Creating a Haptic Empathetic Robot Animal That Feels Touch and Emotion

DSpace Repositorium (Manakin basiert)


Dateien:

Zitierfähiger Link (URI): http://hdl.handle.net/10900/152955
http://nbn-resolving.de/urn:nbn:de:bsz:21-dspace-1529558
http://dx.doi.org/10.15496/publikation-94294
Dokumentart: Dissertation
Erscheinungsdatum: 2026-02-23
Sprache: Englisch
Fakultät: 7 Mathematisch-Naturwissenschaftliche Fakultät
Fachbereich: Informatik
Gutachter: Kuchenbecker, Katherine J. (Prof. Dr.)
Tag der mündl. Prüfung: 2024-02-23
DDC-Klassifikation: 004 - Informatik
500 - Naturwissenschaften
Freie Schlagwörter:
robotics
human-robot interaction
haptics
socially assistive robotics
robot-assisted therapy
social touch
tactile sensors
tactile sensing
gesture classification
Lizenz: http://tobias-lib.uni-tuebingen.de/doku/lic_ohne_pod.php?la=de http://tobias-lib.uni-tuebingen.de/doku/lic_ohne_pod.php?la=en
Zur Langanzeige

Inhaltszusammenfassung:

Die Dissertation ist gesperrt bis zum 23. Februar 2026 !

Abstract:

Touch is essential to everyday interaction. Humans commonly use social touch to communicate needs, express emotions, and request attention; however, utilizing touch in these ways can be challenging for children with autism. Furthermore, while socially assistive robots are generally viewed as a promising means of helping children with autism via robot-assisted therapy, existing robots possess almost no touch-perception capabilities. We propose that socially assistive robots could better understand user intentions and needs, and provide increased support opportunities, if they could perceive and intelligently react to touch. This thesis presents the design, creation, and testing of a touch-perceptive and emotionally responsive robot, which we refer to as the Haptic Empathetic Robot Animal, or HERA. HERA is intended to demonstrate new technical capabilities that therapists could use to teach children with autism about safe and appropriate touch. We divide HERA's development into four principal stages: establishing touch-sensing guidelines, building touch-perceiving sensors, creating a long-term emotion model that responds to touch inputs, and integrating the subsystems for real-time performance. To start, we wanted to determine whether a touch-perceiving robot would be useful to the autism support community. We therefore begin this dissertation by establishing the first-ever touch-perception guidelines for therapy robots. We examined the existing relevant literature to create an initial set of six tactile-perception requirements, and we then evaluated these requirements through interviews with eleven experienced autism specialists from a variety of backgrounds. Thematic analysis of the specialists' feedback revealed three overarching themes: the touch-seeking and touch-avoiding behavior of autistic children, their individual differences and customization needs, and the roles that a touch-perceiving robot could play in such interactions. Using feedback collected from the specialists, we refined our initial list into finalized qualitative requirements based on seven attributes: robustness and maintainability, sensing range, feel, gesture identification, spatial, temporal, and adaptation. Finally, by utilizing current best practices in human-robot interaction, tactile sensor development, and signal processing, we transformed these qualitative requirements into quantitative specifications, which future roboticists and engineers can use in the design process of social robots. We then proceeded to use these touch-perception guidelines to endow HERA with a sense of touch. In the second part of this dissertation, we introduce a low-cost, easy-to-build, soft tactile-perception system that we created for the NAO robot, an existing rigid-bodied robot upon which HERA was built. We also share participants' feedback on touching this system. We installed four custom fabric-and-foam-based resistive sensors on the curved surfaces of a NAO's left upper limb, including its hand, lower arm, upper arm, and shoulder. We then investigated how different users perform common social touches. In our user study, fifteen adults performed five types of affective touch communication gestures (hitting, poking, squeezing, stroking, and tickling) at two force intensities (gentle and energetic) on the four sensor locations. After training on these touches in a 70\%-30\% train-test split, a gesture-classification algorithm based on a random forest identified the correct combined touch gesture and force intensity on windows of held-out test data with an average accuracy of 74.1\%, which is more than eight times better than chance. Participants rated the sensor-equipped arm as pleasant to touch and liked the robot's presence significantly more after touch interactions. Our results showed that this type of tactile-perception system can detect necessary social-touch communication cues from users, can be tailored to a variety of robot body parts, and can provide HRI researchers with the tools needed to implement social touch in their own systems. Next, we needed HERA to react to the social touches that it detected in an emotionally intelligent way, so that children could see the impact of their touches on the robot. However, in the field of human-robot interaction, autonomous physical robots often lack a dynamic internal emotional state, instead displaying brief, fixed emotion routines. These short-term-only responses are intended to promote specific user interactions. We hypothesized that users' perceptions of a social robotic system would improve when the robot provides emotional responses on both shorter and longer time scales (reactions and moods), based on touch inputs from the user. In the third part of the dissertation, we evaluated this proposal through an online study in which 51 diverse participants watched, rated, and commented on nine randomly ordered videos (a three-by-three full-factorial design) of HERA being touched by a human. Users provided the highest ratings in terms of agency, ambient activity, enjoyability, and touch perceptivity for scenarios in which HERA showed emotional reactions and either neutral or emotional moods in response to social touch gestures. Finally, we summarized key qualitative findings about users' preferences for reaction timing, the ability of robot mood to show persisting memory, and perception of neutral behaviors as a curious or self-aware robot. In the final portion of this dissertation, we utilize our findings to improve and integrate HERA's subsystems. We introduced a mathematical emotion model that can easily be implemented in a social robot to enable it to react intelligently to external stimuli. The robot's affective state is modeled as a second-order dynamic system analogous to a mass connected to ground by a parallel spring and damper. By adjusting the parameters of this emotion model, the three main aspects of the robot's personality can be modified. We termed these parameters as disposition, stoicism, and calmness. We also introduce an improved version of our tactile-perception system, with sixteen sensors covering HERA's full body, and upgrades to the sensor design, microcontroller, and gesture classification approach. We explain the connections between the various subsystems and demonstrate their ability to create a robotic animal that feels both touch and emotion. The primary contribution of this thesis is to present practical methods for enabling robots to perceive and respond to dynamic social touch. Additionally, we present touch-perception guidelines that translate the needs of autism therapists into specifications for future roboticists and engineers, easy-to-follow DIY instructions on how to build our fabric-based tactile sensors, and a mathematical representation of our emotion response algorithm. With these systems, we can create more socially intelligent robots that can use their sense of touch to better interact with people in care settings and while completing daily tasks.

Das Dokument erscheint in: