Editor’s Note: David Feil-Seifer, a Postdoctoral Fellow in the Computer Science Department at Yale University, wrote this essay for Circuit Cellar. Feil-Seifer focuses his research on socially assistive robotics (SAR), particularly the study of human-robot interaction for children with autism spectrum disorders (ASD). His dissertation work addressed autonomous robot behavior so that socially assistive robots can recognize and respond to a child’s behavior in unstructured play. He recently was hired as Assistant Professor of Computer Science at the University of Nevada, Reno.
There are looming health care and education crises on the horizon. Baby boomers are getting older and requiring more care, which puts pressure on caregivers. The US nursing shortage is projected to worsen. Similarly, the rapid growth of diagnoses of developmental disorders suggests a greater need for educators, one that the education system is struggling to meet. These great and growing shortfalls in the number of caregivers and educators may be addressed (in part) through the use of socially assistive robotics.
In health care, non-contact repetitive tasks make up a large part of a caregiver’s day. Tasks such as monitoring instruments only require a check to verify that readings are within norms. By offloading these tasks to an automated system, a nurse or doctor could spend more time doing work that better leverages their medical training. A robot can effectively perform simple repetitive tasks (e.g., monitoring breath spirometry exercises or post-stroke rehabilitation compliance).
I coined the term “socially assistive robotics” (SAR) to describe robots that provide such assistance through social rather than physical interaction. My research is the development of SAR algorithms and complete systems relevant to domains such as post-stroke rehabilitation, elder care, and therapeutic interaction for children with autism spectrum disorders (ASD). A key challenge for such autonomous SAR systems is the ability to sense, interpret, and properly respond to human social behavior.
One of my research priorities is developing a socially assistive robotic system for children with ASD. Children with ASD are characterized by social impairments, communication difficulties, and repetitive and stereotyped behaviors. Significant anecdotal evidence indicates that some children with ASD respond socially to robots, which could have therapeutic ramifications. We envision a robot that could act as a catalyst for social interaction, both human-robot and human-human, thus aiding ASD users’ human-human socialization. In such a scenario, the robot is not specifically generating social behavior or participating in social interaction, but instead behaves in a way known to provoke human-human interaction.
Enabling a robot to exhibit and understand social behavior with a child is challenging. Children are highly individual and thus technology used for social interaction needs to be robust to be effective. I developed an autonomous robot that recognizes and appropriately responds to a child’s free-form behavior in play contexts, similar to those seen in some more traditional ASD therapies.
To detect and mitigate child distress, I developed a methodology for learning and then applying a data-driven spatiotemporal model of social behavior based on distance-based features to automatically differentiate between typical vs. aversive child-robot interactions. Using a Gaussian mixture model learned over distance-based feature data, the developed system was able to detect and interpret social behavior with sufficient accuracy to recognize child distress. The robot can use this to change its own behavior to encourage positive social interaction.
To encourage human-human interaction once human-robot interaction was achieved, I developed a navigation planner that used the above spatiotemporal model. This was used to maintain the robot’s spatial relationship with a child to sustain interaction while also guiding the child to a particular location in a room. This could be used to encourage a child to move toward another interaction partner (e.g., a parent). The desired spatial interaction behavior is achieved by modifying an established trajectory planner to weigh candidate trajectories based on conformity to a trained model of the desired behavior.
I also developed a methodology for robot behavior that provides autonomous feedback for a robot-child imitation and turn-taking game. This was accomplished by incorporating an established therapeutic model of feedback along with a trained model of imitation behavior. This is used as part of an autonomous system that can play Simon Says, recognize when the rules have been violated, and provide appropriate feedback.
A growing body of data supports the hypothesis that robots have the potential to aid in addressing the needs of people through non-contact assistance. My research, along with that of many others, has resulted in technical advances for robots providing assistance to people. However, there is a long way to go before these systems can be deployed as a therapeutic platform. Given that the beneficiary populations are growing, and the required therapeutic needs are increasing far more rapidly than the existing resources to address it, SAR could provide lasting benefits to people in need.