Leveraging Physical Interactions to Enable Robotic Assistive Care (via Zoom)

Abstract: How do we build robots that can assist people with mobility limitations with activities of daily living? To successfully perform these activities, a robot needs to be able to physically interact with humans and objects in unstructured human environments. In this talk, I will focus on one such activity - feeding. Successful robot-assisted feeding depends on reliable bite acquisition of hard-to-model deformable food items and easy bite transfer. Using insights from human studies, I will showcase algorithms and technologies that leverage multiple sensing modalities to perceive varied food item properties and determine successful strategies for bite acquisition and transfer. Using feedback from all the stakeholders, I will show how we built an autonomous robot-assisted feeding system that uses these algorithms and technologies and deployed it in the real world that fed real users with mobility limitations.

Bio: Tapomayukh "Tapo" Bhattacharjee is an Assistant Professor in the Department of Computer Science at Cornell University where he directs the EmPRISE Lab. He completed his PhD. in Robotics from Georgia Institute of Technology and was an NIH Ruth L. Kirschstein NRSA postdoctoral research associate in Computer Science & Engineering at the University of Washington. He wants to enable robots to assist people with mobility limitations with tasks for daily living and he believes that allowing efficient and safe physical interactions between robots and their immediate environments is the key. His work spans the fields of human-robot interaction, haptic perception, and robot manipulation.