Social Robots

Some of our lab early work (Lisetti, et al., 2004, Murphy et al., 2002) helped pave the way to establish human-robot interaction (HRI) as a necessary new field of study. Although closely related, due to robots’ inherent physicality, some of the research questions that human-robot interaction raises are different from those of human-computer interaction (HCI); they need to be studied in terms of how they relate to, and how they differ from HCI research questions.

Dr. Lisetti was on the Program Committee of the 1st Human-Robot Interaction Conference (HRI) in 2006, and then in 2007, 2008, and was the HRI Conference Publicity Chair in 2009.


Towards building rapport with a Human Support Robot

Toyota Human Support Robot research prototype (click on image to view video).

Human support robots (mobile robots able to perform useful domestic manipulative tasks) might be better accepted by people if they can communicate in ways they naturally understand: e.g. speech, but also facial expressions, postures, among others.  Subtle (unconscious) mirroring of nonverbal cues during conversations promotes rapport building, essential for good communication.

We investigate whether, as in human- human communication, the ability of a robot to mirror its user’s head movements and facial expressions in real time can improve the user’s experience with it. We describe the technical integration of a Toyota Human Support Robot (HSR) with a facially expressive 3D embodied conversational agent (ECA) (named ECA-HSR). The HSR and the ECA are aware of the user’s head movements and facial emotions, and can mirror them, in real time. We then discuss a user study we designed to assess the impact of each modality.

Our results suggest that interacting with an ECA-HSR that mirrors both the user’s head movements and the facial expressions is preferred over the other conditions.

Publication: Pasternak, K., Wu, Z., Visser, U., and Lisetti, C. (2021)  Towards building rapport with a Human Support Robot,  In Proceedings of the 2021 RoboCup Conference Symposium. [PDF]; demo URL: https://www.youtube.com/watch?v=r83PNP0GTVw&list=PLbunW0E2d97v3qgQ2HZdbjeHx0AdhXnOq&index=21


Cherry, the Little Red Social Robot

Cherry, the social robot

Cherry (shown on the left) was one of the first fully integrated mobile robotic systems in 2000 to test human-robot interaction in social contexts by combining a mobile robot with an embodied conversational agent.

We built Cherry to provide guided tours of our Computer Science suite to visitors.   Cherry had a map of the office floor, she could recognize faculty’s faces using (third party) face recognition software, and talk to visitors about each faculty’s research.  She would also get frustrated – and showed it – if she kept trying to find a professor whose door was always be closed…

We  evaluated the reactions of people before they met Cherry and after they met her: the more people  interacted with her, the more they liked her.  When the project ended, many people told us they missed her roaming around our Computer Science floor, and asked if we could bring her back.  If you are wondering about the red feather boa attire, we aimed at keeping users’ expectations of Cherry’s ability its ability level, by keeping its look fun (while hiding a multiple of unsightly distracting cables).

Publication:  C. Lisetti, S. Brown, K. Alvarez, A. Marpaung.  A Social Informatics Approach to Human-Robot Interaction with a Service Social Robot.  IEEE Transactions on Sytems, Man and Cybernetics, Vol. 34, N. 2: 195-209, 2004.


AAAI Robot Competition: Hors d’Oeuvres Anyone? 

Our team of cooperating robots at AAAI National Conference on AI, robotic competition Hors d’Oeuvres Anyone?

In 2000, we won the Nils Nilsson Award for Integrating AI Technologies Award and the Technical Innovation  Award at the National Artificial Intelligence Conference organized by the Association for the Advancement of Artificial Intelligence (AAAI), where we competed at the AAAI Robot Competition: Hors D’oeuvre Anyone.  The task was to use robots to entertain, engage and serve hors d’oeuvres at the conference main reception, to as many guests as possible – without running anyone over!

We were the first team to introduce a pair of collaborating social robots – Butler (taller) and its assistant (smaller): Butler moved around the crowd offering hors d’oeuvres on its tray; a laser sensor enabled it to know when a treat was taken, hence to determine when the tray was getting low on food and needed refilling.  Sonars enabled them to avoid obstacles such as guests…  The assistant stood by the hors d’oeuvres refill station until called by Butler to bring a new full tray for tray exchange.

They were designed with an emotion-based three-layer architecture which simulated some of the roles of emotions during human decision-making: e.g. if the assistant was too slow, Butler’s would express increasing frustration with time, until it eventually decided to get the tray itself.   If you’re wondering about the armadillo attires, the competition was in Texas… and we honored the local fauna.

Publication:  R. Murphy, C. Lisetti, R. Tardif, L. Irish, and A. Gage.   Emotion-Based Control of Cooperating Heterogeneous Mobile Robots.  IEEE Transactions on Robotics and Automation, Vol. 18, No. 5, October 2002.

The AAAI price included the red Amigobot, which we used to create our Cherry little red robot project (shown above).