Author:
Geoffrey A. Hollinger, Yavor Georgiev, Anthony Manfredi, Bruce A. Maxwell, Zachary A. Pezzementi, and Benjamin Mitchell
IEEE/RSJ Int'l Conf. on Intelligent Robots and Systems
Abstract:
In this paper, we describe a robot that interacts with humans in a crowded conference environment. The robot detects faces, determines the shirt color of onlooking conference attendants, and reacts with a combination of speech, musical, and movement responses. It continuously updates an internal emotional state, modeled realistically after human psychology research. Using empirically-determined mapping functions, the robot’s state in the emotion space is translated to a particular set of sound and movement responses. We successfully demonstrate this system at the AAAI ’05 Open Interaction Event, showing the potential for emotional modeling to improve human-robot interaction.
link
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.