Sunday, June 11, 2006

PAL lab meeting 14 June, 2006 (Chihao): Speaker Attention System for Mobile Robots Using Microphone Array and Face Tracking

From 2006 IEEE International Conference on Robotics and Automation

Author: Kai-Tai Song, Jwu-Sheng Hu, Chi-Yi Tsai, Chung-Min Chou, Chieh-Cheng Cheng, Wei-Han Liu, and Chia-Hsing Yang

Abstract:
This paper presents a real-time human-robot interface system (HRIS), which processes both speech and vision information to improve the quality of communication between human and an autonomous mobile robot. The HRIS contains a real-time speech attention system and a real-time face tracking system. In the speech attention system, a microphone-array voice acquisition system has been developed to estimate the direction of speaker and purify the speaker’s speech signal in a noisy environment. The developed face tracking system aims to track the speaker’s face under illumination variation and react to the face motion. The proposed HRIS can provide a robot with the abilities of finding a speaker’s direction, tracking the speaker’s face, moving its body to the speaker, focusing its attention to the speaker who is talking to it, and purifying the speaker’s speech. The experimental results show that the HRIS not only purifies speech signal with a significant performance, but also tracks a face under illumination variation in real-time.



Link
Shao-Wen (Any) Yang had sent account/password to our lab (subject: "Access ICRA'06 papers via WWW").

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.