Monday, February 27, 2012

Lab Meeting Feb. 29 (Hank): Creating Household Environment Map for Environment Manipulation Using Color Range Sensors on Environment and Robot

Authors: Yohei Kakiuchi and Ryohei Ueda and Kei Okada and Masayuki Inaba

Abstract— A humanoid robot working in a household environment with people needs to localize and continuously update the locations of obstacles and manipulable objects. Achieving such system, requires strong perception method to efficiently update the frequently changing environment.

We propose a method for mapping a household environment using multiple stereo and depth cameras located on the humanoid head and the environment. The method relies on colored 3D point cloud data computed from the sensors. We achieve robot localization by matching the point clouds from the robot sensor data directly with the environment sensor data. Object detection is performed using Iterative Closest Point (ICP) with a database of known point cloud models. In order to guarantee accurate object detection results, objects are only detected within the robot sensor data. Furthermore, we utilize the environment sensor data to map out of the obstacles as bounding convex hulls.

We show experimental results creating a household environment map with known object labels and estimate the robot position in this map.


[link]

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.