Wednesday, January 15, 2014

Lab Meeting, January 16th, 2014 (Henry Lu): Jaeyong Sung, Colin Ponce, Bart Selman and Ashutosh Saxena. "Unstructured Human Activity Detection from RGBD Images" IEEE International Conference on Robotics and Automation (ICRA), 2012

Title:
Unstructured Human Activity Detection from RGBD Images
Authors:
Jaeyong Sung 
Dept. of Comput. Sci., Cornell Univ., Ithaca, NY, USA Ponce, C. ; Selman, B. ; Saxena, A.
Abstracts:



Being able to detect and recognize human activities is essential for several applications, including personal assistive robotics. In this paper, we perform detection and recognition of unstructured humanactivity in unstructured environments. We use a RGBD sensor (Microsoft Kinect) as the input sensor, and compute a set of features based on human pose and motion, as well as based on image and point-cloud information. Our algorithm is based on a hierarchical maximum entropy Markov model (MEMM), which considers a person's activity as composed of a set of sub-activities. We infer the two-layered graph structure using a dynamic programming approach. We test our algorithm on detecting and recognizing twelve different activities performed by four people in different environments, such as a kitchen, a living room, an office, etc., and achieve good performance even when the person was not seen before in the training set.


From:

2012 IEEE International Conference on Robotics and Automation (ICRA)


No comments:

Post a Comment

Note: Only a member of this blog may post a comment.