Wednesday, October 26, 2011

Lab Meeting October 27, 2011 (ShaoChen): A multiple hypothesis people tracker for teams of mobile robots (ICRA 2010)

Title: A multiple hypothesis people tracker for teams of mobile robots (ICRA 2010)

Authors: Tsokas, N.A. and Kyriakopoulos, K.J.

Abstract: This paper tackles the problem of tracking walking people with multiple moving robots equipped with laser rangefinders. We present an adaptation to the classic Multiple Hypothesis Tracking method, which allows for one-to-many associations between targets and measurements in each cycle and is thus capable of operating in a multi-sensor scenario. In the context of two experiments, the successful integration of our tracking algorithm to a dual-robot setup is assessed.

Wednesday, October 12, 2011

Lab Meeting October 13, 2011 (Alan): A Model-Selection Framework for Multibody Structure-and-Motion of Image Sequences (IJCV 2008)

Title: A Model-Selection Framework for Multibody Structure-and-Motion of Image Sequences (IJCV 2008)

Authors: Konrad Schindler, David Suter and Hanzi Wang

Abstract: Given an image sequence of a scene consisting of multiple rigidly moving objects, multi-body structureand-motion (MSaM) is the task to segment the image feature tracks into the different rigid objects and compute the multiple-view geometry of each object.We present a framework for multibody structure-and-motion based on model selection. In a recover-and-select procedure, a redundant set of hypothetical scene motions is generated. Each subset of this pool of motion candidates is regarded as a possible explanation of the image feature tracks, and the most likely explanation is selected with model selection. The framework is
generic and can be used with any parametric camera model, or with a combination of different models. It can deal with sets of correspondences, which change over time, and it is robust to realistic amounts of outliers. The framework is demonstrated for different camera and scene models.

Link

Tuesday, October 11, 2011

Lab Meeting October 13th, 2011 (Jeff): Object Mapping, Recognition, and Localization from Tactile Geometry

Title: Object Mapping, Recognition, and Localization from Tactile Geometry

Authors: Zachary Pezzementi, Caitlin Reyda, and Gregory D. Hager

Abstract:

We present a method for performing object recognition using multiple images acquired from a tactile sensor. The method relies on using the tactile sensor as an imaging device, and builds an object representation based on mosaics of tactile measurements. We then describe an algorithm that is able to recognize an object using a small number of tactile sensor readings. Our approach makes extensive use of sequential state estimation techniques from the mobile robotics literature, whereby we view the object recognition problem as one of estimating a consistent location within a set of object maps. We examine and test approaches based on both traditional
particle filtering and histogram filtering. We demonstrate both the mapping and recognition / localization techniques on a set of raised letter shapes using real tactile sensor data.

Link:
IEEE International Conference on Robotics and Automation(ICRA), 2011
LocalLink
http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5980363