Monday, November 02, 2009

Lab Meeting November 4, 2009(Jimmy): Learning To Detect Unseen Object Classes by Between-Class Attribute Transfer

Title: Learning To Detect Unseen Object Classes by Between-Class Attribute Transfer
Authors: Christoph H. Lampert, Hannes Nickisch, and Stefan Harmeling
In: CVPR2009

Abstract
We study the problem of object classification when training and test classes are disjoint, i.e. no training examples of the target classes are available. This setup has hardly been studied in computer vision research, but it is the rule rather than the exception, because the world contains tens of thousands of different object classes and for only a very few of them image, collections have been formed and annotated with suitable class labels.

In this paper, we tackle the problem by introducing attribute-based classification. It performs object detection based on a human-specified high-level description of the target objects instead of training images. The description consists of arbitrary semantic attributes, like shape, color or even geographic information. Because such properties transcend the specific learning task at hand, they can be pre-learned, e.g. from image datasets unrelated to the current task. Afterwards, new classes can be detected based on their attribute representation, without the need for a new training phase. In order to evaluate our method and to facilitate research in this area, we have assembled a new largescale dataset, “Animals with Attributes”, of over 30,000 animal images that match the 50 classes in Osherson’s classic table of how strongly humans associate 85 semantic attributes with animal classes. Our experiments show that by using an attribute layer it is indeed possible to build a learning object detection system that does not require any training images of the target classes.

[link]

I will also try to introduce the NIPS2009 paper Zero-Shot Learning with Semantic Output Codes by M. Palatucci, D. Pomerleau, G. Hinton, and T.M. Mitchell, which gives some formalization to the problem.

No comments: