Sunday, March 19, 2006

My talk this Wednesday.

My talk this Wednesday will be 2 parts.
First, I'll give a brief demo of my last talk. (about AdaBoost)
Second, I'll talk about the extension of Adaboost.
This time I'll introduce adaboost algorithm under multiclass condition.
My talk is based on the following 2 paper :

1.
Title : A decision-theoretic generalization of on-line learningand an application to boosting Author : Yoav Freund and Robert E. Schapire in AT&T Lab
This paper appears in : Journal of Computer and System Sciences,55(1):110-139,August 1997
Abstract :
In the first part of the paper we consider the problem of dynamically apportioning resources among a set of options in a worst-case on-line framework. The model we study can be interpreted as a broad, abstract extension of the well-studied on-line prediction model to a general decision-theoretic setting. We show that the multiplicative weight-update rule of Littlestone and Warmuth can be adapted to this model yielding boundsthat are slightly weaker in some cases, but applicable to a considerably more general classof learning problems. We show how the resulting learning algorithm can be applied to avariety of problems, including gambling, multiple-outcome prediction, repeated games andprediction of points in R^n. In the second part of the paper we apply the multiplicative weight-update technique to derive a new boosting algorithm. This boosting algorithm doesnot require any prior knowledge about the performance of the weak learning algorithm.We also study generalizations of the new boosting algorithm to the problem of learningfunctions whose range, rather than being binary, is an arbitrary finite set or a bounded segment of the real line.
link

2.
Title : Improved Boosting AlgorithmsUsing Confidence-rated Predictions
Author : Robert E. Schapire and Yoram Singer in AT&T Lab
This paper appears in : Machine Learning, 37(3):297-336, 1999.
Abstract :
We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions. We give a simplified analysis of AdaBoost in this setting, and we show how this analysis can be used to find improved parametersettings as well as a refined criterion for training weak hypotheses. We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan. This method also suggests a technique for growing decision trees which turns out to be identical to one proposed by Kearns and Mansour.We focus next on how to apply the new boosting algorithms to multiclass classification problems, particularlyto the multi-label case in which each example may belong to more than one class. We give two boosting methods for this problem, plus a third method based on output coding. One of these leads to a new method for handling the single-label case which is simpler but as effective as techniques suggested by Freund and Schapire. Finally,we give some experimental results comparing a few of the algorithms discussed in this paper.
link

No comments: