Menu

The Dynamics of AdaBoost

calendar icon Feb 25, 2007 24780 views
split view icon
video icon
presentation icon
video with chapters icon
video thumbnail
Pause
Mute
speed icon
speed icon
0.25
0.5
0.75
1
1.25
1.5
1.75
2

One of the most successful and popular learning algorithms is AdaBoost, which is a classification algorithm designed to construct a "strong" classifier from a "weak" learning algorithm. Just after the development of AdaBoost nine years ago, scientists derived margin- based generalization bounds to explain AdaBoost's unexpectedly good performance. Their result predicts that AdaBoost yields the best possible performance if it always achieves a "maximum margin" solution. Yet, does AdaBoost achieve a maximum margin solution? Empirical and theoretical studies conducted within this period conjecture the answer to be "yes". In order to answer this question, we look toward AdaBoost's dynamics. We simplify AdaBoost to reveal a nonlinear iterated map. We then analyze the convergence of AdaBoost for cases where cyclic behavior is found; this cyclic behavior provides the key to answering the question of whether AdaBoost always maximizes the margin. As it turns out, the answer to this question turns out to be the opposite of what was thought to be true! In this talk, I will introduce AdaBoost, describe our analysis of AdaBoost when viewed as a dynamical system, briefly mention a new boosting algorithm which always maximizes the margin with a fast convergence rate, and if time permits, I will reveal a surprising new result about AdaBoost and the problem of bipartite ranking.

RELATED CATEGORIES

MORE VIDEOS FROM THE SAME CATEGORIES

Except where otherwise noted, content on this site is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International license.