video thumbnail
Pause
Mute
Subtitles
Playback speed
0.25
0.5
0.75
1
1.25
1.5
1.75
2
Full screen

The Dynamics of AdaBoost

Published on Feb 4, 202524725 Views

One of the most successful and popular learning algorithms is AdaBoost, which is a classification algorithm designed to construct a "strong" classifier from a "weak" learning algorithm. Just after the

Related categories

Presentation

Dynamics of AdaBoost00:02
A Story about AdaBoost00:12
The question remained (until recently): Does AdaBoost maximize the margin?02:08
 Overview of Talk 06:39
 A Sample Problem 07:32
Say you have a database of news articles…07:34
Examples of Classification Tasks:08:30
Examples of classification algorithms:08:55
Training Data: {(xi,yi)}i=1..m where (xi,yi) is chosen iid from an unknown probability distribution on X{-1,1}.09:15
How do we construct a classifier?09:31
Say we have a “weak” learning algorithm:09:43
Boosting algorithms combine weak classifiers in a meaningful way (Schapire ‘89).10:03
AdaBoost (Freund and Schapire ’96)11:14
AdaBoost12:57
Does AdaBoost choose λfinal so that the margin µ( f ) is maximized? That is, does AdaBoost maximize the margin? No! 18:17
About the proof…20:11
 Analyzing AdaBoost using Dynamical Systems 20:28
Smallest Non-Trivial Case21:13
TITLE22:24
Two possible stable cycles!24:57
Generalization of smallest non-trivial case27:11
 Empirically Observed Cycles 29:08
If AdaBoost cycles, we can calculate the margin it will asymptotically converge to in terms of the edge values 31:43