This is an old revision of the document!
Table of Contents
Classification
MLP
Multi Layers Perceptron, PMC (Perceptron Multi-Couches)
Gradient Backpropagation
Rétropropagation du Gradient
Stochastic
with Inertia
Simulated Annealing
Recuit Simulé
newton
=Quick Def= second order
RBFNN
Radial Basis Functions Neural Networks
- k-means then gradient descent
- incremental addition of neurons then exact method
SVM
Support Vectors Machine
Decision tree
arbre de décision
- ID3 (based on entropy)
k-nearest neighbors
k plus proches voisins
Boosting
[Freund,Schapire, 1995]
Boosting by majority
AdaBoost
ADAptive BOOSTing, [Freund,Schapire, 1996]
The first and standard version is refered as Discrete AdaBoost.
Quick Def
Greedy approach
Discrete AdaBoost
[Freund,Schapire, 1996]
References
Full Definition
Real AdaBoost
[Friedman,Hastie,Tibshirani, 1998]
References
LogitBoost
[Friedman,Hastie,Tibshirani, 1998]
References
Gentle AdaBoost
[Friedman,Hastie,Tibshirani, 1998]
References
FloatBoost
Objective
AdaBoost is a sequential forward search procedure using the greedy selection strategy to minimize a certain margin on the training set. A crucial heuristic assumption used in such a sequential forward search procedure is the monotonicity (i.e. that addition of a new weak classifier to the current set does not decrease the value of the performance criterion). The premise offered by the sequential procedure in AdaBoost breaks down when this assumption is violated. Floating Search is a sequential feature selection procedure with backtracking, aimed to deal with nonmonotonic criterion functions for feature selection
Full Definition
AdaBoost.Reg
[Freund,Schapire, 1997]
Objective
An extension of AdaBoost to regression problems
References
Multiclass AdaBoost.M1
[Freund,Schapire, 1997]
Objective
Basic extension of AdaBoost to multiclass problems. A weak classifier needs to have an error rate less than 1/2, which is stronger than random guessing (1/k) and is often too difficult to obtain.
Quick Def
A weak classifier associates to an example a label in {0,…,k}
.
References
Full Definition
Multiclass AdaBoost.M2
[Freund,Schapire, 1997]
Objective
Tries to overcome the difficulty of AdaBoost.M1 by extending the communication between the boosting algorithm and the weak learner. The algorithm not only focuses on hard instances, but also on classes which are hard to distinguish.
Quick Def
A weak classifier associates to an example a vector in [0,1]^k
, and the pseudo-loss takes also into account weights according to the performance of the weak classifier over the different classes for the same example.
References
Full Definition
Multilabel AdaBoost.MR
[Schapire,Singer, 1998]
References
Multilabel AdaBoost.MH
[Schapire,Singer, 1998]
References
Multiclass AdaBoost.MO
[Schapire,Singer, 1998]
References
Multiclass AdaBoost.OC
[Schapire, 1997]
References
Multiclass AdaBoost.ECC
[Guruswami,Sahai, 1999]
References
AdaBoost.M1W
[Eibl,Pfeiffer, 2002]
GrPloss
[Eibl,Pfeiffer, 2003]
References
BoostMA
[Eibl,Pfeiffer, 2003]
References
SAMME
Stagewise Additive Modeling using a Multi-class Exponential loss function, [Zhu,Rosset,Zou, 2006]
References
GAMBLE
Gentle Adaptive Multiclass Boosting Learning,
UBoost
Quick Def
Uneven loss function + greedy
LPBoost
Objective
Not greedy, exact.
References
TotalBoost
TOTALly corrective BOOSTing, [Warmuth,Liao,Ratsch, 2006]
References
RotBoost
[Li,Abu-Mostafa,Pratap, 2003]
References
alphaBoost
[Li,Abu-Mostafa,Pratap, 2003]
References
CGBoost
Conjugate Gradient BOOSTing, [Li,Abu-Mostafa,Pratap, 2003]
References
Bootstrap Aggregating
Cascades of detectors
Trees of detectors
Regression
- MLP (Multi Layers Perceptron)
- RBFNN (Radial Basis Functions Neural Network)
- SVR (Support Vectors Regressor)