Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
ai:classification [2007/05/20 14:31] cyril |
— (current) | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Classification ====== | ||
- | |||
- | ===== MLP ===== | ||
- | Multi Layers Perceptron, //PMC (Perceptron Multi-Couches)// | ||
- | ====Gradient Backpropagation==== | ||
- | // | ||
- | ===Stochastic=== | ||
- | ===with Inertia=== | ||
- | ===Simulated Annealing=== | ||
- | //Recuit Simulé// | ||
- | |||
- | ====newton==== | ||
- | =Quick Def= | ||
- | second order | ||
- | |||
- | =====RBFNN===== | ||
- | Radial Basis Functions Neural Networks | ||
- | * __k-means then gradient descent__ | ||
- | * __incremental addition of neurons then exact method__ | ||
- | |||
- | =====SVM===== | ||
- | Support Vectors Machine | ||
- | |||
- | =====Decision tree===== | ||
- | //arbre de décision// | ||
- | * __ID3__ (based on entropy) | ||
- | |||
- | =====k-nearest neighbors===== | ||
- | //k plus proches voisins// | ||
- | |||
- | =====Boosting===== | ||
- | [Freund, | ||
- | |||
- | ====Boosting by majority==== | ||
- | |||
- | ====AdaBoost==== | ||
- | ADAptive BOOSTing, [Freund, | ||
- | |||
- | The first and standard version is refered as [[# | ||
- | |||
- | ==Quick Def== | ||
- | Greedy approach | ||
- | |||
- | ===Discrete AdaBoost=== | ||
- | [Freund, | ||
- | ==References== | ||
- | - [[|1997, | ||
- | ==Full Definition== | ||
- | {{ adaboost-discrete.png | ||
- | |||
- | ===Real AdaBoost=== | ||
- | [Friedman, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | |||
- | ===LogitBoost=== | ||
- | [Friedman, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | |||
- | ===Gentle AdaBoost=== | ||
- | [Friedman, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | |||
- | ===FloatBoost=== | ||
- | ==Objective== | ||
- | AdaBoost is a sequential forward search procedure using the greedy selection strategy to minimize a certain margin on the training set. A crucial heuristic assumption used in such a sequential forward search procedure is the monotonicity (i.e. that addition of a new weak classifier to the current set does not decrease the value of the performance criterion). The premise offered by the sequential procedure in AdaBoost breaks down when this assumption is violated. Floating Search is a sequential feature selection procedure with backtracking, | ||
- | ==Full Definition== | ||
- | {{ floatboost.png | ||
- | |||
- | ===AdaBoost.Reg=== | ||
- | [Freund, | ||
- | ==Objective== | ||
- | An extension of AdaBoost to regression problems | ||
- | ==References== | ||
- | - [[|1997, | ||
- | | ||
- | ===Multiclass AdaBoost.M1=== | ||
- | [Freund, | ||
- | ==Objective== | ||
- | Basic extension of AdaBoost to multiclass problems. A weak classifier needs to have an error rate less than 1/2, which is stronger than random guessing (1/k) and is often too difficult to obtain. | ||
- | ==Quick Def== | ||
- | A weak classifier associates to an example a label in '' | ||
- | ==References== | ||
- | - [[|1997, | ||
- | ==Full Definition== | ||
- | {{ adaboost-m1.png | ||
- | |||
- | ===Multiclass AdaBoost.M2=== | ||
- | [Freund, | ||
- | ==Objective== | ||
- | Tries to overcome the difficulty of AdaBoost.M1 by extending the communication between the boosting algorithm and the weak learner. The algorithm not only focuses on hard instances, but also on classes which are hard to distinguish. | ||
- | ==Quick Def== | ||
- | A weak classifier associates to an example a vector in '' | ||
- | ==References== | ||
- | - [[|1997, | ||
- | ==Full Definition== | ||
- | {{ adaboost-m2.png | ||
- | |||
- | ===Multilabel AdaBoost.MR=== | ||
- | [Schapire, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | |||
- | ===Multilabel AdaBoost.MH=== | ||
- | [Schapire, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | - [[|1998, | ||
- | |||
- | ===Multiclass AdaBoost.MO=== | ||
- | [Schapire, | ||
- | ==References== | ||
- | - [[|1998, | ||
- | - [[|2006, | ||
- | |||
- | ===Multiclass AdaBoost.OC=== | ||
- | [Schapire, 1997] | ||
- | ==References== | ||
- | - | ||
- | - [[|2006, | ||
- | |||
- | ===Multiclass AdaBoost.ECC=== | ||
- | [Guruswami, | ||
- | ==References== | ||
- | - | ||
- | - [[|2006, | ||
- | |||
- | ===AdaBoost.M1W=== | ||
- | [Eibl, | ||
- | |||
- | ===GrPloss=== | ||
- | [Eibl, | ||
- | ==References== | ||
- | - [[|2003, | ||
- | |||
- | ===BoostMA=== | ||
- | [Eibl, | ||
- | ==References== | ||
- | - [[|2003, | ||
- | |||
- | ===SAMME=== | ||
- | Stagewise Additive Modeling using a Multi-class Exponential loss function, [Zhu, | ||
- | |||
- | ==References== | ||
- | - [[ |2006, | ||
- | |||
- | ===GAMBLE=== | ||
- | Gentle Adaptive Multiclass Boosting Learning, | ||
- | |||
- | ====UBoost==== | ||
- | |||
- | ==Quick Def== | ||
- | Uneven loss function + greedy | ||
- | |||
- | ====LPBoost==== | ||
- | |||
- | ==Objective== | ||
- | Not greedy, exact. | ||
- | |||
- | ==References== | ||
- | - [[2000_Demiriz-Bennett-ShaweTaylor_Linear Programming Boosting via Column Generation.pdf|Local Copy]] | ||
- | |||
- | ====TotalBoost==== | ||
- | TOTALly corrective BOOSTing, [Warmuth, | ||
- | ==References== | ||
- | - [[|2006, | ||
- | |||
- | ====RotBoost==== | ||
- | [Li, | ||
- | |||
- | ==References== | ||
- | - [[2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|Local Copy]] | ||
- | |||
- | ====alphaBoost==== | ||
- | [Li, | ||
- | |||
- | ==References== | ||
- | - [[2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|Local Copy]] | ||
- | |||
- | |||
- | ====CGBoost==== | ||
- | Conjugate Gradient BOOSTing, [Li, | ||
- | |||
- | ==References== | ||
- | - [[2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|Local Copy]] | ||
- | |||
- | |||
- | ====Bootstrap Aggregating==== | ||
- | |||
- | =====Cascades of detectors===== | ||
- | |||
- | =====Trees of detectors===== | ||
- | |||
- | ====== Regression ====== | ||
- | |||
- | * **MLP (Multi Layers Perceptron)** | ||
- | * **RBFNN (Radial Basis Functions Neural Network)** | ||
- | * **SVR (Support Vectors Regressor)** | ||