====== Classification ====== ===== MLP ===== Multi Layers Perceptron, //PMC (Perceptron Multi-Couches)// ====Gradient Backpropagation==== //Rétropropagation du Gradient// ===Stochastic=== ===with Inertia=== ===Simulated Annealing=== //Recuit Simulé// ====Newton==== ==Objective== Converges faster than gradient descent ==Quick Def== Second order =====RBFNN===== Radial Basis Functions Neural Networks ===First method=== ==Objective== You have to chose ''k'' ==Quick Def== k-means then gradient descent ===Second method=== ==Quick Def== incremental addition of neurons then exact method =====SVM===== Support Vectors Machine =====Decision tree===== //arbre de décision// ===ID3=== ==Quick Def== based on entropy =====k-nearest neighbors===== //k plus proches voisins// =====Boosting===== [Freund,Schapire, 1995] ==Quick Def== Consists in combining a lot of weak classifiers to get a strong one. ====Boosting by majority==== ====AdaBoost==== ADAptive BOOSTing, [Freund,Schapire, 1996] The first and standard version is refered as [[#Discrete_AdaBoost|Discrete AdaBoost]]. ==Quick Def== Greedy approach ===Discrete AdaBoost=== [Freund,Schapire, 1996] ==References== - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-discrete.png }} {{ adaboost-discrete_2.png }} ===Real AdaBoost=== [Friedman,Hastie,Tibshirani, 1998] ==References== - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}} - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-real.png }} ===LogitBoost=== [Friedman,Hastie,Tibshirani, 1998] ==References== - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}} ==Full Definition== {{ logitboost.png }} {{ logitboost-multi.png }} ===Gentle AdaBoost=== [Friedman,Hastie,Tibshirani, 1998] ==References== - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-gentle.png }} ===Probabilistic AdaBoost=== [Friedman,Hastie,Tibshirani, 1998] ==References== - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}} ===FloatBoost=== ==Objective== AdaBoost is a sequential forward search procedure using the greedy selection strategy to minimize a certain margin on the training set. A crucial heuristic assumption used in such a sequential forward search procedure is the monotonicity (i.e. that addition of a new weak classifier to the current set does not decrease the value of the performance criterion). The premise offered by the sequential procedure in AdaBoost breaks down when this assumption is violated. Floating Search is a sequential feature selection procedure with backtracking, aimed to deal with nonmonotonic criterion functions for feature selection ==Full Definition== {{ floatboost.png }} ===AdaBoost.Reg=== [Freund,Schapire, 1997] ==Objective== An extension of AdaBoost to regression problems ==References== - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}} ===Multiclass AdaBoost.M1=== [Freund,Schapire, 1997] ==Objective== Basic extension of AdaBoost to multiclass problems. A weak classifier needs to have an error rate less than 1/2, which is stronger than random guessing (1/k) and is often too difficult to obtain. ==Quick Def== A weak classifier associates to an example a label in ''{0,...,k}''. ==References== - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-m1.png }} {{ adaboost-m1_2.png }} ===Multiclass AdaBoost.M2=== [Freund,Schapire, 1997] ==Objective== Tries to overcome the difficulty of AdaBoost.M1 by extending the communication between the boosting algorithm and the weak learner. The algorithm not only focuses on hard instances, but also on classes which are hard to distinguish. ==Quick Def== A weak classifier associates to an example a vector in ''[0,1]^k'', and the pseudo-loss takes also into account weights according to the performance of the weak classifier over the different classes for the same example. ==References== - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-m2.png }} {{ adaboost-m2_2.png }} ===Multilabel AdaBoost.MR=== [Schapire,Singer, 1998] ==References== - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-r.png }} ===Multilabel AdaBoost.MH=== [Schapire,Singer, 1998] ==References== - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}} - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}} ==Full Definition== {{ adaboost-mh.png }} ===Multiclass AdaBoost.MO=== [Schapire,Singer, 1998] ==References== - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}} - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}} ===Multiclass AdaBoost.OC=== [Schapire, 1997] ==References== - - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}} ===Multiclass AdaBoost.ECC=== [Guruswami,Sahai, 1999] ==References== - - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}} ===AdaBoost.M1W=== [Eibl,Pfeiffer, 2002] ===GrPloss=== [Eibl,Pfeiffer, 2003] ==References== - [[|2003,Eibl-Pfeiffer,Multiclass-Boosting for Weak Classifiers]] {{2003_Eibl-Pfeiffer_Multiclass-Boosting for Weak Classifiers.pdf|[Local Copy]}} ===BoostMA=== [Eibl,Pfeiffer, 2003] ==References== - [[|2003,Eibl-Pfeiffer,Multiclass-Boosting for Weak Classifiers]] {{2003_Eibl-Pfeiffer_Multiclass-Boosting for Weak Classifiers.pdf|[Local Copy]}} ===SAMME=== Stagewise Additive Modeling using a Multi-class Exponential loss function, [Zhu,Rosset,Zou, 2006] ==References== - [[ |2006,Zhu-Rosset-Zou,Multi-class AdaBoost]] {{2006_Zhu-Rosset-Zou_Multi-class AdaBoost.pdf|[Local Copy]}} ===GAMBLE=== Gentle Adaptive Multiclass Boosting Learning, [Huang,Ertekin,Song, 2005] ==References== - [[ |2005,Huang-Ertekin-Song,Efficient Multiclass Boosting Classiffication with Active Learning]] {{2005_Huang-Ertekin-Song_Efficient Multiclass Boosting Classiffication with Active Learning.pdf|[Local Copy]}} ====UBoost==== ==Quick Def== Uneven loss function + greedy ====LPBoost==== ==Objective== Not greedy, exact. ==References== - {{2000_Demiriz-Bennett-ShaweTaylor_Linear Programming Boosting via Column Generation.pdf|[Local Copy]}} ====TotalBoost==== TOTALly corrective BOOSTing, [Warmuth,Liao,Ratsch, 2006] ==References== - [[|2006,Warmuth-Liao-Ratsch,Totally Corrective Boosting Algorithms that Maximize the Margin]] {{2006_Warmuth-Liao-Ratsch_Totally Corrective Boosting Algorithms that Maximize the Margin.pdf|[Local Copy]}} ====RotBoost==== [Li,Abu-Mostafa,Pratap, 2003] ==References== - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}} ====alphaBoost==== [Li,Abu-Mostafa,Pratap, 2003] ==References== - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}} ====MILBoost==== (Multiple Instance Learning BOOSting), [Viola,Platt, 2005] ==References== - {{|2005,Viola-Platt-Zhang,Multiple Instance Boosting for Object Detection}} {{2005_Viola-Platt-Zhang_Multiple Instance Boosting for Object Detection.pdf|[Local Copy]}} ====CGBoost==== Conjugate Gradient BOOSTing, [Li,Abu-Mostafa,Pratap, 2003] ==References== - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}} ====Bootstrap Aggregating==== =====Cascades of detectors===== ==Quick Def== A cascade of classifiers is a degenerated decision tree, where at each stage a classifier is trained to detect almost all objects of interest, while rejecting a certain fraction of the non-object patterns (eg eliminates 50% of non-object patterns and falsely eliminates 0.1%, then after 20 stages it can be expected a false alarm rate of 0.5^20 and a hit rate of 0.999^20). It enables to focus attention on certain regions and dramatically increases speed. =====Trees of detectors===== ====== Regression ====== * **MLP (Multi Layers Perceptron)** * **RBFNN (Radial Basis Functions Neural Network)** * **SVR (Support Vectors Regressor)**