Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Last revision Both sides next revision
ai:methods:classification [2013/09/19 16:40]
127.0.0.1 external edit
ai:methods:classification [2022/04/07 22:31]
cyril import private
Line 1: Line 1:
-====== Classification ======+====== Classification ====== 
 + 
  
 ===== MLP ===== ===== MLP =====
 +
 Multi Layers Perceptron, //PMC (Perceptron Multi-Couches)// Multi Layers Perceptron, //PMC (Perceptron Multi-Couches)//
 +
 ====Gradient Backpropagation==== ====Gradient Backpropagation====
 +
 //Rétropropagation du Gradient// //Rétropropagation du Gradient//
 +
 ===Stochastic=== ===Stochastic===
 +
 ===with Inertia=== ===with Inertia===
 +
 ===Simulated Annealing=== ===Simulated Annealing===
 +
 //Recuit Simulé// //Recuit Simulé//
  
-====newton==== + 
-=Quick Def= + 
-second order+====Newton==== 
 + 
 +==Objective== 
 + 
 +Converges faster than gradient descent 
 + 
 +==Quick Def=
 + 
 +Second order 
 + 
  
 =====RBFNN===== =====RBFNN=====
 +
 Radial Basis Functions Neural Networks Radial Basis Functions Neural Networks
-    * __k-means then gradient descent__ + 
-    * __incremental addition of neurons then exact method__+===First method=== 
 + 
 +==Objective== 
 + 
 +You have to chose ''k'' 
 + 
 +==Quick Def== 
 + 
 +k-means then gradient descent 
 + 
 +===Second method=== 
 + 
 +==Quick Def== 
 + 
 +incremental addition of neurons then exact method 
 + 
  
 =====SVM===== =====SVM=====
 +
 Support Vectors Machine Support Vectors Machine
 +
 +
  
 =====Decision tree===== =====Decision tree=====
-//arbre de d�cision// + 
-    * __ID3__ (based on entropy)+//arbre de décision// 
 + 
 +===ID3=== 
 + 
 +==Quick Def== 
 + 
 +based on entropy 
 + 
  
 =====k-nearest neighbors===== =====k-nearest neighbors=====
 +
 //k plus proches voisins// //k plus proches voisins//
 +
 +
 +
 +=====Boosting=====
 +
 +[Freund,Schapire, 1995]
 +
 +==Quick Def==
 +
 +Consists in combining a lot of weak classifiers to get a strong one.
 +
 +
 +
 +
 +
 +====Boosting by majority====
 +
 +
 +
 +====AdaBoost====
 +
 +ADAptive BOOSTing, [Freund,Schapire, 1996]
 +
 +
 +
 +The first and standard version is refered as [[#Discrete_AdaBoost|Discrete AdaBoost]].
 +
 +
 +
 +==Quick Def==
 +
 +Greedy approach
 +
 +
 +
 +===Discrete AdaBoost===
 +
 +[Freund,Schapire, 1996]
 +
 +==References==
 +
 +  - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}}
 +
 +==Full Definition==
 +
 +{{  adaboost-discrete.png  }}
 +
 +
 +
 +===Real AdaBoost===
 +
 +[Friedman,Hastie,Tibshirani, 1998]
 +
 +==References==
 +
 +  - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}}
 +
 +  - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}}
 +
 +
 +
 +===LogitBoost===
 +
 +[Friedman,Hastie,Tibshirani, 1998]
 +
 +==References==
 +
 +  - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}}
 +
 +
 +
 +===Gentle AdaBoost===
 +
 +[Friedman,Hastie,Tibshirani, 1998]
 +
 +==References==
 +
 +  - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}}
 +
 +
 +===Probabilistic AdaBoost===
 +
 +[Friedman,Hastie,Tibshirani, 1998]
 +
 +==References==
 +
 +  - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}}
 +
 +
 +===FloatBoost===
 +
 +==Objective==
 +
 +AdaBoost is a sequential forward search procedure using the greedy selection strategy to minimize a certain margin on the training set. A crucial heuristic assumption used in such a sequential forward search procedure is the monotonicity (i.e. that addition of a new weak classifier to the current set does not decrease the value of the performance criterion). The premise offered by the sequential procedure in AdaBoost breaks down when this assumption is violated. Floating Search is a sequential feature selection procedure with backtracking, aimed to deal with nonmonotonic criterion functions for feature selection
 +
 +==Full Definition==
 +
 +{{  floatboost.png  }}
 +
 +
 +
 +===AdaBoost.Reg===
 +
 +[Freund,Schapire, 1997]
 +
 +==Objective==
 +
 +An extension of AdaBoost to regression problems
 +
 +==References==
 +
 +  - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}}
 +
 +  
 +
 +===Multiclass AdaBoost.M1===
 +
 +[Freund,Schapire, 1997]
 +
 +==Objective==
 +
 +Basic extension of AdaBoost to multiclass problems. A weak classifier needs to have an error rate less than 1/2, which is stronger than random guessing (1/k) and is often too difficult to obtain.
 +
 +==Quick Def==
 +
 +A weak classifier associates to an example a label in ''{0,...,k}''.
 +
 +==References==
 +
 +  - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}}
 +
 +==Full Definition==
 +
 +{{  adaboost-m1.png  }}
 +
 +
 +
 +===Multiclass AdaBoost.M2===
 +
 +[Freund,Schapire, 1997]
 +
 +==Objective==
 +
 +Tries to overcome the difficulty of AdaBoost.M1 by extending the communication between the boosting algorithm and the weak learner. The algorithm not only focuses on hard instances, but also on classes which are hard to distinguish.
 +
 +==Quick Def==
 +
 +A weak classifier associates to an example a vector in ''[0,1]^k'', and the pseudo-loss takes also into account weights according to the performance of the weak classifier over the different classes for the same example.
 +
 +==References==
 +
 +  - [[|1997,Freund-Schapire,A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting]] {{1997_Freund-Schapire_A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting.pdf|[Local Copy]}}
 +
 +==Full Definition==
 +
 +{{  adaboost-m2.png  }}
 +
 +
 +
 +===Multilabel AdaBoost.MR===
 +
 +[Schapire,Singer, 1998]
 +
 +==References==
 +
 +  - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}}
 +
 +
 +
 +===Multilabel AdaBoost.MH===
 +
 +[Schapire,Singer, 1998]
 +
 +==References==
 +
 +  - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}}
 +
 +  - [[|1998,Friedman-Hastie-Tibshirani,Additive Logistic Regression - a Statistical View of Boosting]] {{1998_Friedman-Hastie-Tibshirani_Additive Logistic Regression - a Statistical View of Boosting.pdf|[Local Copy]}}
 +
 +
 +
 +===Multiclass AdaBoost.MO===
 +
 +[Schapire,Singer, 1998]
 +
 +==References==
 +
 +  - [[|1998,Schapire-Singer,Improved Boosting Algorithms Using Confidence-rated Predictions.pdf]] {{1998_Schapire-Singer_Improved Boosting Algorithms Using Confidence-rated Predictions.pdf|[Local Copy]}}
 +
 +  - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}}
 +
 +
 +
 +===Multiclass AdaBoost.OC===
 +
 +[Schapire, 1997]
 +
 +==References==
 +
 +  - 
 +
 +  - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}}
 +
 +
 +
 +===Multiclass AdaBoost.ECC===
 +
 +[Guruswami,Sahai, 1999]
 +
 +==References==
 +
 +  - 
 +
 +  - [[|2006,Sun-Todorovic-Li,Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework]] {{2006_Sun-Todorovic-Li_Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework.pdf|[Local Copy]}}
 +
 +
 +
 +===AdaBoost.M1W===
 +
 +[Eibl,Pfeiffer, 2002]
 +
 +
 +
 +===GrPloss===
 +
 +[Eibl,Pfeiffer, 2003]
 +
 +==References==
 +
 +  - [[|2003,Eibl-Pfeiffer,Multiclass-Boosting for Weak Classifiers]] {{2003_Eibl-Pfeiffer_Multiclass-Boosting for Weak Classifiers.pdf|[Local Copy]}}
 +
 +
 +
 +===BoostMA===
 +
 +[Eibl,Pfeiffer, 2003]
 +
 +==References==
 +
 +  - [[|2003,Eibl-Pfeiffer,Multiclass-Boosting for Weak Classifiers]] {{2003_Eibl-Pfeiffer_Multiclass-Boosting for Weak Classifiers.pdf|[Local Copy]}}
 +
 +
 +
 +===SAMME===
 +
 +Stagewise Additive Modeling using a Multi-class Exponential loss function, [Zhu,Rosset,Zou, 2006]
 +
 +
 +
 +==References==
 +
 +  - [[ |2006,Zhu-Rosset-Zou,Multi-class AdaBoost]] {{2006_Zhu-Rosset-Zou_Multi-class AdaBoost.pdf|[Local Copy]}}
 +
 +
 +
 +===GAMBLE===
 +
 +Gentle Adaptive Multiclass Boosting Learning, [Huang,Ertekin,Song, 2005]
 +
 +
 +
 +==References==
 +
 +  - [[ |2005,Huang-Ertekin-Song,Efficient Multiclass Boosting Classiffication with Active Learning]] {{2005_Huang-Ertekin-Song_Efficient Multiclass Boosting Classiffication with Active Learning.pdf|[Local Copy]}}
 +
 +
 +
 +
 +
 +====UBoost====
 +
 +
 +
 +==Quick Def==
 +
 +Uneven loss function + greedy
 +
 +
 +
 +====LPBoost====
 +
 +
 +
 +==Objective==
 +
 +Not greedy, exact.
 +
 +
 +
 +==References==
 +
 +  - {{2000_Demiriz-Bennett-ShaweTaylor_Linear Programming Boosting via Column Generation.pdf|[Local Copy]}}
 +
 +
 +
 +====TotalBoost====
 +
 +TOTALly corrective BOOSTing, [Warmuth,Liao,Ratsch, 2006]
 +
 +==References==
 +
 +  - [[|2006,Warmuth-Liao-Ratsch,Totally Corrective Boosting Algorithms that Maximize the Margin]] {{2006_Warmuth-Liao-Ratsch_Totally Corrective Boosting Algorithms that Maximize the Margin.pdf|[Local Copy]}}
 +
 +
 +
 +====RotBoost====
 +
 +[Li,Abu-Mostafa,Pratap, 2003]
 +
 +
 +
 +==References==
 +
 +  - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}}
 +
 +
 +
 +====alphaBoost====
 +
 +[Li,Abu-Mostafa,Pratap, 2003]
 +
 +
 +
 +==References==
 +  - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}}
 +
 +
 +====MILBoost====
 +(Multiple Instance Learning BOOSting), [Viola,Platt, 2005]
 +
 +==References==
 +  - {{|2005,Viola-Platt-Zhang,Multiple Instance Boosting for Object Detection}} {{2005_Viola-Platt-Zhang_Multiple Instance Boosting for Object Detection.pdf|[Local Copy]}}
 +
 +
 +====CGBoost==== 
 +
 +Conjugate Gradient BOOSTing, [Li,Abu-Mostafa,Pratap, 2003]
 +
 +
 +==References==
 +  - {{2003_Li-AbuMostafa-Pratap_CGBoost - Conjugate Gradient in Function Space.pdf|[Local Copy]}}
 +
 +
 +
 +
 +
 +====Bootstrap Aggregating====
 +
 +
 +
 +=====Cascades of detectors=====
 +
 +==Quick Def==
 +
 +A cascade of classifiers is a degenerated decision tree, where at each stage a classifier is trained to detect almost all objects of interest, while rejecting a certain fraction of the non-object patterns (eg eliminates 50% of non-object patterns and falsely eliminates 0.1%, then after 20 stages it can be expected a false alarm rate of 0.5^20 and a hit rate of 0.999^20). It enables to focus attention on certain regions and dramatically increases speed.
 +
 +
 +
 +=====Trees of detectors=====
 +
 +
  
 ====== Regression ====== ====== Regression ======
 +
 +
  
   * **MLP (Multi Layers Perceptron)**   * **MLP (Multi Layers Perceptron)**
 +
   * **RBFNN (Radial Basis Functions Neural Network)**   * **RBFNN (Radial Basis Functions Neural Network)**
 +
   * **SVR (Support Vectors Regressor)**   * **SVR (Support Vectors Regressor)**
 +
  
ai/methods/classification.txt · Last modified: 2022/04/28 23:27 by cyril
CC Attribution-Share Alike 4.0 International
Driven by DokuWiki Recent changes RSS feed Valid CSS Valid XHTML 1.0