Differences
This shows you the differences between two versions of the page.
Next revision | Previous revision | ||
ai:methods [2007/04/09 00:45] cyril |
ai:methods [2022/04/07 22:22] cyril delete |
||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== | + | ====== |
+ | |||
+ | |||
+ | |||
+ | This is a classification of techniques and algorithms, in order to give a broad view of solutions available to deal with classical problems. | ||
+ | |||
+ | |||
+ | |||
+ | Other pages contain some more details about, which can be seen as memos with references to find more information (with the origin paper when possible). | ||
- | This is a classification of techniques and algorithms, giving only keywords ... | ||
===== Learning ===== | ===== Learning ===== | ||
Line 7: | Line 15: | ||
=== Classification === | === Classification === | ||
- | * **MLP (Multi Layers Perceptron)** - //PMC (Perceptron multicouches)// | + | * __MLP (Multi Layers Perceptron)__ - //PMC (Perceptron multicouches)// |
* gradient backpropagation - // | * gradient backpropagation - // | ||
* stochastic | * stochastic | ||
Line 13: | Line 21: | ||
* simulated annealing - //recuit simulé// | * simulated annealing - //recuit simulé// | ||
* newton (second order) | * newton (second order) | ||
- | * **RBFNN | + | * __RBFNN |
* k-means then gradient descent | * k-means then gradient descent | ||
* incremental addition of neurons then exact method | * incremental addition of neurons then exact method | ||
- | * **SVM (Support Vectors Machine)** | + | * __SVM (Support Vectors Machine)__ |
- | * **Decision tree** | + | * __Decision tree__ |
* ID3 (based on entropy) | * ID3 (based on entropy) | ||
- | * **k-nearest | + | * __k-nearest |
+ | * __Boosting__ | ||
+ | * Boosting by majority | ||
+ | * AdaBoost (ADAptive BOOSTing) | ||
+ | * Discrete AdaBoost | ||
+ | * Real AdaBoost | ||
+ | * Gentle AdaBoost | ||
+ | * LogitBoost | ||
+ | * Probabilistic AdaBoost | ||
+ | * FloatBoost | ||
+ | * AdaBoost.Reg | ||
+ | * Multiclass AdaBoost.M1 | ||
+ | * Multiclass AdaBoost.M2 | ||
+ | * Multilabel AdaBoost.MR | ||
+ | * Multilabel AdaBoost.MH | ||
+ | * Multiclass AdaBoost.MO | ||
+ | * Multiclass AdaBoost.OC | ||
+ | * Multiclass AdaBoost.ECC | ||
+ | * GrPloss | ||
+ | * BoostMA | ||
+ | * AdaBoost.M1W | ||
+ | * SAMME (Stagewise Additive Modeling using a Multi-class Exponential loss function) | ||
+ | * GAMBLE (Gentle Adaptive Multiclass Boosting Learning) | ||
+ | * UBoost | ||
+ | * LPBoost (Linear Programming BOOSTing) | ||
+ | * TotalBoost (TOTALly corrective BOOSTing) | ||
+ | * RotBoost | ||
+ | * alphaBoost | ||
+ | * MILBoost (Multiple Instance Learning BOOSting) | ||
+ | * CGBoost (Conjugate Gradient BOOSTing) | ||
+ | * Bootstrap Aggregating | ||
+ | * __Cascades of detectors__ [[classification# | ||
+ | * __Trees of detectors__ | ||
+ | |||
=== Regression === | === Regression === | ||
- | * **MLP (Multi Layers Perceptron)** | + | * __MLP (Multi Layers Perceptron)__ |
- | * **RBFNN | + | * __RBFNN |
- | * **SVR (Support Vectors Regressor)** | + | * __SVR (Support Vectors Regressor)__ |
+ | === Pattern recognition === | ||
+ | |||
+ | * __Viola-Jones Detector__ [[pattern-recognition# | ||
+ | * with Extended Set of Haar features | ||
+ | * Stumps or CART trees | ||
+ | * Rotation Invariant | ||
+ | * **Multiview** | ||
+ | * Parallel Cascades | ||
+ | * Pyramid Cascade | ||
+ | * Tree Cascade | ||
+ | * Vector Boosting | ||
==== Unsupervised learning ==== | ==== Unsupervised learning ==== | ||
=== Vector quantization / Clustering === | === Vector quantization / Clustering === | ||
- | [[http:// | + | |
- | + | * k-means - // | |
- | * **Sequential leader** | + | * GNG (Growing Neural Gas) |
- | | + | * Auto-organizing maps (Kohonen) - //cartes auto-organisatrices de Kohonen// |
- | | + | |
- | | + | |
==== Reinforcement learning ==== | ==== Reinforcement learning ==== | ||
- | [[http:// | + | |
- | + | ||
- | | + | |
* Q-learning | * Q-learning | ||
* Value iteration | * Value iteration | ||
Line 49: | Line 98: | ||
===== Planification ===== | ===== Planification ===== | ||
- | |||
==== Symbolic ==== | ==== Symbolic ==== | ||
=== State space search === | === State space search === | ||
- | | + | |
- | * Dijkstra | + | * WA* |
+ | * IDA* | ||
+ | * Dijkstra | ||
=== Logics === | === Logics === | ||
- | Based on STRIPS-like languages (ADL, PDDL). | + | * __GraphPlan__ |
- | + | * Stan | |
- | * **GraphPlan**, | + | |
- | * **SATplan | + | |
+ | * __SATplan | ||
==== Others ==== | ==== Others ==== | ||
- | * **Genetic algorithms** | + | * __Genetic algorithms__ |
- | * **Ant colony** | + | * __Ant colonies__ |
==== Specific ==== | ==== Specific ==== | ||
=== Path planning === | === Path planning === | ||
- | * **Configurations space** | + | * __Configurations space__ |
- | * **Potential fields** | + | * __Potential fields__ |
===== Perception ===== | ===== Perception ===== | ||
==== Vision ==== | ==== Vision ==== | ||
- | |||
=== Color Quantization === | === Color Quantization === | ||
- | | + | * RGB cone |
+ | | ||
+ | | ||
+ | | ||
=== Image segmentation === | === Image segmentation === | ||
- | | + | * Floodfill |
- | | + | * Watershed - //lignes de partage des eaux// |
=== Filters === | === Filters === | ||
- | [[http://www.ph.tn.tudelft.nl/Courses/FIP/ | + | * **Anti-noise (smoothing)** |
+ | * Median Filter - //filtre médian// | ||
+ | * Vector Median Filter | ||
+ | * Kuwahara filter | ||
+ | * Peer Group Filtering | ||
+ | * Anisotropic Filtering | ||
+ | * **Gradient** [[vision# | ||
+ | * Prewitt [[vision# | ||
+ | * Roberts [[vision# | ||
+ | * Sobel [[vision# | ||
+ | * Laplace [[vision# | ||
+ | * Scharr [[vision# | ||
+ | * **Morphological** | ||
+ | * dilation - // | ||
+ | * erosion - // | ||
+ | * opening - // | ||
+ | * closing - // | ||
- | | + | === Edge detection === |
- | * gradient : prewitt, sobel, laplace | + | |
- | * morphological : dilation, erosion, opening, closing [[http:// | + | |
+ | * Canny-Deriche | ||
+ | |||
+ | === Pattern recognition === | ||
+ | |||
+ | * Mean-Square Regression - //Régression aux moindres carrés// | ||
+ | * __Hough Transforms__ | ||
+ | * Standard Hough Transform | ||
+ | * Randomized Hough Transform | ||
+ | * Connective Randomized Hough Transform | ||
+ | * Combinatorial Hough Transform | ||
+ | * Adaptive Hough Transform | ||
+ | * Probabilistic Hough Transform | ||
+ | * Adaptive Probabilistic Hough Transform | ||
+ | * Progressive Probabilistic Hough Transform | ||
+ | * Hierarchical Hough Transform | ||
+ | * Sampling Hough Transform | ||
+ | * Generalized Hough Transform | ||
+ | * UpWrite method | ||
+ | * Curvogram | ||
+ | * **Shape Descriptors** | ||
+ | * Ankerst' | ||
+ | * Chamfer matching | ||
+ | * Contour Likelihood Measurement | ||
+ | |||
+ | |||
+ | === Tracking === | ||
+ | |||
+ | * Kalman Filter | ||
+ | * Generalized Kalman Filter | ||
+ | * Correlation Tracking | ||
+ | * ACA (Area Correlation Algorithm) | ||
+ | * KLT Tracker (Kanade-Lucas-Tomasi) | ||
+ | * IPAN Tracker | ||
+ | * MeanShift | ||
+ | * **Features detection** | ||
+ | * Harris detector | ||
+ | * Susan detector | ||
+ | * Multiresolution Contrast detector | ||
==== Sensors fusion ==== | ==== Sensors fusion ==== | ||
- | | + | * Kalman filter |
- | | + | * Particles filter (bayesian network) - //filtrage particulaire// |