Differences
This shows you the differences between two versions of the page.
| Both sides previous revision Previous revision Next revision | Previous revision | ||
|
ai:methods [2022/04/07 22:07] cyril import private |
— (current) | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | ====== Catalog of methods in AI/ | ||
| - | |||
| - | |||
| - | |||
| - | This is a classification of techniques and algorithms, in order to give a broad view of solutions available to deal with classical problems. | ||
| - | |||
| - | |||
| - | |||
| - | Other pages contain some more details about, which can be seen as memos with references to find more information (with the origin paper when possible). | ||
| - | |||
| - | |||
| - | |||
| - | ===== Learning ===== | ||
| - | ==== Supervised learning ==== | ||
| - | === Classification === | ||
| - | |||
| - | * __MLP (Multi Layers Perceptron)__ - //PMC (Perceptron multicouches)// | ||
| - | * gradient backpropagation - // | ||
| - | * stochastic | ||
| - | * with inertia | ||
| - | * simulated annealing - //recuit simulé// | ||
| - | * newton (second order) | ||
| - | * __RBFNN (Radial Basis Functions Neural Networks)__ | ||
| - | * k-means then gradient descent | ||
| - | * incremental addition of neurons then exact method | ||
| - | * __SVM (Support Vectors Machine)__ | ||
| - | * __Decision tree__ - //arbre de décision// | ||
| - | * ID3 (based on entropy) | ||
| - | * __k-nearest neighbors__ - //k plus proches voisins// | ||
| - | * __Boosting__ | ||
| - | * Boosting by majority | ||
| - | * AdaBoost (ADAptive BOOSTing) | ||
| - | * Discrete AdaBoost | ||
| - | * Real AdaBoost | ||
| - | * Gentle AdaBoost | ||
| - | * LogitBoost | ||
| - | * Probabilistic AdaBoost | ||
| - | * FloatBoost | ||
| - | * AdaBoost.Reg | ||
| - | * Multiclass AdaBoost.M1 | ||
| - | * Multiclass AdaBoost.M2 | ||
| - | * Multilabel AdaBoost.MR | ||
| - | * Multilabel AdaBoost.MH | ||
| - | * Multiclass AdaBoost.MO | ||
| - | * Multiclass AdaBoost.OC | ||
| - | * Multiclass AdaBoost.ECC | ||
| - | * GrPloss | ||
| - | * BoostMA | ||
| - | * AdaBoost.M1W | ||
| - | * SAMME (Stagewise Additive Modeling using a Multi-class Exponential loss function) | ||
| - | * GAMBLE (Gentle Adaptive Multiclass Boosting Learning) | ||
| - | * UBoost | ||
| - | * LPBoost (Linear Programming BOOSTing) | ||
| - | * TotalBoost (TOTALly corrective BOOSTing) | ||
| - | * RotBoost | ||
| - | * alphaBoost | ||
| - | * MILBoost (Multiple Instance Learning BOOSting) | ||
| - | * CGBoost (Conjugate Gradient BOOSTing) | ||
| - | * Bootstrap Aggregating | ||
| - | * __Cascades of detectors__ [[classification# | ||
| - | * __Trees of detectors__ | ||
| - | |||
| - | |||
| - | |||
| - | === Regression === | ||
| - | |||
| - | * __MLP (Multi Layers Perceptron)__ | ||
| - | * __RBFNN (Radial Basis Functions Neural Network)__ | ||
| - | * __SVR (Support Vectors Regressor)__ | ||
| - | |||
| - | === Pattern recognition === | ||
| - | |||
| - | * __Viola-Jones Detector__ [[pattern-recognition# | ||
| - | * with Extended Set of Haar features | ||
| - | * Stumps or CART trees | ||
| - | * Rotation Invariant | ||
| - | * **Multiview** | ||
| - | * Parallel Cascades | ||
| - | * Pyramid Cascade | ||
| - | * Tree Cascade | ||
| - | * Vector Boosting | ||
| - | |||
| - | ==== Unsupervised learning ==== | ||
| - | === Vector quantization / Clustering === | ||
| - | |||
| - | * Sequential leader | ||
| - | * k-means - // | ||
| - | * GNG (Growing Neural Gas) | ||
| - | * Auto-organizing maps (Kohonen) - //cartes auto-organisatrices de Kohonen// | ||
| - | |||
| - | |||
| - | ==== Reinforcement learning ==== | ||
| - | |||
| - | * __MDP (Markov Decision Processes)__ | ||
| - | * Q-learning | ||
| - | * Value iteration | ||
| - | * Policy iteration | ||
| - | |||
| - | ===== Planification ===== | ||
| - | ==== Symbolic ==== | ||
| - | === State space search === | ||
| - | |||
| - | * A* | ||
| - | * WA* | ||
| - | * IDA* | ||
| - | * Dijkstra | ||
| - | |||
| - | === Logics === | ||
| - | |||
| - | * __GraphPlan__ | ||
| - | * Stan | ||
| - | * IPP | ||
| - | * SGP | ||
| - | * __SATplan (SATisfiability PLANning)__ | ||
| - | |||
| - | ==== Others ==== | ||
| - | |||
| - | * __Genetic algorithms__ - // | ||
| - | * __Ant colonies__ - //colonies de fourmis// | ||
| - | |||
| - | ==== Specific ==== | ||
| - | === Path planning === | ||
| - | |||
| - | * __Configurations space__ | ||
| - | * __Potential fields__ | ||
| - | |||
| - | ===== Perception ===== | ||
| - | ==== Vision ==== | ||
| - | === Color Quantization === | ||
| - | |||
| - | * RGB cone | ||
| - | * YUV polygon | ||
| - | * HSV rectangle | ||
| - | * Lab | ||
| - | |||
| - | === Image segmentation === | ||
| - | |||
| - | * Floodfill - // | ||
| - | * Watershed - //lignes de partage des eaux// | ||
| - | |||
| - | === Filters === | ||
| - | |||
| - | * **Anti-noise (smoothing)** | ||
| - | * Median Filter - //filtre médian// | ||
| - | * Vector Median Filter | ||
| - | * Kuwahara filter | ||
| - | * Peer Group Filtering | ||
| - | * Anisotropic Filtering | ||
| - | * **Gradient** [[vision# | ||
| - | * Prewitt [[vision# | ||
| - | * Roberts [[vision# | ||
| - | * Sobel [[vision# | ||
| - | * Laplace [[vision# | ||
| - | * Scharr [[vision# | ||
| - | * **Morphological** | ||
| - | * dilation - // | ||
| - | * erosion - // | ||
| - | * opening - // | ||
| - | * closing - // | ||
| - | |||
| - | === Edge detection === | ||
| - | |||
| - | * Canny detector | ||
| - | * Canny-Deriche | ||
| - | |||
| - | === Pattern recognition === | ||
| - | |||
| - | * Mean-Square Regression - // | ||
| - | * __Hough Transforms__ [[pattern-recognition# | ||
| - | * Standard Hough Transform | ||
| - | * Randomized Hough Transform | ||
| - | * Connective Randomized Hough Transform | ||
| - | * Combinatorial Hough Transform | ||
| - | * Adaptive Hough Transform | ||
| - | * Probabilistic Hough Transform | ||
| - | * Adaptive Probabilistic Hough Transform | ||
| - | * Progressive Probabilistic Hough Transform | ||
| - | * Hierarchical Hough Transform | ||
| - | * Sampling Hough Transform | ||
| - | * Generalized Hough Transform | ||
| - | * UpWrite method | ||
| - | * Curvogram | ||
| - | * **Shape Descriptors** | ||
| - | * Ankerst' | ||
| - | * Chamfer matching | ||
| - | * Contour Likelihood Measurement | ||
| - | |||
| - | |||
| - | === Tracking === | ||
| - | |||
| - | * Kalman Filter | ||
| - | * Generalized Kalman Filter | ||
| - | * Correlation Tracking | ||
| - | * ACA (Area Correlation Algorithm) | ||
| - | * KLT Tracker (Kanade-Lucas-Tomasi) | ||
| - | * IPAN Tracker | ||
| - | * MeanShift | ||
| - | * **Features detection** | ||
| - | * Harris detector | ||
| - | * Susan detector | ||
| - | * Multiresolution Contrast detector | ||
| - | |||
| - | ==== Sensors fusion ==== | ||
| - | |||
| - | * Kalman filter | ||
| - | * Particles filter (bayesian network) - //filtrage particulaire// | ||
