1.Gaussian Mixture Models(GMM)
For a D-dimensional feature vector, x, the mixture density used for the likelihood function is defined as
$p(x|\lambda) = \sum_{i=1}^{M}w_{i}p_{i}(x)$.
The density is a weighted linear combination of $M$ unimodal Gaussian densities , $p_{i}(x)$
$p_{i}(x) = \frac{1}{(2\pi)^{D/2}|\sum _{i}|^{1/2}} exp[-\frac{1}{2}(x-\mu_{i})^{T}(\sum _{i})^{-1}(x-\mu_{i})]$.
2.Support Vector Machines(SVM)
An SVM is a two-class classifier constructed from sums of a kernel function $K(• , •)$
$f(x) = \sum_{i=1}{N}\alpha_{i} t_{i} K(x,x_{i}) + d$.
$t_{i}$ are the ideal outputs, $\sum_{i=1}{N}\alpha_{i} t_{i}=0$ and $\alpha_{i} > 0$,$x_{i}$ are support vectors and obtained
from the training set by an optimization process.
$K(. , .)$ is constrained to have certain properties (the Mercer condition), so that it can be expressed as
$K(x,y) = b(x)^{T} b(y)$.
Kernel function examples:
http://www.shamoxia.com/html/y2010/2292.html