Recognition of a Mixture of Multiple Gaussian Patterns (original) (raw)

On the parameters Estimation of The Generalized Gaussian Mixture Model

2009

The parameters estimation of mixture distributions is an important task in statistical signal processing, Pattern recognition, blind equalization and other modern statistical tasks often call for mixture estimation. This paper aims to provide a realistic distribution based on Mixture of Generalized Gaussian distribution (MGG), which has the advantage to characterize the variability of shape parameter in each component in the mixture. We propose a formulation of the Expectation Maximization (EM) algorithm under Generalized Gaussian distribution. For this, two different methods are proposed to include the shape parameter estimation. In the rst method a derivation of the Likelihood function is used to update the mixture parameters. In the second approach we propose an extension of the iclassicali (EM) algorithm and to estimate the shape parameter in terms of Kurtosis. The KullbackLeibler divergence (KLD) is used to compare, and evaluate these algorithms of MGG parameters estimation. An...

Single-Gaussian and Gaussian-Mixture Models Estimation for Pattern Recognition

Single-Gaussian and Gaussian-Mixture Models are utilized in various pattern recognition tasks. The model parameters are estimated usually via Maximum Likelihood Estimation (MLE) with respect to available training data. However, if only small amount of training data is available, the resulting model will not generalize well. Loosely speaking, classification performance given an unseen test set may be poor. In this paper, we propose a novel estimation technique of the model variances. Once the variances were estimated using MLE, they are multiplied by a scaling factor, which reflects the amount of uncertainty present in the limited sample set. The optimal value of the scaling factor is based on the Kullback-Leibler criterion and on the assumption that the training and test sets are sampled from the same source distribution. In addition, in the case of GMM, the proper number of components can be determined.

Parameter Estimation for a Mixture of Two Univariate Gaussian Distributions: A Comparative Analysis of The Proposed and Maximum Likelihood Methods

Two approaches to parameter estimation for a mixture of two univariate Gaussian distributions are numerically compared. The proposed method (PM) is based on decomposing a Continuous function into its odd and even components and estimating them as polynomials, the other is the usual maximum likelihood (ML) method via the expected maximisation (EM) algorithm. An overlapped mixture of two univariate Gaussian distributions is simulated. The PM and ML are used to re-estimate the known mixture model parameters and the measure of performance is the absolute percentage error. The PM produces comparable results to those of to the ML approach. Given that the PM produces good estimates, and knowing that the ML always converges given good initial guess values (IGVs), it is thus recommended that the PM be used symbiotically with the ML to provide IGVs for the EM algorithm.

A Spectral Analysis Approach for Gaussian Mixture Estimation

Arxiv preprint arXiv:0710.1760, 2007

This paper deals with the estimation of one-dimensional Gaussian mixture. Given a set of observations of a K-component Gaussian mixture, we focus on the estimation of the component expectations. The number of components is supposed to be known. Our method is based ...