The well-known mixtures of experts(ME) model is usually trained by expectation maximization (EM) algorithm for maximum likelihood learning. However, we have to first determine the number of experts, which is often hardly known. Derived from regularization theory, a regularized minimum cross-entropy(RMCE) algorithm is proposed to train ME model, which can automatically make model selection. When time series is modeled by ME, it is demonstrated by some climate prediction experiments that RMCE algorithm outperforms EM algorithm. We also compare RMCE algorithm with other regression methods such as back-propagation(BP) algorithm and normalized radial basis function(NRBF) network, and find that RMCE algorithm still shows promising results.http://...
International audienceMixture of experts (MoE) models are successful neural-network architectures fo...
In Gaussian mixture (GM) modeling, it is crucial to select the number of Gaussians for a sample data...
adler~stat.unc.edu We consider the problem of prediction of stationary time series, using the archit...
The well-known mixtures of experts(ME) model is usually trained by expectation maximization(EM) algo...
The well-known mixtures of experts (ME) model has been used in many different areas to account for n...
Curve detection is a basic problem in image processing and remains a difficult problem. In this pape...
Today, there is growing interest in the automatic classification of a variety of tasks, such as weat...
International audience.We consider the Mixture of Experts (MoE) modeling for clustering heterogeneou...
In these days, there are a growing interest in pattern recognition for tasks as prediction of weathe...
Frequency prediction after a disturbance has received increasing research attention given its substa...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
Mixtures-of-Experts models and their maximum likelihood estimation (MLE) via the EM algorithm have b...
International audienceVariable selection is fundamental to high-dimensional statistical modeling, an...
As for Gaussian mixture modeling, the key problem is to select the number of Gaussians in the mixtur...
This paper presents a maximum entropy framework for the aggregation of expert opinions where the exp...
International audienceMixture of experts (MoE) models are successful neural-network architectures fo...
In Gaussian mixture (GM) modeling, it is crucial to select the number of Gaussians for a sample data...
adler~stat.unc.edu We consider the problem of prediction of stationary time series, using the archit...
The well-known mixtures of experts(ME) model is usually trained by expectation maximization(EM) algo...
The well-known mixtures of experts (ME) model has been used in many different areas to account for n...
Curve detection is a basic problem in image processing and remains a difficult problem. In this pape...
Today, there is growing interest in the automatic classification of a variety of tasks, such as weat...
International audience.We consider the Mixture of Experts (MoE) modeling for clustering heterogeneou...
In these days, there are a growing interest in pattern recognition for tasks as prediction of weathe...
Frequency prediction after a disturbance has received increasing research attention given its substa...
This paper presents a new approach to estimating mixture models based on a recent inference principl...
Mixtures-of-Experts models and their maximum likelihood estimation (MLE) via the EM algorithm have b...
International audienceVariable selection is fundamental to high-dimensional statistical modeling, an...
As for Gaussian mixture modeling, the key problem is to select the number of Gaussians in the mixtur...
This paper presents a maximum entropy framework for the aggregation of expert opinions where the exp...
International audienceMixture of experts (MoE) models are successful neural-network architectures fo...
In Gaussian mixture (GM) modeling, it is crucial to select the number of Gaussians for a sample data...
adler~stat.unc.edu We consider the problem of prediction of stationary time series, using the archit...