A novel random-gradient-based algorithm is developed for online tracking the minor component (MC) associated with the smallest eigenvalue of the autocorrelation matrix of the input vector sequence. The five available learning algorithms for tracking one MC are extended to those for tracking multiple MCs or the minor subspace (MS). In order to overcome the dynamical divergence properties of some available random-gradient-based algorithms, we propose a modification of the Oja-type algorithms, called OJAm, which can work satisfactorily. The averaging differential equation and the energy function associated with the OJAm are given. It is shown that the averaging differential equation will globally asymptotically converge to an invariance set. T...
In this letter, we propose a class of self-stabilizing learning algorithms for minor component analy...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Abstract — This paper provides a performance analysis of a least mean square (LMS) dominant invarian...
This brief deals with the problem of minor component analysis (MCA). Artificial neural networks can ...
The eigenvector associated with the smallest eigenvalue of the autocorrelation matrix of input signa...
We introduce a novel information criterion (NIC) for searching for the optimum weights of a two-laye...
Abstract—The dual purpose principal and minor subspace gra-dient flow can be used to track principal...
Minor component analysis (MCA) is an important statistical tool for signal processing and data analy...
This paper introduces a new algorithm for tracking the minor subspace of the correlation matrix asso...
AbstractA principal component analysis (PCA) neural network is developed for online extraction of th...
Minor subspace extraction is concerned with extracting multiple minor components from an autocorrela...
International audienceLearning expressive probabilistic models correctly describing the data is a ub...
We proposed a new self-organizing net based on the principle of Least Mean Square Error Reconstructi...
This paper deals with studying the asymptotical properties of multilayer neural networks models used...
The stability of minor component analysis (MCA) learning algorithms is an important problem in many ...
In this letter, we propose a class of self-stabilizing learning algorithms for minor component analy...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Abstract — This paper provides a performance analysis of a least mean square (LMS) dominant invarian...
This brief deals with the problem of minor component analysis (MCA). Artificial neural networks can ...
The eigenvector associated with the smallest eigenvalue of the autocorrelation matrix of input signa...
We introduce a novel information criterion (NIC) for searching for the optimum weights of a two-laye...
Abstract—The dual purpose principal and minor subspace gra-dient flow can be used to track principal...
Minor component analysis (MCA) is an important statistical tool for signal processing and data analy...
This paper introduces a new algorithm for tracking the minor subspace of the correlation matrix asso...
AbstractA principal component analysis (PCA) neural network is developed for online extraction of th...
Minor subspace extraction is concerned with extracting multiple minor components from an autocorrela...
International audienceLearning expressive probabilistic models correctly describing the data is a ub...
We proposed a new self-organizing net based on the principle of Least Mean Square Error Reconstructi...
This paper deals with studying the asymptotical properties of multilayer neural networks models used...
The stability of minor component analysis (MCA) learning algorithms is an important problem in many ...
In this letter, we propose a class of self-stabilizing learning algorithms for minor component analy...
A fast algorithm is proposed for optimal supervised learning in multiple-layer neural networks. The ...
Abstract — This paper provides a performance analysis of a least mean square (LMS) dominant invarian...