We propose the utilization of divergences in gradient descent learning of supervised and unsupervised vector quantization as an alternative for the squared Euclidean distance. The approach is based on the determination of the Fréchet-derivatives for the divergences, wich can be immediately plugged into the online-learning rules. We provide the mathematical foundation of the respective framework. This framework includes usual gradient descent learning of prototypes as well as parameter optimization and relevance learning for improvement of the performance
This paper analyses the Contrastive Divergence algorithm for learning statistical parameters. We rel...
Schneider P, Biehl M, Hammer B. Distance learning in discriminative vector quantization. Neural Comp...
Learning vector quantization (LVQ) is one of the most powerful approaches for prototype based classi...
We propose the utilization of divergences in gradient descent learning of supervised and unsupervise...
Villmann T, Haase S, Schleif F-M, Hammer B. Divergence Based Online Learning in Vector Quantization....
Villmann T, Haase S, Schleif F-M, Hammer B, Biehl M. The Mathematics of Divergence Based Online Lear...
We propose relevance learning for unsupervised online vector quantization algorithm based on stochas...
Mwebaze E, Schneider P, Schleif F-M, Haase S, Villmann T, Biehl M. Divergence based Learning Vector ...
We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed...
Mwebaze E, Schneider P, Schleif F-M, et al. Divergence based classification in Learning Vector Quant...
Dit proefschrift geeft een systematische analyse van op divergentie gebaseerde leer algoritmen en le...
Vector Quantizers (VQ) can be exploited for classification. In particular the gradient of the error ...
Learning vector quantization applying non-standard metrics became quite popular for classification p...
Functional Bregman divergences are an important class of divergences in machine learning that genera...
Vector Quantizers (VQ) can be exploited for classification. In particular the gradient of the error ...
This paper analyses the Contrastive Divergence algorithm for learning statistical parameters. We rel...
Schneider P, Biehl M, Hammer B. Distance learning in discriminative vector quantization. Neural Comp...
Learning vector quantization (LVQ) is one of the most powerful approaches for prototype based classi...
We propose the utilization of divergences in gradient descent learning of supervised and unsupervise...
Villmann T, Haase S, Schleif F-M, Hammer B. Divergence Based Online Learning in Vector Quantization....
Villmann T, Haase S, Schleif F-M, Hammer B, Biehl M. The Mathematics of Divergence Based Online Lear...
We propose relevance learning for unsupervised online vector quantization algorithm based on stochas...
Mwebaze E, Schneider P, Schleif F-M, Haase S, Villmann T, Biehl M. Divergence based Learning Vector ...
We discuss the use of divergences in dissimilarity-based classification. Divergences can be employed...
Mwebaze E, Schneider P, Schleif F-M, et al. Divergence based classification in Learning Vector Quant...
Dit proefschrift geeft een systematische analyse van op divergentie gebaseerde leer algoritmen en le...
Vector Quantizers (VQ) can be exploited for classification. In particular the gradient of the error ...
Learning vector quantization applying non-standard metrics became quite popular for classification p...
Functional Bregman divergences are an important class of divergences in machine learning that genera...
Vector Quantizers (VQ) can be exploited for classification. In particular the gradient of the error ...
This paper analyses the Contrastive Divergence algorithm for learning statistical parameters. We rel...
Schneider P, Biehl M, Hammer B. Distance learning in discriminative vector quantization. Neural Comp...
Learning vector quantization (LVQ) is one of the most powerful approaches for prototype based classi...