We present a regularization method which extends the recently introduced Generalized Matrix LVQ. This learning algorithm extends the concept of adaptive distance mea-sures in LVQ to the use of relevance matrices. In general, relevance learning can display a tendency towards over-simplification in the course of training. An overly pro-nounced elimination of dimensions in feature space can have negative effects on the performance and may lead to instabilities in the training. Complementing the standard GMLVQ cost function by an appropriate regularization term prevents this unfavorable behavior and can help to improve the generalization ability. The approach is first tested and illustrated in terms of artificial model data. Furthermore we appl...