We present a method, which allows to train a Generalized Matrix Learning Vector Quantization (GMLVQ) model for classification using data from several, maybe non-calibrated, sources without explicit transfer learning. This is achieved by using a siamese-like GMLVQ-architecture, which comprises different sets of prototypes for the target classification and for the separation learning of the sources. In this architecture, a linear map is trained by means of GMLVQ for source distinction in the mapping space in parallel to the classification task learning. The respective null-space projection provides a common data representation of the different source data for an all-together classification learning