This paper proposes two new training algorithms for multilayer perceptrons based on evolutionary computation, regularization, and transduction. Regularization is a commonly used technique for preventing the learning algorithm from overfitting the training data. In this context, this work introduces and analyzes a novel regularization scheme for neural networks (NNs) named eigenvalue decay, which aims at improving the classification margin. The introduction of eigenvalue decay led to the development of a new training method based on the same principles of SVM, and so named Support Vector NN (SVNN). Finally, by analogy with the transductive SVM (TSVM), it is proposed a transductive NN (TNN), by exploiting SVNN in order to address transductive...