In this paper, we discuss the implementation of a wrapped neural network feature selection approach, introduced here as the Weight Cascaded Retraining (WCR) algorithm. The first part of the paper provides an outline of the algorithm and elaborates on its formal underpinnings. Central to the whole feature pruning approach is the iteratively conceived guided function optimization realized by passing the optimized weight vector from one iteration step to the next. This essentially gives rise to a cascaded form of neural network retraining. In the second part of the paper the theoretical exposition of the WCR algorithm will be illuminated and benchmarked by means of publicly available UCI case material. It is illustrated that WCR based neural n...