We show that the delayed feedback neural networks for storing limit cycles can be trained using a global training algorithm. It is found that the storage capacity of the networks is in proportion to delay length as in the networks trained by the correlation learning based on Hebb's rule, but is much higher than in the latter. The generalization capacity of the networks is also higher than in the latter. Another interesting finding is that the spurious states or unwanted attractors totally disappear in the networks trained by the global training algorithm if the memory limit cycles are sufficiently long. The dynamics of the networks is investigated as a function of the length of limit cycles