Abstract – Training a neural network is a difficult optimization problem because of numerous local minimums. As an alternative to local search algorithms, many global search algorithms have been used to train neural networks. However, local search algorithms are more efficient with computational resources, and therefore numerous random restarts with a local algorithm may be more effective than a global algorithm at obtaining a low value of the objective function. This study uses Monte-Carlo simulations to determine the relative efficiency of a local search algorithm to 9 stochastic global algorithms: 2 simulated annealing algorithms, 1 simple random stochastic algorithm, 1 genetic algorithm and 5 evolutionary strategy algorithms. The comput...
Random cost simulations were introduced as a method to investigate optimization prob-lems in systems...
In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) mod...
In view of several limitations of gradient search techniques (e.g. backpropagation), global search t...
Training a neural network is a difficult optimization problem because of numerous local minimums. M...
Training a neural network is a difficult optimization problem because of numerous local minimums. M...
Neural network learning is the main essence of ANN. There are many problems associated with the mult...
Abstract—This paper presents a new method that inte-grates tabu search, simulated annealing, genetic...
In this thesis, a new global optimization technique, its applications in particular to neural networ...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Artificial Neural Networks have earned popularity in recent years because of their ability to approx...
Artificial Neural Networks have earned popularity in recent years because of their ability to approx...
Approaches combining genetic algorithms and neural networks have received a great deal of attention ...
It has been demonstrated that genetic algorithms (GAs) can help search the global (or near global) o...
Random cost simulations were introduced as a method to investigate optimization prob-lems in systems...
In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) mod...
In view of several limitations of gradient search techniques (e.g. backpropagation), global search t...
Training a neural network is a difficult optimization problem because of numerous local minimums. M...
Training a neural network is a difficult optimization problem because of numerous local minimums. M...
Neural network learning is the main essence of ANN. There are many problems associated with the mult...
Abstract—This paper presents a new method that inte-grates tabu search, simulated annealing, genetic...
In this thesis, a new global optimization technique, its applications in particular to neural networ...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Simulated Annealing is a meta-heuristic that performs a randomized local search to reach near-optima...
Artificial Neural Networks have earned popularity in recent years because of their ability to approx...
Artificial Neural Networks have earned popularity in recent years because of their ability to approx...
Approaches combining genetic algorithms and neural networks have received a great deal of attention ...
It has been demonstrated that genetic algorithms (GAs) can help search the global (or near global) o...
Random cost simulations were introduced as a method to investigate optimization prob-lems in systems...
In recent decades, researches on optimizing the parameter of the artificial neural network (ANN) mod...
In view of several limitations of gradient search techniques (e.g. backpropagation), global search t...