Function approximation is a very important task in environments where computation has to be based on extracting information from data samples in real world processes. Neural networks and wavenets have been recently seen as attractive tools for developing efficient solutions for many real world problems in function approximation. In this paper, it is shown how feedforward neural networks can be built using a different type of activation function referred to as the PPS-wavelet. An algorithm is presented to generate a family of PPS-wavelets that can be used to efficiently construct feedforward networks for function approximation
The authors present a class of efficient algorithms for PAC learning continuous functions and regres...
Research into Wavelet Neural Networks was conducted on numerous occasions in the past. Based on prev...
In this paper we develop a theoretical description of standard feedfoward neural networks in terms o...
Wavelet functions have been successfully used in many problems as the activation function of feedfor...
Neural networks and wavelet transform have been recently seen as attractive tools for developing efi...
Abstract:- Function approximation, which finds the underlying relationship from a given finite input...
The main purpose of this paper is to investigate theoretically and experimentally the use of family ...
A new architecture based on wavelets and neural networks is proposed and implemented for learning a ...
This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polyn...
The Neural networks are massively parallel, distributed processing systems representing a new comput...
The wavelet network has been introduced as a special feedforward neural network supported by the wav...
Wavelet functions have been used as the activation function in feedforward neural networks. An abund...
We compare activation functions in terms of the approximation power of their feedforward nets. We co...
We compare activation functions in terms of the approximation power of their feedforward nets. We co...
In this dissertation, we have investigated the representational power of multilayer feedforward neur...
The authors present a class of efficient algorithms for PAC learning continuous functions and regres...
Research into Wavelet Neural Networks was conducted on numerous occasions in the past. Based on prev...
In this paper we develop a theoretical description of standard feedfoward neural networks in terms o...
Wavelet functions have been successfully used in many problems as the activation function of feedfor...
Neural networks and wavelet transform have been recently seen as attractive tools for developing efi...
Abstract:- Function approximation, which finds the underlying relationship from a given finite input...
The main purpose of this paper is to investigate theoretically and experimentally the use of family ...
A new architecture based on wavelets and neural networks is proposed and implemented for learning a ...
This paper proposes a comparison between wavelet neural networks (WNN), RBF neural network and polyn...
The Neural networks are massively parallel, distributed processing systems representing a new comput...
The wavelet network has been introduced as a special feedforward neural network supported by the wav...
Wavelet functions have been used as the activation function in feedforward neural networks. An abund...
We compare activation functions in terms of the approximation power of their feedforward nets. We co...
We compare activation functions in terms of the approximation power of their feedforward nets. We co...
In this dissertation, we have investigated the representational power of multilayer feedforward neur...
The authors present a class of efficient algorithms for PAC learning continuous functions and regres...
Research into Wavelet Neural Networks was conducted on numerous occasions in the past. Based on prev...
In this paper we develop a theoretical description of standard feedfoward neural networks in terms o...