The human brain’s ability to extract information from multidimensional data modeled by the Nonlinear Line Attractor (NLA), where nodes are connected by polynomial weight sets. Neuron connections in this architecture assumes complete connectivity with all other neurons, thus creating a huge web of connections. We envision that each neuron should be connected to a group of surrounding neurons with weighted connection strengths that reduces with proximity to the neuron. To develop the weighted NLA architecture, we use a Gaussian weighting strategy to model the proximity, which will also reduce the computation times significantly. Once all data has been trained in the NLA network, the weight set can be reduced using a locality preserving nonlin...
Finding useful representations of data in order to facilitate scientific knowledge generation is a u...
Data mining techniques have become extremely important with the proliferation of data. One technique...
An attractor neural network on the small-world topology is studied. A learning pattern is presented...
The human brain has the capability to process high quantities of data quickly for detection and reco...
Development of a mathematical model for learning a nonlinear line of attraction is presented in this...
Artificial neural networks are an area of research that has been explored extensively. With the for...
Neural Networks have become increasingly popular in recent years due to their ability to accurately ...
Attractor networks are widely believed to underlie the memory systems of animals across different sp...
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Compute...
Line attractor networks have become standard workhorses of computational accounts of neural populati...
The analysis is restricted to the features of neural networks endowed to the latter by the inborn (n...
An overview of neural networks, covering multilayer perceptrons, radial basis functions, constructiv...
A conventional view of information processing by line (manifold) attractor networks holds that they ...
A nonlinear recurrent neural network is trained to synthesize chaotic signals. The identification pr...
The size of the basins of attraction around fixed points in recurrent neural nets (NNs) can be modif...
Finding useful representations of data in order to facilitate scientific knowledge generation is a u...
Data mining techniques have become extremely important with the proliferation of data. One technique...
An attractor neural network on the small-world topology is studied. A learning pattern is presented...
The human brain has the capability to process high quantities of data quickly for detection and reco...
Development of a mathematical model for learning a nonlinear line of attraction is presented in this...
Artificial neural networks are an area of research that has been explored extensively. With the for...
Neural Networks have become increasingly popular in recent years due to their ability to accurately ...
Attractor networks are widely believed to underlie the memory systems of animals across different sp...
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Compute...
Line attractor networks have become standard workhorses of computational accounts of neural populati...
The analysis is restricted to the features of neural networks endowed to the latter by the inborn (n...
An overview of neural networks, covering multilayer perceptrons, radial basis functions, constructiv...
A conventional view of information processing by line (manifold) attractor networks holds that they ...
A nonlinear recurrent neural network is trained to synthesize chaotic signals. The identification pr...
The size of the basins of attraction around fixed points in recurrent neural nets (NNs) can be modif...
Finding useful representations of data in order to facilitate scientific knowledge generation is a u...
Data mining techniques have become extremely important with the proliferation of data. One technique...
An attractor neural network on the small-world topology is studied. A learning pattern is presented...