<p><b>A)</b> Comparison between the inferred nonlinearity in the range of energies observed in the dataset and the log of the density of states at the same energies, showing the increasing match between the two quantities as the population size, <i>N</i>, increases. Both axes are normalized by the population size so that all curves have a similar scale. Nonlinearity can be shifted by an arbitrary constant without changing the model; to remove this redundancy, we set <i>V</i> (0) = 0 for all nonlinearities. <b>B)</b> The population size dependence of the average squared distance between the density of states and the inferred nonlinearity. Since the nonlinearity can be shifted by an arbitrary constant, we chose this offset so as to minimize t...
The dynamic range of stimulus processing in living organisms is much larger than a single neural net...
Neural scaling laws (NSL) refer to the phenomenon where model performance improves with scale. Sharm...
<p><b>A</b> Downscaling facilitates simulations, while taking the <i>N</i> → ∞ limit often affords a...
It took until the last decade to finally see a machine match human performance on essentially any ta...
We consider an oriented network in which two subgraphs (modules X and Y), with a given intra-modular...
<p>A. The nonlinearities in the LN model framework for a GS (red) ( pS/µm<sup>2</sup> and pS/µm<sup...
<p>Example of a toy-network illustrating that the degree to which any given metric of neuron embedde...
A simple expression for a lower bound of Fisher information is derived for a network of recurrently ...
Many complex systems of great interest-ecologies, economies, immune systems, etc.-can be described a...
The dynamic range quantifies the range of inputs that a neural network can discriminate. It is maxim...
The dynamics of a probabilistic neural network is characterized by the distribution nvux'\x) of...
Abstract: We investigate the structure of the loss function landscape for neural networks subject to...
<p>The linear response for linear neurons depends on the network structure both explicitly and impli...
A. The maximal Lyapunov exponent (MLE) as a function of the projection width (σi) and the time const...
In this note, a correlation metric $\kappa_c$ is proposed which is based on the universal behavior o...
The dynamic range of stimulus processing in living organisms is much larger than a single neural net...
Neural scaling laws (NSL) refer to the phenomenon where model performance improves with scale. Sharm...
<p><b>A</b> Downscaling facilitates simulations, while taking the <i>N</i> → ∞ limit often affords a...
It took until the last decade to finally see a machine match human performance on essentially any ta...
We consider an oriented network in which two subgraphs (modules X and Y), with a given intra-modular...
<p>A. The nonlinearities in the LN model framework for a GS (red) ( pS/µm<sup>2</sup> and pS/µm<sup...
<p>Example of a toy-network illustrating that the degree to which any given metric of neuron embedde...
A simple expression for a lower bound of Fisher information is derived for a network of recurrently ...
Many complex systems of great interest-ecologies, economies, immune systems, etc.-can be described a...
The dynamic range quantifies the range of inputs that a neural network can discriminate. It is maxim...
The dynamics of a probabilistic neural network is characterized by the distribution nvux'\x) of...
Abstract: We investigate the structure of the loss function landscape for neural networks subject to...
<p>The linear response for linear neurons depends on the network structure both explicitly and impli...
A. The maximal Lyapunov exponent (MLE) as a function of the projection width (σi) and the time const...
In this note, a correlation metric $\kappa_c$ is proposed which is based on the universal behavior o...
The dynamic range of stimulus processing in living organisms is much larger than a single neural net...
Neural scaling laws (NSL) refer to the phenomenon where model performance improves with scale. Sharm...
<p><b>A</b> Downscaling facilitates simulations, while taking the <i>N</i> → ∞ limit often affords a...