Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, which are already extended to hundreds, and are far from their biological realization. Their implausible biological dynamics is based on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typically large, using the backpropagation technique. Here, offline and online CIFAR-10 database learning on 3-layer tree architectures, inspired by experimental-based dendritic tree adaptations, outperforms the achievable success rates of the 5-layer convolutional LeNet. Its highly pruning tree backpropagation procedure, where a single route connects an output unit and a weight, represents an efficie...
Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box func...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
A, Schematic of a sparsely connected network with 3 hidden layers. The output layer is fully connect...
The realization of complex classification tasks requires training of deep learning (DL) architecture...
An underlying mechanism for successful deep learning (DL) with a limited deep architecture and datas...
In this era of artificial intelligence, deep neural networks like Convolutional Neural Networks (CNN...
Training deep neural networks with the error backpropagation algorithm is considered implausible fro...
Learning classification tasks of (2^nx2^n) inputs typically consist of \le n (2x2) max-pooling (MP) ...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
Although Capsule Network is powerful at defining the positional relationship between features in dee...
In this work, we suggest Kernel Filtering Linear Overparameterization (KFLO), where a linear cascade...
The brain can efficiently learn a wide range of tasks, motivating the search for biologically inspir...
Deep neural networks follow a pattern of connectivity that was loosely inspired by neurobiology. The...
Deep learning (DL) is playing an increasingly important role in our lives. It has already made a hug...
The design of deep neural networks remains somewhat of an art rather than precise science. By tentat...
Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box func...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
A, Schematic of a sparsely connected network with 3 hidden layers. The output layer is fully connect...
The realization of complex classification tasks requires training of deep learning (DL) architecture...
An underlying mechanism for successful deep learning (DL) with a limited deep architecture and datas...
In this era of artificial intelligence, deep neural networks like Convolutional Neural Networks (CNN...
Training deep neural networks with the error backpropagation algorithm is considered implausible fro...
Learning classification tasks of (2^nx2^n) inputs typically consist of \le n (2x2) max-pooling (MP) ...
The paper characterizes classes of functions for which deep learning can be exponentially better tha...
Although Capsule Network is powerful at defining the positional relationship between features in dee...
In this work, we suggest Kernel Filtering Linear Overparameterization (KFLO), where a linear cascade...
The brain can efficiently learn a wide range of tasks, motivating the search for biologically inspir...
Deep neural networks follow a pattern of connectivity that was loosely inspired by neurobiology. The...
Deep learning (DL) is playing an increasingly important role in our lives. It has already made a hug...
The design of deep neural networks remains somewhat of an art rather than precise science. By tentat...
Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box func...
Error backpropagation is a highly effective mechanism for learning high-quality hierarchical feature...
A, Schematic of a sparsely connected network with 3 hidden layers. The output layer is fully connect...