We propose StitchNet, a novel neural network creation paradigm that stitches together fragments (one or more consecutive network layers) from multiple pre-trained neural networks. StitchNet allows the creation of high-performing neural networks without the large compute and data requirements needed under traditional model creation processes via backpropagation training. We leverage Centered Kernel Alignment (CKA) as a compatibility measure to efficiently guide the selection of these fragments in composing a network for a given task tailored to specific accuracy needs and computing resource constraints. We then show that these fragments can be stitched together to create neural networks with accuracy comparable to that of traditionally train...
In this paper, we propose to regularize deep neural nets with a new type of multitask learning where...
Living neural networks emerge through a process of growth and self-organization that begins with a s...
A relaxed group-wise splitting method (RGSM) is developed and evaluated for channel pruning of deep ...
A typical feed forward neural network relies solely on its training algorithm, such as backprop or q...
Scaling model capacity has been vital in the success of deep learning. For a typical network, necess...
Connectivity patterns in biological brains exhibit many repeating motifs. This repetition mirrors in...
Improving the e ciency of neural networks has great potential impact due to their wide range of pos...
Unlike cloud-based deep learning models that are often large and uniform, edge-deployed models usual...
Research in neuroevolution-that is, evolving artificial neural networks (ANNs) through evolutionary ...
In this paper we present a modified neural network architecture and an algorithm that enables neural...
In the context of multi-task learning, neural networks with branched architectures have often been e...
We propose a novel deep neural network that is both lightweight and effectively structured for model...
Most uses of machine learning today involve training a model from scratch for a particular task, or ...
Deep learning uses neural networks which are parameterised by their weights. The neural networks are...
Synaptic plasticity allows cortical circuits to learn new tasks and to adapt to changing environment...
In this paper, we propose to regularize deep neural nets with a new type of multitask learning where...
Living neural networks emerge through a process of growth and self-organization that begins with a s...
A relaxed group-wise splitting method (RGSM) is developed and evaluated for channel pruning of deep ...
A typical feed forward neural network relies solely on its training algorithm, such as backprop or q...
Scaling model capacity has been vital in the success of deep learning. For a typical network, necess...
Connectivity patterns in biological brains exhibit many repeating motifs. This repetition mirrors in...
Improving the e ciency of neural networks has great potential impact due to their wide range of pos...
Unlike cloud-based deep learning models that are often large and uniform, edge-deployed models usual...
Research in neuroevolution-that is, evolving artificial neural networks (ANNs) through evolutionary ...
In this paper we present a modified neural network architecture and an algorithm that enables neural...
In the context of multi-task learning, neural networks with branched architectures have often been e...
We propose a novel deep neural network that is both lightweight and effectively structured for model...
Most uses of machine learning today involve training a model from scratch for a particular task, or ...
Deep learning uses neural networks which are parameterised by their weights. The neural networks are...
Synaptic plasticity allows cortical circuits to learn new tasks and to adapt to changing environment...
In this paper, we propose to regularize deep neural nets with a new type of multitask learning where...
Living neural networks emerge through a process of growth and self-organization that begins with a s...
A relaxed group-wise splitting method (RGSM) is developed and evaluated for channel pruning of deep ...