Consider a learning algorithm, which involves an internal call to an optimization routine such as a generalized eigenvalue problem, a cone programming problem or even sorting. Integrating such a method as layers within a trainable deep network in a numerically stable way is not simple – for instance, only recently, strategies have emerged for eigendecomposition and differentiable sorting. We propose an efficient and differentiable solver for general linear programming problems which can be used in a plug and play manner within deep neural networks as a layer. Our development is inspired by a fascinating but not widely used link between dynamics of slime mold (physarum) and mathematical optimization schemes such as steepest descent. We descr...
Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a drama...
In this paper the authors describe a novel terminal attractor algorithm for solving linear systems, ...
Problems in operations research are typically combinatorial and high-dimensional. To a degree, linea...
Classic algorithms and machine learning systems like neural networks are both abundant in everyday l...
The deep learning community has devised a diverse set of methods to make gradient optimization, usin...
We propose and analyze two classes of neural network models for solving linear programming (LP) prob...
We study the problem of learning differentiable functions expressed as programs in a domain-specific...
International audienceThis paper introduces Deep Statistical Solvers (DSS), a new class of trainable...
We present DANTE, a novel method for training neural networks using the alternating minimization pri...
Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic ...
Learning a deep neural network requires solving a challenging optimization problem: it is a high-dim...
The objectives of this study are the analysis and design of efficient computational methods for deep...
Optimization and machine learning are both extremely active research topics. In this thesis, we expl...
Deep neural network optimization is challenging. Large gradients in their chaotic loss landscape lea...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a drama...
In this paper the authors describe a novel terminal attractor algorithm for solving linear systems, ...
Problems in operations research are typically combinatorial and high-dimensional. To a degree, linea...
Classic algorithms and machine learning systems like neural networks are both abundant in everyday l...
The deep learning community has devised a diverse set of methods to make gradient optimization, usin...
We propose and analyze two classes of neural network models for solving linear programming (LP) prob...
We study the problem of learning differentiable functions expressed as programs in a domain-specific...
International audienceThis paper introduces Deep Statistical Solvers (DSS), a new class of trainable...
We present DANTE, a novel method for training neural networks using the alternating minimization pri...
Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic ...
Learning a deep neural network requires solving a challenging optimization problem: it is a high-dim...
The objectives of this study are the analysis and design of efficient computational methods for deep...
Optimization and machine learning are both extremely active research topics. In this thesis, we expl...
Deep neural network optimization is challenging. Large gradients in their chaotic loss landscape lea...
We propose a fast second-order method that can be used as a drop-in replacement for current deep lea...
Rapid progress in deep learning is leading to a diverse set of quickly changing models, with a drama...
In this paper the authors describe a novel terminal attractor algorithm for solving linear systems, ...
Problems in operations research are typically combinatorial and high-dimensional. To a degree, linea...