With this package, you can generate mixed-integer linear programming (MIP) models of trained artificial neural networks (ANNs) using the rectified linear unit (ReLU) activation function. At the moment, only TensorFlow sequential models are supported. Interfaces to either the Pyomo or Gurobi modeling environments are offered. ReLU ANNs can be used to approximate complex functions from data. In order to embed these functions into optimization problems, strong formulations of the network are needed. This package employs progressive bound tightening procedures to produce MIP encodings for ReLU networks. This allows the user to embed complex and nonlinear functions into mixed-integer programs. Note that the training of ReLU ANNs is not part of ...
The activation function plays an important role in training and improving performance in deep neural...
We develop fast algorithms and robust software for convex optimization of two-layer neural networks ...
In the article, emphasis is put on the modern artificial neural network structure, which in the lite...
Recently, ReLU neural networks have been modelled as constraints in mixed integer linear programming...
Artificial Neural Networks (ANNs) are prevalent machine learning models that are applied across vari...
Recent work has shown potential in using Mixed Integer Programming (MIP) solvers to optimize certain...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
The optimal operation of chemical processes provides the foundation for optimization problems to det...
We study optimization problems where the objective function is modeled through feedforward neural ne...
The literature has shown how to optimize and analyze the parameters of different types of neural net...
This repository contains code material for the publication: Stanojevic, A., Woźniak, S., Bellec, G.,...
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, System Design and...
We give the first dimension-efficient algorithms for learning Rectified Linear Units (ReLUs), which ...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
We can compare the expressiveness of neural networks that use rectified linear units (ReLUs) by the ...
The activation function plays an important role in training and improving performance in deep neural...
We develop fast algorithms and robust software for convex optimization of two-layer neural networks ...
In the article, emphasis is put on the modern artificial neural network structure, which in the lite...
Recently, ReLU neural networks have been modelled as constraints in mixed integer linear programming...
Artificial Neural Networks (ANNs) are prevalent machine learning models that are applied across vari...
Recent work has shown potential in using Mixed Integer Programming (MIP) solvers to optimize certain...
We contribute to a better understanding of the class of functions that is represented by a neural ne...
The optimal operation of chemical processes provides the foundation for optimization problems to det...
We study optimization problems where the objective function is modeled through feedforward neural ne...
The literature has shown how to optimize and analyze the parameters of different types of neural net...
This repository contains code material for the publication: Stanojevic, A., Woźniak, S., Bellec, G.,...
Thesis: S.M. in Engineering and Management, Massachusetts Institute of Technology, System Design and...
We give the first dimension-efficient algorithms for learning Rectified Linear Units (ReLUs), which ...
By applying concepts from the statistical physics of learning, we study layered neural networks of r...
We can compare the expressiveness of neural networks that use rectified linear units (ReLUs) by the ...
The activation function plays an important role in training and improving performance in deep neural...
We develop fast algorithms and robust software for convex optimization of two-layer neural networks ...
In the article, emphasis is put on the modern artificial neural network structure, which in the lite...