This release fixes a number of bugs: Properly handle InputLayer in Keras Sequential models (thanks @QuantumDancer) Fix bug supporting TimeDistributed BatchNormalization layers Fix bugs in lwtnn-split-keras-network.py Fix some annoying warnings in compilation Fixes for compilation errors in gcc11 (thanks @matthewfeickert) Replace broken link for boost in the minimal install (now getting boost from a public CERN URL) There were some tweaks to reduce element-wise calls through std::function in activation functions. This should mean a lot less pointer dereferencing. There were also some improvements to overall code health, all from @matthewfeickert: Add Github Actions based CI Code linting, also add pre-commit hooks to run linting, and enfor...
Main changes since the previous release: Added the ability for Graph and LightweightGraph to return...
This release adds one bug fix to v2.7. Previously the CMake build would include the system default v...
This is a bugfix release. It should resolve some issues some people saw where the sigmoid activation...
This adds a few new features and bug fixes: Add SimpleRNN layer (thanks @laurilaatu) Add Python 3.1...
This release adds a few minor things to python code. Nothing changes any C++ code but there are some...
The main improvement in version 2.1 is that now support the Keras functional API in the Graph and Li...
This release is a major improvement when building with CMake: CMake can automatically install Eigen...
This release fixes bugs in the CMake code CMake would build the project fine, but the project would ...
Changes since the last release: Add (some) support for ELUs (thanks @demarley) Sequence inputs now ...
This is a bugfix release which only affects networks that used ELU activation functions. Since versi...
This release adds several new features: Inputs are read into LightweightGraph lazily. In some cases...
Changes since the last release: Various tweaks to the test executables Fix bug where Softmax could ...
We've added lots of new features since release v1.0: New layers: we now support batch normalizati...
Major Change: Templated Classes The biggest change in this release is that all the core matrix class...
This release introduces several parameterized activation functions: The old ELU function had a hard...
Main changes since the previous release: Added the ability for Graph and LightweightGraph to return...
This release adds one bug fix to v2.7. Previously the CMake build would include the system default v...
This is a bugfix release. It should resolve some issues some people saw where the sigmoid activation...
This adds a few new features and bug fixes: Add SimpleRNN layer (thanks @laurilaatu) Add Python 3.1...
This release adds a few minor things to python code. Nothing changes any C++ code but there are some...
The main improvement in version 2.1 is that now support the Keras functional API in the Graph and Li...
This release is a major improvement when building with CMake: CMake can automatically install Eigen...
This release fixes bugs in the CMake code CMake would build the project fine, but the project would ...
Changes since the last release: Add (some) support for ELUs (thanks @demarley) Sequence inputs now ...
This is a bugfix release which only affects networks that used ELU activation functions. Since versi...
This release adds several new features: Inputs are read into LightweightGraph lazily. In some cases...
Changes since the last release: Various tweaks to the test executables Fix bug where Softmax could ...
We've added lots of new features since release v1.0: New layers: we now support batch normalizati...
Major Change: Templated Classes The biggest change in this release is that all the core matrix class...
This release introduces several parameterized activation functions: The old ELU function had a hard...
Main changes since the previous release: Added the ability for Graph and LightweightGraph to return...
This release adds one bug fix to v2.7. Previously the CMake build would include the system default v...
This is a bugfix release. It should resolve some issues some people saw where the sigmoid activation...