What's changed Simplified imports - can now directly import accumulators without middle step Renamed GAModelWrapper -> GradientAccumulateModel Renamed GAOptimizerWrapper -> GradientAccumulateOptimizer Updated README and all CIs accordingly Deprecated tensorflow==2.2, due to tensorflow-addons incompatiblity. Now tf >= 2.3 supported. Full Changelog: https://github.com/andreped/GradientAccumulator/compare/v0.3.0...v0.3.1 How to install? pip install gradient-accumulator==0.3.1 New API! Model wrapper: from gradient_accumulator import GradientAccumulateModel model = Model(...) model = GradientAccumulateModel(accum_steps=4, inputs=model.input, outputs=model.output) Optimizer wrapper: from gradient_accumulator import GradientAccumulateModel o...
Changes: Added mixed precision support (only float16 currently, which is compatible with NVIDIA GPU...
This is a minor patch release. What's changed: Fixed typo by renaming use_acg to use_agc
Changes Added a new feature to call Python functions in the C++ layer. 1f680066294c46abde303659ea0a...
What's changed Fix for GradientAccumulateOptimizer to support tf >= 2.10 by dynamically inheriting ...
What's changed Added experimental Optimizer wrapper solution through GAOptimizerWrapper by @andrepe...
What's Changed Added custom AccumBatchNormalization layer with gradient accumulation support. Added...
What's Changed AGC and mixed precision are now compatible Support for both float16 and bfloat16 on ...
New feature! Multi-GPU support has now been added! Support has been added for both optimizer and mo...
This is a minor patch release. What's changed: Added support for tensorflow-metal, enabling GA on m...
What's Changed v0.5.0 zenodo + cite by @andreped in https://github.com/andreped/GradientAccumulator...
GradientAccumulator is now available on PyPI : https://pypi.org/project/gradient-accumulator/#files ...
What's Changed Added issue templates by @andreped in https://github.com/andreped/GradientAccumulato...
What's Changed Add seemless support for mixed precision in AccumBatchNormalization by @andreped in ...
Summary The main feature of this patch release is that AccumBN can now be used as drop-in replacemen...
Zenodo DOI release and updated README to contain updated documentation regarding installation and us...
Changes: Added mixed precision support (only float16 currently, which is compatible with NVIDIA GPU...
This is a minor patch release. What's changed: Fixed typo by renaming use_acg to use_agc
Changes Added a new feature to call Python functions in the C++ layer. 1f680066294c46abde303659ea0a...
What's changed Fix for GradientAccumulateOptimizer to support tf >= 2.10 by dynamically inheriting ...
What's changed Added experimental Optimizer wrapper solution through GAOptimizerWrapper by @andrepe...
What's Changed Added custom AccumBatchNormalization layer with gradient accumulation support. Added...
What's Changed AGC and mixed precision are now compatible Support for both float16 and bfloat16 on ...
New feature! Multi-GPU support has now been added! Support has been added for both optimizer and mo...
This is a minor patch release. What's changed: Added support for tensorflow-metal, enabling GA on m...
What's Changed v0.5.0 zenodo + cite by @andreped in https://github.com/andreped/GradientAccumulator...
GradientAccumulator is now available on PyPI : https://pypi.org/project/gradient-accumulator/#files ...
What's Changed Added issue templates by @andreped in https://github.com/andreped/GradientAccumulato...
What's Changed Add seemless support for mixed precision in AccumBatchNormalization by @andreped in ...
Summary The main feature of this patch release is that AccumBN can now be used as drop-in replacemen...
Zenodo DOI release and updated README to contain updated documentation regarding installation and us...
Changes: Added mixed precision support (only float16 currently, which is compatible with NVIDIA GPU...
This is a minor patch release. What's changed: Fixed typo by renaming use_acg to use_agc
Changes Added a new feature to call Python functions in the C++ layer. 1f680066294c46abde303659ea0a...