Overview This is the first joined release with pytorch-bearer, here we come... This release extends the training features by addIng Tensor Processing Unit (TPU) support, see docs. It brings together the flexibility from pytorch-bearer of extended support for user-defined callbacks, see docs. For the easier development, we have added profiling tool used in training runs, see docs. We have added an automatic sampler setup. Depending on DDP or TPU, lightning configures the sampler correctly (user needs to do nothing). We have extended support for multiple loggers to be passed to Trainer as an iterable (e.g. list, tuple, etc.), docs and added support for step-based learning rate scheduling At last, we were fixing many reported issues as you can...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
New Model: BART (added by @sshleifer) Bart is one of the first Seq2Seq models in the library, and ac...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
[0.7.2] - 2020-04-07 Added Added same step loggers' metrics aggregation (#1278) Added parity test b...
[0.7.0] - 2022-15-02 Added Added support for multi-label, space delimited, targets (#1076) Added su...
Trainer & TFTrainer Version 2.9 introduces a new Trainer class for PyTorch, and its equivalent TFTra...
Data preprocessing and documentation enhancements, major refactorings Functional enhancements: Supp...
Generalization release The main focus of this release was on adding flexibility and generalization t...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
Name change: welcome Transformers Following the extension to TensorFlow 2.0, pytorch-transformers =...
[0.7.2] - 2022-03-30 Fixed Fixed examples (question answering), where NLTK's punkt module needs to ...
Added a new task - Self Supervised Learning (SSL) and a separate training API for it. Added new SOTA...
This release focused on a ton of bug fixes, small optimizations to training but most importantly, cl...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
New Model: BART (added by @sshleifer) Bart is one of the first Seq2Seq models in the library, and ac...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
[0.7.2] - 2020-04-07 Added Added same step loggers' metrics aggregation (#1278) Added parity test b...
[0.7.0] - 2022-15-02 Added Added support for multi-label, space delimited, targets (#1076) Added su...
Trainer & TFTrainer Version 2.9 introduces a new Trainer class for PyTorch, and its equivalent TFTra...
Data preprocessing and documentation enhancements, major refactorings Functional enhancements: Supp...
Generalization release The main focus of this release was on adding flexibility and generalization t...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
Name change: welcome Transformers Following the extension to TensorFlow 2.0, pytorch-transformers =...
[0.7.2] - 2022-03-30 Fixed Fixed examples (question answering), where NLTK's punkt module needs to ...
Added a new task - Self Supervised Learning (SSL) and a separate training API for it. Added new SOTA...
This release focused on a ton of bug fixes, small optimizations to training but most importantly, cl...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat
New Model: BART (added by @sshleifer) Bart is one of the first Seq2Seq models in the library, and ac...
The lightweight PyTorch wrapper for ML researchers. Scale your models. Write less boilerplat