Last updated:
0 purchases
bregmanlearning 0.0.0
A pytorch extension providing Bregman-based optimizers
Free software: BSD 3-Clause License
Installation
The package can be install from PyPI using:
pip install bregman-learning
Usage
The library provides 2 Bregman-based optimizers, several regularizers for these optimizers, and functions for pre- and postprocessing the networks.
The Bregman-based optimizers provides are LinBreg and AdaBreg. Their usage is similar to the usage of Adam and SGD, their non-Bregman counterparts. Instead of:
from torch.optim import Adam
...
optimizer = Adam(model.parameters(), lr=learning_rate)
the optimizers are created using:
from bregman import AdaBreg, L1
...
optimizer = AdaBreg(
model.parameters(),
reg=L1(rc=regularization_constant),
lr=learning_rate
)
where the L1 regularizer can be interchanged with any regularizer in the library.
For the best results when using sparsity-promoting regularizers, the networks have to pre- and postprocessed accordingly. For the L12 regularizer, this can be done using:
from bregman import sparsify
...
sparsify(model, density_level=0.2)
and:
from bregman import simplify
...
pruned_model = simplify(model)
Citing
If you use this code, please use the citation information in the CITATION.cff file or click the cite this repository button in the sitebar.
Changelog
0.0.0 (2022-06-17)
First release on PyPI.
For personal and professional use. You cannot resell or redistribute these repositories in their original state.
There are no reviews.