adam-atan2-pytorch 0.0.12

Creator: bradpython12

Last updated:

0 purchases

adam-atan2-pytorch 0.0.12 Image
adam-atan2-pytorch 0.0.12 Images

Languages

Categories

Add to Cart

Description:

adamatan2pytorch 0.0.12

Adam-atan2 - Pytorch
Implementation of the proposed Adam-atan2 optimizer in Pytorch
A multi-million dollar paper out of google deepmind proposes a small change to Adam update rule (using atan2) to remove the epsilon altogether for numerical stability and scale invariance
Install
$ pip install adam-atan2-pytorch

Usage
import torch
from torch import nn

# toy model

model = nn.Linear(10, 1)

# import AdamAtan2 and instantiate with parameters

from adam_atan2_pytorch import AdamAtan2

opt = AdamAtan2(model.parameters(), lr = 1e-4)

# forward and backwards

for _ in range(100):
loss = model(torch.randn(10))
loss.backward()

# optimizer step

opt.step()
opt.zero_grad()

Citations
@inproceedings{Everett2024ScalingEA,
title = {Scaling Exponents Across Parameterizations and Optimizers},
author = {Katie Everett and Lechao Xiao and Mitchell Wortsman and Alex Alemi and Roman Novak and Peter J. Liu and Izzeddin Gur and Jascha Narain Sohl-Dickstein and Leslie Pack Kaelbling and Jaehoon Lee and Jeffrey Pennington},
year = {2024},
url = {https://api.semanticscholar.org/CorpusID:271051056}
}

@inproceedings{Kumar2023MaintainingPI,
title = {Maintaining Plasticity in Continual Learning via Regenerative Regularization},
author = {Saurabh Kumar and Henrik Marklund and Benjamin Van Roy},
year = {2023},
url = {https://api.semanticscholar.org/CorpusID:261076021}
}

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.