bayestorch 0.0.3

Creator: codyrutscher

Last updated:

0 purchases

bayestorch 0.0.3 Image
bayestorch 0.0.3 Images
Add to Cart

Description:

bayestorch 0.0.3

BayesTorch







Welcome to bayestorch, a lightweight Bayesian deep learning library for fast prototyping based on
PyTorch. It provides the basic building blocks for the following
Bayesian inference algorithms:

Bayes by Backprop (BBB)
Markov chain Monte Carlo (MCMC)
Stein variational gradient descent (SVGD)


💡 Key features

Low-code definition of Bayesian (or partially Bayesian) models
Support for custom neural network layers
Support for custom prior/posterior distributions
Support for layer/parameter-wise prior/posterior distributions
Support for composite prior/posterior distributions
Highly modular object-oriented design
User-friendly and easily extensible APIs
Detailed API documentation


🛠️️ Installation
Using Pip
First of all, install Python 3.6 or later. Open a terminal and run:
pip install bayestorch

From source
First of all, install Python 3.6 or later.
Clone or download and extract the repository, navigate to <path-to-repository>, open a
terminal and run:
pip install -e .


▶️ Quickstart
Here are a few code snippets showcasing some key features of the library.
For complete training loops, please refer to examples/mnist and examples/regression.
Bayesian model trainable via Bayes by Backprop
from torch.nn import Linear

from bayestorch.distributions import (
get_mixture_log_scale_normal,
get_softplus_inv_scale_normal,
)
from bayestorch.nn import VariationalPosteriorModule


# Define model
model = Linear(5, 1)

# Define log scale normal mixture prior over the model parameters
prior_builder, prior_kwargs = get_mixture_log_scale_normal(
model.parameters(),
weights=[0.75, 0.25],
locs=(0.0, 0.0),
log_scales=(-1.0, -6.0)
)

# Define inverse softplus scale normal posterior over the model parameters
posterior_builder, posterior_kwargs = get_softplus_inv_scale_normal(
model.parameters(), loc=0.0, softplus_inv_scale=-7.0, requires_grad=True,
)

# Define Bayesian model trainable via Bayes by Backprop
model = VariationalPosteriorModule(
model, prior_builder, prior_kwargs, posterior_builder, posterior_kwargs
)

Partially Bayesian model trainable via Bayes by Backprop
from torch.nn import Linear

from bayestorch.distributions import (
get_mixture_log_scale_normal,
get_softplus_inv_scale_normal,
)
from bayestorch.nn import VariationalPosteriorModule


# Define model
model = Linear(5, 1)

# Define log scale normal mixture prior over `model.weight`
prior_builder, prior_kwargs = get_mixture_log_scale_normal(
[model.weight],
weights=[0.75, 0.25],
locs=(0.0, 0.0),
log_scales=(-1.0, -6.0)
)

# Define inverse softplus scale normal posterior over `model.weight`
posterior_builder, posterior_kwargs = get_softplus_inv_scale_normal(
[model.weight], loc=0.0, softplus_inv_scale=-7.0, requires_grad=True,
)

# Define partially Bayesian model trainable via Bayes by Backprop
model = VariationalPosteriorModule(
model, prior_builder, prior_kwargs,
posterior_builder, posterior_kwargs, [model.weight],
)

Composite prior
from torch.distributions import Independent
from torch.nn import Linear

from bayestorch.distributions import (
CatDistribution,
get_laplace,
get_normal,
get_softplus_inv_scale_normal,
)
from bayestorch.nn import VariationalPosteriorModule


# Define model
model = Linear(5, 1)

# Define normal prior over `model.weight`
weight_prior_builder, weight_prior_kwargs = get_normal(
[model.weight],
loc=0.0,
scale=1.0,
prefix="weight_",
)

# Define Laplace prior over `model.bias`
bias_prior_builder, bias_prior_kwargs = get_laplace(
[model.bias],
loc=0.0,
scale=1.0,
prefix="bias_",
)

# Define composite prior over the model parameters
prior_builder = (
lambda **kwargs: CatDistribution([
Independent(weight_prior_builder(**kwargs), 1),
Independent(bias_prior_builder(**kwargs), 1),
])
)
prior_kwargs = {**weight_prior_kwargs, **bias_prior_kwargs}

# Define inverse softplus scale normal posterior over the model parameters
posterior_builder, posterior_kwargs = get_softplus_inv_scale_normal(
model.parameters(), loc=0.0, softplus_inv_scale=-7.0, requires_grad=True,
)

# Define Bayesian model trainable via Bayes by Backprop
model = VariationalPosteriorModule(
model, prior_builder, prior_kwargs, posterior_builder, posterior_kwargs,
)


📧 Contact
[email protected]

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.