Skip to content

Учебный проект по байесовскому мультимоделированию студентов первого курса магистратуры 25/26 года обучения. Авторы : Соболевский Федор, Набиев Мухаммадшариф, Василенко Дмитрий, Касюк Вадим

License

Notifications You must be signed in to change notification settings

intsystems/bensemble

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

132 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🚧 Under Active Refactoring 🚧

The library is currently undergoing a major rewrite. The API might be unstable. Check back soon!


😎 Bensemble: Bayesian Multimodeling Project

Python License Coverage Documentation Ruff image

Bensemble is a library for Bayesian Deep Learning which integrates established methods for neural network ensembling and uncertainty quantification. Bensemble provides building blocks that slot directly into your existing PyTorch workflows.


Key Resources

Resource Description
📘 Documentation Full API reference and user guides.
📝 Tech Report In-depth technical details and theoretical background.
✍️ Blog Post Summary of the project and motivation.
📊 Benchmarks Comparison of methods on standard datasets.

Features

  • PyTorch-Native: All layers and methods compatible with standart PyTorch.
  • Modularity: BayesianLinear, BayesianConv2d with built-in Local Reparameterization Trick (LRT).
  • Core Bayesian Methods: Implements canonical algorithms from Variational Inference to Scalable Laplace approximations.
  • Modern Stack: Built with uv, fully typed, and tested.

Installation

You can install bensemble using pip:

pip install bensemble

Or, if you prefer using uv for lightning-fast installation:

uv pip install bensemble

Quick Start

Build a Bayesian Neural Network using our layers and write a standard PyTorch training loop.

import torch
import torch.nn as nn
from torch.utils.data import DataLoader, TensorDataset

# Import building blocks
from bensemble.layers import BayesianLinear
from bensemble.losses import VariationalLoss, GaussianLikelihood
from bensemble.utils import get_total_kl, predict_with_uncertainty

# 0. Prepare Dummy Data
X_train = torch.randn(100, 10)
y_train = torch.randn(100, 1)

X_test = torch.randn(5, 10)

dataset = TensorDataset(X_train, y_train)
train_loader = DataLoader(dataset, batch_size=10, shuffle=True)

# 1. Define Model using Bayesian Layers
model = nn.Sequential(
    BayesianLinear(10, 50, prior_sigma=1.0),
    nn.ReLU(),
    BayesianLinear(50, 1, prior_sigma=1.0),
)

# 2. Define Objectives (Likelihood + Divergence)
likelihood = GaussianLikelihood()
criterion = VariationalLoss(likelihood, alpha=1.0)

optimizer = torch.optim.Adam(
    list(model.parameters()) + list(likelihood.parameters()), lr=0.01
)

# 3. Train Model
model.train()
for epoch in range(100):
    for x, y in train_loader:
        optimizer.zero_grad()

        preds = model(x)
        kl = get_total_kl(model)

        loss = criterion(preds, y, kl)

        loss.backward()
        optimizer.step()

# 4. Predict
mean, std = predict_with_uncertainty(model, X_test, num_samples=100)

print(f"Prediction: {mean[0].item():.2f}")
print(f"Uncertainty: ±{std[0].item():.2f}")

Development Setup

If you want to contribute to bensemble or run tests, we recommend using uv to manage the environment.

# 1. Clone the repository
git clone https://github.com/intsystems/bensemble.git
cd bensemble

# 2. Create and activate virtual environment via uv
uv venv
source .venv/bin/activate  # on Windows: .venv\Scripts\activate

# 3. Install in editable mode with dev dependencies
uv pip install -e ".[dev]"

Algorithms & Demos

We have implemented four distinct approaches. Check out the interactive demos for each:

Method Description Demo
Variational Inference Approximates posterior using Gaussian distributions using Local Reparameterization Trick Open Notebook
Laplace Approximation Fits a Gaussian around the MAP estimate using Kronecker-Factored Curvature (K-FAC). Open Notebook
Variational Rényi Generalization of VI minimizing $\alpha$-divergence (Rényi). Open Notebook
Probabilistic Backprop Propagates moments through the network using Assumed Density Filtering (ADF). Open Notebook

Development & Testing

The library is covered by a comprehensive test suite to ensure reliability.

Run Tests

pytest tests/

Linting

We use ruff to keep code clean:

ruff check .
ruff format .

Authors

Developed by:


License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Учебный проект по байесовскому мультимоделированию студентов первого курса магистратуры 25/26 года обучения. Авторы : Соболевский Федор, Набиев Мухаммадшариф, Василенко Дмитрий, Касюк Вадим

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •