Tutorials & Notebooks

Data Science Corner

First-principles Bayesian ML — from the mathematics to working code, one notebook at a time.

Jupyter Notebooks

01

Laplace Approximation for Logistic Regression

Python · Jupyter

A ground-up implementation of the Laplace approximation applied to logistic regression — turning a point estimate into a full posterior distribution with minimal overhead.

Concepts Covered

Laplace Approximation Logistic Regression Posterior Estimation Bayesian Inference
02

Fixed Basis Regression & Variational Inference

Julia · Jupyter

Synthetic univariate data is used as a playground to implement four inference methods entirely from first principles — with emphasis on translating mathematical derivations directly into code.

Concepts Covered

Maximum Likelihood (MLE) Maximum a Posteriori (MAP) Gibbs Sampling Mean Field Variational Inference Fixed Basis Functions
03

Bayesian Logistic Regression via Metropolis–Hastings

Python · Jupyter

A real Kaggle dataset drives this end-to-end walkthrough: preprocessing for i.i.d. assumptions, a scratch-built binary logistic regression, full Bayesian uncertainty estimation, and a head-to-head comparison with the GLM package.

Concepts Covered

Metropolis–Hastings MCMC Data Preprocessing Binary Logistic Regression Uncertainty Estimation GLM Comparison
✍️

More notebooks on the way — Variational Autoencoders, Gaussian Processes, and Bayesian Neural Networks.

All notebooks are open source — browse or fork the full repo.

View DS-Tutorials on GitHub