Bayesian Thinking

Notes & simulations

Bayesian inference — updating beliefs with data. Start with a prior, observe data, get a posterior. Just the core logic, with simulations.

Builds on: Statistical Inference

The workflow

Every Bayesian analysis follows the same loop:

Step What you do Where to learn it
1. Write down the likelihood What’s your data model? Normal, binomial, Poisson? Bayes’ Theorem — shows how the likelihood plugs into the updating formula
2. Choose a prior What do you believe before seeing data? Vague or informative? Priors & Posteriors — watch different priors get updated by data
3. Compute the posterior Conjugate pair → formula. Otherwise → MCMC. Priors & Posteriors for closed-form; MCMC for sampling when no formula exists
4. Summarize & interpret Posterior mean, credible intervals, posterior probabilities Bayesian Regression — credible intervals and how to read Bayesian output
5. Check sensitivity How much do results change with different priors? Shrinkage — why the prior pulls estimates and when that helps vs hurts
6. Scale up Multiple groups? Partial pooling via hierarchical models. Hierarchical Models — let groups borrow strength from each other
7. Compare models Which model fits better? Bayes factors and BIC. Model Comparison — Bayes factors, BIC, and Lindley’s paradox
8. Check model fit Does the model actually describe the data? Posterior Predictive Checks — simulate from the model and compare

The core loop: likelihood + prior → posterior → interpret → check.

Foundations

Computation

  • MCMC — Sampling from posteriors when closed-form solutions don’t exist

Hierarchical Modeling

  • Shrinkage — Why pulling estimates toward the mean beats taking them at face value
  • Hierarchical Models — Partial pooling: the killer app of Bayesian inference

Model Checking

Application

Bigger Picture

How does Bayesian inference relate to causal inference?

They’re different questions:

Bayesian inference Causal inference
Question What should I believe given the data? Does X cause Y?
Framework Prior + likelihood = posterior Potential outcomes, DAGs
Key concept Updating beliefs Counterfactuals

You can combine them — Bayesian causal inference uses Bayesian methods to estimate causal effects — but each stands on its own.