Introduction to Bayesian Thinking
Bayesian inference is a framework for updating beliefs with data. You start with what you think is plausible (a prior), observe data, and arrive at an updated belief (a posterior). That’s it — the rest is mechanics.
This short course builds intuition through simulations. No measure theory, no MCMC (yet) — just the core logic.
Prerequisites: Foundations of Statistical Inference
Topics
- Bayes’ Theorem — The engine behind everything: how evidence updates beliefs
- Priors & Posteriors — Watch your prior get overwhelmed by data
- Bayesian vs Frequentist — Same question, two philosophies, different answers
How does Bayesian inference relate to causal inference?
They’re different questions:
| Bayesian inference | Causal inference | |
|---|---|---|
| Question | What should I believe given the data? | Does X cause Y? |
| Framework | Prior + likelihood = posterior | Potential outcomes, DAGs |
| Key concept | Updating beliefs | Counterfactuals |
You can combine them — Bayesian causal inference uses Bayesian methods to estimate causal effects — but each stands on its own.