Current teaching
Bayesian Dynamic Modeling in Macro & Finance - 2026
- Course: Bayesian Dynamic Modeling in Macro & Finance
- Period/Level: 1st semester of 2026 – Undergraduate course
- Time: Thursdays from 9:45am to 11:45am
- Instructor: Professor Hedibert F. Lopes, PhD, ISBA Fellow, ISI Fellow – www.hedibert.org
- Teaching assistant:
- Guilherme Piantino – guilhermejlp@al.insper.edu.br
- 7 Wednesdays from 3pm to 4:30pm: Feb 25th, Mar 11th and 25th, Apr 8th and 22nd, May 6th and 20th.
- Objective: The main goal of the course is to make the student familiar with and able to implement univariate and multivariate modern state-space time series models, particularly applied to macroeconomic and quantitative financial data. The inferential paradigm is Bayesian and posterior computation and model selection are aided by customized R packages.
- Students: Economics and Administration (≥ 6th semester) – Engineering (≥ 7th semester)
- Course prerequisites: Prerequisites include mathematical maturity in Calculus and Linear Algebra, alongside proficiency in R programming. Students are expected to have a strong foundational grasp of Statistics, Econometrics, and Time Series analysis. Prior exposure to ‘R for Data Science’ is highly advantageous.
- Brief course description: The main goal of the course is to make the student familiar with and able to implement univariate and multivariate modern state-space time series models, particularly applied to macroeconomic and quantitative financial data. We will quickly review standard observion-driven time series models, such as the family of autoregressive integrated moving average(ARIMA) models, generalized autoregressive conditionally heteroskedastic (GARCH), vector autoregressive (VAR) models and dynamic conditional correlation (DCC) models. Then, we devote the remainder of the course to dynamic models (aka state-space models), starting with local level, local growth and both deterministic and trigonometrical seasonal components. We will derived the Bayesian sequential learning of the parameters and states (which are also parameters) as well as the joint smoothed distribution of the states variables (aka latent variables or hidden states). We will talk about Markov switching models and various stochastic volatility models. We finish with multivariate dynamic models, including the well established class of time-varying parameter VAR models, dynamic factor models, factor stochastic volatility model and various time-varying covariance models. The inferential approach of this course is predominantly Bayesian, so we will briefly introduce key ingredients of Bayesian inference, model selection and criticism. An introduction to the main Monte Carlo methods for Bayesian inference, such as MC integration, sampling-importance-resampling (SIR), Markov chain Monte Carlo (MCMC) and sequential
- Homework assignment
- R software for Bayesian computation: All classroom examples and implementations as well as projects will be carried out by the open-source statistical software R.
- Module 1: Foundations of Bayesian inference and computation
- Classes 1 & 2: The Bayesian Paradigm: Likelhoods, priors, predictives and posteriors. Bayesian model selection, Bayes Factors, and Information Criteria.
- Very first example: The Beta-Binomial model
- Gaussian linear regression: MLE, conjugate and conditionally conjugate priors
- Bayesian Beta regression (data)
- Basic Bayesian ingredients
- Phisycist A and B: Tiago Mendonca’s shiny
- Phisycists A, B, C and D: Normal model and 4 priors
- Bernoulli trials with Beta prior & Uniform trials with Pareto prior
- Bayesian inference for the correlation coefficient of bivariate Gaussian data
- Bayesian hierarchical Beta-Binomial regression
- Zero-inflated Poisson data
- Classes 3 & 4: Simulation-Based Inference: MC and MCMC methods (SIR, Gibbs Sampling and Metropolis-Hastings algorithm)
- In class examples
- Sampling N(0,1) draws via rnorm, Box-Miller, Marsaglia and SIR schemes
- Bayesian computation
- Normal vs Student’s t model – closed form vs MC-based posterior inferences
- AR(1) with missing values
- Comparing SIR, RWMH, IMH and Gibbs sampler
- Nonlinear regression model – posterior and predictive inference via SIR
- Our first dynamic linear model: the Gaussian local level model
- Banana-shape posterior: Posterior inference via SIR
- Learning the number of degree of freedom of a Student’s t via SIR
- Bayesian regression with the normal-gamma (NG) prior
- A bit of Monte Carlo simulation and integration
- Normal vs skew-normal distributions
- The potential fallacy of using the prior as proposal in SIR algorithms
- Bayesian learning of correlation in bivariate normal data via SIR algorithm
- Data augmentation/Gibbs sampler – Linkage example
- Data augmentation/Gibbs sampler – Mean of Student’s t data
- Bayesian hierarchical linear regression – Gibbs Sampler
- AR(p) models – Bayesian updating with conjugate prior vs Gibbs sampler
- Return on education – Gaussian linear model with conjugate prior
- Gaussian linear regression model – Gibbs sampler for conditionally conjugate prior
- Normal linear models: subset selection via BIC and shrinkage/sparsity via regularizing priors – Stock and Watson’s (2002) macro data
- Classes 1 & 2: The Bayesian Paradigm: Likelhoods, priors, predictives and posteriors. Bayesian model selection, Bayes Factors, and Information Criteria.
- Module 2: Observation-driven models and dynamic models
- Class 5: Univariate Trends and Volatility: ARIMA models and GARCH models
- Autoregressive (AR) models and moving average (MA) models
- AR(2): h-step-ahead forecasting
- Bayesian AR(p): conjugate analysis vs Gibbs sampler
- Revisiting regression with autocorrelated errors: SIR vs Gibbs
- Threshold AR (TAR) model: Gibbs and Metropolis steps
- Seasonal models
- Modeling monthly totals of international airline passengers from 1949-1960
- Glossary of ARCH models
- Bayesian GARCH (David Ardia’s tutorial)
- Classes 6 & 7: Introduction to Dynamic Linear Models, Filtering and Smoothing: Local level, local trend, and seasonality, forward filtering and backward smoothing densities and algorithms
- Dynamic models: local level model
- The local level model as a multivariate normal update
- Dynamic models: normal dynamic linear model
- Comparing block-move and single-move MCMC for the local linear model
- Dynamic models: dynamic generalized linear model
- Dynamic linear regression
- AR(1) plus noise model
- Hidden Markov Model: Variance Switching
- R package dlm: Bayesian and Likelihood Analysis of Dynamic Linear Models
- R package bsts: Bayesian Structural Time Series (short tutorial by H.F.Lopes)
- R package kDGLM: Dynamic Generalized Linear Models
- Comparing dlm, bsts and kDGLM
- Class 8: Multivariate time series: VAR, BVAR and related models
- Class 5: Univariate Trends and Volatility: ARIMA models and GARCH models
- Module 3: Particle filtering, SV and MSSV models & TVP-VAR models
- Class 9: Sequential Monte Carlo: Bootstrap filter, auxiliary particle filter and particle learning
- Class 10: Stochastic Volatility: SV, MSSV and related models
- Class 11: Time-varying parameter VAR: TVP-VAR, Factor-augmented VAR.
- Module 4: High-dimensional and covariance modeling
- Class 12: Latent Factor Structures: Dynamic factor models (DFM) for high-dimensional data reduction.
- Class 13: Multivariate Volatility: Factor stochastic volatility (FSV) models and Cholesky SV (CSV) models. Advanced time-varying covariance models and recent developments in the field.
- Bibliography (do not buy these books!)
- A First Course in Bayesian Statistical Methods (2009) by Hoff
- Introduction to Bayesian Econometrics (2008) by Greenberg
- Bayesian Data Analysis – 3rd edition (2013) by Gelman, Carlin, Stern, Dunson, Vehtari & Rubin
- Bayesian Methods for Data Analysis – 3rd edition (2008) by Carlin & Louis
- Bayesian Computation with R (2009) by Albert
- Bayesian Econometrics (2003) by Koop
- An Introduction to Modern Bayesian Econometrics (2004) by Lancaster
- Bayesian Econometric Methods – 2nd edition (2019) by Chan, Koop, Poirier & Tobias
- Bayesian Reasoning and Machine Learning (2012) by Barber
- Time Series Modeling, Computation, and Inference – 2nd edition (2021) by Prado, Ferreira & West
- MCMC: Stochastic Simulation for Bayesian Inference – 2nd edition (2006) by Gamerman & Lopes
Bayesian Learning - Professional Master in Economics - 2026
- Course: Bayesian Learning
- Level: Professional Master in Economics – 2nd quarter of 2026
- Professor: Hedibert Freitas Lopes – www.hedibert.org (hedibertfl@insper.edu.br)
- Teaching assistant: To be announced
- Syllabus:The ultimate goal of this course is to enable graduates to critically decide between the classical or Bayesian approach, or a combination of both, when faced with real-world decision-making problems under uncertainty. Areas where these real-world problems arise, as examples discussed throughout the course, include microeconomics, macroeconomics, finance, quantitative marketing, among many others. With this objective in mind, we will study the basic ingredients of the Bayesian paradigm: formulation of the binomial model-prior, model comparison and combination, computational aspects, and Bayesian decision-making. In the second part of the course, the Bayesian approach to traditional linear regression and logistic regression models will be introduced, as well as their modern versions where priors are treated as regularization mechanisms and sparsity inducers. Sparsity will be present throughout the 2nd and 3rd parts of the course when dealing with highly dimensional and/or highly complex models. In the third and final part of the course, we will present several statistical models currently used for this purpose, such as mixture models, hierarchical models, factor models, and regression tree models, as well as models based on neural networks and models that use texts and documents as data (text modeling). All, it is worth mentioning, under the unified and coherent Bayesian approach. All calculations during the course will be performed using the R statistical package.
- Homework assignments:
- Final project – paper presentation:
- The final project consists of an oral presentation of a high-level scientific article chosen from the list provided below. You must submit two deliverables to my institutional email (above) in a single PDF file containing a 5 to 7-page summary and a link to a video presentation (maximum 15 minutes) utilizing slides. The final grade will be composed of homework assignments (50%), the final project (40%), and class participation (10%).
Course notes (+ R code & references)
- Bayesian ingredients
- The Monty Hall problem
- Flipping one of three coins three times and observing three heads (R code)
- Chapter 1 of “Bayes’ Rule: A tutorial Introduction to Bayesian Analysis (Slide)
- Tiago Mendonca’s shiny for the physicists example
- Phisycists A, B, C and D: Normal model and 4 priors
- Histórias da Matemática: Da Contagem nos Dedos à Inteligência Artificial (by Marcelo Viana, IMPA)
- Bayesian computation
- Monte Carlo Integration – Excerpts from chapter 3 of Gamerman and Lopes (2006)
- Banana-shape posterior: Posterior inference via SIR
- Learning the number of degree of freedom of a Student’s t (Rmarkdown)
- Bayesian regression with the normal-gamma (NG) prior
- A bit of Monte Carlo simulation and integration
- Back to the physicists example
- Normal vs skew-normal distributions: an exercise in Bayesian learning
- The potential fallacy of using the prior as proposal in SIR algorithms
- Bayesian learning of correlation in bivariate normal data via SIR algorithm
- Smith and Gelfand (1992) Bayesian Statistics without Tears: A Sampling-Resampling Perspective. The American Statistician, 46(2), 84-88.
- Hamiltonian Monte Carlo: a brief introduction
- Bayesian linear regression
- Boston housing data: ML and Bayesian inference
- Motorcycle example
- Regressing weight on height: SIR for Gaussian and Student’s t models
- Bayesian regularization
- iid Bernoulli or logit regression or probit regression? (Rmarkdown)
- Bayesian Poisson regression model versus i.i.d. Poisson model
- Another example of Poisson regression
- A few R packages for Bayesian inference in linear models
- Stan/rstan: Bayes Sparse Regression
- Bayesian classification via logistic regression
- Sparse logistic regression for the spam/ham dataset (data)
- Marketing campaigns of a Portuguese banking institution (data)
- Sparse logistic regression: comparison of regularization and Bayesian implementations
- Gelman, Jakulin, Pittau and Zu (2008) A weakly informative default prior distribution for logistic and other regression models, AOAS, 2(4), 1360-1383.
- Polson, Scott and Windle (2013) Bayesian Inference for logistic models using Pólya-Gamma latent variables, JASA, 108, 1339-1349.
- Other important modeling structures
- Factor models (Additional material)
- Time-varying variance/covariance
- Finite mixture of distributions
- Machine Learning 1: Tree models
- Classification and regression trees (CART)
- Bayesian CART
- Bootstrap aggregating (bagging)
- Boosting (weak/stronger learners)
- Random forests
- Bayesian additive regression trees (BART) + (Rob’s notes) + (a list of slides/examples/people/papers)
- Machine Learning 2: Modeling text
- Machine Learning 3: Neural nets
Additional supporting material
- Stan/rstan for posterior inference: Hamiltonian MC (HMC) methods – by Hedibert Lopes (February 2021)
- MC and MCMC: Key References – by Hedibert Lopes (February 2021)
- R packages for Bayesian linear regression – by Hedibert Lopes (February 2020)
- R packages for Bayesian Econometrics – by Hedibert Lopes (March 2014)
- CRAN Task View on Bayesian Inference (July 2023)
- Mathematics for Machine Learning – by Deisenroth, Faisal and Ong (2020)
- Conceitos e analises estatisticas com R e JASP – by Luis Anunciação (September 2021)
- Data Science, Marketing and Business by Pedro Fernandes & Paulo Marques (October 2019)
- Aprendizado de Máquina: Uma Abordagem Estatística (by Rafael Izbicki & Tiago Mendonça)
- Estatística e Ciência de Dados (by Pedro Morettin & Julio Singer)
Advanced Bayesian Econometrics - Doctorate in Business Economics - 2026
- Course: Advanced Bayesian Econometrics
- Level: Doctorate in Business Economics – 4th quarter of 2026
- Professor: Hedibert Freitas Lopes – www.hedibert.org
- Objective: The end of the course goal is to allow the student to critically decide between a Bayesian, a frequentist or Bayesian-frequentist compromise when facing real world problems in the fields of micro- and macro-econometrics and finance, as well as in quantitative marketing, strategy and business administration. With this end in mind, we will visit well known Bayesian issues, such as prior specification and model comparison and model averaging, but also study regularization via Bayesian LASSO, Spike-and-Slab and related schemes, “small n, large p” issues, Bayesian statistical learning via additive regression trees, random forests, large-scale VAR and (dynamic) factor models.
- Course description: Basic ingredients: prior, posterior, and predictive distributions, sequential Bayes, conjugate analysis, exchangeability, principles of data reduction and decision theory. Model criticism: Bayes factor, computing marginal likelihoods, Savage-Dickey ratio, reversible jump MCMC, Bayesian model averaging and deviance information criterion. Modern computation via (Markov chain) Monte Carlo methods: Monte Carlo integration, sampling-importance resampling, Gibbs sampler, Metropolis-Hastings algorithms. Mixture models, Hierarchical models, Bayesian regularization, Instrumental variables modeling, Large-scale (sparse) factor modeling, Bayesian additive regression trees (BART) and related topics, Dynamic models, Sequential Monte Carlo algorithms, Bayesian methods in microeconometrics, macroeconometrics, marketing and finance.
- Part I Bayesian ingredients: i) Inference: likelihood, prior, predictive and posterior distributions; ii) Model criticism: Marginal likelihoods, Bayes factor, model averaging and decision theory; and iii) Computation: An introduction (Markov chain and sequencial) Monte Carlo methods.
- Part II Multivariate models: i) Large-scale vector autoregressive models; ii) Factor models and other dimension reduction models; and iii) Time-varying high-dimensional covariance models.
- Part III Modern Bayesian statistical learning: i) Mixture models and the Dirichlet process: handling non-Gaussian models; ii) Regularization: sparsity via shrinkage and variable selection; iii) Large vector-autoregressive and factor models: combining sparsity and parsimony; iv) Classification and support vector machines; v) Regression trees and random forests; and vi) Latent Dirichlet allocation: Text as data, text mining.
- Paper presentation
- Homework assignments
LECTURE NOTES
PART I: Bayesian ingredients
- Basic Bayes
- Exchangeability
- Principles of data reduction
- More on estimators
- Decision theory (Nuisance parameters + travel insurance example)
- Decision Theory: Principles and Approaches, by Parmigiani and Inoue (with contributions by Lopes), 2009, Wiley. (TOC)
- Introdução à Teoria da Decisão – by Victor Fossaluza (IME-USP)
- James-Stein estimator (from The Bayesian Choice (2nd edition) by C.P.Robert)
- Bayesian model criticism (pages 1-6 & 32-34)
- Additional reading material:
- Chapter 2 of Gamerman and Lopes (2006) – Compact, but easy to read.
- Chapters 2-4 of Migon, Gamerman and Louzada (2014) – Integrates classical and Bayesian inference.
- Chapter 1 and 2 of Gelman et al. (2013) – Application-oriented.
- Chapter 4 (Sections 4.1-4.4) of Berger (1985) – More technical.
- van de Schoot, R., Depaoli, S., King, R. et al. Bayesian statistics and modelling. Nat Rev Methods Primers1, 1 (2021).
- Discussion about p-values
PART II: Bayesian Computation
- Monte Carlo (MC) methods
- Markov chain: a brief review
- Markov chain Monte Carlo (MCMC) algorithms
- MC and MCMC: Key References
- More on Bayesian model criticism
- Hamiltonian Monte Carlo: A toy example
- Stan/rstan for posterior inference: Hamiltonian MC (HMC) methods
- Banana shaped bivariate target: MH vs HMC
- Bayesian hierarchical modeling – the Beta-Binomial case
PART III: Bayesian Learning
- Modeling with mixtures of distributions
- Fundamentos de Aprendizagem Estatística + R code + MC exercise
- Multiple linear regression: selection, shrinkage, sparsity
- Motorcycle example
- Slides from the 2015 School of Time Series and Econometrics tutorial
- Hahn, He and Lopes (2018) Gaussian linear regression with arbitrary sparsity + slides of a talk + R package bayeslm
- Fava and Lopes (2021) The illusion of the illusion of sparsity + slides of a talk + UFPE webinar (62′ and forward)
- van Erp, Oberski and Mulder (2019) Shrinkage priors for Bayesian penalized regression, Journal of Mathematical Psychology, 89, 31-50.
- Michael Betancourt’s Bayes Sparse Regression (stan/rstan example)
- Classification: logistic regression and discriminant analysis
- Sparse logistic regression for the spam/ham dataset (data)
- Marketing campaigns of a Portuguese banking institution (data)
- Sparse logistic regression: comparison of regularization and Bayesian implementations
- Gelman, Jakulin, Pittau and Zu (2008) A weakly informative default prior distribution for logistic and other regression models, AOAS, 2(4), 1360-1383.
- Polson, Scott and Windle (2013) Bayesian Inference for logistic models using Pólya-Gamma latent variables, JASA, 108, 1339-1349.
- Multivariate models and dimension reduction
- Classification and regression trees (CART)
- I highly recommend checking out (studying!) the slides of the Machine Learning course by Paulo Orenstein, Assistant Professor at IMPA.
- Chapter 12 and Chapter 13 are about “Tree-based methods”.
- Also, Chapter 15 and Chapter 16 are about “Deep learning”.
- Bayesian CART
- Bootstrap aggregating (bagging)
- Bayesian additive regression trees (BART)
- Example 1: ICU data: CART, BART and random forest (R code)
- Example 2: Stock and Watson’s (2002) macro data (data)
- More examples: Four BART applications & 2 reviews + cute CART trees and 3D plots
- More recent references
- Latent Dirichlet Allocation (LDA)
- Neural Networks
Complementary material to PART III
- Boosting (weak/stronger learners)
- Random forests
- Bayesian instrumental variables
- General linear and hierarchical models
- Limited dependent variable models
- Spatial models
- P.Richard Hahn’s top 25 books on Statistics, Causal Inference, Statistical Computing, Machine Learning and Data Science
Bibliography: Bayesian econometrics
- Zellner (1971) An Introduction to Bayesian Inference in Econometrics
- Goel and Iyngar (1992) Bayesian Analysis in Statistics and Econometrics
- West and Harrison (1997) Bayesian Forecasting and Dynamic Models (2nd edition)
- Dorfman (1997) Bayesian Economics Through Numerical Methods
- Bauwens, Lubrano and Richard (2000) Bayesian Inference in Dynamic Econometric Models
- Koop (2003) Bayesian Econometrics
- Geweke (2005) Contemporary Bayesian Econometrics and Statistics
- Lancaster (2004) Introduction to Modern Bayesian Econometrics
- Rossi, Allenby and McCulloch (2005) Bayesian Statistics and Marketing
- Prado and West (2010) Time Series: Modeling, Computation and Inference
- Geweke, Koop and Van Dijk (2011) The Oxford Handbook of Bayesian Econometrics
- Greenberg (2013) Introduction to Bayesian Econometrics
- Herbst and Schorfheide (2015) Bayesian Estimation of DSGE Models
- Chan, Koop, Poirier and Tobias (2019) Bayesian Econometric Methods (2nd edition)
- Broemeling (2019) Bayesian Analysis of Time Series
- Bernardi, Grassi and Ravazzolo (2020) Bayesian Econometrics
Bibliography: Bayesian statistics
- Berger (1985) Statistical Decision Theory and Bayesian Analysis
- Bernardo and Smith (2000) Bayesian Theory
- Gelman and Hill (2006) Data Analysis Using Regression and Multilevel/Hierarchical Models
- Robert (2007) The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation
- Hoff (2009) A First Course in Bayesian Statistical Methods
- Carlin and Louis (2009) Bayesian Methods for Data Analysis (3rd edition)
- Gelman, Carlin, Stern, Dunson, Vehtari and Rubin (2016) Bayesian Data Analysis
- Migon, Gamerman and Louzada (2015) Statistical Inference: An Integrated Approach (2nd edition)
- Reich and Ghosh (2019) Bayesian Statistical Methods
- Held and Sabanes-Bove (2020) Likelihood and Bayesian Inference: With Applications in Biology and Medicine
Bibliography: Bayesian computation
- Gilks, Richardson and Spiegelhalter (1995) Markov Chain Monte Carlo in Practice
- Doucet, de Freitas and Gordon (2001) Sequential Monte Carlo Methods in Practice
- Robert and Casella (2004) Monte Carlo Statistical Methods (2nd edition)
- Gamerman and Lopes (2006) MCMC: Stochastic Simulation for Bayesian Inference, Second Edition
- Marin and Robert (2007) Bayesian Core: A Practical Approach to Computational Bayesian Statistics
- Albert (2009) Bayesian Computation with R
- Brooks, Gelman, Jones and Meng (2011) Handbook of Markov Chain Monte Carlo
- Givens and Hoeting (2012) Computational Statistics (2nd edition)
- Marin and Robert (2014) Bayesian Essentials with R (complete solution manual)
- Turkman, Paulino and Mueller (2019) Computational Bayesian Statistics: An Introduction
- McElreath (2020) Statistical Rethinking: A Bayesian course with Examples in R and STAN
- Chopin and Papaspiliopoulos (2020) An Introduction to Sequential Monte Carlo
Bibliography: (Bayesian) statistical learning
- Bishop (2006) Pattern Recognition and Machine Learning
- Hastie, Tibshirani and Friedman (2008) The Elements of Statistical Learning, 2nd edition
- Murphy (2012) Machine Learning: A Probabilistic Perspective
- Barber (2012) Bayesian Reasoning and Machine Learning
- James, Witten, Hastie and Tibshirani (2013) An Introduction to Statistical Learning
- Hastie, Tibshirani and Wainwright (2015) Statistical Learning with Sparsity
- Efron and Hastie (2016) Computer Age Statistical Inference: Algorithms, Evidence and Data Science
- Fernandez and Marques (2018) Data Science, Marketing and Business
- Izbicki & Santos (2020) Aprendizado de máquina: uma abordagem estatística
Bibliography: Classical Monte Carlo papers
- Metropolis and Ulam (1949) The Monte Carlo method. JASA, 44, 335-341.
- Metropolis, Rosenbluth, Rosenbluth, Teller and Teller (1953) Equation of state calculations by fast computing machines. Journal of Chemical Physics, 21, 2087-1092.
- Hastings (1970) Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57, 97-109.
- Peskun (1973) Optimum Monte Carlo sampling using Markov chains. Biometrika, 60, 607-612.
- Besag (1974) Spatial Interaction and the Statistical Analysis of Lattice Systems. JRSS-B, 36, 192-236.
- Kirkpatrick, Gelatt and Vecchi (1983) Optimization by Simulated Annealing. Science, 220 (4598), 671-680.
- Geman and Geman (1984) Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Analysis and Machine Intelligence, 6, 721-741.
- Pearl (1987) Evidential reasoning using stochastic simulation of causal models. Artificial intelligence, 32, 245-257.
- Tanner and Wong (1987) The Calculation of Posterior Distributions by Data Augmentation. JASA, 82, 528-540.
- Geweke (1989) Bayesian Inference in Econometric Models Using Monte Carlo Integration. Econometrica, 57, 1317-1339.
- Gelfand and Smith (1990) Sampling-Based Approaches to Calculating Marginal Densities. JASA, 85, 398-409.
- Casella and George (1992) Explaining the Gibbs Sampler. The American Statistician, 46, 167-174.
- Gilks and Wild (1992) Adaptive Rejection Sampling for Gibbs Sampling. Applied Statistics, 41, 337-348.
- Smith and Gelfand (1992) Bayesian Statistics without Tears: A Sampling-Resampling Perspective. The American Statistician, 46, 84-88.
- Chib and Greenberg (1995) Understanding the Metropolis-Hastings algorithm. The American Statistician, 49, 327-335.
Advanced Econometrics - Professional Master in Economics - 2026
- Course: Advanced Econometrics
- Level: Professional Master in Economics – 4th quarter of 2026
- Professor: Hedibert Freitas Lopes – www.hedibert.org
- Objective: The main objective of this course is to introduce basic aspects i) Statistical Learning and ii) Bayesian Learning, as well as iii) Micro-econometrics, and iv) Macro-econometrics, that are necessary for the master’s degree program.
- Brief course description: Regression with endogeneity, regression with measurement error, instrumental variables, potential outcomes, Neyman-Rubin model, selection bias, reverse causality, and omitted variables, panel data, hierarchical models, fixed effect, and random effect, difference-in-differences methods, ARIMA models; long memory; unit root, GARCH models, and stochastic volatility. Vector autoregressive models, factorial models with stochastic volatility, and multivariate models with time-varying parameters, logistic regression; performance metrics for classification, training and testing data; cross-validation, bias-variance trade-off, prior, posterior, and predictive distributions; sequential Bayes and conjugate analysis, Monte Carlo methods.
- Teaching assistant: To be announced
- Final exam: You will NOT be asked to write R script or similar during the final exam
- Evaluation: 30% final exam, 60% homework assignments (15% each), 10% participation
- Homework assignments: Homework can be done by groups of no more than 4 students.
Class 1: Brief review of basic ingredients: i) Parametric models, likelihoods, estimators and their sampling distributions; ii) Gaussian linear regression: estimation and variable selection; iii) AR(1) model: estimation, unit root, equilibrium distribution; iv) AR(p) model: connection to Gaussian linear regression; v) VAR(1) model: multivariate estimation, matrix notation.
- Bernoulli trials: iid vs logistic regression
- Selecting the sample size of iid Bernoulli trials
- Poisson model: maximum likelihood estimation, sufficiency, unbiasedness, consistency, efficiency
- Comparing estimators: Mean Square Error (MSE)
- Comparing estimators: sample mean, sample weighted mean, trimmed mean, sample median
- Poisson model: R code for the coal mining disaster data
- Poisson model: Our first Bayesian experience
- Bayesian hierarchical Beta-Binomial regression
- Gaussian linear regression (pages 1-22) – A few examples
- Autoregressive model of order one (pages 1-17) – A few examples
- A bit about Monte Carlo simulation and the bootstrap technique
- Bivariate normal correlation coefficient: MLE, sample correlation and the bootstrap
- Complementary bibliography: Introdução à Inferência Estatística (2010, Heleno Bolfarine & Monica Sandoval) – Sections 1.1.4, 1.3, 2.1, 2.2, 3.1, 3.2.
Class 2: Introduction to statistical learning: i) Linear and log-linear regression modeling; ii) Training and testing samples; iii) Validation. Bias-variance trade-off.
- Multiple linear regression
- CAPM model
- Return on education
- Professors salaries
- Motorcycle example + HTML version from Rmarkdown
- We studied the selection of the polynomial order that best fits the “motorcycle” data. We discussed cross-validation (CV), replicated CV, leave-one-out CV (LOOCV), and 10-fold CV. After concluding that the polynomial model of order 12 was the best, we implemented a bootstrap algorithm to obtain approximate confidence bands.
- Logistic regression
- Gaussian vs Poisson regression: Counts of visitors to a website
- Fundamentos de Machine Learning (Notas do Prof. Paulo C. Marques F.) – Código R
Classes 3+4: Introduction to Bayesian learning: Prior, posterior, and predictive distributions; Sequential Bayesian updating and conjugate analysis, Bayes factor, posterior model probability, model selection; Monte Carlo & Markov Chain Monte Carlo methods; Sparsity in linear and log-linear models.
- Bayesian ingredients
- Sequential Bayes: i) Bernoulli trials with Beta prior and ii) Uniform trials with Pareto prior
- Sequential Bayes: Bernoulli trials with discrete sucess probability
- Tiago Mendonca’s shiny for the physicists example
- Phisycists A, B, C and D: Normal model and 4 priors
- Flipping one of three coins three times and observing three heads (R code)
- Chapter 1 of “Bayes’ Rule: A tutorial Introduction to Bayesian Analysis (Slide)
- Bayesian inference for the correlation coefficient of bivariate Gaussian data
- Bayesian stochastic volatility (SV) model with various error distributions (SV in R: Hosszejni & Kastner, 2021)
- Bayesian Logistic Regression
- Bayesian Beta Regression
- Bayesian hierarchical Beta-Binomial regression
- Bayesian hierarchical linear regression
- Zero-inflated Poisson data
- Three examples of HMC/RSTAN in action: iid Gaussian, Gaussian linear regression, and stochastic volatility AR(1) model.
- Sparse priors: Brief summaries of Carvalho, Polson and Scott (2010) and Griffin and Brown (2010)
- Sparse priors: simulation exercise – graphs
Classes 5+6: Introduction to causal inference: Simpson’s Paradox, Directed Acyclic Graphs (Paths, Junctions, Chains, Forks, Colliders and d-separation), Potential outcome, average treatment effect (ATE), quantile treatment effect (QTE), conditional average treatment effect (CATE), Stable Unit Treatment Value Assumption (SUTVA), Instrumental Variables (IV), Difference in Difference (DiD) and Regression Discontinuity Design (RDD).
- Omitted variable example + Error-in-variable problem
- Notes on causality
- Example 1: Instrumental variables (IV) + dataset
- Example 2: Difference in Difference (DiD) + dataset
- Example 3: Regression Discontinuity Design (RDD) + dataset
- CausalML Book – Applied Causal Inference: An introduction to the emerging fusion of machine learning and causal inference.
- Athey and Imbens (2017) The State of Applied Econometrics: Causality and Policy Evaluation
- Li, Ding and Mealli (2023) Bayesian causal inference: a critical review
- Neal (2020) Introduction to Causal Inference from a Machine Learning Perspective
- Neal (2020) Introduction to Causal Inference (course lots of illustration, very didactical)
- Cunningham (2021) Causal Inference: The Mixtape
- Hahn and Herren (2025) Regression Adjustment for Causal Inference: A Primer with Examples
- Heckman and Pinto (2015) Causal analysis after Haavelmo
- Pearl (2013) Reflections on Heckman and Pinto’s “Causal Analysis After Haavelmo”
- Heckman and Pinto (2022) Causality and econometrics
Class 7: More econometric models: i) General linear models: heteroskedasticity, Student’s t errors and autoregressive errors); ii) Hierarchical models; iii) Limited dependent variable models (Tobit, probit, ordered probit, multinomial probit).
Class 8: Introduction to univariate time series econometrics: i) Autoregressive moving average models; ii) Unit root econometrics; iii) Seasonal models; iv) ARCH/GARCH and related models; v) Stochastic volatility models.
- Autoregressive (AR) models and moving average (MA) models
- Unit-root nonstationarity and long-memory processes
- Seasonal models
- Glossary of ARCH models
- Bayesian GARCH
- Stochastic volatility (SV) models
- GARCH(1,1) vs SV-AR(1) models with Students t errors
- Petrobrás data: ARCH(1), GARCH(1,1) and SV-AR(1) models
Class 9: Introduction to multivariate time series econometrics: i) Factor SV models; ii) Dynamic Conditional Correlation (DCC) models; iii) Vector autoregressive (VAR) models, impulses-response function, structural VAR (SVAR); iv) Large BVAR, Factor-augmented VAR (FAVAR); v) Time-varying parameter (TVP)-VAR; vi) Bayesian VAR (BVAR) and Bayesian FAVAR (BFAVAR).
- Vector autoregressive models (VAR)
- Large Bayesian VAR (BVAR), Factor-Augmented VAR (FAVAR), Time-Varying-Parameter-BVAR (TVP-BVAR) & Bayesian FAVAR
- Factor Stochastic Volatility (FSV)
- FSV x Dynamic Conditional Correlation (DCC)
Class: Final exam
Basic bibliography
- Mostly Harmless Econometrics: An Empiricist’s Companion (Angrist and Pischke, 2009)
- Analysis of Financial Time Series, 3rd Edition (Tsay, 2010)
- An Introduction to Statistical Learning (James, Witten, Hastie and Tibshirani, 2023) – https://www.statlearning.com
- Introduction to Bayesian Econometrics (Greenberg, 2013)
Additional bibliography
- Introduction to Econometrics, 3rd edition (Stock and Watson, 2010)
- Introductory Econometrics: A Modern Approach (Wooldridge, 2012)
- Time Series Analysis (Hamilton, 1994)
- Aprendizado de Máquina: Uma Abordagem Estatística (Izbicki and Mendonça, 2020) – https://tiagoms.com/publications/ame
- Estatística e Ciência de Dados (Morettin and Singer, 2021) – https://www.ime.usp.br/~pam/cdadosf3.pdf
- Introduction to Modern Bayesian Econometrics (Lancaster, 2004)
- Bayesian Econometric Methods, 2a edição (Chan, Koop, Poirier and Tobias)
- Bayesian Statistics and Marketing (Rossi, Allenby and McCulloch, 2005)
- Time Series: Modeling, Computation, and Inference (Prado and West, 2010)