When observing data, the key question is: What I can learn from the observation? Bayesian inference treats all parameters of the model as random variables. The main task is to update their distribution as new data is observed. Hence, quantifying uncertainty of the parameter estimation is always part of the task. In this course we will introduce the basic theoretical concepts of Bayesian Statistics and Bayesian inference. We discuss the computational techniques and their implementations, different types of models as well as model selection procedures. We work with existing datasets and use the PyMC3 framework for practicals.
The main topics are:
- Bayes theorem
- Prior and posterior distributions
- Computational challenges and techniques: MCMC, variational approaches
- Models: Mixture Models, Bayesian Neural Networks, Variational Autoencoder, Normalising Flows
- PyMC3 framework for Bayesian computation
- Running Bayesian models on a supercomputer
Contents level
in hours
in %
Beginner's contents:
4.5 h
30 %
Intermediate contents:
10.5 h
70 %
Advanced contents:
0 h
0 %
Community-targeted contents:
0 h
0 %
Prerequisites:
Participants should be familiar with general statistical concepts, such as distributions, samples. Furthermore, familiarity with fundamental Machine Learning concepts such as regression, classification and training is helpful.
A personal institutional email address (university/research institution, government agency, organisation, or company) is required to register for JSC training courses. If you don't have an institutional email address, please get in touch with the contact person for this course.
Target Audience:
PhD students and Postdocs
Learning Outcome:
The ability to set up a Bayesian approach within a given framework
Language:
This course is given in English.
Duration:
5 half days
Dates:
16-20 March 2026, 13:00 - 17:00
Venue:
Online
Number of Participants:
Maximum 25
Instructor:
Alina Bazarova, Jose Robledo (JSC)
Fees
This course is offered free of charge.
Pre-required logistics
Participants should be familiar with general statistical concepts, such as distributions, samples. Furthermore, familiarity with fundamental Machine Learning concepts such as regression, classification and training is helpful.