ST451 Half Unit
Bayesian Machine Learning
This information is for the 2023/24 session.
Teacher responsible
Dr Konstantinos Kalogeropoulos
Availability
This course is available on the MSc in Applied Social Data Science, MSc in Data Science, MSc in Econometrics and Mathematical Economics, MSc in Health Data Science, MSc in Quantitative Methods for Risk Management, MSc in Statistics, MSc in Statistics (Financial Statistics), MSc in Statistics (Financial Statistics) (Research), MSc in Statistics (Research), MSc in Statistics (Social Statistics) and MSc in Statistics (Social Statistics) (Research). This course is available as an outside option to students on other programmes where regulations permit.
This course has a limited number of places (it is controlled access) and demand is typically very high. Priority is given to Department of Statistics students and those with the course listed in their programme regulations.
Pre-requisites
Basic knowledge in probability in statistics via a course such as the ST202 Probability Distribution Theory and Inference or an equivalent course; Previous programming experience is not required but students who have no previous experience in Python must complete an online pre-sessional Python course from the Digital Skills Lab before the start of the course (https://moodle.lse.ac.uk/course/view.php?id=7696)
Course content
The course sets up the foundations and covers the basic algorithms covered in probabilistic machine learning. Several techniques that are probabilistic in nature are introduced and standard topics are revisited from a Bayesian viewpoint. The module provides training in state-of-the-art methods that have been applied successfully for several tasks such as natural language processing, image recognition and fraud detection.
The first part of the module covers the basic concepts of Bayesian Inference such as prior and posterior distribution, Bayesian estimation, model choice and forecasting. These concepts are also illustrated in real world applications modelled via linear models of regression and classification and compared with alternative approaches.
The second part of the module introduces and provides training in further topics of probabilistic machine learning such as Graphical models, mixtures and cluster analysis, Variational approximation, advanced Monte Carlo sampling methods, sequential data and Gaussian processes. All topics are illustrated via real-world examples and are contrasted against non-Bayesian approaches.
Teaching
This course will be delivered through a combination of classes, lectures and Q&A sessions totalling a minimum of 35 hours across the Winter Term.
Syllabus:
- Bayesian inference concepts: Prior and posterior distributions, Bayes estimators, credible inter- vals, Bayes factors, Bayesian forecasting, Posterior Predictive distribution.
- Linear models for regression: Linear basis function models, Bayesian linear regression, Bayesian model comparison.
- Linear models for classification: Probabilistic generative models, Probabilistic discriminative models, The Laplace approximation, Bayesian logistic regression.
- Variational inference, Variational linear and logistic regression.
- Graphical models: Bayesian networks, Conditional independence, Markov random fields.
- Mixture models and Clustering: Clustering, Mixtures, The EM algorithm.
- Sampling methods: Basic sampling algorithms, Markov chain Monte Carlo, Gibbs sampling
- Sequential data: Markov models, Hidden Markov models, Linear dynamical systems.
- Gaussian processes : Bayesian Non-Parametrics, Gaussian processes for regression and classification.
Formative coursework
Students will be expected to produce 10 problem sets in the WT.
10 problem sets in WT to prepare students for both summative assessment components. They will include theoretical exercises, targeting for learning outcomes a and b, as well as computer-based assignments (for learning outcome c) that will need to be presented in suitable form for the purposes of learning outcome d. Additionally, mostly related to learning outcome b, students will be encouraged to share and compare their responses in some challenging parts of the problem sets, through the use of dedicated Moodle forums.
Indicative reading
- C. M. Bishop, Pattern Recognition and Machine Learning, Springer 2006
- K. Murphy, Machine Learning: A Probabilistic Perspective, MIT Press, 2012
- S. Rogers and M. Girolami, A First Course in Machine Learning, Second Edition, Chapman and Hall/CRC, 2016
- D. J. C. MacKay, Information Theory, Inference and Learning Algorithms, Cambridge University Press, 2003
- D. Barber, Bayesian Reasoning and Machine Learning, Cambridge University Press 2012
Assessment
Exam (50%, duration: 2 hours) in the spring exam period.
Project (50%) in the ST.
Student performance results
(2019/20 - 2021/22 combined)
Classification | % of students |
---|---|
Distinction | 42.2 |
Merit | 35.8 |
Pass | 13.8 |
Fail | 8.3 |
Key facts
Department: Statistics
Total students 2022/23: 22
Average class size 2022/23: 11
Controlled access 2022/23: Yes
Lecture capture used 2022/23: Yes (LT)
Value: Half Unit
Course selection videos
Some departments have produced short videos to introduce their courses. Please refer to the course selection videos index page for further information.
Personal development skills
- Self-management
- Team working
- Problem solving
- Application of information skills
- Communication
- Application of numeracy skills