Uncertainty Quantification - 2024 entry
MODULE TITLE | Uncertainty Quantification | CREDIT VALUE | 15 |
---|---|---|---|
MODULE CODE | MTHM063 | MODULE CONVENER | Dr Stefan Siegert (Coordinator) |
DURATION: TERM | 1 | 2 | 3 |
---|---|---|---|
DURATION: WEEKS | 11 |
Number of Students Taking Module (anticipated) | 30 |
---|
Computer models are used in many areas of science: Engineers use them to optimise airfoils, epidemiologists use them to identify disease hotspots, and atmospheric scientists use them to estimate climate change impacts. However, even the most complex computer models are only approximations of the real world, and hence our knowledge about the real-world system remains uncertain. This course provides a detailed introduction into the field of uncertainty quantification in mathematical and computational modelling. The key concepts are taught through a mix of mathematical theory computational methods, and real-world examples. After taking this course, you will have expertise to advise scientists in different fields on computer model experimentation and carry out independent uncertainty quantification analyses.
Prerequisites: MTH1004, MTH2006 or equivalent, programming experience in high-level languages such as R or python
This module will cover key mathematical and computational concepts from the field of uncertainty quantification, including emulation, experimental design, sensitivity analysis, calibration, and prediction. All methods will be taught through a mix of mathematical theory, computer exercises and applications. Upon completion, students will be confident to discuss strengths and weaknesses of computer models, advise on experimental design, calibrate, and evaluate models, and interpret simulation experiment results in the presence of observation and model error. During the course students will read and discuss some of the seminal academic papers in the field, learn about current practice in research and industry, and receive guidance and resources for further study.
On successful completion of this module you should be able to:
Module Specific Skills and Knowledge
2. critically discuss common sources of uncertainty in computer model experiments in different fields
Discipline Specific Skills and Knowledge
5. independently gain knowledge in new areas of mathematics and science
Personal and Key Transferable / Employment Skills and Knowledge
7. demonstrate the ability to efficiently manage information and time
1. Computer simulation experiments: Fundamentally, computer simulation models are complicated input-output relationships that use initial conditions, boundary conditions and model parameters to generate data about the outcome of the simulated experiment. We introduce basic concepts and themes in computer simulation experiments in various fields and discuss common sources of uncertainty. We illustrate these concepts on simple pedagogic examples, as well as complex examples from real-world applications such as engineering, atmospheric sciences, and biology.
2. Foundations of Probability and Statistics: We use probability as the fundamental tool to quantify uncertain outcomes. We review the most important concepts from probability theory and statistical modelling, including probability distributions, expectation, variance, covariance, sampling, conditional probability, parameter estimation, and regression analysis. To prepare for the chapter on Gaussian process emulation we introduce random vectors, the multivariate normal distribution, and related concepts.
3. Emulation: Complex computer models often take a long time to run, which makes a detailed analysis of their input-output relationship time-consuming. To address this, we fit "emulators", fast statistical models that approximate the output of the complex model at any input value. We focus on the class of Gaussian process emulators, including their fitting, hyperparameter estimation, validation, and diagnostics.
4. Experimental design: When simulation experiments are expensive, we must choose wisely at which input values to run the model to learn about its output behavior. We introduce different experimental design strategies to choose suitable input points, such as fractional factorial, central composite, latin hypercube, quasi-Monte Carlo and adaptive designs.
5. Sensitivity analysis: The outputs of simulation models are sensitive to variations in their input parameters. The aim of sensitivity analysis is to apportion the output uncertainty to the individual input parameters, to identify to which parameters the model is most sensitive, and which parameters are irrelevant. You will learn methods for derivative-based (local) sensitivity analysis and variance-based (global) sensitivity analysis.
6. Calibration and parameter tuning: Simulation models typically have unknown parameters, and we can use observations of the real-world system to learn about these parameters. In doing so, we must account for the fact that the observations can be noisy and incomplete, and that the simulation model is only an approximation of the real-world system. You will learn a number of different strategies for tuning and calibration, such as discrepancy modelling, numerical optimisation, history matching, and Bayesian calibration.
7. Prediction: We can use the calibrated simulation model to make predictions about the real-world system. To communicate uncertainty, predictions take the form of a probability distribution over all possible outcomes. Afterwards we should assess how good or bad our probabilistic prediction was. In this part of the course, you will learn the basics of probabilistic forecast evaluation, including reliability analysis, scoring rules, and skill scores.
Scheduled Learning & Teaching Activities | 33 | Guided Independent Study | 117 | Placement / Study Abroad | 0 |
---|
Category | Hours of study time | Description |
Scheduled learning and teaching activities | 33 | Lectures and example classes |
Guided independent study | 117 | Study of lecture notes and readings, working on exercises, revision, assessments |
Form of Assessment | Size of Assessment (e.g. duration/length) | ILOs Assessed | Feedback Method |
---|---|---|---|
Exercise sheets | 4 exercise sheets | All | verbal/written feedback on request |
Online quizzes | 4 quizzes | All | Automated scoring and general feedback, verbal/written individual feedback on request |
Coursework | 60 | Written Exams | 40 | Practical Exams | 0 |
---|
Form of Assessment | % of Credit | Size of Assessment (e.g. duration/length) | ILOs Assessed | Feedback Method |
---|---|---|---|---|
Midterm test | 20 | 40 minutes | All | Written |
Midterm test | 20 | 40 minutes | All | Written |
UQ Report | 60 | 15 pages | All | Written |
Original Form of Assessment | Form of Re-assessment | ILOs Re-assessed | Time Scale for Re-assessment |
---|---|---|---|
Midterm test | Midterm test | All | Referral/deferral period |
Midterm test | Midterm test | All | Referral/deferral period |
UQ Report | UQ Report | All | Referral/deferral period |
All |
Deferrals: Reassessment will be by coursework and/or written exam in the deferred element only. For deferred candidates, the module mark will be uncapped.
Referrals: Reassessment will be by a single exam worth 100% of the module only. As it is a referral, the mark will be capped at 40%.
information that you are expected to consult. Further guidance will be provided by the Module Convener
Reading list for this module:
CREDIT VALUE | 15 | ECTS VALUE | |
---|---|---|---|
PRE-REQUISITE MODULES | None |
---|---|
CO-REQUISITE MODULES | None |
NQF LEVEL (FHEQ) | AVAILABLE AS DISTANCE LEARNING | No | |
---|---|---|---|
ORIGIN DATE | Monday 11th March 2024 | LAST REVISION DATE | Tuesday 9th April 2024 |
KEY WORDS SEARCH | None Defined |
---|
Please note that all modules are subject to change, please get in touch if you have any questions about this module.