Algorithms for Uncertainty Quantification - Summer 18
- Summer 18
- Dr. Tobias Neckel
- Time and Place
- Lecture: Tuesday, 14:15-15:45 MI 02.07.023
- Tutorial: Wednesday, 12:15-13:45 MI 02.07.023
- Master students, e.g. of CSE, mathematics, informatics, data science, data engineering and analytics, physics,...
- Friedrich Menhorn
- preliminary: 01.08.2018, 11:00-12:15
- Semesterwochenstunden / ECTS Credits
- 4 SWS (2V+2Ü) / 5 Credits
- Algorithms for UQ (IN2345)
- As announced in the tutorial we will swap lecture and tutorial in week 24. That means: Tutorial: June 12, 14:00-16:00; Lecture: June 13, 12:00-14:00.
- Evaluation of the lecture takes place during the lecture on June 13 2018. Please bring your laptop.
- Typos in the slides of §6 have been fixed. A print version of the slides is now also available
- The first lecture takes place on April 10 2018.
Computer simulations of different phenomena heavily rely on input data which – in many cases – are not known as exact values but face random effects. Uncertainty Quantification (UQ) is a cutting-edge research field that supports decision making under such uncertainties. Typical questions tackled in this course are “How to incorporate measurement errors into simulations and get a meaningful output?”, “What can I do to be 98.5% sure that my robot trajectory will be safe?”, “Which algorithms are available?”, “What is a good measure of complexity of UQ algorithms?”, “What is the potential for parallelization and High-Performance Computing of the different algorithms?”, or “Is there software available for UQ or do I need to program everything from scratch?”
In particular, this course will cover:
- Brief repetition of basic probability theory and statistics
- 1st class of algorithms: sampling methods for UQ (Monte Carlo): the brute-force approach
- More advanced sampling methods: Quasi Monte Carlo & Co.
- Relevant properties of interpolation & quadrature
- 2nd class of algorithms: stochastic collocation via the pseudo-spectral approach: Is it possible to obtain accurate results with (much) less costs?
- 3rd class of algorithms: stochastic Galerkin: Are we willing to (heavily) modify our software to gain accuracy?
- Dimensionality reduction in UQ: apply hierarchical methodologies such as tree-based sparse grid quadrature. How does the connection to Machine Learning and classification problems look like?
- Which parameters actually do matter? => sensitivity analysis (Sobol’ indices etc.)
- What if there is an infinite amount of parameters? => approximation methods for random fields (KL expansion)
- Software for UQ: What packages are available? What are the advantages and downsides of major players (such as chaospy, UQTk, and DAKOTA)
- Outlook: inverse UQ problems, data aspects, real-world measurements
- 10.04.18: Introduction
- 24.04.18: Repetition: probability theory & statistics, print version (4x2 slides)
- 08.05.18: Intro sampling methods, print version (4x2 slides)
- 15.05.18: More advanced sampling methods, print version (4x2 slides)
- 23.05.18: Aspects of interpolation and quadrature, print version (4x2 slides)
- 29.05.18: Polynomial chaos 1: the pseudo-spectral approach, print version (4x2 slides)
- 05.06.18: Polynomial Chaos 2: the stochastic Galerkin approach, print version (4x2 slides)
- 13.06.18: Sparse grids in Uncertainty Quantification, print version (4x2 slides)
- 19.06.18: Sensitivity analysis, print version (4x2 slides)
- 26.06.18: Random fields, print version (4x2 slides)
- 03.07.18: Software for UQ, print version (4x2 slides)
Worksheets and Solutions
|1||Python overview||Worksheet1||April 11||Template||Solution 1 Solution 2 Solution 3|
|2||Probability and statistics overview||Worksheet2||May 02||Solution 1 Solution 6 Solution.pdf|
|3||Standard Monte Carlo sampling||Worksheet3||May 9||Template||Solution 2 Solution 3 Solution 4 Solution 5 Solution.pdf|
|4||More advanced sampling techniques||Worksheet4||May 16||Template|
|5||Aspects of interpolation and quadrature||Worksheet5||May 30||Template|
|6||Polynomial Chaos 1: the pseudo-spectral approach||Worksheet6||June 06||Template|
|7||Polynomial Chaos 2: the stochastic Galerkin approach||Worksheet7||June 12|
|8||The sparse pseudo-spectral approach||Worksheet8||June 20||Template|
|9||Sobol' indices for global sensitivity analysis||Worksheet9||June 27||Ex2 Template|
|10||Random fields in Uncertainty Quantification||Worksheet10||July 04||Template|
- first exam (preliminary, check TUMonline):
- WED, Aug 01, 2018, 11:00-12:15 (75 min)
- covered topics (preliminary): everything except:
- inverse problems (lecture 12)
- details of pure python programming
- specific API of chaospy (or other packages)
- style of exam exercises: similar to tutorials
- allowed material: tba
- Most likely written exam. However, in case of a low number of registered candidates, the exam will be carried out orally (about 30 min).
- Old exam: Exam SS17
- R. C. Smith, Uncertainty Quantification – Theory, Implementation, and Applications, SIAM, 2014
- D. Xiu, Numerical Methods for Stochastic Computations – A Spectral Method Approach, Princeton Univ. Press, 2010
- T. J. Sullivan, Introduction to Uncertainty Quantification, Texts in Applied Mathematics 63, Springer, 2015
- Advanced Monte Carlo:
- Interpolation and Quadrature:
- Polynomial Chaos Expansion:
- Sparse Grids:
- Sensitivity Analysis:
- Smith, Chapter 13 and 15
- Random Fields:
- Stochastic Processes: