Low Rank Approximation
- Term
- Summer 2018
- Lecturer
- Univ.-Prof. Dr. Daniel Kressner: John von Neumann Lecturer
- Time and Place
- Lecture: Monday, details see here
- Audience
- MA5328
- Master CSE
- Master Mathematics
- Topmath
- Master Informatics
- Master Mathematics in Data Science
- Master Data Engineering and Analytics
- Tutorials
- no tutorials
- Exam
- 60 minute written Exam or 20 minutes oral exam
- Semesterwochenstunden / ECTS Credits
- 2 SWS / 3 credits
- TUMonline
- https://campus.tum.de/tumonline/wblvangebot.wbshowlvoffer?ppersonnr=333087
News
The first lecture will be on Monday, 16.4.2018, 2pm (14:00) in room 02.08.020, M11.
The lecture dates have been fixed to 16.4., 23.4., 30.4., 14.5., 28.5., 4.6., 11.6., 18.6., 25.6., 2.7. (Monday 2pm)
See also TUMOnline at https://campus.tum.de/tumonline/wbLv.wbShowLVDetail?pStpSpNr=950377693&pSpracheNr=1
Contents
Low-rank compression is an ubiquitous tool in scientific computing and data analysis. There have been numerous exciting developments in this area during the last decade and the goal of this course is to give an overview of these developments, covering theory, algorithms, and applications of low-rank matrix and tensor compression. Specifically, the following topics will be covered:
1. Theory
- - Low-rank matrix and tensor formats (CP, Tucker, TT, hierarchical Tucker)
- - A priori approximation results
2. Algorithms
- - Basic operations with low-rank matrices and tensors
- - SVD-based compression
- - Randomized compression
- - Alternating optimization
- - Riemannian optimization
- - Nuclear norm minimization
- - Adaptive cross approximation and variants
3. Applications
- - Image processing
- - Matrix and tensor completion
- - Model reduction
- - Solution of large- and extreme-scale linear algebra problems from various applications (dynamics and control, uncertainty quantification, quantum computing, ...)
- - Tensors in deep learning
Depending on how the course progresses and the interest of the participants, hierarchical low-rank formats (HODLR, HSS, H matrices) may be covered as well.
Hands-on examples using publicly available software (in Matlab, Python, and Julia) will be provided throughout the course.
Lecture slides
- Slides 1: Basic concepts and subspace iteration lecture1.pdf (covered on 16.4. and 23.4.)
- Slides 2: Randomized low-rank approximation lecture2.pdf (covered on 23.4.)
- Slides 3: Low-rank approximation by deterministic column/row selection lecture3.pdf (covered on 30.4.) (updated on 28.5.)
- Slides 4: Randomized sampling and intro to tensors lecture4.pdf (covered on 14.5.)
- Slides 5: CP and Tucker decompositions lecture5.pdf (covered on 28.5.)
- Slides 6: Basics of TT decomposition lecture6.pdf (covered on 4.6. and 11.6.) (updated on 11.6.)
- Slides 7: ALS lecture7.pdf (covered on 11.6.)
- Slides 8: Optimization on low-rank manifolds lecture8.pdf (covered on 25.6. and 2.7.)
- Slides 9: Dynamical low-rank approximation lecture9.pdf (covered on 2.7.)
Mini projects
- ALS for CP decomposition project_cp.pdf
- HOSVD and ACA project_hosvd_aca.pdf
- Randomized SVD and HOSVD project_randomized_SVD.pdf
- Tensor rank project_tensorrank.pdf
Literature & external links
Pointers to the literature can be found in the slides.