Personal tools

Low Rank Approximation

From Sccswiki

Jump to: navigation, search
Term
Summer 2018
Lecturer
Univ.-Prof. Dr. Daniel Kressner: John von Neumann Lecturer
Time and Place
Lecture: Monday, details see here
Audience
MA5328
Master CSE
Master Mathematics
Topmath
Master Informatics
Master Mathematics in Data Science
Master Data Engineering and Analytics
Tutorials
no tutorials
Exam
60 minute written Exam or 20 minutes oral exam
Semesterwochenstunden / ECTS Credits
2 SWS / 3 credits
TUMonline
https://campus.tum.de/tumonline/wblvangebot.wbshowlvoffer?ppersonnr=333087



Contents

News

The first lecture will be on Monday, 16.4.2018, 2pm (14:00) in room 02.08.020, M11.

The lecture dates have been fixed to 16.4., 23.4., 30.4., 14.5., 28.5., 4.6., 11.6., 18.6., 25.6., 2.7. (Monday 2pm)

See also TUMOnline at https://campus.tum.de/tumonline/wbLv.wbShowLVDetail?pStpSpNr=950377693&pSpracheNr=1

Contents

Low-rank compression is an ubiquitous tool in scientific computing and data analysis. There have been numerous exciting developments in this area during the last decade and the goal of this course is to give an overview of these developments, covering theory, algorithms, and applications of low-rank matrix and tensor compression. Specifically, the following topics will be covered:

1. Theory

  • - Low-rank matrix and tensor formats (CP, Tucker, TT, hierarchical Tucker)
  • - A priori approximation results

2. Algorithms

  • - Basic operations with low-rank matrices and tensors
  • - SVD-based compression
  • - Randomized compression
  • - Alternating optimization
  • - Riemannian optimization
  • - Nuclear norm minimization
  • - Adaptive cross approximation and variants

3. Applications

  • - Image processing
  • - Matrix and tensor completion
  • - Model reduction
  • - Solution of large- and extreme-scale linear algebra problems from various applications (dynamics and control, uncertainty quantification, quantum computing, ...)
  • - Tensors in deep learning

Depending on how the course progresses and the interest of the participants, hierarchical low-rank formats (HODLR, HSS, H matrices) may be covered as well.

Hands-on examples using publicly available software (in Matlab, Python, and Julia) will be provided throughout the course.

Lecture slides

  • Slides 1: Basic concepts and subspace iteration lecture1.pdf (covered on 16.4. and 23.4.)
  • Slides 2: Randomized low-rank approximation lecture2.pdf (covered on 23.4.)
  • Slides 3: Low-rank approximation by deterministic column/row selection lecture3.pdf (covered on 30.4.) (updated on 14.5.)
  • Slides 4: Randomized sampling and intro to tensors lecture4.pdf (covered on 14.5.)

Literature & external links

Pointers to the literature can be found in the slides.