CompactCourse: Scalable kernel methods in machine learning - Summer 16
- Term
- Summer 16
- Lecturer
- Prof. George Biros; contact: Arash Bakhtiari
- Time and Place
- see below
- Audience
- all interested students, in particular students of BGCE, TopMath, CSE, Mathematics, and Informatics
- Tutorials
- -
- Exam
- n.a.
- Semesterwochenstunden / ECTS Credits
- n.a
- TUMonline
- n.a
Dates
- 23.06.2016, 4pm to 6.30pm, MI 03.13.010
- 24.06.2016, 4pm to 6.30pm, MI 03.13.010
27.06.2016, 9am to 11.30am, MI 01.06.020- 28.06.2016, 4pm to 6.30pm, MI 03.13.010
- 29.06.2016, 4pm to 6.30pm, MI 01.10.011
- 30.06.2016, 9am to 11.30am, MI 01.11.018
- 01.07.2016, 2pm to 4.30pm, MI 01.10.011
Registration
- The registration is closed.
Description
We will consider supervised, unsupervised, and approximation algorithms: nearest-neighbors, regression, density estimation, and scattered data approximation. These methods require each entry in our dataset corresponds to a point in a metric space. Kernel methods generalize this motion by using a kernel function whose arguments are two points in the dataset. This modification significantly expands the applicability and power of well-understood machine learning algorithms. At the core of many kernel methods lies the so called kernel matrix, a dense, square matrix of size N-by-N with N being O(10^5)--O(10^9). Solving linear systems with kernel matrices is an algebraic operation that is required in many kernel methods. Scalable kernel methods require the use of kernel matrix approximations.
We will discuss the following topics. Introduction to learning and approximation. Practical considerations for real applications: normalization, cross-validation, dimensionality reduction. Basic concepts for supervised and semi-supervised learning. Fundamental theoretical models in kernel methods. The Nystrom method and its variants for low-rank approximations of kernel matrices. Block low-rank and hierarchical matrices for approximations of kernel matrices. We will discuss several numerical examples related to machine learning problems in signal and image analysis.
Prerequisites
- Graduate-level numerical linear algebra.
- Undergraduate-level optimization (nonlinear programming)
- Undergraduate-level probability and statistics
- Programming (MATLAB or Python).
Textbook
There will be no textbook for the class. We will use lecture notes and readings. An excellent general reference for machine learning (available online) can be found in [1] Chapter 2 is a good introductory reading.
Assessment
- Class participation