SCCS Colloquium - Dec 5, 2019

From Sccswiki
Jump to navigation Jump to search
Date: December 5, 2019
Room: 00.08.053
Time: 15:00 - 16:00

Julian Suk: Second-Order Optimisation for the Training of Deep Neural Networks

Master's thesis introduction talk. Julian is advised by Severin Reiz.

Deep neural networks have become the most prominent model in machine learning due to their flexibility and therefore, their broad applicability. The training of real-world deep neural networks requires vast computational resources. Gradient descent methods still enjoy great popularity but Hessian-based optimisation techniques are on the rise. While computing the second derivative of the cost function is still computationally expensive, a possibly much faster convergence rate justifies the consideration of such methods. Gradient descent is inherently sequential and cannot take full advantage of highly parallelised computing architectures. This motivates the exploration of second-order optimisation methods also in the context of high performance computing.

Keywords: Deep neural networks, optimisation, machine learning

Niklas Stotzem: Normalization of Datasets for Sparse Grids Datadriven Methods in the SG++ Datamining Pipeline

MSE Research Internship talk. Niklas is advised by Kilian Röhner.

The SG++ datamining pipeline offers different machine learning models based on sparse grid methods. The methods operate on datapoints located in the [0,1] hypercube. So far, the pipeline has only supported pre-scaled data. Goal of the research project was to integrate a scaling algorithm and investigate a heuristic approach, using only part of the available data.

Keywords: Sparse Grids, Feature Scaling