SCCS Colloquium - Apr 17, 2019

From Sccswiki
Revision as of 13:02, 9 April 2019 by Makis (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
Date: April 17, 2019
Room: 02.07.023
Time: 15:00 - 16:00

Jan Schopohl: Domain Parallelization of SGDE based Classification

This is a Bachelor's thesis submission talk. Jan is advised by Kilian Röhner.

This thesis describes cluster-level domain-parallelization of an sparse grid density estimation (SGDE) based classification algorithm. The Online phase of the implementation of this algorithm in the SG++ toolbox was parallelized using MPI and the ScaLAPACK library. This means that distributed compute resources can be efficiently used as the implementation is not constrained to one node. Instead of a data-parallel approach, model parallelization was chosen in order to achieve greater flexibility and efficiency on complex grids. The parallel classifier was integrated into the existing datamining pipeline of the SG++ toolbox. The performance of the new approach was evaluated for weak and strong scaling and the influence of parallelization parameters was considered.

Keywords: Sparse Grids, SG++, Classification, Parallelization

Language: English

Kislaya Ravi: Neural Network Hyperparameter Optimization using SNOWPAC

This is a Master's thesis submission talk. Kislaya is advised by Friedrich Menhorn and examined by Prof. Hans-Joachim Bungartz and Prof. Laura Leal-Taixe (TUM Computer Vision).

Automatic searching for the optimum set of hyperparameters is crucial in practical application of deep learning algorithms. In this work we optimize the hyperparameters using mixed integer SNOWPAC (Stochastic Nonlinear Optimization With Path-Augmented Constraints), a method for stochastic nonlinear constrained derivative free optimization using a trust region approach.

We present new addition to SNOWPAC to solve mixed integer optimization problems. We compare its performance against various existing optimizers using different benchmark problems. Then, we link the it with neural network training to optimize the hyperparameters. We create eight different neural network hyperparameter optimization problems with number of unknown parameters ranging from six to nineteen. We optimize hyperparameter for the problems using SNOWPAC and other existing methods like HORD, HORD-ISP, Spearmint, TPE and SMAC. Then we compare these tools over the different criterions. We show that, statistically SNOWPAC not only finds the set of hyperparameters with lower validation error but also take small non-evaluation time.

Keywords: Optimization, Trust-region method, Hyperparameters, SNOWPAC

Language: English