Personal tools

Running Research and Development Projects

From Sccswiki

Jump to: navigation, search

Contents

Excellence Initiative: IGSSE

Distributed stochastic simulation for the hydroelastic analysis of very large floating structures

Project type IGSSE Project Team
Funded by Excellence Initiative of the German federal and state

governments

Begin Oktober 2008
End September 2011
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz, Dr. rer. nat. Miriam Mehl
Staff Bernhard Gatzhammer, M.Sc, Dipl.-Inf. Marion Bendig
Contact person Dr. rer. nat. Miriam Mehl
Co-operation partner Prof.Dr. Ernst Rank, Dr. Ralf-Peter

Mundani, PD Dr. Alexander Düster, Prof. PhD Chien Ming Wang (Singapur), SOFiSTiK AG (Oberschleißheim)

Brief description

Very large floating structures (VLFS) are more and more employed by a number of countries in creating land space from the ocean. These “swimming islands” are of pontoon-type and benefit from high stability, low manufacturing costs, and easy maintenance. Owing to their much larger dimensions in length than in depth, the VLFS are relatively flexible and, thus, VLFSs have to be robustly designed against wave-induced deformations and stresses. As such a reliability analysis involves many uncertainties, efficient methods have to be developed that allow for both the modelling of uncertain behaviour and the handling of the computational complexity. In this project, the main objective focuses on the development and implementation of a prototype for the hydroelastic analysis of VLFS. Therefore, stochastic finite elements are subject of choice for the planned reliability analysis over huge sets of different structural properties, while sophisticated techniques of modern grid computing should tackle the computational problem of such complex parameter studies.

Excellence Initiative: IAS

The Institute for Advanced Study (IAS) of Technische Universität München is the centerpiece of TUM’s institutional strategy to promote top-level research in the so-called Excellence Initiative by the German federal and state governments.

HPC - Tackling the Multi-Challenge

Project type IAS focus group
Funded by Excellence Initiative of the German federal and state and governments
Begin 2010
End 2013
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Dr. rer. nat. habil. Miriam Mehl, Dr. rer. nat. Dirk Pflüger, Christoph Kowitz, M.Sc., Valeriy Khakhutskyy, M.Sc., Dipl.-Math. Benjamin Uekermann, Arash Bakhtiari, M.Sc. (hons)
Contact person Dr. rer. nat. habil. Miriam Mehl
Co-operation partner Prof. George Biros (Georgia, USA), Markus Hegland (Canberra, Australia)

Brief description

High-performance computing (HPC) is a thriving cross-sectional research field of utmost relevance in science and engineering. Actually, scientific progress is more and more depending on insight gained by computational research. With the increased technological potential, however, the requirements are growing, too – which leads to several computational challenges, which are all related to some “multi-X” notion: multi-disciplinary, multi-physics, multi-scale, multi-dimensional, multi-level, multi-core. This focus group will primarily address the three topic multi-physics (mp), multi-dimensional (md), and multi-core (mc).
The interplay of these three subtopics is straightforward: Both mp and md are among the usual suspects that need and, thus, drive HPC technology and mc; mp frequently appears in the context of optimisation or parameter identification or estimation – thriving topics of current md research; and present as well as future mc technology is inspired by algorithmic patterns, as provided by mp and md. Hence, it is not only reasonable to address mp, md, and mc in an integral way, it is essential, and this IAS focus group offers the unique chance of doing this at a very high international level.

Bayern Excellent: MAC@IGSSE

The Munich Centre of Advanced Computing (MAC) is a research consortium which has been established at TUM to bundle research activities related to computational science and engineering (CSE) as well as high-performance computing (HPC) - across disciplines, across departments, and across institutions. In MAC, seven of TUM's departments and other Munich research institutions (Ludwig-Maximilians-Universität, Max-Planck insititutes, the Leibniz Supercomputing Centre of the Bavarian Academy of Sciences and Humanities) as well as TUM's international partners such as KAUST, the King Abdullah University of Science and Technology, join their forces to ensure the sustainable usage of current and future HPC architectures for the most relevant and most challenging CSE applications.


Efficient Parallel Strategies in Computational Modelling of Materials

Project type Förderprogramm ”Bayern exzellent”: Munich Centre of Advanced Computing (MAC)
Funded by Bavarian state government, Technische Universität München
Begin 2008
End 2012
Leader Prof. Dr. Dr. h.c. Notker Rösch
subproject: Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Martin Roderus
Contact person Martin Roderus
Co-operation partner Prof. Dr. Dr. h.c. Notker Rösch, Prof. Dr. Arndt Bode, Prof. Dr. Michael Gerndt, Prof. Dr. Heinz-Gerd Hegering

Brief description

The project will develop a new paradigm for the parallelisation of density functional theory (DFT) methods for electronic structure calculations and implement this new strategy. Advanced embedding techniques will account for environment effects (e.g. solvent, support) on a system, which requires a strong modularisation of the DFT approach, facilitating task specific parallelisation, memory management, and low-level optimisation. Efficiency will be further increased by dynamical adaptation of varying resource usage at module level and pooling of applications.


A High-End Toolbox for Simulation and Optimisation of Multi-Physics PDE Models

Project type Förderprogramm ”Bayern exzellent”: Munich Centre of Advanced Computing (MAC)
Funded by Bavarian state government, Technische Universität München
Begin 2008
End 2012
Leader Prof. Dr. Michael Ulbrich
subproject: Univ.-Prof. Dr. Hans-Joachim Bungartz, Dr. rer. nat. Miriam Mehl
Staff Janos Benk, M.Sc
Contact person Dr. rer. nat. Miriam Mehl
Co-operation partner Prof. Dr. Michael Ulbrich, Prof. Dr. Martin Brokate, Prof. Dr. Ernst Rank, Prof. Dr. Ronald Hoppe (Augsburg)

Brief description

The project aims at bundling forces to overcome conceptional drawbacks of current simulation software and to make a big step towards a future generation of simulation and optimisation tools for complex systems. The goal is to develop a rapid prototyping HPC software platform for both simulation and optimisation. The design will be hierarchical, with high performance components on all levels, ranging from problem formulation via discretisation to numerics and parallelisation. Work will be interwoven with theoretical investigations of innovative numerical algorithms.


A Scalable Infrastructure for Computational Steering

Project type Förderprogramm ”Bayern exzellent”: Munich Centre of Advanced Computing (MAC)
Funded by Bavarian state government, Technische Universität München
Begin 2008
End 2012
Leader Prof. Dr. Rüdiger Westermann
subproject: Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Daniel Butnaru, M.Sc
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz
Co-operation partner Prof. Dr. Rüdiger Westermann, Prof. Bernd Brügge, Ph.D., Prof. Dr. Ernst Rank, Prof. Dr.-Ing. Wolfgang Wall

Brief description

The goal of this project is to design and prototype a scalable infrastructure for computational steering. It will be targeted for the computational engineering domain, which allows to leverage existing cooperative developments as a starting point and to use real-world data that is representative in size, modality, and structure to what is available in other scientific areas like geology or biology. The infrastructure implements a processing pipeline ranging from scalable data processing workflows to interactive visualisation and human-computer interaction in virtual and augmented reality environments.

Excellence Initiative: MAC@KAUST

Simulation of CO2 Sequestration

Project type Strategic Partnership with the King Abdullah University of Science and Technology (KAUST)]
Funded by KAUST
Begin 2009
End 2013
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff see Munich Centre of Advanced Computing
Contact person Tobias Weinzierl
Co-operation partner Prof. Dr. Dr.-Ing. habil. Arndt Bode (Computer Architecture), Prof. Dr. Martin Brokate (Numerical Mathematics and Control Theory), Prof. Dr. Drs. h.c.Karl-Heinz Hoffmann (Numerical Mathematics and Control Theory), Prof. Dr.-Ing. Michael Manhart (Hydromechanics), Prof. Dr. Michael Ulbrich (Mathematical Optimisation)

Brief description

The goal of this project is to design and investigate novel approaches to modelling and simulation of CO2 sequestration processes, in particular in the context of enhanced oil recovery. The project will involve both fine-grain simulations - with all related aspects from multi-phase schemes via numerical algorithmics to high-performance computing issues - and homogenization approaches to efficiently capture the fine-grain effects on the macro-scale. For that, groups with expertise in flow physics, mathematical modelling, numerical analysis, numerical algorithmics, optimisation and inverse problems, and high-performance computing and HPC systems join their forces. Topics addressed will cover multi-scale modelling and homogenisation, fully-resolved pore-scale simulation, constrained optimisation of the sequestration process, enhanced numerics and parallelisation, and HPC implementation.


Virtual Arabia

Project type Strategic Partnership with the King Abdullah University of Science and Technology (KAUST)]
Funded by KAUST
Begin 2009
End 2013
Leader Tobias Weinzierl
Staff see Munich Centre of Advanced Computing
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz
Co-operation partner Prof. Dr. Dr.-Ing. habil. Arndt Bode (Computer Architecture), Prof. Gudrun Klinker, Ph.D. (Augmented Reality), Prof. Dr. Ernst Rank (Computation in Engineering), Prof. Dr. Rüdiger Westermann (Computer Graphics & Visualization)

Brief description

The goal of this project is to develop a virtual environment for the interactive visual exploration of Saudi Arabia. In contrast to virtual globe viewers like Google Earth, this environment will allow the user to look both above and underneath the earth surface in an integrated way. It will, thus, provide interactive means for the visual exploration of 3D geological structures and dynamic seismic processes as well as atmospheric processes and effects or built or planned infrastructure. The specific techniques required to support such functionality will be integrated into a generic infrastructure for visual computing. The project will cooperate with the KAUST 3D Modelling and Visualisation Centre and the KAUST Computational Earth Sciences Centre.

G8-Initiative: Nuclear Fusion Simulations at Exascale (Nu-FuSe)

Project type G8 Research Councils Initiative on Multilateral Research Funding
Funded by G8 group of leading industrial nations
Begin July 2011
End April 2015
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Dr. rer. nat. Tobias Neckel
Contact person Dr. rer. nat. Tobias Neckel
Co-operation partner Prof. Frank Jenko ( Max-Planck Institut für Plasmaphysik, IPP)

Brief description

The G8 project Nu-FuSE is an international project looking to significantly improve computational modelling capabilities to the level required by the new generation of fusion reactors. The focus is on three specific scientific areas: fusion plasma; the materials from which fusion reactors are built; and the physics of the plasma edge. This will require computing at the “exascale” level across a range of simulation codes, collaborating together to work towards full integrated fusion tokamak modelling.

To exploit upcoming exascale systems effectively for fusion modelling creates significant challenges around scaling, resiliency, result validation and programmability. This project will be focusing on meeting these challenges by improving the performance and scaling of community modelling codes to enable simulations orders of magnitude larger than are currently undertaken.

HEPP: International Helmholtz Graduate School for Plasma Physics

Project type Helmholtz Graduate School Scholarship
Funded by Helmholtz Gemeinschaft
Begin November 2011
End October 2014
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Dr. rer. nat. Tobias Neckel
Contact person Dr. rer. nat. Tobias Neckel
Co-operation partner Prof. Frank Jenko ( Max-Planck Institut für Plasmaphysik, IPP)

Brief description

The fundamental equations used to understand and predict various phenomena in plasma physics share a very important feature: They are all nonlinear. This implies that analytical techniques - although also very important - are limited in practice, calling for a numerical approach. Fortunately, the capabilities of modern supercomputers have reached a level which allows to tackle some outstanding open issues in theoretical plasma physics, including, e.g., turbulence, nonlinear magnetohydrodynamics, and plasma-wall interaction.

Given the multiscale nature of most problems of interest, advanced algorithms and efficient implementations on massively parallel platforms are usually required in order to tackle them. In this context, a close collaboration of theoretical plasma physicists with applied mathematicians and computer scientists can be of great benefit. Thus, state-of-the-art numerical techniques, hardware-aware implementation strategies, and scalable parallelization approaches are explored in terms of their potential to minimize the overall computational requirements and to maximize the reliability and robustness of the simulations.


DFG - German Research Foundation

Priority Program 1648 SPPEXA - Software for Exascale Computing

Coordination Project

Funded by DFG
Begin 2012
End 2016
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Benjamin Peherstorfer, M.Sc., Dr. rer. nat. Tobias Neckel
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz

Brief description

The Priority Programme (SPP) SPPEXA is different from other SPP with respect to its genesis, its volume, its funding via DFG's Strategy Fund, with respect to the range of disciplines involved, and to a clear strategic orientation towards a set of time-critical objectives. Therefore, despite its distributed structure, SPPEXA also resembles a Collaborative Research Centre to a large extent. Its successful implementation and evolution will require both more and more intense structural measures. The Coordination Project comprises all intended SPPEXAwide activities, including steering and coordination, internal and international collaboration and networking, and educational activities.

Reference: Priority Program 1648 SPPEXA - Software for Exascale Computing

ExaFSA - Exascale Simulation of Fluid-Structure-Acoustics Interaction

Funded by DFG
Begin 2012
End 2016
Leader Univ.-Prof. Dr. Miriam Mehl
Staff Dipl.-Math. Benjamin Uekermann
Contact person Univ.-Prof. Dr. Miriam Mehl

Brief description

In scientific computing, an increasing need for ever more detailed insights and optimization leads to improved models often including several physical effects described by different types of equations. The complexity of the corresponding solver algorithms and implementations typically leads to coupled simulations reusing existing software codes for different physical phenomena (multiphysics simulations) or for different parts of the simulation pipeline such as grid handling, matrix assembly, system solvers, and visualization. Accuracy requirements can only be met with a high spatial and temporal resolution making exascale computing a necessary technology to address runtime constraints for realistic scenarios. However, running a multicomponent simulation efficiently on massively parallel architectures is far more challenging than the parallelization of a single simulation code. Open questions range from suitable load balancing strategies over bottleneck-avoiding communication, interactive visualization for online analysis of results, synchronization of several components to parallel numerical coupling schemes. We intend to tackle these challenges for fluid-structure-acoustics interactions, which are extremely costly due to the large range of scales. Specifically, this requires innovative surface and volume coupling numerics between the different solvers as well as sophisticated dynamical load balancing and in-situ coupling and visualization methods.

Reference: Priority Program 1648 SPPEXA - Software for Exascale Computing

EXAHD - An Exa-Scalable Two-Level Sparse Grid Approach for Higher-Dimensional Problems in Plasma Physics and Beyond

Funded by DFG
Begin 2012
End 2016
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz

Brief description

Higher-dimensional problems (i.e., beyond four dimensions) appear in medicine, finance, and plasma physics, posing a challenge for tomorrow's HPC. As an example application, we consider turbulence simulations for plasma fusion with one of the leading codes, GENE, which promises to advance science on the way to carbon-free energy production. While higher-dimensional applications involve a huge number of degrees of freedom such that exascale computing gets necessary, mere domainde composition approaches for their parallelization are infeasible since the communication explodes with increasing dimensionality. Thus, to ensure high scalability beyond domain decomposition, a second major level of parallelism has to be provided. To this end, we propose to employ the sparse grid combination scheme, a model reduction approach for higher-dimensional problems. It computes the desired solution via a combination of smaller, anisotropic and independent simulations, and thus provides this extra level of parallelization. In its randomized asynchronous and iterative version, it will break the communication bottleneck in exascale computing, achieving full scalability. Our two-level methodology enables novel approaches to scalability (ultra-scalable due to numerically decoupled subtasks), resilience (fault and outlier detection and even compensation without the need of recomputing), and load balancing (high-level compensation for insufficiencies on the application level).

Reference: Priority Program 1648 SPPEXA - Software for Exascale Computing

SFB-TRR 89: Invasive Computing

Funded by DFG
Begin Mid 2010
End 1st phase in mid 2014
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Dipl.-Inf. Martin Schreiber, Dr. rer. nat. Tobias Neckel, Dr. rer. nat. Tobias Weinzierl, Univ.-Prof. Dr. Michael Bader
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz

Brief description

In the proposed CRC/Transregio, we intend to investigate a completely novel paradigm for designing and programming future parallel computing systems called invasive computing. The main idea and novelty of invasive computing is to introduce resource-aware programming support in the sense that a given program gets the ability to explore and dynamically spread its computations to neighbour processors similar to a phase of invasion, then to execute portions of code of high parallelism degree in parallel based on the available (invasible) region on a given multi-processor architecture. Afterwards, once the program terminates or if the degree of parallelism should be lower again, the program may enter a retreat phase, deallocate resources and resume execution again, for example, sequentially on a single processor. In order to support this idea of self-adaptive and resource-aware programming, not only new programming concepts, languages, compilers and operating systems are necessary but also revolutionary architectural changes in the design of MPSoCs (Multi-Processor Systems-on-a-Chip) must be provided so to efficiently support invasion, infection and retreat operations involving concepts for dynamic processor, interconnect and memory reconfiguration.

Reference: Transregional Collaborative Research Centre 89 - Invasive Computing

Numerical Aspects of the Simulation of Quantum Many-body Systems

Project type QCCC project
Funded by Quantum Computing, Control and Communication (QCCC)
Begin January 2008
End December 2012
Leader Univ.-Prof. Dr. Thomas Huckle
Staff Dipl.-Math. Konrad Waldherr
Contact person Univ.-Prof. Dr. Thomas Huckle
Co-operation partner Dr. Thomas Schulte-Herbrueggen (Chemistry, TUM)

Brief description

In the last years a growing attention has been dedicated to many body quantum systems from the point of view of quantum information. Indeed, after the initial investigation of simple systems as single or two qubits, the needs of understanding the characteristics of a realistic quantum information device leads necessary to the study of many body quantum systems. These studies are also driven by the very fast development of experiments which in the last years reach the goal of coherent control of a few qubits (ion traps, charge qubits, etc...) with a roadmap for further scaling and improvement of coherent control and manipulation techniques. Also, new paradigm of performing quantum information tasks, such as quantum information transfer, quantum cloning and others, without direct control of the whole quantum system but using our knowledge of it has increased the need of tools to understand in details the behaviour of many body quantum system as we find them in nature. These new goals of the quantum information community lead to an unavoidable exchange of knowledge with other communities that already have the know-how and the insight to address such problems; for example the condensed matter, computational physics or quantum chaos communities. Applying known techniques and developing new ones from a quantum information perspective have already produced fast and unexpected developments in these fields. The comprehension of many body quantum systems ranging from few qubits to the thermodynamical limit is thus needed and welcome not only to develop useful quantum information devices, but it will lead us to a better understanding of the quantum world. Reference: Computations in Quantum Tensor Networks

VW-Stiftung: ASCETE (Advanced Simulation of Coupled Tsunami-Earthquake Events)

Project type Call "Extreme Events: Modelling, Analysis and Prediction"
Funded by Volkswagen Stiftung
Begin February 2012
End January 2015
Leader Univ.-Prof. Dr. Jörn Behrens (KlimaCampus, Univ. Hamburg)
Staff Univ.-Prof. Dr. Michael Bader, Alexander Breuer, Kaveh Rahnema
Contact person Univ.-Prof. Dr. Michael Bader
Co-operation partner Univ.-Prof. Dr. Jörn Behrens (KlimaCampus, Univ. Hamburg), Univ.-Prof. Dr. Heiner Igel (GeoPhysics, Univ. München), Dr. Martin Käser (GeoPhysics, Univ. München), Dr. Luis Angel Dalguer (ETH Zürich),
see official ASCETE webpage

Brief description

Earthquakes and tsunamis represent the most dangerous natural catastrophes and can cause large numbers of fatalities and severe economic loss in a single and unexpected extreme event as shown in Sumatra in 2004, Samoa in 2009, Haiti in 2010, or Japan in 2011. Both phenomena are consequences of the complex system of interactions of tectonic stress, fracture mechanics, rock friction, rupture dynamics, fault geometry, ocean bathymetry, and coast line geometry. The ASCETE project forms an interdisciplinary research consortium that – for the first time – will couple the most advanced simulation technologies for earthquake rupture dynamics and tsunami propagation to understand the fundamental conditions of tsunami generation. To our knowledge, tsunami models that consider the fully dynamic rupture process coupled to hydrodynamic models have not been investigated yet. Therefore, the proposed project is original and unique in its character, and has the potential to gain insight into the underlying physics of earthquakes capable to generate devastating tsunamis.


BMBF: HPC Software for Scalable, Parallel Hardware

The two following BMBF projects were established within the BMBF call "HPC Software for Scalable, Parallel Hardware" in 2008.


Highly Scalable Eigenvalue Solvers for Petaflop Applications (ELPA)

Website of the project

Project type BMBF-Projekt; "HPC Software für skalierbare Parallelrechner"
Funded by BMBF
Begin 2008
End 2012
Leader Rechenzentrum Garching, Dr. Hermann Lederer
Staff Thomas Auckenthaler, Univ.-Prof. Dr. Michael Bader, Univ.-Prof. Dr. Hans-Joachim Bungartz, Univ.-Prof. Dr. Thomas Huckle
Contact person Thomas Auckenthaler
Co-operation partners Rechenzentrum Garching (Dr. H. Lederer),
Bergische Universität Wuppertal, Lehrstuhl für Angewandte Informatik (Prof. A. Frommer, Prof. B. Lang),
Fritz-Haber-Institut, Berlin, Abt. Theorie (Prof. M. Scheffler, Dr. V. Blum),
Max-Planck-Institut für Mathematik in den Naturwissenschaften, Leipzig, Abt. Komplexe Strukturen in Biologie und Kognition (Prof. J. Jost),
IBM Deutschland GmbH

Brief description

The ELPA project will develop highly scalable solvers for Eigenvalue problems. Primary goal will be the design and implementation of a highly scalable direct Eigensolver for large, dense, symmetric matrices. Integration of the respective code into a respective library is planned. In addition, the use of iterative solvers for specific Eigenproblems will also be investigated.


Innovative HPC-Methoden und Einsatz für hochskalierbare Molekulare Simulation (IMEMO)

Project type BMBF-Projekt; "HPC Software für skalierbare Parallelrechner"
Funded by BMBF
Begin 2008
End 2012
Leader Prof. Dr.-Ing. Michael Resch, HLRS, Universität Stuttgart
Staff Martin Buchholz, Ekaterina Elts, M.Sc, Wolfgang Eckhardt, Univ.-Prof. Dr. Michael Bader, Univ.-Prof. Dr. Hans-Joachim Bungartz
Contact person Martin Buchholz
Co-operation partners Institut für Techno- und Wirtschaftsmathematik (ITWM) an der Fraunhofer Gesellschaft (Dr. Franz-Josef Pfreundt),
Höchstleistungsrechenzentrum (HLRS) der Universität Stuttgart (Prof. Dr.-Ing. Michael Resch),
Lehrstuhl für Thermodynamik (LTD) an der Universität Kaiserslautern (Prof. Dr.-Ing. Hans Hasse),
Lehrstuhl für Thermodynamik und Energietechnik (ThEt) an der Universität Paderborn (Prof. Dr.-Ing. Jadran Vrabec)

Brief description

Within the IMEMO project, our SCCS group will develop efficient algorithms for the parallelisation of large-scale molecular simulations. One of the main questions is the dynamical load balancing in settings where strong imbalances occur, such as during condensation processes, where the distribution of molecules in different parts of the computational domain will vary over several orders of magnitude. A further important focus is the development of hierarchical parallel algorithms on highly parallel clusters of manycore processors.

BMBF: Program Math

Non-Linear Characterization and Analysis of FEM Simulation Results for Motor-Car Components and Crash Tests (SIMDATA-NL)

Project type BMBF support program: Mathematics for innovations in the Industrial and Service Sectors
Funded by BMBF
Begin July 2010
End June 2013
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Benjamin Peherstorfer, M.Sc, Dr. rer. nat. Dirk Pflüger
Contact person Dr. rer. nat. Dirk Pflüger
Co-operation partner Prof. Dr. Michael Griebel (INS, Bonn)

Prof. Dr. Claudia Czado (Mathematical Statistics, TU München), Dr. Jochen Garcke (Institute of Mathematics, TU Berlin), Clemens-August Thole, Prof. Dr. Ulrich Trottenberg (SCAI, St. Augustin), AUDI AG, PDTec AG, Volkswagen AG

Brief description

The project aims at the extraction of the (few) effective dimensions in high-dimensional simulation data in the context of automotive design. Linear methods, like the principal component analysis, alone are not sufficient for many of those applications due to significant non-linear effects. Therefore, they will be complemented by methods that are able to resolve nonlinear relationships, especially by means of sparse grid discretizations.

EU: Tempus CANDI

Project type EU Tempus Project
Funded by EU
Begin January 2010
End December 2013
Leader University Vienna
Staff Univ.-Prof. Dr. Hans-Joachim Bungartz, Univ.-Prof. Dr. Ernst W. Mayr, Univ.-Prof. Dr. Helmut Seidl, Dr. rer. nat. Tobias Weinzierl
Contact person Dr. rer. nat. Tobias Weinzierl
Co-operation partner see official webpage

Brief description

The CANDI project will develop both the infrastructure for e-Learning / Retraining, and the skills necessary to transfer existing courses and curricula to an e-Learning environment. The project is set up in a way to address multiple problems simultaneously:

  • Most obviously, CANDI will help to educate large numbers of students. Additional costs for the infrastructure will be modest, since no new buildings are necessary, existing teaching \ personnel can be employed, and only modest investment in computer infrastructure is necessary.
  • CANDI will help to narrow the gap between the education level in central universities and the provinces.
  • CANDI will train the local university staff in systematic and effective use of e-Learning, presentation technology, and related didactic skills. Existing e-Learning approaches we saw in Central Asia mostly involve electronic versions of course notes on the internet.
  • Importantly, CANDI will use e-Learning not only to teach students, but also to teach university staff, in particular at institutions in provincial cities. In fact, e-Learning will also become the main medium to teach e-Learning skills.
  • CANDI will support the retraining of industry staff. On the other hand, CANDI will also open opportunities for industry to deliver applied courses and lectures to a university audience.
  • CANDI will employ cheap open source solutions for e-Learning. In addition to these direct effects, CANDI will also have important positive indirect effects on universities and industries in Uzbekistan and Kazakhstan:
  • CANDI will have a pilot phase where existing courses from European partners will be transferred into the e-Learning framework. Since these courses will reflect the state of the art in their respective areas (mostly Computer Science, Chemistry, Computational Science, Soft Skills), they will by their nature improve the quality of the curricula inside and outside of e-Learning.
  • The establishment of standardized e-learning courses facilitates the convergence of different academic systems, and thus the possibility of a credit transfer system.
  • CANDI will improve the English and soft skill knowledge of all participants, thereby improving the ability of Central Asian staff to achieve sustainability by international grants.
  • By building the competence for e-Learning, CANDI will also contribute to the knowledge base in software engineering and programming in Uzbekistan and Kazakhstan.

EU: Tempus Belgrad

Project type EU Tempus Project
Funded by EU
Begin 15.01.2009
End 14.01.2012
Leader Faculty of Mechanical Engineering, University of Belgrade
Staff Prof. Dr.-Ing. Martin Gabi (Universität Karlsruhe), Prof. Dr. rer. nat. Ernst Rank (TUM), Univ.-Prof. Dr. Hans-Joachim Bungartz (TUM), Dr. Mihailo Ristic (Imperial College London), Prof. Dr. Javier Alvarez del Castillo (Universitat Politècnica de Catalunya)), The German University in Cairo - GUC, Prof. Dr. Milos Nedeljkovic (University of Belgrade), Prof. Dr. Milan Matijevic (University of Kragujevac), Prof. Dr. Dragan Lazic (University of Belgrade), Prof. Dr. Zarko Cojbasic (University of Nis)
Contact person Prof. Milos Nedeljkovic
Co-operation partner ASIIN e.V. (Düsseldorf), Andrej Vrbancic (Robotina doo, Slovenija), Prof. Dr. Radivoje Mitrovic (Ministry of Education, Serbia), National Tempus Office Serbia, Dr. Zaljko Despotovic (Institute "Mihajlo Pupin", Serbia), Rectorate of University of Belgrade, Biserka Ilic (Informatika doo, Serbia), Dusan Babic (IvDam Process Control doo, Serbia)


ENB:

Bavarian Graduate School of Computational Engineering (BGCE)

Website of the BGCE

Project type Elite Study Program
Funded by Elite Network of Bavaria
Begin April 2005
End April 2015
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Dr. rer. nat. Tobias Neckel, Dipl.-Inf. Marion Bendig
Contact person Dr. rer. nat. Tobias Neckel
Co-operation partner International Master's Program Computational Science and Engineering (TUM)

International Master's Program Computational Mechanics (TUM)
International Master's Program Computational Engineering (U Erlangen)

Brief description

The Bavarian Graduate School of Computational Engineering is an association of the three Master programs: Computational Engineering (CE) at the University of Erlangen-Nürnberg, Computational Mechanics (COME), and Computational Science and Engineering (CSE), both at TUM. Funded by the Elitenetzwerk Bayern, the Bavarian Graduate School offers an Honours program for gifted and highly motivated students. The Honours program extends the regular Master's programs by several academic offers:

  • additional courses in the area of computational engineering, in particular block courses, and summer academies.
  • Courses and seminars on "soft skills" - like communication skills, management, leadership, etc.
  • an additional semester project closely connected to current research

Students who master the regular program with an above-average grade, and successfully finish the Honours program, as well, earn the academic degree "Master of Science with Honours".

KONWIHR (Bavarian Competence Network for Technical and Scientific High Performance Computing):

Optimization of Dense and Sparse Matrix Kernels for SeisSol on SuperMUC

Project type KONWIHR III project
Funded by Bayer. Staatsministerium für Wissenschaft, Forschung und Kunst
Begin 2013
End 2014
Leader Univ.-Prof. Dr. Michael Bader
Staff Alexander Breuer, Alexander Heinecke
Contact person Univ.-Prof. Dr. Michael Bader
Co-operation partner Geophysics group, Department of Earth and Environmental Sciences, University of Munich (Dr. Christian Pelties, Prof. Dr. Heiner Igel)

Brief description

SeisSol is one of the leading simulation codes for earthquake scenarios, in particular for accurate simulation of dynamic rupture processes. In the proposed project, we optimize the performance of SeisSol via a code generation approach. In a two-step procedure, the set-up of element matrices (which are used to express SeisSol's innermost kernel operations) are extracted, and optimized kernel implementations are generated (exploiting SIMD operations, register blocking, etc.) and integrated into SeisSol. Performance evaluation will be done on the SuperMUC platform.

MISTI MIT-TUM Project: Combining Model Reduction with Sparse Grids into a Multifidelity Framework for Design, Control and Optimization

Webpage of MISTI

Project type MISTI Germany Project
Funded by MISTI
Begin January 2012
End September 2013
Leader Univ.-Prof. Dr. Hans-Joachim Bungartz
Staff Daniel Butnaru, M.Sc, Benjamin Peherstorfer, M.Sc
Contact person Univ.-Prof. Dr. Hans-Joachim Bungartz
Co-operation partner Univ.-Prof. Dr. Karen Willcox (MIT)

Brief description

Many engineering problems require repeated simulations in order to model and optimize a real life system. Such models are typically quite complex and a single solution usually involves a huge computational effort. If a large number of such expensive solutions is needed, the models become impractical and alternatives are sought, with the goal of enabling interactive and highly reliable high-accuracy simulations. Surrogate models mimic the behavior of the simulation model as closely as possible and are at the same time computationally much cheaper to evaluate. While certain surrogate methods exist and perform well for specific problems, their acceptance is slowed by their complex and intrusive manner. They need to be reconsidered for each problem class and are sensitive to the characteristics of the underlying simulation.

In this project we open a collaboration between MIT and TUM in the area of model reduction with an initial focus on non-intrusive methods. These treat the simulation as a black box and, based only on a number of snapshots, deliver an approximation which can than be efficiently queried. The joint work will combine MIT’s model-reduction techniques with TUM’s sparse grid methods with the goal of delivering a novel non-intrusive model reduction technique.

BaCaTeC: Scalable Tsunami and Atmospheric Simulation on Heterogeneous Manycore Platforms

Project type High-tech research collaborations between Bavaria and California
Funded by BaCaTeC
Begin Juli 2012
End December 2013
Leader Univ.-Prof. Dr. Michael Bader, Prof. Francis X. Giraldo
Staff Kaveh Rahnema, Alexander Breuer
Contact person Univ.-Prof. Dr. Michael Bader

Brief Description:

Applications in the geosciences more than others rely on the availability of extreme computational performance. The required HPC and supercomputing platforms are heavily based on heterogeneous accelerator hardware: general-purpose graphical processing units (GPGPU), accelerator hardware, and mainstream manycore developments, such as Intel’s MIC architecture, will soon dominate this field. While fully exploiting the performance of such platforms is a challenge of its own, the applications’ need for dynamic adaptive mesh refinement makes the development of algorithms and software even more demanding. Within this BaCaTeC project, the two involved groups together with two consultants at Intel will combine their expertise to optimize their simulation codes for tsunami and atmospheric simulation (both using discontinuous Galerkin methods for discretization) for CPU and GPU-based platforms.