Adieu SuperMUC

It weighs as much as 42 elephants and is now a thing of the past: After having computed more than nine billion core hours and completing 6.3 million runs for researchers, the 250-ton SuperMUC, actually a system consisting of two machines, is being partly decommissioned and recycled. The Free State of Bavaria and the German Federal Government invested round about 133 million euros in SuperMUC Phase 1 and Phase 2 at the Leibniz Supercomputing Centre (LRZ) in Garching near Munich. "The rapid development of computer science means that within six to seven years such a computer is outdated. Its operation is no longer economical, a new system provides more performance for less money," explains Professor Dieter Kranzlmüller, Director of the LRZ. "But computing power is not everything. As a service provider, we support and advise scientists in the modelling of data and in the development of simulations for supercomputers".

6 Computer Years Replace 24,000 Human Years

Between 2012 and 2018/19, the SuperMUC was used by 2230 researchers from 23 nations for 820 projects. The largest project, "Observable Nucleons as a Test Case for (Particle) Physics Beyond Standard Models", computed almost 212 million core hours: For comparison: At least 300 people would have had to calculate for 80 years in order to evaluate the data on neutrons and protons and complete the task of the Deutsches Elektronen Synchrotron (DESY), a Research Centre of the Helmholtz Association. These calculations help to determine the black matter in space, perhaps they provide ideas for high-tech materials.

"It is unworthy to waste the time of outstanding people with servant-like computing", said the researcher Gottfried Wilhelm Leibniz. Because machines calculate more accurately and faster, SuperMUC (Phase 1 and 2 combined) were built from 12525 nodes and around 240,000 computing cores. Arranged in 238 racks and networked with more than 250 kilometers of fiber-optic cable and 46 kilometres of copper tubing, the computer achieved 6.8 petaflops of peak performance. In 2012, the first implemention of SuperMUC– SuperMUC Phase 1– was ranked the fourth fastest computer in the world, in 2018 it only reached 64th place. The whole system achieved 6.8 PFLOP/s in its best times. Computer records are more volatile than sports records.

Big data in science

When SuperMUC was installed, computational fluid dynamics (approx. 2.16 billion core hours), astrophysics (1.9 billion hours) or bioinformatics (1 billion hours) produced so much data, that only a supercomputer was able to evaluate them. Digitalisation has led to an explosion in the volume of data in other domain sciences as well. SuperMUC calculated for meteorologists (153 million computing hours), plasma physicists (80 million), engineers (structural mechanics) and materials experts (about 35 million each) and most recently even for economic specialists (10,000 hours) and chemical physicists (42,239 hours).

During SuperMUC’s years of operation, LRZ supported the researchers in their endeavours— over the course of 150 workshops and courses, 4,000 specialists were qualified to work on SuperMUC. In addition, science and research have become more international and diverse: SuperMUC processed almost 5.4 million runs from within Germany, mainly from Bavaria (4.5 million) and North Rhine-Westphalia (171,426). But 10 percent of its computing time was awarded to researchers abroad, mainly Italy (34,381 jobs) and France (34,170). In addition, science and research are networked: SuperMUC processed almost 5.4 million orders from Germany, mainly from Bavaria (4.5 million) and North Rhine-Westphalia (171,426). 10 percent of his computing time or 3 percent of his commissions were awarded to female researchers abroad, mainly to Italy (34,381 jobs) and France (34,170). The SuperMUC was used almost 114,000 times in the EU and 30,000 times in Norway, Switzerland, Turkey, Israel, New Zealand and the USA.

Fame and glory
The Garching supercomputer won international praise: For example, it created several 2D and 3D models for the SeisSol project for the representation of earthquake waves. One of the simulations of the international team reached the final of the Gordon Bell Award for High Performance Computing in 2014. But fast computing requires energy - since the beginning of the SuperMUC it has come from renewable sources. The supercomputer needed an average of 30 GWh per year - about as much as city with 30,000 inhabitants.

Sounds like a lot, but it was extremely economical: SuperMUC used recycled water and heat in as many ways as possible— it saved Bavaria and the federal government more than 5 million euros in electricity costs due to award-winning innovative technology.


infografik_SuperMUC_s

Achievements SuperMUC