

You have just been appointed chairman of the Gauss Centre for Supercomputing or GCS. What initiatives have you placed on the agenda for the next two years?
Prof. Dieter Kranzlmueller: Exciting challenges lie ahead of us, for which we are jointly developing strategies within the GCS network. We are in the planning phase for the successor systems to Blue Lion at LRZ, Herder at HLRS, and Jupiter at JSC — in other words, the post-exascale generation of GCS systems. Classical high-performance computing is currently undergoing change due to artificial intelligence (AI) and quantum computing. While AI — and with it, Graphics Processing Units (GPUs) — has almost become routine in scientific computing, we are now beginning to operate the first quantum systems, which are connected to our supercomputers as accelerators and provide researchers with more capabilities for computation and simulation. This sparks new demands among users, and we are in the process of identifying these needs and aligning them with latest technological developments.
This is a lot about long-term planning. How is this being affected by the change in government?
Kranzlmueller: As publicly funded scientific computing centres, we closely coordinate with funding agencies at both the federal and state levels. The change in government in Berlin has transformed the Bundesministerium für Bildung und Forschung into the Federal Ministry of Research, Technology and Space. The change in name signals a shift in focus, which will influence our tasks: Supercomputing is part of science and research, which in turn drive technological advancements in IT as well as in space exploration. We will now work with the responsible officials in the ministries to determine how we, as HPC centres, can respond to these demands.
Europe is also promoting supercomputing, currently with a particular focus on building scalable resources for AI.
Kranzlmueller: After the decision to establish 13 AI Factories for science and industry in Europe – including one each at HLRS in Stuttgart and at JSC in Jülich – the EU is planning additional, larger so-called Giga Factories to ensure Europe remains at the forefront of AI development and applications. The tender for potential locations is currently underway – here, we at GCS are called upon to contribute our expertise in building and operating powerful large-scale facilities. Highly scalable AI clusters raise important questions, such as how they will be powered, which affects site selection, as well as issues around operations and staffing. Technical and political challenges, the development of innovative systems and additional services — within the GCS network, we will not run out of work. Personally, I approach these tasks with respect, but also with excitement at the opportunity to help shape the future over the next few years together with Thomas Lippert and Michael Resch.
What priorities are personally important to you in this regard?
Kranzlmueller: My top priority is providing efficient, sustainable, and innovative computing resources for leading, internationally relevant research projects. Additionally, training and qualifying scientists so they can make even better use of the resources at the three GCS centres is very important to me. Today, this especially means taking into account the increasing technical heterogeneity and complexity of systems when developing scientific codes, as well as addressing questions of energy consumption and efficiency. AI, and more recently quantum computing, are becoming more closely integrated with classical supercomputing, leading to many exciting new methods and opportunities — especially for processing research data from a wide range of disciplines. For the scientific computing centres, this means not only evaluating technical innovations but, above all, significantly expanding and continually improving the support and consulting services offered to researchers.
You’ve already mentioned it — we see a growing heterogeneity in high-performance computing systems due to different types of processors and accelerators, as well as quantum systems. What does this mean for future GCS acquisitions?
Kranzlmueller: Each centre follows a complementary strategy in designing their computing architectures, so the three supercomputers can demonstrate strengths in different tasks and meet various user needs. System architectures will certainly become even more complex. We are currently discussing what our users will need to effectively work with these systems – specifically, whether large Central Processing Unit or CPU resources will continue to be provided at the GCS centres. CPUs remain in high demand for classical simulations and modelling. So on one hand, research’s growing appetite for more computing power can only be satisfied through heterogeneous systems. On the other hand, society, politics, and even manufacturers are currently placing a strong focus on AI – but classical High-Performance Computing, short: HPC, must not be overlooked in the process.
Does this mean HPC funding is falling behind, given that programs for AI and research are currently at the top of the political agenda in Germany and Europe?
Kranzlmueller: That would be very short-sighted. Results from HPC are fundamentally important for AI applications. For example, in environmental and natural sciences, calculations from mathematical-physical simulations provide data for advanced statistical models — and conversely, these models help optimise, expand, or vary the mathematical models.
Do the European AI and Giga Factories, or regional initiatives like BayernKI, compete with the supercomputing centres?
Kranzlmueller: There’s no need to immediately assume competition — the AI Factories and regional initiatives complement supercomputing, create synergies between computing centres, and offer science and research more options for efficient, tailored data processing. For Germany, the EuroHPC Joint Undertaking has placed two European AI Factories in GCS centres. In addition, we have built up all necessary AI resources: the current HPC systems are GPU-accelerated and AI-capable. As with supercomputing, the demand for more performance grows with experience in applying AI methods. In that sense, AI also prepares our users for supercomputing: for example, working with large language models (LLMs) requires not only fast data throughput but also high computing power — and that is often still underestimated.
Due to the energy-hungry nature of AI applications, the energy question is becoming more pressing: The LRZ is planning a substation for its own energy supply. Are there joint efforts or guidelines within GCS for greater sustainability?
Kranzlmueller: Within GCS we are in close exchange on these topics, testing and researching different approaches to allow for comparison. Binding guidelines would restrict or regulate our basic research. This also applies to other areas such as system and software development or building management — factors that likewise influence the energy consumption of computing systems. From the independence of each computing centre in strategic matters emerge diverse solutions, which we can evaluate and compare. That’s efficient, practical, and also politically desired.
Currently, researchers apply for computing time for large-scale projects through GCS, and the Scientific Steering Committee assigns them to the appropriate centre in Garching, Jülich, or Stuttgart. Are applications now also being forwarded to AI clusters or AI Factories?
Kranzlmueller: The three supercomputing centres collaborate closely with the National High-Performance Computing or the NHR-Verein and use the same application system. This allows us to flexibly assign projects to the centre that has the most suitable resources — whether it’s a supercomputer or an AI cluster. Additionally, GCS and NHR jointly host a regular AI and HPC Café, where researchers can quickly clarify practical questions online with experts.
Supercomputing is international – how is it affected by current U.S. politics? Are research projects impacted by funding cuts or halted data transfers?
Kranzlmueller: Science is, in terms of content, non-political, but it does provide impulses for policymaking. I’m very glad that the currently erratic U.S. political climate has not affected collaborations between research institutions. At the same time, it’s clear that calls for European sovereignty — especially in areas like IT and digital transformation — are gaining more traction. That is a positive development. I think it would be beneficial if we focused more on our strengths and worked more closely together on topics like energy efficiency, software development, or basic research in Europe. We should also expand international partnerships with Asia and the Global South. There is still so much we can learn from their ways of thinking and working — which would benefit us not only scientifically, but also culturally and strategically. (vs | LRZ)