

High-performance computing (HPC) has become an indispensable tool for astrophysics. Researchers who want to better understand our universe often turn to supercomputing to model phenomena in our galaxy either too large or too complex for us to probe otherwise. Recently, a research team led by James Beattie, a postdoctoral researcher at Princeton University and the Canadian Institute for Theoretical Astrophysics at the University of Toronto, focused on better understanding how magnetic fields influence the highly turbulent motions through the interstellar medium (ISM). Using HPC resources at the Leibniz Supercomputing Centre (LRZ), one of the three centers that comprise the Gauss Centre for Supercomputing, the team was able to model turbulence in the ISM in unprecedented detail, calling long-held assumptions of the role of magnetic turbulence into question in the process and providing new research directions for next-generation experiments in space. The team published its work in Nature Astronomy.
“The ISM is ultimately the source from which new stars are formed,” said Salvatore Cielo, application support specialist in LRZ’s Astrolab and co-author on the paper. “Understanding how this process works exactly is a longstanding problem in astrophysics, one where theory alone cannot yet explain from first principles. In fact, our simulations managed to predict a behavior of the magnetic energy that has evaded other theoretical models.” In addition to Princeton University, the University of Toronto, and LRZ, the team received support from researchers at the Australian National University and Heidelberg University. LRZ is one of the three HPC centers that comprise the Gauss Centre for Supercomputing (GCS).
Physicists consider turbulence one of the last major unsolved challenges in their discipline, and for good reason—turbulent motions of fluids are inherently chaotic and trying to accurately study or simulate how these chaotic motions influence one another at all scales has proven elusive. Moreover, the turbulent vortices fragment into ever smaller ones, with a subsequent “cascade” from large to small structures. The ISM adds even more challenge: turbulence happening in the galaxy is also influenced by magnetic fields. While magnetic fields from far-away stars may be weak, turbulent vortices in the ISM amplify them like a spinning dynamo.
To model such a pervasive, fluid-like structure inside the ISM, researchers use magnetohydrodyanmics (MHD) simulations. On a supercomputer, the researchers create a grid, then solve the MHD equations within each grid space, which tell them how magnetic fields, particles and gas in the ISM influence one another. The finer the grid’s mesh, the more physics they can fold into the simulation. But the more grid points mean higher computational costs: “Turbulence in the ISM does not have any defined geometry or symmetry that you can use to simplify the modeling, so you really have to optimize your code and tackle these calculations with a brute-force computational approach,” Cielo said.
To get a more accurate view of how magnetic turbulence influences the ISM, the team created a high-resolution grid that consisted of 10,080 units per dimension—an ISM portion about 30 light-years per side, large enough to encompass structure such as extreme dense regions and voids, plasmoids (a form of confined magnetic “bubbles”), and to provide statistics on how the magnetic fields align with the flow. But the model can also be scaled down to study finer phenomena, such as wind and plasma ejections coming from the sun, which influence space travel or satellites.
As a member of LRZ’s Astrolab, Cielo works closely with researchers who won allocations of computing time for astrophysics research. Working with Beattie’s team, Cielo supported the researcher by helping to optimize the code to run across 140,000 cores on LRZ’s SuperMUC-NG Phase 1, roughly half the entire system. The team used mixed-precision algorithms— an optimization used also in machine learning and artificial intelligence applications—to reduce the load in memory in less-critical spaces in the simulation, allowing the cores to direct more computational resources to expanding the simulation size. Cielo also led an interactive, three-dimensional visualization of the data, which also required hundreds of nodes on SuperMUC-NG Phase 1. This visualization, along with a detailed 2D visualization created by Beattie, gave the team new insights into how magnetic fields alter energy cascading through the ISM, suppressing some of the smallest-scale turbulent motions. In turn they enhance Alfvén waves, low-frequency disturbances powered by the magnetic fields in the cosmos. These findings show the researchers that magnetic fields may play a bigger role in stabilizing the ISM against excessive fragmentation that previous models have suggested and gives scientists a new line of inquiry into more accurate magnetic turbulence simulations in the future. “With this extremely high-resolution work, we are able to make qualitatively new predictions that take us another step closer to unraveling the mystery of stability and energy transport in the ISM,” Cielo said. “The problem of magnetized turbulence touches fundamental chords in several branches of physics, so we hope that many other scientists will find this work inspiring. We encourage curious people to read the paper, read about the research happening on our groups’ sites, and contact us about exciting new research projects for SuperMUC-NG.”
Cielo continues to support the team as it uses its findings to inform new, exciting simulations running on LRZ resources. As the center finishes acceptance testing of SuperMUC-NG Phase 2, a significant upgrade to the center’s flagship system, the team looks forward to porting its application to take advantage of GPUs available on the new machine. “This is another aspect of supporting our users for the next generation of HPC—it requires large efforts to port software to run on GPUs; there is no free lunch,” Cielo said. Cielo and the research team have already begun exploring several alternatives for porting this and other applications to take advantage of GPU acceleration or selecting other GPU-ready ones. The team is also in the early stages of exploring how to use AI-accelerated algorithms that would help free up precious memory during a simulation primarily limited by information coming on and off memory.
For Cielo, the team’s focused work has created a strong foundation from which to work, and he and other support staff in LRZ’s Astrolab will continue to provide optimization and visualization support to only sharpen the team’s insights gleaned from its recent work. “They run very optimized software, and we help create the perfect computational environment for the work—the software, compilers, and everything needed to run well on our hardware,” Cielo said. “SuperMUC-NG Phase 1 and 2 are well-suited for these kinds of simulations, and with a great research team and a good code, you can scale up to new computational heights.” Eric Gedenk | GCS
Photo credit © James Beattie | Princeton University