menu
menu
Technology

AI for new science discovery: Researchers create first Milky Way simulation to track more than 100 billion stars

Sounak Mukhopadhyay
Researchers have developed the first Milky Way simulation capable of tracking over 100 billion stars for ten thousand years. Utilising deep learning and high-resolution physics, this method enhances astrophysics and offers potential applications in climate and weather research.
AI for new science discovery: Researchers create first Milky Way simulation to track more than 100 billion stars(Pexels)

Researchers in Japan and Spain have created the first simulation of the Milky Way. It can track more than 100 billion stars over a period of 10,000 years. It combines deep learning with high-resolution physics.

The researchers are led by Keiya Hirashima. The study was done at the RIKEN Centre for Interdisciplinary Theoretical and Mathematical Sciences (iTHEMS) in Japan. They worked with partners from the University of Tokyo and Universitat de Barcelona in Spain, according to ANI.

Scientists have long struggled to model a galaxy as large as the Milky Way with enough detail to track single stars. Existing simulations can handle systems equal to about one billion suns. Yet, it falls far short of the Milky Way’s more than 100 billion stars.

The new AI learned how gas behaves. The model is now hundreds of times faster than older methods. The simulation uses 100 times more stars than earlier work.

The breakthrough, showcased at the SC ’25 supercomputing conference, marks a big step for astrophysics, high-performance computing and AI-supported modelling. The same method could also help large Earth system studies, including climate and weather research.

Each tiny “particle” in those models often represents nearly 100 stars. It hides individual behaviour and weakens accuracy. The problem arises from the tiny time steps required to capture fast events, such as supernovae.

Smaller steps demand huge computing power. A full star-by-star Milky Way simulation would take decades. It also adds more supercomputer cores is neither efficient nor practical.

How researchers tackled limits of galactic modelling

Hirashima and his team tackled the limits of galactic modelling by combining deep learning with traditional physics. Their method used a trained surrogate model that learned gas behaviour from detailed supernova simulations.

This surrogate predicted how gas spreads for nearly one lakh years after each blast without slowing the main run. The hybrid approach kept the broad structure of the galaxy accurate. It still captured small events, such as individual supernova details.

The team found close agreement while testing the results against large runs on Fugaku and the Miyabi system. Similar methods could transform weather, ocean and climate studies.

"I believe that integrating AI with high-performance computing marks a fundamental shift in how we tackle multi-scale, multi-physics problems across the computational sciences," ANI quoted Hirashima as saying.

"This achievement also shows that AI-accelerated simulations can move beyond pattern recognition to become a genuine tool for scientific discovery -- helping us trace how the elements that formed life itself emerged within our galaxy," Hirashima added.

by Mint