Astrophysicists reveal the largest simulation of the universe ever made

To understand how the universe formed, astronomers, AbacusSummit, have created more than 160 simulations of how gravity shapes the distribution of dark matter.

The newly released cosmic simulation array is the largest ever produced, collectively recording nearly 60 trillion particles.

The simulation set, called AbacusSummit, will be useful for extracting the secrets of the universe from future surveys of the universe, as its creators hoped. They presented AbacusSummit in several research papers recently published in Monthly Notices of the Royal Astronomical Society.

The AbacusSummit is a product of researchers at the Flatiron Institute (CCA) Computational Astrophysics (CCA) Center in New York City and the Center for Astrophysics | Harvard and Smithsonian. Consisting of more than 160 simulations, it describes how the particles in the universe move due to their gravity. These models, known as N-body simulations, capture the behavior of dark matter, the mysterious and invisible force that makes up 27% of the universe and interacts only with gravity.

How does gravity shape the distribution of dark matter

The AbacusSummit collection includes hundreds of simulations of how gravity shapes the distribution of dark matter throughout the universe. Here, a shot of one of the simulations is shown at a magnification scale of 1.2 billion light years. The simulation replicates the large-scale structures of our universe, such as the cosmic web and large galaxy clusters. Credits: Team AbacusSummit; Planning and design: Lucy Reading-Ikanda

said Lehman Garrison, lead author of one of the new papers and research associate at CCA.

Garrison led the development of the tabletop simulation together with graduate student Nina Maksimova and astronomy professor Daniel Eisenstein, both of whom work at the Center for Astrophysics. Simulations were performed on a US Department of Energy supercomputer at the Oak Ridge Leadership Computing Facility in Tennessee.

See also  - The largest expansion of the Chinese nuclear arsenal ever - VG

Many space surveys will produce maps of the universe in unprecedented detail in the years to come. These dark energy spectroscopic devices include (

DESI
The Dark Energy Spectroscopic Instrument (DESI) is a new instrument for conducting a spectrographic survey of distant galaxies that has been retrofitted onto the Mayall Telescope on top of Kitt Peak in the Sonoran Desert 55 miles distant from Tucson, Arizona. Its main components are a focal plane containing 5000 fiber-positioning robots and a bank of spectrographs which are fed by the fibers. It enables an experiment to probe the expansion history of the Universe and the mysterious physics of dark energy.

“>DESI), the Roman Nancy Grace Space Telescope, the Vera Sea Robin Observatory and the Euclid spacecraft. One of the goals of this big-budget mission is to improve estimates of the cosmological and astrophysical parameters that determine how the universe behaves and looks.

Scientists will make this better estimate by comparing the new observations with computer simulations of the universe with different values ​​for different parameters — such as the nature of the dark energy separating the universe.

AbacusSummit leverages parallel computingAbacusSummit leverages parallel computing

The counter utilizes parallel computer processing to speed up its calculations of how particles move due to their gravity. The sequential processing approach (above) calculates the attraction between each pair of particles individually. Parallel processing (below) instead divides the work across multiple compute cores, allowing multiple particle interactions to be computed simultaneously. Credit: Lucy Reading-Ikkanda Foundation/Simons

“The next generation of cosmological surveys will map the universe in great detail and explore a wide range of cosmological questions,” said Eisenstein, who is a co-author of the new MNRAS paper. But seizing this opportunity requires an ambitious new generation of numerical simulations. We believe AbacusSummit will be a bold step for synergies between account and experience. “

See also  Amazing Discovery of 'Dragon Man' Skull

The decade-long project was terrifying. N-body calculations – which attempt to calculate the motion of objects, such as planets, interacting with gravity – have been the number one challenge in physics since the time of Isaac Newton. The trick comes from the interaction of each object with every other object, regardless of the distance. This means that as you add more stuff, the number of interactions increases rapidly.

There is no general solution to the N-body problem for three or more massive bodies. The calculations provided are only rough estimates. A common method is to freeze time, calculate the net force acting on each object, and then push each element according to the total force it experiences. Then time moves forward a bit, and the process repeats itself.

Using this approach, AbacusSummit has processed a large number of particles thanks to smart codes, new numerical methods, and a lot of computing power. The Summit supercomputer was the fastest in the world at the time the team did the calculations; Still the fastest computer in the US

The team designed the codebase for Summit AbacusSummit – called Abacus – to take full advantage of Summit’s parallel processing power, where multiple computations can be performed simultaneously. In particular, Summit offers multiple GPUs, or GPUs, that excel at parallel processing.

Running N-body computations using parallel processing requires careful algorithm design because the entire simulation requires a large amount of memory for storage. This means that the counter can not only make copies of the simulation for different nodes of the supercomputer to work with. Instead, the code breaks down each simulation into a grid. Initial calculations give a fair estimate of the effect of distant particles at any given point in the simulation (which plays a much smaller role than nearby particles). The counter then groups and separates the closest cells so the computer can work on each group independently, combining estimates of distant particles with accurate calculations of nearby particles.

See also  they distribute blood pressure monitors to prevent strokes

“The computational algorithm fits perfectly into the capabilities of modern supercomputers, providing a highly ordered computational pattern for the massive parallelism of shared GPUs,” said Maximova.

Thanks to its design, the counter reaches extremely high speeds, refreshing 70 million particles per second per Summit supercomputer node, while analyzing the simulation as it runs. Each particle represents a mass of dark matter 3 billion times the mass of the Sun.

“Our vision was to code this to provide the simulations needed for this new specific galaxy survey,” Garrison said. “We write code to make simulations faster and more accurate than ever before.”

Eisenstein, a member of the DESI collaboration — who recently started his survey to map an unprecedented part of the universe — says he is eager to use counters in the future.

“Cosmology is leaping forward because of the interdisciplinary amalgamation of amazing observations and modern computing,” he said. “The next decade promises to be an exciting age in our study of the sweeping history of the universe.”

Reference: “Top Abacus: Great Collection

accuracy
How close the measured value conforms to the correct value.

“>Health, High resolution N-body simulation” by Nina A. Maksimova, Lyman H. Garrison, Daniel J. Eisenstein, Boriana Hadziska, Sunak Bose, and Thomas P. Satterthwaite, 7 September 2021, mPeriodic notifications from the Royal Astronomical Society.
DOI: 10.1093 / mnras / stab2484

Additional co-authors of Abacus Summit and Abacus include Sihan Yuan of Stanford University, Philip Pinto of the University of Arizona, Sunak Boss of Durham University in the UK and Research Centers in Astrophysics Boriana Hadjiska, Thomas Satterthwaite and Douglas Ferrer. Simulations were run on the Summit supercomputer under the task of the Advanced Computing Challenge for Scientific Computing Research.

Leave a Comment