At the Very Beginning

By
An image of the birth of a star

We’re all familiar with a typical telescope and how it works. You look into one end and can see something very far away with more detail than you can with the naked eye – often, these objects are so far away you can’t even see them without the aid of a powerful telescope. These telescopes are optical telescopes. We can see what’s on the other end because we can see light waves. But when cosmologists, scientists who study the formation and development of the universe, study the sky, they need a different tool. They use radio signals to explore outer space. A radio telescope sends back a different picture to researchers, one made of radio waves. Radio waves come from hydrogen atoms scattered across the universe, and they can paint just as vivid a picture to a cosmologist as light can.

These radio waves tell a lot about the history of the sky. The HERA Telescope team, led by UC Berkeley’s Joshua Dillon, used ACCESS supercomputing resources to help analyze all that radio wave data. With the power of Pittsburgh Supercomputing Center’s (PSC) Bridges-2, they’ve been able to clean up the data they receive and test a number of hypotheses they have about the beginnings of the universe. As it turns out, when you point a radio telescope at the sky, you’re going to get a flood of data that needs to be analyzed – far too much for a simple desktop computer to work through.

… our work with Bridges-2 … fell into two broad categories. One was validation – trying to simulate data that looks like our data but [in which] we fully understand what’s going on … The other is what we call parameter inference – looking at the measurements we’ve made given some hypothesis’s predictions … This required a fair amount of high-performance computing … Having big systems like Bridges-2 available is very useful.

Joshua Dillon, UC Berkeley
The HERA radio telescope array in South Africa

The myriad ways in which supercomputers can be deployed to help researchers continue to expand. With the power of cyberinfrastructure, scientists like Joshua Dillon can make the most of their research by finding innovative ways to incorporate supercomputers into their work. In this case, the parallel processing of Bridges-2 helped them run enough simulations that they were able to focus on the data that was important. 

Another perspective

With an event as big as the Big Bang, there are a number of nuanced and exciting ways to study it. For example, ACCESS resource Anvil helped researchers study the Big Bang from a completely different angle. Their team is utilizing cyberinfrastructure to study quark-gluon plasma. This type of plasma was created by the Big Bang, but it didn’t last long – only fractions of a second – before it started to convert into the stuff of life. The very building blocks of matter, protons and neutrons, came from this plasma. Chun Shen, a theoretical physicist at Wayne State University, worked with the JETSCAPE team to run Monte Carlo simulations of Big Bang-like events on Anvil.

“We are trying to understand the property of nuclear matter Image description under extremely hot and extremely dense conditions,” says Shen. “These conditions occurred a few microseconds after the Big Bang and can be recreated in a particle accelerator by colliding heavy nuclei at near speed of light. What we have found on the experimental side is that there is a new phase of matter called quark-gluon plasma created in this type of collision, but the lifetime of this is really short – it’s in terms of 10 to the minus 23 or 24 seconds. It’s very, very, very short-lived. So we have to rely on model simulations in combination with the experimental measurements to tell us what actually happened during this short amount of time.”

With their allocation, Shen and his team were able to make significant progress on their research. “The large amount of simulations that we run on Anvil,” says Shen, “are to train efficient model emulators, because the models we developed are sophisticated, and they run very slowly. They are very inefficient in exploring a high-dimensional parameter space. We have about 20 model parameters related to the characterizing properties of the matter, and if I want to constrain these 20 parameters simultaneously with experimental data, it’s a 20-dimensional parameter space, which is very big. And if you have a very slow model, it’s hard to explore such a big space. So we train the model emulator using Gaussian processes, and then the emulator can run very fast.”

If you’re a researcher who wants to explore how supercomputers can help you, visit the ACCESS Allocations page to learn more.

You can read more about these stories here:
HERA Telescope Team Uses Bridges-2 for Critical Measurement of Early Universe 

Purdue’s Anvil supercomputer helps researchers look at the origins of the universe


Project Details

Resource Provider Institutions: San Diego Supercomputing Center (SDSC), Rosen Center for Advanced Computing (RCAC)
Affiliations: For SDSC: HERA Team, UC Berkeley For RCAC: Wayne State University, JETSCAPE
Funding Agency: NSF

The science story featured here was enabled by the ACCESS program, which is supported by National Science Foundation grants #2138259, #2138286, #2138307, #2137603, and #2138296.

Sign up for ACCESS news and updates.

Receive our monthly newsletter with ACCESS program news in your inbox. Read past issues.