Space

Ancient white dwarf explosions seem to disobey long-established "rules"

An example of a Type Ia supernova, named G299
NASA/CXC/U.Texas
An example of a Type Ia supernova, named G299
NASA/CXC/U.Texas

White dwarf stars have mostly been thought to follow very specific "rules" – if they're over a certain mass limit, they'll explode in a supernova with a very predictable brightness and time. But now Caltech astronomers have found a strange twist to the pattern that they can't explain. According to a new study, white dwarfs used to explode at lower masses than they do today.

White dwarfs are one of the last stages of life for stars like our Sun. After these stars exhaust their fuel supply, they shed their outer layers into a planetary nebula and leave behind a small, dim core – a white dwarf. In some cases, these remnants can also go on to explode later, and it's long been thought that there was a clear-cut mass limit, above which white dwarfs would explode. In 1930, Indian astrophysicist Subrahmanyan Chandrasekhar determined that limit to be 1.4 solar masses. The idea has generally held up over the years, but there are some exceptions.

Now, a team of Caltech astronomers has investigated the Chandrasekhar limit and found an unexpected pattern – ancient white dwarfs used to explode at smaller masses more often. As time went by, explosions began taking place at higher masses.

The team made this discovery using the Keck II telescope in Hawaii. They started by looking at ancient galaxies – those that had apparently stopped producing stars around a billion years after the beginning of the universe. The stars in these galaxies seemed to have far less nickel in them than usual.

What does that have to do with anything? When white dwarfs explode, heavy elements like nickel and iron are forged in the explosion and go on to seed future stars with these elements. Lower concentrations of nickel suggest that the white dwarfs that preceded these stars exploded at lower masses – around that of the Sun, so quite a bit lower than the Chandrasekhar limit.

"We found that, in the early universe, white dwarfs were exploding at lower masses than later in the universe's lifetime," says Evan Kirby, lead researcher on the study. "It's still unclear what has driven this change."

It's important to properly understand why this may be, since white dwarfs that explode are key to our understanding of the cosmos. However they happen, white dwarf supernovae are remarkably consistent, always reaching the same maximum brightness and lasting the same amount of time before fading. These events, known as Type Ia supernovae, are so uniform that they're often called Standard Candles and can be used to measure distances.

"We call Type Ia supernovae 'standardizable candles'," says Kirby. "If you look at a candle at a distance, it will look dimmer than when it's up close. If you know how bright it is supposed to be up close, and you measure how bright it is at a distance, you can calculate that distance," says Kirby. "Type Ia supernovae have been very useful in calculating things like the rate of expansion of the universe. We use them all the time in cosmology. So, it's important to understand where they come from and characterize the white dwarfs that generate these explosions."

Next up, the team plans to study the concentrations of other elements, like manganese, to back up the findings.

The research was published in the Astrophysical Journal.

Source: Caltech

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!