Image
© Tim GravestockMessages from a long-lost universe
At first, there didn't seem anything earth-shattering about the tiny point of light that pricked the southern Californian sky on a mild night in early April 2007. Only the robotic eyes of the Nearby Supernova Factory, a project designed to spy out distant stellar explosions, spotted it from the Palomar Observatory, high in the hills between Los Angeles and San Diego.

The project's computers automatically forwarded the images to a data server to await analysis. The same routine kicks in scores of times each year when a far-off star in its death throes explodes onto the night sky, before fading back to obscurity once more.

But this one did not fade away. It got brighter. And brighter. That's when human eyes became alert.

The supernova finally reached its peak brightness after 77 days. After 200 days - long after most supernovae have dwindled back into obscurity - it was still burning brightly. Only in October 2008, an unprecedented 555 days after it was first spotted, had it faded enough for the supernova hunters to call off their observations.

Digesting what they had seen took longer still. SN 2007bi, as dry protocol labelled the event, was one of the most extreme explosions ever recorded, of that there was no doubt. It was so intense that it didn't fit any model of how normal stars die. But then, it was rapidly becoming clear that, in life as in death, this had been no normal star.

If the interpretation of what popped up that April night is correct, this was a star that should not have existed, in a place where it should never have been. It was a mind-bogglingly massive star that was a throwback to a universe long since gone. It was a star that time forgot.

That picture began to emerge only after long months of monitoring the supernova's afterglow with the Samuel Oschin Telescope, a 61-year-old veteran atop Mount Palomar. This afterglow is powered by the decay of heavy radioactive elements generated in the runaway processes of nuclear fusion that occur during the initial explosion. The critical process is the decay of radioactive nickel, which quickly turns to cobalt, which in turn decays to iron, radiating gamma rays as it does so. The brightness and persistence of the afterglow reveal how much of these elements the supernova produced.

Image
© New Scientist
Plugging these observations into models of conventional supernovae brought a startling conclusion. To keep the supernova glowing that brightly, and for that long, the explosion must have produced 10 times more radioactive nickel than a normal supernova can muster - a discrepancy so huge that it demanded an explanation.

A clue to what was going on came in a few largely forgotten papers buried in journals from 40 years ago. In the core of any massive star, the outward pressure of photons created in nuclear fusion reactions counters the weight of the material bearing down on it, preventing the star from collapsing in on itself. Sometimes, in massive stars many times the size of the sun, gravity can eventually overwhelm this photon pressure, initiating what is known as a core-collapse, or type II, supernova. That is one of two common types of supernova. The other, called type Ia, occurs when a dying white dwarf star accretes mass from a companion star and grows unstable, igniting in a final searing fireball.

In the old papers, astronomers speculated on what might happen to cause a truly giant star - one bigger than about 200 suns - to go supernova. In this case, they calculated, the core of the star could eventually become so hot during nuclear fusion that photons would start to convert spontaneously into pairs of particles: an electron and its antimatter doppelgänger, a positron. This would rob the star of some of the photon pressure needed to support its outer layers, causing it to collapse in on itself in a gargantuan supernova that would vaporise the star. This final titanic burst of fusion would create vast quantities of heavy radioactive elements, far larger than a conventional supernova can produce. The astronomers called it a "pair-instability" supernova.

Implausible interloper

No supernova explosion answering to this description had ever been witnessed, and the idea remained a mere twinkling in the theorists' eyes. That is, it did until Avishay Gal-Yam, an astrophysicist at the Weizmann Institute of Science in Rehovot, Israel, and his collaborators started looking at SN 2007bi. The more they compared the data with various supernova models, the more they became convinced that the pair-instability model was the answer to the conundrum this explosion posed. "Only a pair-instability supernova can produce that much radioactive nickel," says Gal-Yam. With the model, they could even calculate how big the exploding star had been: a whopping 300 times the mass of the sun (Nature, vol 462, p 624).

Problem solved? Not a bit of it. The finding came with a massive sting in its tail: according to all our theories and all our observations, stars that big simply should not exist.

At least, they should not exist in the kind of universe we see around us today. In the decades since the pair-instability model was born, theory and some comprehensive sweeps of the night sky have combined to show that the composition of the modern cosmos prevents stars reaching such huge sizes. The presence of appreciable quantities of what astronomers call metals - elements heavier than hydrogen and helium - causes gas clouds to collapse speedily into "pocket-sized" stars. That is why most stars today are celestial minnows, containing less mass than our sun. The absolute upper limit on a modern star, theory and observations of our galaxy agree, lies at about 150 solar masses. A monster of 300 solar masses is an implausible interloper into this settled scene.

Things were different in early cosmic times, some 13 billion years ago in the pristine universe immediately after the big bang. Back then, solar giants ruled the roost. Only hydrogen, helium and trace amounts of lithium were floating around the cosmos, and much bigger quantities of these elements had to accumulate before they fell under the spell of gravity and were pulled together to form a star. As a result, the first stars in the universe were humongous, containing anything up to several hundred solar masses.

Fossil universes

Existing before proper galaxies had been able to form, these stars lived brief, wild lives of just a few million years as they furiously burned their vast stocks of hydrogen. Yet in their violent deaths, these stars were of huge significance. As theory has it, these explosions fused the first elements heaver than hydrogen, helium and lithium. They provided the raw materials for the cosmos we see today: its galaxies, its sun-like stars, its planets and, in one insignificant corner at least, its life.

No one has ever seen these cosmic giants directly. We would dearly love to, if only to confirm the grounds for our own existence. Unfortunately, we can't. Even as they were sowing the seeds of the future cosmos, these megastars were precipitating their own demise. By increasing the metal content of the cosmos as they died, they destroyed the very conditions that nurtured them in the first place. By the end of the first few hundred million years after the big bang, metal levels were so high that stars of their like could never form again. Direct evidence for the existence of megastars lies far beyond the horizon of even our most powerful telescopes.

Or does it? If SN 2007bi is what it seems, we might have found a get-out clause: a loophole that allows us to spy if not the first megastars, then something very similar. Against the odds, the cosmic trailblazers may have lived on into the modern universe. But how?

The secret lies in where this supernova was situated: an otherwise unassuming dwarf galaxy some 1.6 billion light years away from Earth. Dwarf galaxies, as their name suggests, are runtish structures that never made it to full size. Whereas a fully formed galaxy such as our own Milky Way contains several hundred billion stars, a dwarf galaxy can have as few as just a couple of million.

Observations of the distant universe show that dwarf galaxies were once much more prevalent. "We know that the first galaxies to form were dwarfs," says Nils Bergvall of the Uppsala Observatory in Sweden. The idea is that these were the basic blocks that built up to form the much larger galaxies of today.

We also know that dwarf galaxies, even those relatively nearby which we can see as they were in comparatively recent cosmic time, have just 5 to 10 per cent of the metals that our sun has - or markedly less than one-hundredth of their total mass. The earliest dwarf galaxies may have had even less.

We have been slow to grasp the implication: that the tiniest dwarf galaxies could be pristine chunks of the early universe, preserving its composition and conditions in a cosmos that has long since moved on. Their degree of preservation could be the result of their sheer dwarfishness: because gravity within them is weaker than within a normal galaxy, a supernova exploding within it will fling the metal-rich products outwards at such speed that they mostly escape altogether.

If the original conditions of the universe were preserved in these dwarf galaxies, there would be no reason why further waves of megastars should not continually form and die within them throughout cosmic time. If it is the absence of metals that determines stellar size, behemoth stars are not restricted to the furthest reaches of the universe: they could be found in any dwarf galaxy with a low enough metal content, including places well within reach of our telescopes. It is a line of reasoning that the identification of SN 2007bi now seems to support in spectacular fashion.

The discovery of a nearby population of megastars in what amounts to suspended animation would have huge implications for stellar science. We do not understand the processes of star formation and death as well as we would like to think. "It is surprisingly difficult to get the models to agree with the observations," says Gal-Yam. He cites the example of gold, the abundance of which in the universe essentially defies explanation, although most astronomers assume it must somehow be made in supernovae. To find some answers, we might need to look no further than nearby dwarf galaxies.

But wait a moment. If these huge living fossils have always been lurking on our cosmic doorsteps, how come we have not seen them before now? Stars that big would surely be hard to overlook, either during their tempestuous lives or spectacular deaths. Yet apart from one peculiarly luminous supernova in 1999, we have never seen anything that looks like SN 2007bi.

Part of the explanation, says Alexei Filippenko of the University of California, Berkeley, is that we have been looking in the wrong places. "Telescope time is precious, and in a pathetic dwarf galaxy there are not that many stars, so not that many opportunities for one to go supernova," he says. Astronomers have understandably focused their attention on the big galaxies that are richly stocked with stars.

Tantalising glimpse

That is now changing as fast robotic sky searches, such as the Palomar Transient Factory based at the observatory that first spotted SN 2007bi, swing into action. Such projects make no judgement about where best to look; they just keep their electronic eyes open for anything that is changing in the sky. This new strategy is already bearing fruit. "We are now tracking a number of supernovae that could also turn out to be pair-instability supernovae. But we want to be absolutely certain before we announce," says Filippenko.

Direct observations of any living megastars lurking out there are more tricky. Giant stars with their huge stocks of hydrogen and helium fuel would be so hot that most of their energy would be emitted as ultraviolet light, which is absorbed by Earth's atmosphere before it reaches ground-based telescopes. "Without seeing the ultraviolet, these stars will just hide away and look like ordinary high-mass stars," says Gal-Yam.

Because astronomers have traditionally believed that there is little of interest to see at ultraviolet wavelengths, there are no general-purpose ultraviolet space telescopes, either. The Hubble Space Telescope can see at these wavelengths, but the kind of painstaking programme to map relatively nearby dwarf galaxies would mean tying it up for thousands of hours of observation time. Gal-Yam has just submitted a proposal to do just that, but he is competing against about 40 other projects.

Hubble was serviced for the final time last year, and attention is now switching to its replacement, NASA's James Webb Space Telescope, which is scheduled for launch in 2014. But this telescope has no ultraviolet capability. "Once Hubble is gone, we are going to be totally blind," says Gal-Yam. "There is an urgency about doing this work."

At it stands, that April supernova could have been a tantalising and wholly unexpected glimpse into a universe we thought we would never see, that of the first stars, the cosmos makers. That would be an explosion to truly blow our minds.