Lay report on the
Discussion Meeting:
The search for dark
matter and dark energy in the Universe
Only 5 % of the
Universe is made up of the conventional kind of mass and energy that we can
observe. Of the rest, one third is described as dark matter and thought to
consist of strange particles which are extremely difficult to detect. Many
different kinds of dark matter particles have been postulated and experiments
set up to find them, but as yet, none have been found. The remaining two thirds
are required to account for the accelerating expansion of the Universe and
mystify researchers even more. This missing dark energy could be an invariable
property of space (cosmological constant), or a more dynamic effect (e.g.
quintessence). Current instrumentation does not allow researchers to determine
the right answer, but future experiments might.
“I know nothing except the fact of my ignorance,” said Socrates. Some 2400 years later, scientists studying the entirety of the world we live in, i.e. the physical composition of the Universe, are ready to compete with the philosopher in modest proclamations of their ignorance. They can even put a number on it. Their ignorance amounts to a whopping 95%. In other words, 5% of the Universe consists of matter as we know it, of atoms that make up stars, elephants and kitchen tables. As for the remaining 95%, we don’t know what it is made of. Cosmologists have a range of colourful theories about what it might turn out to be, but as yet they don’t have solid evidence to prove or disprove any of them. As Sir Martin Rees, the Astronomer Royal, said at the Royal Society meeting on dark matter and dark energy, “any cosmologist ought to express embarrassment, because 95% of the Universe is not accounted for at all.”
But how did the
researchers arrive at this stunning dilemma? Essentially, it all comes from
observing different kinds of data which determine the mass contained in the
Universe in different ways, and which arrive at wildly different results.
Imagine you wanted to know the number of people in a city and counted their
heads. As a control, another researcher would count their legs. If the
data tells you
there are ten times more heads than legs in the city and you’ve double-checked
the figures, you would have to question your fundamental assumptions that people
have one head and two legs, and start considering exotic possibilities instead,
such as legless or multi-headed people. This is analogous to what happened when
astrophysicists tried to sum up the mass contained in the Universe.
The first discrepancy
they noticed is what we now call dark matter. It arises essentially from our
understanding of the nuclear fusion events by which heavier elements formed in
the early days of the Universe. From the observed distribution of elements and
isotopes, cosmologists can calculate a maximal number for the headcount of
ordinary matter based on baryonic matter (baryons are ”heavy” particles
including the protons and neutrons in ordinary atoms). The trouble is, the
gravity of this total baryonic mass is not nearly enough to keep the galaxies
together that we can observe. Hence the postulate of dark matter, which
contributes many times more mass to each galaxy than the stars whose luminosity
we see (and the planets and brown dwarfs whose presence we infer). Recent
measurements of the Cosmic Microwave Background (CMB, a kind of radiation left
over from the Big Bang and permeating the Universe everywhere) have confirmed
the isotope results, and thus the need to find the missing kind of matter
(Richard Bond, Toronto).
Even though the
precise nature of dark matter remains mysterious, most researchers assume that
it has low energy and therefore call it “cold” dark matter. Reviewing the
particles which might be behind this phenomenon, Rocky Kolb (Fermilab) listed a
number of candidates including both cold and the currently less fashionable
warmer varieties of dark matter. Some of the candidates are well known
particles, like the neutrinos which were first postulated by Wolfgang Pauli, but
have only recently been shown to possess a small mass (Dave Wark, Rutherford
Appleton Lab.). Many others have been conjured up more recently with the
explicit goal to fix up the missing mass in our galaxies.
The difficult part is
to detect the particles experimentally. By definition, being “dark” means they
are not going to show up in any telescopes and will only very weakly, if at all,
interact with ordinary matter. For one kind of promising dark matter candidate,
the WIMPs (Weakly Interacting Massive Particles) there are several kinds of
detectors already in development or in use, which similar to neutrino detectors
aim at detecting very rare “side effects” produced by such particles going
through a large amount of ordinary matter. Cryogenic WIMP detectors such as the
CRESST (Cryogenic Rare Event Search with Superconducting Thermometers)
experiment (Hans Kraus, Oxford) have been used since the mid-1990s. They operate
at only a few thousandths of a Kelvin above absolute zero and try to pick up the
minute energy released if a WIMP hits an atomic nucleus. To exclude disturbance
from cosmic radiation, the experiments are typically conducted underground.
CRESST, for instance, is located in a tunnel in the Alps, near Gran Sasso in
Italy. There are also competing, non-cryogenic approaches to finding WIMPS
relying on scintillation or ionisation events in a variety of materials (Peter
Smith, Rutherford Appleton Lab.). So far, these detectors have not found a
single WIMP, but eventually they will allow researchers to close in on them
until their existence can be confirmed or ruled out.
Other researchers
point their detectors at other candidate particles. One of the longest serving
hypothetical candidates is the axion, which was originally postulated by
theoretical physicists in the 1970s to solve the symmetry problem in the strong
interaction. It was later realised that a halo of axions around our galaxy would
also fix the dark matter problem (Karl van Bibber, Lawrence Livermore National
Laboratory). By definition, an axion would very rarely interact with ordinary
matter in a way that would make it observable. However, an experimental setup
known as a microwave trap, involving microwaves, temperatures near absolute
zero, and a very strong magnet, should in principle be able to convert an axion
into a microwave photon which could then be detected. A variety of detectors
have been built based on this principle, including the Axion Dark-Matter
Experiment (ADMX) at the Lawrence Livermore Laboratory. With these, researchers
are scanning the range of possible properties that an axion might have with
increasing sensitivity, but so far the existence of this particle could be
neither confirmed nor ruled out.
It is comforting then
that there is some experimental evidence of the dark matter halo around
galaxies. It comes from the observation of gravitational lensing - which is the effect that the
gravitational field of large structures, such as galaxies, can bend the path of
light such that an object that should normally be hidden from our view behind
that galaxy can be seen several times on different sides at the same time (Peter
Schneider, Bonn, Germany). Essentially, such observations confirm the existence
of the dark matter halo. Recently, predicted details like the cosmic shear (an
elliptical deformation of the image produced from spherical objects) have also
been confirmed. While the interpretation of the data is made complicated by the
fact that gravitational lenses are irregular in their shape, theoretical
simulations can help to understand the probable shapes and sizes of galactic
halos (Julio Navarro, Victoria, Canada).
But just as
researchers had become comfortable with the assumption that the majority of our
galaxy (and of the Universe at large) is made up of cold dark matter, the second
major deficit in the balance sheets turned up, which can be explained neither by
the luminous ordinary matter, nor the gravitationally active dark matter.
Observations of distant supernovae have confirmed that the Universe is expanding
at an accelerating speed, which requires a mass / energy content three times
higher than the combined amounts of ordinary and dark matter (Saul Perlmutter,
Berkeley, California). Considering the behaviour of the Universe as a whole, the
balance sheet now looks as follows:
5 % ordinary matter
30 % dark matter (WIMPs, axions, or some other strange
particles)
65 % some kind of unclustered energy
The latter has been
dubbed “dark energy”. Together with the dark matter it adds up to the 95% of the
Universe we know nothing about.*
There are various ways
of accounting for the missing 65 % of unclustered energy. The simplest one would
be to assign an energy to the space. This is known as vacuum energy or the
cosmological constant. (Albert Einstein introduced it to reconcile relativity
with his view of a cosmos that does not change over time.) This constant would
have been unchanged since the beginning of the Universe. Thus, as the Universe
expands and the density of baryonic mass decreases, it becomes a more and more
important part of the balance sheet. Which is exactly what troubles physicists.
At earlier times in history, the cosmological constant would have been a
negligible factor in the total mass/energy density, many orders of magnitude
smaller than the baryonic and cold dark matter. Given the wide range of
possibilities in astronomical measures of energy, space and time, it is a rather
strange coincidence that the vacuum energy should become an important factor
just at the very moment when our civilisation has become sufficiently advanced
to observe it.
Alternative
explanations have been offered, including the quintessence model (Paul
Steinhardt, Princeton). This model replaces the constant vacuum energy with a
parameter that changed gradually during the evolution of the Universe.
Representing a kind of dynamically changing “negative gravity”, quintessence
comes with some counter-intuitive concepts, including negative pressures, but it
has the pragmatic advantage that it can be easily tuned to fit all the data
available.
So what’s the future
for our Universe?
While there are a
number of ways to explain the history and current state of the Universe, the
models diverge sharply in their predictions of the future. If the dark energy is
represented by a cosmological constant, it will continue to expand at ever
accelerated speed. We will not be around to witness this, but we can appreciate
that such a development would be bad news for life in the Universe, as matter
and energy will be too thinly spread to support life anywhere. With quintessence
accounting for the dark energy, the expansion will proceed a little more slowly,
at a more or less constant speed, and it may or may not level off at some point.
So how can we find out
what future the mysterious dark energy is driving us towards? With current
devices we can’t but there is hope that some of the cosmological experiments in
the pipeline could help to clarify this issue. Some small hint could already be
included in the data from the Microwave Anisotropy Probe (MAP)*. But the biggest hopes reside on the
observations of distant supernovae with new instruments, including a new
orbit-based telescope known as Supernova Acceleration Probe (SNAP) and the
ground-based Large Aperture Synoptic Survey Telescope.
It is clear that both
the dark matter and the dark energy field are driving instrumental development
forward, as the current tools simply don’t allow us to make sense of most of the
Universe. Thus the field is characterised more by theories and profound
questions than by definitive answers, and the recent meeting at the Royal
Society was no exception, in that no new, definitive answers were presented. Its
strength, says George Kalmus, one of the organisers, “was the synthesis of
recent data in order to try to understand what we think we know and what we know
we don’t know.” This again, sounds
a lot like Socrates.
* Data of the
Wilkinson Microwave Anisotropy Probe (WMAP) released on 11.02.2003 allowed
researchers to refine this estimate as follows: 4 % normal matter, 23 % dark
matter, and 76 % dark energy.
When theoreticians get
stuck, they sometimes resort to the emergency exit of postulating a new particle
or substance which cannot be observed by current methods. Sometimes it turns out
to be non-existent (like, most famously, phlogiston), but sometimes it can be
tracked down decades later, like the neutrino reluctantly introduced by Wolfgang
Pauli in 1930. Similarly, the dark matter problem has generated a variety of
hypothetical particles and on top of that adopted a few more, including Pauli’s
neutrino, to fix that giant hole in the Universe. Here is a short overview of
past and present candidate explanations:
There is no dark matter, and the
observations are explained by Modified Newtonian Dynamics
(MOND)
This school of
thinking (dismissed by most researchers in the field) claims that Newton’s law
of gravitation might be different over very long distances. While the MOND
theory can explain a few observations at the cost of replacing Newton’s law with
a less elegant equation, it does not consistently explain the entire
Universe.
In the early days of
dark matter research, it was considered possible that the missing mass might
consist of ordinary matter after all, only hidden in unexpected places, such as
dwarf stars, planets, or gas clouds. By now, however, a wealth of results
including isotope abundance, gravitational lensing, and the microwave background
has proven unequivocally that the dark matter must be in non-baryonic particles.
Most crucially, baryonic matter would not be able to account for the large scale
structures (galaxies upwards) we see in the Universe.
After many years of
uncertainty, researchers have recently shown that neutrinos have a small mass.
Thus they could contribute to the dark matter, especially if there are other
kinds of neutrinos apart from those three that are detectable. In this context,
they are known as “hot dark matter”. So far, however, dark matter that is
predominantly hot can not explain the structure of the
Universe.
Warm dark matter,
e.g. gravitinos, which arise from the theory of supersymmetry as partners of the
graviton
Currently thought to
be among the less promising candidates.
WIMPs (Weakly
Interacting Massive Particles)
These represent an
entire class of possible constituents of cold dark matter, including neutralinos
(postulated from supersymmetry theory) Several facilities have been built with
the only goal being to find them.
First postulated to
plug a gap in the Standard Model of particle physics, axions entered the dark
matter field when simulations predicted that their overall abundance in the
Universe should be very high if their mass is small enough. There is also the
idea of large scale axion clusters, which may be distributed over lightyears of
space.
Without the need to
specify which kind of particle it refers to, this theory assumes that the
present non-interacting cold matter was once in thermal equilibrium with the
rest of the world. At one point, it decoupled and went its separate ways. This
theory has proven useful for predictions even though it does not tell us what
particles the dark matter is made of.
There are many more
candidate ideas for dark matter, and they come in all shapes and sizes. As Rocky
Kolb pointed out ironically, the mass has been “pinned down to within 65 orders
of magnitude”, and the interaction characteristics range from absolutely
non-interacting (apart from gravitation) through to strongly
interacting.
How can researchers
make sense of all this and pick the right explanation from such a large range of
possibilities? There are some who hope that fundamental physics will one day
come up with a Theory of Everything, which will, by definition, explain the
entire Universe including dark matter and dark energy. More likely, the solution
will come from a lot more observations and experiments, and a lot of hard work.
Astronomical
observations allow us to observe the past, because the light from distant stars
takes millions of years before it arrives in our telescopes. Moreover, the
distribution of matter and radiation in the Universe allows researchers to
retrace its evolution and to describe in reasonable detail what happened between
the Big Bang and now. There is, however, no window into the future of the
Universe, and at the moment “what happens next” appears more uncertain than
ever. The problem is that the future of the Universe depends very sensitively on
the nature of the dark energy.
The previous
hypothesis that the Universe is dominated by gravitating matter (no matter
whether it’s dark or luminous) and will after a period of expansion collapse
back in a Big Crunch event was shattered by relatively simple astronomical
observations made on supernovae of the type Ia. Essentially, these supernovae
can serve as a beacon because they are all extremely similar in their intensity
and spectral characteristics. Thus, from the apparent magnitude with which a
distant supernova is observed, astronomers can deduce its distance in time and
space. From the wavelength shift towards the red end of the spectrum that the
light has suffered in transit they can calculate how much the Universe has
expanded during that time.
This is a surprisingly
simple way of monitoring the expansion history of the Universe. Saul Perlmutter
and his colleagues used this approach in the mid-1990s, expecting to find out
how fast the expansion of the Universe slows down. It took only a few dozen
supernovae to convince them that their fundamental assumption was wrong and to
prove that the expansion rate of the Universe is currently accelerating.
Combined with the
current knowledge of ordinary and dark matter, and the cosmic microwave
background, this insight leads directly to the conclusion that two thirds of the
mass/energy content of the Universe must be tearing things apart instead of
pulling them together, like gravitating matter would. It does not tell us yet
what the nature of this dark energy is, and whether it is invariable
(cosmological constant) or changes over time (e.g. quintessence). If it is a
constant vacuum energy, there is the dilemma that it would have to be many
orders of magnitude different from what particle physicists predict it to be,
and the intriguing coincidence that a parameter set at the birth of the Universe
to a value insignificantly small at that time should begin dominating cosmology
exactly at the time when we turn up to see it happening. On the other hand,
allowing the dark energy to change over time throws up even more questions and
new variables that we know nothing about. Can supernova observations be improved
to reveal more about the dark energy mystery?
At present, data from
supernovae in the significant distance range are building up quickly. Soon there
will be hundreds of them and at that point, the information to be gained is no
longer limited by the statistical error, but by the risk of systematic error.
There are three kinds of significant errors that need to be controlled: Firstly,
comparing supernovae from different epochs in the history of the Universe
carries the risk that the average properties of these events might have shifted
over time. For instance, the age of the star at explosion, or the elemental
composition might be different in an average modern supernova, if compared to
average early supernovae. Luckily, the spectroscopic analysis of the light
obtained from the supernovae allows the observer to compile a complete picture
of their physical properties. Making sure that they have large numbers of
well-characterised supernovae and then only comparing like with like,
astronomers can rule out any adverse effect of time
differences.
Secondly, interstellar
dust in the light path might distort the observation of distant events. To rule
this out, observers must perform control experiments, for example checking for
scattering of the light from nearby objects in other parts of the spectrum,
particularly for X-ray sources. Lastly, very distant events might be amplified
by gravitational lensing, so a careful observation of the environment is
required to account for this possibility.
In the short-term,
researchers are addressing these issues with a large scale project called the
Nearby Supernova Factory, which will record complete high precision datasets of
hundreds of supernovae every year. To look further back in time and be able to
monitor the time when the Universe switched from slowing down to accelerating
(i.e. when dark energy gained the upper hand over gravitating matter),
researchers are planning to install a space-based telescope and spectrometer
dedicated to this task and called SNAP (SuperNova Acceleration Probe). This
probe, used in collaboration by many research groups mainly based in the US and
France, will monitor thousands of supernovae with unprecedented detail.
Selecting suitable
subsets from this large number of supernovae, researchers will be able to reduce
the error bars significantly and thereby narrow down the range of possibilities
currently open to the nature of the dark energy and the fate of our Universe. In
combination with continuing progress in other experimental observations
including the probing of the microwave background, mass density, and
gravitational lensing, they might even approach definitive answers to
cosmology’s most puzzling mysteries.
Further
Reading:
Ostriker J.P. and
Steinhardt, P.J. Scientific American January 2001, page 36
The Quintessential
Universe, (This article is part of a special issue on
cosmology.)
Brumfiel G. Nature
422, 108-110, 13.3.2003. Cosmology gets real. (News
Feature)
Fukugita M. Nature
422, 489-491, 3.4.2003. The dark side. (News & Views
Feature)