I was quite excited to see today that IOP PhysicsWorld had an article today on Dark Matter decaying into Dark Energy. The article discusses a recently accepted paper by Salvatelli et al. in PHYSICAL REVIEW LETTERS.
The gist of this recent PRL paper by Salvatelli et al is the following: the tension between Planck's CMB data using a LambdaCDM model and many other data sources, such as Ho (Hubble constant at z=0) measurements by...you guessed it...the Hubble Space Telescope, can be resolved in a model in which dark matter decays into dark energy (but only when this interaction occurs after a redshift value of 0.9.) There has been a major problem reconciling the low value of Ho estimated by Planck's CMB data (Ho = 67.3 +/- 1.2) with the much higher value measured by the Hubble Space Telescope (Ho = 73.8 +/- 2.4 .)
However, when using a model in which dark matter can decay into dark energy, and when using RSD data on the fluctuations of matter density (as a function of the redshift, z), then the Planck estimate of the Hubble constant at z=0 becomes Ho = 68.0 +/- 2.3. This new model eases the tension between the Planck data and the Hubble Space Telescope measurement of Ho.
So, let's go into the details of the model:
(1) Dark matter can decay into dark energy (or vice versa is also possible in the model)
(2) The interaction between dark matter and dark energy is labeled 'q' in their model. When 'q' is negative, then this means that dark matter can decay in dark energy. When 'q' is positive, then this means that dark energy can decay in dark matter. And when 'q' is zero, then this is no interaction.
(3) The group has binned 'q' into a constant value over different periods of time.
Bin#1 is 2.5 < z < primordial epoch (in other words, from the Big Bang until ~5 billion years after the Big Bang)
Bin#2 is 0.9 < z < 2.5 (in other words, from ~5 billion years after the Big Bang to )
Bin#3 is 0.3 < z < 0.9
Bin#4 is 0.0 < z < 0.3 (i.e. most recent history)
The best fit values of these parameters are the following: (See Table I and Fig 1 of their paper for the actual values)
q1 = -0.1 +/- 0.4 (in other words, q1 is well within 1 sigma away from zero)
q2 = -0.3 +0.25 - 0.1 (in other words, q2 is only roughly 1 sigma away from zero)
q3 = -0.5 +0.3 - 0.16 (in other words, q3 is roughly 2 sigma away from zero)
q4 = -0.9 +0.5 - 0.3 (in other words, q3 is roughly 2 sigma away from zero)
There is a trend that q(z) becomes more negative as z gets closer to its value today of z=0.
Salvatelli et al. have also created a simpler model in which there are only 2 bins. In this case, bin#1 is z > 0.9 (q12=0) and bin#2 is z < 0.9 (q34=-0.128.) In this case, there is a only roughly 3 sigma (1%) chance than z is zero (or positive) and a 99% chance that q34 is less than zero. Using this model, the group then created Figure2, which shows the difference between this model and the standard LambdaColdDarkMatter model. The y-axis of the figure is f times sigma, where f is the growth rate of structure and sigma is the Root Mean Square (RMS) matter density fluctuations in the universe. Both f and sigma are function of time (i.e. functions of z.) The experimental data points are plotted with error bars. Also plotted are the predictions from a standard LambdaCDM model and this new interacting model in which dark matter can decay into dark energy.
While there is large scatter in the experimental data points, this new model does seem to fit the data better than a LambdaCDM model. (Though, it should be pointed out that the values of f*sigma8 are highly dependent model assumptions, and hence there numbers tend to fluctuate with time as different model assumptions are used to pull the values of f*sigma8 from the raw data.)
It should also be pointed out that Wang et al. 2014 found that if q were a constant versus z, then it couldn't help ease the tension between all three data sets. It's only in the recent work by Salvatelli et al. (in which the interactions turns on after z=1) that all three data sets can be reconciled. (The three data sets being Planck, RSD and Hubble.)
So, what happened at z=0.9? In other words, what's so special about z=0.9? The authors of the paper aren't really sure, but as seen in their Figure#5, it does seem as if the interaction between dark matter and dark energy is turning on around z=1. The first thing to note is that energy density of dark energy doesn't become greater than the energy density of matter until a value of z~2. So, the effect of dark energy on the universe before the universe were half the size it is today (i.e. a value of z>2) is quite negligible. In other words, we can explain the expansion of the universe before z=1 through just knowing the energy in photons and matter. So, it's no real surprise that something happens in their model at ~z=1. This is when dark energy is starting to accelerate the expansion of the universe. Before z=1, the universe is expanding, but the rate of expansion is decelerating. This is the case when matter is the dominant energy source in the universe. (see figure below created by Prof Whittle at U. Virginia...also, you definitely should watch his Great Courses lecture series on the Cosmology if you haven't already.)
So, we have to ask ourselves the real question: why is dark energy only a few times greater than the energy density of dark matter today. A simple guess is that dark matter can decay into something that can accelerate the expansion of the universe. However, it turns out that in the case that dark matter decays into relativistic particles, then the Hubble constant will still continue to decrease towards a value of zero. Yet, this is not what we see experimentally. Instead, the Hubble expanding rate appears to be approaching a constant, positive value. This implies that there is a vacuum energy density of ~ (2 meV)^4 that permeates space. The total energy associated with this vacuum energy density of ~ (2 meV)^4 is still zero because this energy is gravitationally self-bound as it expands.
So, we are left with some tough questions: what is dark matter? what is dark energy? are they related?The goal of this post is just to leave you (the reader) trying to figure out ways to answer these questions.