Tuesday, December 8, 2015

Updates on Warm Dark Matter & What's Up with the Lyman Alpha Forest

A couple of quick updates followed by a discussion of the most recent M. Viel paper regarding Lyman Alpha Forest and limits on Warm Dark Matter.

Update#1: Jeltema and Profumo find no evidence for a 3.5 keV line in the Draco spheroidal galaxy. This is yet again another paper that finds no evidence for this 3.5 keV line when looking at dwarf galaxies.  The source of this 3.5 keV line may be unique to elliptical galaxies and likely has nothing to do with dark matter.

Update#2: (Related to LHC) Both the Atlas and CMS detectors recently published data from RunII at 13 TeV and report no evidence for resonances (that would produce di-jets) with mass energies up to ~6 TeV. In other words, cold dark matter and supersymmetry are running out of places to hide.

Update#3: Baur et al. (including M. Viel) put a pre-print of a paper submitted to JCAP titled "Lyman Alpha Forests cool Warm Dark Matter."

The pre-print argues that "Using an unprecedentedly large sample of medium resolution QSO spectra from the ninth data release of SDSS, along with a state-of-the-art set of hydrodynamical simulations to model the Lyman-alpha forest in the non-linear regime, we issue the tightest bounds to date on pure dark matter particles: mX > 4.35 keV (95% CL) for early decoupled thermal relics such as a hypothetical gravitino, and its corresponding bound for a non-resonantly produced right-handed neutrino ms > 31.7 keV (95% CL)."

However, I think that we need to be very skeptical of this work. Here are some of my issues with the manuscript as well as many of the other Lyman Alpha Forest manuscripts out there.


(1) There is no data provided. Unlike in previous manuscripts by Viel et al. that include the data from MIKE+HIRES, there was no attempt in this manuscript to provide the data.

(2) In addition, there was no attempt to plot results in a format used by other research groups. For example, P(k) vs. k plots are normally presented as Power Spectra [h^-3*Mpc^3] vs. k [h*Mpc^-1]. However, there's no data provide, so there's no way for other researchers to take their data and compare it against their own data. Even when the data is given (such as in Neutrino masses and cosmology with Lyman-alpha forest power spectrum, the data is presented as P(k)*k/pi vs. k in units of (s/km). This makes it hard for non-experts in the field to compare the Lyman-Alpha forest data to data that can be generated, for example, by the free software CAMB.)

(3) The following is one of my main pet-peeves with Lyman Alpha data. There's no real attempt by the researchers in the field to fit their data in with overlapping data, such as lensing. (Though, this has been done by other researchers, such as Hunt and Sarkar 2015.)

(4) It appears that Baur et al. only use  data out to a k-space value of 0.02 s/km and data out to a z value of 4.4;  however, there's no real way to constrain warm dark matter until you go out to a value of k closer to 0.1 or 0.2 s/km at z values of 5 or greater. So, I'm skeptical that Baur et al. have really improved constraints on warm dark matter particles.

It could be that the recently improved limits presented by Baur et al. on warm dark matter are correct. But without the data being corroborated with data overlapping in k-space (such as weak lensing), it's hard to take their constraints seriously. As such, I think that we need to wait for a major organization (such as Planck) to correctly integrate Lyman Alpha Forest data with their CMB data (along with BAO and galaxy cluster data from others) before there can be any reliable constraints on the mass of the dark matter particle.

And this leads to my main pet-peev (as an outsider to this community): I have yet to see a study done that uses as much real data as possible and allows all of the unknown variables to simultaneously vary. Why waste your time just varying a few of the parameters and only using some of the freely available data?

So, here's the list of the data and possible variables of interest.

Data:
(1) CMB TT, TE, EE, BB (WMAP, Planck, ACT, SPT, ACTPol, SPTPol, BICEP, KECK, PolarBear,etc...)
(2) BAO  (SDSS, Lyman Alpha, etc...)
(3) Galaxy Clusters
(4) Weak lensing (overlap with CMB data)
(5) Limits on He4, D, and He3
(6) Small scale Lyman Alpha forest
(7) Small-small galaxy counting at the kpc scale
(8) Optical depth constraints at z~6 
(9) Particle physics mass constraints (top quark, W/Z mass, Higgs mass) to help constrain inflation potentials
(10) Mixing in the quark sector and lepton sector
(11) Neutrino mass difference constraints

(Note that I plan to create a website in which I put all of the above data sets into 1 Excel file (with multiple tabs.) I have downloaded the raw data from NASA or from arxiv manuscripts, and have put the data into an Excel spreadsheet so that the data can easily be graphed and analyzed by non-experts.

Variables:
(1) Inflation potential quadratic term
(2) Inflation potential quartic term
(3) Inflation potential high order terms(?)
(4) Friction between inflation potential and particles (re-heating)
(5) Dark matter density
(6) Dark matter rest mass (or equivalent rest mass if non-thermal)
(7) Dark energy density today
(8) Change of dark energy density with z
(9) Baryon density today
(10) Photon density today
(11) Term related to baryon density assymmetry
(12) Term related to lepton density assymmetry
(13) Spatial curvature, k
(14) Scalar power, As   (if terms 1-4 do no constrain this term)
(15) Tensor power, AT  (if terms 1-4 do no constrain this term)
(16-18) Masses of the three neutrino species
(19) Delta Neff, i.e. other species than contribute to radiative energy density


(This set is large, and more fundamental than the current 6 parameters used by most researchers: Dark Matter density, Baryon density, Scalar power, n_s, optical depth, and age of the universe. Note that only the first 2 of the 6 parameters in LCDM are "fundamental" listed above and the other 4 are derived parameters.)

So, I think that what we need most of all is a framework for taking in the diverse sets of data out there (both in the astrophysics community and in the particle physics community) and integrating the data into a single model that can vary >19 independent, fundamental parameters and that can optimize for the best fit of these fundamental parameters to all of the known data.

Until then, we seem to be wasting our time making constraints that are only valid under certain conditions. I know that it's not easy herding scientists together, but think that this is the only way to make any real headway into constraining the many, many fundamental parameters that we want to know (such as dark matter rest mass, curvature, inflation potential, dark energy's time dependence, and the relationship, if any, between the Higgs scalar field, the dark energy scalar field, and the inflationary scalar field.)

No comments:

Post a Comment