• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for December, 2015

After a long-anticipated data run, LHC proton-proton running concludes in early November.  A mere six weeks later, on a mid-December afternoon, the ATLAS and CMS collaborations present their first results from the full dataset to a packed CERN auditorium, with people all over the world watching the live webcast.  Both collaborations see slight excesses in events with photon pairs; the CMS excess is quite modest, but the ATLAS data show something that could be interpreted as a peak.  If it holds up with additional data, it would herald a major discovery.  While the experimenters caution that the results do not have much statistical significance, news outlets around the world run breathless stories about the possible discovery of a new particle.

December 15, 2015? No — December 13, 2011, four years ago.  That seminar presented what we now know were the first hints of the Higgs boson in the LHC data.  At the time, everyone was hedging their bets, and saying that the effects we were seeing could easily go away with more data.  Yet now we look back and know that it was the beginning of the end for the Higgs search.  And even at the time, everyone was feeling pretty optimistic.  Yes, we had seen effects of that size go away before, but at this time four years ago, a lot of people were guessing that this one wouldn’t (while still giving all of the caveats).

But while both experiments are reporting an effect at 750 GeV — and some people are getting very excited about it — it seems to me that caution is needed here, more so than we did with the emerging evidence for the Higgs boson.  What’s different about what we’re seeing now compared to what we saw in 2011?

I found it instructive to look back at the presentations of four years ago.  Then, ATLAS had an effect in diphotons around an invariant mass of 125 GeV that had a 2.8 standard deviation local significance, which was reduced to 1.5 standard deviations when the “look elsewhere effect” (LEE) was taken into account.  (The LEE exists because if there is a random fluctuation in the data, it might appear anywhere, not just the place you happen to be looking, and the statistical significance needs to be de-weighted for that.)  In CMS, the local significance was 2.1 standard deviations.  Let’s compare that to this year, when both experiments see an effect in diphotons around an invariant mass of 750 GeV.  At ATLAS, it’s a 3.6 standard deviation local effect which reduced to 2.0 standard deviations after the LEE.  For CMS the respective values are 2.6 and 1.2 standard deviations.  So it sounds like the 2015 signals are even stronger than the 2011 ones, although, on their own, still quite weak, when we consider that five standard deviations is the usual standard to claim a discovery because we are sure that a fluctuation of that size would be very unlikely.

But the 2011 signals had some other things going for them.  The first were experimental.  There were simultaneous excesses in other channels that were consistent with what you’d expect from a Higgs decay.  This included in particular the ZZ channel, which had a low expected rate, but also very low backgrounds and excellent mass resolution.  In 2011, both experiments had the beginning of signals in ZZ too (although at a slightly different putative Higgs mass value) and some early hints in other decay channels.  There were multiple results supporting the diphotons, whereas in 2015 there are no apparent excesses in other channels indicating anything at 750 GeV.

And on top of that, there was something else going for the Higgs in December 2011: there was good reason to believe it was on the way.  From a myriad of other experiments we had indirect evidence that a Higgs boson ought to exist, and in a mass range where the LHC effects were showing up.  This indirect evidence came through the interpretation of the “standard model” theory that had done an excellent job of describing all other data in particle physics and thus gave us confidence that it could make predictions about the Higgs too.  And for years, both the Tevatron and the LHC had been slowly but surely directly excluding other possible masses for the Higgs.  If a Higgs were going to show up, it made perfect sense for it to happen right where the early effects were being observed, at just that level of significance with so little data.

Do we have any of that with the 750 GeV effect in 2015?  No.  There are no particular reasons to expect this decay with this rate at this mass (although in the wake of last week’s presentations, there have been many conjectures as to what kind of new physics could make this happen).  Thus, one can’t help but to think that this is some kind of fluctuation.  If you look at enough possible new-physics effects, you have a decent chance of seeing some number of fluctuations at this level, and that seems to be the most reasonable hypothesis right now.

But really there is no need to speculate.  In 2016, the LHC should deliver ten times as much data as it did this year.  That’s even better than what happened in 2012, when the LHC exceeded its 2011 performance by a mere factor of five.  We can anticipate another set of presentations in December 2016, and by then we will know for sure if 2015 gave us a fluctuation or the first hint of a new physical theory that will set the research agenda of particle physics for years to come.  And if it is the latter, I will be the first to admit that I got it wrong.

Share

Si, et vraiment seulement si…

Wednesday, December 16th, 2015

Si le LHC était une échelle et les nouvelles particules tant recherchées, des boîtes cachées sur les étagères les plus hautes, la montée en énergie du LHC s’apparente à l’acquisition d’une échelle plus longue donnant accès aux dernières étagères. Fin 2012, les échelles étaient plus courtes, mais on en avait dix fois plus, facilitant l’exploration des étagères à notre portée. ATLAS et CMS viennent de jeter leur premier coup d’œil à un endroit jamais exploré auparavant mais auront besoin de plus de données pour les inspecter en profondeur.

Le 15 décembre, lors du séminaire de fin d’année, les expériences CMS et ATLAS du CERN ont présenté leurs premiers résultats basés sur les toutes nouvelles données accumulées en 2015 depuis la reprise du Grand collisionneur de hadrons (LHC) à 13 TeV, l’énergie d’exploitation la plus haute jamais atteinte. Bien que la quantité de données ne soit que le dixième de ce qu’elle était à plus basse énergie (soit 4 fb-1 pour ATLAS et 2,8-1 fb pour CMS pour les données recueillies à 13 TeV comparés à 25 fb-1 à 8 TeV pour chaque expérience), cette augmentation en énergie met désormais des particules hypothétiques plus massives à la portée des expériences.

Les deux expériences ont d’abord démontré comment leurs détecteurs se sont comportés après plusieurs améliorations majeures, y compris l’acquisition des données à deux fois le taux utilisé en 2012. Les deux groupes ont contrôlé sous toutes les coutures comment les particules déjà connues se comportent à plus haute énergie, sans trouver d’anomalies. Mais c’est dans la recherche de particules nouvelles et plus lourdes que tous les espoirs sont permis. Les deux groupes ont exploré des douzaines de possibilités différentes, triant des milliards d’événements.

Chaque événement est un cliché de ce qui s’est produit lorsque deux protons entrent en collision dans le LHC. L’énergie dégagée par la collision se matérialise sous forme de particules lourdes et instables qui se désintègrent aussitôt, provoquant de mini feux d’artifice. En attrapant, identifiant et regroupant toutes les particules qui s’échappent du point de collision, on peut reconstruire les particules originales qui ont été produites.

Les expériences CMS et ATLAS ont toutes deux trouvé de petits excès en sélectionnant les événements contenant deux photons. Dans plusieurs de ces événements, les deux photons semblent venir de la désintégration d’une particule ayant une masse d’environ 750 GeV, soit 750 fois plus lourde qu’un proton ou 6 fois la masse d’un boson de Higgs. Puisque les deux expériences ont regardé une multitude de combinaisons différentes, en vérifiant à chaque fois des douzaines de valeurs de masse pour chaque combinaison, on s’attend toujours à trouver de telles fluctuations statistiques.

ATLAS-diphotonPartie supérieure : la masse combinée exprimée en GeV pour toutes les paires de photons trouvées dans les données récoltées à 13 TeV par ATLAS. Le trait rouge montre à quoi on s’attend venant de sources aléatoires (communément appelé bruit de fond). Les points noirs correspondent aux données et les lignes, les erreurs expérimentales. La petite bosse à 750 GeV est ce qui est maintenant intrigant. La partie du bas montre la différence entre des points noirs (les données) et la courbe rouge (le bruit de fond), montrant clairement un petit excès de 3,6σ ou 3,6 fois l’erreur expérimentale. Quand on prend en compte toutes les fluctuations possibles à toutes les valeurs de masse considérées, l’excès n’est plus que de 2,0σ.

Ce qui est intrigant, c’est que les deux équipes ont trouvé la même chose à exactement au même endroit, sans s’être consulté et en utilisant des techniques de sélection conçues pour ne pas biaiser les données. Néanmoins, les deux groupes expérimentaux sont extrêmement prudents, déclarant qu’une fluctuation statistique est toujours possible jusqu’à ce que plus de données soient disponibles pour tout vérifier avec une précision accrue.
CMS-combined-p0CMS a légèrement moins de données qu’ATLAS à 13 TeV et par conséquent, décèle un effet beaucoup plus petit. Dans leurs seules données prises à 13 TeV, l’excès à 760 GeV est de 2,6σ, 3,0σ lorsque combiné avec les données de 8 TeV. Mais au lieu de juste évaluer cette probabilité localement, les physiciens et physiciennes préfèrent prendre en compte les fluctuations pour toutes les valeurs de masse considérées. La probabilité n’est alors que de 1,2σ, pas de quoi fouetter un chat. C’est “l’effet de regarder ailleurs” : il prend en compte qu’on finit toujours par trouver une fluctuation quelque part quand on regarde dans tant d’endroits.

Les théoriciens et théoriciennes se retiennent beaucoup moins. Depuis des décennies, on sait que le Modèle standard, le modèle théorique actuel de la physique des particules, n’explique pas tout, sans pouvoir progresser. Tout le monde espère donc qu’un indice viendra des données expérimentales pour aller de l’avant. Beaucoup ont dû travailler dur toute la nuit car huit nouveaux articles sont apparus dès ce matin, proposant des explications variées sur la nature possible de la nouvelle particule, si particule nouvelle il y a. Quelques personnes pensent que cela pourrait être une particule liée à la matière sombre, d’autres penchent pour un autre type de boson de Higgs tel que prédit par la Supersymétrie ou même y voient les premiers signes de nouvelles dimensions. D’autres proposent qu’un tel effet ne pourrait se produire que si cette particule s’accompagne d’une deuxième particule plus lourde. Tout le monde suggère quelque chose au-delà du Modèle standard.

Deux choses sont certaines : tout d’abord, le nombre d’articles théoriques dans les prochaines semaines va exploser. Et deuxièmement, on ne pourra pas dire si il y a nouvelle particule sans plus de données. Avec un peu de chance, nous pourrions en savoir plus dès l’été prochain lorsque le LHC aura produit plus de données. D’ici là, tout cela n’est que pure spéculation.

Ceci étant dit, il ne faut toutefois pas oublier que le boson de Higgs a fait son apparition de façon très semblable. Les premiers signes de son existence étaient déjà visibles en juillet 2011. Avec plus de données, ces signes s’étaient renforcés en décembre 2011 à un autre séminaire de fin d’année mais sa découverte n’a pu être établie que lorsque encore plus de données eurent été recueillies et analysées en juillet 2012. Ouvrir ses cadeaux avant Noël n’est jamais une bonne idée.

Passez de bonnes Fêtes, Pauline Gagnon

Pour en savoir plus sur la physique des particules et les enjeux du LHC, consultez mon livre : « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels». Pour recevoir un avis lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution.

Share

If, and really only if…

Wednesday, December 16th, 2015

If the LHC were a ladder and the new sought-after particles, boxes hidden on the top shelves, operating the LHC at higher energy is like having a longer ladder giving us access to the higher shelves. By the end of 2012, our ladders were shorter but we had 10 times more than now. ATLAS and CMS just had their first glimpse at a place never reached before but more data is still needed to explore this space thoroughly.

On December 15, at the End-of-the-Year seminar, the CMS and ATLAS experiments from CERN presented their first results using the brand new data accumulated in 2015 since the restart of the Large Hadron Collider (LHC) at 13 TeV, the highest operating energy so far. Although the size of the data sample is still only one tenth of what was available at lower energy (namely 4 fb-1 for ATLAS and 2.8-1 fb for CMS collected at 13 TeV compared to 25 fb-1 at 8 TeV for each experiment), it has put hypothetical massive particles within reach.

Both experiments showed how well their detectors performed after several major improvements, including collecting data at twice the rate used in 2012. The two groups made several checks on how known particles behave at higher energy, finding no anomalies. But it is in searches for new, heavier particles that every one hopes to see something exciting. Both groups explored dozens of different possibilities, sifting through billions of events.

Each event is a snapshot of what happens when two protons collide in the LHC. The energy released by the collision materializes into some heavy and unstable particle that breaks apart mere instants later, giving rise to a mini firework. By catching, identifying and regrouping all particles that fly apart from the collision point, one can reconstruct the original particles that were produced.

Both CMS and ATLAS found small excesses when selecting events containing two photons. In several events, the two photons seem to come from the decay of a particle having a mass around 750 GeV, that is, 750 times heavier than a proton or 6 times the mass of a Higgs boson. Since the two experiments looked at dozens of different combinations, checking dozens of mass values for each combination, such small statistical fluctuations are always expected.

ATLAS-diphoton

Top part: the combined mass given in units of GeV for all pairs of photons found in the 13 TeV data by ATLAS. The red curve shows what is expected from random sources (i.e. the background). The black dots correspond to data and the lines, the experimental errors. The small bump at 750 GeV is what is now intriguing. The bottom plot shows the difference between black dots (data) and red curve (background), clearly showing a small excess of 3.6σ or 3.6 times the experimental error. When one takes into account all possible fluctuations at all mass values, the significance is only 2.0σ

What’s intriguing here is that both groups found the same thing at exactly the same place, without having consulted each other and using selection techniques designed not to bias the data. Nevertheless, both experimental groups are extremely cautious, stating that a statistical fluctuation is always possible until more data is available to check this with increased accuracy.

CMS-combined-p0CMS has slightly less data than ATLAS at 13 TeV and hence, sees a much smaller effect. In their 13 TeV data alone, the excess at 760 GeV is about 2.6σ, 3σ when combined with the 8 TeV data. But instead of just evaluating this probability alone, experimentalists prefer take into account the fluctuations in all mass bins considered. Then the significance is only 1.2σ, nothing to write home about. This “look-elsewhere effect” takes into account that one is bound to see a fluctuation somewhere when ones look in so many places.

Theorists show less restrain. For decades, they have known that the Standard Model, the current theoretical model of particle physics, is flawed and have been looking for a clue from experimental data to go further. Many of them have been hard at work all night and eight new papers appeared this morning, proposing different explanations on which new particle could be there, if something ever proves to be there. Some think it could be a particle related to Dark Matter, others think it could be another type of Higgs boson predicted by Supersymmetry or even signs of extra dimensions. Others offer that it could only come from a second and heavier particle. All suggest something beyond the Standard Model.

Two things are sure: the number of theoretical papers in the coming weeks will explode. But establishing the discovery of a new particle will require more data. With some luck, we could know more by next Summer after the LHC delivers more data. Until then, it remains pure speculation.

This being said, let’s not forget that the Higgs boson made its entry in a similar fashion. The first signs of its existence appeared in July 2011. With more data, they became clearer in December 2011 at a similar End-of-the-Year seminar. But it was only once enough data had been collected and analysed in July 2012 that its discovery made no doubt. Opening one’s gifts before Christmas is never a good idea.

Have a good Holiday Season, Pauline Gagnon

To learn more about particle physics and what might be discovered at the LHC, don’t miss my upcoming book : « Who cares about particle physics : Making sense of the Higgs boson, Large Hadron Collider and CERN ». To be alerted of new postings, follow me on Twitter: @GagnonPauline  or sign-up on this mailing list to receive an e-mail notification.

Share

It’s amazing that so much hard work (and such high levels of stress) can be condensed down so much… 5 pages, 3 plots and a table – and the new world leading limit on the WIMP-nucleon spin-independent elastic scattering cross section, of course.
Yes, the LUX Run 3 reanalysis results are finally out. It’s been in the works for over a year, and it has been a genuinely wonderful experience to watch this paper grow – and seeing my own plot in there has felt like sending forth a child into the world!
As much as I worked to improve our signal efficiency at low energies, the real star of the LUX reanalysis show was the “D-D” calibration – D-D standing for deuterium-deuterium. We calibrated the detector’s response to nuclear recoils (which we expect WIMP dark matter to cause) with something that sounds like it is out of science fiction, a D-D generator. This generator uses the fusion of deuterium (think heavy hydrogen – one proton, one neutron) to generate neutrons that are focussed into a beam and sent into the detector.

Quick LUX 101 – LUX is a dark matter search experiment. Dark matter is that mysterious dark, massive substance that makes up 27% of our universe. How does LUX look for dark matter? Well, it is a ‘dual phase xenon TPC’ detector, and it lives 4850 feet underground at the Sanford Underground Research Facility.  It must be underground to shield it from as much cosmic radiation as possible – as it is looking for a very rare, weakly interacting dark matter particle called a WIMP. LUX is basically a big tank of liquid xenon, with a gas layer on top. It is sensitive to particles that enter this xenon – photons and electrons cause what we call an electron recoil (think of them bouncing off an atomic electron) whilst neutrons cause a nuclear recoil (bounce off a xenon nucleus). We expect that WIMPs will interact with the atomic nuclei too, just incredibly rarely – so understanding the detector response to these nuclear recoils is of utmost importance.  Both these electron recoils and nuclear recoils, inside the liquid xenon cause a flash of light, a signal we call “S1”, the scintillation signal. Any light in LUX is picked up by two arrays of photomultiplier tubes, 122 in total. Recoils can cause ionisation of electrons; electrons are ‘knocked off’ their atoms by the collision. If you place an electric field over the xenon volume, you can actually push these electrons along, instead of letting them recombine with their atoms. In LUX, the electrons are pushed all the way to the top, and into the gaseous xenon later. They then cause a second flash of light via scintillation in the gas, “S2”, the ionisation signal (as its source is the ionised electrons). Two signals mean two things – discrimination between electron recoils (background) and nuclear recoils (possible dark matter signal!) due to the differing distribution of energy between S1 and S2 for each recoil, and secondly, 3D position reconstruction. XY coordinates can be determined from looking at which photomultiplier tubes light up, whilst the time between the S1 and S2 tells us the depth of the interaction. This XYZ position is very important; we use the xenon to shield itself from radiation from the detector materials itself, or from the surrounding rock. If we have the 3D position of all our events, we can only look in the very inner region of the detector, where it is very quiet, for those rare dark matter interactions.

Schematic of the LUX detector

Schematic of the LUX detector. On the left, it is demonstrated how the S1 and S2 signals can provide 3D position reconstruction. The right shows the inside of the detector, and the position of the photomultiplier tubes that collect light emitted by the scintillation of xenon.

 

Back to the deuterium-deuterium fusion neutron gun – it’s actually a wonderfully simple but extremely clever idea. We fire a beam of neutrons into our detector, all at the same energy (monoenergetic or monochromatic), at a set position. We then select events in our data along that beam, and look for those neutrons that scattered a second time in the detector. Because of that XYZ position reconstruction, if we have signals from two different scatters, we can actually determine the angle of scattering. As the initial energy is known, allows the energy of the recoil to then be calculated, via simple kinematics. Matching the recoil energy with the size of the two signals allows us to calibrate the nuclear recoil response of the detector extremely well.  The light yield (in S1), tougher to measure than the charge yield (in S2), as we are talking about individual photons, was measured as low as 1.1keV. (keV are kiloelectronvolts, or 1000x the energy of a single electron moved across a potential difference of 1V. In other words, a tiny quantity. 1keV is only 1.6×10-16 joules!)  The charge yield was measured below 1keV. In the previous LUX results, we had assumed a conservative hard cut off – ie we would measure no light for recoils below 3 keV. Now we know that isn’t the case, and can extend our sensitivity to lower energies – which corresponds to lighter WIMPs.

Screenshot 2015-12-14 20.28.48

Upper limits on the spin-independent elastic WIMP-nucleon cross section at 90% CL. Observed limit in black, with the 1- and 2-σ ranges of background-only trials shaded green and yellow.

This improvement in low energy calibration, as well as a more streamlined and improved analysis framework, has led to a huge improvement in LUX’s low WIMP mass limit. In the plot above, which shows the WIMP mass against the probability of interaction with a nucleus, everything above the black line is now ruled out. If you’ve been following the WIMP dark matter saga, you will know that a few experiments were claiming hints of signals in that low mass region, but this new result definitely lays those signals to rest.

Getting a paper ready for publication has turned out to be far harder work than I expected. It requires a lot of teamwork, perseverance, brain power and very importantly, the ability to take on criticism and use it to improve. I must have remade the LUX efficiency plot over 100 times, and a fair few of those times it was because someone didn’t quite like the way I’d formatted it. In a collaboration, you have to be willing to learn from others and compromise. In the last few days before we finished I did not benefit at all from my UK time zone, as I stayed up later and later to finish things off. But – it was worth it! Now, if I search for my name on arXiv, I come up 4 times (3 LUX papers and the LZ Conceptual Design Report). As pathetic as it sounds, this is actually quite exciting for me, and is what I hope to be the foundations of a long career in physics.

The new LUX results are obviously nowhere near as exciting as an actual WIMP discovery, but it’s another step on the way there. LUX Run 4 is in full swing, where we will obtain over 3 times more data, increasing our sensitivity even further , and who knows – those WIMPs might just finally show their faces.

Share

Dark Matter: A New Hope

Monday, December 7th, 2015

[Apologies for the title, couldn’t resist the temptation to work in a bit of Star Wars hype]

To call the direct detection of dark matter “difficult” is a monumental understatement. To date, we have had no definite, direct detection on Earth of this elusive particle that we suspect should be all around us. This seems somewhat of a paradox when our best astronomical observations indicate that there’s about five times more dark matter in the universe than the ordinary, visible matter that appears to make up the world we see. So what’s the catch? Why is it so tricky to find?

An enhanced image of the “Bullet Cluster”: two colliding galaxies are observed with ordinary “baryonic” matter (coloured red) interacting as expected and the dark matter from each galaxy, inferred from gravitational lensing (coloured blue), passing straight through one another. Source: NASA Astronomy Picture of the Day 24/08/2006

The difficulty lies in the fact that dark matter does not interact with light (that is, electromagnetically) or noticeably with atoms as we know them (that is, with the strong force, which holds together atomic nuclei). In fact, the only reason we know it exists is because of how it interacts gravitationally. We see galaxies rotate much faster than they would without the presence of some unseen “dark matter”, amongst other things. Unfortunately, none of the particles we know from the Standard Model of particle physics are suitable candidates for explaining dark matter of this sort. There are, however, several attempts in the works to try and detect it via weak nuclear interactions on Earth and pin down its nature, such as the recently approved LUX-ZEPLIN experiment, which should be built and collecting data by 2020.

Direct detection, however, isn’t the only possible way physicists can get a handle on dark matter. In February 2014, an X-Ray signal at 3.5 keV was detected by the XMM-Newton, an X-ray spectroscopy project by the European Space Agency, in orbit around Earth. Ever since, there’s been buzz amongst particle cosmologists that the signal may be from some kind of dark matter annihilation process. One of the strongest candidates to explain the signal has been sterile neutrino, a hypothetical cousin of the Standard Model neutrino. Neutrinos are ghostly particles that also interact incredibly rarely with ordinary matter* but, thanks to the remarkable work of experimentalists, were detected in the late 1950s. Their exact nature was later probed by two famous experiments, SNO and Super-Kamiokande, that demonstrated that neutrinos do in fact have mass, by observing a phenomenon known as Neutrino Oscillations. As reported on this blog in October, the respective heads of each collaboration were awarded the 2015 Nobel Prize in Physics for their efforts in this field.

“Handedness” refers to how a particle spins about the axis it travels along. Standard Model neutrinos (first observed in 1956) are all observed as left handed. Sterile neutrinos, a hypothetical dark matter candidate, would be right-handed, causing them to spin the opposite way along their axes. Image source: ysfine.com

The hope amongst some physicists is that as well as the neutrinos that have been studied in detectors for the last half a century, there exists a sort of heavier “mirror image” to these particles that could act as a suitable dark matter candidate. Neutrinos are only found to “spin” in a certain way relative to the axis of their propagation, while the hypothesised sterile neutrinos would spin the opposite way round (in more technical terms, they have opposite chirality). This difference might seem trivial, but in the mathematical structure underpinning the Standard Model, it would fundamentally change how often these new particles interact with known particles. Although predicted to react incredibly rarely with ordinary matter, there are potentially processes that would allow these sterile neutrinos to emit an X-Ray signal, with half the mass-energy of the original particle. Due to the sheer number of them found in dense places such as the centres of galaxies, where XMM-Newton was collecting data from, in principle such a signal would be measurable from regions with a high density of sterile neutrinos.

This all seems well and good, but how well does the evidence measure up? Since the announcement of the signal, the literature has gone back and forth on the issue, with the viability of sterile neutrinos as a dark matter candidate being brought into question. It is thought that the gravitational presence of dark matter played a crucial role in the formation of galaxies in the early universe, and the best description we have relies on dark matter being “cold”, i.e. with a velocity dispersion such that the particles don’t whizz around at speeds too close to the speed of light, at which point their kinematic properties are difficult to reconcile with cosmological models. However, neutrinos are notorious for having masses so small they have yet to be directly measured and to explain the signal at 3.5 keV, the relevant sterile neutrino would have to have a relatively small mass of ~7 keV/c2, about 15,000 times lighter than the usual prediction for dark matter at ~100 GeV/c2. This means that under the energy predicted by cosmological models for dark matter production, our sterile neutrinos would have a sort of “luke-warm” characteristic, in which they move around at speeds comparable to but not approaching the speed of light.

A further setback has been that the nature of the signal has been called into question, since the resolution of the initial measurements from XMM-Newton (and accompanying X-ray satellite experiments such as Chandra) was not sharp enough to definitively determine the signal’s origin. XMM-Newton built up a profile of X-ray spectra by averaging across measurements from just 73 galaxy clusters, though it will take further measurements to fully rule out the possibility that the signal isn’t from the atomic spectra of potassium and sulpher ions found in hot cosmic plasmas.

But there remains hope.

A recent pre-print to the Monthly Notices of the Royal Astronomical Society (MNRAS) by several leading cosmologists has outlined the compatibility of a 7 keV/c2 sterile neutrino’s involvement with the development of galactic structure. To slow down the sterile neutrinos enough to bring them in line with cosmological observations, “lepton asymmetry” (a breaking of the symmetry between particles and antiparticles) has to be introduced in the model. While this may initially seem like extra theoretical baggage, since lepton asymmetry has yet to be observed, there are theoretical frameworks than can introduce such an asymmetry with the introduction of two much heavier sterile neutrinos at the GeV scale.

A Black Brant XII sounding rocket, similar to the type that could be used to carry microcalorimeters, capable of recording X-ray signals of the type XMM-Newton and Chandra have been observing in galactic nuclei. These rockets are used to conduct scientific experiments in sub-orbital flight, including attempts at dark matter detection. Source: NASA/Wallops

Under such a model, not only could our dark matter candidate be reconciled, but neutrino oscillations could also be explained. Finally, baryogenesis, the description of why there was slightly more matter than antimatter in the early universe, could also find an explanation in such a theory. This would resolve one of the largest puzzles in Physics; the Standard Model predicts nearly equivalent amounts of particles and antiparticles in the early universe which should have annihilated to leave nothing but radiation, rather than the rich and exciting universe we inhabit today. On the experimental side, there are a few proposed experiments to try and measure the X-ray signal more carefully to determine its shape and compare it with the prediction of such models, such as flying rockets around with calorimeters inside to try and pick up the signal by observing a broader section of the sky than XMM or Chandra did.

With the experts’ opinions divided and further research yet to be done, it would be facetious to end this article with any sort of comment on whether the signal can or will gather the support of the community and become verified as a full blown dark matter signal. At time of writing, a paper has been released claiming signal is better explained as an emission from the plasmas found in galactic nuclei. A further preprint to MNRAS, put on arXiv just days ago, claims the sterile neutrino hypothesis is incompatible with the signal but that axions (a dark matter model that supposes a totally different type of particle outside of the Standard Model) remain as a candidate to explain the signal. Perhaps sterile neutrinos, are not the particles we’re looking for.

This kind of endeavour is just one of the hundreds of ways particle physicists and our colleagues in Astrophysics are looking to find evidence of new, fundamental physics. The appeal for me, as someone whose work will probably only have relevance to huge, Earth-bound experiments like the Large Hadron Collider, is the crossover between modelling the birth of colossal objects like galaxies and theories of subatomic particle production, using comparison between the two for consistency. Regardless of whether future rocket-based experiments can gather enough data to fully validate the signal in terms of theories produced by physicists here on Earth, it is a perfect example of breadth of activity physicists are engaged in, attempting to answer the big questions such as the nature of dark matter, through our research.

Kind regards to Piotr Oleśkiewicz (Durham University) for bringing this topic to my attention and for his insights on cosmology, and to Luke Batten (University College London) for a few corrections.

*The oft-quoted fact about neutrinos is that 65 billion solar neutrinos pass through just one square centimetre of area on earth every single second. The vast majority of these neutrinos will whizz straight through you without ever having noticed you were there, but by chance, in your entire lifetime, it’s likely that at least one or two will have the courtesy to notice you and bump off one of your atoms. The other interesting fact is that due to the decay of potassium in your bones, you actually emit about three hundred neutrinos a second.

Share

The Large Hadron Collider is almost done running for 2015.  Proton collisions ended in early November, and now the machine is busy colliding lead nuclei.  As we head towards the end-of-year holidays, and the annual CERN shutdown, everyone wants to know — what have we learned from the LHC this year, our first year of data-taking at 13 TeV, the highest collision energies we have ever achieved, and the highest we might hope to have for years to come?

We will get our first answers to this question at a CERN seminar scheduled for Tuesday, December 15, where ATLAS and CMS will be presenting physics results from this year’s run.  The current situation is reminiscent of December 2011, when the experiments had recorded their first significant datasets from LHC Run 1, and we saw what turned out to be the first hints of the evidence for the Higgs boson that was discovered in 2012.  The experiments showed a few early results from Run 2 during the summer, and some of those have already resulted in journal papers, but this will be our first chance to look at the broad physics program of the experiments.  We shouldn’t have expectations that are too great, as only a small amount of data has been recorded so far, much less than we had in 2012.  But what science might we hope to hear about next week?  

Here is one thing to keep in mind — the change in collision energy affects particle production rates, but not the properties of the particles that are produced.  Any measurement of particle production rates is inherently interesting at a new collision energy, as will be a measurement that has never been done before.  Thus any measurement of a production rate that is possible with this amount of data would be a good candidate for presentation.  (The production rates of top quarks at 13 TeV have already been measured by both CMS and ATLAS; maybe there will be additional measurements along these lines.)

We probably won’t hear anything new about the Higgs boson.  While the Higgs production rates are larger than in the previous run, the amount of data recorded is still relatively small compared to the 2010-12 period.  This year, the LHC has delivered about 4 fb-1 of data, which could be compared to the 5 fb-1 that was delivered in 2011.  At that time there wasn’t enough data to say anything definitive about the Higgs boson, so it is hard to imagine that there will be much in the way of Higgs results from the new data (not even the production rate at 13 TeV), and certainly nothing that would tell us anything more about its properties than we already know from the full Run 1 dataset of 30 fb-1.  We’ll all probably have to wait until sometime next year before we will know more about the Higgs boson, and if anything about it will disagree with what we expect from the standard model of particle physics.

If there is anything to hope for next week, it is some evidence for new, heavy particles.  Because the collision energy has been increased from 8 TeV to 13 TeV, the ability to create a heavy particle of a given mass has increased too.  A little fooling around with the “Collider Reach” tool (which I had discussed here) suggests that even as little data as we have in hand now can give us improved chances of observing such particles now compared to the chances in the entire Run 1 dataset as long as the particle masses are above about 3 TeV.  Of course there are many theories that predict the existence of such particles, the most famous of which is supersymmetry.  But so far there has been scant evidence of any new phenomena in previous datasets.  If we were to get even a hint of something at a very high mass, it would definitely focus our scientific efforts for 2016, where we might get about ten times as much data as we did this year.

Will we get that hint, like we did with the Higgs boson four years ago?  Tune in on December 15 to find out!

Share