• Ricky
  • Nathvani

Latest Posts

  • John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Latest Posts

Depuis le 15 décembre, j’ai compté 200 nouveaux articles théoriques, chacun offrant une ou plusieurs explications possibles sur la nature d’une nouvelle particule qui n’a pas encore été découverte. Cette frénésie a commencé lorsque les expériences CMS et ATLAS ont toutes deux rapporté avoir trouvé quelques événements qui pourraient révéler la présence d’une nouvelle particule se désintégrant en deux photons. Sa masse serait autour de 750 GeV, soit cinq fois la celle du Higgs boson.

Personne ne sait si un tel engouement est justifié mais cela illustre combien les physiciens et physiciennes espèrent une découverte majeure dans les années à venir. Est-ce que cela se passera comme pour le boson de Higgs, qui fut officiellement découvert en juillet 2012, bien que quelques signes avant-coureurs apparurent un an auparavant ? Il est encore bien trop tôt pour le dire. Et comme je l’avais écrit en juillet 2011, c’est comme si nous essayions de deviner si le train s’en vient en scrutant l’horizon par une morne journée d’hiver. Seule un peu de patience nous dira si la forme indistincte à peine visible au loin est bien le train longuement attendu ou juste une illusion. Il faudra plus de données pour pouvoir trancher, mais en attendant, tout le monde garde les yeux rivés sur cet endroit.
LeTrainDeMidiLe train de midi, Jean-Paul Lemieux, Galerie nationale du Canada

En raison des difficultés inhérentes à la reprise du LHC à plus haute énergie, la quantité de données récoltées à 13 TeV en 2015 par ATLAS et CMS a été très limitée. De tels petits échantillons de données sont toujours sujets à de larges fluctuations statistiques et l’effet observé pourrait bien s’évaporer avec plus de données. C’est pourquoi les deux expériences se sont montrées si réservées lors de la présentation de ces résultats, déclarant clairement qu’il était bien trop tôt pour sauter au plafond.

Mais les théoriciens et théoriciennes, qui cherchent en vain depuis des décennies un signe quelconque de phénomènes nouveaux, ont sauté sur l’occasion. En un seul mois, y compris la période des fêtes de fin d’année”, 170 articles théoriques avaient déjà été publiés pour suggérer autant d’interprétations différentes possibles pour cette nouvelle particule, même si on ne l’a pas encore découverte.

Aucune nouvelle donnée ne viendra avant quelques mois en raison du de la maintenance annuelle. Le Grand Collisionneur de Hadrons repartira le 21 mars et devrait livrer les premières collisions aux expériences le 18 avril. On espère un échantillon de données de 30 fb-1 en 2016, alors qu’en 2015 seuls 4 fb-1 furent produits. Lorsque ces nouvelles données seront disponibles cet été, nous saurons alors si cette nouvelle particule existe ou pas.

Une telle possibilité serait une véritable révolution. Le modèle théorique actuel de la physique des particules, le Modèle Standard, n’en prévoit aucune. Toutes les particules prédites par le modèle ont déjà été trouvées. Mais puisque ce modèle laisse encore plusieurs questions sans réponses, les théoriciennes et théoriciens sont convaincus qu’il doit exister une théorie plus vaste pour expliquer les quelques anomalies observées. La découverte d’une nouvelle particule ou la mesure d’une valeur différente de celle prévue par la théorie révèleraient enfin la nature de cette nouvelle physique allant au-delà du Modèle Standard.

Personne ne connaît encore quelle forme cette nouvelle physique prendra. Voilà pourquoi tant d’explications théoriques différentes pour cette nouvelle particule ont été proposées. J’ai compilé certaines d’entre elles dans le tableau ci-dessous. Plusieurs de ces articles décrivent simplement les propriétés requises par un nouveau boson pour reproduire les données observées. Les solutions proposées sont incroyablement diversifiées, les plus récurrents étant diverses versions de modèles de matière sombre ou supersymétriques, de Vallée Cachée, de Grande Théorie Unifiée, de bosons de Higgs supplémentaire ou composites, ou encore des dimensions cachées. Il y en a pour tous les goûts : des axizillas au dilatons, en passant pas les cousins de pions sombres, les technipions et la trinification.

La situation est donc tout ce qu’il y a de plus clair : tout est possible, y compris rien du tout. Mais n’oublions pas qu’à chaque fois qu’un accélérateur est monté en énergie, on a eu droit à de nouvelles découvertes. L’été pourrait donc être très chaud.

Pauline Gagnon

Pour en savoir plus sur la physique des particules et les enjeux du LHC, consultez mon livre : « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels».

Pour recevoir un avis lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution.

tableau

Un résumé partiel du nombre d’articles publiés jusqu’à maintenant et le type de solutions proposées pour expliquer la nature de la nouvelle particule, si nouvelle particule il y a. Pratiquement tous les modèles théoriques connus peuvent être adaptés pour accommoder une nouvelle particule compatible avec les quelques événements observés. Ce tableau est juste indicatif et en aucun cas, strictement exact puisque plusieurs articles étaient plutôt difficiles à classer. Une de ces idées s’avèrera-t-elle être juste ?

Share

Frenzy among theorists

Thursday, February 4th, 2016

Since December 15, I have counted 200 new theoretical papers, each one suggesting one or several possible explanations for a new particle not yet discovered. This flurry of activity started when the CMS and ATLAS Collaborations both reported having found a few events that could possibly reveal the presence of a new particle decaying to two photons. Its mass would be around 750 GeV, that is, five times the mass of the Higgs boson.

No one knows yet if all this excitement is granted but it clearly illustrates how much physicists are hoping for a huge discovery in the coming years. Will it be like with the Higgs boson, which was officially discovered in July 2012 but had already given some faint signs of its presence a year earlier? Right now, there is not enough data. And just as I wrote in July 2011, it is as if we were trying to guess if the train is coming by looking in the far distance on a grey winter day. Only time will tell if the indistinct shape barely visible above the horizon is the long awaited train or just an illusion. But until more data become available, everybody will keep their eyes on that spot.

LeTrainDeMidi

The noon train, Jean-Paul Lemieux, National Gallery of Canada

Due to the difficulties inherent to the restart of the LHC at higher energy, the amount of data collected at 13 TeV in 2015 by ATLAS and CMS was very limited. Given that small data samples are always prone to large statistical fluctuations, the experimentalists exerted much caution when they presented these results, clearly stating that any claim was premature.

But theorists, who have been craving for signs of something new for decades, jumped on it. Within a single month, including the end-of-the-year holiday period, 170 theoretical papers were published to suggest just as many possible different interpretations for this yet undiscovered new particle.

No new data will come for a few more months due to annual maintenance. The Large Hadron Collider is due to restart on March 21 and should deliver the first collisions to the experiments around April 18. The hope is to collect a data sample of 30 fb-1 in 2016, to be compared with about 4 fb-1 in 2015. Later this summer, when more data will be available, we will know if this new particle exists or not.

This possibility is however extremely exciting since the Standard Model of particle physics is now complete. All expected particles have been found. But since this model leaves many open questions, theorists are convinced that there ought to be a more encompassing theory. Hence, discovering a new particle or measuring anything with a value different from its predicted value would reveal at long last what the new physics beyond the Standard Model could be.

No one knows yet what form this new physics will take. This is why so many different theoretical explanations have been proposed for this possible new particle. I have compiled some of them in the table below. Many of these papers described the properties needed by a new boson to fit the actual data. The solutions proposed are incredibly diversified, the most recurrent ones being various versions of dark matter or supersymmetric, new gauge symmetries, Hidden Valley, Grand Unified Theory, extra or composite Higgs bosons and extra dimensions. There enough to suit every taste: axizillas, dilatons, dark pion cousins of a G-parity odd WIMP, one-family walking technipion or trinification.

It is therefore crystal clear: it could be anything or nothing at all… But every time accelerators have gone up in energy, new discoveries have been made. So we could be in for a hot summer.

Pauline Gagnon

Learn more on particle physics, don’t miss my book, which will come out in English in July.

To be alerted of new postings, follow me on Twitter: @GagnonPauline  or sign-up on this mailing list to receive an e-mail notification.

table

A partial summary of the number of papers published so far with the type of solutions they proposed to explain the nature of the new particle, if new particle there is. Just about all known theoretical models can be adapted to produce a new particle with characteristics compatible with the few events observed. This is just indicative and by no means, strictly exact since many proposals were rather hard to categorize. Will one of these ideas be the right one?

Share

In the late 1980s, as particle colliders probed deeper into the building blocks of nature, there were hints of a strange and paradoxical behaviour in the heart of atoms. Fundamental particles have a curious quantum mechanical property known as “spin”, which the electron carries in magnitude ½. While the description of electron’s spin is fairly simple, protons are made up of many particles whose “spins” can add together in complicated ways and yet remarkably, its total spin turns out to be the same as the electron: ½. This led to one of the great mysteries of modern physics: how do all the particles inside the proton conspire together to give it a ½ spin? And what might this mean for our understanding of hadrons, the particles that make up most of the visible universe?

[This article is largely intended for a lay-audience and contains an introduction to foundational ideas such as spin. If you’ve had a basic introduction to Quantum Mechanics before, you may wish to skip to section marked —— ]

We’ve known about the proton’s existence for nearly a hundred years, so you’d be forgiven for thinking that we knew all there was to know about it. For many of us, our last exposure to the word “proton” was in high school chemistry, where they were described as a little sphere of positive charge that clumps with neutrons to make atomic nuclei, around which negatively charged electrons orbit to create all the atoms, which make up Life, the Universe and Everything1.

2000px-Proton.svg

The simple, three-quark model of a proton (each coloured circle is a type of “quark”). 

Like many ideas in science, this is a simplified model that serves as a good introduction to a topic, but skips over the gory details and the bizarre, underlying reality of nature. In this article, we’ll focus on one particular aspect, the quantum mechanical “spin” of the proton. The quest to measure its origin has sparked discovery, controversy and speculation that has lasted 30 years, the answer to which is currently being sought at a unique particle collider in New York.

The first thing to note is that protons, unlike electrons2, are composite particles, made up from lots of other particles. The usual description is that the proton is made up of three smaller “quarks” which, as far as we know, can’t be broken down any further. This picture works remarkably well at low energies but it turns out at very high energies, like those being reached at the at the LHC, this description turns out to be inadequate. At that point, we have to get into the nitty-gritty and consider things like quark-antiquark pairs that live inside the proton interacting dynamically with other quarks without changing its the overall charge. Furthermore, there are particles called gluons that are exchanged between quarks, making them “stick” together in the proton and playing a crucial role in providing an accurate description for particle physics experiments.

So on closer inspection, our little sphere of positive charge turns out to be a buzzing hive of activity, with quarks and gluons all shuffling about, conspiring to create what we call the proton. It is by inferring the nature of these particles within the proton that a successful model of the strong nuclear force, known as Quantum Chromodynamics (QCD), was developed. The gluons were predicted and verfied to be the carriers of this force between quarks. More on them later.

Proton structure

A more detailed model of the proton. The golden chains between the quarks (the coloured spheres) are representations of gluons, transferred between them. Quark anti-quark pairs are also visible with arrows representing spins.

That’s the proton, but what exactly is spin? It’s often compared to angular momentum, like the objects in our everyday experience might have. Everyone who’s ever messed around on an office chair knows that once you get spun around in one, it often takes you a bit of effort to stop because the angular momentum you’ve built up keeps you going. If you did this a lot, you might have noticed that if you started spinning with your legs/arms outstretched and brought them inwards while you were spinning, you’d begin to spin faster! This is because angular momentum (L) is proportional to the radial (r) distribution of matter (i.e. how far out things are from the axis of rotation) multiplied by the speed of rotation3 (v). To put it mathematically L = m × v × r where m is just your constant mass. Since L is constant, as you decrease r (by bringing your arms/legs inwards), v (the speed at which you’re spinning) increases to compensate. All fairly simple stuff.

So clearly, for something to have angular momentum it needs to be distributed radially. Surely r has to be greater than 0 for L to be greater than 0. This is true, but it turns out that’s not all there is to the story. A full description of angular momentum at the quantum (atomic) level is given by something we denote as “J”. I’ll skip the details, but it turns out J = L + S, where L is orbital angular momentum, in a fashion similar to what we’ve discussed, and S? S is a slightly different beast.

Both L and S can only take on discrete values at the microscopic level, that is, they have quantised values. But whereas a point-like particle cannot have L > 0 in its rest frame (since if it isn’t moving around and v = 0, then L = 0), S will have a non-zero value even when the particle isn’t moving. S is what we call Spin. For the electron and quarks, it takes on the value of ½ in natural units.

Spin has a lot of very strange properties. You can think of it like a little arrow pointing in a direction in space but it’s not something we can truly visualise. One is tempted to think of the electron like the Earth, a sphere spinning about some kind of axis, but the electron is not a sphere, it’s a point-like particle with no “structure” in space. While an electron can have many different values of L depending on its energy (and atomic structure depends on these values), it only has one intrinsic magnitude of spin: ½. However, since spin can be thought of as an arrow, we have some flexibility. Loosely speaking, spin can point in many different directions but we’ll consider it as pointing “up” (+½) or “down” (- ½). If we try to measure it along a particular axis, we’re bound to find it in one of these states relative to our direction of measurement.

Spin250

Focus on one of the red faces. When the cube rotates every 360 degrees, the red ribbon appears to go above and below the cube alternatively! Because the cube is coupled to its environment, it takes 720 degrees to return it to it’s original orientation.


One of the peculiar things about spin-½ is that it causes the wave-function of the electron to exhibit some mind bending properties. For example, you’d think rotating any object by 360 degrees would put it back into exactly the same state as it was, but it turns out that doesn’t hold true for electrons. For electrons, rotating them by 360 degrees introduces a negative sign into their wave-function! You have to spin it another 360 degrees to get it back into the same state! There are ways to visualise systems with similar behaviour (see right) but that’s just a sort of “metaphor” for what really happens to the electron. This links into the famous conclusion of Pauli’s that no two identical particles with spin-½ (or any other half-integer spin) can share the same quantum mechanical state.

——

Spin is an important property of matter that only really manifests on the quantum scale, and while we can’t visualise it, it ends up being important for the structure of atoms and how all solid objects obtain the properties they do. The other important property it has is that the spin of a free particle likes to align with magnetic fields4 (and the bigger the spin, the greater the magnetic coupling to the field). By using this property, it was discovered that the proton also had angular momentum J = ½. Since the proton is a stable particle, it was modelled to be in a low energy state with L = 0 and hence J = S = ½ (that is to say, the orbital angular momentum is assumed to be zero and hence we may simply call J, the “spin”). The fact the proton has spin and that spin aligns with magnetic fields, is a crucial element to what makes MRI machines work.

Once we got a firm handle on quarks in the late 1960s, the spin structure of the proton was thought to be fairly simple. The proton has spin-½. Quarks, from scattering experiments and symmetry considerations, were also inferred to have spin-½. Therefore, if the three quarks that make up the proton were in an “up-down-up” configuration, the spin of the proton naturally comes out as ½ – ½ + ½ = ½. Not only does this add up to the measured spin, but it also gives a pleasant symmetry to the quantum description of the proton, consistent with the Pauli exclusion principle (it doesn’t matter which of the three quarks is the “down” quark). But hang on, didn’t I say that the three-quarks story was incomplete? At high energies, there should be a lot more quark-antiquark pairs (sea quarks) involved, messing everything up! Even so, theorists predicted that these quark-antiquark pairs would tend not to be polarised, that is, have a preferred direction, and hence would not contribute to the total spin of the proton.

If you can get the entirety of the proton spinning in a particular direction (i.e. polarising it), it turns out the scattering of an electron against its constituent quarks should be sensitive to their spin! Thus, by scattering electrons at high energy, one could check the predictions of theorists about how the quarks’ spin contributes to the proton.

In a series of perfectly conducted experiments, the theory was found to be absolutely spot on with no discrepancy whatsoever. Several Nobel prizes were handed out and the entire incident was considered resolved, now just a footnote in history. OK, not really.

In truth, the total opposite happened. Although the experiments had a reasonable amount of uncertainty due to the inherent difficulty of polarising protons, a landmark paper by the European Muon Collaboration found results consistent with the quarks contributing absolutely no overall spin to the proton whatsoever! The measurements could be interpreted with the overall spin from the quarks being zero5. This was a complete shock to most physicists who were expecting verification from what was supposed to be a fairly straightforward measurement. Credit where it is due, there were theorists who had predicted that the assumption about orbital angular momentum (L = 0) had been rather ad-hoc and that L > 0 could account for some of the missing spin. Scarcely anyone would have expected, however, that the quarks would carry so little of the spin. Although the nuclear strong force, which governs how quarks and gluons combine to form the proton, has been tested to remarkable accuracy, the nature of its self-interaction makes it incredibly difficult to draw predictions from.

The feynman diagram for Deep Inelastic Scattering (electron line at the top, proton on the bottom). This type of scattering is sensitive to quark spin.

The Feynman diagram for Deep Inelastic Scattering (electron line at the top, proton on the bottom, with a photon exchanged between them). This type of scattering is sensitive to quark spin.

Future experiments (led by father and son rivals, Vernon and Emlyn Hughes6 of CERN and SLAC respectively) managed to bring this to a marginally less shocking proposal. The greater accuracy of the measurements from these collaborations had found that the total spin contributions from the quarks was actually closer to ~30%. An important discovery was that the sea quarks, thought not to be important, were actually found to have measurable polarisation. Although it cleared up some of the discrepancy, it still left 60-70% of spin unaccounted for. Today, following much more experimental activity in Deep Inelastic Scattering and precision low-energy elastic scattering, the situation has not changed in terms of the raw numbers. The best estimates still peg the quarks’ spin as constituting only about 30% of the total.

Remarkably, there are theoretical proposals to resolve the problem that were hinted at long before experiments were even conducted. As mentioned previously, although currently impossible to test experimentally, the quarks may carry orbital angular momentum (L) that could compensate for some of the missing spin. Furthermore, we have failed to mention the contribution of gluons to the proton spin. Gluons are spin-1 particles, and were thought to arrange themselves such that their total contribution to the proton spin was nearly non-existent.

BNL AERIALS

The Brookhaven National Laboratory where RHIC is based (seen as the circle, top right).


The Relativistic Heavy Ion Collider (RHIC) in New York is currently the only spin-polarised proton collider in the world. This gives it a unique sensitivity to the spin structure of the proton. In 2014, an analysis of the data collected at RHIC indicated that the gluons (whose spin contribution can be inferred from polarised proton-proton collisions) could potentially account for up to 30 of the missing 70% of proton spin! About the same as the quarks. This would bring the “missing” amount down to about 40%, which could be accounted for by the unmeasurable orbital angular momentum of both quarks and gluons.

As 2016 kicks into gear, RHIC will be collecting data at a much faster rate than ever after a recent technical upgrade that should double it’s luminosity (loosely speaking, the rate at which proton collisions occur). With the increased statistics, we should be able to get an even greater handle on the exact origin of proton spin. 


The astute reader, provided they have not already wandered off, dizzy from all this talk of spinning protons, may be tempted to ask “Why on earth does it matter where the total spin comes from? Isn’t this just abstract accountancy?” This is a fair question and I think the answer is a good one. Protons, like all other hadrons (similar, composite particles made of quarks and gluons) are not very well understood at all. A peculiar feature of QCD called confinement binds individual quarks together so that they are never observed in isolation, only bound up in particles such as the proton. Understanding the spin structure of the proton can inform our theoretical models for understanding this phenomenon.

This has important implications, one being that 98% of the mass of all visible matter does not come from the Higgs Boson. It comes from the binding energy of protons! And the exact nature of confinement and precise properties of QCD have implications for the cosmology of the early universe. Finally, scattering experiments with protons have already revealed so much to fundamental physics, such as the comprehension of one of the fundamental forces of nature. As one of our most reliable probes of nature, currently in use at the LHC, understanding them better will almost certainly aid our attempts to unearth future discoveries.

Kind regards to Sebastian Bending (UCL) for several suggestions (all mistakes are unreservedly my own).

 

[1] …excluding dark matter and dark energy which constitute the dark ~95% of the universe.

[2] To the best of our knowledge.

[3] Strictly speaking the component of velocity perpendicular to the radial direction.

[4] Sometimes, spins in a medium like water like to align against magnetic fields, causing an opposite magnetic moment (known as diamagnetism). Since frogs are mostly water, this effect can and has been used to levitate frogs.

[5] A lot of the information here has been summarised from this excellent article by Robert Jaffe, whose collaboration with John Ellis on the Ellis-Jaffe rule led to many of the predictions discussed here.

[6] Emlyn was actually the spokesperson for SLAC, though he is listed as one of the primary authors on the SLAC papers regarding the spin structure of the proton.

Share

After a long-anticipated data run, LHC proton-proton running concludes in early November.  A mere six weeks later, on a mid-December afternoon, the ATLAS and CMS collaborations present their first results from the full dataset to a packed CERN auditorium, with people all over the world watching the live webcast.  Both collaborations see slight excesses in events with photon pairs; the CMS excess is quite modest, but the ATLAS data show something that could be interpreted as a peak.  If it holds up with additional data, it would herald a major discovery.  While the experimenters caution that the results do not have much statistical significance, news outlets around the world run breathless stories about the possible discovery of a new particle.

December 15, 2015? No — December 13, 2011, four years ago.  That seminar presented what we now know were the first hints of the Higgs boson in the LHC data.  At the time, everyone was hedging their bets, and saying that the effects we were seeing could easily go away with more data.  Yet now we look back and know that it was the beginning of the end for the Higgs search.  And even at the time, everyone was feeling pretty optimistic.  Yes, we had seen effects of that size go away before, but at this time four years ago, a lot of people were guessing that this one wouldn’t (while still giving all of the caveats).

But while both experiments are reporting an effect at 750 GeV — and some people are getting very excited about it — it seems to me that caution is needed here, more so than we did with the emerging evidence for the Higgs boson.  What’s different about what we’re seeing now compared to what we saw in 2011?

I found it instructive to look back at the presentations of four years ago.  Then, ATLAS had an effect in diphotons around an invariant mass of 125 GeV that had a 2.8 standard deviation local significance, which was reduced to 1.5 standard deviations when the “look elsewhere effect” (LEE) was taken into account.  (The LEE exists because if there is a random fluctuation in the data, it might appear anywhere, not just the place you happen to be looking, and the statistical significance needs to be de-weighted for that.)  In CMS, the local significance was 2.1 standard deviations.  Let’s compare that to this year, when both experiments see an effect in diphotons around an invariant mass of 750 GeV.  At ATLAS, it’s a 3.6 standard deviation local effect which reduced to 2.0 standard deviations after the LEE.  For CMS the respective values are 2.6 and 1.2 standard deviations.  So it sounds like the 2015 signals are even stronger than the 2011 ones, although, on their own, still quite weak, when we consider that five standard deviations is the usual standard to claim a discovery because we are sure that a fluctuation of that size would be very unlikely.

But the 2011 signals had some other things going for them.  The first were experimental.  There were simultaneous excesses in other channels that were consistent with what you’d expect from a Higgs decay.  This included in particular the ZZ channel, which had a low expected rate, but also very low backgrounds and excellent mass resolution.  In 2011, both experiments had the beginning of signals in ZZ too (although at a slightly different putative Higgs mass value) and some early hints in other decay channels.  There were multiple results supporting the diphotons, whereas in 2015 there are no apparent excesses in other channels indicating anything at 750 GeV.

And on top of that, there was something else going for the Higgs in December 2011: there was good reason to believe it was on the way.  From a myriad of other experiments we had indirect evidence that a Higgs boson ought to exist, and in a mass range where the LHC effects were showing up.  This indirect evidence came through the interpretation of the “standard model” theory that had done an excellent job of describing all other data in particle physics and thus gave us confidence that it could make predictions about the Higgs too.  And for years, both the Tevatron and the LHC had been slowly but surely directly excluding other possible masses for the Higgs.  If a Higgs were going to show up, it made perfect sense for it to happen right where the early effects were being observed, at just that level of significance with so little data.

Do we have any of that with the 750 GeV effect in 2015?  No.  There are no particular reasons to expect this decay with this rate at this mass (although in the wake of last week’s presentations, there have been many conjectures as to what kind of new physics could make this happen).  Thus, one can’t help but to think that this is some kind of fluctuation.  If you look at enough possible new-physics effects, you have a decent chance of seeing some number of fluctuations at this level, and that seems to be the most reasonable hypothesis right now.

But really there is no need to speculate.  In 2016, the LHC should deliver ten times as much data as it did this year.  That’s even better than what happened in 2012, when the LHC exceeded its 2011 performance by a mere factor of five.  We can anticipate another set of presentations in December 2016, and by then we will know for sure if 2015 gave us a fluctuation or the first hint of a new physical theory that will set the research agenda of particle physics for years to come.  And if it is the latter, I will be the first to admit that I got it wrong.

Share

Si, et vraiment seulement si…

Wednesday, December 16th, 2015

Si le LHC était une échelle et les nouvelles particules tant recherchées, des boîtes cachées sur les étagères les plus hautes, la montée en énergie du LHC s’apparente à l’acquisition d’une échelle plus longue donnant accès aux dernières étagères. Fin 2012, les échelles étaient plus courtes, mais on en avait dix fois plus, facilitant l’exploration des étagères à notre portée. ATLAS et CMS viennent de jeter leur premier coup d’œil à un endroit jamais exploré auparavant mais auront besoin de plus de données pour les inspecter en profondeur.

Le 15 décembre, lors du séminaire de fin d’année, les expériences CMS et ATLAS du CERN ont présenté leurs premiers résultats basés sur les toutes nouvelles données accumulées en 2015 depuis la reprise du Grand collisionneur de hadrons (LHC) à 13 TeV, l’énergie d’exploitation la plus haute jamais atteinte. Bien que la quantité de données ne soit que le dixième de ce qu’elle était à plus basse énergie (soit 4 fb-1 pour ATLAS et 2,8-1 fb pour CMS pour les données recueillies à 13 TeV comparés à 25 fb-1 à 8 TeV pour chaque expérience), cette augmentation en énergie met désormais des particules hypothétiques plus massives à la portée des expériences.

Les deux expériences ont d’abord démontré comment leurs détecteurs se sont comportés après plusieurs améliorations majeures, y compris l’acquisition des données à deux fois le taux utilisé en 2012. Les deux groupes ont contrôlé sous toutes les coutures comment les particules déjà connues se comportent à plus haute énergie, sans trouver d’anomalies. Mais c’est dans la recherche de particules nouvelles et plus lourdes que tous les espoirs sont permis. Les deux groupes ont exploré des douzaines de possibilités différentes, triant des milliards d’événements.

Chaque événement est un cliché de ce qui s’est produit lorsque deux protons entrent en collision dans le LHC. L’énergie dégagée par la collision se matérialise sous forme de particules lourdes et instables qui se désintègrent aussitôt, provoquant de mini feux d’artifice. En attrapant, identifiant et regroupant toutes les particules qui s’échappent du point de collision, on peut reconstruire les particules originales qui ont été produites.

Les expériences CMS et ATLAS ont toutes deux trouvé de petits excès en sélectionnant les événements contenant deux photons. Dans plusieurs de ces événements, les deux photons semblent venir de la désintégration d’une particule ayant une masse d’environ 750 GeV, soit 750 fois plus lourde qu’un proton ou 6 fois la masse d’un boson de Higgs. Puisque les deux expériences ont regardé une multitude de combinaisons différentes, en vérifiant à chaque fois des douzaines de valeurs de masse pour chaque combinaison, on s’attend toujours à trouver de telles fluctuations statistiques.

ATLAS-diphotonPartie supérieure : la masse combinée exprimée en GeV pour toutes les paires de photons trouvées dans les données récoltées à 13 TeV par ATLAS. Le trait rouge montre à quoi on s’attend venant de sources aléatoires (communément appelé bruit de fond). Les points noirs correspondent aux données et les lignes, les erreurs expérimentales. La petite bosse à 750 GeV est ce qui est maintenant intrigant. La partie du bas montre la différence entre des points noirs (les données) et la courbe rouge (le bruit de fond), montrant clairement un petit excès de 3,6σ ou 3,6 fois l’erreur expérimentale. Quand on prend en compte toutes les fluctuations possibles à toutes les valeurs de masse considérées, l’excès n’est plus que de 2,0σ.

Ce qui est intrigant, c’est que les deux équipes ont trouvé la même chose à exactement au même endroit, sans s’être consulté et en utilisant des techniques de sélection conçues pour ne pas biaiser les données. Néanmoins, les deux groupes expérimentaux sont extrêmement prudents, déclarant qu’une fluctuation statistique est toujours possible jusqu’à ce que plus de données soient disponibles pour tout vérifier avec une précision accrue.
CMS-combined-p0CMS a légèrement moins de données qu’ATLAS à 13 TeV et par conséquent, décèle un effet beaucoup plus petit. Dans leurs seules données prises à 13 TeV, l’excès à 760 GeV est de 2,6σ, 3,0σ lorsque combiné avec les données de 8 TeV. Mais au lieu de juste évaluer cette probabilité localement, les physiciens et physiciennes préfèrent prendre en compte les fluctuations pour toutes les valeurs de masse considérées. La probabilité n’est alors que de 1,2σ, pas de quoi fouetter un chat. C’est “l’effet de regarder ailleurs” : il prend en compte qu’on finit toujours par trouver une fluctuation quelque part quand on regarde dans tant d’endroits.

Les théoriciens et théoriciennes se retiennent beaucoup moins. Depuis des décennies, on sait que le Modèle standard, le modèle théorique actuel de la physique des particules, n’explique pas tout, sans pouvoir progresser. Tout le monde espère donc qu’un indice viendra des données expérimentales pour aller de l’avant. Beaucoup ont dû travailler dur toute la nuit car huit nouveaux articles sont apparus dès ce matin, proposant des explications variées sur la nature possible de la nouvelle particule, si particule nouvelle il y a. Quelques personnes pensent que cela pourrait être une particule liée à la matière sombre, d’autres penchent pour un autre type de boson de Higgs tel que prédit par la Supersymétrie ou même y voient les premiers signes de nouvelles dimensions. D’autres proposent qu’un tel effet ne pourrait se produire que si cette particule s’accompagne d’une deuxième particule plus lourde. Tout le monde suggère quelque chose au-delà du Modèle standard.

Deux choses sont certaines : tout d’abord, le nombre d’articles théoriques dans les prochaines semaines va exploser. Et deuxièmement, on ne pourra pas dire si il y a nouvelle particule sans plus de données. Avec un peu de chance, nous pourrions en savoir plus dès l’été prochain lorsque le LHC aura produit plus de données. D’ici là, tout cela n’est que pure spéculation.

Ceci étant dit, il ne faut toutefois pas oublier que le boson de Higgs a fait son apparition de façon très semblable. Les premiers signes de son existence étaient déjà visibles en juillet 2011. Avec plus de données, ces signes s’étaient renforcés en décembre 2011 à un autre séminaire de fin d’année mais sa découverte n’a pu être établie que lorsque encore plus de données eurent été recueillies et analysées en juillet 2012. Ouvrir ses cadeaux avant Noël n’est jamais une bonne idée.

Passez de bonnes Fêtes, Pauline Gagnon

Pour en savoir plus sur la physique des particules et les enjeux du LHC, consultez mon livre : « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels». Pour recevoir un avis lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution.

Share

If, and really only if…

Wednesday, December 16th, 2015

If the LHC were a ladder and the new sought-after particles, boxes hidden on the top shelves, operating the LHC at higher energy is like having a longer ladder giving us access to the higher shelves. By the end of 2012, our ladders were shorter but we had 10 times more than now. ATLAS and CMS just had their first glimpse at a place never reached before but more data is still needed to explore this space thoroughly.

On December 15, at the End-of-the-Year seminar, the CMS and ATLAS experiments from CERN presented their first results using the brand new data accumulated in 2015 since the restart of the Large Hadron Collider (LHC) at 13 TeV, the highest operating energy so far. Although the size of the data sample is still only one tenth of what was available at lower energy (namely 4 fb-1 for ATLAS and 2.8-1 fb for CMS collected at 13 TeV compared to 25 fb-1 at 8 TeV for each experiment), it has put hypothetical massive particles within reach.

Both experiments showed how well their detectors performed after several major improvements, including collecting data at twice the rate used in 2012. The two groups made several checks on how known particles behave at higher energy, finding no anomalies. But it is in searches for new, heavier particles that every one hopes to see something exciting. Both groups explored dozens of different possibilities, sifting through billions of events.

Each event is a snapshot of what happens when two protons collide in the LHC. The energy released by the collision materializes into some heavy and unstable particle that breaks apart mere instants later, giving rise to a mini firework. By catching, identifying and regrouping all particles that fly apart from the collision point, one can reconstruct the original particles that were produced.

Both CMS and ATLAS found small excesses when selecting events containing two photons. In several events, the two photons seem to come from the decay of a particle having a mass around 750 GeV, that is, 750 times heavier than a proton or 6 times the mass of a Higgs boson. Since the two experiments looked at dozens of different combinations, checking dozens of mass values for each combination, such small statistical fluctuations are always expected.

ATLAS-diphoton

Top part: the combined mass given in units of GeV for all pairs of photons found in the 13 TeV data by ATLAS. The red curve shows what is expected from random sources (i.e. the background). The black dots correspond to data and the lines, the experimental errors. The small bump at 750 GeV is what is now intriguing. The bottom plot shows the difference between black dots (data) and red curve (background), clearly showing a small excess of 3.6σ or 3.6 times the experimental error. When one takes into account all possible fluctuations at all mass values, the significance is only 2.0σ

What’s intriguing here is that both groups found the same thing at exactly the same place, without having consulted each other and using selection techniques designed not to bias the data. Nevertheless, both experimental groups are extremely cautious, stating that a statistical fluctuation is always possible until more data is available to check this with increased accuracy.

CMS-combined-p0CMS has slightly less data than ATLAS at 13 TeV and hence, sees a much smaller effect. In their 13 TeV data alone, the excess at 760 GeV is about 2.6σ, 3σ when combined with the 8 TeV data. But instead of just evaluating this probability alone, experimentalists prefer take into account the fluctuations in all mass bins considered. Then the significance is only 1.2σ, nothing to write home about. This “look-elsewhere effect” takes into account that one is bound to see a fluctuation somewhere when ones look in so many places.

Theorists show less restrain. For decades, they have known that the Standard Model, the current theoretical model of particle physics, is flawed and have been looking for a clue from experimental data to go further. Many of them have been hard at work all night and eight new papers appeared this morning, proposing different explanations on which new particle could be there, if something ever proves to be there. Some think it could be a particle related to Dark Matter, others think it could be another type of Higgs boson predicted by Supersymmetry or even signs of extra dimensions. Others offer that it could only come from a second and heavier particle. All suggest something beyond the Standard Model.

Two things are sure: the number of theoretical papers in the coming weeks will explode. But establishing the discovery of a new particle will require more data. With some luck, we could know more by next Summer after the LHC delivers more data. Until then, it remains pure speculation.

This being said, let’s not forget that the Higgs boson made its entry in a similar fashion. The first signs of its existence appeared in July 2011. With more data, they became clearer in December 2011 at a similar End-of-the-Year seminar. But it was only once enough data had been collected and analysed in July 2012 that its discovery made no doubt. Opening one’s gifts before Christmas is never a good idea.

Have a good Holiday Season, Pauline Gagnon

To learn more about particle physics and what might be discovered at the LHC, don’t miss my upcoming book : « Who cares about particle physics : Making sense of the Higgs boson, Large Hadron Collider and CERN ». To be alerted of new postings, follow me on Twitter: @GagnonPauline  or sign-up on this mailing list to receive an e-mail notification.

Share

It’s amazing that so much hard work (and such high levels of stress) can be condensed down so much… 5 pages, 3 plots and a table – and the new world leading limit on the WIMP-nucleon spin-independent elastic scattering cross section, of course.
Yes, the LUX Run 3 reanalysis results are finally out. It’s been in the works for over a year, and it has been a genuinely wonderful experience to watch this paper grow – and seeing my own plot in there has felt like sending forth a child into the world!
As much as I worked to improve our signal efficiency at low energies, the real star of the LUX reanalysis show was the “D-D” calibration – D-D standing for deuterium-deuterium. We calibrated the detector’s response to nuclear recoils (which we expect WIMP dark matter to cause) with something that sounds like it is out of science fiction, a D-D generator. This generator uses the fusion of deuterium (think heavy hydrogen – one proton, one neutron) to generate neutrons that are focussed into a beam and sent into the detector.

Quick LUX 101 – LUX is a dark matter search experiment. Dark matter is that mysterious dark, massive substance that makes up 27% of our universe. How does LUX look for dark matter? Well, it is a ‘dual phase xenon TPC’ detector, and it lives 4850 feet underground at the Sanford Underground Research Facility.  It must be underground to shield it from as much cosmic radiation as possible – as it is looking for a very rare, weakly interacting dark matter particle called a WIMP. LUX is basically a big tank of liquid xenon, with a gas layer on top. It is sensitive to particles that enter this xenon – photons and electrons cause what we call an electron recoil (think of them bouncing off an atomic electron) whilst neutrons cause a nuclear recoil (bounce off a xenon nucleus). We expect that WIMPs will interact with the atomic nuclei too, just incredibly rarely – so understanding the detector response to these nuclear recoils is of utmost importance.  Both these electron recoils and nuclear recoils, inside the liquid xenon cause a flash of light, a signal we call “S1”, the scintillation signal. Any light in LUX is picked up by two arrays of photomultiplier tubes, 122 in total. Recoils can cause ionisation of electrons; electrons are ‘knocked off’ their atoms by the collision. If you place an electric field over the xenon volume, you can actually push these electrons along, instead of letting them recombine with their atoms. In LUX, the electrons are pushed all the way to the top, and into the gaseous xenon later. They then cause a second flash of light via scintillation in the gas, “S2”, the ionisation signal (as its source is the ionised electrons). Two signals mean two things – discrimination between electron recoils (background) and nuclear recoils (possible dark matter signal!) due to the differing distribution of energy between S1 and S2 for each recoil, and secondly, 3D position reconstruction. XY coordinates can be determined from looking at which photomultiplier tubes light up, whilst the time between the S1 and S2 tells us the depth of the interaction. This XYZ position is very important; we use the xenon to shield itself from radiation from the detector materials itself, or from the surrounding rock. If we have the 3D position of all our events, we can only look in the very inner region of the detector, where it is very quiet, for those rare dark matter interactions.

Schematic of the LUX detector

Schematic of the LUX detector. On the left, it is demonstrated how the S1 and S2 signals can provide 3D position reconstruction. The right shows the inside of the detector, and the position of the photomultiplier tubes that collect light emitted by the scintillation of xenon.

 

Back to the deuterium-deuterium fusion neutron gun – it’s actually a wonderfully simple but extremely clever idea. We fire a beam of neutrons into our detector, all at the same energy (monoenergetic or monochromatic), at a set position. We then select events in our data along that beam, and look for those neutrons that scattered a second time in the detector. Because of that XYZ position reconstruction, if we have signals from two different scatters, we can actually determine the angle of scattering. As the initial energy is known, allows the energy of the recoil to then be calculated, via simple kinematics. Matching the recoil energy with the size of the two signals allows us to calibrate the nuclear recoil response of the detector extremely well.  The light yield (in S1), tougher to measure than the charge yield (in S2), as we are talking about individual photons, was measured as low as 1.1keV. (keV are kiloelectronvolts, or 1000x the energy of a single electron moved across a potential difference of 1V. In other words, a tiny quantity. 1keV is only 1.6×10-16 joules!)  The charge yield was measured below 1keV. In the previous LUX results, we had assumed a conservative hard cut off – ie we would measure no light for recoils below 3 keV. Now we know that isn’t the case, and can extend our sensitivity to lower energies – which corresponds to lighter WIMPs.

Screenshot 2015-12-14 20.28.48

Upper limits on the spin-independent elastic WIMP-nucleon cross section at 90% CL. Observed limit in black, with the 1- and 2-σ ranges of background-only trials shaded green and yellow.

This improvement in low energy calibration, as well as a more streamlined and improved analysis framework, has led to a huge improvement in LUX’s low WIMP mass limit. In the plot above, which shows the WIMP mass against the probability of interaction with a nucleus, everything above the black line is now ruled out. If you’ve been following the WIMP dark matter saga, you will know that a few experiments were claiming hints of signals in that low mass region, but this new result definitely lays those signals to rest.

Getting a paper ready for publication has turned out to be far harder work than I expected. It requires a lot of teamwork, perseverance, brain power and very importantly, the ability to take on criticism and use it to improve. I must have remade the LUX efficiency plot over 100 times, and a fair few of those times it was because someone didn’t quite like the way I’d formatted it. In a collaboration, you have to be willing to learn from others and compromise. In the last few days before we finished I did not benefit at all from my UK time zone, as I stayed up later and later to finish things off. But – it was worth it! Now, if I search for my name on arXiv, I come up 4 times (3 LUX papers and the LZ Conceptual Design Report). As pathetic as it sounds, this is actually quite exciting for me, and is what I hope to be the foundations of a long career in physics.

The new LUX results are obviously nowhere near as exciting as an actual WIMP discovery, but it’s another step on the way there. LUX Run 4 is in full swing, where we will obtain over 3 times more data, increasing our sensitivity even further , and who knows – those WIMPs might just finally show their faces.

Share

Dark Matter: A New Hope

Monday, December 7th, 2015

[Apologies for the title, couldn’t resist the temptation to work in a bit of Star Wars hype]

To call the direct detection of dark matter “difficult” is a monumental understatement. To date, we have had no definite, direct detection on Earth of this elusive particle that we suspect should be all around us. This seems somewhat of a paradox when our best astronomical observations indicate that there’s about five times more dark matter in the universe than the ordinary, visible matter that appears to make up the world we see. So what’s the catch? Why is it so tricky to find?

An enhanced image of the “Bullet Cluster”: two colliding galaxies are observed with ordinary “baryonic” matter (coloured red) interacting as expected and the dark matter from each galaxy, inferred from gravitational lensing (coloured blue), passing straight through one another. Source: NASA Astronomy Picture of the Day 24/08/2006

The difficulty lies in the fact that dark matter does not interact with light (that is, electromagnetically) or noticeably with atoms as we know them (that is, with the strong force, which holds together atomic nuclei). In fact, the only reason we know it exists is because of how it interacts gravitationally. We see galaxies rotate much faster than they would without the presence of some unseen “dark matter”, amongst other things. Unfortunately, none of the particles we know from the Standard Model of particle physics are suitable candidates for explaining dark matter of this sort. There are, however, several attempts in the works to try and detect it via weak nuclear interactions on Earth and pin down its nature, such as the recently approved LUX-ZEPLIN experiment, which should be built and collecting data by 2020.

Direct detection, however, isn’t the only possible way physicists can get a handle on dark matter. In February 2014, an X-Ray signal at 3.5 keV was detected by the XMM-Newton, an X-ray spectroscopy project by the European Space Agency, in orbit around Earth. Ever since, there’s been buzz amongst particle cosmologists that the signal may be from some kind of dark matter annihilation process. One of the strongest candidates to explain the signal has been sterile neutrino, a hypothetical cousin of the Standard Model neutrino. Neutrinos are ghostly particles that also interact incredibly rarely with ordinary matter* but, thanks to the remarkable work of experimentalists, were detected in the late 1950s. Their exact nature was later probed by two famous experiments, SNO and Super-Kamiokande, that demonstrated that neutrinos do in fact have mass, by observing a phenomenon known as Neutrino Oscillations. As reported on this blog in October, the respective heads of each collaboration were awarded the 2015 Nobel Prize in Physics for their efforts in this field.

“Handedness” refers to how a particle spins about the axis it travels along. Standard Model neutrinos (first observed in 1956) are all observed as left handed. Sterile neutrinos, a hypothetical dark matter candidate, would be right-handed, causing them to spin the opposite way along their axes. Image source: ysfine.com

The hope amongst some physicists is that as well as the neutrinos that have been studied in detectors for the last half a century, there exists a sort of heavier “mirror image” to these particles that could act as a suitable dark matter candidate. Neutrinos are only found to “spin” in a certain way relative to the axis of their propagation, while the hypothesised sterile neutrinos would spin the opposite way round (in more technical terms, they have opposite chirality). This difference might seem trivial, but in the mathematical structure underpinning the Standard Model, it would fundamentally change how often these new particles interact with known particles. Although predicted to react incredibly rarely with ordinary matter, there are potentially processes that would allow these sterile neutrinos to emit an X-Ray signal, with half the mass-energy of the original particle. Due to the sheer number of them found in dense places such as the centres of galaxies, where XMM-Newton was collecting data from, in principle such a signal would be measurable from regions with a high density of sterile neutrinos.

This all seems well and good, but how well does the evidence measure up? Since the announcement of the signal, the literature has gone back and forth on the issue, with the viability of sterile neutrinos as a dark matter candidate being brought into question. It is thought that the gravitational presence of dark matter played a crucial role in the formation of galaxies in the early universe, and the best description we have relies on dark matter being “cold”, i.e. with a velocity dispersion such that the particles don’t whizz around at speeds too close to the speed of light, at which point their kinematic properties are difficult to reconcile with cosmological models. However, neutrinos are notorious for having masses so small they have yet to be directly measured and to explain the signal at 3.5 keV, the relevant sterile neutrino would have to have a relatively small mass of ~7 keV/c2, about 15,000 times lighter than the usual prediction for dark matter at ~100 GeV/c2. This means that under the energy predicted by cosmological models for dark matter production, our sterile neutrinos would have a sort of “luke-warm” characteristic, in which they move around at speeds comparable to but not approaching the speed of light.

A further setback has been that the nature of the signal has been called into question, since the resolution of the initial measurements from XMM-Newton (and accompanying X-ray satellite experiments such as Chandra) was not sharp enough to definitively determine the signal’s origin. XMM-Newton built up a profile of X-ray spectra by averaging across measurements from just 73 galaxy clusters, though it will take further measurements to fully rule out the possibility that the signal isn’t from the atomic spectra of potassium and sulpher ions found in hot cosmic plasmas.

But there remains hope.

A recent pre-print to the Monthly Notices of the Royal Astronomical Society (MNRAS) by several leading cosmologists has outlined the compatibility of a 7 keV/c2 sterile neutrino’s involvement with the development of galactic structure. To slow down the sterile neutrinos enough to bring them in line with cosmological observations, “lepton asymmetry” (a breaking of the symmetry between particles and antiparticles) has to be introduced in the model. While this may initially seem like extra theoretical baggage, since lepton asymmetry has yet to be observed, there are theoretical frameworks than can introduce such an asymmetry with the introduction of two much heavier sterile neutrinos at the GeV scale.

A Black Brant XII sounding rocket, similar to the type that could be used to carry microcalorimeters, capable of recording X-ray signals of the type XMM-Newton and Chandra have been observing in galactic nuclei. These rockets are used to conduct scientific experiments in sub-orbital flight, including attempts at dark matter detection. Source: NASA/Wallops

Under such a model, not only could our dark matter candidate be reconciled, but neutrino oscillations could also be explained. Finally, baryogenesis, the description of why there was slightly more matter than antimatter in the early universe, could also find an explanation in such a theory. This would resolve one of the largest puzzles in Physics; the Standard Model predicts nearly equivalent amounts of particles and antiparticles in the early universe which should have annihilated to leave nothing but radiation, rather than the rich and exciting universe we inhabit today. On the experimental side, there are a few proposed experiments to try and measure the X-ray signal more carefully to determine its shape and compare it with the prediction of such models, such as flying rockets around with calorimeters inside to try and pick up the signal by observing a broader section of the sky than XMM or Chandra did.

With the experts’ opinions divided and further research yet to be done, it would be facetious to end this article with any sort of comment on whether the signal can or will gather the support of the community and become verified as a full blown dark matter signal. At time of writing, a paper has been released claiming signal is better explained as an emission from the plasmas found in galactic nuclei. A further preprint to MNRAS, put on arXiv just days ago, claims the sterile neutrino hypothesis is incompatible with the signal but that axions (a dark matter model that supposes a totally different type of particle outside of the Standard Model) remain as a candidate to explain the signal. Perhaps sterile neutrinos, are not the particles we’re looking for.

This kind of endeavour is just one of the hundreds of ways particle physicists and our colleagues in Astrophysics are looking to find evidence of new, fundamental physics. The appeal for me, as someone whose work will probably only have relevance to huge, Earth-bound experiments like the Large Hadron Collider, is the crossover between modelling the birth of colossal objects like galaxies and theories of subatomic particle production, using comparison between the two for consistency. Regardless of whether future rocket-based experiments can gather enough data to fully validate the signal in terms of theories produced by physicists here on Earth, it is a perfect example of breadth of activity physicists are engaged in, attempting to answer the big questions such as the nature of dark matter, through our research.

Kind regards to Piotr Oleśkiewicz (Durham University) for bringing this topic to my attention and for his insights on cosmology, and to Luke Batten (University College London) for a few corrections.

*The oft-quoted fact about neutrinos is that 65 billion solar neutrinos pass through just one square centimetre of area on earth every single second. The vast majority of these neutrinos will whizz straight through you without ever having noticed you were there, but by chance, in your entire lifetime, it’s likely that at least one or two will have the courtesy to notice you and bump off one of your atoms. The other interesting fact is that due to the decay of potassium in your bones, you actually emit about three hundred neutrinos a second.

Share

The Large Hadron Collider is almost done running for 2015.  Proton collisions ended in early November, and now the machine is busy colliding lead nuclei.  As we head towards the end-of-year holidays, and the annual CERN shutdown, everyone wants to know — what have we learned from the LHC this year, our first year of data-taking at 13 TeV, the highest collision energies we have ever achieved, and the highest we might hope to have for years to come?

We will get our first answers to this question at a CERN seminar scheduled for Tuesday, December 15, where ATLAS and CMS will be presenting physics results from this year’s run.  The current situation is reminiscent of December 2011, when the experiments had recorded their first significant datasets from LHC Run 1, and we saw what turned out to be the first hints of the evidence for the Higgs boson that was discovered in 2012.  The experiments showed a few early results from Run 2 during the summer, and some of those have already resulted in journal papers, but this will be our first chance to look at the broad physics program of the experiments.  We shouldn’t have expectations that are too great, as only a small amount of data has been recorded so far, much less than we had in 2012.  But what science might we hope to hear about next week?  

Here is one thing to keep in mind — the change in collision energy affects particle production rates, but not the properties of the particles that are produced.  Any measurement of particle production rates is inherently interesting at a new collision energy, as will be a measurement that has never been done before.  Thus any measurement of a production rate that is possible with this amount of data would be a good candidate for presentation.  (The production rates of top quarks at 13 TeV have already been measured by both CMS and ATLAS; maybe there will be additional measurements along these lines.)

We probably won’t hear anything new about the Higgs boson.  While the Higgs production rates are larger than in the previous run, the amount of data recorded is still relatively small compared to the 2010-12 period.  This year, the LHC has delivered about 4 fb-1 of data, which could be compared to the 5 fb-1 that was delivered in 2011.  At that time there wasn’t enough data to say anything definitive about the Higgs boson, so it is hard to imagine that there will be much in the way of Higgs results from the new data (not even the production rate at 13 TeV), and certainly nothing that would tell us anything more about its properties than we already know from the full Run 1 dataset of 30 fb-1.  We’ll all probably have to wait until sometime next year before we will know more about the Higgs boson, and if anything about it will disagree with what we expect from the standard model of particle physics.

If there is anything to hope for next week, it is some evidence for new, heavy particles.  Because the collision energy has been increased from 8 TeV to 13 TeV, the ability to create a heavy particle of a given mass has increased too.  A little fooling around with the “Collider Reach” tool (which I had discussed here) suggests that even as little data as we have in hand now can give us improved chances of observing such particles now compared to the chances in the entire Run 1 dataset as long as the particle masses are above about 3 TeV.  Of course there are many theories that predict the existence of such particles, the most famous of which is supersymmetry.  But so far there has been scant evidence of any new phenomena in previous datasets.  If we were to get even a hint of something at a very high mass, it would definitely focus our scientific efforts for 2016, where we might get about ten times as much data as we did this year.

Will we get that hint, like we did with the Higgs boson four years ago?  Tune in on December 15 to find out!

Share

@CMSVoices

Thursday, October 29th, 2015

Building on the success of rotating Twitter accounts like @realscientists, which I participated in last year, the CMS experiment has a new account: @CMSVoices.  The idea is that it’s an account for talking to CMS members and hearing about their day-to-day work, in contrast with the official news from the @CMSexperiment account.  Of course, you can already hear from many individual CMS physicists on Twitter (I’m normally @sethzenz), but the account gives you the chance to interact with a new person each month, and it might even help us get some new tweeters started!  I also tried to explain things in more detail and start some more general discussions, for example:

There weren’t too many discussions or too many followers so far, but we’re just getting started, and I’m looking forward to others taking the account over and seeing what the do with it.  The next holder of the CMSVoices account, starting in November, will be @matt_bellis.  Please welcome him next week, and let us know if you have any questions or ideas!

Share