• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Archive for the ‘Latest Posts’ Category

The Delirium over Beryllium

Thursday, August 25th, 2016

This post is cross-posted from ParticleBites.

Article: Particle Physics Models for the 17 MeV Anomaly in Beryllium Nuclear Decays
Authors: J.L. Feng, B. Fornal, I. Galon, S. Gardner, J. Smolinsky, T. M. P. Tait, F. Tanedo
Reference: arXiv:1608.03591 (Submitted to Phys. Rev. D)
Also featuring the results from:
— Gulyás et al., “A pair spectrometer for measuring multipolarities of energetic nuclear transitions” (description of detector; 1504.00489NIM)
— Krasznahorkay et al., “Observation of Anomalous Internal Pair Creation in 8Be: A Possible Indication of a Light, Neutral Boson”  (experimental result; 1504.01527PRL version; note PRL version differs from arXiv)
— Feng et al., “Protophobic Fifth-Force Interpretation of the Observed Anomaly in 8Be Nuclear Transitions” (phenomenology; 1604.07411; PRL)

Editor’s note: the author is a co-author of the paper being highlighted. 

Recently there’s some press (see links below) regarding early hints of a new particle observed in a nuclear physics experiment. In this bite, we’ll summarize the result that has raised the eyebrows of some physicists, and the hackles of others.

A crash course on nuclear physics

Nuclei are bound states of protons and neutrons. They can have excited states analogous to the excited states of at lowoms, which are bound states of nuclei and electrons. The particular nucleus of interest is beryllium-8, which has four neutrons and four protons, which you may know from the triple alpha process. There are three nuclear states to be aware of: the ground state, the 18.15 MeV excited state, and the 17.64 MeV excited state.

Beryllium-8 excited nuclear states. The 18.15 MeV state (red) exhibits an anomaly. Both the 18.15 MeV and 17.64 states decay to the ground through a magnetic, p-wave transition. Image adapted from Savage et al. (1987).

Most of the time the excited states fall apart into a lithium-7 nucleus and a proton. But sometimes, these excited states decay into the beryllium-8 ground state by emitting a photon (γ-ray). Even more rarely, these states can decay to the ground state by emitting an electron–positron pair from a virtual photon: this is called internal pair creation and it is these events that exhibit an anomaly.

The beryllium-8 anomaly

Physicists at the Atomki nuclear physics institute in Hungary were studying the nuclear decays of excited beryllium-8 nuclei. The team, led by Attila J. Krasznahorkay, produced beryllium excited states by bombarding a lithium-7 nucleus with protons.

Preparation of beryllium-8 excited state

Beryllium-8 excited states are prepare by bombarding lithium-7 with protons.

The proton beam is tuned to very specific energies so that one can ‘tickle’ specific beryllium excited states. When the protons have around 1.03 MeV of kinetic energy, they excite lithium into the 18.15 MeV beryllium state. This has two important features:

  1. Picking the proton energy allows one to only produce a specific excited state so one doesn’t have to worry about contamination from decays of other excited states.
  2. Because the 18.15 MeV beryllium nucleus is produced at resonance, one has a very high yield of these excited states. This is very good when looking for very rare decay processes like internal pair creation.

What one expects is that most of the electron–positron pairs have small opening angle with a smoothly decreasing number as with larger opening angles.

Screen Shot 2016-08-22 at 9.18.11 AM

Expected distribution of opening angles for ordinary internal pair creation events. Each line corresponds to nuclear transition that is electric (E) or magenetic (M) with a given orbital quantum number, l. The beryllium transitionsthat we’re interested in are mostly M1. Adapted from Gulyás et al. (1504.00489).

Instead, the Atomki team found an excess of events with large electron–positron opening angle. In fact, even more intriguing: the excess occurs around a particular opening angle (140 degrees) and forms a bump.

Adapted from Krasznahorkay et al.

Number of events (dN/dθ) for different electron–positron opening angles and plotted for different excitation energies (Ep). For Ep=1.10 MeV, there is a pronounced bump at 140 degrees which does not appear to be explainable from the ordinary internal pair conversion. This may be suggestive of a new particle. Adapted from Krasznahorkay et al., PRL 116, 042501.

Here’s why a bump is particularly interesting:

  1. The distribution of ordinary internal pair creation events is smoothly decreasing and so this is very unlikely to produce a bump.
  2. Bumps can be signs of new particles: if there is a new, light particle that can facilitate the decay, one would expect a bump at an opening angle that depends on the new particle mass.

Schematically, the new particle interpretation looks like this:

Schematic of the Atomki experiment.

Schematic of the Atomki experiment and new particle (X) interpretation of the anomalous events. In summary: protons of a specific energy bombard stationary lithium-7 nuclei and excite them to the 18.15 MeV beryllium-8 state. These decay into the beryllium-8 ground state. Some of these decays are mediated by the new X particle, which then decays in to electron–positron pairs of a certain opening angle that are detected in the Atomki pair spectrometer detector. Image from 1608.03591.

As an exercise for those with a background in special relativity, one can use the relation (pe+ + pe)2 = mX2 to prove the result:

Untitled

This relates the mass of the proposed new particle, X, to the opening angle θ and the energies E of the electron and positron. The opening angle bump would then be interpreted as a new particle with mass of roughly 17 MeV. To match the observed number of anomalous events, the rate at which the excited beryllium decays via the X boson must be 6×10-6 times the rate at which it goes into a γ-ray.

The anomaly has a significance of 6.8σ. This means that it’s highly unlikely to be a statistical fluctuation, as the 750 GeV diphoton bump appears to have been. Indeed, the conservative bet would be some not-understood systematic effect, akin to the 130 GeV Fermi γ-ray line.

The beryllium that cried wolf?

Some physicists are concerned that beryllium may be the ‘boy that cried wolf,’ and point to papers by the late Fokke de Boer as early as 1996 and all the way to 2001. de Boer made strong claims about evidence for a new 10 MeV particle in the internal pair creation decays of the 17.64 MeV beryllium-8 excited state. These claims didn’t pan out, and in fact the instrumentation paper by the Atomki experiment rules out that original anomaly.

The proposed evidence for “de Boeron” is shown below:

Beryllium

The de Boer claim for a 10 MeV new particle. Left: distribution of opening angles for internal pair creation events in an E1 transition of carbon-12. This transition has similar energy splitting to the beryllium-8 17.64 MeV transition and shows good agreement with the expectations; as shown by the flat “signal – background” on the bottom panel. Right: the same analysis for the M1 internal pair creation events from the 17.64 MeV beryllium-8 states. The “signal – background” now shows a broad excess across all opening angles. Adapted from de Boer et al. PLB 368, 235 (1996).

When the Atomki group studied the same 17.64 MeV transition, they found that a key background component—subdominant E1 decays from nearby excited states—dramatically improved the fit and were not included in the original de Boer analysis. This is the last nail in the coffin for the proposed 10 MeV “de Boeron.”

However, the Atomki group also highlight how their new anomaly in the 18.15 MeV state behaves differently. Unlike the broad excess in the de Boer result, the new excess is concentrated in a bump. There is no known way in which additional internal pair creation backgrounds can contribute to add a bump in the opening angle distribution; as noted above: all of these distributions are smoothly falling.

The Atomki group goes on to suggest that the new particle appears to fit the bill for a dark photon, a reasonably well-motivated copy of the ordinary photon that differs in its overall strength and having a non-zero (17 MeV?) mass.

Theory part 1: Not a dark photon

With the Atomki result was published and peer reviewed in Physics Review Letters, the game was afoot for theorists to understand how it would fit into a theoretical framework like the dark photon. A group from UC Irvine, University of Kentucky, and UC Riverside found that actually, dark photons have a hard time fitting the anomaly simultaneously with other experimental constraints. In the visual language of this recent ParticleBite, the situation was this:

Beryllium-8

It turns out that the minimal model of a dark photon cannot simultaneously explain the Atomki beryllium-8 anomaly without running afoul of other experimental constraints. Image adapted from this ParticleBite.

The main reason for this is that a dark photon with mass and interaction strength to fit the beryllium anomaly would necessarily have been seen by the NA48/2 experiment. This experiment looks for dark photons in the decay of neutral pions (π0). These pions typically decay into two photons, but if there’s a 17 MeV dark photon around, some fraction of those decays would go into dark-photon — ordinary-photon pairs. The non-observation of these unique decays rules out the dark photon interpretation.

The theorists then decided to “break” the dark photon theory in order to try to make it fit. They generalized the types of interactions that a new photon-like particle, X, could have, allowing protons, for example, to have completely different charges than electrons rather than having exactly opposite charges. Doing this does gross violence to the theoretical consistency of a theory—but they goal was just to see what a new particle interpretation would have to look like. They found that if a new photon-like particle talked to neutrons but not protons—that is, the new force were protophobic—then a theory might hold together.

Schematic description of how model-builders “hacked” the dark photon theory to fit both the beryllium anomaly while being consistent with other experiments. This hack isn’t pretty—and indeed, comes at the cost of potentially invalidating the mathematical consistency of the theory—but the exercise demonstrates the target for how to a complete theory might have to behave. Image adapted from this ParticleBite.

Theory appendix: pion-phobia is protophobia

Editor’s note: what follows is for readers with some physics background interested in a technical detail; others may skip this section.

How does a new particle that is allergic to protons avoid the neutral pion decay bounds from NA48/2? Pions decay into pairs of photons through the well-known triangle-diagrams of the axial anomaly. The decay into photon–dark-photon pairs proceed through similar diagrams. The goal is then to make sure that these diagrams cancel.

A cute way to look at this is to assume that at low energies, the relevant particles running in the loop aren’t quarks, but rather nucleons (protons  and neutrons). In fact, since only the proton can talk to the photon, one only needs to consider proton loops. Thus if the new photon-like particle, X, doesn’t talk to protons, then there’s no diagram for the pion to decay into γX. This would be great if the story weren’t completely wrong.

Avoiding NA48

Avoiding NA48/2 bounds requires that the new particle, X, is pion-phobic. It turns out that this is equivalent to X being protophobic. The correct way to see this is on the left, making sure that the contribution of up-quark loops cancels the contribution from down-quark loops. A slick (but naively completely wrong) calculation is on the right, arguing that effectively only protons run in the loop.

The correct way of seeing this is to treat the pion as a quantum superposition of an up–anti-up and down–anti-down bound state, and then make sure that the X charges are such that the contributions of the two states cancel. The resulting charges turn out to be protophobic.

The fact that the “proton-in-the-loop” picture gives the correct charges, however, is no coincidence. Indeed, this was precisely how Jack Steinberger calculated the correct pion decay rate. The key here is whether one treats the quarks/nucleons linearly or non-linearly in chiral perturbation theory. The relation to the Wess-Zumino-Witten term—which is what really encodes the low-energy interaction—is carefully explained in chapter 6a.2 of Georgi’s revised Weak Interactions.

Theory part 2: Not a spin-0 particle

The above considerations focus on a new particle with the same spin and parity as a photon (spin-1, parity odd). Another result of the UCI study was a systematic exploration of other possibilities. They found that the beryllium anomaly could not be consistent with spin-0 particles. For a parity-odd, spin-0 particle, one cannot simultaneously conserve angular momentum and parity in the decay of the excited beryllium-8 state. (Parity violating effects are negligible at these energies.)

Parity

Parity and angular momentum conservation prohibit a “dark Higgs” (parity even scalar) from mediating the anomaly.

For a parity-odd pseudoscalar, the bounds on axion-like particles at 20 MeV suffocate any reasonable coupling. Measured in terms of the pseudoscalar–photon–photon coupling (which has dimensions of inverse GeV), this interaction is ruled out down to the inverse Planck scale.

Screen Shot 2016-08-24 at 4.01.07 PM

Bounds on axion-like particles exclude a 20 MeV pseudoscalar with couplings to photons stronger than the inverse Planck scale. Adapted from 1205.2671 and 1512.03069.

Additional possibilities include:

  • Dark Z bosons, cousins of the dark photon with spin-1 but indeterminate parity. This is very constrained by atomic parity violation.
  • Axial vectors, spin-1 bosons with positive parity. These remain a theoretical possibility, though their unknown nuclear matrix elements make it difficult to write a predictive model. (See section II.D of 1608.03591.)

Theory part 3: Nuclear input

The plot thickens when once also includes results from nuclear theory. Recent results from Saori Pastore, Bob Wiringa, and collaborators point out a very important fact: the 18.15 MeV beryllium-8 state that exhibits the anomaly and the 17.64 MeV state which does not are actually closely related.

Recall (e.g. from the first figure at the top) that both the 18.15 MeV and 17.64 MeV states are both spin-1 and parity-even. They differ in mass and in one other key aspect: the 17.64 MeV state carries isospin charge, while the 18.15 MeV state and ground state do not.

Isospin is the nuclear symmetry that relates protons to neutrons and is tied to electroweak symmetry in the full Standard Model. At nuclear energies, isospin charge is approximately conserved. This brings us to the following puzzle:

If the new particle has mass around 17 MeV, why do we see its effects in the 18.15 MeV state but not the 17.64 MeV state?

Naively, if the new particle emitted, X, carries no isospin charge, then isospin conservation prohibits the decay of the 17.64 MeV state through emission of an X boson. However, the Pastore et al. result tells us that actually, the isospin-neutral and isospin-charged states mix quantum mechanically so that the observed 18.15 and 17.64 MeV states are mixtures of iso-neutral and iso-charged states. In fact, this mixing is actually rather large, with mixing angle of around 10 degrees!

The result of this is that one cannot invoke isospin conservation to explain the non-observation of an anomaly in the 17.64 MeV state. In fact, the only way to avoid this is to assume that the mass of the X particle is on the heavier side of the experimentally allowed range. The rate for emission goes like the 3-momentum cubed (see section II.E of 1608.03591), so a small increase in the mass can suppresses the rate of emission by the lighter state by a lot.

The UCI collaboration of theorists went further and extended the Pastore et al. analysis to include a phenomenological parameterization of explicit isospin violation. Independent of the Atomki anomaly, they found that including isospin violation improved the fit for the 18.15 MeV and 17.64 MeV electromagnetic decay widths within the Pastore et al. formalism. The results of including all of the isospin effects end up changing the particle physics story of the Atomki anomaly significantly:

Parameter fits

The rate of X emission (colored contours) as a function of the X particle’s couplings to protons (horizontal axis) versus neutrons (vertical axis). The best fit for a 16.7 MeV new particle is the dashed line in the teal region. The vertical band is the region allowed by the NA48/2 experiment. Solid lines show the dark photon and protophobic limits. Left: the case for perfect (unrealistic) isospin. Right: the case when isospin mixing and explicit violation are included. Observe that incorporating realistic isospin happens to have only a modest effect in the protophobic region. Figure from 1608.03591.

The results of the nuclear analysis are thus that:

  1. An interpretation of the Atomki anomaly in terms of a new particle tends to push for a slightly heavier X mass than the reported best fit. (Remark: the Atomki paper does not do a combined fit for the mass and coupling nor does it report the difficult-to-quantify systematic errors  associated with the fit. This information is important for understanding the extent to which the X mass can be pushed to be heavier.)
  2. The effects of isospin mixing and violation are important to include; especially as one drifts away from the purely protophobic limit.

Theory part 4: towards a complete theory

The theoretical structure presented above gives a framework to do phenomenology: fitting the observed anomaly to a particle physics model and then comparing that model to other experiments. This, however, doesn’t guarantee that a nice—or even self-consistent—theory exists that can stretch over the scaffolding.

Indeed, a few challenges appear:

  • The isospin mixing discussed above means the X mass must be pushed to the heavier values allowed by the Atomki observation.
  • The “protophobic” limit is not obviously anomaly-free: simply asserting that known particles have arbitrary charges does not generically produce a mathematically self-consistent theory.
  • Atomic parity violation constraints require that the X couple in the same way to left-handed and right-handed matter. The left-handed coupling implies that X must also talk to neutrinos: these open up new experimental constraints.

The Irvine/Kentucky/Riverside collaboration first note the need for a careful experimental analysis of the actual mass ranges allowed by the Atomki observation, treating the new particle mass and coupling as simultaneously free parameters in the fit.

Next, they observe that protophobic couplings can be relatively natural. Indeed: the Standard Model Z boson is approximately protophobic at low energies—a fact well known to those hunting for dark matter with direct detection experiments. For exotic new physics, one can engineer protophobia through a phenomenon called kinetic mixing where two force particles mix into one another. A tuned admixture of electric charge and baryon number, (Q-B), is protophobic.

Baryon number, however, is an anomalous global symmetry—this means that one has to work hard to make a baryon-boson that mixes with the photon (see 1304.0576 and 1409.8165 for examples). Another alternative is if the photon kinetically mixes with not baryon number, but the anomaly-free combination of “baryon-minus-lepton number,” Q-(B-L). This then forces one to apply additional model-building modules to deal with the neutrino interactions that come along with this scenario.

In the language of the ‘model building blocks’ above, result of this process looks schematically like this:

Model building block

A complete theory is completely mathematically self-consistent and satisfies existing constraints. The additional bells and whistles required for consistency make additional predictions for experimental searches. Pieces of the theory can sometimes  be used to address other anomalies.

The theory collaboration presented examples of the two cases, and point out how the additional ‘bells and whistles’ required may tie to additional experimental handles to test these hypotheses. These are simple existence proofs for how complete models may be constructed.

What’s next?

We have delved rather deeply into the theoretical considerations of the Atomki anomaly. The analysis revealed some unexpected features with the types of new particles that could explain the anomaly (dark photon-like, but not exactly a dark photon), the role of nuclear effects (isospin mixing and breaking), and the kinds of features a complete theory needs to have to fit everything (be careful with anomalies and neutrinos). The single most important next step, however, is and has always been experimental verification of the result.

While the Atomki experiment continues to run with an upgraded detector, what’s really exciting is that a swath of experiments that are either ongoing or in construction will be able to probe the exact interactions required by the new particle interpretation of the anomaly. This means that the result can be independently verified or excluded within a few years. A selection of upcoming experiments is highlighted in section IX of 1608.03591:

Experimental searches

Other experiments that can probe the new particle interpretation of the Atomki anomaly. The horizontal axis is the new particle mass, the vertical axis is its coupling to electrons (normalized to the electric charge). The dark blue band is the target region for the Atomki anomaly. Figure from 1608.03591; assuming 100% branching ratio to electrons.

We highlight one particularly interesting search: recently a joint team of theorists and experimentalists at MIT proposed a way for the LHCb experiment to search for dark photon-like particles with masses and interaction strengths that were previously unexplored. The proposal makes use of the LHCb’s ability to pinpoint the production position of charged particle pairs and the copious amounts of D mesons produced at Run 3 of the LHC. As seen in the figure above, the LHCb reach with this search thoroughly covers the Atomki anomaly region.

Implications

So where we stand is this:

  • There is an unexpected result in a nuclear experiment that may be interpreted as a sign for new physics.
  • The next steps in this story are independent experimental cross-checks; the threshold for a ‘discovery’ is if another experiment can verify these results.
  • Meanwhile, a theoretical framework for understanding the results in terms of a new particle has been built and is ready-and-waiting. Some of the results of this analysis are important for faithful interpretation of the experimental results.

What if it’s nothing?

This is the conservative take—and indeed, we may well find that in a few years, the possibility that Atomki was observing a new particle will be completely dead. Or perhaps a source of systematic error will be identified and the bump will go away. That’s part of doing science.

Meanwhile, there are some important take-aways in this scenario. First is the reminder that the search for light, weakly coupled particles is an important frontier in particle physics. Second, for this particular anomaly, there are some neat take aways such as a demonstration of how effective field theory can be applied to nuclear physics (see e.g. chapter 3.1.2 of the new book by Petrov and Blechman) and how tweaking our models of new particles can avoid troublesome experimental bounds. Finally, it’s a nice example of how particle physics and nuclear physics are not-too-distant cousins and how progress can be made in particle–nuclear collaborations—one of the Irvine group authors (Susan Gardner) is a bona fide nuclear theorist who was on sabbatical from the University of Kentucky.

What if it’s real?

This is a big “what if.” On the other hand, a 6.8σ effect is not a statistical fluctuation and there is no known nuclear physics to produce a new-particle-like bump given the analysis presented by the Atomki experimentalists.

The threshold for “real” is independent verification. If other experiments can confirm the anomaly, then this could be a huge step in our quest to go beyond the Standard Model. While this type of particle is unlikely to help with the Hierarchy problem of the Higgs mass, it could be a sign for other kinds of new physics. One example is the grand unification of the electroweak and strong forces; some of the ways in which these forces unify imply the existence of an additional force particle that may be light and may even have the types of couplings suggested by the anomaly.

Could it be related to other anomalies?

The Atomki anomaly isn’t the first particle physics curiosity to show up at the MeV scale. While none of these other anomalies are necessarily related to the type of particle required for the Atomki result (they may not even be compatible!), it is helpful to remember that the MeV scale may still have surprises in store for us.

  • The KTeV anomaly: The rate at which neutral pions decay into electron–positron pairs appears to be off from the expectations based on chiral perturbation theory. In 0712.0007, a group of theorists found that this discrepancy could be fit to a new particle with axial couplings. If one fixes the mass of the proposed particle to be 20 MeV, the resulting couplings happen to be in the same ballpark as those required for the Atomki anomaly. The important caveat here is that parameters for an axial vector to fit the Atomki anomaly are unknown, and mixed vector–axial states are severely constrained by atomic parity violation.
KTeV anomaly

The KTeV anomaly interpreted as a new particle, U. From 0712.0007.

  • The anomalous magnetic moment of the muon and the cosmic lithium problem: much of the progress in the field of light, weakly coupled forces comes from Maxim Pospelov. The anomalous magnetic moment of the muon, (g-2)μ, has a long-standing discrepancy from the Standard Model (see e.g. this blog post). While this may come from an error in the very, very intricate calculation and the subtle ways in which experimental data feed into it, Pospelov (and also Fayet) noted that the shift may come from a light (in the 10s of MeV range!), weakly coupled new particle like a dark photon. Similarly, Pospelov and collaborators showed that a new light particle in the 1-20 MeV range may help explain another longstanding mystery: the surprising lack of lithium in the universe (APS Physics synopsis).

Could it be related to dark matter?

A lot of recent progress in dark matter has revolved around the possibility that in addition to dark matter, there may be additional light particles that mediate interactions between dark matter and the Standard Model. If these particles are light enough, they can change the way that we expect to find dark matter in sometimes surprising ways. One interesting avenue is called self-interacting dark matter and is based on the observation that these light force carriers can deform the dark matter distribution in galaxies in ways that seem to fit astronomical observations. A 20 MeV dark photon-like particle even fits the profile of what’s required by the self-interacting dark matter paradigm, though it is very difficult to make such a particle consistent with both the Atomki anomaly and the constraints from direct detection.

Should I be excited?

Given all of the caveats listed above, some feel that it is too early to be in “drop everything, this is new physics” mode. Others may take this as a hint that’s worth exploring further—as has been done for many anomalies in the recent past. For researchers, it is prudent to be cautious, and it is paramount to be careful; but so long as one does both, then being excited about a new possibility is part what makes our job fun.

For the general public, the tentative hopes of new physics that pop up—whether it’s the Atomki anomaly, or the 750 GeV diphoton bumpa GeV bump from the galactic center, γ-ray lines at 3.5 keV and 130 GeV, or penguins at LHCb—these are the signs that we’re making use of all of the data available to search for new physics. Sometimes these hopes fizzle away, often they leave behind useful lessons about physics and directions forward. Maybe one of these days an anomaly will stick and show us the way forward.

Further Reading

Here are some of the popular-level press on the Atomki result. See the references at the top of this ParticleBite for references to the primary literature.

UC Riverside Press Release
UC Irvine Press Release
Nature News
Quanta Magazine
Quanta Magazine: Abstractions
Symmetry Magazine
Los Angeles Times

Share

What is “Model Building”?

Thursday, August 18th, 2016

Hi everyone! It’s been a while since I’ve posted on Quantum Diaries. This post is cross-posted from ParticleBites.

One thing that makes physics, and especially particle physics, is unique in the sciences is the split between theory and experiment. The role of experimentalists is clear: they build and conduct experiments, take data and analyze it using mathematical, statistical, and numerical techniques to separate signal from background. In short, they seem to do all of the real science!

So what is it that theorists do, besides sipping espresso and scribbling on chalk boards? In this post we describe one type of theoretical work called model building. This usually falls under the umbrella of phenomenology, which in physics refers to making connections between mathematically defined theories (or models) of nature and actual experimental observations of nature.

One common scenario is that one experiment observes something unusual: an anomaly. Two things immediately happen:

  1. Other experiments find ways to cross-check to see if they can confirm the anomaly.
  2. Theorists start figure out the broader implications if the anomaly is real.

#1 is the key step in the scientific method, but in this post we’ll illuminate what #2 actually entails. The scenario looks a little like this:

An unusual experimental result (anomaly) is observed. One thing we would like to know is whether it is consistent with other experimental observations, but these other observations may not be simply related to the anomaly.

An unusual experimental result (anomaly) is observed. One thing we would like to know is whether it is consistent with other experimental observations, but these other observations may not be simply related to the anomaly.

Theorists, who have spent plenty of time mulling over the open questions in physics, are ready to apply their favorite models of new physics to see if they fit. These are the models that they know lead to elegant mathematical results, like grand unification or a solution to the Hierarchy problem. Sometimes theorists are more utilitarian, and start with “do it all” Swiss army knife theories called effective theories (or simplified models) and see if they can explain the anomaly in the context of existing constraints.

Here’s what usually happens:

Usually the nicest models of new physics don't fit! In the explicit example, the minimal supersymmetric Standard Model doesn't include a good candidate to explain the 750 GeV diphoton bump.

Usually the nicest models of new physics don’t fit! In the explicit example, the minimal supersymmetric Standard Model doesn’t include a good candidate to explain the 750 GeV diphoton bump.

Indeed, usually one needs to get creative and modify the nice-and-elegant theory to make sure it can explain the anomaly while avoiding other experimental constraints. This makes the theory a little less elegant, but sometimes nature isn’t elegant.

Candidate theory extended with a module (in this case, an additional particle). This additional model is "bolted on" to the theory to make it fit the experimental observations.

Candidate theory extended with a module (in this case, an additional particle). This additional model is “bolted on” to the theory to make it fit the experimental observations.

Now we’re feeling pretty good about ourselves. It can take quite a bit of work to hack the well-motivated original theory in a way that both explains the anomaly and avoids all other known experimental observations. A good theory can do a couple of other things:

  1. It points the way to future experiments that can test it.
  2. It can use the additional structure to explain other anomalies.

The picture for #2 is as follows:

A good hack to a theory can explain multiple anomalies. Sometimes that makes the hack a little more cumbersome. Physicists often develop their own sense of 'taste' for when a module is elegant enough.

A good hack to a theory can explain multiple anomalies. Sometimes that makes the hack a little more cumbersome. Physicists often develop their own sense of ‘taste’ for when a module is elegant enough.

Even at this stage, there can be a lot of really neat physics to be learned. Model-builders can develop a reputation for particularly clever, minimal, or inspired modules. If a module is really successful, then people will start to think about it as part of a pre-packaged deal:

A really successful hack may eventually be thought of as it's own variant of the original theory.

A really successful hack may eventually be thought of as it’s own variant of the original theory.

Model-smithing is a craft that blends together a lot of the fun of understanding how physics works—which bits of common wisdom can be bent or broken to accommodate an unexpected experimental result? Is it possible to find a simpler theory that can explain more observations? Are the observations pointing to an even deeper guiding principle?

Of course—we should also say that sometimes, while theorists are having fun developing their favorite models, other experimentalists have gone on to refute the original anomaly.

pheno_05

Sometimes anomalies go away and the models built to explain them don’t hold together.

 

But here’s the mark of a really, really good model: even if the anomaly goes away and the particular model falls out of favor, a good model will have taught other physicists something really neat about what can be done within the a given theoretical framework. Physicists get a feel for the kinds of modules that are out in the market (like an app store) and they develop a library of tricks to attack future anomalies. And if one is really fortunate, these insights can point the way to even bigger connections between physical principles.

I cannot help but end this post without one of my favorite physics jokes, courtesy of T. Tait:

 A theorist and an experimentalist are having coffee. The theorist is really excited, she tells the experimentalist, “I’ve got it—it’s a model that’s elegant, explains everything, and it’s completely predictive.”The experimentalist listens to her colleague’s idea and realizes how to test those predictions. She writes several grant applications, hires a team of postdocs and graduate students, trains them,  and builds the new experiment. After years of design, labor, and testing, the machine is ready to take data. They run for several months, and the experimentalist pores over the results.

The experimentalist knocks on the theorist’s door the next day and says, “I’m sorry—the experiment doesn’t find what you were predicting. The theory is dead.”

The theorist frowns a bit: “What a shame. Did you know I spent three whole weeks of my life writing that paper?”

Share

Les grandes percées sont rares en physique. La recherche est plutôt jalonnée d’innombrables petites avancées et c’est ce qui ressortira de la Conférence Internationale de la Physique des Hautes Énergies (ICHEP) qui s’est ouverte hier à Chicago. On y espérait un pas de géant mais aujourd’hui les expériences CMS et ATLAS ont toutes deux rapporté que l’effet prometteur observé à 750 GeV dans les données de 2015 avait disparu. Il est vrai que ce genre de choses n’est pas rare en physique des particules étant donné la nature statistique de tous les phénomènes que nous observons.

CMS-2016-750GeV

Sur chaque figure, l’axe vertical indique le nombre d’évènements trouvés contenant une paire de photons dont la masse combinée apparaît sur l’axe horizontal en unités de GeV. (À gauche) Les points en noir représentent les données expérimentales recueillies et analysées jusqu’à présent par la Collaboration CMS, soit 12.9 fb-1, à comparer aux 2.7 fb-1 disponibles en 2015. Le trait vertical associé à chaque point représente la marge d’erreur expérimentale. En tenant compte de ces erreurs, les données sont compatibles avec ce à quoi on s’attend pour le bruit de fond, tel qu’indiqué par la courbe en vert. (À droite) Une nouvelle particule se serait manifestée sous forme d’un pic tel que celui en rouge si elle avait eu les mêmes propriétés que celles pressenties dans les données de 2015 à 750 GeV. Visiblement, les données expérimentales (points noirs) reproduisent simplement le bruit de fond. Il faut donc conclure que ce qui avait été aperçu dans les données de 2015 n’était que le fruit d’une variation statistique.

Mais dans ce cas, c’était particulièrement convainquant car le même effet avait été observé indépendamment par deux équipes qui travaillent sans se consulter et utilisent des méthodes d’analyse et des détecteurs différents. Cela avait déclenché beaucoup d’activités et d’optimisme : à ce jour, 540 articles scientifiques ont été écrits sur cette particule hypothétique qui n’a jamais existé, tant l’implication de son existence serait profonde.

Mais les théoriciens et théoriciennes ne furent pas les seuls à nourrir autant d’espoir. Beaucoup d’expérimentalistes y ont cru et ont parié sur son existence, un de mes collègues allant jusqu’à mettre en jeu une caisse d’excellent vin.

Si beaucoup de physiciens et physiciennes avaient bon espoir ou étaient même convaincus de la présence d’une nouvelle particule, les deux expériences ont néanmoins affiché la plus grande prudence. En l’absence de preuves irréfutables de sa présence, aucune des deux collaborations, ATLAS et CMS, n’a revendiqué quoi que ce soit. Ceci est caractéristique des scientifiques : on parle de découvertes seulement lorsqu’il ne subsiste plus aucun doute.

Mais beaucoup de physiciens et physiciennes, moi y compris, ont délaissé un peu leurs réserves, non seulement parce que les chances que cet effet disparaisse étaient très minces, mais aussi parce que cela aurait été une découverte beaucoup plus grande que celle du boson de Higgs, générant du coup beaucoup d’enthousiasme. Tout le monde soupçonne qu’il doit exister d’autres particules au-delà de celles déjà connues et décrites par le Modèle standard de la physique des particules. Mais malgré des années passées à leur recherche, nous n’avons toujours rien à nous mettre sous la dent.

Depuis que le Grand collisionneur de hadrons (LHC) du CERN opère à plus haute énergie, ayant passé de 8 TeV à 13 TeV en 2015, les chances d’une découverte majeure sont plus fortes que jamais. Disposer de plus d’énergie donne accès à des territoires jamais explorés auparavant.

Jusqu’ici, les données de 2015 n’ont pas révélé la présence de particules ou phénomènes nouveaux mais la quantité de données recueillies était vraiment limitée. Au contraire, cette année le LHC se surpasse, ayant déjà produit cinq fois plus de données que l’année dernière. On espère y découvrir éventuellement les premiers signes d’un effet révolutionnaire. Des dizaines de nouvelles analyses basées sur ces données récentes seront présentées à la conférence ICHEP jusqu’au 10 août et j’en reparlerai sous peu.

Il a fallu 48 ans pour découvrir le boson de Higgs après qu’il fut postulé théoriquement alors qu’on savait ce que l’on voulait trouver. Mais aujourd’hui, nous ne savons même pas ce que nous cherchons. Cela pourrait donc prendre encore un peu de temps. Il y a autre chose, tout le monde le sait. Mais quand le trouverons nous, ça, c’est une autre histoire.

Pauline Gagnon

Pour en savoir plus sur la physique des particules et les enjeux du LHC, consultez mon livre : « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels».

Pour recevoir un avis lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution.

Share

Giant leaps are rare in physics. Scientific research is rather a long process made of countless small steps and this is what will be presented throughout the week at the International Conference on High Energy Physics (ICHEP) in Chicago. While many hoped for a major breakthrough, today, both the CMS and ATLAS experiments reported that the promising effect observed at 750 GeV in last year’s data has vanished. True, this is not uncommon in particle physics given the statistical nature of all phenomena we observe.

CMS-2016-750GeV

On both plots, the vertical axis gives the number of events found containing a pair of photons with a combined mass given in units of GeV (horizontal axis) (Left plot) The black dots represent all data collected in 2016 and analysed so far by the CMS Collaboration, namely 12.9 fb-1, compared to the 2.7 fb-1 available in 2015. The vertical line associated with each data point represents the experimental error margin. Taking these errors into account, the data are compatible with what is expected from various backgrounds, as indicated by the green curve. (Right) A new particle would have manifested itself as a peak as big as the red one shown here if it had the same features as what had been seen in the 2015 data around 750 GeV. Clearly, the black data points pretty much reproduce the background. Hence, we must conclude that what was seen in the 2015 data was simply due to a statistical fluctuation.

What was particularly compelling in this case was that the very same effect had been observed by two independent teams, who worked without consulting each other and used different detectors and analysis methods. This triggered frantic activity and much expectation: to date, 540 scientific theory papers have been written on a hypothetical particle that never was, so profound the implications of the existence of such a new particle would be.

But theorists were not the only ones to be so hopeful. Many experimentalists had taken strong bets, one of my colleagues going as far as putting a case of very expensive wine on it.

If many physicists were hopeful or even convinced of the presence of a new particle, both experiments nevertheless had been very cautious. Without unambiguous signs of its presence, neither the ATLAS nor the CMS Collaborations had made claims. This is very typical of scientists: one should not claim anything until it has been established beyond any conceivable doubt.

But many theorists and experimentalists, including myself, threw some of our caution to the air, not only because the chances it would vanish were so small but also because it would have been a much bigger discovery than that of the Higgs boson, generating much enthusiasm. As it stands, we all suspect that there are other particles out there, beyond the known ones, those described by the Standard Model of particle physics. But despite years spent looking for them, we still have nothing to chew on. In 2015, the Large Hadron Collider at CERN raised its operating energy, going from 8 TeV to the current 13 TeV, making the odds for a discovery stronger than ever since higher energy means access to territories never explored before.

So far, the 2015 data has not revealed any new particle or phenomena but the amount of data collected was really small. On the contrary, this year, the LHC is outperforming itself, having already delivered five times more data than last year. The hope is that these data will eventually reveal the first signs of something revolutionary. Dozens of new analyses based on the recent data will be presented until August 10 at the ICHEP conference and I’ll present some of them later on.

It took 48 years to discover the Higgs boson after it was first theoretically predicted when we knew what to expect. This time, we don’t even know what we are looking for. So it could still take a little longer. There is more to be found, we all know it. But when will we find it, is another story.

Pauline Gagnon

To find out more about particle physics, check out my book « Who Cares about Particle Physics: making sense of the Higgs boson, the Large Hadron Collider and CERN ».

To be notified of new blogs, follow me on Twitter : @GagnonPauline or sign up on this distribution list

 

Share

Earlier last month, Romania became the 22nd Member State of the European Organisation for Nuclear Research, or CERN, home to the world’s most powerful atom-smasher. But the hundred Romanian scientists working on experiments there have already operated under a co-operation agreement with CERN for the last 25 years. So why have Romania decided to commit the money and resources needed to become a full member? Is this just bureaucratic reshuffling or the road to a more fruitful collaboration between scientists?

Image: CERN

On 18th July, Romania became a full member state of CERN. In doing so, it joined twenty one other countries, which over the years have created one of the largest scientific collaborations in the world. Last year, the two largest experimental groups at CERN, ATLAS and CMS, broke the world record for the total number of authors on a research article (detailing the mass of the Higgs Boson).

To meet its requirements for becoming a member, Romania has committed $11mil USD towards the CERN budget this year, three times as much as neighbouring member Bulgaria and more than seven times as much as Serbia, which holds Associate Membership, aiming to follow in Romania’s footsteps. In return, Romania now holds a place on CERN’s council, having a say in all the major research decisions of the ground-breaking organization where the forces of nature are probed, antimatter is created and Higgs Bosons discovered.

Romania’s accession to the CERN convention marks another milestone in the organisation’s history of international participation over the last sixty years. In that time it has built bridges between the members of nations where diplomacy and international relations were less than favourable, uniting researchers from across the globe towards the goal of understanding the universe on its most fundamental level.

CERN was founded in 1954 with the acceptance of its convention by twelve European nations in a joint effort for nuclear research, the year where “nuclear research” included the largest ever thermonuclear detonation by the US in its history and the USSR deliberately testing the effects of nuclear radiation from a bomb on 45,000 of its own soldiers. Despite the Cold War climate and the widespread use of nuclear physics as a means of creating apocalyptic weapons, CERN’s founding convention alongside UNESCO, which member states adhere to today, states:

“The Organization shall provide for collaboration among European States in nuclear research of a pure scientific and fundamental character…The Organization shall have no concern with work for military requirements,”

The provisional Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research) was dissolved and its legacy was carried by the labs built and operated under the convention it had laid and the name it bore: CERN. Several years later in 1959, the British director of the Proton Synchrotron division at CERN, John Adams, received a gift of vodka from Soviet scientist Vladimir Nikitin of the Dubna accelerator, just north of Moscow, and at the time the most powerful accelerator in the world. 

The vodka was to be opened in the event the Proton Synchrotron accelerator at CERN was successfully operated at an energy greater than Dubna’s maximum capacity: 10 GeV. It more than doubled the feat, reaching 24 GeV, and with the vodka dutifully polished off, the bottle was stuffed with a photo of the proton beam readout and sent back to Moscow.

John Adams, holding the empty vodka bottle in celebration of the Proton Synchroton’s successful start (Image: CERN-HI-5901881-1 CERN Document Server)

Soviet scientists contributed more than vodka to the international effort in particle physics. Nikitin would later go on to work alongside other soviet and US scientists in a joint effort at Fermilab in 1972. Over the next few decades, ten more member states would join CERN permanently, including Israel, its first non-European member. On top of this, researchers at CERN now join from four associate member nations, four observer states (India, Japan, USA and Russia) and holds a score of cooperation agreements with other non-member states.

While certainly the largest collaboration of this kind, CERN is certainly no longer unique in being a collaborative effort in particle physics. Quantum Diaries is host to the blogs of many experiments all of whom comprise of a highly diverse and internationally sourced research cohort. The synchrotron lab for the Middle East, SESAME, expected to begin operation next year, will involve both the Palestinian and Israeli authorities with hopes it “will foster dialogue and better understanding between scientists of all ages with diverse cultural, political and religious backgrounds,”. It was co-ordinated in part, by CERN.

I have avoided speaking personally so far, but one needs to address the elephant in the room. As a British scientist, I speak from a nation where the dust is only just settling on the decision to cut ties with the European Union, against the wishes of the vast majority of researchers. Although our membership to CERN will remain secure, other projects and our relationship with european collaborators face uncertainty.

While I certainly won’t deign to give my view on the matter of a democratic vote, it is encouraging to take a look back at a fruitful history of unity between nations and celebrate Romania’s new Member State status as a sign that that particle physics community is still, largely an integrated and international one. In the short year that I have been at University College London, I have not yet attended any international conferences, yet have had the pleasure to meet and learn from visiting researchers from all over the globe. As this year’s International Conference on High Energy Physics kicks off this week, (chock-full of 5-σ BSM discovery announcements, no doubt*), there is something comforting in knowing I will be sharing my excitement, frustration and surprise with like-minded graduate students from the world over.

Kind regards to Ashwin Chopra and Daniel Quill of University College London for their corrections and contributions, all mistakes are unreservedly my own.
*this is, obviously, playful satire, except for the case of an announcement in which case it is prophetic foresight.

Share

The Large Hadron Collider (LHC) at CERN has already delivered more high energy data than it had in 2015. To put this in numbers, the LHC has produced 4.8 fb-1, compared to 4.2 fb-1 last year, where fb-1 represents one inverse femtobarn, the unit used to evaluate the data sample size. This was achieved in just one and a half month compared to five months of operation last year.

With this data at hand, and the projected 20-30 fb-1 until November, both the ATLAS and CMS experiments can now explore new territories and, among other things, cross-check on the intriguing events they reported having found at the end of 2015. If this particular effect is confirmed, it would reveal the presence of a new particle with a mass of 750 GeV, six times the mass of the Higgs boson. Unfortunately, there was not enough data in 2015 to get a clear answer. The LHC had a slow restart last year following two years of major improvements to raise its energy reach. But if the current performance continues, the discovery potential will increase tremendously. All this to say that everyone is keeping their fingers crossed.

If any new particle were found, it would open the doors to bright new horizons in particle physics. Unlike the discovery of the Higgs boson in 2012, if the LHC experiments discover a anomaly or a new particle, it would bring a new understanding of the basic constituents of matter and how they interact. The Higgs boson was the last missing piece of the current theoretical model, called the Standard Model. This model can no longer accommodate new particles. However, it has been known for decades that this model is flawed, but so far, theorists have been unable to predict which theory should replace it and experimentalists have failed to find the slightest concrete signs from a broader theory. We need new experimental evidence to move forward.

Although the new data is already being reconstructed and calibrated, it will remain “blinded” until a few days prior to August 3, the opening date of the International Conference on High Energy Physics. This means that until then, the region where this new particle could be remains masked to prevent biasing the data reconstruction process. The same selection criteria that were used for last year data will then be applied to the new data. If a similar excess is still observed at 750 GeV in the 2016 data, the presence of a new particle will make no doubt.

Even if this particular excess turns out to be just a statistical fluctuation, the bane of physicists’ existence, there will still be enough data to explore a wealth of possibilities. Meanwhile, you can follow the LHC activities live or watch CMS and ATLAS data samples grow. I will not be available to report on the news from the conference in August due to hiking duties, but if anything new is announced, even I expect to hear its echo reverberating in the Alps.

Pauline Gagnon

To find out more about particle physics, check out my book « Who Cares about Particle Physics: making sense of the Higgs boson, the Large Hadron Collider and CERN », which can already be ordered from Oxford University Press. In bookstores after 21 July. Easy to read: I understood everything!

CMS-lumi-17juin

The total amount of data delivered in 2016 at an energy of 13 TeV to the experiments by the LHC (blue graph) and recorded by CMS (yellow graph) as of 17 June. One fb-1 of data is equivalent to 1000 pb-1.

Share

Le Grand collisionneur de hadrons (LHC) du CERN a déjà produit depuis avril plus de données à haute énergie qu’en 2015. Pour quantifier le tout, le LHC a produit 4.8 fb-1 en 2016, à comparer aux 4.2 fb-1 de l’année dernière. Le symbole fb-1 représente un femtobarn inverse, l’unité utilisée pour évaluer la taille des échantillons de données. Tout cela en à peine un mois et demi au lieu des cinq mois requis en 2015.

Avec ces données en réserve et les 20-30 fb-1 projetés d’ici à novembre, les expériences ATLAS et CMS peuvent déjà repousser la limite du connu et, entre autres, vérifier si les étranges événements rapportés fin 2015 sont toujours observés. Si cet effet était confirmé, il révèlerait la présence d’une nouvelle particule ayant une masse de 750 GeV, soit six fois plus lourde que le boson de Higgs. Malheureusement en 2015, il n’y avait pas suffisamment de données pour obtenir une réponse claire. Après deux ans de travaux majeurs visant à accroître sa portée en énergie, le LHC a repris ses opérations l’an dernier mais à faible régime. Si sa performance actuelle se maintient, les chances de faire de nouvelles découvertes seront décuplées. Tout le monde garde donc les doigts croisés.

Toute nouvelle particule ouvrirait la porte sur de nouveaux horizons en physique des particules. Contrairement à la découverte du boson de Higgs en 2012, si les expériences du LHC révèlent une anomalie ou l’existence d’une nouvelle particule, cela modifierait notre compréhension des constituants de base de la matière et des forces qui les régissent. Le boson de Higgs constituait la pièce manquante du Modèle standard, le modèle théorique actuel. Ce modèle ne peut plus accommoder de nouvelles particules. On sait pourtant depuis des décennies qu’il est limité, bien qu’à ce jour, les théoriciens et théoriciennes n’aient pu prédire quelle théorie devrait le remplacer et les expérimentalistes ont échoué à trouver le moindre signe révélant cette nouvelle théorie. Une évidence expérimentale est donc absolument nécessaire pour avancer.

Bien que les nouvelles données soient déjà en cours de reconstruction et de calibration, elles resteront “masquées” jusqu’à quelques jours avant le 3 août, date d’ouverture de la principale conférence de physique cet été. D’ici là, la région où la nouvelle particule pourrait se trouver est masquée afin de ne pas biaiser le processus de reconstruction des données. A la dernière minute, on appliquera aux nouvelles données les mêmes critères de sélection que ceux utilisés l’an dernier. Si ces évènements sont toujours observés à 750 GeV dans les données de 2016, la présence d’une nouvelle particule ne fera alors plus aucun doute.

Mais même si cela s’avérait n’être qu’une simple fluctuation statistique, ce qui arrive souvent en physique de par sa nature, la quantité de données accumulée permettra d’explorer une foule d’autres possibilités. En attendant, vous pouvez suivre les activités du LHC en direct ou voir grandir les échantillons de données de CMS et d’ATLAS. Je ne pourrai malheureusement pas vous rapporter ce qui sera présenté à la conférence en août, marche en montagne oblige, mais si une découverte quelconque est annoncée, même moi je m’attends à entendre son écho résonner dans les Alpes.

Pauline Gagnon

Pour en apprendre plus sur la physique des particules, ne manquez pas mon livre « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels » disponible en librairie au Québec et en Europe, de meme qu’aux Editions MultiMondes. Facile à lire : moi, j’ai tout compris!

CMS-lumi-17juin

Graphe cumulatif montrant la quantité de données produites à 13 TeV en 2016 par le LHC (en bleu) et récoltées par l’expérience CMS (en jaune) en date du 17 juin.

Share

Lab news

Friday, April 1st, 2016

Get the latest news from the world’s biggest science lab! All the facts, all the truth, totally verified and true beyond all reasonable doubt. 85% official news. Brought to you by the team that revealed Elvis landing on the moon.

ATLAS to install neutrino calorimeters

The ATLAS detector is currently the largest experiment on the CERN site, weighing over 7,000 tonnes, spanning 50 m across and almost 50 m long. It can detect nearly all particles produced in the record breaking high energy collisions provided by the LHC. These particles have strange names like the electron, proton, pion, Ξ(1530)3/2+, photon, friton, demi-semi-lepton and Boris. But there is a big problem, which becomes more pressing as we reach higher and higher energies, and that is the neutrino. This is a tiny, neutral, almost massless particle that was predicted in 1930, and it comes in different flavours (the most popular being mint.) The ATLAS Collaboration has an ambitious plan to extend the capabilities of its detector by being the first such general purpose detector to install neutrino calorimeters. At the moment a neutrino is seen as “missing transverse energy”, and that makes it really hard to find new particles.

ATLAS Spokesperson, Dave Charlton, said “Look I really don’t have time for this, I have to go to a meeting!”. After reporters blocked his path and stole his CERN card he added “Fine, how about ‘This is a very exciting time for ATLAS and we are happy to be leading the field in this area. Detecting neutrinos will open up new parameter space and allow to perform searches never seen before.’ Now give me my CERN card, the Weekly meeting cannot start without me.” By seeing neutrinos directly, physicists would be able to observe the annoying neutrino backgrounds that get in the way of dark matter searches. They could count the neutrinos directly to see if they agree with long standing predictions.

Proposals for the new ATLAS neutrino calorimeters

Proposals for the new ATLAS neutrino calorimeters

But not everyone is happy with the proposal. We spoke to a neutrino expert, and after she closed the door on us, we went to Wikipedia. Apparently neutrinos are so bad at interacting that they need about one light years of lead before they can be seen. This would have some impact on the local (and not so local) area. We spoke with a representative from Geneva Airport. He said “If the proposed plans are succesful this would mean moving Geneva Airport. The people and businesses of Geneva rely on the airport for connections with the rest of the world. It would be very inconvenient and not very efficient to commute a light year to reach the airport. Most rental car contracts will not allow you travel that far.”

It’s not yet clear where the supply of lead will come from. A sphere of solid lead would contain more than the global supply, even if every atom was liberated from the Earth’s crust. We would need 38 orders of magnitude more than there is on the planet. That’s more than a million million million. It’s lots. There is also a problem with the sheer size of the proposal. “There are problems we still have to solve”, said an ATLAS physicist “We have a Solar Passage Working Group, and NASA is helping us deal with other local astronomical bodies that might pose impact challenges. Trigger is an issue. Right now it takes about 100 milliseconds to trigger an event. With the new neutrino calorimeters it could take up to 3 years.”

The proposals, if approved, will be implemented by 2600.

CMS developing “truth matching” for data

For decades the CMS Collaboration has used a common tool known as “truth matching” with its simulation studies. Every particle in a simulation has information associated with it, including its mass, energy, charge, momentum, spin, and favourite movies. All these quantities have to be estimated using measurements from the simulated detector, so they are never perfectly known. However with a simulation you can match up the particles to what really happened with the so-called “truth record”, and this is what we call truth matching. If you have a particle travelling with a certain momentum in a certain direction you can compare it to the truth record and find out exactly what kind of particle it is. That means you no longer need those tricky identification algorithms, and you can remove background processes trivially.

“This makes my analysis super easy!” said one CMS student. “I might even graduate next week.” Truth matching has been applied to simulations for several decades, and it it is unique in being the only method that has not also been applied to data. Everything else, from machine learning to Bayesian analysis, have been developed using simulation before being moved over to real data. By employing ouija boards, dowsing techniques, and Feng Shui, CMS psychics have reported initial success. “There are definitely a lot of protons in the LHC beam.” one said. The LHC beam does indeed contain about a million million protons per bunch, and this has been seen by some as a sign of confirmation of the method. Others are more skeptical. “Those protons could have come from the magnets or the pipes. There’s a lot of matter in these tunnels. The results prove nothing.”

One of the first complete data events to be truth matched, a diphoton Higgs decay

One of the first complete data events to be truth matched, a diphoton Higgs decay

If the truth matching of data is successful, it could lead to a revolution in particle physics. Detectors could be slimmed down, time could be saved in the analysis process, and the peer review process would be streamlined. “Rather than having to measure the levels of signal and background, a process that can take months, we can simply count how many electrons bosons we have.” The initial findings are only the first step, and there are plans to extend the data truth matching to more complex final states. It’s expected that by 2019 the CMS Collaboration will be able to truthmatch Higgs bosons, top quarks, and even new particles we’ve never seen before.

A tearful Polish professor, who pioneered the use of the famous ‘pseudorapidity’ variable said “I have been waiting for this breakthrough my entire career. This will make the lives of so many scientists so much simpler.”

LHCb made a big blunder, and you won’t believe what it is!

Senior LHCb physicists were left red faced today when they discovered a terrible blunder. “How could we not have seen this?” Spokesperson Guy Wilkinson said. “It’s been staring us in the face for years” blurted Operations Coordinator Barbara Storaci.

LHCb, a huge science machine that lives underground on the Franco-Swiss border, is hiding a huge secret. Sources on reddit tell us “This kind of hting happens al the time. The Eiffel Tower was bilt up-side-down for the frist few weeks.” and “OMG! WTF? ORLY?”

Can you see what’s wrong with this picture? 98% of people can’t!

The LHCb schematic, with the approved geometry

The LHCb schematic, with the approved geometry

It turns out that when LHCb was made, the engineers only built half a detector. “Now I see it I can’t unsee it!” exclaimed a postdoc, spilling crepe on the table as he spoke.

“It may be true that we only built half a detector”, an anonymous researcher said “but at least it was the forward half.” So far there are no plans to correct the problem, and the Collaboration has already produced hundreds of world class papers with the current detector and shows no signs of stopping.

ALICE alchemists quit after years of research

A team of alchemists working on the ALICE Collaboration have today announced that their research program will end today. The collection of six pesudoscientists, a small minority of the total Collaboration, are hanging up their lab coats after declaring their research “unworkable” and “a total abysmal failure”. The ALICE Collaboration investigates the collisions of Lead ions with other particles in the LHC. The Collaboration has been responsible for a wide range of discoveries concerning the quark-gluon plasma, which is a form of primordial matter from the early universe.

The STAR experiment contained real Gold atoms

The STAR experiment contained real Gold atoms

However it is not the quark-gluon plasma that the small band of alchemists are studying. Instead they want to turn the Lead into Gold, and they want to use the LHC to do it. Most of them came from the previous generation of ion collider experiments, based in Brookhaven, New York. At those facilities there was an abundance of Gold in the experimental apparatus, and it the alchemists looked to replicate this success.

“I just don’t understand” said Bob Bobbatrop, the Master Mage “we had so much success with the RHIC accelerator! The LHC must be producing negative energy fields and the crystals in our detector must be misaligned.” ALICE Spokesperson, Paolo Giubellino, was quick to distance himself from the misfit alchemists. “They are not representative of the Collaboration as a whole, and frankly, I don’t know how they got in here in the first place. The RHIC facility in Brookhaven collided Gold ions, so of course these so-called alchemists found Gold. They’d have to be even stupider not to find it there! This is why we have a peer review process. We’ve even started to arrange psuedomeetings in a local coffee shop where they present their results, and they haven’t yet noticed that most of the people listening are tourists. Even the local barista rolls her eyes when they talk. Meanwhile we can get on with the real research.”

But like a gauge violating wavefunction, Bob Bobbatrop is not phased. “We have vastly superior software! When we need a random number we don’t rely on a C++ library, we use a 20 sided die. You can’t get more serious than that.”

Cryogenics team start charity drive

Do you have any old, unwanted fridge magnets? You can send them to CERN! Last year the cryogenics team at CERN faced problems that lead to the failure of some magnets. Now, a charity drive is starting where you can donate your old magnets, and these will be attached to the outside of failing magnets to give them a boost. “We accept any magnets! That magnet you purchased on vacation? Yes, we’ll take it. Do you have magnetic letters? We will take those too.”

Donated magnets in the staging and testing area

Donated magnets in the staging and testing area

Some magnets are more useful than others. Magnets with mini thermometers can help engineers keep track of the state of the supercooled LHC magnets. The resident artists at CERN have expressed an interest in the magnetic “fridge poetry” packs. Magnets that feature cats will be used in the RF cavity sector. So please, take a look at your fridge, and see if you really need that snow globe magnet from Oslo, or that hula girl magnet from Hawai’i. Why leave it sitting in your kitchen when it can be helping research on the world’s largest machine?

Creative solution to poster defacement row

In recent weeks the media has reported on defacement of the LGBT CERN posters at the lab, with many being removed or subject to grafitti. CERN Director General, Fabiola Gianotti, has taken these incidents very seriously. “The targeting of a single group of posters for abuse like this unacceptable” she said, “and so I have made the decision that from now on, all types posters at CERN will be removed or defaced. CERN is a lab of equal opportunities, and it must be free from discrimination.”

Teams of administrators, including Gianotta herself, have been seen walking the corridors of CERN and instituting this new policy. Posters announcing a SUSY conference have had “NO MORE SYMMTRY BRAKING HERE!!1!” scrawled across them, and a poster advertising a symposium on solar neutrinos was subjected to “Go back to where you came from. The sun.” written on it. Even parking signs are not immune, with slogans such as “Parking? More like… splarking!” and a fire exit sign was seen with a neatly written note underneath saying “They had fire in Hitler’s Germany too, you know”.

One of the many posters subject to the new policy

One of the many posters subject to the new policy

By attacking all signs and posters at the lab, the aim is to make nobody feel victimised or isolated. Staff are encouraged to use their own initiative and are recommended to mutter incoherently under their breath as they do so. “If nothing else” one technician said “it’s made the lab more surreal. I don’t even know how much a coffee is anymore. Apparently it’s now one ‘WHY ARE YOU READING THIS?!’, but it used to be 1.60 CHF.”

LIGO result explained

In February 2016, the LIGO experiment announced it had observed gravitational waves, predicted over a century ago by Albert Einstein’s theory of general relativity. The discovery is thought to have come from the merging of two massive black holes, from over a billion light years away. However, two students have come forward to say that they created the waves in their apartment, using a waffle iron, an iPhone, and the cluck of a chicken. “We’ve been working on this prank for weeks” said the first student, “and we had no idea it would be taken seriously!” The second student added “We had to eat so many Pringles to get enough tubes for the wave generator.”

Captain McNuggets, relaxing in the garden

Captain McNuggets, relaxing in the garden

The real hero of the story is their chicken, Captain McNuggets, who made the characteristic “chirp” sound. So did LIGO really detect gravitational waves? “Oh, absolutely!” the pair of students replied. The machine they made could produce gravitational waves of any frequency and amplitude desired, but it was only made “for a bit of a laugh” and is unlikely to see further research. The machine itself was dismantled in October to make space for their latest project, the “ballistic taco-launcher”.

Share

There has been a lot of press about the recent DØ result on the possible \(B_s \pi\) state. This was also covered on Ricky Nathvani’s blog. At Moriond QCD, Jeroen Van Tilburg showed a few plots from LHCb which showed no signal in the same mass regions as explored by D∅. Tomorrow, there will be a special LHC seminar on the LHCb search for purported tetraquark, where we will get the full story from LHCb. I will be live blogging the seminar here! It kicks off at 11:50 CET, so tune in to this post for live updates.


Mar 22, 2016 -12:23. Final answer. LHCb does not confirm the tetraquark. Waiting for CMS, ATLAS, CDF.


Mar 22, 2016 – 12:24. How did you get the result out so fast? A lot of work by the collaboration to get MC produced and to expedite the process.


Mar 22, 2016 – 12:21. Is the \(p_T\) cut on the pion too tight? The fact that you haven’t seen anything anywhere else gives you confidence that the cut is safe. Also, cut is not relative to \(B_s\).


Mar 22, 2016 – 12:18. Question: What are the fractions of multiple candidates which enter? Not larger than 1.2. If you go back to the cuts. What selection killed the combinatoric background the most? Requirement that the \(\pi\) comes from the PV, and the \(p_T\) cut on the pion kill the most. How strong the PV cut? \(\chi^2\) less than 3.5 for the pion at the PV, you force the \(B_s\) and the pion to come from the PV, and constrain the mass of \(B_s\) mass.


Mar 22, 2016 – 12:17: Can you go above the threshold? Yes.


Mar 22, 2016 – 12:16. Slide 9: Did you fit with a floating mass? Plan to do this for the paper.


Mar 22, 2016 – 12:15. Wouldn’t \(F_S\) be underestimated by 8%? Maybe maybe not.


Mar 22, 2016 – 12:13. Question: Will LHCb publish? Most likely yes, but a bit of politics. Shape of the background in the \(B_s\pi\) is different in LHCb and DØ. At some level, you expect a peak from the turn over. Also CMS is looking.


Mar 22, 2016 – 12:08-12:12. Question: did you try the cone cut to try to generate a peak? Answer: Afraid that the cut can give a biased estimate of the significance. From DØ seminar, seems like this is the case. For DØ to answer. Vincenzo Vagnoni says that DØ estimation of significance is incorrect. We also don’t know if there’s something that’s different between \(pp\) and \(p \bar{p}\).


Mar 22, 2016 – 12:08. No evidence of \(X(5568)\) state, set upper limit. “We look forward to hearing from ATLAS, CMS and CDF about \(X(5568)\)”


Mar 22, 2016 – 12:07. What if the production of the X was the same at LHCb? Should have seen a very large signal. Also, in many other spectroscopy plots, e.g. \(B*\), look at “wrong sign” plots for B and meson. All results LHCb already searched for would have been sensitive to such a state.


Mar 22, 2016 -12:04. Redo the analysis in bins of rapidity. No significant signal seen in any result. Do for all pt ranges of the Bs.


Mar 22, 2016 – 12:03. Look at \(B^0\pi^+\) as a sanity check. If X(5568) is similar to B**, then the we expect order 1000 events.


Mar 22, 2016 – 12:02.Upper limits on production given.


Mar 22, 2016 – 12:02. Check for systematics: changing mass and width of DØ range, and effect of efficiency dependence on signal shape are the dominant sources of systematics. All measurements dominated by statistics.


Mar 22, 2016 – 12:00. Result of the fits all consistent with zero. The relative production is also consistent with zero.


Mar 22, 2016 – 11:59. 2 fits with and without signal components, no difference in pulls. Do again with tighter cut on the transverse momentum of the \(B_s\). Same story, no significant signal seen.


Mar 22, 2016 – 11:58. Fit model: S-wave Breit-Wigner, mass and width fixed to DØ result. Backgrounds: 2 sources. True \(B_s^0\) with random track, and fake \(B_s\).


Mar 22, 2016 – 11:56.  No “cone cut” applied because it is highly correlated with reconstructed mass.


Mar 22, 2016 – 11:55. LHCb strategy: Perform 3 independent searches, confirm a qualitative approach, move forward with single approach with Run 1 dataset. Cut based selection to match D∅ strategy. Take home point. Statistics is 20x larger and much cleaner.


Mar 22, 2016 – 11:52. Review of DØ result. What could it be? Molecular model is disfavored. Diquark-Antidiquark models are popular. But could not fit into any model. Could also be feed down of  radiative decays. All predictions have large uncertainties


Mar 22, 2016 –  11:49. LHCb-CONF-2016-004 posted at cds.cern.ch/record/2140095/


Mar 22, 2016 – 11:47. The speaker is transitioning to Marco Pappagallo .


Mar 22, 2016 – 11:44. People have begun entering the auditorium for the talk, at the end of Basem Khanji’s seminar on \(\Delta m_d\)

 

Share

Has CERN discovered a new particle or not? Nobody knows yet, although we are now two steps closer than in December when the first signs of a possible discovery were first revealed.

First step: both the ATLAS and CMS experiments showed yesterday at the Moriond conference that the signal remains after re-analyzing the 2015 data with improved calibrations and reconstruction techniques. The faint signal still stands, even slightly stronger (see the Table). CMS has added new data not included earlier and collected during a magnet malfunction. Thanks to much effort and ingenuity, the reanalysis now includes 20% more data. Meanwhile, ATLAS showed that all data collected at lower energy up to 2012 were also compatible with the presence of a new particle.

The table below shows the results presented by CMS and ATLAS in December 2015 and February 2016. Two hypotheses were tested, assuming different characteristics for the hypothetical new particle: the “spin 0” case corresponds to a new type of Higgs boson, while “spin 2” denotes a graviton.

The label “local” means how strong the new signal appears locally at a mass of 750 or 760 GeV, while “global” refers to the probability of finding a small excess over a broad range of mass values. In physics, statistical fluctuations come and go. One is bound to find a small anomaly when looking all over the place, which is why it is wise to look at the bigger picture. So globally, the excess of events observed so far is still very mild, far from the 5σ criterion required to claim a discovery. The fact that both experiments found it independently is what is so compelling.

table-750GeV

 

But mostly, the second step, we are closer to potentially confirming the presence of a new particle simply because the restart of the Large Hadron Collider is now imminent. New data are expected for the first week of May. Within 2-3 months, both experiments will then know.

We need more data to confirm or refute the existence of a new particle beyond any possible doubt. And that’s what experimental physicists are paid to do: state what is known about Nature’s laws when there is not even the shadow of a doubt.

That does not mean than in the meantime, we are not dreaming since if this were confirmed, it would be the biggest breakthrough in particle physics in decades. Already, there is a frenzy among theorists. As of 1 March, 263 theoretical papers have been written on the subject since everybody is trying to find out what this could be.

Why is this so exciting? If this turns out to be true, it would be the first particle to be discovered outside the Standard Model, the current theoretical framework. The discovery of the Higgs boson in 2012 had been predicted and simply completed an existing theory. This was a feat in itself but a new, unpredicted particle would at long last reveal the nature of a more encompassing theory that everybody suspects exists but that nobody has found yet. Yesterday at the Moriond conference, Alessandro Strumia, a theorist from CERN, also predicted that this particle would probably come with a string of companions.

Theorists have spent years trying to imagine what the new theory could be while experimentalists have deployed heroic efforts, sifting through huge amounts of data looking for the smallest anomaly. No need to say then that the excitement is tangible at CERN right now as everybody is holding their breath, waiting for new data.

Pauline Gagnon

To learn more about particle physics and what might be discovered at the LHC, don’t miss my upcoming book : « Who cares about particle physics : Making sense of the Higgs boson, Large Hadron Collider and CERN »

To be alerted of new postings, follow me on Twitter: @GagnonPauline  or sign-up on this mailing list to receive an e-mail notification.

Share