• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for September, 2013

CERN’s universe is ours!

Sunday, September 29th, 2013

This past weekend, CERN held its first open days for the public in about five years. This was a big, big deal. I haven’t heard any final statistics, but the lab was expecting about 50,000 visitors on each of the two days. (Some rain on Sunday might have held down attendance.) Thus, the open days were a huge operation — roads were shut down, and Transports Publics Genevois was running special shuttle buses amongst the Meyrin and Previssen sites and the access points on the LHC ring. The tunnels were open to people who had reserved tickets in advance — a rare opportunity, and one that is only possible during a long shutdown such as the one currently underway.

A better CERN user than me would have volunteered for the open days. Instead, I took my kids to see the activities. We thought that the event went really well. I was bracing for it to be a mob scene, but in the end the Meyrin site was busy but not overrun. (Because the children are too small, we couldn’t go to any of the underground areas.) There were many eager orange-shirted volunteers at our service, as we visited open areas around the campus. We got to see a number of demonstrations, such as the effects of liquid-nitrogen temperatures on different materials. There were hands-on activities for kids, such as assembling your own LHC and trying to use a scientific approach to guessing what was inside a closed box. Pieces of particle detectors and LHC magnets were on display for all to see.

But I have to say, what really got my kids excited was the Transport and Handling exhibit, which featured CERN’s heavy lifting equipment. They rode a scissors lift that took them to a height of several stories, and got to operate a giant crane. Such a thing would never, ever happen in the US, which has a very different culture of legal liability.

I hope that all of the visitors had a great time too! I anticipate that the next open days won’t be until the next long shutdown, which is some years away, but it will be well worth the trip.


This article originally appeared in Fermilab Today on Sept. 26, 2013.

Many new international partners officially joined LBNE during the collaboration meeting earlier this month. Photo courtesy of Norm Buchanan

Many new international partners officially joined LBNE during the collaboration meeting earlier this month. Photo courtesy of Norm Buchanan

LBNE is making headway toward becoming a truly global experiment.

Last week 16 institutions from Brazil, Italy and the UK joined the LBNE collaboration, based at Fermilab, significantly contributing to an overall membership increase of over 30 percent compared to a year ago.

The swelling numbers strengthen the case to pursue an LBNE design that will maximize its scientific impact, helping us understand how neutrinos fit into our understanding of matter, energy, space and time.

In mid-2012 an external review panel recommended phasing LBNE to meet DOE budget constraints. In December the project received CD-1 approval on its phase 1 design, which excluded both the near detector and an underground location for the far detector.

“Although LBNE was reconfigured for CD-1, our goal is still to deliver a full-scope, fully capable LBNE to enable world-leading physics,” Project Director Jim Strait told the LBNE collaboration earlier this month at its meeting in Fort Collins, Colo. “We have a well-developed design of such a facility, and we are working with new partners to move toward this goal.”

Fortunately, the CD-1 approval explicitly allows for an increase in design scope if new partners are able to bring additional resources. Under this scenario, goals for a new, expanded LBNE phase 1 bring back these excluded design elements, which are crucial for executing a robust and far-reaching neutrino, nucleon decay and astroparticle physics program.

Over the last few months, neutrino physicists from institutions in several countries have expressed interest in joining LBNE. Discussions are under way to identify areas of mutual interest and understanding the potential scale of collaboration.

“These groups bring a wealth of physics and technology expertise to the collaboration,” said Bob Wilson of Colorado State University, who, with fellow spokesperson Milind Diwan of Brookhaven National Laboratory and others, has been actively building these partnerships.

Physicist Ricardo Gomes of the Federal University of Goiás in Brazil, whose group is already a member of the MINOS+ and NOvA experiments, said that LBNE is a natural next step.

“LBNE is a great opportunity to work on an exciting experiment from the start, one that will help to answer important neutrino questions,” Gomes said. “We hope to work on simulation of background events from cosmic-ray muons and would like to contribute to the photon detector instrumentation.”

Fermilab Director Nigel Lockyer is pleased with LBNE’s recent growth.

“It’s incredibly encouraging that so many around the globe are signing on as official LBNE collaborators,” Lockyer said. “To get as much science as we can out of it, LBNE must be a global project.”

Anne Heavey


I asked if my IceCube and ARA colleague Mike Richman (University of Maryland) could write up something for the meeting he was attending. Thanks much, Mike!

During the last week of August, the annual TeV Particle Astrophysics, or
TeVPA, conference was held in Irvine, CA. IceCube and neutrino astrophysics
were very well represented at the conference. The opening talk of the
conference was “Results from IceCube”, presented by Albrecht Karle [1].
After introducing our detector and its objectives, the PeV events and the 26
other High Energy Starting Events (HESE) were highlighted in some detail.

The HESE are a set of very bright events in which neutrinos interacted in
the ice within the instrumented volume of IceCube itself. So far, IceCube is
cautiously reporting this result as “evidence” for high-energy
extraterrestrial neutrinos. We need to analyze more data to reach the
standard “discovery” threshold of about a 1 out of 2 million chance that our
data is the result of an especially sneaky terrestrial background.
Nevertheless, the HESE are generating a lot of excitement in the community.
Later on Monday, the events were described in more detail by Claudio Kopper
[2], while Joanna Kiryluk [3] helped put them in context with a discussion
of other IceCube searches for cascade-like events. Ranjan Laha [4], a
non-IceCube theorist, presented an outsider’s perspective on common
questions about the HESE.

Also on Monday, Jake Feintzeig [5] presented results from IceCube searches
for neutrino point-sources, and I [6] presented results from searches for
neutrinos correlated with gamma-ray bursts. Ignacio Taboada [7] presented a
search for a correlation between gamma-ray bursts and the 28 HESE. No
neutrino signal was found in any of these searches; however, many models
suggest that if these neutrino sources exist, we will need to take data for
longer before we can see them.

On Tuesday, Shigeru Yoshida [8] and Aya Ishihara [9] discussed scenarios in
which the extremely high energy neutrino flux levels at IceCube can
constrain cosmological models. Serap Tilav [10] gave a provocative talk in
which she explored a fit for the cosmic ray composition using the spectrum
across all energies as measured by many experiments. On Thursday, Markus
Ahlers [11] discussed multi-messenger — gamma, neutrino and charged
particle — tests which could be used to identify the sources of the HESE.

During the conference, there were talks by IceCube and non-IceCube
researchers alike in which it was said that, in light of the HESE, we have
entered the era of neutrino astrophysics. And while IceCube is currently
leading the field, we are certainly not alone. TeVPA also featured talks on
ANTARES and KM3NeT [12], neutrino experiments in the Mediterranean sea, as
well as ANITA, EVA and ARA [13] and ARIANNA [14], which search for radio
emission from the highest energy neutrinos interacting in the South Pole
ice. There was even a talk [15] on the possibility of using ultra high
energy neutrinos to constrain the depth of outer ice layers on moons like
Europa. It’s an exciting time to be working in neutrino astrophysics!

All material presented at TeVPA 2013 is available here: TeVPA2013.

Individual talk URLs:

[1] https://indico.cern.ch/contributionDisplay.py?contribId=0&confId=221841
[2] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=104&confId=221841
[3] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=113&confId=221841
[4] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=103&confId=221841
[5] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=99&confId=221841
[6] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=101&confId=221841
[7] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=102&confId=221841
[8] https://indico.cern.ch/contributionDisplay.py?sessionId=2&contribId=10&confId=221841
[9] https://indico.cern.ch/contributionDisplay.py?sessionId=10&contribId=34&confId=221841
[10] https://indico.cern.ch/contributionDisplay.py?sessionId=4&contribId=56&confId=221841
[11] https://indico.cern.ch/contributionDisplay.py?sessionId=6&contribId=96&confId=221841
[12] https://indico.cern.ch/contributionDisplay.py?sessionId=2&contribId=1&confId=221841
[13] https://indico.cern.ch/contributionDisplay.py?sessionId=2&contribId=11&confId=221841
[14] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=106&confId=221841
[15] https://indico.cern.ch/contributionDisplay.py?sessionId=9&contribId=108&confId=221841
[16] http://indico.cern.ch/conferenceDisplay.py?confId=221841



Zooming in on new particles

Friday, September 20th, 2013

The Large Hadron Collider (LHC) at CERN has stopped in the spring to undergo a major consolidation program but this has not stopped the search for new physics. On the contrary, physicists are taking advantage of the interruption to finalise all analyses with the whole data collected so far.

Dozens of new results have been presented by the four LHC experiments at several conferences since the end of operation. While only a handful of these results have made the headlines, a wealth of new information is now available, allowing theorists to refine their models.

Even with the discovery of a Higgs boson, physicists know that the Standard Model of particle physics cannot be the final answer since it has known shortcomings. For example, it fails to provide an explanation for dark matter or why the masses of fundamental particles such as electrons and muons are so different. Another theory called supersymmetry (or SUSY for short) is one of the most popular and most promising ways to extend the Standard Model, but it has yet to manifest itself.

One major difficulty when testing this new theory is the large number of parameters it introduces. To find the new particles predicted by SUSY, we must explore a vast territory spanned by 105 dimensions, corresponding to its 105 free parameters. Finding these new particles is like trying to spot a stranger in a crowd of millions.

Fortunately, theorists have attempted to give us experimentalists some guidance to constrain these parameters using theoretical or experimental considerations. One model that has gained popularity lately is called the phenomenological Minimal Supersymmetric Model or pMSSM and uses only 19 parameters. It takes into account information from all aspects of particle physics, incorporating constraints from the measured characteristics of the Z and Higgs bosons, b-quark physics, astrophysics as well as direct searches for dark matter at underground facilities and supersymmetric particles at the LHC.

Several groups of theorists and experimentalists have combined all these recent results to see which areas of the reduced but still huge parameter space of the pMSSM model are still allowed.

Their approach consists in generating millions of possible values corresponding to the masses and couplings of the hypothesised SUSY particles. The couplings are quantities related to the probability to produce these particles at the LHC.

Then they impose various constraints obtained from the many quantities measured by past and current experiments to see which points among all possibilities are still allowed.

Two theorists, Alex Arbey and Nazila Mahmoudi, and experimentalist Marco Battaglia, contrary to their earlier work, performed their latest scan assuming the four positive results reported by direct dark matter experiments were true dark matter signals to see if these results could be explained within SUSY.

While attempts by other groups were not able to find SUSY scenarios in agreement with the parameters of the possible dark matter signal, their results were rather surprising: they found surviving scenarios pointing to a light neutralino, with a mass of only 10 GeV, twelve times lighter than the Higgs boson. The second lightest particle is the super partner of the bottom quark, called sbottom, at around 20 GeV.


The mass ranges predicted for different SUSY particles coming out of this study. The Higgs boson discovered last summer, h0, is assumed to be the lightest of the five Higgs bosons predicted by SUSY and the lightest SUSY particle is the neutralino, χ0.

If this scenario were correct, why would such a light particle have escaped detection? The reason is that most searches led by the CMS and ATLAS experiments have focused so far on events where a large amount of energy is missing.

This would be the case when some heavy but invisible SUSY particle escapes from our detectors. Such criteria are needed to reduce the overwhelming background and isolate the few events containing traces of SUSY particles. But a light neutralino would only carry a small quantity of energy and would have gone undetected.

While theorists are assessing which corners of the parameter space are still allowed, experimentalists are evaluating the impact of their selection criteria on detecting particles having the characteristics of the remaining allowed regions. New strategies are now being sought to explore this possibility.

Operating the LHC at higher energy and collecting larger datasets starting in 2015 should give definite answers to these questions. These combined efforts may soon pave the way to new discoveries.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.





Le Grand collisionneur de hadrons (LHC) du CERN a cessé d’opérer au printemps pour entreprendre un programme de consolidation majeure, mais la quête pour une « nouvelle physique » se poursuit. Les physicien-ne-s profitent de cette pause pour finaliser toutes leurs analyses avec l’ensemble des données recueillies jusqu’à présent.

Des dizaines de nouveaux résultats ont déjà été présentés par les quatre expériences du LHC lors de diverses conférences depuis la fin des opérations. Bien qu’une poignée seulement de ces résultats ait fait les manchettes, l’information nouvellement disponible permet aux théoricien-ne-s d’affiner leurs modèles.

Même avec la découverte d’un boson de Higgs, les physicien-ne-s savent bien que le Modèle Standard de la physique des particules ne peut pas être la réponse finale car il comporte plusieurs lacunes. Par exemple, il ne fournit aucune explication sur la nature de la matière sombre ou pourquoi les masses des particules fondamentales comme celles des électrons et des muons diffèrent autant.

Une autre théorie appelée supersymétrie (SUSY pour les intimes) offre la possibilité d’étendre le Modèle standard. C’est l’alternative la plus populaire mais encore faut-il arriver à prouver son existence en trouvant les nouvelles particules qu’elle prédit.

La difficulté majeure pour tester cette théorie vient du fait qu’elle introduit de nombreux paramètres. Pour trouver les

nouvelles particules supersymétriques qu’elle prédit, il faut donc explorer un vaste territoire à 105 dimensions, correspondant aux 105 paramètres libres. Trouver ces particules est comme essayer de repérer un visage inconnu dans une foule de millions de personnes.

Heureusement, les théoricien-ne-s tentent d’orienter les expérimentateurs et expérimentatrices en réduisant cet espace autant que possible à l’aide de considérations théoriques et expérimentales. Un modèle qui a gagné en popularité ces derniers temps est appelé le modèle phénoménologique supersymétrique minimal ou pMSSM. Il utilise seulement 19 paramètres.

Ce modèle incorpore l’information provenant de tous les aspects de la physique des particules. Il intègre les contraintes obtenues à partir des mesures des caractéristiques des bosons Z et bosons de Higgs, de la physique du quark b, de l’astrophysique, ainsi que les recherches directes de matière sombre venant des installations sous-terraines et de particules supersymétriques au LHC.

Plusieurs groupes comprenant des théoricien-ne-s et des expérimentateurs et expérimentatrices ont combiné tous ces résultats récents et passés pour déterminer quelles zones de l’espace des paramètres réduit mais toujours considérable du modèle de pMSSM sont toujours permis.

Ils et elles génèrent d’abord des millions de valeurs possibles correspondant aux valeurs de masses et couplages des particules supersymétriques hypothétiques. Les couplages sont en gros des quantités reliées à la probabilité de produire ces particules au LHC.

Ensuite, ils et elles imposent les diverses contraintes obtenues à partir des nombreuses quantités mesurées par les expériences passées et actuelles pour voir quels points parmi toutes ces possibilités demeurent encore autorisés.

Deux théoriciens, Alex Arbey et Nazila Mahmoudi, et un expérimentateur, Marco Battaglia, contrairement à leurs travaux antérieurs, ont inclus dans leur dernière analyse les résultats positifs rapportés par quatre expériences de recherche directe de matière sombre en supposant qu’ils viennent bien de la matière sombre.

Alors que les tentatives d’autres groupes n’avaient pu trouver de scénarios de SUSY en accord avec les possibles signaux de matière sombre,  leurs résultats sont plutôt surprenants: ils ont trouvé des scénarios suggérant la possibilité d’une particule supersymmétrique appelée neutralino qui serait très légère, avec une masse d’à peine 10 GeV, soit douze fois moins que le boson de Higgs. La seconde particule la plus légère serait la super-partenaire du quark b, appelée sbottom, avec une masse d’environ 20 GeV.


La gamme des masses prévues pour les différentes particules supersymétriques ressortant de cette étude. Le boson de Higgs découvert l’été dernier, h0, correspondrait au plus léger des cinq bosons de Higgs prédits par SUSY et la particule de SUSY la plus légère serait le neutralino χ0.

Si ce scénario est correct, comment une particule aussi légère aurait-elle pu échappé à la détection? La raison est simple: la plupart des recherches menées par les expériences CMS et ATLAS ont tenté jusqu’ici de détecter des événements contenant une grande quantité d’énergie manquante.

C’est le cas pour les événements où une particule supersymmetrique lourde et invisible à nos détecteurs s’échappe. De tels critères de sélection sont nécessaires afin de réduire la quantité écrasante de bruit de fond et isoler les rares événements contenant des particules supersymétriques. Mais des neutralinos légers n’emporteraient qu’une petite partie d’énergie et serait donc passée inaperçue.

Pendant que les théoricien-ne-s déterminent quelles régions de l’espace des paramètres sont encore autorisées, les expérimentateurs et expérimentatrices évaluent l’impact de leurs critères de sélection sur la détection des particules ayant les caractéristiques des régions restantes. De nouvelles stratégies sont actuellement recherchées pour explorer cette possibilité.

En opérant le LHC à plus haute énergie en 2015 et en produisant encore plus de données, on pourra obtenir des réponses définitives à ces questions. Ces efforts combinés ouvriront peut-être bientôt la voie à de nouvelles découvertes.

Pauline Gagnon

Pour être alerté des nouvelles offres, suivez-moi sur Twitter: @ GagnonPauline
ou vous inscrire sur cette liste de diffusion pour recevoir les notifications e -mail.


Aces high

Thursday, September 19th, 2013

Much as I love living in Lincoln, Nebraska, having a long residence at CERN has some advantages. For instance, we do get much better traffic of seminar and colloquium speakers here. (I know, you were thinking about chocolate.) Today’s colloquium in particular really got me thinking about how we do, or don’t, understand particle physics today.

The speaker was George Zweig of MIT. Zweig has been to CERN before — almost fifty years ago, when he was a postdoctoral fellow. (This was his first return visit since then.) He had just gotten his PhD at Caltech under Richard Feynman, and was busy trying to understand the “zoo” of hadronic particles that were being discovered in the 1960’s. (Side note: Zweig pointed out today that at the time there were 26 known hadronic particles…19 of which are no longer believed to exist.) Zweig developed a theory that explained the observations of the time by positing a set of hadronic constituents that he called “aces”. (He thought there might be four of them, hence the name.) Some particles were made of two aces (and thus called “deuces”) and others were made of three (and called “trays”). This theory successfully explained why some expected particle decays didn’t actually happen in nature, and gave an explanation for differences in masses between various sets of particles.

Now, reading this far along, you might think that this sounds like the theory of quarks. Yes and no — it was Murray Gell-Mann who first proposed quarks, and had similar successful predictions in his model. But there was a critical difference between the two theories. Zweig’s aces were meant to be true physical particles — concrete quarks, as he referred to them. Gell-Mann’s quarks, by contrast, were merely mathematical constructs whose physical reality was not required for the success of the theory. At the time, Gell-Mann’s thinking held sway; I’m no expert on the history of this period of history in theoretical particle physics. But my understanding was that the Gell-Mann approach was more in line with the theory fashions of the day, and besides, if you could have a successful theory that didn’t have to introduce some new particles that were themselves sketchy (their electric charges had to be fractions of the electron charge, and they apparently couldn’t be observed anyway), why would you?

Of course, we now know that Zweig’s interpretation is more correct; this was even becoming apparent a few short years later, when deep-inelastic scattering experiments at SLAC in the late 1960’s discovered that nucleons had smaller constituents, but at that time it was controversial to actually associate those with the quarks (or aces). For whatever reason, Zweig left the field of particle physics and went on to a successful career as a faculty member at MIT, doing work in neurobiology that involved understanding the mechanisms of hearing.

I find it a fascinating tale of how science actually gets done. How might it apply to our science today? A theory like the standard model of particle physics has been so well tested by experiment that it is taken to be true without controversy. But theories of physics beyond the standard model, the sort of theories that we’re now trying to test at the LHC, are much less constrained. And, to be sure, some are more popular than others, because they are believed to have some certain inherent beauty to them, or because they fit well with patterns that we think we observe. I’m no theorist, but I’m sure that some theories are currently more fashionable than others. But in the absence of experimental data, we can’t know that they are right. Perhaps there are some voices that are not being heard as well as they need to be. Fifty years from now, will we identify another George Zweig?


A summer at CERN in pictures

Tuesday, September 17th, 2013

They say a picture says a thousand words – so here is my story of a summer at CERN in 12,000 words.


Welcome to CERN baby! – this year’s crop of summer students gather in front of the Globe for a group photo (first to spot me gets a postcard!)


It was beautiful summer which, with CERN located in the midst of agricultural estates, afforded itself to some very colourful walks.

The sun was shining and the weather sweet – the summer held much promise. And with CERN located in the midst of agricultural estates, there were a few nice strolls to be enjoyed too.


A cross section of the LHC.

We got acquainted with the stars of CERN – a cross section of the Large Hadron Collider (LHC). Notice the two beam pipes through which particles circulate around the 27 km circular accelerator in different directions.


The gargantuan detector, CMS.

And its giants – the gargantuan particle detector, CMS. This 12,500 tonne beasty is located 80 metres underground at a point where the beams of the LHC cross and particles collide.



The weekends afforded a fine opportunity to soak up some Swiss culture – here a group of Swiss horn players in Gruyère.


CERN's old rubbish - on a quite Saturday morning stroll around the Meyrin site I stubbled across this - an old decommissioned detector.

Or to take a quiet Saturday morning stroll around CERN, where you never know what you’ll find – I stubbled across this old decommissioned detector.


The final project

Now there was of course some work to be done – my experiment coupling a laser into optical fibres and through a birefringent crystal.


While most of came to learn about the laws of Physics, some wanted to defy them - Filip levitating.

And while most of us came to learn about the laws of Physics, some sought to defy them – Filip levitating.


The Antimatter Decelerator - which is used to make and store antimatter at CERN - is described my Michael "Antimatter" Doser himself.

But we did learn about the stranger goings-on at CERN – Michael “Antimatter” Doser himself explains the workings of the Antimatter Decelerator, which is used in the production of antimatter.


A reveller caught in the spotlight at the outdoor film.

There was plenty of time for a few nights at the movies – a reveller caught in the spotlight at the outdoor film at Perle du Lac, Geneva.


Cian and Donal of the Emigrants jamming in Charly O'Neils.

And a few more down the pub – Cian and Donal of the Emigrants jamming in Charly O’Neills.


We had arrived to work at...

All in all, it was a rather memorable summer at…


You Learn Something New Every Day

Friday, September 13th, 2013
The new display in the CERN cafeteria

The new display in the CERN cafeteria

One of the remarkable things about working at CERN is that there’s always a lot going on, even in this “quiet” year when the LHC’s shut down – far too much for anyone to keep track of what everyone else is up to. This morning, I saw a new accelerator status page up on the big displays in the cafeteria. I didn’t recognize the beamlines in the picture – it’s definitely not the LHC or the accelerator chain leading up to it – so I tweeted about it, and rather quickly got a reply from a friend of mine in the accelerator division.

It turns out we’re looking at CTF3, a test facility for research toward the Compact Linear Collider (CLIC), a multi-TeV electron-positron machine that might be built in a few decades. I knew that technology for CLIC was being actively studied at CERN, but I never thought much about what sort of facilities were here or what they were called, until they showed up on the display at breakfast. This is a very good place to learn something new every day!

I have no idea why they put the CTF3 beamlines on the display today, but maybe it’s because something interesting will happen soon? You can watch and find out for yourself here.


The discovery of the Higgs boson is a major step forward in our understanding of nature at the most fundamental levels. In addition to being the last piece of the standard model, it is also at the core of the fine tuning problem — one of the deepest mysteries in particle physics. So it is only natural that our scientific methodology rise to the occasion to provide the most powerful and complete analysis of this breakthrough discovery.

This week the ATLAS collaboration has taken an important step forward by making the likelihood function for three key measurements about the Higgs available to the world digitally. Furthermore, this data is being shared in a way that represents a template for how particle physics operates in the fast-evolving world of open access to data. These steps are a culmination of decades of work, so allow me to elaborate.

Four interactions that can produced a Higgs boson at the LHC

Four interactions that can produced a Higgs boson at the LHC

Higgs production and decay measured by ATLAS.

Higgs production and decay measured by ATLAS.

First of all, what are the three key measurements, and why are they important? The three results were presented by ATLAS in this recent paper.  Essentially, they are measurements for how often the Higgs is produced at the LHC through different types of interactions (shown above) and how often it decays into three different force carrying particles (photons, W, and Z bosons).  In this plot, the black + sign at (1,1) represents the standard model prediction and the three sets of contours represent the measurements performed by ATLAS.  These measurements are fundamental tests of the standard model and any deviation could be a sign of new physics like supersymmetry!

Ok, so what is the likelihood function, and why is it useful?  Here maybe it is best to give a little bit of history.  In 2000, the first in a series of workshops was held at CERN where physicists gathered to discuss the details of our statistical procedures that lead to the final results of our experiments.  Perhaps surprisingly, there is no unique statistical procedure, and there is a lot of debate about the merits of different approaches.  After a long discussion panel, Massimo Corradi cut to the point

It seems to me that there is a general consensus that what is really meaningful for an experiment is likelihood, and almost everybody would agree on the prescription that experiments should give their likelihood function for these kinds of results. Does everybody agree on this statement, to publish likelihoods?

And as Louis Lyons charred the session…

Any disagreement? Carried unanimously.  That’s actually quite an achievement for this workshop.

So there you have it, the likelihood function is the essential piece of information needed for communicating scientific results.

So what happened next?  Well… for years, despite unanimous support, experiments still do not publish their likelihood functions.  Part of the reason is that we lacked the underlying technology to communicate these likelihood functions efficiently.  In the run up to the LHC we developed some technology (associated to RooFit and RooStats) for being able to share very complicated likelihood functions internally.  This would be the ideal way to share our likelihood functions, but we aren’t quite there yet.  In January 2013, we had a conference devoted to the topic of publishing likleihood functions, which culminated in a paper “On the presentation of LHC Higgs results”.  This paper, written by theorist and experimentalists, singled out the likelihood associated to the plot above as the most useful way of communicating information about the Higgs properties.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

The reason that these specific Higgs plots are so useful is that more specific tests of the standard model can be derived from them.  For instance, one might want to consider beyond the standard model theories where the Higgs interacts with all the matter particles (fermions) or all the force carrying particles (vector bosons) differently than in the standard model.  To do that, it is useful to group together all of the information in a particular way and take a special 2-d slice through the 6-d parameter space described by the three 2-d plots above.  To the left is the result of this test (where the axes are called κ_F and κ_V for the vector bosons and fermions, respectively).  What is special about this plot is that there is an overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.  While my student Sven Kreiss made the comparison as part of a test, anyone can now reproduce this plot from the official ATLAS likelihood functions.  More importantly, the same procedure that was used to make this plot can be used to test other specific theories — and there are a lot of alternative ways to reinterpret these Higgs results.

Great! So where can you find these likelihood functions and what does this have to do with open access?  I think this part is very exciting.  CERN is now famous for being the birthplace for the world wide web and having a forward-looking vision for open access to our published papers. The sheer volume and complexity of the LHC data makes the conversation about open access to the raw data quite complicated.  However, having access to our published results is much less controversial.  While it is not done consistently, there are several examples of experiments putting information that goes into tables and figures on HepData (a repository for particle physics measurements).  Recently, our literature system INSPIRE started to integrate with HepData so that the data are directly associated to the original publication (here is an example).  What is important is that this data is discoverable and citable.  If someone uses this data, we want to know exactly what is being used and the collaborations that produced the data deserve some credit.  INSPIRE is now issuing a Digital Object Identifier (DOI) to this data, which is a persistent and trackable link to the data.

So now for the fun part, you can go over to the INSPIRE record for the recent Higgs paper (http://inspirehep.net/record/1241574) and you will see this:

The INSPIRE record for the recent ATLAS Higgs paper.


If you click on HepData tab at the top it will take you to a list of data associated to this paper.   Each of the three entries has a DOI associated to it (and lists all the ATLAS authors).  For example, the H→γγ result’s DOI is 10.7484/INSPIREHEP.DATA.A78C.HK44, and this is what should be cited for any result that uses this likelihood.    (Note, to get to the actual data, you click on the Files tab.)  INSPIRE is now working so that your author profile will not only include all of your papers, but also the data sets that you are associated with (and you can also see the data associated with your ORCID ID).

The H→γγ likelihood function.

The INSPIRE record for the H→γγ likelihood function.

Now it’s time for me and my co-authors to update our paper “On the presentation of LHC Higgs results” to cite this data.  And next week, Salvatore Mele, head of Open Access at CERN, will give a keynote presentation to the DataCite conference entitled “A short history of the Higgs Boson. From a tenth of a billionth of a second after the Big Bang, through the discovery at CERN, to a DataCite DOI”.

I truly hope that this becomes standard practice for the LHC.  It is a real milestone for the information architecture associated to the field of high energy physics and a step forward in the global analysis of the Higgs boson discovered at the LHC!

Update (Sept. 17): The new version of our paper is out that has citations to the likelihoods.

Update (Sept. 18): The data record now has a citation tab as well, so you can distinguish citations to the data and citations to the paper.


Prioritizing the future

Monday, September 9th, 2013

As I’ve discussed a number of times, the United States particle physics community has spent the last nine months trying to understand what the exciting research and discovery opportunities are for the next ten to twenty years, and what sort of facilities might be required to exploit them. But what comes next? How do we decide which of these avenues of research are the most attractive, and, perhaps most importantly, can be achieved given that we work within finite budgets, need the right enabling technologies to be available at the right times, and must be planned in partnership with researchers around the world?

In the United States, this is the job of the Particle Physics Project Prioritization Panel, or P5. What is this big mouthful? First, it is a sub-panel of the High Energy Physics Advisory Panel, or HEPAP. HEPAP is the official body that can advise the Department of Energy and the National Science Foundation (the primary funders of particle physics in the US, and also the sponsors of the US LHC blog) on programmatic direction of the field in the US. As an official Federal Advisory Committee, HEPAP operates in full public view, but it is allowed to appoint sub-panels that are under the control of and report to HEPAP but have more flexibility to deliberate in private. This particular sub-panel, P5, was first envisioned in a report of a previous HEPAP sub-panel in 2001 that looked at, among other things, the long-term planning process for the field. The original idea was that P5 would meet quite regularly and continually review the long-term roadmap for the field and adjust it according to current conditions and scientific knowledge. However, in reality P5’s have been short-lived and been re-formed every few years. The last P5 report dates from 2008, and obviously a lot has changed since then — in particular, we now know from the LHC that there is a Higgs boson that looks like the one predicted in the standard model, and there have been some important advances in our understanding of neutrino mixing. Thus the time is ripe to take another look at the plan.

And so it is that a new P5 was formed last week, tasked with coming up with a new strategic plan for the field “that can be executed over a 10 year timescale, in the context of a 20-year global vision for the field.” P5 is supposed to be taking into account the latest developments in the field, and use the Snowmass studies as inputs. The sub-panel is to consider what investments are needed to fulfill the scientific goals, what mix of small, medium and large experiments is appropriate, and how international partnerships can fit into the picture. Along the way, they are also being asked to provide a discussion of the scientific questions of the field that is accessible to non-specialists (along the lines of this lovely report from 2004) and articulate the value of particle-physics research to other sciences and society. Oh, and the sub-panel is supposed to have a final report by May 1. No problem at all, right?

Since HEPAP’s recommendations will drive the the plan for the field, it is very important that this panel does a good job! Fortunately, there are two good things going for it. First, the membership of the panel looks really great — talented and knowledgeable scientists who are representative of the demographics of the field and include representatives from outside the US. Second, they are being asked to make their recommendations in the context of fairly optimistic budget projections. Let us only hope that these come to pass!

Watch this space for more about the P5 process over the coming eight months.