Posts Tagged ‘LHC’

Tweeting the Higgs

Wednesday, January 23rd, 2013

Back in July two seminars took place that discussed searches for the Higgs boson at the Tevatron and the LHC. After nearly 50 years of waiting an announcement of a $$5\sigma$$ signal, enough to claim discovery, was made and all of a sudden the twitter world went crazy. New Scientist presented an analysis of the tweets by Domenico et al. relating to the Higgs in their Short Sharp Scient article Twitter reveals how Higgs gossip reached fever pitch. I don’t want to repeat what is written in the article, so please take a few minutes to read it and watch the video featured in the article.

The distribution of tweets around the July 2nd and July 4th announcements (note the log scale)

Instead of focusing on the impressive number of tweets and how many people were interested in the news I think it’s more useful for me as a blogger to focus on how this gossip was shared with the world. The Higgs discovery was certainly not the only exciting physics news to come out of 2012, and the main reason for this is the jargon that was used. People were already familiar with acronyms such as CERN and LHC. The name “Higgs” was easy to remember (for some reason many struggled with “boson”, calling it “bosun”, or worse) and, much to physicists’ chagrin, “God particle” made quite a few appearances too. It seems that the public awareness was primed and ready to receive the message. There were many fellow bloggers who chose to write live blogs and live tweet the event (I like to think that I started bit of a trend there, with the OPERA faster than light neutrinos result, but that’s probably just wishful thinking!) Following the experiences of December 2011, when the webcast failed to broadcast properly for many users had twitter on standby, with tweets already composed, hungry for numbers. The hashtags were decided in advance and after a little jostling for the top spot it was clear which ones were going to be the most popular. Despite all the preparation we still saw huge numbers of #comicsans tweets. Ah well, we can’t win them all!

The point is that while the world learned about the Higgs results I think it’s just as important that we (the physicists) learn about the world and how to communicate effectively. This time we got it right, and I’m glad to see that it got out of our control as well. Our tweets went out, some questions were asked and points clarified and the news spread. I’m not particularly fond of the phrase “God particle” , but I’m very happy that it made a huge impact, carrying the message further and reaching more people than the less sensational phrase “Higgs boson”. Everyone knows who God is, but who is Higgs? I think that this was a triumph in public communication, something we should be building on. Social media technologies are changing more quickly each year, so we need to keep up.

A map of retweets on July 4th, showing the global spread.

But moving back to the main point, the Higgs tweets went global and viral because they were well prepared and the names were simple. Other news included things like the search for the $$B_s$$ meson decaying to two muons and the limits that places on SUSY, but how does one make a hashtag for that? I would not want to put the hashtag #bs on my life’s work. It’s always more exciting to announce a discovery than an exclusion too. The measurement of $$\theta_{13}$$ was just as exciting in my opinion, but that also suffered the same problem. How is the general public supposed to interpret a Greek character and two numbers? I should probably point out that this is all to do with finding the right jargon for the public, and not about the public’s capacity to understand abstract concepts (a capacity which is frequently underestimated.) Understanding how $$\theta_{13}$$ fits in the PMNS mixing matrix is no more difficult than understanding the Higgs mechanism (in fact it’s easier!) It’s just that there’s no nice nomenclature to help spread the news, and that’s something that we need to fix as soon as possible.

As a side note, $$\theta_{13}$$ is important because it tells us about how the neutrinos mix. Neutrino mixing is beyond the Standard Model physics, so we should be getting more excited about it! If $$\theta_{13}$$ is non-zero then that means that we can put another term into the matrix and this fourth term is what gives us matter-antimatter asymmetry in the lepton sector, helping to explain why we still have matter hanging around in the universe, why we have solid things instead of just heat and light. Put like that is sounds more interesting and newsworthy, but that can’t be squeezed into a tweet, let alone a hashtag. It’s a shame that result didn’t get more attention.

It’s great fun and a fine challenge to be part of this whole process. We are co-creators, exploring the new media together. Nobody knows what will work in the near future, but we can look back what has already worked, and see how people passed on the news. Making news no longer stops once I hit “Publish”, it echoes around the world, through your tweets, and reblogs, and we can see its journey. If we’re lucky it gets passed on enough to go viral, and then it’s out of our control. It’s this kind of interactivity that it so rewarding and engaging.

You can read the New Scientist article or the original paper on the arXiV.

Gluon Walls: A New Form of Matter?

Tuesday, January 8th, 2013

Theoretical physicist Raju Venugopalan

We sat down with Brookhaven theoretical physicist Raju Venugopalan for a conversation about “color glass condensate” and the structure of visible matter in the universe.

Q. We’ve heard a lot recently about a “new form of matter” possibly seen at the Large Hadron Collider (LHC) in Europe — a state of saturated gluons called “color glass condensate.” Brookhaven Lab, and you in particular, have a long history with this idea. Can you tell me a bit about that history?

A. The idea for the color glass condensate arose to help us understand heavy ion collisions at our own collider here at Brookhaven, the Relativistic Heavy Ion Collider (RHIC)—even before RHIC turned on in 2000, and long before the LHC was built. These machines are designed to look at the most fundamental constituents of matter and the forces through which they interact—the same kinds of studies that a century ago led to huge advances in our understanding of electrons and magnetism. Only now instead of studying the behavior of the electrons that surround atomic nuclei, we are probing the subatomic particles that make up the nuclei themselves, and studying how they interact via nature’s strongest force to “give shape” to the universe today.

We do that by colliding nuclei at very high energies to recreate the conditions of the early universe so we can study these particles and their interactions under the most extreme conditions. But when you collide two nuclei and produce matter at RHIC, and also at the LHC, you have to think about the matter that makes up the nuclei you are colliding. What is the structure of nuclei before they collide?

We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at RHIC (and later at LHC) would reach an upper limit of gluon concentration—a state of gluon saturation we call color glass condensate.* The collision of these super-dense gluon force fields is what produces the matter at RHIC, so learning more about this state would help us understand how the matter is created in the collisions. The theory we developed to describe the color glass condensate also allowed us to make calculations and predictions we could test with experiments. (more…)

Higgs update, HCP 2012

Thursday, November 22nd, 2012

Last week, Seth and I met up to discuss the latest results from the Hadron Collider Physics (HCP) Symposium and what they mean for the Higgs searches. We have moved past discovery and now we are starting to perform precision measurements. Is this the Standard Model Higgs boson, or some other Higgs boson? Should we look forward to a whole new set of discoveries around the corner, or is the Higgs boson the final word for new physics that the LHC has to offer? We’ll find out more in the coming months!

Mixing it up

Wednesday, November 14th, 2012

One of the other results presented at the Hadron Collider Physics Symposium this week was the result of a search for $$D^{0}–\bar{D}^{0}$$ mixing at LHCb.

Cartoon: If a $$D^0$$ is produced, at some time t later, it is possible that the system has "oscillated" into a $$\bar{D}^0$$. This is because the mass eigenstates are not the same as the flavor eigenstates.

Neutral meson mixing is predicted for any neutral meson system, and has been verified for the $$K^0–\bar{ K}^0$$, $$B^0–\bar{B}^0$$ and $$B_s^0–\bar{B_s}^0$$ systems. However, for the $$D^0–\bar{D}^0$$ system, no one measurement has provided a result with greater than $$5\sigma$$ significance that mixing actually occurs, until now.

The actual measurement is of $$R(t)=R$$, which is effectively the Taylor expansion of the time dependent ratio of $$D^0 \rightarrow K^+ \pi^-$$ (“Wrong Sign” decay) to $$D^0\rightarrow K^- \pi^+$$ (“Right Sign” decay). Charge conjugates of these decays are also included. There is a “Wrong Sign” and a “Right Sign” because the Right Sign decays are much more probable, according to the standard model.

The mixing of the $$D^0–\bar{D}^0$$ system is described by the parameters $$x = \Delta m /\Gamma$$ and $$y = \Delta \Gamma / 2\Gamma$$, where $$\Delta m$$ is the mass difference between the $$D^0$$ and $$\bar{D}^0$$, $$\Delta \Gamma$$ is the difference of widths of the mass peaks, and $$\Gamma$$ is the average width. What appears in the description of $$R$$, however, is $$x’$$ and $$y’$$, which give the relations between the $$x$$ and $$y$$ with added information about the strong phase difference between the Right Sign and Wrong Sign decays. The important part about $$x’$$ and $$y’$$ are that they appear in the time dependent terms of the Taylor expansion of $$R$$. If there were no mixing at all, then we would expect the ratio to remain constant, and the higher order time dependence to vanish. If mixing does occur, however, then a clear, non-flat trend should be seen, and hence a measurement of $$x’$$ and $$y’$$. That is why the time dependent analysis is so important.

Fit of ratio of WS and RS decays as a function of decay time of the D meson. Flat line would be no mixing, sloped line indicates mixing. From http://arxiv.org/pdf/1211.1230.pdf

Result of the mixing parameter fit of the neutral D meson system. 1,3 and 5 standard deviation contours are shown, and the + represents no mixing. From http://arxiv.org/pdf/1211.1230.pdf

The result is a 9.1 $$\sigma$$ evidence for mixing, which is also in agreement with previous results from BaBar, Belle and CDF. On top of confirming that the neutral D meson system does mix, this result is of particular importance because, coupled with the result of CP violation in the charm system, it begs the question whether or not there is much more interesting physics beyond the standard model involving charm just waiting to be seen. Stay tuned!

The coolest and hottest fluid

Friday, October 19th, 2012

In September, the Large Hadron Collider (LHC) operators at CERN attempted a new trick: putting in collisions protons in one beam and lead ions in the other. Usually, the LHC operates with two beams of identical particles (protons or ions) circulating in opposite directions in the accelerator. Here is what is expected from this new setup.

These ions are atoms stripped of all their electrons, leaving only the nucleus. Lead ions contain 82 protons plus 126 neutrons, all held together by the nuclear force.  Protons are also composite objects made of three quarks and bound together by “gluons”, the particles carrying the nuclear force.

So when two such heavy ions collide at nearly the speed of light, I dare anyone to describe where each quark and each gluon will end up. Already, trying to predict where fifteen billiard balls go after breaking the pack is tough enough. But when each projectile is made of hundreds of particles, it becomes impossible.

At first glance, it would seem all we could get out of this is just a mess. But this turns out to be the coolest and hottest mess one will ever see. From the most energetic collisions comes a new form of matter called the quark-gluon plasma.

There are three very well known state of matter: solid, liquid and gaseous. Lesser known is the fourth state of matter called plasma. This is what one finds inside a neon tube when the electric current applied is strong enough to strip the gas of its electrons. Positively charged ions and negatively charged electrons float around freely, having enough energy not to recombine.

The quark-gluon plasma is just one step above this. Imagine there is enough energy around that not only the atoms but the nucleons (the name given to protons and neutrons, the particles found inside the nucleus) break apart and coexist in some sort of an extremely energetic fluid. This is as hot as it got instants after the Big Bang. What is so cool about it though, is that this plasma exhibits collective behavior, meaning quarks and gluons do not float freely but have collective properties. The most spectacular of them is that this fluid has no viscosity and behaves as a perfect fluid. If you try to confine it in a container, it just flows up the container’s wall and spread all over the place.

The ALICE experiment is dedicated to the study of the quark-gluon plasma. Each year, the LHC operates for a few weeks with lead ions instead of protons. ALICE collects data both during proton-proton collisions and heavy ions collisions. Even when only protons collide, the projectiles are not solid balls like on a billiard table but composite objects. By comparing what can is obtained from heavy ion collisions with proton collisions, the ALICE physicists must first disentangle what comes from having protons in a bound state inside the nucleus as opposed to “free protons”.

So far, it appears that the quark-gluon plasma only formed during heavy-ion collisions since they provide the necessary energy density over a substantial volume (namely, the size of a nucleus). Some of the effects observed, such as the number of particles coming out of the collisions at different angles or momenta, depend in part on the final state created. When the plasma is formed, it reabsorbs many of the particles created, such that fewer particles emerged from the collision.

By colliding protons and heavy ions, scientists hope to discern what comes from the initial state of the projectile (bound or free protons) and what is caused by the final state (like the suppression of particles emitted when a quark-gluon plasma forms).

Already, with only one day of data taken in this new mode, the ALICE collaboration just released two papers. The first one presents the measurements of the charged hadrons density produced in proton-ion collisions and compares the result with the same measurement after proper normalization performed in proton-proton and ion-ion collisions. The second compares the transverse momentum distributions of charged hadrons measured in proton-ions and proton-proton collisions.

The ultimate goal is to study the so-called “structure function”, which describes how quarks and gluons are distributed inside protons, when they are free or embedded inside the nucleus.

More will be studied during the two-month running period with protons colliding on heavy ions planned for the beginning of 2013.

A “snapshot” of the debris coming out of a proton-lead ion collision captured by the ALICE detector showing a large number of various particles created from the energy released by the collision.

Pauline Gagnon

Torride et cool à la fois

Friday, October 19th, 2012

En septembre, les opérateurs du Grand Collisionneur de Hadrons (LHC) au CERN on réussi un truc nouveau : mettre en collision un faisceau de protons avec un faisceau d’ions de plomb. Habituellement, le LHC fonctionne avec deux faisceaux de particules identiques (protons ou ions) circulant en sens inverse dans l’accélérateur. Pourquoi cette nouvelle configuration ?

Ces ions sont des atomes auxquels on  a arraché tous les électrons, ne laissant que le noyau atomique. Les ions de plomb contiennent 82 protons plus 126 neutrons, le tout maintenu ensemble par la force nucléaire.  Les protons sont eux aussi des particules composites puisqu’ils sont faits de trois quarks « collés » ensemble grâce aux « gluons », les particules associées à la force nucléaire.

Alors quand de tels noyaux entrent en collision à presque la vitesse de la lumière, qui pourrait prédire où chaque quark et chaque gluon aboutira? Même avec seulement quinze balles de billard, il est pratiquement impossible de deviner où elles iront après la casse.  Si, de surcroit, chaque projectile est fait de centaines de particules, cela devient totalement imprévisible.

A première vue, il semblerait que tout ce qui peut sortir de collisions ions-ions est un fouillis incroyable. Mais en fait, ces collisions super énergétiques produisent le fouillis le plus torride et le plus cool qui soit : un plasma de quarks et gluons.

Tout le monde connaît les trois états de la matière: solide, liquide et gazeux mais le quatrième état, le plasma, est lui bien moins connu. C’est ce qu’on retrouve dans un tube au néon quand la différence de potentiel appliquée est assez forte pour arracher tous les électrons du gaz. Les ions chargés positivement ainsi que les électrons flottent librement, ayant suffisamment d’énergie pour ne pas se recombiner.

Le plasma de quarks et gluons est juste l’étape suivante. Imaginez qu’on fournisse suffisamment d’énergie pour pouvoir dissocier non seulement les atomes mais aussi les noyaux et mêmes les nucléons (le nom générique donné aux neutrons et protons à l’intérieur des noyaux atomiques). On obtient alors une soupe extrêmement énergétique de quarks et de gluons.

Il n’y a pas plus chaud et ce serait l’état dans lequel se trouvait toute la matière immédiatement après le Big Bang. Fait étonnant : le plasma de quarks et gluons se comporte comme un fluide ayant des propriétés collectives et non comme un ensemble de particules indépendantes. C’est en fait un fluide parfait ayant une viscosité nulle. Si on essayait de le confiner dans un contenant, le fluide remonterait les parois du contenant et se répandrait au maximum. Plus cool que ça et tu meurs…

L’expérience ALICE se consacre justement à l’étude de ce plasma. Chaque année, le LHC opère pour quelques semaines avec des ions de plomb au lieu des protons. ALICE accumule des données durant les collisions protons-protons et celles d’ions lourds. Même lorsque ce sont seulement des protons qui entrent en collision, les projectiles ne sont pas des balles pleines comme au billard mais bien des objets composites. En comparant ce que l’on obtient à partir de collisions d’ions ou de protons, les physicien-ne-s d’ALICE doivent d’abord distinguer ce qui vient du fait que les projectiles sont des protons liés dans le noyau ou bien à l’état libre.

Jusqu’à maintenant, il semble que le plasma de quarks et gluons ne se forme que dans les collisions d’ions puisqu’ils sont les seuls à fournir la densité d’énergie requise sur un volume assez substantiel (le volume d’un noyau atomique). Certains des effets observés, comme le nombre  de particules à émerger du plasma de quarks et gluons à différents angles ou vitesses dépend en partie de la nature de l’état final créé. Quand un plasma se forme, il réabsorbe une partie des particules émises, de telles sortent qu’on en voit beaucoup moins sortir de ces collisions.

Les collisions de protons sur des ions lourds permettront peut-être de démêler ce qui est attribuable à l’état initial (protons libres ou liés dans le noyau) et l’état final (comme lorsque le plasma réabsorbe une partie des particules émises).

Déjà, avec une seule journée d’opération à ce régime, la collaboration ALICE vient de publier deux articles scientifiques. Le premier article donne la mesure de la densité de hadrons chargés produits dans des collisions proton-ions comparée aux mêmes mesures effectuées avec des collisions protons-protons ou ions-ions, après avoir normalisé le tout. Le second article porte sur la comparaison des distributions de quantités de mouvement des hadrons chargés pour des collisions protons-protons et ions-ions.

Le but ultime est d’étudier les fonctions de structure des projectiles utilisés, c’est-à-dire décrire comment les quarks et les gluons sont distribués à l’intérieur des protons quand ils sont libres ou liés dans le noyau des ions de plomb.

Bien d’autres études suivront au début de 2013 durant la période de deux mois consacrée aux collisions protons-ions.

« Cliché » des débris d’une collision de proton-ion de plomb capturé par le détecteur ALICE montrant un grand nombres de particules diverses crées à partir de l’énergie dégagée.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

“Snowmass” (Not Snowmass)

Saturday, October 13th, 2012

Every so often, perhaps once or twice a decade, particle physics in the United States comes to some kind of a crossroads that requires us to think about the long-term direction of the field. Perhaps there is new experimental data that is pointing in new directions, or technology developments that make some new facility possible, or we’re seeing the end of the previous long-term plan and it’s time to develop the next one. And when this happens, the cry goes up in the community — “We need a Snowmass!”

Snowmass refers to Snowmass Village in Colorado, just down the road from Aspen, the home of the Aspen Center for Physics, a noted haunt for theorists. During the winter, Snowmass a ski resort. During the summer, it’s a mostly empty ski resort, where it’s not all that expensive to rent some condos and meeting rooms for a few weeks. Over the past few decades there have been occasional “summer studies” held at Snowmass, typically organized by the Division of Particles and Fields of the American Physical Society (and sponsored by a host of organizations and agencies). It’s a time for the particle-physics community to come together for a few weeks and spend some quality time focusing on long-range planning.

The last big Snowmass workshop was in 2001. At the time, the Fermilab Tevatron was just getting started on a new data run after a five-year shutdown for upgrades, and the LHC was under construction. The top quark had been discovered, but was not yet well characterized. We were just beginning to understand neutrino masses and mixing. The modern era of observational cosmology was just beginning. A thousand physicists came to Snowmass over the course of three weeks to plot the future of the field. (And I was a lot younger.) Flash forward eleven years: the Tevatron has been shut down (leaving the US without a major high-energy particle collider), the LHC is running like gangbusters, we’re trying to figure out what dark energy is, and just in the past year two big shoes have dropped — we have measured the last neutrino mixing angle, and, quite famously, observed what could well be the Higgs boson. So indeed, it is time for another Snowmass workshop.

This week I came to Fermilab for a Community Planning Meeting for next year’s Snowmass workshop. Snowmass 2013 is going to be a bit different than previous workshops in that it will not actually be at Snowmass! Budgetary concerns and new federal government travel regulations have made the old style of workshop infeasible. Instead, there will be a shorter meeting this summer hosted by our colleagues at the University of Minnesota (hats off to thee for having us), so this time we won’t have as much time during the workshop to chew over the issues, and more work will have to be done ahead of time. (But I suspect that we’re still going to call this workshop “Snowmass”, just as the ICHEP conference was “the Rochester conference” for such a long time, even if it’s now the “Community Summer Study”.)

This Snowmass is being organized along the three “frontiers” that we’re using to classify the current research efforts in the field — energy, intensity and cosmic. As someone who works at the LHC, I’m most familiar with what’s going on at the energy frontier, and certainly there are important questions that have only come into focus this year. Did we observe the Higgs boson at the LHC? What more do we have to know about it to believe that it’s the Higgs? What are the implications of not having observed any other new particles yet for particle physics and for future experiments? The Snowmass study will help us understand how we answer these questions, and specifically what experiments and facilities are needed to do so. There are lots of interesting ideas that are out there right now. Can the LHC tell us what we need to know, possibly with an energy or luminosity upgrade? Is this the time to build a “Higgs factory” that would allow us to study measure Higgs properties precisely? If so, what’s the right machine for that? Or do we perhaps need an accelerator with even greater energy reach, something that will help us create new particles that would be out of reach of the LHC? What kind of instrumentation and computing technologies are needed to make sense of the particle interactions at these new facilities? The intensity and cosmic frontiers have equally big and interesting questions. I would posit that the scientific questions of particle physics have not been so compelling for a long time, and that it is a pivotal time to think about what new experiments are needed.

However, we also have the bracing reality that we are looking at these questions in a budget environment that is perhaps as constrained as it has ever been. Presentations from our champions and advocates at the Department of Energy and the National Science Foundation, the agencies that fund this research (and that sponsor the US LHC blog) were encouraging about the scientific opportunities but also noted the boundary conditions that arise from the federal budget as a whole, national research priorities, and our pre-existing facilities plan. It will continue to be a challenge to make the case for our work (compelling as it may be to us, and to someone who might be interested in looking at the Quantum Diaries site) and to envision a set of facilities that can be built and used given the funding available.

The first (non-native) settlers of Snowmass, Colorado, were miners, who were searching for buried treasure under adverse conditions. They were constrained by the technology of the time, and the facilities that were available for their work. I shouldn’t suggest that what we are doing is exactly like mining (it’s much safer, for one thing), but hopefully when we go to Snowmass (or really “Snowmass”) we will be figuring out how to develop the technology and facilities that are needed to extract an even greater treasure.

Thursday, September 20th, 2012

From Now Until Mid-December, Expect One Thing from the LHC: More Collisions.

Figure 1: Integrated luminosity for LHC Experiments versus time. 8 TeV proton-proton collisions began in April 2012. Credit: CERN

Hi All,

Quick post today. That plot above represents the amount of 8 TeV data collected by the LHC experiments. As of this month, the ATLAS and CMS detector experiments have each collected 15 fb-1 of data. A single fb-1 (pronounced: inverse femto-barn) is equivalent to 70 trillion proton-proton collisions. In other words, ATLAS and CMS have each observed 1,050,000,000,000,000 proton-proton collisions. That is 1.05 thousand-trillion, or 1.05×1015.

To understand how gargantuan a number this is, consider that it took the LHC’s predecessor, the Tevatron, 24 years to deliver 12 fb-1 of proton-antiproton collisions*. The LHC has collected this much data in five months. Furthermore,  proton-proton collisions will officially continue until at least December 16th, at which time CERN will shut off the collider for the holiday season. Near the beginning of the calendar year, we can expect the LHC to collide lead ions for a while before the long, two-year shut down. During this time, the LHC magnets will be upgraded in order to allow protons to run at 13 or 14 TeV, and the detector experiments will get some much-needed tender loving care maintenance and upgrades.

To estimate how much more data we might get before the New Year, let’s assume that the LHC will deliver 0.150 fb-1 per day from now until December 16th. I consider this to be a conservative estimation, but I refer you to the LHC’s Performance and Statistics page. I also assume that the experiments operate at 100% efficiency (not so conservative but good enough). Running 7 days a week puts us at a little over 1 fb-1 per week. According to the LHC schedule, there about about 10 more weeks of running (12 weeks until Dec. 16 minus 2 weeks for “machine development”).

By this estimation, both ATLAS and CMS will have at least 25 fb-1 of data each before shut down!

25 fb-1 translates to 1.75 thousand-trillion proton-proton collisions, more than four times as much 8 TeV data used to discover the Higgs boson in July**.

Fellow QDer Ken Bloom has a terrific breakdown of what all this extra data means for studying physics. Up-to-the-minute updates about the LHC’s performance are available via the LHC Programme Coordinate Page, @LHCstatus, and @LHCmode. There are no on-going collisions at the moment because the LHC is currently under a technical stop/beam recommissioning/machine development/scrubbing, but things will be back to normal next week.

Happy Colliding

- richard (@bravelittlemuon)

* 10 fb-1 were recorded each by CDF and DZero, but to be fair, it also took Fermilab about 100 million protons to make 20 or so antiprotons.

** The Higgs boson discovery used 5 fb-1 of 7 TeV data and 5.5 fb-1 of 8 TeV data

Beyond the Higgs: Training PanDA to Tackle Astrophysics, Biology

Tuesday, September 18th, 2012

The art of data mining is about searching for the extraordinary within a vast ocean of regularity. This can be a painful process in any field, but especially in particle physics, where the amount of data can be enormous, and ‘extraordinary’ means a new understanding about the fundamental underpinnings of our universe. Now, a tool first conceived in 2005 to manage data from the world’s largest particle accelerator may soon push the boundaries of other disciplines. When repurposed, it could bring the immense power of data mining to a variety of fields, effectively cracking open the possibility for more discoveries to be pulled up from ever-increasing mountains of scientific data.

Advanced data management tools offer scientists a way to cut through the noise by analyzing information across a vast network. The result is a searchable pool that software can sift through and use for a specific purpose. One such hunt was for the Higgs boson, the last remaining elementary particle of the Standard Model that, in theory, endows other particles with mass.

With the help of a system called PanDA, or Production and Distributed Analysis, researchers at CERN’s Large Hadron Collider (LHC) in Geneva, Switzerland discovered such a particle by slamming protons together at relativistic speeds hundreds of millions of times per second. The data produced from those trillions of collisions—roughly 13 million gigabytes worth of raw information—was processed by the PanDA system across a worldwide network and made available to thousands of scientists around the globe. From there, they were able to pinpoint an unknown boson containing a mass between 125–127 GeV, a characteristic consistent with the long-sought Higgs.

An ATLAS event with two muons and two electrons - a candidate for a Higgs-like decay. The two muons are picked out as long blue tracks, the two electrons as short blue tracks matching green clusters of energy in the calorimeters. ATLAS Experiment © 2012 CERN.

The sheer amount of data arises from the fact that each particle collision carries unique signatures that compete for attention with the millions of other collisions happening nanoseconds later. These must be recorded, processed, and analyzed as distinct events in a steady stream of information. (more…)

Understanding the Higgs search

Wednesday, August 15th, 2012

It’s been over a month since CERN hosted a seminar on the updated searches for the Higgs boson. Since then ATLAS and CMS and submitted papers showing what they found, and recently I got news that the ATLAS paper was accepted by Physics Letters B, a prestigious journal of good repute. For those keeping score, that means it took over five weeks to go from the announcement to publication, and believe it not, that’s actually quite fast.

Crowds watch the seminar from Melbourne, Australia (CERN)

However, all this was last month’s news. Within a week of finding this new particle physicists started on the precision spin measurement, to see if it really is the Higgs boson or not. Let’s take a more detailed look at the papers. You can see both papers as they were submitted on the arXiv here: ATLAS / CMS.

The Higgs backstory

In order to fully appreciate the impact of these papers we need to know a little history, and a little bit about the Higgs boson itself. We also need to know some of the fundamentals of scientific thinking and methodology. The “Higgs” mechanism was postulated almost 50 years ago by several different theorists: Brout, Englert, Guralnik, Hagen, Higgs, and Kibble. For some reason Peter Higgs seems to have his name attached to this boson, maybe because his name sounds “friendliest” when you put it next to the word “boson”. The “Brout boson” sounds harsh, and saying “Guralnik boson” a dozen times in a presentation is just awkward. Personally I prefer the “Kibble boson”, because as anyone who owns a dog will know, kibble gets everywhere when you spill it. You can tidy it up all you like and you’ll still be finding bits of kibble months later. You may not find bits often, but they’re everywhere, much like the Higgs field itself. Anyway, this is all an aside, let’s get back to physics.

It helps to know some of history behind quantum mechanics. The field of quantum mechanics started around the beginning of the 20th century, but it wasn’t until 1927 that the various ideas started to get resolved into a consistent picture of the universe. Some of the greatest physicists from around the world met at the 1927 Solvay Conference to discuss the different ideas and it turned out that the two main approaches to quantum mechanics, although they looked different, were actually the same. It was just a matter of making everything fit into a consistent mathematical framework. At that time the understanding of nature was that fields had to be invariant with respect to gauge transformation and Lorentz transformations.

The Solvay Conference 1927, where some of the greatest physicists of the 20th century met and formulated the foundations of modern quantum mechanics. (Wikipedia)

A gauge transformation is the result of the kind of mathematics we need to represent particle fields, and these fields must not introduce new physics when they get transformed. To take an analogy, imagine you have the blueprints for a building and you want to make some measurements of various distances and angles. If someone makes a copy of the blueprints, but changes the direction of North (so that the building faces another direction) then this must not change any of the distances or angles. In that sense the distances and angles in blueprint are rotation-invariant. They are rotation-invariant because we need to use Euclidean space to represent the building, and a consequence of using Euclidean space is that any distances and angles described in the space must be invariant with respect to rotation. In quantum mechanics we use complex numbers to represent the field, and a gauge transformation is just a rotation of a complex number.

The Lorentz transformation is a bit simpler to understand, because it’s special relativity, which says that if you have a series of events, observers moving at different speeds and in different directions will agree on the causality of those events. The rest of special relativity is just a matter of details, and those details are a lot of fun to look at.

By the time all of quantum mechanics was coming together there were excellent theories that took these symmetries into account. Things seemed to be falling into place, and running the arguments backwards lead to some very powerful predictions. Instead of observing a force and then requiring it to be gauge and Lorentz invariant, physicists found they could start with a gauge and Lorentz invariant model and use that to predict what forces can exist. Using plain old Euclidean space and making it Lorentz invariant gives us Minkowski space, which is the perfect for making sure that our theories work well with special relativity. (To get general relativity we start with a space which is not Euclidean.) Then we can write the most general description of a field we can think of in this space as long as it is gauge invariant and that’s a valid physical field. The only problem was that there were some interactions that seemed to involve a massive photon-like boson. Looking at the interactions gave us a good idea of the mass of this particle, the $$W$$ boson. In the next few decades new particles were discovered and the Standard Model was proposed to describe all these phenomena. There are three forces in the Standard Model, the electromagnetic force, the weak force, and the strong force, and each one has its own field.

Inserting the Higgs field

The Higgs field is important because it unifies two of the three fundamental fields in particle physics, electromagnetism and the weak fields. It does this by mixing all the fields up (and in doing so, it mixes the bosons up.) Flip Tanedo has tried to explain the process from a theorist’s point of view to me privately on more than one occasion, but I must admit I just ended up a little confused by some of the finer points. The system starts with three fields which are pretty much all the same as each other, the $$W_1$$, $$W_2$$, and the $$W_3$$. These fields don’t produce any particles themselves because they don’t obey the relevant physical laws (it’s a bit more subtle in reality, but that’s a blog post in itself.) If they did produce their own fields then they would generate massless particles known as Goldstone bosons, and we haven’t seen these, so we know there is something else going on. Instead of making massless bosons they mix amongst themselves to create new fields, giving us massive bosons, and the Goldstone bosons get converted into extra degrees of freedom. Along comes the Higgs field and suddenly these fields separate and mix, giving us four new fields.

The Higgs field, about to break the symmetry and give mass (Flip Tanedo)

The $$W_1$$ and $$W_2$$ mix to give us the $$W^+$$ and $$W^-$$ bosons, and then the $$W_3$$ field meets the $$B$$ field to give us the $$Z$$ boson and the photon. What makes this interesting is that the photon behaves well on its own. It has no mass and this means that its field is automatically gauge invariant. Nature could have decided to create just the electromagnetic field and everything would work out fine. Instead we have the photon and three massive bosons, and the fields of these massive bosons cannot be gauge invariant by themselves, they need something else to make it all balance out. By now you’ve probably guessed what this mystery object is, it’s the Higgs field and with it, the Higgs boson! This field fixes it all up so that the fields mix, we get massive bosons and all the relevant laws (gauge invariance and Lorentz invariance) are obeyed.

Before we go any further it’s worth pointing a few things out. The mass of the $$W$$ boson is so large in comparison to other particles that it slows down the interactions of a lot of particles, and this is one of the reasons that the sun burns so “slowly”. If the $$W$$ boson was massless then it could be produced in huge numbers and the rate of fusion in the sun would be much faster. The reason we have had a sun for billions of years, allowing the evolution of life on Earth (and maybe elsewhere) is because the Higgs field gives such a large mass to the $$W$$ boson. Just let that thought sink in for a few seconds and you’ll see the cosmic significance of the Higgs field. Before we get ahead ourselves we should note that the Higgs field leads to unification of the electromagnetic and weak forces, but it says nothing about the strong force. Somehow the Higgs field has missed out one of the three fundamental forces of the Standard Model. We may one day unite the three fields, but don’t expect it to happen any time soon.

“Observation” vs “discovery”, “Higgs” vs “Higgs-like”

There’s one more thing that needs to be discussed before looking at the papers and that’s a rigorous discussion of what we mean by “discovery” and if we can claim discover of the Standard Model Higgs boson yet. “Discovery” has come to mean a five sigma observation of a new resonance, or in other words that probability that the Standard Model background in the absence of a new particle would bunch up like this is less than one part in several million. If we see five sigma we can claim a discovery, but we still need to be a little careful. Suppose we had a million mass points, what is the probability that there is one five sigma fluctuation in there? It’s about $$20\%$$, so looking at just the local probability is not enough, we need to look at the probability that takes all the data points into account. Otherwise we can increase the chance of seeing a fluctuation just by changing the way we look at the data. Both ATLAS and CMS have been conscious of this effect, known as the “Look Elsewhere Effect”, so every time they provide results they also provide the global significance, and that is what we should be looking at when we talk about the discovery.

Regular readers might remember Flip’s comic about me getting worked up over the use of the word “discovery” a few weeks back. I got worked up because the word “discovery” had been misused. Whether an observation is $$4.9$$ or $$5.1$$ sigma doesn’t matter that much really, and I think everyone agrees about that. What bothered me was that some people decided to change what was meant by a discovery after seeing the data, and once you do that you stop being a scientist. We can set whatever standards we like, but we must stick to them. Burton, on the other hand, was annoyed by a choice of font. Luckily our results are font-invariant, and someone said “If you see five sigma you can present in whatever durn font you like.”

Getting angry over the change of goalposts. Someone has to say these things.

In addition to knowing what we mean by “discovery” we also need to take hypothesis testing into account. Anyone who claims that we have discovered the Higgs boson is as best misinformed, and at worst willingly untruthful. We have discovered a new particle, there’s no doubt about that, but now we need to eliminate things are not the Higgs until we’re confident that the only thing left is the Higgs boson. We have seen this new particle decay to two photons, and this tells us that it can only only have spin 0 or spin 2. That’s eliminated spin 1, spin 3, spin 4… etc for us, all with a single measurement. What we are doing now trying to exclude both the spin 0 and spin 2 possibilities. Only one of these will be excluded, and then will know for sure what the spin is. And then we know it’s the Standard Model Higgs boson, right? Not quite! Even if we know it’s a spin 0 particle we would still need to measure its branching fractions to confirm that it is what we have been looking for all along. Bear this in mind when thinking about the paper- all we have seen so far is a new particle. Just because we’re searching for the Higgs and we’ve found something new it does not mean that it’s a the Higgs boson.

The papers

Finally we get to the papers. From the titles we can see that both ATLAS and CMS have been suitably agnostic about the particle’s nature. Neither claim it’s the Higgs boson and neither even claim anything more than an “observation”. The abstracts tell us a few useful bits of information (note that the masses quoted agree to within one sigma, which is reassuring) but we have to tease out the most interesting parts by looking at the details. Before the main text begins each experiment dedicates their paper to the memories of those who have passed away before the papers were published. This is no short list of people, which is not surprising given that people have been working on these experiments for more than 20 years. Not only is this a moving start to the papers, it also underlines the impact of the work.

Both papers were dedicated to the memories of colleagues who did not see the observation. (CMS)

Both papers waste no time getting into the heart of the matter, which is nature of the Standard Model and how it’s been tested for several decades. The only undiscovered particle predicted by the Standard Model is the Higgs boson, we’ve seen everything else we expected to see. Apart from a handful of gauge couplings, just about every prediction of the Standard Model has been vindicated. In spite of that, the search for the Higgs boson has taken an unusually long time. Searches took place at LEP and Tevatron long before the LHC collided beams, and the good news is that the LEP limit excluded the region that is very difficult for the LHC to rule out (less than $$114GeV$$). CDF and D0 both saw an excess in the favored region, but the significance was quite low, and personally I’m skeptical since we’ve already seen that CDF’s dijet mass scale might have some problems associated with it. Even so we shouldn’t spend too long trying to interpret (or misinterpret) results, we should take them at face value, at least at first. Next the experiments tell us which final states they look for, and this is where things will get interesting later on. Before describing the detectors, each experiment pauses to remind us that the conditions of 2012 are more difficult than those of 2011. The average number of interactions per beam crossing increased by a factor of two, making all analyses more difficult to work with (but ultimately all our searches a little more sensitive.)

At this point both papers summarize their detectors, but CMS goes out of their way to show off how the design of their detector was optimized for general Higgs searches. Having a detector which can reconstruct high momentum leptons, low momentum photons and taus, and also tag b-jets is not as easy task. Both experiments do well to be able to search for the Higgs bosons in the channels they look at. Even if we limit ourselves to where ATLAS looked the detectors would still have trigger on leptons and photons, and be able to reconstruct not only those particles, but also the missing transverse energy. That’s no easy task at a hadron collider with many interactions per beam crossing.

The two experiments have different overall strategies to the Higgs searches. ATLAS focused their attention on just two final states in 2012: $$\gamma\gamma$$, and $$ZZ^*$$, whereas CMS consider five final sates: $$\gamma\gamma$$, $$ZZ^*$$, $$WW^*$$, $$\tau\tau$$, and $$b\bar{b}$$. ATLAS focus mostly on the most sensitive modes, the so-called “golden channel”, $$ZZ^*$$, and the fine mass resolution channel, $$\gamma\gamma$$. With a concerted effort, a paper that shows only these modes can be competitive with a paper that shows many more, and labor is limited on both experiments. CMS spread their effort across several channels, covering all the final states with expected sensitivities comparable to the Standard Model.

$$H\to ZZ^*$$

The golden channel analysis has been presented many times before because it is sensitive across a very wide mass range. In fact it spans the range $$110-600GeV$$, which is the entire width of the Higgs search program at ATLAS and CMS. (Constraints from other areas of physics tell us to look as high as $$1000GeV$$, but at high masses the Higgs boson would have a very large width, making it extremely hard to observe. Indirect results favor the low mass region, which is less than around $$150GeV$$.) Given the experience physicists have had with this channel it’s no surprise that the backgrounds are very well understood at this point. The dominant “irreducible” background comes from Standard Model production of $$Z/\gamma*$$ bosons, where there is one real $$Z$$ boson, and one “off-shell”, or virtual boson. This is called irreducible because the source of background is the same final state as the signal, so we can’t remove further background without also removing some signal. This off-shell boson can be an off-shell $$Z$$ boson or an off-shell photon, it doesn’t really matter which since these are the same for the background. In the lower mass range there are also backgrounds from $$t\bar{t}$$, but fortunately these are well understood with good control regions in the data. Using all this knowledge, the selection criteria for $$8TeV$$ were revisited to increase sensitivity as much as possible.

The invariant mass spectrum for ATLAS's H→ZZ* search (ATLAS)

Since this mode has a real $$Z$$ boson, we can look for two high momentum leptons in the final state, which mames things especially easy. The backgrounds are small, and the events are easy to identify, so the trigger is especially simple. Events are stored to disk if there is at least one very high momentum lepton, or two medium momentum leptons which means that we don’t have to throw any events away. Some triggers fire so rapidly that we can only store some of the events from them, and we call this prescaling. When we keep $$1$$ in $$n$$ events then we have a prescale of $$n$$. For a Higgs search we want to have a high efficiency as possible so we usually require a prescale of $$1$$. Things are not quite so nice for the $$\gamma\gamma$$ mode, as we’ll see later.

The invariant mass spectrum for CMS's H→ZZ* search (CMS)

After applying a plethora of selections on the leptons and reconstructing the $$Z$$ and Higgs boson candidates the efficiency for the final states vary from $$15\%-37\%$$, which is actually quite high. No detector can cover the whole of the solid angle, and efficiencies vary with the detector geometry. The efficiency needs to be very high because the fraction of Higgs bosons that would decay to these final states is so small. At a mass of $$125GeV$$ the branching fraction to the $$ZZ^*$$ state is about $$2\%$$, and then branching fraction of $$Z$$ to two leptons is about $$6\%$$. Putting that all together means that only $$1$$ in $$10,000$$ Higgs bosons would decay to this final state. At a mass of $$125GeV$$ the LHC would produce about $$15,000$$ Higgs bosons per $$fb^{-1}$$. So for $$10fb^{-1}$$ we could expect to have about $$11$$ Higgs bosons decaying to this final state, and we could expect to see about $$3$$ of those events reconstructed. This is a clean mode, but it’s an extremely challenging one.

The selection criteria are applied, the background is estimated, and the results are shown. As you can see there is a small but clear excess over background in the region around $$125GeV$$ and this is evidence supporting the Higgs boson hypothesis!

CMS see slightly fewer events than expected, but still see a clear excess (CMS)

$$H\to\gamma\gamma$$

Out of the $$H\to ZZ^*$$ and $$H\to\gamma\gamma$$ modes the $$\gamma\gamma$$ final state is the more difficult one to reconstruct. The triggers are inherently “noisy” because they must fire on something that looks like a high energy photon, and there are many sources of background for this. As well as the Standard Model real photons (where the rate of photon production is not small) there are jets faking photons, and electrons faking photons. This makes the mode dominated by backgrounds. In principle the mode should be easy: just reconstruct Higgs candidates from pairs of photons and wait. The peak will reveal itself in time. However ATLAS and CMS are in the middle of a neck and neck race to find the Higgs boson, so both collaborations exploit any advantage they can, and suddenly these analyses become some of the most difficult to understand.

A typical H→γγ candidate event with a striking signature (CMS)

To get a handle on the background ATLAS and CMS each choose to split the mode into several categories, depending on the properties of the photons or the final state, and each one with its own sensitivity. This allows the backgrounds to be controlled with different strategies in each category, leading to increased overall sensitivity. Each category has its own mass resolution and signal-to-background ratio, each is mutually independent of the others, and each has its own dedicated studies. For ATLAS the categories are defined by the presence of two jets, whether or not the photon converts (produces an $$e^-e^+$$ pair) in the detector, the pseudorapidity of the photons, and a kinematic quantity called $$p_{T_T}$$, with similar categories for CMS.

When modelling the background both experiments wisely chose to use the data. The background for the $$gamma\gamma$$ final state is notoriously hard to predict accurately, because there are so many contributions from different backgrounds, from real and fake photon candidates, and many kinematic or detector effects to take into account. The choice of background model even varies on a category by category basis, and choices of model vary from simple polynomial fits to the data, to exponential and skewed Gaussian backgrounds. What makes these background models particularly troublesome is that the background has to be estimated using the signal region, so small deviations that are caused by signal events could be interpreted by the fitting algorithm as a weird background shape. The fitting mechanism must be robust enough to fit the background shapes without being fooled into thinking that a real excess of events is just a slightly different shape.

ATLAS's H→γγ search, where events are shown weighted (top) and unweighted (bottom) (ATLAS)

To try to squeeze even more sensitivity out of the data CMS use a boosted decision tree to aid signal separation. A boosted decision tree is a sophisticated statistical analysis method that uses signal and background samples to decide what looks like signal, and then uses several variables to return just one output variable. A selection can be made on the output variable that removes much of the background while keeping a lot of the signal. Using boosted decision trees (or any multivariate analysis technique) requires many cross checks to make sure the method is not biased or “overtrained”.

CMS's H→γγ search, where events are shown weighted (main plot) and unweighted (inset) (CMS)

After analyzing all the data the spectra show a small bump. The results can seem a little disappointing at first, after all the peak is barely discernable, and so much work has gone into the analyses. Both experiments show the spectra after weighting the events to take the uncertainties into account and this makes the plots a little more convincing. Even so, what matters is the statistical significance of these results, and this cannot be judged by eye. The final results show a clear preference for a boson with a mass of $$125GeV$$, consistent with the Higgs boson. CMS see a hint at around $$135GeV$$, but this is probably just a fluctuation, given that ATLAS do not see something similar.

ATLAS local significance for H→γγ (ATLAS)

(If you’ve been reading the blog for a while you may remember a leaked document from ATLAS that hinted at a peak around $$115GeV$$ in this invariant mass spectrum. That document used biased and non peer-reviewed techniques, but the fact remains that even without these biases there appear to be a small excess in the ATLAS data around $$115GeV$$. The significance of this bump has decreased as we have gathered more data, so it was probably just a fluctuation. However, you can still see a slight bump at $$115GeV$$ in the significance plot. Looking further up the spectrum, both ATLAS and CMS see very faint hints of something at $$140GeV$$ which appears in both the $$ZZ^*$$ and $$\gamma\gamma$$ final states. This region has already been excluded for a Standard Model Higgs, but there may be something else lurking out there. The evidence is feeble at the moment, but that’s what we’d expect for a particle with a low production cross section.)

$$H\to WW^*$$

One of the most interesting modes for a wide range of the mass spectrum is the $$WW(*)$$ final state. In fact, this is the first mode to be sensitive to the Standard Model Higgs boson searches, and exclusions were seen at ATLAS, CMS, and the Tevatron experiments at around $$160GeV$$ (the mass of two on-shell $$W$$ bosons) before any other mass region. The problem with this mode is that it has two neutrinos in the final state. It would be nice to have an inclusive sample of $$W$$ bosons, including the hadronic final states, but the problems here are the lack of a good choice of trigger, and the irreducible and very large background. That mean that we must select events with two leptons and two neutrinos in them. As the favored region excludes more and more of the high mass region this mode gets more challenging, because at first we lose the mass constraint on the second $$W$$ boson (as it must decay off-shell), and secondly because we must be sensitive in the low missing transverse energy region, which starts to approach our resolution for this variable.

While we approach our resolution from above, the limit on the resolution increases from below, because the number of interactions per beam crossing increases, increasing the overall noise in the detector. To make progress in this mode takes a lot of hard work for fairly little gain. Both papers mention explicitly how difficult the search is in a high pileup scenario, with CMS stating

“The analysis of the $$7TeV$$ data is described in [referenced paper] and remains unchanged, while the $$8TeV$$ analysis was modified to cope with more difficult conditions induced by the higher pileup of the 2012 data taking.”

and ATLAS saying

“The analysis of the $$8TeV$$ data presented here is focused on the mass range $$110<m_H<200GeV$$ It follows the procedure used for the $$7TeV$$ data described in [referenced paper], except that more stringent criteria are applied to reduce the $$W$$+jets background and some selections have been modified to mitigate the impact of the high instantaneous luminosity at the LHC in 2012.”

It’s not all bad news though, because the final branching fraction to this state is much higher than that of the $$ZZ^*$$ final state. The branching fraction for the Standard Model Higgs boson to $$WW^*$$ is about $$10$$ times higher than that for $$ZZ^*$$, and the branching fraction of the $$W$$ boson to leptons is also about $$3$$ times higher than the $$Z$$ boson to leptons, which gives another order of magnitude advantage. Unfortunately all these events must be smeared out across a large spectrum. There is one more trick we have up our sleeves though, and it comes from the spin of the parent. Since the Standard Model Higgs boson has zero spin the $$W$$ bosons tend to align their spins in opposite directions to make it all balance out. This then favors one decay direction over another for the leptons. The $$W^+$$ boson decays with a neutrino in the final state, and because of special relativity the neutrino must align its spin against its direction of motion. The $$W-$$ boson decays with an anti-neutrino, which takes its spin with its direction of motion. This forces the two leptons to travel in the same direction with respect to the decay axis of the Higgs boson. The high momenta of the leptons smears things out a bit, but generally we should expect to see one high momentum lepton, and a second lower momentum lepton n roughly the same region of the detector.

The transverse mass for ATLAS's H→WW* search (ATLAS)

ATLAS did not actually present results for the $$WW^*$$ final state on July 4th, but they did show it in the subsequent paper. CMS showed the $$WW^*$$ final state on July 4th, although it did somewhat reduce their overall significance. Both ATLAS and CMS spend some of the papers discussing the background estimates for the $$WW^*$$ mode, but ATLAS seem to go to more significant lengths to describe the cross checks they used in data. In fact this may help to explain why ATLAS did not quite have the result ready for July 4th, whereas CMS did. There’s a trade off between getting the results out quickly and spending some extra time to understand the background. This might have paid off for ATLAS, since they seem to be more sensitive in this mode than CMS.

The invariant mass for CMS's H→WW* search (CMS)

After looking at the data we can see that both ATLAS and CMS are right at the limits of their sensitivity in this mode. They are not limited by statistics, they are limited by uncertainties, and the mass point $$125GeV$$ sits uncomfortably close some very large uncertainties. The fact that this mode is sensitive at all is a tribute to the hard work of dozens of physicists who went the extra mile to make it work.

CMS's observed and expected limits for H→WW*, showing the dramatic degradation in sensitivity as the mass decreases (CMS)

$$H\to b\bar{b}$$

At a mass of $$125GeV$$ by far the largest branching fraction of the Standard Model Higgs boson is to $$b\bar{b}$$. CDF and D0 have both seen a broad excess in this channel (although personally I have some doubts about the energy scale of jets at CDF, given the dijet anomaly they see that D0 does not see) hinting at a Higgs boson of $$120-135GeV$$. The problem with this mode is that the background is many orders of magnitude larger than the signal, so some special tricks must be used to remove the background. What is done at all four experiments is to search for a Higgs boson that is produced in associated with a $$W$$ or $$Z$$ boson, and this greatly reduces the background. ATLAS did not present an updated search in the $$b\bar{b}$$ channel, and taking a look at the CMS limits we can probably see why, the contribution is not as significant as in other modes. The way CMS proceed with the analysis is to use several boosted decision trees (one for each mass point) and to select candidates based on the output of the boosted decision tree. The result is less than $$1$$ sigma of significance, about half of what is expected, but if this new boson is the Higgs boson then this significance will increase as we gather more data.

A powerful H→bb search requires a boosted decision tree, making the output somewhat harder to interpret (CMS)

It’s interesting to note that the $$b\bar{b}$$ final state is sensitive to both a spin 0 and a spin 2 boson (as I explained in a previous post) and it may have different signal strength parameters for different spin states. The signal strength parameter tells us how many events we see compared to how many events we do see, and it is denoted with the symbol $$\mu$$. A there is no signal then $$\mu=0$$, if the signal is exactly as large as we expect then $$\mu=1$$, and any other value indicates new physics. It’s possible to have a negative value for $$\mu$$ and this would indicate quantum mechanical interference of two or more states that cancel out. Such an interference term is visible in the invariant mass of two leptons, as the virtual photon and virtual $$Z$$ boson wavefunctions interfere with each other.

$$H\to\tau\tau$$

Finally, the $$\tau\tau$$ mode is perhaps the most enlightening and the most exciting right now. CMS showed updated results, but ATLAS didn’t. CMS’s results were expected to approach the Standard Model sensitivity, but for some reason their results didn’t reach that far, and that is crucially important. CMS split their final states by the decay mode of the $$\tau$$, where the final states include $$e\mu 4\nu$$, $$\mu\mu 4\nu$$, $$\tau_h\mu 3\mu$$, and $$\tau_h e3\nu$$, where $$\tau_h$$ is a hadronically decaying $$\tau$$ candidate. This mode has at least three neutrinos in the final state, so like the $$WW^*$$ mode the events get smeared across a mass spectrum. There are irreducible backgrounds from $$Z$$ bosons decaying to $$\tau\tau$$ and from Drell-Yan $$\tau\tau$$ production, so the analysis must search for an excess of events over these backgrounds. In addition to the irreducible backgrounds there are penalties in efficiency associated with the reconstruction of $$\tau$$ leptons, which make this a challenging mode to work this. There are dedicated algorithms for reconstructing hadronically decaying $$\tau$$ jets, and these have to balance out the signal efficiency for real $$tau$$ leptons and background rejection.

CMS's H→τtau; search, showing no signal (CMS)

After looking at the data CMS expect to see an excess of $$1.4$$ sigma, but they actually see $$0$$ sigma, indicating that there may be no Standard Model Higgs boson after all. Before we jump to conclusions it’s important to note a few things. First of all statistical fluctuations happen, and they can go down just as easily as they can go up, so this could just be a fluke. It’s a $$1.5$$ sigma difference, so the probability of this being due a fluctuation if the Standard Model Higgs boson is about $$8\%$$. On its own that could be quite low, but we have $$8$$ channels to study, so the chance of this happening in any one of the channels is roughly $$50\%$$, so it’s looking more likely that this is just a fluctuation. ATLAS also have a $$\tau\tau$$ analysis, so we should expect to see some results from them in the coming weeks or months. If they also don’t see a signal then it’s time to start worrying.

CMS's limit of H→ττ actually shows a deficit at 125GeV. A warning sign for possible trouble for the Higgs search! (CMS)

Combining results

Both experiments combine their results and this is perhaps the most complicated part of the whole process. There are searches with correlated and uncorrelated uncertainties, there are two datasets at different energies to consider, and there are different signal-to-background ratios to work with. ATLAS and CMS combine their 2011 and 2012 searches, so they both show all five main modes (although only CMS show the $$b\bar{b}$$ and $$\tau\tau$$ modes in 2012.)

When combining the results we can check to see if the signal strength is “on target” or not, and there is some minor disagreement between the modes. For the $$ZZ^*$$ and $$WW^*$$ modes, the signal strengths are about right, but for the $$\gamma\gamma$$ mode it’s a little high for both experiments, so there is tension between these modes. Since these are the most sensitive modes, and we have more data on the way then this tension should either resolve itself, or get worse before the end of data taking. The $$b\bar{b}$$ and $$\tau\tau$$ modes are lower than expected for both experiments (although for ATLAS the error bars are so large it doesn’t really matter), suggesting that this new particle may a non-Standard Model Higgs boson, or it could be something else altogether.

Evidence of tension between the γγ and fermionic final states (CMS)

While the signal strengths seem to disagree a little, the masses all seem to agree, both within experiments and between them. The mass of $$125GeV$$ is consistent with other predictions (eg the Electroweak Fit) and it sheds light on what to look for beyond the Standard Model. Many theories favor a lower mass Higgs as part of a multiplet of other Higgs bosons, so we may see some other bosons. In particular, the search for the charged Higgs boson at ATLAS has started to exclude regions on the $$\tan\beta$$ vs $$m_{H^+}$$ plane, and the search might cover the whole plane in the low mass region by the end of 2012 data taking. Although a mass of $$125GeV$$ is consistent with the Electroweak Fit, it is a bit higher than the most favored region (around $$90GeV$$) so there’s certainly space for new physics, given the observed exclusions.

The masses seem to agree, although the poor resolution of the WW* mode is evident when compared to the ZZ* and γγ modes (ATLAS)

To summarize the results, ATLAS sees a $$5.9$$ sigma local excess, which is $$5.1$$ sigma global excess, and technically this is a discovery. CMS sees a $$5.0$$ sigma local excess, which is $$4.6$$ sigma global excess, falling a little short of a discovery. The differences in results are probably due to good luck on the part of ATLAS and bad luck on the part of CMS, but we’ll need to wait for more data to see if this is the case. The results should “even out” if the differences are just due to fluctuations up for ATLAS and down for CMS.

ATLAS proudly show their disovery (ATLAS)

If you’ve read this far then you’ve probably picked up on the main message, we haven’t discovered the Standard Model Higgs boson yet! We still have a long road ahead of us and already we have moved on to the next stage. We need to measure the spin of this new boson and if we exclude the spin 0 case then we know it is not a Higgs boson. If exclude the spin 2 case then we still need to go a little further to show it’s the Standard Model Higgs boson. The spin analysis is rather complicated, because we need to measure the angles between the decay products and look for correlations. We need to take the detector effects into account, then subtract the background spectra. What is left after that are the signal spectra, and we’re going to be statistically limited in what we see. It’s a tough analysis, there’s no doubt about it.

We need to see the five main modes to confirm that this is what we have been looking for for so long. If we get the boson modes ($$ZZ^*$$, $$WW^*$$, $$\gamma\gamma$$) spot on relative to each other, then we may have a fermiophobic Higgs boson, which is an interesting scenario. (A “normal” fermiophobic Higgs boson has already been excluded, so any fermiophobic Higgs boson we may see must be very unusual.)

There are also many beyond the Standard Model scenarios that must be studied. As more regions of parameter space are excluded, theorists tweak their models, and give us updated hints on where to search. ATLAS and CMS have groups dedicated to searching for beyond the Standard Model physics, including additional Higgs bosons, supersymmetry and general exotica. It will be interesting to see how their analyses change in light of the favored mass region in the Higgs search.

A favored Higgs mass has implications for physics beyond the Standard Model. Combined with the limits on new particles (shown in plot) many scenarios can be excluded (ATLAS)

2012 has been a wonderful year for physics, and it looks like it’s only going to get better. There are still a few unanswered questions and tensions to resolve, and that’s what we must expect from the scientific process. We need to wait a little longer to get to the end of the story, but the anticipation is all part of the adventure. We’ll know is really happening by the end of Moriond 2013, in March. Only then can we say with certainty “We have proven/disproven the existence of the Standard Model Higgs boson”!

I like to say “We do not do these things because they are easy. We do them because they are difficult”, but I think Winston Churchill said it better:

This is not the end. It is not even the beginning of the end, but it is perhaps the end of the beginning.” W. Churchill

References etc

Plots and photos taken from:
“Webcast of seminar with ATLAS and CMS latest results from ICHEP”, ATLAS Experiment, CERN, ATLAS-PHO-COLLAB-2012-014
Wikipedia
“Observation of a new particle in the search for the Standard Model Higgs boson with the ATLAS detector at the LHC”, ATLAS Collaboration, arXiv:1207.7214v1 [hep-ex]
“Observation of a new boson at a mass of 125 GeV with the CMS experiment at the LHC”, CMS Collaboration, arXiv:1207.7235v1 [hep-ex]
Flip Tanedo

It’s been a while since I last posted. Apologies. I hope this post makes up for it!