• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘Tevatron’

This article first appeared in ISGTW Dec. 21, 2011.

A night-time view of the Tevatron. Photo by Reidar Hahn.

This is the first part of a two-part series on the contribution Tevatron-related computing has made to the world of computing. This part begins in 1981, when the Tevatron was under construction, and brings us up to recent times. The second part will focus on the most recent years, and look ahead to future analysis.

Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software.

Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. In fact, computing continues to do its work. Although the proton and antiproton beams no longer brighten the Tevatron’s tunnel, physicists expect to be using computing to continue analyzing a vast quantity of collected data for several years to come.

When all that data is analyzed, when all the physics results are published, the Tevatron will leave behind an enduring legacy. Not just a physics legacy, but also a computing legacy.

In the beginning: The fixed-target experiments

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.

1981. The first Indiana Jones movie is released. Ronald Reagan is the U.S. President. Prince Charles makes Diana a Princess. And the first personal computers are introduced by IBM, setting the stage for a burst of computing innovation.

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.Meanwhile, at the Fermi National Accelerator Laboratory in Batavia, Illinois, the Tevatron has been under development for two years. And in 1982, the Advanced Computer Program formed to confront key particle physics computing problems. ACP tried something new in high performance computing: building custom systems using commercial components, which were rapidly dropping in price thanks to the introduction of personal computers. For a fraction of the cost, the resulting 100-node system doubled the processing power of Fermilab’s contemporary mainframe-style supercomputers.

“The use of farms of parallel computers based upon commercially available processors is largely an invention of the ACP,” said Mark Fischler, a Fermilab researcher who was part of the ACP. “This is an innovation which laid the philosophical foundation for the rise of high throughput computing, which is an industry standard in our field.”

The Tevatron fixed-target program, in which protons were accelerated to record-setting speeds before striking a stationary target, launched in 1983 with five separate experiments. When ACP’s system went online in 1986, the experiments were able to rapidly work through an accumulated three years of data in a fraction of that time.

Entering the collider era: Protons and antiprotons and run one

1985. NSFNET (National Science Foundation Network), one of the precursors to the modern Internet, is launched. And the Tevatron’s CDF detector sees its first proton-antiproton collisions, although the Tevatron’s official collider run one won’t begin until 1992.

The experiment’s central computing architecture filtered incoming data by running Fortran-77 algorithms on ACP’s 32-bit processors. But for run one, they needed more powerful computing systems.

By that time, commercial workstation prices had dropped so low that networking them together was simply more cost-effective than a new ACP system. ACP had one more major contribution to make, however: the Cooperative Processes Software.

CPS divided a computational task into a set of processes and distributed them across a processor farm – a collection of networked workstations. Although the term “high throughput computing” was not coined until 1996, CPS fits the HTC mold. As with modern HTC, farms using CPS are not supercomputer replacements. They are designed to be cost-effective platforms for solving specific compute-intensive problems in which each byte of data read requires 500-2000 machine instructions.

CPS went into production-level use at Fermilab in 1989; by 1992 it was being used by nine Fermilab experiments as well as a number of other groups worldwide.

1992 was also the year that the Tevatron’s second detector experiment, DZero, saw its first collisions. DZero launched with 50 traditional compute nodes running in parallel, connected to the detector electronics; the nodes executed filtering software written in Fortran, E-Pascal, and C.

Gearing up for run two

"The Great Wall" of 8mm tape drives at the Tagged Photon Laboratory, circa 1990 - from the days before tape robots. Photo by Reidar Hahn.

1990. CERN’s Tim Berners-Lee launches the first publicly accessible World Wide Web server using his URL and HTML standards. One year later, Linus Torvalds releases Linux to several Usenet newsgroups. And both DZero and CDF begin planning for the Tevatron’s collider run two.

Between the end of collider run one in 1996 and the beginning of run two in 2001, the accelerator and detectors were scheduled for substantial upgrades. Physicists anticipated more particle collisions at higher energies, and multiple interactions that were difficult to analyze and untangle. That translated into managing and storing 20 times the data from run one, and a growing need for computing resources for data analysis.

Enter the Run Two Computing Project (R2CP), in which representatives from both experiments collaborated with Fermilab’s Computing Division to find common solutions in areas ranging from visualization and physics analysis software to data access and storage management.

R2CP officially launched in 1996. It was the early days of the dot com era. eBay had existed for a year, and Google was still under development. IBM’s Deep Blue defeated chess master Garry Kasparov. And Linux was well-established as a reliable open-source operating system. The stage is set for experiments to get wired and start transferring their irreplaceable data to storage via Ethernet.

The high-tech tape robot used today. Photo by Reidar Hahn.

“It was a big leap of faith that it could be done over the network rather than putting tapes in a car and driving them from one location to another on the site,” said Stephen Wolbers, head of the scientific computing facilities in Fermilab’s computing sector. He added ruefully, “It seems obvious now.”

The R2CP’s philosophy was to use commercial technologies wherever possible. In the realm of data storage and management, however, none of the existing commercial software met their needs. To fill the gap, teams within the R2CP created Enstore and the Sequential Access Model (SAM, which later stood for Sequential Access through Meta-data). Enstore interfaces with the data tapes stored in automated tape robots, while SAM provides distributed data access and flexible dataset history and management.

By the time the Tevatron’s run two began in 2001, DZero was using both Enstore and SAM, and by 2003, CDF was also up and running on both systems.

Linux comes into play

The R2CP’s PC Farm Project targeted the issue of computing power for data analysis. Between 1997 and 1998, the project team successfully ported CPS and CDF’s analysis software to Linux. To take the next step and deploy the system more widely for CDF, however, they needed their own version of Red Hat Enterprise Linux. Fermi Linux was born, offering improved security and a customized installer; CDF migrated to the PC Farm model in 1998.

The early computer farms at Fermilab, when they ran a version of Red Hat Linux (circa 1999). Photo by Reidar Hahn.

Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Enterprise Linux ceased to be free. The Fermi Linux team rebuilt Red Hat Enterprise Linux into the prototype of Scientific Linux, and formed partnerships with colleagues at CERN in Geneva, Switzerland, as well as a number of other institutions; Scientific Linux was designed for site customizations, so that in supporting it they also supported Scientific Linux Fermi and Scientific Linux CERN.

Today, Scientific Linux is ranked 16th among open source operating systems; the latest version was downloaded over 3.5 million times in the first month following its release. It is used at government laboratories, universities, and even corporations all over the world.

“When we started Scientific Linux, we didn’t anticipate such widespread success,” said Connie Sieh, a Fermilab researcher and one of the leads on the Scientific Linux project. “We’re proud, though, that our work allows researchers across so many fields of study to keep on doing their science.”

Grid computing takes over

As both CDF and DZero datasets grew, so did the need for computing power. Dedicated computing farms reconstructed data, and users analyzed it using separate computing systems.

“As we moved into run two, people realized that we just couldn’t scale the system up to larger sizes,” Wolbers said. “We realized that there was really an opportunity here to use the same computer farms that we were using for reconstructing data, for user analysis.”

A wide-angle view of the modern Grid Computing Center at Fermilab. Today, the GCC provides computing to the Tevatron experiments as well as the Open Science Grid and the Worldwide Large Hadron Collider Computing Grid. Photo by Reidar Hahn.

Today, the concept of opportunistic computing is closely linked to grid computing. But in 1996 the term “grid computing” had yet to be coined. The Condor Project had been developing tools for opportunistic computing since 1988. In 1998, the first Globus Toolkit was released. Experimental grid infrastructures were popping up everywhere, and in 2003, Fermilab researchers, led by DZero, partnered with the US Particle Physics Data Grid, the UK’s GridPP, CDF, the Condor team, the Globus team, and others to create the Job and Information Management system, JIM. Combining JIM with SAM resulted in a grid-enabled version of SAM: SAMgrid.

“A pioneering idea of SAMGrid was to use the Condor Match-Making service as a decision making broker for routing of jobs, a concept that was later adopted by other grids,” said Fermilab-based DZero scientist Adam Lyon. “This is an example of the DZero experiment contributing to the development of the core Grid technologies.”

By April 2003, the SAMGrid prototype was running on six clusters across two continents, setting the stage for the transition to the Open Science Grid in 2006.

From the Tevatron to the LHC – and beyond

Throughout run two, researchers continued to improve the computing infrastructure for both experiments. A number of computing innovations emerged before the run ended in September 2011. Among these was CDF’s GlideCAF, a system that used the Condor glide-in system and Generic Connection Brokering to provide an avenue through which CDF could submit jobs to the Open Science Grid. GlideCAF served as the starting point for the subsequent development of a more generic glidein Work Management System. Today glideinWMS is used by a wide variety of research projects across diverse research disciplines.

Another notable contribution was the Frontier system, which was originally designed by CDF to distribute data from central databases to numerous clients around the world. Frontier is optimized for applications where there are large numbers of widely distributed clients that read the same data at about the same time. Today, Frontier is used by CMS and ATLAS at the LHC.

“By the time the Tevatron shut down, DZero was processing collision events in near real-time and CDF was not far behind,” said Patricia McBride, the head of scientific programs in Fermilab’s computing sector. “We’ve come a long way; a few decades ago the fixed-target experiments would wait months before they could conduct the most basic data analysis.”

One of the key outcomes of computing at the Tevatron was the expertise developed at Fermilab over the years. Today, the Fermilab computing sector has become a worldwide leader in scientific computing for particle physics, astrophysics, and other related fields. Some of the field’s top experts worked on computing for the Tevatron. Some of those experts have moved on to work elsewhere, while others remain at Fermilab where work continues on Tevatron data analysis, a variety of Fermilab experiments, and of course the LHC.

The accomplishments of the many contributors to Tevatron-related computing are noteworthy. But there is a larger picture here.

“Whether in the form of concepts, or software, over the years the Tevatron has exerted an undeniable influence on the field of scientific computing,” said Ruth Pordes, Fermilab’s head of grids and outreach. “We’re very proud of the computing legacy we’ve left behind for the broader world of science.”

— Miriam Boon

Share

Real CMS proton-proton collision events in which 4 high energy electrons (green lines and red towers) are observed. The event shows characteristics expected from the decay of a Higgs boson but is also consistent with background Standard Model physics processes. Courtesy: CMS

Today physicists at CERN on the CMS and ATLAS experiments at the Large Hadron Collider announced an update on their search for the Higgs boson. That may make you wonder ( I hope) what is Fermilab’s role in this. Well, glad you asked.

Fermilab supports the 1,000 US LHC scientists and engineers by providing office and meeting space as well as the Remote Operation Center. Fermilab helped design the CMS detector, a portion of the LHC accelerator and is working on upgrades for both. About one-third of the members of each of the Tevatron’s experiments, CDF and DZero, are also members of the LHC experiments.

That means that a good portion of the LHC researchers are also looking for the Higgs boson with the Tevatron.  Because the Tevatron and LHC accelerators collide different pairs of particles, the dominant way in which the experiments search for the Higgs at the two accelerators is different. Thus the two machines offer a complimentary search strategy.

If the Higgs exists and acts the way theorists expect, it is crucial to observe it in both types of decay patterns. Watch this video to learn how physicists search for the Higgs boson. These types of investigations might lead to the identification of new and unexpected physics.

Scientists from the CDF and DZero collaborations at Fermilab continue to analyze data collected before the September shutdown of the Tevatron in the search for the Higgs boson.

The two collaborations will announce their latest results for the Higgs boson search at an international particle physics conference in March 2012. This new updated analysis will have 20 to 40 percent more data than the July 2011 results as well as further improvements in analysis methods.

The Higgs particle is the last not-yet-observed piece of the theoretical framework known as the Standard Model of particles and forces. Watch this video to learn The nature of the Higgs boson and how it works. According to the Standard Model, the Higgs boson explains why some particles have mass and others do not. Higgs most likely has a mass between 114-137 GeV/c2, about 100 times the mass of a proton. This predicted mass range is based on stringent constraints established by earlier measurements made by Tevatron and other accelerators around the world, and confirmed by the searches of LHC experiments presented so far in 2011. This mass range is well within reach of the Tevatron Collider.

The Tevatron experiments already have demonstrated that they have the ability to ferret out the Higgs-decay pattern by applying well-established techniques used to search for the Higgs boson to observing extremely rare but firmly expected physics signature. This signature consists of pairs of heavy bosons (WW or WZ) that decay into a pair of b quarks, a process that closely mimics the main signature that the Tevatron experiments use to search for the Higgs particle, i.e. Higgs decaying to a pair of b quarks, which has by far the largest probability to happen in this mass range. Thus, if a Standard Model Higgs exists, the Tevatron experiments will see it.

If the Standard Model Higgs particle does not exist, Fermilab’s Tevatron experiments are on track to rule it out this winter. CDF and DZero experiments have excluded the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges and will have sufficient analysis sensitivity to rule out this winter the mass region between.

While today’s announcement shows the progress that the LHC experiments have made in the last few months, all eyes will be on the Tevatron and on the LHC in March 2012 to see what they have to say about the elusive Higgs Boson.

— Tona Kunz

Share

Tevatronic Rhapsody

Friday, September 30th, 2011

The big news today is that one of the biggest accelerators is shutting down after 28 years of excellent service. When SLAC closed down PEP-II, I wrote a tribute to the tune of American Pie (called “The Day the Mesons Died”) so it’s only fair that I do the same for Tevatron.

Although most of the time, Tevatron was an accelerator that always happened to be on the wrong side of the continent or the wrong side of the Atlantic, I did start my particle physics career with the help of CDF. I worked on Monte Carlo simulations, trying to find a strategy for measuring Bs oscillations. That’s where it all began, and for that I’m grateful.

So here we are, Tevatronic Rhapsody, (hastily written in the gaps between work) to the tune of Queen’s Bohemian Rhapsody. If you really want to annoy your office mates, you can sing along with this backing track.

Tevatronic Rhapsody

Is this the data?
Is this Monte Carlo?
Caught in a meeting
No escape from the next slideshow
Open your eyes
Look up to the slides and see
Is that the top quark?
Is it the final piece?
Because it’s rarely come
Quickly go
Mass is high
Lifetime low
Any way it decays doesn’t really matter to me
(\ell b)

Bs just flipped it quarks
Put an s where there was b
Now it violates CP
Bs, you were too heavy
But not for D0 and CDF!
Bs, ooooh
You decayed into hadrons
And showed us how to violate CP
With a phase, complex phase
Because antimatter matters

Too late, our time has come
No funding for the beams
No more events to be seen
Goodbye antiprotons
We’ve got to go
Got to analyze the data for results

Bs, oooh (any way the fit goes)
You went to two muons
More often than in the Standard Model
[Celebratory guitar riff]

I see a little silhouette-o of a peak
Is it Higgs? Is it Higgs?!
Will we get a Nobel Prize?
ICHEP, Lepton-Photon,
Conferences they go on

Set the limits! (Set the limits!) Set the limits! (Set the limits!) Set the limits! Make a plot!
Upload the taaaaalk!

Here is a spectrum, with strange topology
It is a dijet, it’s an anomaly!
Technicolor, or a monstrosity?

Heavy top, lighter top, is it CPT?
CPT? No! It violates Lorentz!
CPT?
CPT? No! It violates Lorentz!
CPT?
CPT? No! It violates Lorentz!
Violates Lorentz!
(CPT?)
Violates Lorentz!
(CPT?)
Violates Lorentz-enz-enz-enz!

Oh bottom mesons, and charmed mesons, and baryons, we saw them!
The quark model had positions set aside for all, (except the X!)

So you think you can stop us and turn off our beams?
So you think we won’t be publishing reams?
Oh, data
We’ve still got loads of data!
Just gotta publish
Just gotta analyze data

[Time to rock out]

The universe, it matters, anyone can see
More than antimatters, to me

Have a great Friday! And congratulations to everyone who worked with Tevatron. Today is the end of a glorious era and A Job Well Done (as we Brits say).

Share

Tevatron past as LHC prologue

Wednesday, September 28th, 2011

This Friday, Fermilab will turn off the Tevatron for the last time after a 28-year run. It has been a constant in my life as a particle physicist, and indeed for a whole generation of particle physicists. I know some people who have managed to spend their entire careers involved with the Tevatron in some way. Not true for me; I was on hiatus at the Cornell Electron Storage Ring for five years as a graduate student. But the Tevatron was where I had my first experiences as an undergraduate researcher; as a college freshman, I was stunned to find myself with a Fermilab ID card in my pocket and suddenly in on the hunt for the top quark. (No dice; another six years, significant detector upgrades, and more than an order of magnitude more data had to come first.) And as a postdoctoral researcher, it was where I had my greatest triumphs (moderate as they may be) as a full-time researcher. (As a professor with many other things to juggle, it would be a stretch to call me a full-time researcher now.) I learned a tremendous amount along the way about physics and about how to be a physicist.

(But I will not be attending the shutdown ceremonies on September 30 — it’s Rosh Hashanah, the Jewish new year. What is it with the managers of particle physics laboratories who can’t read a calendar? So much for getting Fermilab Today to pick up this blog post….)

The Tevatron’s longevity surely puts it into a special category of scientific enterprises that have captured the public imagination because of their epic scope. The Voyager 1 satellite, for instance, has been chugging along since 1979, and barring unforeseen circumstances will continue to tell us about the nature of the universe. The Tevatron in its own way will keep chugging along too, as there is so much data yet to analyze that it will keep teaching us about the universe for some years to come.

I’m not going to tick off all of the accomplishments of the Tevatron and CDF and D0, its principal experiments; this has been done elsewhere, and also has been covered in excellent presentations at the DPF meeting by Steve Holmes and Paul Grannis, both of whom were there pretty much from the beginning. (Chris Quigg also provides a lovely summary of the physics achievements in the CERN Courier.) But what I would like to point out is that the Tevatron program of 2011 is not the program that was envisioned when the machine design was launched in the late 1970’s. The clear targets of the machine were the W and Z bosons and the top quark, and these are now understood in detail because of the Tevatron. But as far as I know, no one anticipated the program of bottom-quark physics that emerged, no one thought that precision measurements of masses could be done at a hadron collider, and even just a few years ago it would have been optimistic to suggest that the Tevatron experiments would have the capability to observe the Higgs boson. On the accelerator side, the final instantaneous luminosity was a factor of 400 better than design, meaning that there was an average 35% annual improvement over twenty years.

Since this is an LHC blog — what can we learn about the LHC from this? It is that we should not underestimate the potential that we have in front of us. The LHC will likely operate for as long as the Tevatron has, and we can realistically expect similar performance improvements along the way. We should also not underestimate how our experimental reach can be increased through advances in detector technology, and the just plain cleverness that physicists will bring to the table when given the chance to solve a challenging and important problem. In 2037, there will be new generation of particle physicists for whom the LHC is a constant of life, and I expect that we will be looking back on an LHC legacy that is just as memorable as that of the Tevatron.

Share

Really though, have you? To date, it has not discovered the higgs boson, or Supersymmetry, or any kind of new physics. In fact, all the Tevatron has done since 1987 was find Standard Model physics. Though, that is my point.

Fig. 1: Aerial view Fermilab‘s Tevatron Accelerator Complex. These images were  taken around that big pool of water, in the center of the Tevatron Ring. (Photo: Symmetry Mag.)

The Tevatron, for the past 24 years, has done everything to prove that the idiotic, nonsensical, and just plain weird idea that all of matter is composed of quarks & leptons (plus some bosons) is actually correct. Of course CERN’s Large Electron Positron is due its respect for confirming the Standard Model first through precision measurements, however, the Tevatron set the thing in stone. Over the past decades, many, many clever physicists have tried to modify the Standard Model by introducing new particles, new interactions, new particles & new interactions, but one-by-one they have been shot down. In my opinion, the Tevatron will always be known as The Standard Model Factory.

The Tevatron: Past & Present

My history with the Tevatron dates back to the summer of 2007, when I was a physics undergraduate who was hired by my then advisor to do some summer research. Since then, I have spent quite a bit of time at Fermilab and have been present for quite a few events. So like many other physicists, I am saddened by the fact that the collider will be shutting down September 30th (next Friday!). Consequentially, I decided to put together a little grocery list of Tevatron discoveries. In full disclosure: below is really just summary of all of Fermilab’s press releases since 1995, which in its own right borderlines on being an encyclopedia of particle physics.

  • February 1995 – Discovery of the top quark. Not exactly sure where to begin with this one. I mean, the top quarks existence is evidence of several things: (1) the quark structure of matter; (2) the universality of the Weak force, meaning all quarks & leptons have partners under Weak nuclear charge, e.g., up & down, charm & strange, top & bottom; (3) and also provides a tidy way of explaining some of the differences between matter & antimatter in something called the CKM matrix. The then head of the Dept. of Energy had this to say about the top quark, “This discovery serves as a powerful validation of federal support for science.” Below is the top quark, as imagined by the Particle Zoo.
  • March 1999 – Direct measurement that matter and antimatter behave differently (CP violation). The Kaons at the Tevatron (KTeV) experiment diverted protons from the Tevatron accelerator to produce a well-known particle called a Kaon, in order to measure its lifetime. The significantly larger-than-expected measurement of CP violation implied (1) CP violation was not negligibly small and (2) all particle theories have to accommodate this fact. An attractive and popular theory at the time, called the Superweak Theory, nicely explained a number of different phenomena but implied zero CP violation. You can guess why no one talks about that theory today.
  • March 2001 – Tevatron Run II begins. From this day on, the Tevatron began colliding protons & antiprotons at an impressive 1.96 TeV. It took the remainder of the decade for that record to be topped.
  • November 2001 – The Neutrinos at the Tevatron (NuTeV) Experiment discover a worrisome discrepancy between theoretical predictions and experiment measurements of the quantity sin2θW, which can be thought of as the ratio between the mass of the W boson and the mass of the Z boson. The NuTeV Experiment, like KTeV, diverted Tevatron protons to produce a different particle. In this case, neutrinos were produced and then were sequentially fired into 700 tons of steel. This anomaly had less than a 0.25% chance of being a random, statistical fluctuation (~3σ), and is now believed to related to the superstructure of protons & neutrons in a nucleus.
  • June 2004 – Tevatron results set the first “modern” constraints on the higgs boson. Thanks to the top quark, the DZero Experiment was able to set a best estimate of the higgs boson’s mass (117 GeV/c2) and a definite upper bound (251 GeV/c2). Of course these numbers exclude new physics, but so began Today’s hunt for the higgs boson.
  • April 2005 – Tevatron analyses go global. In order to cope with the huge amount of data being generated, the Tevatron detector experiments decide to connect their networks to The Grid, a global network of computers with the sole purpose of acting like one, giant computer, not unlike Deep Thought or planet Earth. This computing model is the heart and soul of the way CERN processes the LHC’s 15 petabytes a year.
  • September 2006 – Oscillations in the recently famed Bs meson are discovered! A Bs (pronounced: B-sub-s) is a bound state that occurs when a b-quark and a s-quark begin to orbit around each other, like an electron and a proton in a hydrogen atom. The “oscillations” refer to how often the two quarks exchange a W boson. This high precision measurement is considered a benchmark tests of the Standard Model due to its sensitivity to new physics. They Bs mixing Feynman diagram is below (pulled from the QD image library).
  • October 2006 – The “Period Table of Particles” is fleshed out. Just like how the theory of electrons, neutrons, & protons implies the existence of the period table of elements, the theory of quarks implies the existence of a gigantic number of combinations. This is the point of no return: The Standard Model works. It may be incomplete, it may be missing attachments, but from here on out no one can say that it is wrong.
  • December 2006 – The production of individual top quarks is identified. Okay, this needs a bit of explanation. Top quarks are heavy, like really heavy. We are talking over 40 times heavier than the second heaviest quark and well over 300,000 times heavier than the electron; it weighs as much as 180 hydrogen atoms. According to the Standard Model, it is actually easier to produce a top quark and anti-top quark at the same time than individually. This is because individual top quark production involves the Weak nuclear force and just shrinks the chances of producing one. Like Bs, single top quark production is a Standard Model benchmark because it is very sensitive to new physics. Interestingly enough, single top quark production also provides a mean for testing Supersymmetry, Technicolor, and different higgs boson models.
  • July 2008 – Diboson production is at long last discovered. The Standard Model predicts that it is possible to produce two Z bosons, simultaneously, from collisions. It is a very rare thing to see and most every addition to the Standard Model affects the rate two Z bosons are produced. There are plenty of ways to modify the oscillation rate of Bs or the rate of single top quark production and still maintain consistency with the Standard Model; modifying diboson production rates is a whole different behemoth… good luck with that. I was actually at the talk when this was announced; I remember that week very well because it was rumored that the higgs boson had been found. 🙂
  • August 2008 – “Tevatron Experiments Double-Team Higgs Boson.” The CDF & DZero Experiments combine their powers to call Captain Pla… I mean, for the first time, combine their independent higgs boson searches and begin directly excluding possible mass values for the boson. This juggernaut of an analysis (plot below) was quickly recognized for its level of sophistication and set expectations for the LHC experiments.
  • May 2010 – The infamous dimuon asymmetry is discovered. Remember how in “September 2006” I mentioned that B mesons, like Bs, are sensitive to new physics? Well, B mesons can decay into two muons or two anti-muons, plus some other things. When the number of muon pairs & anti-muon pairs were measured, it was discovered that more muon pairs were produced than anti-muon pairs. The LHC experiments still need more data to be as sensitive to confirm this high precision measurement but this might actually be the first detection of physics beyond the Standard Model at a collider. If a reader knows of an earlier collider experiment signal that hints at Beyond the Standard Model physics, I am happy to pass the title on to that.
  • August 2011 – The Tevatron’s updates its higgs boson mass exclusion with over 8 fb-1. (Below)

The Standard Model Factory

You know, when I started writing this post I had an idea how impressive the Tevatron is/was. Having systematically gone through each of Fermilab’s press releases in search for major milestones, and trust me I omitted a fair number, I do not really know what else to say. I am a bit star-struck. Yes, the Tevatron has been running since 1987 and I happily acknowledge that it just simply cannot compete with the LHC beyond 2012 projections. Just recently, the LHC reached the 3 fb-1 threshold, which translates to generating 1/3rd of the entire Tevatron data set in about 9 months; but really Really, the LHC has some pretty big shoes to fill.

Congratulations to the Tevatron Experiments, past & present, for undeniably establishing the Standard Model of Particle Physics.

More importantly, congratulations to the Tevatron Accelerator Division, for having repeatedly done the impossible because you could.

 

Happy Colliding.
– richard (@bravelittlemuon)

http://en.wikipedia.org/wiki/Beyond_the_Standard_Model
Share

CDF (red) and DZero (yellow) recorded the Colorado earthquake. Image courtesy of Todd Johnson, AD

On Tuesday, Aug. 23, the Tevatron accelerator knew something none of the people operating it knew. It felt what employees didn’t, and it reported the news faster than the media could upload it to the Internet.

A 5.9-magnitude earthquake had struck the East Coast, and the super-sensitive Tevatron felt it as it happened about 600 miles away. It had also registered a similar quake in Colorado the night before.

The quakes were recorded by sensors on large underground focusing magnets that compress particle beams from the four-mile Tevatron ring into precision collisions at the CDF and DZero detectors. The sensors keep these areas most sensitive to misalignment under constant surveillance. Quakes can jiggle small numbers of particles – less than one percent of the beam – out of alignment and force the shutdown of parts of the three-story detectors to avoid damage. Tevatron operators compare the sensor recordings with updates from the U.S. Geological Survey to rule out natural causes before having to spend time diagnosing machine trouble that caused beam movement.

Typically, two quakes occurring in this short a timeframe would cause headaches for those who run the Tevatron, but fortunately the machine didn’t have beam in the tunnels at the time.

CDF (red) and DZero (yellow) recorded the East Coast earthquake. Image courtesy of Todd Johnson, AD

The Tevatron has recorded more than 20 earthquakes from all over the globe, as well as the deadly tsunamis in Sumatra in 2005 and in Japan in March.

—Tona Kunz

Share

What If It’s Not The Higgs?

Sunday, August 21st, 2011

Updated: Monday, 2011 August 29, to clarify shape of angular distribution plots.

It’s the $10 billion question: If experimentalists do discover a bump at the Large Hadron Collider, does it have to be the infamous higgs boson? Not. One. Bit. Plainly and simply, if the ATLAS & CMS collaborations find something at the end of this year it will take a little more data to know we are definitely dealing with a higgs boson. Okay, I suppose I should back up a little an add some context. 🙂

The Standard Model of Particle Physics (or SM for short) is the name for the very well established theory that explains how almost everything in the Universe works, from a physics perspective at least. The fundamental particles that make up the SM, and hence our Universe, are shown in figure 1 and you can learn all about them by clicking on the hyperlink a sentence back. Additionally, this short Guardian article does a great job explaining fermions & bosons.

Fig 1. The Standard Model is composed of elementary particles, which are the fundamental building blocks of the Universe, and rules dictating how the particles interact. The fundamental building blocks are known as fermions and the particles which mediate interactions between fermions are called bosons. (Image: AAAS)

As great as the Standard Model is, it is not perfect. In fact, the best way to describe the theory is to say that it is incomplete. Three phenomena that are not fully explained, among many, are: (1) how do fermions (blue & green boxes in figure 1) obtain their mass; (2) why is there so little antimatter (or so much matter) in the Universe; and (3) how does gravity work at the nanoscopic scale? These are pretty big questions and over the years theorists have come up with some pretty good ideas.

The leading explanation for how fermions (blue & green boxes in figure 1) have mass is called the Higgs Mechanism and it predicts that there should be a new particle called the higgs boson (red box at bottom of figure 1). Physicist believe that the Higgs Mechanism may explain the fermion masses is because this same mechanism very accurately predicts the masses for the other bosons (red boxes in figure 1). It is worth nothing that when using the Higgs Mechanism to explain the masses of the bosons, no new particle is predicted.

Unfortunately, the leading explanations for the huge disparity between matter & antimatter, as well as a theory of gravity at the quantum level, have not been as successful. Interestingly, all three types of  theories (the Higgs Mechanism, matter/antimatter, and quantum gravity) generally predict the existence of a new boson, namely, the higgs boson, the Z’ boson (pronounced: zee prime), and the graviton. A key property that distinguishes each type of boson from the others is the intrinsic angular momentum they each carry. The higgs boson does not carry any, so we call it a “spin 0” boson; the Z’ boson carries a specific amount, so it is called a “spin 1” boson; and the graviton carries precisely twice as much angular momenta as the Z’ boson, so the graviton is called a “spin 2” boson. This will be really important in a few paragraphs but quickly let’s jump back to the higgs story.

Fig 2. Feynman Diagrams representing a higgs boson (left), Z’ boson (center), and graviton (right)
decaying into a b quark (b) & anti-b quark (b).

In July, at the European Physics Society conference, the CDF & DZero Experiments, associated with the Tevatron Collider in Illinois, USA, and the CMS & ATLAS Experiments, associated with the Large Hadron Collider, in Geneva, Switzerland, reported their latest results in the search for the higgs boson. The surprising news was that it might have been found but we will not know for sure until the end of 2011/beginning of 2012.

This brings us all the way back to our $10/€7 billion question: If the experiments have found something, how do we know that it is the higgs boson and not a Z’ boson or a graviton? Now I want to be clear: It is insanely unlikely that the new discovery is a Z’ or a graviton, if there is a new discovery at all. If something has been been discovered, chances are it is the higgs boson but how do we know?

Now, here is where awesome things happen.

The Solution.

In all three cases, the predicted boson can decay into a b quark (b) & anti-b quark (b) pair, which you can see in the Feynman diagrams in figure 2. Thanks to the Law of Conservation of Momentum, we can calculate the angle between each quark and the boson. Thanks to the well-constructed detectors at the Large Hadron Collider and the Tevatron, we can measure the angle between each quark and the boson. The point is that the angular distribution (the number of quarks observed per angle)  is different for spin 0 (higgs), spin 1 (Z’), and spin 2 (graviton) bosons!

To show this, I decided to use a computer program to simulate how we expect angular distributions for a higgs → bb, a Z’→ bb, and a graviton → bb to look. Below are three pairs of plots: the ones to the left show the percentage of b (or b) quarks we expect at a particular angle, with respect to the decaying boson; the ones on the right show the percentage of quarks we expect at the cosine (yes, the trigonometric cosine) of the particular angle.

 

Figure 3. The angular distribution (left) and cosine of the angular distribution (right) for the higgs (spin-0) boson, mH = 140 GeV/c2. 50K* events generated using PYTHIA MSUB(3).

Figure 4. The angular distribution (left) and cosine of the angular distribution (right) for a Z’ (spin-1) boson, mZ’ = 140 GeV/c2. 50K* events generated using PYTHIA MSUB(141).

Figure 5. The angular distribution (left) and cosine of the angular distribution (right) for a graviton (spin-2) boson, mG = 140 GeV/c2. 40K* events generated using PYTHIA MSUB(391), i.e., RS Graviton.

Thanks to the Law of Conservation of Angular Momentum, the intrinsic angular momenta held by the spin 0 (higgs), spin 1 (Z’), and spin 2 (graviton) force the quarks to decay preferentially at some angles and almost forbid other angles. Consequentially, the angular distribution for the higgs boson (spin 0) will give one giant hump around 90°; for the Z’ boson will have two humps at 60° and 120°; and the graviton (spin 2) will have three humps at 30°, 90°, and 150°. Similarly in the cosine distribution: the spin-0 higgs boson has no defining peak; the spin-1 Z’ boson has two peaks; and the spin-2 graviton has three peaks!

In other words, if it smells like a higgs, looks like a higgs, spins like a higgs, then my money is on the higgs.

A Few Words About The Plots

I have been asked by a reader if I could comment a bit on the shape and apparent symmetry in the angular distribution plots, both of which are extremely well understood. When writing the post, I admittedly glossed over these really important features because I was pressed to finish the post before traveling down to Chicago for a short summer school/conference, so I am really excited that I was asked about this.

At the Large Hadron Collider, we collide protons head-on. Since the protons are nicely aligned (thanks to the amazing people who actually operate the collider), we can consistently and uniformly label the direction through which the protons travel. In our case, let’s have a proton that come from the left be proton A and a proton that comes from the right be proton B. With this convention, proton A is traveling along what I call the “z-axis”; if proton A were to shoot vertically up toward the top of this page it would be traveling along the “x-axis”; and if it were to travel out of the computer screen toward you, the reader, the proton would be traveling in the “y direction” (or along the “y-axis”). The angle between the z-axis and the x-axis (or z-axis and the y-axis) is called θ (pronounced: theta). You can take a look at figure 6 for a nice little cartoon of the coordinate system I just described to you.

Figure 6: A coordinate system in which proton A (pA) is traveling along the z-axis and proton B (pB) in the negative z direction. The angle θ is measure as the angle between the z-axis and the x-axis, or equally, between the z-axis and the y-axis.

When the quarks (spin 1/2) inside a proton collide to become a higgs (spin 0), Z’ (spin 1), or graviton (spin 2), angular momentum must always be conserved. The simplest way for a quark in proton A and a quark in proton B to make a higgs boson is for the quarks to spin opposite directions, while still traveling along the z-axis, so that their spins cancel out, i.e., spin 1/2 – spin 1/2 = spin 0. This means that the higgs boson (spin 0) does not have any angular momentum constraints when decaying into two b-quarks and thus the cosine of the angle between the two b-quarks should be roughly flat and uniform. This is a little hard to see in figure 3 (right) because, as my colleague pointed out, the resolution in my plots are too small. (Thanks, Zhen!)

Turning to the Z’ boson (spin 1) case, protons A & B can generate a spin 1 particle most easily when their quarks, again while traveling along the z-axis, are spinning in the same direction, i.e., spin 1/2 + spin 1/2 = spin 1. Consequentially, the spin 1 Z’ boson and its decay products, unlike the higgs boson (spin 0), are required to conserve 1 unit of angular momentum. This happens most prominently when the two b-quarks (1) push against each other in opposite directions or (2) travel in the same direction. Therefore, the cosine of the angle made by the b-quarks is dominantly -1 or +1. If we allow for quantum mechanical fluctuations, caused by Heisenberg’s Uncertainty Principle, then we should also expect b-quarks to sometimes decay with a cosine greater than -1 and less than +1. See figure 4 (right).

The spin 2 graviton can similarly be explained but with a key difference. The spin 2 graviton is special because like the Z’ boson (spin 1) it can have 1 unit of angular momentum, but unlike Z’ boson (spin 1) it can also have 2 units of angular momenta. To produce a graviton with 2 units of angular momenta, rarer processes that involve the W & Z bosons (red W & Z in figure 1) must occur. This allows the final-state b-quarks to decay with a cosine of 0, which explains the slight enhancement in figure 5 (right).

It is worth noting that the reason why I have been discussing the cosine of the angle between the the quarks and not the angle itself is because the cosine is what we physicists calculate and measure. The cosine of an angle, or equally sine of an angle, amplify subtle differences between particle interactions and can at times be easier to calculate & measure.

The final thing I want to say about the angular distributions is probably the coolest thing ever, better than figuring out the spin of a particle. Back in the 1920s, when Quantum Mechanics was first proposed, people were unsure about a keystone of the theory, namely the simultaneous particle and wave nature of matter. We know bosons definitely behave like particles because they can collide and decay. That wavy/oscillatory behavior you see in the plots are exactly that: wavy/oscillatory behavior. No classical object will decay into particles with a continuous distribution; no classical has ever been found to do so nor do we expect to find one, at least according to our laws of classical physics. This wave/particle/warticle behavior is a purely quantum physics effect and would be an indicator that Quantum Mechanics is correct at the energy scale being probed by the Large Hadron Collider. 🙂

 

Happy Colliding.

– richard (@bravelittlemuon)

PS I apologize if some things are a little unclear or confusing. I traveling this weekend and have not had time to fully edit this post. If you have a question or want me to clarify something, please, feel free to write a comment.

PPS If you are going to be at the PreSUSY Summer School in Chicago next week, feel free to say hi!

*A note on the plots: I simulated several tens of thousands of events for clarity. According to my calculations, it would take four centuries to generate 40,000 gravitons, assuming the parameters I chose. In reality, the physicists can make the same determination as we did with fewer than four years worth of data.

Share

Paper vs. Protons (Pt. 2)

Tuesday, August 9th, 2011

Yup, it’s still summer conference season here in the Wonderful World of Physics. My fellow QD bloggers rocked at covering the European Physics Society meeting back in July, so check it out. Aside from the summer conferences, it is also summer school season for plenty of people (like me!). To clarify, I am not talking about repeating a class during the summer. Actually, it is quite the opposite: these are classes that are at most offered once a year and are taught in different countries, depending on the year.

To give you context, graduate students normally run out of courses to take in our second or third of our PhD program; and although the purpose of a PhD is to learn how to conduct research, there will always be an information gap between our courses and our research. There is nothing wrong with that, but sometimes that learning curve is pretty big. In order to alleviate this unavoidable issue, university professors often will teach a one-time-only “topics” course on their research to an audience of three or four students during the regular academic year. Obviously, this is not always sustainable for departments, large or small, because of fixed costs required to teach a course. The solution? Split the cost by inviting a hundred or so students from around the world to a university and cram an entire term’s worth of information into a 1- to 4-week lecture series, which, by the way, are taught by expert faculty from everywhere else in the world. 🙂

To be honest, it is like learning all about black holes & dark matter from the people who coined the names “black holes” & “dark matter.” So not only do graduate students get to learn about the latest & greatest from the people who discovered the latest & greatest, but we also get to hear all the anecdotal triumphs and setbacks that lead to the discoveries.

Fig. 1: Wisconsin’s state capitol in Madison, Wi., taken from one of the bike paths
that wrap around the city’s many lakes. (Photo: Mine)

This brings us to the point of my post. Back in July, I had the great opportunity to attend the 2011 CTEQ Summer School in Madison, Wi., where for 10 days we talked about this equation:

Now, this is not just any ordinary equation, it is arguably the most important equation for any physicist working at the Large Hadron Collider, the Tevatron, or any of the other half-dozen atom smashers on this planet. In fact, this equation is precisely what inspired the name Paper vs. Protons.

Since quantum physics is inherently statistical most calculations result in computing probabilities of things happening. The formula above allows you to compute the probability of what happens when you collide protons, something experimentalists can measure, by simply calculating the probability of something happening when you collide quarks, something undergraduates can do! Physicists love quarks very much because they are elementary particles and are not made of anything smaller, at least that is what we think. Protons are these messy balls of quarks, gluons, photons, virtual particles, elephant-anti-elephant pairs, and are just horrible. Those researchers studying the proton’s structure with something called “lattice QCD” have the eternal gratitude of physicists like me, who only deal with quarks and their kookiness.

Despite being so important the equation only has three parts, which are pretty straightforward. The first part, is that tail end of the second line:

which is just probability of this happening:

Fig. 2: Feynman diagram representing the qq-bar → γ → e+e- process.

If you read Paper vs. Protons (Pt. 1) you might recognize it. This Feynman diagram represents a quark (q) & an antiquark (q with a bar over it) combine to become a photon (that squiggly line in the center), which then decays into an electron (e-) & its antimatter partner, the positron (e+). Believe it or not, the probability of this “qq-bar → γ → e+e-” process happening (or cross section as we call it) is something that advanced college students and lower level graduate students learn to calculate in a standard particle physics course. Trust me when I say that every particle physicist has calculated it, or at the very least a slight variation that involves muons. By coincidence, I actually calculated it (for the nth time) yesterday.

Okay, time for the second part of the equation. To help explain it, I am using a great image (below) from the LHC experiment ALICE. So you & I know that all matter is made from atoms (left). Atoms, in turn, consist of a nucleus of protons & neutrons (center) that are being orbited by electrons (white dots, left). A proton (right) is made up of three quarks (three fuzzy, white dots, right) that bathe in a sea of gluons (red-blue-green fuzziness, right). About 45% of a proton’s energy at the LHC is shared by the three quarks; the remaining 55% of the proton’s energy is shared by the gluons.

Fig. 3: An atom (left), an atom’s nucleus (center), and a free proton (right). (Image: ALICE Expt)

How do we know those numbers? Easy, with something called a “parton distribution function”, or p.d.f. for short! A p.d.f. gives us back the probability of finding, for example, a quark in a proton with 15% of the proton’s energy. Since we want to know the probability of finding a quark (q) in the first proton (with momentum x1) and the probability of finding an anti-quark (q with a bar over its head) in the second proton (with momentum x2) we need to use our p.d.f. (which we will call “f”) twice. Additionally, since the quark and anti-quark can come from either of the two protons we need to use “f” a total of four times. Part 2 of our wonderful equation encapsulates the entire likelihood of finding the quarks we want to smash together:

Now the third (and final!) part is the simple to understand because all it tells us to do is to add: add together all the different ways a quark can share a proton’s energy. For example, a quark could have 5% or 55% of a proton’s energy, and even though either case might be unlikely to happen we still add together the probability of each situation happening. This the third part of our wonderful equation:

Putting everything together, we find that the probability of producing an electron (e-) and a positron (e+) when smashing together two protons is actually just the sum (part 3) of all the different ways (part 2) two quarks can produce an e+e- pair (part 1). Hopefully that made sense.

Though it gets better. When we plug our values into the formula, we get a number. This number is literally what we try to measure that the Large Hadron Collider; this is how we discover new physics! If theory “A” predicts a number and we measure a number that is way different, beyond any statistical uncertainty, it means that theory “A” is wrong. This is the infamous Battle of Paper vs Protons. To date, paper and protons agree with one another. However, at the end of this year, when the LHC shuts down for routine winter maintenance, we will have enough data to know definitively if the paper predictions for the higgs boson match what the protons say. Do you see why I think this equation is so important now? This is equation is how we determine whether or not we have discovered new physics. :p

Happy Colliding.

– richard (@bravelittlemuon)

PS. If you will be at the PreSUSY Summer School at the end of August, be sure to say hi.

Share

The combined Tevatron results exclude the existence of a Higgs particle with a mass between 100-108 and 156-177 GeV/c2. For the range 110-155 GeV/c2, the experiments are now extremely close to the sensitivity needed (dotted line below 1) either to see a substantial excess of Higgs-like events or to rule out the existence of the particle. The small excess of Higgs-like events observed by the Tevatron experiments in the range from 120 to 155 (see solid curve) is not yet statistically significant.

Scientists of the CDF and DZero collaborations at Fermilab continue to increase the sensitivity of their Tevatron experiments to the Higgs particle and narrow the range in which the particle seems to be hiding. At the European Physical Society conference in Grenoble, Fermilab physicist Eric James reported today that together the CDF and DZero experiments now can exclude the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges, expanding exclusion ranges that the two experiments had reported in March 2011.

Last Friday, the ATLAS and CMS experiments at the European center for particle physics, CERN, reported their first exclusion regions. The two experiments exclude a Higgs particle with a mass of about 150 to 450 GeV/c2, confirming the Tevatron exclusion range and extending it to higher masses that are beyond the reach of the Tevatron. Even larger Higgs masses are excluded on theoretical grounds.

This leaves a narrow window for the Higgs particle, and the Tevatron experiments are on track to collect enough data by the end of September 2011 to close this window if the Higgs particle does not exist.

James reported that the Tevatron experiments are steadily becoming more sensitive to Higgs processes that the LHC experiments will not be able to measure for some time. In particular, the Tevatron experiments can look for the decay of a Higgs particle into a pair of bottom and anti-bottom quark which are the dominant, hard-to-detect decay mode of the Higgs particle. In contrast, the ATLAS and CMS experiments currently focus on the search for the decay of a Higgs particle into a pair of W bosons, which then decay into lighter particles.

This graph shows the improvement in the combined sensitivity of the CDF and DZero experiments to a Higgs signal over the last couple of years. When the sensitivity for a particular value of the Higgs mass, mH, drops below one, scientists expect the Tevatron experiments to be able to rule out a Higgs particle with that particular mass. By early 2012, the Tevatron experiments should be able to corroborate or rule out a Higgs particle with a mass between 100 to about 190 GeV/c2.

The LHC experiments reported at the EPS conference an excess of Higgs-like events in the 120-150 GeV/c2 mass region at about the 2-sigma level. The Tevatron experiments have seen a small, 1-sigma excess of Higgs-like events in this region for a couple of years. A 3-sigma level is considered evidence for a new result, but particle physicists prefer a 5-sigma level to claim a discovery. More data and better analyses are necessary to determine whether these excesses are due to a Higgs particle, some new phenomena or random data fluctuations.

In early July, before the announcement of the latest Tevatron and LHC results, a global analysis of particle physics data by the GFitter group indicated that, in the simplest Higgs model, the Higgs particle should have a mass between approximately 115 and 137 GeV/c2.

“To have confidence in having found the Higgs particle that theory predicts, you need to analyze the various ways it interacts with other particles,” said Giovanni Punzi, co-spokesperson of the CDF experiment. “If there really is a Higgs boson hiding in this region, you should be able to find its decay into a bottom-anti-bottom pair. Otherwise, the result could be a statistical fluctuation, or some different particle lurking in your data.”

The CDF and DZero experiments will continue to take data until the Tevatron shuts down at the end of September.

“The search for the Higgs particle in its bottom and anti-bottom quark decay mode really has been the strength of the Tevatron,” said Dmitri Denisov, DZero co-spokesperson

“With the additional data and further improvements in our analysis tools, we expect to be sensitive to the Higgs particle for the entire mass range that has not yet been excluded. We should be able to exclude the Higgs particle or see first hints of its existence in early 2012.”

The details of the CDF and DZero analysis are described in this note, which will be posted later today, as well as submitted to the arXiv.

—Kurt Riesselmann

Share

This article first appeared in symmetry breaking July 22.

Editor’s note:  The LHC experiments reported at the EPS meeting a tantalizing excess of Higgs-like events, short of claiming a discovery, but very intriguing nevertheless. See the Higgs search at the LHC section further below for more information on these results.

 The LHC experiments reported at the EPS meeting a tantalizing excess of Higgs-like events, short of claiming a discovery, but very intriguing nevertheless. See the Higgs search at the LHC section further below for more information on these results.

Experiments at Fermi National Accelerator Laboratory and the European particle physics center, CERN, are zooming in on the final remaining mass region where the Higgs particle might be lurking. Over the next seven days, Fermilab’s CDF and DZero collaborations and CERN’s ATLAS and CMS collaborations will announce their latest Higgs search results at the High-Energy Physics conference of the European Physical Society.

Scientists at Fermilab and CERN employ very similar methods to create the Higgs: accelerate particles to high energy using the world’s most powerful accelerators, the Tevatron (1 TeV beam energy) and the Large Hadron Collider (3.5 TeV), smash the particles together, and sift through the large number of new particles emerging from these collisions. But to find a Higgs particle among the many particles created, the teams of scientists are focusing on different signals (see below).

If the Higgs particle exists and has the properties predicted by the simplest Higgs model, named after Scottish physicist Peter Higgs, then the colliders at Fermilab and CERN already must have produced Higgs particles. But finding the tell-tale sign of a Higgs boson among all other particle signatures is like searching for a drop of ink in an ocean. Only if the accelerators produce more and more collisions do scientists stand a chance of finding enough evidence for the Higgs particle.

Where to look

The Higgs mechanism, developed in the 1960s by several independent groups of theorists, explains why some fundamental particles have mass and others don’t. Its mathematical framework fits perfectly into one of the most successful theories in science: the Standard Model of elementary particles and forces.

Experimenters sifting through data from one experiment after another have come up empty-handed; instead they have ruled out larger and larger swaths of potential Higgs territory. An analysis by the GFitter group of precision measurements and the direct and indirect constraints on the Higgs mass indicates that, in the simplest Higgs model, the Higgs particle should have a mass between approximately 115 and 137 billion electron volts (GeV)/c2, or about 100 times the mass of a proton.

Higgs search at the Tevatron

At Fermilab’s Tevatron, scientists attempt to produce Higgs particles by smashing together protons and antiprotons, composite particles that comprise elementary building blocks. When a proton and antiproton hit each other at high energy, scientists observe the collisions and interactions of these components, such as quarks, antiquarks and gluons. Those subatomic collisions transform energy into new particles that can be heavier than the protons themselves, as predicted by Einstein’s famous equation E=mc2.

At the Tevatron, which makes protons and antiprotons collide, scientists focus on finding signs for the decay of the Higgs particle into a bottom quark and anti-bottom quark.

At the Tevatron, which makes protons and antiprotons collide, scientists focus on finding signs for the decay of the Higgs particle into a bottom quark and anti-bottom quark.
Tevatron scientists have carried out detailed simulations of such collisions and found that the best chance for producing, say, a 120-GeV Higgs boson at the Tevatron are quark-antiquark collisions that create a high-energy W boson (see graphic). This W boson has a chance to spend its extra energy to generate a short-lived Higgs boson. The W boson and the Higgs boson would then decay into lighter particles that can be caught and identified by the CDF and DZero particle detectors, which surround the two proton-antiproton collision points of the Tevatron.

According to the Standard Model, such a 120-GeV Higgs boson will decay 68 percent of the time into a bottom quark and anti-bottom quark. But other collision processes and particle decays also produce bottom and anti-bottom quarks. Identifying an excess of these particles due to the decay of the Higgs boson is the best chance for Tevatron scientists to discover or rule out a Standard Model Higgs.

At the EPS conference, CDF and DZero will report (see press release) that, for the first time, the two collaborations have successfully applied well-established techniques used to search for the Higgs boson to observe extremely rare collisions that produce pairs of heavy bosons (WW or WZ) that decay into heavy quarks. This well-known process closely mimics the production of a W boson and a Higgs particle, with the Higgs decaying into a bottom quark and antiquark.

Higgs search at the LHC

At the LHC, located on the French-Swiss border, scientists smash protons into protons. Because the LHC operates at higher collision energies than the Tevatron, each collision produces on average many more particles than a collision at the Tevatron. In particular, the LHC floods its particle detectors with bottom and anti-bottom quarks created by many different types of subatomic processes. Hence it becomes more difficult than at the Tevatron to find this particular “ink in the ocean”—an excess of bottom and anti-bottom quarks in the LHC data due to the Higgs particle.

At the EPS conference, the ATLAS scientists showed that they should have been able to exclude a Higgs boson with mass between 130 and 200 GeV/c2, but instead the collaboration saw an excess of events in the 130 to 155 GeV/c2 range, as reported by ATLAS physicist Jon Butterworth in his blog at the Guardian. It could be a fluctuation, but it could also be the first hint of a Higgs signal. Geoff Brumfiel writes for Nature News that the CMS experiment also sees an excess in the 130 to 150 GeV/c2 range. (CMS physicist Tommaso Dorigo has posted the relevant CMS Higgs search plots in his blog.) Combined, the two LHC experiments should have enough data by the end of this summer to say whether this excess is real or not. The Tevatron experiments are getting close to being sensitive to a Higgs particle near 150 GeV as well. Here is the new DZero result: the dotted line, which indicates sensitivity, is approaching 1 near 150 GeV, but the solid line, which is the actual observation, is significantly below 1, yet it differs from the expectation only at the 1 to 1.5 sigma level. Bottom line: DZero scientists cannot exclude a Higgs boson in this range. And here is the new CDF result: Again, for a Higgs mass of about 150 GeV/c2, the sensitivity approaches 1, and the observed Higgs constraints agree well with the expectations. (Note that DZero shows 1-sensitivity and CDF shows sensitivity; that’s why the CDF curve is above 1.) On Wednesday, July 27, CDF and DZero will present their combined results for this mass range at the EPS conference. The sensitivity of the combined CDF and DZero results will be even closer to 1 at 150 GeV/c2.

At the Large Hadron Collider, which smashes protons into protons, scientists focus on finding signs for the decay of the Higgs particle into two photons.

At the Large Hadron Collider, which smashes protons into protons, scientists focus on finding signs for the decay of the Higgs particle into two photons.

For a light Higgs boson, LHC scientists focus on a very different Higgs production and decay process, complementary to the Higgs search at the Tevatron. Detailed simulations of high-energy proton-proton collisions have shown that the best chance to catch, say, a 120-GeV Standard Model Higgs particle at the LHC is to look for a Higgs boson emerging from the collision of two gluons, followed by its decay into two high-energy gamma rays (see second graphic). This is an extremely rare process since the Higgs boson doesn’t interact directly with the massless gluons and gamma rays. Instead, the Higgs production and decay occur through intermediate, massive quark-antiquark loops, which can temporarily appear in subatomic processes, in accordance with the laws of quantum mechanics. The intermediate loop, however, makes this process much rarer to occur. In particular, the decay of a 120-GeV Standard Model Higgs boson into two gamma rays happens only once out of 500 times. Hence LHC scientists will need to gather a sufficiently large number of proton-proton collisions to observe this process.

Why do physicists think that the Higgs particle exists?

The discovery in the 1980s of heavy, force-carrying particles, known as W and Z bosons, confirmed crucial predictions made by the Standard Model and the simplest Higgs model. Since then, further discoveries and precision measurements of particle interactions have confirmed the validity of the Standard Model many times. It now seems almost impossible to explain the wealth of particle data without the Higgs mechanism. But one crucial ingredient of this fabulous particle recipe—the Higgs boson itself—has remained at large. Does it exist? How heavy is it? Does it interact with quarks and other massive particles as expected? These questions will keep scientists busy for years to come.

Want to learn more about what the Higgs particle is and how it gives mass to some particles? Watch this 3-minute video.

Kurt Riesselmann

Share