• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘Energy Frontier’

Each time news comes out about the Higgs boson I get questions from media, friends and family trying to grasp why this particle is so important. The following questions come up again and again. So with experimenters from using Fermilab’s  Tevatron announcing new Higgs results Wednesday at a conference in Italy, I thought it was time to share answers to the questions that might pop into your mind.

Why should the average person care if the Higgs is found?

Understanding more about the building blocks of matter and the forces that control their interactions helps scientists to learn how to manipulate those forces to humankind’s benefit. For example, the study of the electron led to the development of electricity, the study of quantum mechanics made possible the creation of GPS systems and the study of the weak force led to an understanding of radioactive decay and nuclear power.

Now what?

The Tevatron experiments will continue to further analyze the Higgs boson data to wring out more information. In addition, the Tevatron and LHC experiments are working to combine their data for a release at an unspecified date.

Even if both teams find evidence of a Higgs boson in the same location, physicists will need to do more analysis to make sure the Higgs boson isn’t a non-Standard Model Higgs masquerading as a resident of the Standard Model. That will require physicists to measure several properties in addition to mass.

What would finding the Higgs boson mean for the field of physics?

Finding evidence of the Higgs boson would expand the following three areas of study:

• Pin-pointing the mass range of the Higgs would help physicists condense the number of theories about the existence of undiscovered particles and the forces that interact on them. For example, a Standard Model Higgs boson would rule out classic QCD-like versions of technicolor theory. A Higgs boson with a mass larger than 125 GeV would rule out the simplest versions of supersymmetry, or SUSY, which predict that every known particle has an unknown sibling particle with a different mass. Other theories would gain more support. One such SUSY theory predicts that a Standard Model Higgs boson would appear as the lightest of a group of five or more Higgs bosons. Whether the Higgs boson exists or not does not affect theories about the existence of extra dimensions.

• Knowing the mass of the Higgs boson would give physicists more data to plug into other equations about how our universe formed and about some of the least understood particle interactions, such as magnetic muon anomaly.

• Finding evidence of a heavy mass Higgs boson (larger than 150 GeV) would require the existence of undiscovered particles and/or forces. Finding a light mass Higgs boson (less than 125 GeV) would not require the existence of new physics but doesn’t rule it out either.

What is the difference between the Higgs boson and the Higgs field?

The Higgs field exists like a giant vat of molasses spread throughout the universe. Particles that travel through it end up with globs of molasses sticking to them, slowing them down and making them heavier. You can think of the Higgs boson as the molasses globs, or a particle manifestation of this energy field akin to a ball of energy.

Physicists have different theories about how many Higgs bosons exist, akin to predicting whether the molasses would stick in one giant glob or several globlets.

How long have physicists been looking for the Higgs?

More than a decade. It started with the LEP experiment at CERN in the 1990s, continued with the Tevatron and now with the LHC.

How do physicists create a Higgs boson?

A high-energy particle accelerator such as the Tevatron or LHC can recreate the energy levels that permeated the universe shortly after the Big Bang. Colliding particles at this energy level can set free the right amount of energy to produce particles, including a Higgs boson. The collision energy is localized in a small space and transforms from energy into the mass of the Higgs boson.

How is the Higgs boson related to the Big Bang theory?

The Big Bang occurred 13.7 billion years ago sending massless particles and radiation energy zooming through the universe like cars at rush hour. Shortly afterward, the Higgs field appeared, as if a truck carrying molasses overturned and leaked all over the highway. Particles such as light, which went through the puddle super fast, avoided having any molasses stick to them, similar to the way hydroplaning cars skim the surface of water. Particles that went through the molasses puddle more slowly had molasses goblets cling to them, creating a drag that slowed them even more and made them more massive. How fast a particle made it through the puddle determined how much molasses clung to it, and thus how massive it became. When the universe began to cool, slow particles with mass began to bunch up like mini-traffic jams and form composite particles and then atoms.

How do we know this is where the Higgs is located?

Just as firemen sweep building floors to rule out the existence of trapped homeowners, physicists have used direct and indirect observations from experiments to rule out the existence of the Higgs boson in most energy ranges where the Standard Model predicts it could reside.

Does the mass of the Higgs compare to its weight?

Sort of. Non-physicists think of mass as how much something weighs. But scientists consider mass to take into account weight and other factors. Weight changes with gravity, so you would weigh less on the moon than on Earth. Mass remains constant throughout the universe. However, when talking about things on Earth, mass and weight are fairly interchangeable.

How did the Higgs boson get the nickname “the God particle”?

Nobel laureate Leon Lederman, a Fermilab physicist, wrote a book in the early 1990s about particle physics and the search for the Higgs boson. His publisher coined the name as a marketable title for the book. Scientists dislike the nickname.

What countries are involved in the CDF and DZero experiments?

• CDF: US, Canada, France, Germany, Greece, Italy, Japan, Korea, UK, Russia, Slovakia, Spain and Taiwan

• DZero; Brazil, China, Columbia, Czechoslovakia, Ecuador, France, Germany, India, Ireland, Korea, Mexico, Netherlands, UK, Ukraine, US, Russia, Spain and Sweden.

What is the competitive relationship between the Tevatron and LHC experiments?

It is closer to sibling rivalry than the traditional business competition you would find in something such as the auto industry.

Fermilab supports about 1,000 US CMS scientists and engineers by providing computing facilities, office and meeting space as well as the LHC Remote Operation Center. Fermilab helped design and build the CMS detector as well as equipment for the LHC accelerator, and Fermilab scientists are working on upgrades for both and analyzing data. About one third of the members of each of the Tevatron’s experiments, CDF and DZero, are also members of the LHC experiments.

— Tona Kunz

Share

If you saw the news last week, you saw that Fermilab announced a new precision measurement of  the W boson. While this particle is a key building block for the structure of our world, it  doesn’t even get a mention in American high school text books. Because of that, the value of such results can be easily lost on the public. Yet the step-by-step, layered approach to science  means that it requires many potentially unheralded result to pave the path to discovery. With that in mind, I hope to explain here why you should take notice of the W boson mass measurement.

The new CDF and Dzero combined result for the W boson mass (vertical section of green oval), combined with the world's best value for the top quark mass (horizontal section of green oval), restricts the Higgs mass requiring it to be less than 152 GeV/c2 with 95 percent probability. Direct searches have narrowed the allowed Higgs mass range to 115-127 GeV/c2. The grey bar shows the remaining area the Higgs could reside in.

If you’re seeking clues to the hiding place of the theorized Higgs boson and whether scientists understand how the basic building blocks of matter come together to create our world, you need to measure the mass of the W boson particle. This building block of nature gave rise to the Higgs theory and has the power to prove if the Standard Model is correct, or if physics textbooks need a rewrite. The Standard Model serves as the blueprint for our world, detailing the properties of the building blocks of matter and how they interact.

Last week the world’s most precise measurement of the W boson mass was unveiled by scientists at the CDF and DZero collaborations using data gathered with the Tevatron accelerator at the U. S. Department of Energy’s Fermi National Accelerator Laboratory. The precision of this joint  measurement surpasses all previous measurements combined.

This measurement comes at a pivotal time, just before physicists from experiments at the Tevatron and the Large Hadron Collider at CERN will present their new results this week in the hunt for the Higgs at the annual conference on Electroweak Interactions and Unified Theories known as Rencontres de Moriond in Italy.

The Tevatron experiments found the mass for the W boson to be exactly as predicted. This indicates that if a Higgs boson exists in the Standard Model framework, it should be right where the Tevatron and LHC experiments and are looking.

The Higgs boson is the last undiscovered component of the Standard Model and theorized to give fundamental particles mass.  Without mass, atoms would not exist.

Just as firemen sweep building floors to rule out the existence of trapped homeowners, physicists have used direct and indirect observations from experiments to rule out the existence of the Higgs boson in most mass ranges where the Standard Model predicts it could reside. This new W mass measurement, combined with the lower limit established at the LEP experiment at CERN many years ago and the latest measurement of the mass of the top quark, determines that a Standard Model Higgs boson can not have a mass larger than 145 GeV, or giga-electronvolt. This is consistent with recent direct searches at the LHC that constrain the possible home of the Higgs to below 130 GeV. This measurement illustrates the power of the Tevatron and how its expertise in precision physics can point the way to discoveries and cross-check any future discoveries made at the LHC.

The CDF collaboration measured the W boson mass to be 80387 +/- 19 MeV/c2. The DZero collaboration measured the particle’s mass to be 80375 +-23 MeV/c2. The two measurements combined along with the addition of previous data from the earliest operation of the Tevatron produces a measurement of 80387 +- 17 MeV/c2, which has a precision of 0.02 percent.

These ultra-precise, rigorous measurements took up to five years for the collaborations to complete independently. The collaborations measured the particle’s mass in six different ways, which all match and combine for a result that is twice as precise as the previous measurement.

The new W mass measurement and the latest precision determination of the mass of the top quark from Fermilab triangulate the location of the Higgs particle and restrict its mass to less than 152 GeV/c2 . This is in agreement with the latest direct searches at the LHC, which constrain the Higgs mass to less than 127 GeV/c2, and direct-search limits from the Tevatron, which point to a Higgs mass of less than 156 GeV/c2, before the update of their results expected this week.

Measurements of the W boson mass provide critical stress tests of the accuracy of the Standard Model. If the mass were not as predicted, that would suggest that physicists’ understanding of nature as explained by the Standard Model is wrong, and imply the existence of undiscovered particles or forces. In principal, the W mass might shine a light on all sorts of new physics models.

Physicists attach such importance to the mass of the W boson because the existence of the world we live in depends on it living up to predictions. The W boson is a carrier of the electroweak nuclear force, which binds objects together across far distances.  Making it massless would eliminate this force leaving in the world only the strong force, which binds things at the atomic level. The world would be so widely different form what we know if the W boson had no mass. Ordinary atoms and matter as we know it could not exist, and we certainly wouldn’t exist.

A need to explain why the mass of the W boson is 85 times the mass of the proton gave rise to Higgs theory. This explained the “extra” mass acquired when the W boson passes through a molasses-like Higgs energy field. As if the molasses is sticking to it, the particle slows down and in accordance with Einstein’s Theory of Special Relativity gets more massive.

— Tona Kunz

Share

Fermilab planning a busy 2012

Tuesday, January 3rd, 2012

This column by Fermilab Director Pier Oddone first appeared in Fermilab Today Jan. 3 .

We have a mountain of exciting work coming our way!

In accelerator operations, we need to give enough neutrinos to MINERvA to complete their low-energy run, enough anti-neutrinos to MiniBooNE to complete their run and enough neutrinos to MINOS to enable their independent neutrino velocity measurement that will follow up on last year’s OPERA results. We need to provide test beams to several technology development projects and overcome setbacks due to an aging infrastructure to deliver beam to the SeaQuest nuclear physics experiment. And we need to do all of this in the first few months of the year before a year-long shutdown starts. During the shutdown, we will modify the accelerator complex for the NOvA era and begin the campaign to double the number of protons from the Booster to deliver simultaneous beams to various experiments.

In parallel with accelerator modifications, we will push forward on many new experiments. The NOvA detector is in full construction mode, and we face challenges in the very large number of detector elements and large mechanical systems. Any project of this scale requires a huge effort to achieve the full promise of its design. We have the resources in our FY2012 budget to make a lot of progress toward MicroBooNE, Mu2e and LBNE. We will continue to work with DOE to advance Muon g-2. All these experiments are at an important stage in their development and need to be firmly established this year.

At the Cosmic Frontier, we will commission and start operation of the Dark Energy Survey at the Blanco Telescope in Chile, where the camera has arrived and is being tested. In the dark matter arena we will commission and operate the 60 kg COUPP detector at Canada’s SNOLAB and continue the run of the CDMS 15 kg detector in the Soudan Mine while carrying out R&D on future projects. We continue to have a major role in the operation of the Pierre Auger cosmic-ray observatory. In addition we should complete the first phase of the Fermilab Holometer, which will study the properties of space-time at the Planck scale.

At the Energy Frontier, we play a major role in the LHC detector operations and analysis. It should be a fabulously exciting year at the LHC as we push on the hints that we already see in the data.

Beyond construction and operation of facilities we continue our R&D efforts on the superconducting RF technology necessary for Project X and other future accelerators. We will be building the Illinois Accelerator Research Center and moving forward to connect our advanced accelerator program with industry and universities. Our rich program on theory, computation and detector technology will continue to support our laboratory and the particle physics community.

If we accomplish all that is ahead of us for 2012, it will be a year to remember and celebrate when we hit New Year’s Day 2013!

Share

This article first appeared in ISGTW Dec. 21, 2011.

A night-time view of the Tevatron. Photo by Reidar Hahn.

This is the first part of a two-part series on the contribution Tevatron-related computing has made to the world of computing. This part begins in 1981, when the Tevatron was under construction, and brings us up to recent times. The second part will focus on the most recent years, and look ahead to future analysis.

Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software.

Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. In fact, computing continues to do its work. Although the proton and antiproton beams no longer brighten the Tevatron’s tunnel, physicists expect to be using computing to continue analyzing a vast quantity of collected data for several years to come.

When all that data is analyzed, when all the physics results are published, the Tevatron will leave behind an enduring legacy. Not just a physics legacy, but also a computing legacy.

In the beginning: The fixed-target experiments

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.

1981. The first Indiana Jones movie is released. Ronald Reagan is the U.S. President. Prince Charles makes Diana a Princess. And the first personal computers are introduced by IBM, setting the stage for a burst of computing innovation.

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.Meanwhile, at the Fermi National Accelerator Laboratory in Batavia, Illinois, the Tevatron has been under development for two years. And in 1982, the Advanced Computer Program formed to confront key particle physics computing problems. ACP tried something new in high performance computing: building custom systems using commercial components, which were rapidly dropping in price thanks to the introduction of personal computers. For a fraction of the cost, the resulting 100-node system doubled the processing power of Fermilab’s contemporary mainframe-style supercomputers.

“The use of farms of parallel computers based upon commercially available processors is largely an invention of the ACP,” said Mark Fischler, a Fermilab researcher who was part of the ACP. “This is an innovation which laid the philosophical foundation for the rise of high throughput computing, which is an industry standard in our field.”

The Tevatron fixed-target program, in which protons were accelerated to record-setting speeds before striking a stationary target, launched in 1983 with five separate experiments. When ACP’s system went online in 1986, the experiments were able to rapidly work through an accumulated three years of data in a fraction of that time.

Entering the collider era: Protons and antiprotons and run one

1985. NSFNET (National Science Foundation Network), one of the precursors to the modern Internet, is launched. And the Tevatron’s CDF detector sees its first proton-antiproton collisions, although the Tevatron’s official collider run one won’t begin until 1992.

The experiment’s central computing architecture filtered incoming data by running Fortran-77 algorithms on ACP’s 32-bit processors. But for run one, they needed more powerful computing systems.

By that time, commercial workstation prices had dropped so low that networking them together was simply more cost-effective than a new ACP system. ACP had one more major contribution to make, however: the Cooperative Processes Software.

CPS divided a computational task into a set of processes and distributed them across a processor farm – a collection of networked workstations. Although the term “high throughput computing” was not coined until 1996, CPS fits the HTC mold. As with modern HTC, farms using CPS are not supercomputer replacements. They are designed to be cost-effective platforms for solving specific compute-intensive problems in which each byte of data read requires 500-2000 machine instructions.

CPS went into production-level use at Fermilab in 1989; by 1992 it was being used by nine Fermilab experiments as well as a number of other groups worldwide.

1992 was also the year that the Tevatron’s second detector experiment, DZero, saw its first collisions. DZero launched with 50 traditional compute nodes running in parallel, connected to the detector electronics; the nodes executed filtering software written in Fortran, E-Pascal, and C.

Gearing up for run two

"The Great Wall" of 8mm tape drives at the Tagged Photon Laboratory, circa 1990 - from the days before tape robots. Photo by Reidar Hahn.

1990. CERN’s Tim Berners-Lee launches the first publicly accessible World Wide Web server using his URL and HTML standards. One year later, Linus Torvalds releases Linux to several Usenet newsgroups. And both DZero and CDF begin planning for the Tevatron’s collider run two.

Between the end of collider run one in 1996 and the beginning of run two in 2001, the accelerator and detectors were scheduled for substantial upgrades. Physicists anticipated more particle collisions at higher energies, and multiple interactions that were difficult to analyze and untangle. That translated into managing and storing 20 times the data from run one, and a growing need for computing resources for data analysis.

Enter the Run Two Computing Project (R2CP), in which representatives from both experiments collaborated with Fermilab’s Computing Division to find common solutions in areas ranging from visualization and physics analysis software to data access and storage management.

R2CP officially launched in 1996. It was the early days of the dot com era. eBay had existed for a year, and Google was still under development. IBM’s Deep Blue defeated chess master Garry Kasparov. And Linux was well-established as a reliable open-source operating system. The stage is set for experiments to get wired and start transferring their irreplaceable data to storage via Ethernet.

The high-tech tape robot used today. Photo by Reidar Hahn.

“It was a big leap of faith that it could be done over the network rather than putting tapes in a car and driving them from one location to another on the site,” said Stephen Wolbers, head of the scientific computing facilities in Fermilab’s computing sector. He added ruefully, “It seems obvious now.”

The R2CP’s philosophy was to use commercial technologies wherever possible. In the realm of data storage and management, however, none of the existing commercial software met their needs. To fill the gap, teams within the R2CP created Enstore and the Sequential Access Model (SAM, which later stood for Sequential Access through Meta-data). Enstore interfaces with the data tapes stored in automated tape robots, while SAM provides distributed data access and flexible dataset history and management.

By the time the Tevatron’s run two began in 2001, DZero was using both Enstore and SAM, and by 2003, CDF was also up and running on both systems.

Linux comes into play

The R2CP’s PC Farm Project targeted the issue of computing power for data analysis. Between 1997 and 1998, the project team successfully ported CPS and CDF’s analysis software to Linux. To take the next step and deploy the system more widely for CDF, however, they needed their own version of Red Hat Enterprise Linux. Fermi Linux was born, offering improved security and a customized installer; CDF migrated to the PC Farm model in 1998.

The early computer farms at Fermilab, when they ran a version of Red Hat Linux (circa 1999). Photo by Reidar Hahn.

Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Enterprise Linux ceased to be free. The Fermi Linux team rebuilt Red Hat Enterprise Linux into the prototype of Scientific Linux, and formed partnerships with colleagues at CERN in Geneva, Switzerland, as well as a number of other institutions; Scientific Linux was designed for site customizations, so that in supporting it they also supported Scientific Linux Fermi and Scientific Linux CERN.

Today, Scientific Linux is ranked 16th among open source operating systems; the latest version was downloaded over 3.5 million times in the first month following its release. It is used at government laboratories, universities, and even corporations all over the world.

“When we started Scientific Linux, we didn’t anticipate such widespread success,” said Connie Sieh, a Fermilab researcher and one of the leads on the Scientific Linux project. “We’re proud, though, that our work allows researchers across so many fields of study to keep on doing their science.”

Grid computing takes over

As both CDF and DZero datasets grew, so did the need for computing power. Dedicated computing farms reconstructed data, and users analyzed it using separate computing systems.

“As we moved into run two, people realized that we just couldn’t scale the system up to larger sizes,” Wolbers said. “We realized that there was really an opportunity here to use the same computer farms that we were using for reconstructing data, for user analysis.”

A wide-angle view of the modern Grid Computing Center at Fermilab. Today, the GCC provides computing to the Tevatron experiments as well as the Open Science Grid and the Worldwide Large Hadron Collider Computing Grid. Photo by Reidar Hahn.

Today, the concept of opportunistic computing is closely linked to grid computing. But in 1996 the term “grid computing” had yet to be coined. The Condor Project had been developing tools for opportunistic computing since 1988. In 1998, the first Globus Toolkit was released. Experimental grid infrastructures were popping up everywhere, and in 2003, Fermilab researchers, led by DZero, partnered with the US Particle Physics Data Grid, the UK’s GridPP, CDF, the Condor team, the Globus team, and others to create the Job and Information Management system, JIM. Combining JIM with SAM resulted in a grid-enabled version of SAM: SAMgrid.

“A pioneering idea of SAMGrid was to use the Condor Match-Making service as a decision making broker for routing of jobs, a concept that was later adopted by other grids,” said Fermilab-based DZero scientist Adam Lyon. “This is an example of the DZero experiment contributing to the development of the core Grid technologies.”

By April 2003, the SAMGrid prototype was running on six clusters across two continents, setting the stage for the transition to the Open Science Grid in 2006.

From the Tevatron to the LHC – and beyond

Throughout run two, researchers continued to improve the computing infrastructure for both experiments. A number of computing innovations emerged before the run ended in September 2011. Among these was CDF’s GlideCAF, a system that used the Condor glide-in system and Generic Connection Brokering to provide an avenue through which CDF could submit jobs to the Open Science Grid. GlideCAF served as the starting point for the subsequent development of a more generic glidein Work Management System. Today glideinWMS is used by a wide variety of research projects across diverse research disciplines.

Another notable contribution was the Frontier system, which was originally designed by CDF to distribute data from central databases to numerous clients around the world. Frontier is optimized for applications where there are large numbers of widely distributed clients that read the same data at about the same time. Today, Frontier is used by CMS and ATLAS at the LHC.

“By the time the Tevatron shut down, DZero was processing collision events in near real-time and CDF was not far behind,” said Patricia McBride, the head of scientific programs in Fermilab’s computing sector. “We’ve come a long way; a few decades ago the fixed-target experiments would wait months before they could conduct the most basic data analysis.”

One of the key outcomes of computing at the Tevatron was the expertise developed at Fermilab over the years. Today, the Fermilab computing sector has become a worldwide leader in scientific computing for particle physics, astrophysics, and other related fields. Some of the field’s top experts worked on computing for the Tevatron. Some of those experts have moved on to work elsewhere, while others remain at Fermilab where work continues on Tevatron data analysis, a variety of Fermilab experiments, and of course the LHC.

The accomplishments of the many contributors to Tevatron-related computing are noteworthy. But there is a larger picture here.

“Whether in the form of concepts, or software, over the years the Tevatron has exerted an undeniable influence on the field of scientific computing,” said Ruth Pordes, Fermilab’s head of grids and outreach. “We’re very proud of the computing legacy we’ve left behind for the broader world of science.”

— Miriam Boon

Share

Christmas time brings not only presents and pretty cookies but an outpouring of media lists proffering the best science stories of the year and predicting those that will top the list in 2012.

While the lists varied wildly everyone seemed excited by a few of the same things: upsetting Einstein’s theory of special relativity, a hint of the ‘god particle’ and finding planets like our own.

Several of the stories that made nearly every media outlet’s list, though in various rankings, have a connection, directly or indirectly, to Fermilab. Here’s a sampling with the rankings from the publications.

Discover magazine had the largest list, picking the top 100 science stories.

1: A claim by researchers at the OPERA experiment at CERN that they had measured neutrinos traveling faster than the speed of light, something disallowed by Einstein’s Theory of Special Relativity. Now the scientific community is looking for another experiment to cross-check OPERA’s claim.

That brought renewed interest to a 2007 measurement by the MINOS experiment based at Fermilab that found neutrinos skirting the cosmic speed limit, but only slightly. The MINOS collaboration always planned to study this further when it upgrades its detector in early 2012 but the OPERA result added new urgency.

Look in 2012 for MINOS to update the time of flight of neutrinos debate in three stages. First, MINOS is analyzing the data collected since its 2007 result to look for this phenomena. Results should be ready in early 2012. This likely will improve the MINOS  precision in this area by a factor of three from its 2007 result. Second, MINOS is in the process of upgrading its timing system within the next few months using a system of atomic clocks to detect when the neutrinos arrive at the detector. The atomic clock system will progressively improve resolution, which is needed to make the MINOS analysis comparable to the OPERA result and improve precision from the 2007 MINOS result by as much as a factor of 10. That will tell us if OPERA was on the right track or not, but may not be the definitive answer. That answer will come with the upgrades to the MINOS experiment  and a more powerful neutrino beam, producing a larger quantity of neutrino events to study. The upgraded MINOS experiment will be in many ways a more precise system than OPERA’s and could produce a result comparable with OPERA’s precision likely by January 2014.

4: Kepler’s search for Earth-like planets that could sustain life produces a bounty of cosmic surprises, fueled, in part, by the computing skills of a Fermilab astrophysicist.
32: The on-again, off-again rumor of finding the Higgs boson particle.  Physicists working with experiments at Fermilab’s Tevatron experiments and CERN’s Large Hadron Collider expect to answer the question of whether a Standard Model version of the Higgs exists in 2012.
65: The shutdown of the Tevatron at Fermilab after 28 years and numerous scientific and technological achievements.
82: Fermilab physicist Jason Steffen’s frustration with slow airplane boarding drives him to figure out a formula to speed up the aisle crawl.

Nature’s year in review didn’t rank stories but started off by mentioning the Tevatron’s shutdown after 28 years and following up shortly with the puzzling particle news of potentially FTL neutrinos and a Higgs sighting.

For science — as for politics and economics — 2011 was a year of upheaval, the effects of which will reverberate for decades. The United States lost three venerable symbols of its scientific might: the space-shuttle programme, the Tevatron particle collider and blockbuster profits from the world’s best-selling drug all came to an end.

Cosmos magazine rankings:

The MINOS far detector in the Soudan Mine in Minnesota. Credit: Fermilab

1: Kepler’s exoplanet findings
2: FTL neutrinos
3: Higgs

Scientific American‘s choices:

3: FTL neutrinos
5: Higgs

ABC News asked science radio and TV host physicist Michio Kaku for his top 10 picks. They include:

3: Hint of Higgs
5: Kepler’s exoplanet findings
10: Nobel Prize for the discovery that the expansion of the universe is accelerating, which laid the groundwork for the today’s search for dark energy. Fermilab has several connections to to this work. The latest tool in dark energy survey experiments, the Dark Energy Camera,  was constructed at Fermilab in 2011. One of the three prize winners, Saul Perlmutter, is a member of the group that will use the camera, the Dark Energy Survey collaboration. Adam Riess, another of the winners, is a member of the SDSS-II experiment, a predecessor to DES that Fermilab was key in building and later operating its computing system.

Live Science

5: FTL neutrinos
4: Kepler’s exoplanet findings
2: Higgs

If the Higgs boson’s mass is high, it is expected to decay predominantly into two W bosons. Plushies images from the Particle Zoo.

To make the Ars Technica list stories had to be awe inspiring in 2011 AND have a chance of making the 2012 list as well.

1: FTL neutrinos
2: Kepler’s exoplanet findings
6: Higgs hunt

Science magazine chose the best scientific breakthrough of the year. Kepler’s exoplanet hunt made it into the runner up list.

Tell us who you agree with or, better, yet give us your own top 10 science stories of the year.

— Tona Kunz

Share

Real CMS proton-proton collision events in which 4 high energy electrons (green lines and red towers) are observed. The event shows characteristics expected from the decay of a Higgs boson but is also consistent with background Standard Model physics processes. Courtesy: CMS

Today physicists at CERN on the CMS and ATLAS experiments at the Large Hadron Collider announced an update on their search for the Higgs boson. That may make you wonder ( I hope) what is Fermilab’s role in this. Well, glad you asked.

Fermilab supports the 1,000 US LHC scientists and engineers by providing office and meeting space as well as the Remote Operation Center. Fermilab helped design the CMS detector, a portion of the LHC accelerator and is working on upgrades for both. About one-third of the members of each of the Tevatron’s experiments, CDF and DZero, are also members of the LHC experiments.

That means that a good portion of the LHC researchers are also looking for the Higgs boson with the Tevatron.  Because the Tevatron and LHC accelerators collide different pairs of particles, the dominant way in which the experiments search for the Higgs at the two accelerators is different. Thus the two machines offer a complimentary search strategy.

If the Higgs exists and acts the way theorists expect, it is crucial to observe it in both types of decay patterns. Watch this video to learn how physicists search for the Higgs boson. These types of investigations might lead to the identification of new and unexpected physics.

Scientists from the CDF and DZero collaborations at Fermilab continue to analyze data collected before the September shutdown of the Tevatron in the search for the Higgs boson.

The two collaborations will announce their latest results for the Higgs boson search at an international particle physics conference in March 2012. This new updated analysis will have 20 to 40 percent more data than the July 2011 results as well as further improvements in analysis methods.

The Higgs particle is the last not-yet-observed piece of the theoretical framework known as the Standard Model of particles and forces. Watch this video to learn The nature of the Higgs boson and how it works. According to the Standard Model, the Higgs boson explains why some particles have mass and others do not. Higgs most likely has a mass between 114-137 GeV/c2, about 100 times the mass of a proton. This predicted mass range is based on stringent constraints established by earlier measurements made by Tevatron and other accelerators around the world, and confirmed by the searches of LHC experiments presented so far in 2011. This mass range is well within reach of the Tevatron Collider.

The Tevatron experiments already have demonstrated that they have the ability to ferret out the Higgs-decay pattern by applying well-established techniques used to search for the Higgs boson to observing extremely rare but firmly expected physics signature. This signature consists of pairs of heavy bosons (WW or WZ) that decay into a pair of b quarks, a process that closely mimics the main signature that the Tevatron experiments use to search for the Higgs particle, i.e. Higgs decaying to a pair of b quarks, which has by far the largest probability to happen in this mass range. Thus, if a Standard Model Higgs exists, the Tevatron experiments will see it.

If the Standard Model Higgs particle does not exist, Fermilab’s Tevatron experiments are on track to rule it out this winter. CDF and DZero experiments have excluded the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges and will have sufficient analysis sensitivity to rule out this winter the mass region between.

While today’s announcement shows the progress that the LHC experiments have made in the last few months, all eyes will be on the Tevatron and on the LHC in March 2012 to see what they have to say about the elusive Higgs Boson.

— Tona Kunz

Share

CDF (red) and DZero (yellow) recorded the Colorado earthquake. Image courtesy of Todd Johnson, AD

On Tuesday, Aug. 23, the Tevatron accelerator knew something none of the people operating it knew. It felt what employees didn’t, and it reported the news faster than the media could upload it to the Internet.

A 5.9-magnitude earthquake had struck the East Coast, and the super-sensitive Tevatron felt it as it happened about 600 miles away. It had also registered a similar quake in Colorado the night before.

The quakes were recorded by sensors on large underground focusing magnets that compress particle beams from the four-mile Tevatron ring into precision collisions at the CDF and DZero detectors. The sensors keep these areas most sensitive to misalignment under constant surveillance. Quakes can jiggle small numbers of particles – less than one percent of the beam – out of alignment and force the shutdown of parts of the three-story detectors to avoid damage. Tevatron operators compare the sensor recordings with updates from the U.S. Geological Survey to rule out natural causes before having to spend time diagnosing machine trouble that caused beam movement.

Typically, two quakes occurring in this short a timeframe would cause headaches for those who run the Tevatron, but fortunately the machine didn’t have beam in the tunnels at the time.

CDF (red) and DZero (yellow) recorded the East Coast earthquake. Image courtesy of Todd Johnson, AD

The Tevatron has recorded more than 20 earthquakes from all over the globe, as well as the deadly tsunamis in Sumatra in 2005 and in Japan in March.

—Tona Kunz

Share

This article appeared in ILC Newsline July 28.

Editor’s note: Fermilab has been working with national and international institutions to develop superconducting radio-frequency cavities and their encapsulating cryomodules for next-generation accelerators, including the proposed International Linear Collider and the proposed Project X.

CM1 with its recently installed RF distribution system ready for testing. Image: Jerry Leibfritz

SRF technology enables the acceleration of intense beams of particles to high energies more efficiently and at lower costs than other technologies. SRF technology could also be applied in the areas of clean nuclear energy and transmutation of radioactive waste.

Cryomodule 1 is now firing on all eight cavities.

Cryomodule 1, Fermilab’s test cryomodule for ILC-type accelerating cavities and superconducting radiofrequency (SRF) technology, was powered up as a complete, multi-cavity instrument earlier this month. Previously, researchers had delivered power only to the individual cavities inside it.

“We’ve operated superconducting cavities before, but this is the next step in scale,” said Sergei Nagaitsev of Fermilab’s Accelerator Division. “Operating a single cavity in its own cryostat is comparable, but with a full cryomodule, the complexity goes up by an order of magnitude.”

Since the cool-down of CM1 last November, scientists and engineers have been busy installing the plumbing for power distribution, called waveguides; upgrading the water skid, which helps with the cooling of the high-power RF equipment; and taking data on each cavity’s accelerating gradient and quality factor, or Q. Researchers completed the cavity tests in June.

“The big question now is how this module performs compared to when the cavities were at DESY,” said Fermilab’s Elvin Harms. The German physics lab DESY provided all eight CM1 cavities, which were tested before they came across the Atlantic. Over the coming weeks, researchers will continue to feed power into cryomodule to gather data on how cavities perform as a single unit rather than as individual elements. The hope is that their gradients and Q will be in reasonable agreement with DESY’s numbers.

To make sure the data that comes through is reliable, the CM1 team will work on calibrations, test RF power operation, and work the kinks out of the system. Then comes a multi-week programme where scientists will perform stability tests and beam studies for the ILC beam current programme, which includes tests that can be conducted without the presence of beam. Researchers will also use CM1 for tests for Project X, Fermilab’s proposed proton accelerator programme.

Not all sailing was smooth in the time since the November cool-down. Some cavities still have wrinkles that need to be ironed out.

“Nevertheless, the fact that the integration of it all into a single system worked is a tremendous boost for the Accelerator Division, the Technical Division and our collaborators,” Nagaitsev said.

Collaborators on CM1 include researchers from DESY, INFN in Italy and KEK in Japan.

“Many people have invested a lot of time in CM1,” Harms said. “They’ve been eagerly waiting to get this to this day.”

— Leah Hesla

Share

The combined Tevatron results exclude the existence of a Higgs particle with a mass between 100-108 and 156-177 GeV/c2. For the range 110-155 GeV/c2, the experiments are now extremely close to the sensitivity needed (dotted line below 1) either to see a substantial excess of Higgs-like events or to rule out the existence of the particle. The small excess of Higgs-like events observed by the Tevatron experiments in the range from 120 to 155 (see solid curve) is not yet statistically significant.

Scientists of the CDF and DZero collaborations at Fermilab continue to increase the sensitivity of their Tevatron experiments to the Higgs particle and narrow the range in which the particle seems to be hiding. At the European Physical Society conference in Grenoble, Fermilab physicist Eric James reported today that together the CDF and DZero experiments now can exclude the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges, expanding exclusion ranges that the two experiments had reported in March 2011.

Last Friday, the ATLAS and CMS experiments at the European center for particle physics, CERN, reported their first exclusion regions. The two experiments exclude a Higgs particle with a mass of about 150 to 450 GeV/c2, confirming the Tevatron exclusion range and extending it to higher masses that are beyond the reach of the Tevatron. Even larger Higgs masses are excluded on theoretical grounds.

This leaves a narrow window for the Higgs particle, and the Tevatron experiments are on track to collect enough data by the end of September 2011 to close this window if the Higgs particle does not exist.

James reported that the Tevatron experiments are steadily becoming more sensitive to Higgs processes that the LHC experiments will not be able to measure for some time. In particular, the Tevatron experiments can look for the decay of a Higgs particle into a pair of bottom and anti-bottom quark which are the dominant, hard-to-detect decay mode of the Higgs particle. In contrast, the ATLAS and CMS experiments currently focus on the search for the decay of a Higgs particle into a pair of W bosons, which then decay into lighter particles.

This graph shows the improvement in the combined sensitivity of the CDF and DZero experiments to a Higgs signal over the last couple of years. When the sensitivity for a particular value of the Higgs mass, mH, drops below one, scientists expect the Tevatron experiments to be able to rule out a Higgs particle with that particular mass. By early 2012, the Tevatron experiments should be able to corroborate or rule out a Higgs particle with a mass between 100 to about 190 GeV/c2.

The LHC experiments reported at the EPS conference an excess of Higgs-like events in the 120-150 GeV/c2 mass region at about the 2-sigma level. The Tevatron experiments have seen a small, 1-sigma excess of Higgs-like events in this region for a couple of years. A 3-sigma level is considered evidence for a new result, but particle physicists prefer a 5-sigma level to claim a discovery. More data and better analyses are necessary to determine whether these excesses are due to a Higgs particle, some new phenomena or random data fluctuations.

In early July, before the announcement of the latest Tevatron and LHC results, a global analysis of particle physics data by the GFitter group indicated that, in the simplest Higgs model, the Higgs particle should have a mass between approximately 115 and 137 GeV/c2.

“To have confidence in having found the Higgs particle that theory predicts, you need to analyze the various ways it interacts with other particles,” said Giovanni Punzi, co-spokesperson of the CDF experiment. “If there really is a Higgs boson hiding in this region, you should be able to find its decay into a bottom-anti-bottom pair. Otherwise, the result could be a statistical fluctuation, or some different particle lurking in your data.”

The CDF and DZero experiments will continue to take data until the Tevatron shuts down at the end of September.

“The search for the Higgs particle in its bottom and anti-bottom quark decay mode really has been the strength of the Tevatron,” said Dmitri Denisov, DZero co-spokesperson

“With the additional data and further improvements in our analysis tools, we expect to be sensitive to the Higgs particle for the entire mass range that has not yet been excluded. We should be able to exclude the Higgs particle or see first hints of its existence in early 2012.”

The details of the CDF and DZero analysis are described in this note, which will be posted later today, as well as submitted to the arXiv.

—Kurt Riesselmann

Share

This article first appeared in symmetry breaking July 22.

Editor’s note:  The LHC experiments reported at the EPS meeting a tantalizing excess of Higgs-like events, short of claiming a discovery, but very intriguing nevertheless. See the Higgs search at the LHC section further below for more information on these results.

 The LHC experiments reported at the EPS meeting a tantalizing excess of Higgs-like events, short of claiming a discovery, but very intriguing nevertheless. See the Higgs search at the LHC section further below for more information on these results.

Experiments at Fermi National Accelerator Laboratory and the European particle physics center, CERN, are zooming in on the final remaining mass region where the Higgs particle might be lurking. Over the next seven days, Fermilab’s CDF and DZero collaborations and CERN’s ATLAS and CMS collaborations will announce their latest Higgs search results at the High-Energy Physics conference of the European Physical Society.

Scientists at Fermilab and CERN employ very similar methods to create the Higgs: accelerate particles to high energy using the world’s most powerful accelerators, the Tevatron (1 TeV beam energy) and the Large Hadron Collider (3.5 TeV), smash the particles together, and sift through the large number of new particles emerging from these collisions. But to find a Higgs particle among the many particles created, the teams of scientists are focusing on different signals (see below).

If the Higgs particle exists and has the properties predicted by the simplest Higgs model, named after Scottish physicist Peter Higgs, then the colliders at Fermilab and CERN already must have produced Higgs particles. But finding the tell-tale sign of a Higgs boson among all other particle signatures is like searching for a drop of ink in an ocean. Only if the accelerators produce more and more collisions do scientists stand a chance of finding enough evidence for the Higgs particle.

Where to look

The Higgs mechanism, developed in the 1960s by several independent groups of theorists, explains why some fundamental particles have mass and others don’t. Its mathematical framework fits perfectly into one of the most successful theories in science: the Standard Model of elementary particles and forces.

Experimenters sifting through data from one experiment after another have come up empty-handed; instead they have ruled out larger and larger swaths of potential Higgs territory. An analysis by the GFitter group of precision measurements and the direct and indirect constraints on the Higgs mass indicates that, in the simplest Higgs model, the Higgs particle should have a mass between approximately 115 and 137 billion electron volts (GeV)/c2, or about 100 times the mass of a proton.

Higgs search at the Tevatron

At Fermilab’s Tevatron, scientists attempt to produce Higgs particles by smashing together protons and antiprotons, composite particles that comprise elementary building blocks. When a proton and antiproton hit each other at high energy, scientists observe the collisions and interactions of these components, such as quarks, antiquarks and gluons. Those subatomic collisions transform energy into new particles that can be heavier than the protons themselves, as predicted by Einstein’s famous equation E=mc2.

At the Tevatron, which makes protons and antiprotons collide, scientists focus on finding signs for the decay of the Higgs particle into a bottom quark and anti-bottom quark.

At the Tevatron, which makes protons and antiprotons collide, scientists focus on finding signs for the decay of the Higgs particle into a bottom quark and anti-bottom quark.
Tevatron scientists have carried out detailed simulations of such collisions and found that the best chance for producing, say, a 120-GeV Higgs boson at the Tevatron are quark-antiquark collisions that create a high-energy W boson (see graphic). This W boson has a chance to spend its extra energy to generate a short-lived Higgs boson. The W boson and the Higgs boson would then decay into lighter particles that can be caught and identified by the CDF and DZero particle detectors, which surround the two proton-antiproton collision points of the Tevatron.

According to the Standard Model, such a 120-GeV Higgs boson will decay 68 percent of the time into a bottom quark and anti-bottom quark. But other collision processes and particle decays also produce bottom and anti-bottom quarks. Identifying an excess of these particles due to the decay of the Higgs boson is the best chance for Tevatron scientists to discover or rule out a Standard Model Higgs.

At the EPS conference, CDF and DZero will report (see press release) that, for the first time, the two collaborations have successfully applied well-established techniques used to search for the Higgs boson to observe extremely rare collisions that produce pairs of heavy bosons (WW or WZ) that decay into heavy quarks. This well-known process closely mimics the production of a W boson and a Higgs particle, with the Higgs decaying into a bottom quark and antiquark.

Higgs search at the LHC

At the LHC, located on the French-Swiss border, scientists smash protons into protons. Because the LHC operates at higher collision energies than the Tevatron, each collision produces on average many more particles than a collision at the Tevatron. In particular, the LHC floods its particle detectors with bottom and anti-bottom quarks created by many different types of subatomic processes. Hence it becomes more difficult than at the Tevatron to find this particular “ink in the ocean”—an excess of bottom and anti-bottom quarks in the LHC data due to the Higgs particle.

At the EPS conference, the ATLAS scientists showed that they should have been able to exclude a Higgs boson with mass between 130 and 200 GeV/c2, but instead the collaboration saw an excess of events in the 130 to 155 GeV/c2 range, as reported by ATLAS physicist Jon Butterworth in his blog at the Guardian. It could be a fluctuation, but it could also be the first hint of a Higgs signal. Geoff Brumfiel writes for Nature News that the CMS experiment also sees an excess in the 130 to 150 GeV/c2 range. (CMS physicist Tommaso Dorigo has posted the relevant CMS Higgs search plots in his blog.) Combined, the two LHC experiments should have enough data by the end of this summer to say whether this excess is real or not. The Tevatron experiments are getting close to being sensitive to a Higgs particle near 150 GeV as well. Here is the new DZero result: the dotted line, which indicates sensitivity, is approaching 1 near 150 GeV, but the solid line, which is the actual observation, is significantly below 1, yet it differs from the expectation only at the 1 to 1.5 sigma level. Bottom line: DZero scientists cannot exclude a Higgs boson in this range. And here is the new CDF result: Again, for a Higgs mass of about 150 GeV/c2, the sensitivity approaches 1, and the observed Higgs constraints agree well with the expectations. (Note that DZero shows 1-sensitivity and CDF shows sensitivity; that’s why the CDF curve is above 1.) On Wednesday, July 27, CDF and DZero will present their combined results for this mass range at the EPS conference. The sensitivity of the combined CDF and DZero results will be even closer to 1 at 150 GeV/c2.

At the Large Hadron Collider, which smashes protons into protons, scientists focus on finding signs for the decay of the Higgs particle into two photons.

At the Large Hadron Collider, which smashes protons into protons, scientists focus on finding signs for the decay of the Higgs particle into two photons.

For a light Higgs boson, LHC scientists focus on a very different Higgs production and decay process, complementary to the Higgs search at the Tevatron. Detailed simulations of high-energy proton-proton collisions have shown that the best chance to catch, say, a 120-GeV Standard Model Higgs particle at the LHC is to look for a Higgs boson emerging from the collision of two gluons, followed by its decay into two high-energy gamma rays (see second graphic). This is an extremely rare process since the Higgs boson doesn’t interact directly with the massless gluons and gamma rays. Instead, the Higgs production and decay occur through intermediate, massive quark-antiquark loops, which can temporarily appear in subatomic processes, in accordance with the laws of quantum mechanics. The intermediate loop, however, makes this process much rarer to occur. In particular, the decay of a 120-GeV Standard Model Higgs boson into two gamma rays happens only once out of 500 times. Hence LHC scientists will need to gather a sufficiently large number of proton-proton collisions to observe this process.

Why do physicists think that the Higgs particle exists?

The discovery in the 1980s of heavy, force-carrying particles, known as W and Z bosons, confirmed crucial predictions made by the Standard Model and the simplest Higgs model. Since then, further discoveries and precision measurements of particle interactions have confirmed the validity of the Standard Model many times. It now seems almost impossible to explain the wealth of particle data without the Higgs mechanism. But one crucial ingredient of this fabulous particle recipe—the Higgs boson itself—has remained at large. Does it exist? How heavy is it? Does it interact with quarks and other massive particles as expected? These questions will keep scientists busy for years to come.

Want to learn more about what the Higgs particle is and how it gives mass to some particles? Watch this 3-minute video.

Kurt Riesselmann

Share