• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘CDF’

Top Quarks… So Many Top Quarks

Wednesday, April 30th, 2014

Thousands of paper on top quarks exist. Why?

There are literally thousands of papers, collaboration notes, and conference notes with the words “Top” and “Quark” in the title. As of this post, there are 3,477 since 1979 listed on inSpires. There are many, many more that omit the word “quark”. And sure, this is meager compared to the 5,114 papers with the words “Higgs Boson” written since ’74, but that is over 50,000 pages of top quarks (estimating 15 pages/paper). To be fair, there are also many, many more that omit the word “boson”. But for further comparison, there are only 395 papers with a title including the words “Bottom Quark“, 211 with “Bottomonium“, and 125 with “Bottom Hadron“. So why are there so many papers written about the top quark? The answer is that the top quark is weird special.

http://www.symmetrymagazine.org/breaking/2009/09/02/top-quark-chefs

A single top quark candidate event at the Collider Detector experiment at Fermilab. Credit: CDF Collaboration

The top quark is very heavy, about 185 times heavier than the proton and ranks as the heaviest known elementary particle in all the particle kingdom. The second heaviest quark, the bottom quark, is only 4 or 5 times heavier than the proton. If you or I were a proton, then a medium-to-large school bus (without any people) would be a top quark. In fact, the top quark is so heavy it can decay into a real (on-shell) W boson, which is roughly half its mass. The only other particle that can do this is the Higgs. Though it is rare, exceedingly rare, the top quark can decay into real Z  and Higgs bosons as well. Not even the Higgs can top that last feat.

Top quark decaying into real, on-shell W boson and bottom quark. Credit: DZero Collaboration

However, the top quark is still a quark. It has an electric charge that is 2/3 as large as the proton. It has an intrinsic angular momentum (spin) equal to the proton’s or electron’s spin. The top quark is also colored, meaning that is interacts with gluons and is influenced by the strong nuclear force (QCD). When colored objects (quarks and gluons) are produced at collider and fixed target experiments, they undergo a process called hadronization. Hadronization is when two colored objects are far away from one another and the strong nuclear attraction between the two becomes so strong that a pair of colored objects will spontaneously be produced in the space between them. These new colored particles will then form bound states with the old colored states. However, the process hadronization means that we only observe the bound states of colored objects and not the colored objects themselves. Physicists have to infer their properties from the physics of bound states…. or do we?

jets

Colored objects before (L), during (Center L and Center R), and after (R) hadronization.

The onset of hadronization is typically occurs about 10-24 seconds after the creation of a colored object. Yes, that is 0.000000000000000000000001 seconds. That is incredibly fast and well beyond anything that can be done at an experiment. The mean lifetime of the top quark on the other hand is about 10-25 seconds. In other words, the top quark is much more likely to decay in to a W boson, its principle decay mode, than hadronize. By looking at the decays of the W boson, for example to an electron and an electron-neutrino, their angular distributions, and other kinematic properties, we can measure directly the top quark’s quantum numbers. The top quark is special because it is the only quark whose spin and charge quantum numbers we can measure directly.

feynman_t_decay_ljetsqq_pink

Top quark decaying into real, on-shell W boson and bottom quark. The W boson can subsequently decay into a charged lepton and a neutrino or into a quark and anti-quark. Credit: DZero Collaboration

The top quark tells us much about the Standard Model of particle physics, but it also may be a window to new physics. Presently, no one has any idea why the top quark is so much heavier than the bottom quark, or why both are orders of magnitude heavier than the electron and muon. This is called the “Mass Hierarchy Problem” of the Standard Model and stems from the fact that the quark and lepton masses in the theory are not predicted but are taken as input parameters. This does not mean that the Standard Model is “wrong”. On the contrary, the model works very, very well; it is simply incomplete. Of course there are new models and hypotheses that offer explanations, but none have been verified by data.

However, thanks to the 2012 discovery of the Higgs boson, there is a new avenue that may shed light upon the mass hierarchy problem. We now know that quarks and leptons interact with the Higgs boson proportionally to their masses. Since the top quark is ~40 times more massive than the bottom quark, it will interact with Higgs boson 40 times more strongly. There is suspicion that since the Higgs boson is sensitive to the different quark and lepton masses, it may somehow play a role in how masses are assigned.

Happy Colliding

– richard (@BraveLittleMuon)

Share

This article appeared in Fermilab Today on March 26, 2014.

Tom Wicks, rigging superintendent at Joliet Steel & Construction, stands next to the stripped-down CDF detector at Fermilab. His mother, Lois Anderson, helped build the detector as an ironworker nearly 30 years ago. Photo: Amanda Solliday

Tom Wicks, rigging superintendent at Joliet Steel & Construction, stands next to the stripped-down CDF detector at Fermilab. His mother, Lois Anderson, helped build the detector as an ironworker nearly 30 years ago. Photo: Amanda Solliday

One day when Tom Wicks was a child, he biked over to see his mom, Lois Anderson, working at an office in Aurora, Ill. She was at the top of the building, welding and torching as ironworkers do.

“That’s when my son told me, ‘I want to do that,'” Anderson said.

Both mother and son have worked as ironworkers on Fermilab experiments throughout their careers. Anderson, known as “Sarge” during business hours and the only female on her crew for decades, began ironworking at CDF — one of two detectors located on the Tevatron ring — when it was “a hole in the ground” in the early 1980s. Anderson and Wicks, rigging superintendent at Joliet Steel & Construction, worked together on the last upgrade of the detector in 2001.

Now Wicks is dismantling much of the roughly 4,000-ton particle detector that he, his mother and his stepfather helped build.

“She likes to tease me about it. ‘All that work we’ve put into it, and now you’re tearing it apart?'” Wicks said.

CDF ran for more than two decades, collecting data from proton-antiproton collisions from 1985 until the Tevatron shut down in 2011. Scientists at CDF and its sister detector DZero discovered the last quark predicted by the Standard Model, the top quark. Both collaborations still analyze valuable data collected from the detectors.

In its heyday, the large orange and blue CDF detector drew crowds when upgrades required rolling the machine from the collision tunnel to an open assembly hall.

“During the last upgrade, it was like a football game,” Wicks said. “There were so many people watching, you couldn’t get a space along the rail to watch us do it.”

Wicks and his crew began working with Fermilab staff to remove equipment from the CDF detector in March 2013. They will likely finish next month, leaving intact the multmillion-dollar solenoid magnet at the core of the detector.

John Wackerlin, a fellow ironworker and foreman at Walbridge, led one of the teams tasked with decommissioning the experiment. Like Wicks, he’s laying to rest something his family helped build. His father, Bob Wackerlin, welded together the structure that houses the 30-foot-tall detector.

The elder Wackerlin’s work at Fermilab started even before CDF. When his wife was pregnant with John, Bob Wackerlin worked underground in the 4-mile Tevatron tunnel while it was still being dug. He retired after 42 years as an ironworker and said he’s proud of his family’s connection to the laboratory.

“I’ve worked in just about every building on this site,” Bob Wackerlin said. “Fermilab projects are some of the best jobs that come across our ironworkers union. It’s employed a lot of people over the years.”

His son added, “Working with physicists and the talent and brainpower here — it’s unreal.”

Although CDF is turned off and its many wires and cables scrapped, much of the detector will find a home in future experiments. The solenoid magnet, for example, could be reused in another particle experiment, said Fermilab scientist Jonathan Lewis. Scientists are recycling parts of the detector for other high-energy physics projects at Fermilab, and electronics, phototubes and assorted pieces of CDF have also been shipped to other labs and universities in the United States, Europe and Japan.

Both families see this as progress.

“Once you’ve learned something from one experiment, it makes way for new experiments,” John Wackerlin said. “So now we can go on to even bigger and better things. I’m excited about it.”

Amanda Solliday

Share

This article appeared in symmetry on March 19, 2014.

An international team of scientists from Fermilab’s Tevatron and CERN’s Large Hadron Collider has produced the world’s best value for the mass of the top quark.

An international team of scientists from Fermilab’s Tevatron and CERN’s Large Hadron Collider has produced the world’s best value for the mass of the top quark.

Scientists working on the world’s leading particle collider experiments have joined forces, combined their data and produced the first joint result from Fermilab’s Tevatron and CERN’s Large Hadron Collider. These machines are the past and current holders of the record for most powerful particle collider on Earth.

Scientists from the four experiments involved—ATLAS, CDF, CMS and DZero—announced their joint findings on the mass of the top quark today at the Rencontres de Moriond international physics conference in Italy.

Together the four experiments pooled their data analysis power to arrive at a new world’s best value for the mass of the top quark of 173.34 ± 0.76 GeV/c2.

Experiments at the LHC at the CERN laboratory in Geneva, Switzerland and the Tevatron collider at Fermilab in Illinois, USA are the only ones that have ever seen top quarks—the heaviest elementary particles ever observed. The top quark’s huge mass (more than 100 times that of the proton) makes it one of the most important tools in the physicists’ quest to understand the nature of the universe.

The new precise value of the top-quark mass will allow scientists to test further the mathematical framework that describes the quantum connections between the top quark, the Higgs particle and the carrier of the electroweak force, the W boson. Theorists will explore how the new, more precise value will change predictions regarding the stability of the Higgs field and its effects on the evolution of the universe. It will also allow scientists to look for inconsistencies in the Standard Model of particle physics—searching for hints of new physics that will lead to a better understanding of the nature of the universe.

“The combining together of data from CERN and Fermilab to make a precision top quark mass result is a strong indication of its importance to understanding nature,” says Fermilab director Nigel Lockyer. “It’s a great example of the international collaboration in our field.”

Courtesy of: Fermilab and CERN

Courtesy of: Fermilab and CERN

A total of more than six thousand scientists from more than 50 countries participate in the four experimental collaborations. The CDF and DZero experiments discovered the top quark in 1995, and the Tevatron produced about 300,000 top quark events during its 25-year lifetime, completed in 2011. Since it started collider physics operations in 2009, the LHC has produced close to 18 million events with top quarks, making it the world’s leading top quark factory.

“Collaborative competition is the name of the game,” says CERN’s Director General Rolf Heuer. “Competition between experimental collaborations and labs spurs us on, but collaboration such as this underpins the global particle physics endeavor and is essential in advancing our knowledge of the universe we live in.”

Each of the four collaborations previously released their individual top-quark mass measurements. Combining them together required close collaboration between the four experiments, understanding in detail each other’s techniques and uncertainties. Each experiment measured the top-quark mass using several different methods by analyzing different top quark decay channels, using sophisticated analysis techniques developed and improved over more than 20 years of top quark research beginning at the Tevatron and continuing at the LHC. The joint measurement has been submitted to the arXiv.

A version of this article was originally issued by Fermilab and CERN as a press release.

Share

This Fermilab press release was published on February 24.

Matteo Cremonesi, left, of the University of Oxford and the CDF collaboration and Reinhard Schwienhorst of Michigan State University and the DZero collaboration present their joint discovery at a forum at Fermilab on Friday, Feb. 21. The two collaborations have observed the production of single top quarks in the s-channel, as seen in data collected from the Tevatron. Photo: Cindy Arnold

Matteo Cremonesi, left, of the University of Oxford and the CDF collaboration and Reinhard Schweinhorst of Michigan State University and the DZero collaboration present their joint discovery at a forum at Fermilab on Friday, Feb. 21. The two collaborations have observed the production of single top quarks in the s-channel, as seen in data collected from the Tevatron. Photo: Cindy Arnold

Scientists on the CDF and DZero experiments at the U.S. Department of Energy’s Fermi National Accelerator Laboratory have announced that they have found the final predicted way of creating a top quark, completing a picture of this particle nearly 20 years in the making.

The two collaborations jointly announced on Friday, Feb. 21, that they had observed one of the rarest methods of producing the elementary particle – creating a single top quark through the weak nuclear force, in what is called the s-channel. For this analysis, scientists from the CDF and DZero collaborations sifted through data from more than 500 trillion proton-antiproton collisions produced by the Tevatron from 2001 to 2011. They identified about 40 particle collisions in which the weak nuclear force produced single top quarks in conjunction with single bottom quarks.

Top quarks are the heaviest and among the most puzzling elementary particles. They weigh even more than the Higgs boson – as much as an atom of gold – and only two machines have ever produced them: Fermilab’s Tevatron and the Large Hadron Collider at CERN. There are several ways to produce them, as predicted by the theoretical framework known as the Standard Model, and the most common one was the first one discovered: a collision in which the strong nuclear force creates a pair consisting of a top quark and its antimatter cousin, the anti-top quark.

Collisions that produce a single top quark through the weak nuclear force are rarer, and the process scientists on the Tevatron experiments have just announced is the most challenging of these to detect. This method of producing single top quarks is among the rarest interactions allowed by the laws of physics. The detection of this process was one of the ultimate goals of the Tevatron, which for 25 years was the most powerful particle collider in the world.

“This is an important discovery that provides a valuable addition to the picture of the Standard Model universe,” said James Siegrist, DOE associate director of science for high energy physics. “It completes a portrait of one of the fundamental particles of our universe by showing us one of the rarest ways to create them.”

Searching for single top quarks is like looking for a needle in billions of haystacks. Only one in every 50 billion Tevatron collisions produced a single s-channel top quark, and the CDF and DZero collaborations only selected a small fraction of those to separate them from background, which is why the number of observed occurrences of this particular channel is so small. However, the statistical significance of the CDF and DZero data exceeds that required to claim a discovery.

“Kudos to the CDF and DZero collaborations for their work in discovering this process,” said Saul Gonzalez, program director for the National Science Foundation. “Researchers from around the world, including dozens of universities in the United States, contributed to this important find.”

The CDF and DZero experiments first observed particle collisions that created single top quarks through a different process of the weak nuclear force in 2009. This observation was later confirmed by scientists using the Large Hadron Collider.

Scientists from 27 countries collaborated on the Tevatron CDF and DZero experiments and continue to study the reams of data produced during the collider’s run, using ever more sophisticated techniques and computing methods.

“I’m pleased that the CDF and DZero collaborations have brought their study of the top quark full circle,” said Fermilab Director Nigel Lockyer. “The legacy of the Tevatron is indelible, and this discovery makes the breadth of that research even more remarkable.”

Fermilab is America’s national laboratory for particle physics research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Share

Higgsdependence Day

Monday, July 2nd, 2012

On July 4th CERN will hold a seminar where ATLAS and CMS will present their latest findings on the search for the Higgs boson. There’s a reasonable chance that either or both experiments will see a 5 sigma excess, and this would be enough to claim a “discovery”. One of my US friends at CERN called this day Higgsdependence Day, and all over the USA people will be celebrating with fireworks and barbecues. (Okay, perhaps they will be celebrating something else. My boss tells me he might tar and feather me as the token British member of the group…)

CERN is not the only lab to be holding a seminar. Today at 09:00 CDT Fermilab will be announcing the latest results from CDF and D0. Rumors suggest a 3 sigma excess (technically an “observation”) in the interesting region. So if you can spare the time I’d recommend you listen in on the announcement. You can see the webcast information here.

In anticipation of the CERN seminar, when I came to my office this morning I found a bottle of champagne with a label hastily pasted to the back. It seems these might be placed alongside fire extinguishers in every office at CERN! (You can get your own label here.)

No Higgs seminar is complete without a bottle of Champagne, just in case!

No Higgs seminar is complete without a bottle of Champagne, just in case!

For those of us who can’t get enough of the Higgs boson and want to brush up on the basics I would recommend the following show, put out by the BBC. This contains the latest results from the 2011 searches and it goes into quite a bit of depth about why we think the Higgs boson exists and what to expect from the 2012 searches.

Finally for those keeping score I still have $20 riding on a non-discovery. If a 5 sigma excess is seen on Wednesday there is a bit more work that needs to be done to show that it is the Standard Model Higgs, and that would probably take until the end of 2012 running. So my $20 is safe… for now.

Share

Each time news comes out about the Higgs boson I get questions from media, friends and family trying to grasp why this particle is so important. The following questions come up again and again. So with experimenters from using Fermilab’s  Tevatron announcing new Higgs results Wednesday at a conference in Italy, I thought it was time to share answers to the questions that might pop into your mind.

Why should the average person care if the Higgs is found?

Understanding more about the building blocks of matter and the forces that control their interactions helps scientists to learn how to manipulate those forces to humankind’s benefit. For example, the study of the electron led to the development of electricity, the study of quantum mechanics made possible the creation of GPS systems and the study of the weak force led to an understanding of radioactive decay and nuclear power.

Now what?

The Tevatron experiments will continue to further analyze the Higgs boson data to wring out more information. In addition, the Tevatron and LHC experiments are working to combine their data for a release at an unspecified date.

Even if both teams find evidence of a Higgs boson in the same location, physicists will need to do more analysis to make sure the Higgs boson isn’t a non-Standard Model Higgs masquerading as a resident of the Standard Model. That will require physicists to measure several properties in addition to mass.

What would finding the Higgs boson mean for the field of physics?

Finding evidence of the Higgs boson would expand the following three areas of study:

• Pin-pointing the mass range of the Higgs would help physicists condense the number of theories about the existence of undiscovered particles and the forces that interact on them. For example, a Standard Model Higgs boson would rule out classic QCD-like versions of technicolor theory. A Higgs boson with a mass larger than 125 GeV would rule out the simplest versions of supersymmetry, or SUSY, which predict that every known particle has an unknown sibling particle with a different mass. Other theories would gain more support. One such SUSY theory predicts that a Standard Model Higgs boson would appear as the lightest of a group of five or more Higgs bosons. Whether the Higgs boson exists or not does not affect theories about the existence of extra dimensions.

• Knowing the mass of the Higgs boson would give physicists more data to plug into other equations about how our universe formed and about some of the least understood particle interactions, such as magnetic muon anomaly.

• Finding evidence of a heavy mass Higgs boson (larger than 150 GeV) would require the existence of undiscovered particles and/or forces. Finding a light mass Higgs boson (less than 125 GeV) would not require the existence of new physics but doesn’t rule it out either.

What is the difference between the Higgs boson and the Higgs field?

The Higgs field exists like a giant vat of molasses spread throughout the universe. Particles that travel through it end up with globs of molasses sticking to them, slowing them down and making them heavier. You can think of the Higgs boson as the molasses globs, or a particle manifestation of this energy field akin to a ball of energy.

Physicists have different theories about how many Higgs bosons exist, akin to predicting whether the molasses would stick in one giant glob or several globlets.

How long have physicists been looking for the Higgs?

More than a decade. It started with the LEP experiment at CERN in the 1990s, continued with the Tevatron and now with the LHC.

How do physicists create a Higgs boson?

A high-energy particle accelerator such as the Tevatron or LHC can recreate the energy levels that permeated the universe shortly after the Big Bang. Colliding particles at this energy level can set free the right amount of energy to produce particles, including a Higgs boson. The collision energy is localized in a small space and transforms from energy into the mass of the Higgs boson.

How is the Higgs boson related to the Big Bang theory?

The Big Bang occurred 13.7 billion years ago sending massless particles and radiation energy zooming through the universe like cars at rush hour. Shortly afterward, the Higgs field appeared, as if a truck carrying molasses overturned and leaked all over the highway. Particles such as light, which went through the puddle super fast, avoided having any molasses stick to them, similar to the way hydroplaning cars skim the surface of water. Particles that went through the molasses puddle more slowly had molasses goblets cling to them, creating a drag that slowed them even more and made them more massive. How fast a particle made it through the puddle determined how much molasses clung to it, and thus how massive it became. When the universe began to cool, slow particles with mass began to bunch up like mini-traffic jams and form composite particles and then atoms.

How do we know this is where the Higgs is located?

Just as firemen sweep building floors to rule out the existence of trapped homeowners, physicists have used direct and indirect observations from experiments to rule out the existence of the Higgs boson in most energy ranges where the Standard Model predicts it could reside.

Does the mass of the Higgs compare to its weight?

Sort of. Non-physicists think of mass as how much something weighs. But scientists consider mass to take into account weight and other factors. Weight changes with gravity, so you would weigh less on the moon than on Earth. Mass remains constant throughout the universe. However, when talking about things on Earth, mass and weight are fairly interchangeable.

How did the Higgs boson get the nickname “the God particle”?

Nobel laureate Leon Lederman, a Fermilab physicist, wrote a book in the early 1990s about particle physics and the search for the Higgs boson. His publisher coined the name as a marketable title for the book. Scientists dislike the nickname.

What countries are involved in the CDF and DZero experiments?

• CDF: US, Canada, France, Germany, Greece, Italy, Japan, Korea, UK, Russia, Slovakia, Spain and Taiwan

• DZero; Brazil, China, Columbia, Czechoslovakia, Ecuador, France, Germany, India, Ireland, Korea, Mexico, Netherlands, UK, Ukraine, US, Russia, Spain and Sweden.

What is the competitive relationship between the Tevatron and LHC experiments?

It is closer to sibling rivalry than the traditional business competition you would find in something such as the auto industry.

Fermilab supports about 1,000 US CMS scientists and engineers by providing computing facilities, office and meeting space as well as the LHC Remote Operation Center. Fermilab helped design and build the CMS detector as well as equipment for the LHC accelerator, and Fermilab scientists are working on upgrades for both and analyzing data. About one third of the members of each of the Tevatron’s experiments, CDF and DZero, are also members of the LHC experiments.

— Tona Kunz

Share

This article first appeared in ISGTW Dec. 21, 2011.

A night-time view of the Tevatron. Photo by Reidar Hahn.

This is the first part of a two-part series on the contribution Tevatron-related computing has made to the world of computing. This part begins in 1981, when the Tevatron was under construction, and brings us up to recent times. The second part will focus on the most recent years, and look ahead to future analysis.

Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software.

Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. In fact, computing continues to do its work. Although the proton and antiproton beams no longer brighten the Tevatron’s tunnel, physicists expect to be using computing to continue analyzing a vast quantity of collected data for several years to come.

When all that data is analyzed, when all the physics results are published, the Tevatron will leave behind an enduring legacy. Not just a physics legacy, but also a computing legacy.

In the beginning: The fixed-target experiments

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.

1981. The first Indiana Jones movie is released. Ronald Reagan is the U.S. President. Prince Charles makes Diana a Princess. And the first personal computers are introduced by IBM, setting the stage for a burst of computing innovation.

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.Meanwhile, at the Fermi National Accelerator Laboratory in Batavia, Illinois, the Tevatron has been under development for two years. And in 1982, the Advanced Computer Program formed to confront key particle physics computing problems. ACP tried something new in high performance computing: building custom systems using commercial components, which were rapidly dropping in price thanks to the introduction of personal computers. For a fraction of the cost, the resulting 100-node system doubled the processing power of Fermilab’s contemporary mainframe-style supercomputers.

“The use of farms of parallel computers based upon commercially available processors is largely an invention of the ACP,” said Mark Fischler, a Fermilab researcher who was part of the ACP. “This is an innovation which laid the philosophical foundation for the rise of high throughput computing, which is an industry standard in our field.”

The Tevatron fixed-target program, in which protons were accelerated to record-setting speeds before striking a stationary target, launched in 1983 with five separate experiments. When ACP’s system went online in 1986, the experiments were able to rapidly work through an accumulated three years of data in a fraction of that time.

Entering the collider era: Protons and antiprotons and run one

1985. NSFNET (National Science Foundation Network), one of the precursors to the modern Internet, is launched. And the Tevatron’s CDF detector sees its first proton-antiproton collisions, although the Tevatron’s official collider run one won’t begin until 1992.

The experiment’s central computing architecture filtered incoming data by running Fortran-77 algorithms on ACP’s 32-bit processors. But for run one, they needed more powerful computing systems.

By that time, commercial workstation prices had dropped so low that networking them together was simply more cost-effective than a new ACP system. ACP had one more major contribution to make, however: the Cooperative Processes Software.

CPS divided a computational task into a set of processes and distributed them across a processor farm – a collection of networked workstations. Although the term “high throughput computing” was not coined until 1996, CPS fits the HTC mold. As with modern HTC, farms using CPS are not supercomputer replacements. They are designed to be cost-effective platforms for solving specific compute-intensive problems in which each byte of data read requires 500-2000 machine instructions.

CPS went into production-level use at Fermilab in 1989; by 1992 it was being used by nine Fermilab experiments as well as a number of other groups worldwide.

1992 was also the year that the Tevatron’s second detector experiment, DZero, saw its first collisions. DZero launched with 50 traditional compute nodes running in parallel, connected to the detector electronics; the nodes executed filtering software written in Fortran, E-Pascal, and C.

Gearing up for run two

"The Great Wall" of 8mm tape drives at the Tagged Photon Laboratory, circa 1990 - from the days before tape robots. Photo by Reidar Hahn.

1990. CERN’s Tim Berners-Lee launches the first publicly accessible World Wide Web server using his URL and HTML standards. One year later, Linus Torvalds releases Linux to several Usenet newsgroups. And both DZero and CDF begin planning for the Tevatron’s collider run two.

Between the end of collider run one in 1996 and the beginning of run two in 2001, the accelerator and detectors were scheduled for substantial upgrades. Physicists anticipated more particle collisions at higher energies, and multiple interactions that were difficult to analyze and untangle. That translated into managing and storing 20 times the data from run one, and a growing need for computing resources for data analysis.

Enter the Run Two Computing Project (R2CP), in which representatives from both experiments collaborated with Fermilab’s Computing Division to find common solutions in areas ranging from visualization and physics analysis software to data access and storage management.

R2CP officially launched in 1996. It was the early days of the dot com era. eBay had existed for a year, and Google was still under development. IBM’s Deep Blue defeated chess master Garry Kasparov. And Linux was well-established as a reliable open-source operating system. The stage is set for experiments to get wired and start transferring their irreplaceable data to storage via Ethernet.

The high-tech tape robot used today. Photo by Reidar Hahn.

“It was a big leap of faith that it could be done over the network rather than putting tapes in a car and driving them from one location to another on the site,” said Stephen Wolbers, head of the scientific computing facilities in Fermilab’s computing sector. He added ruefully, “It seems obvious now.”

The R2CP’s philosophy was to use commercial technologies wherever possible. In the realm of data storage and management, however, none of the existing commercial software met their needs. To fill the gap, teams within the R2CP created Enstore and the Sequential Access Model (SAM, which later stood for Sequential Access through Meta-data). Enstore interfaces with the data tapes stored in automated tape robots, while SAM provides distributed data access and flexible dataset history and management.

By the time the Tevatron’s run two began in 2001, DZero was using both Enstore and SAM, and by 2003, CDF was also up and running on both systems.

Linux comes into play

The R2CP’s PC Farm Project targeted the issue of computing power for data analysis. Between 1997 and 1998, the project team successfully ported CPS and CDF’s analysis software to Linux. To take the next step and deploy the system more widely for CDF, however, they needed their own version of Red Hat Enterprise Linux. Fermi Linux was born, offering improved security and a customized installer; CDF migrated to the PC Farm model in 1998.

The early computer farms at Fermilab, when they ran a version of Red Hat Linux (circa 1999). Photo by Reidar Hahn.

Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Enterprise Linux ceased to be free. The Fermi Linux team rebuilt Red Hat Enterprise Linux into the prototype of Scientific Linux, and formed partnerships with colleagues at CERN in Geneva, Switzerland, as well as a number of other institutions; Scientific Linux was designed for site customizations, so that in supporting it they also supported Scientific Linux Fermi and Scientific Linux CERN.

Today, Scientific Linux is ranked 16th among open source operating systems; the latest version was downloaded over 3.5 million times in the first month following its release. It is used at government laboratories, universities, and even corporations all over the world.

“When we started Scientific Linux, we didn’t anticipate such widespread success,” said Connie Sieh, a Fermilab researcher and one of the leads on the Scientific Linux project. “We’re proud, though, that our work allows researchers across so many fields of study to keep on doing their science.”

Grid computing takes over

As both CDF and DZero datasets grew, so did the need for computing power. Dedicated computing farms reconstructed data, and users analyzed it using separate computing systems.

“As we moved into run two, people realized that we just couldn’t scale the system up to larger sizes,” Wolbers said. “We realized that there was really an opportunity here to use the same computer farms that we were using for reconstructing data, for user analysis.”

A wide-angle view of the modern Grid Computing Center at Fermilab. Today, the GCC provides computing to the Tevatron experiments as well as the Open Science Grid and the Worldwide Large Hadron Collider Computing Grid. Photo by Reidar Hahn.

Today, the concept of opportunistic computing is closely linked to grid computing. But in 1996 the term “grid computing” had yet to be coined. The Condor Project had been developing tools for opportunistic computing since 1988. In 1998, the first Globus Toolkit was released. Experimental grid infrastructures were popping up everywhere, and in 2003, Fermilab researchers, led by DZero, partnered with the US Particle Physics Data Grid, the UK’s GridPP, CDF, the Condor team, the Globus team, and others to create the Job and Information Management system, JIM. Combining JIM with SAM resulted in a grid-enabled version of SAM: SAMgrid.

“A pioneering idea of SAMGrid was to use the Condor Match-Making service as a decision making broker for routing of jobs, a concept that was later adopted by other grids,” said Fermilab-based DZero scientist Adam Lyon. “This is an example of the DZero experiment contributing to the development of the core Grid technologies.”

By April 2003, the SAMGrid prototype was running on six clusters across two continents, setting the stage for the transition to the Open Science Grid in 2006.

From the Tevatron to the LHC – and beyond

Throughout run two, researchers continued to improve the computing infrastructure for both experiments. A number of computing innovations emerged before the run ended in September 2011. Among these was CDF’s GlideCAF, a system that used the Condor glide-in system and Generic Connection Brokering to provide an avenue through which CDF could submit jobs to the Open Science Grid. GlideCAF served as the starting point for the subsequent development of a more generic glidein Work Management System. Today glideinWMS is used by a wide variety of research projects across diverse research disciplines.

Another notable contribution was the Frontier system, which was originally designed by CDF to distribute data from central databases to numerous clients around the world. Frontier is optimized for applications where there are large numbers of widely distributed clients that read the same data at about the same time. Today, Frontier is used by CMS and ATLAS at the LHC.

“By the time the Tevatron shut down, DZero was processing collision events in near real-time and CDF was not far behind,” said Patricia McBride, the head of scientific programs in Fermilab’s computing sector. “We’ve come a long way; a few decades ago the fixed-target experiments would wait months before they could conduct the most basic data analysis.”

One of the key outcomes of computing at the Tevatron was the expertise developed at Fermilab over the years. Today, the Fermilab computing sector has become a worldwide leader in scientific computing for particle physics, astrophysics, and other related fields. Some of the field’s top experts worked on computing for the Tevatron. Some of those experts have moved on to work elsewhere, while others remain at Fermilab where work continues on Tevatron data analysis, a variety of Fermilab experiments, and of course the LHC.

The accomplishments of the many contributors to Tevatron-related computing are noteworthy. But there is a larger picture here.

“Whether in the form of concepts, or software, over the years the Tevatron has exerted an undeniable influence on the field of scientific computing,” said Ruth Pordes, Fermilab’s head of grids and outreach. “We’re very proud of the computing legacy we’ve left behind for the broader world of science.”

— Miriam Boon

Share

Christmas time brings not only presents and pretty cookies but an outpouring of media lists proffering the best science stories of the year and predicting those that will top the list in 2012.

While the lists varied wildly everyone seemed excited by a few of the same things: upsetting Einstein’s theory of special relativity, a hint of the ‘god particle’ and finding planets like our own.

Several of the stories that made nearly every media outlet’s list, though in various rankings, have a connection, directly or indirectly, to Fermilab. Here’s a sampling with the rankings from the publications.

Discover magazine had the largest list, picking the top 100 science stories.

1: A claim by researchers at the OPERA experiment at CERN that they had measured neutrinos traveling faster than the speed of light, something disallowed by Einstein’s Theory of Special Relativity. Now the scientific community is looking for another experiment to cross-check OPERA’s claim.

That brought renewed interest to a 2007 measurement by the MINOS experiment based at Fermilab that found neutrinos skirting the cosmic speed limit, but only slightly. The MINOS collaboration always planned to study this further when it upgrades its detector in early 2012 but the OPERA result added new urgency.

Look in 2012 for MINOS to update the time of flight of neutrinos debate in three stages. First, MINOS is analyzing the data collected since its 2007 result to look for this phenomena. Results should be ready in early 2012. This likely will improve the MINOS  precision in this area by a factor of three from its 2007 result. Second, MINOS is in the process of upgrading its timing system within the next few months using a system of atomic clocks to detect when the neutrinos arrive at the detector. The atomic clock system will progressively improve resolution, which is needed to make the MINOS analysis comparable to the OPERA result and improve precision from the 2007 MINOS result by as much as a factor of 10. That will tell us if OPERA was on the right track or not, but may not be the definitive answer. That answer will come with the upgrades to the MINOS experiment  and a more powerful neutrino beam, producing a larger quantity of neutrino events to study. The upgraded MINOS experiment will be in many ways a more precise system than OPERA’s and could produce a result comparable with OPERA’s precision likely by January 2014.

4: Kepler’s search for Earth-like planets that could sustain life produces a bounty of cosmic surprises, fueled, in part, by the computing skills of a Fermilab astrophysicist.
32: The on-again, off-again rumor of finding the Higgs boson particle.  Physicists working with experiments at Fermilab’s Tevatron experiments and CERN’s Large Hadron Collider expect to answer the question of whether a Standard Model version of the Higgs exists in 2012.
65: The shutdown of the Tevatron at Fermilab after 28 years and numerous scientific and technological achievements.
82: Fermilab physicist Jason Steffen’s frustration with slow airplane boarding drives him to figure out a formula to speed up the aisle crawl.

Nature’s year in review didn’t rank stories but started off by mentioning the Tevatron’s shutdown after 28 years and following up shortly with the puzzling particle news of potentially FTL neutrinos and a Higgs sighting.

For science — as for politics and economics — 2011 was a year of upheaval, the effects of which will reverberate for decades. The United States lost three venerable symbols of its scientific might: the space-shuttle programme, the Tevatron particle collider and blockbuster profits from the world’s best-selling drug all came to an end.

Cosmos magazine rankings:

The MINOS far detector in the Soudan Mine in Minnesota. Credit: Fermilab

1: Kepler’s exoplanet findings
2: FTL neutrinos
3: Higgs

Scientific American‘s choices:

3: FTL neutrinos
5: Higgs

ABC News asked science radio and TV host physicist Michio Kaku for his top 10 picks. They include:

3: Hint of Higgs
5: Kepler’s exoplanet findings
10: Nobel Prize for the discovery that the expansion of the universe is accelerating, which laid the groundwork for the today’s search for dark energy. Fermilab has several connections to to this work. The latest tool in dark energy survey experiments, the Dark Energy Camera,  was constructed at Fermilab in 2011. One of the three prize winners, Saul Perlmutter, is a member of the group that will use the camera, the Dark Energy Survey collaboration. Adam Riess, another of the winners, is a member of the SDSS-II experiment, a predecessor to DES that Fermilab was key in building and later operating its computing system.

Live Science

5: FTL neutrinos
4: Kepler’s exoplanet findings
2: Higgs

If the Higgs boson’s mass is high, it is expected to decay predominantly into two W bosons. Plushies images from the Particle Zoo.

To make the Ars Technica list stories had to be awe inspiring in 2011 AND have a chance of making the 2012 list as well.

1: FTL neutrinos
2: Kepler’s exoplanet findings
6: Higgs hunt

Science magazine chose the best scientific breakthrough of the year. Kepler’s exoplanet hunt made it into the runner up list.

Tell us who you agree with or, better, yet give us your own top 10 science stories of the year.

— Tona Kunz

Share

Real CMS proton-proton collision events in which 4 high energy electrons (green lines and red towers) are observed. The event shows characteristics expected from the decay of a Higgs boson but is also consistent with background Standard Model physics processes. Courtesy: CMS

Today physicists at CERN on the CMS and ATLAS experiments at the Large Hadron Collider announced an update on their search for the Higgs boson. That may make you wonder ( I hope) what is Fermilab’s role in this. Well, glad you asked.

Fermilab supports the 1,000 US LHC scientists and engineers by providing office and meeting space as well as the Remote Operation Center. Fermilab helped design the CMS detector, a portion of the LHC accelerator and is working on upgrades for both. About one-third of the members of each of the Tevatron’s experiments, CDF and DZero, are also members of the LHC experiments.

That means that a good portion of the LHC researchers are also looking for the Higgs boson with the Tevatron.  Because the Tevatron and LHC accelerators collide different pairs of particles, the dominant way in which the experiments search for the Higgs at the two accelerators is different. Thus the two machines offer a complimentary search strategy.

If the Higgs exists and acts the way theorists expect, it is crucial to observe it in both types of decay patterns. Watch this video to learn how physicists search for the Higgs boson. These types of investigations might lead to the identification of new and unexpected physics.

Scientists from the CDF and DZero collaborations at Fermilab continue to analyze data collected before the September shutdown of the Tevatron in the search for the Higgs boson.

The two collaborations will announce their latest results for the Higgs boson search at an international particle physics conference in March 2012. This new updated analysis will have 20 to 40 percent more data than the July 2011 results as well as further improvements in analysis methods.

The Higgs particle is the last not-yet-observed piece of the theoretical framework known as the Standard Model of particles and forces. Watch this video to learn The nature of the Higgs boson and how it works. According to the Standard Model, the Higgs boson explains why some particles have mass and others do not. Higgs most likely has a mass between 114-137 GeV/c2, about 100 times the mass of a proton. This predicted mass range is based on stringent constraints established by earlier measurements made by Tevatron and other accelerators around the world, and confirmed by the searches of LHC experiments presented so far in 2011. This mass range is well within reach of the Tevatron Collider.

The Tevatron experiments already have demonstrated that they have the ability to ferret out the Higgs-decay pattern by applying well-established techniques used to search for the Higgs boson to observing extremely rare but firmly expected physics signature. This signature consists of pairs of heavy bosons (WW or WZ) that decay into a pair of b quarks, a process that closely mimics the main signature that the Tevatron experiments use to search for the Higgs particle, i.e. Higgs decaying to a pair of b quarks, which has by far the largest probability to happen in this mass range. Thus, if a Standard Model Higgs exists, the Tevatron experiments will see it.

If the Standard Model Higgs particle does not exist, Fermilab’s Tevatron experiments are on track to rule it out this winter. CDF and DZero experiments have excluded the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges and will have sufficient analysis sensitivity to rule out this winter the mass region between.

While today’s announcement shows the progress that the LHC experiments have made in the last few months, all eyes will be on the Tevatron and on the LHC in March 2012 to see what they have to say about the elusive Higgs Boson.

— Tona Kunz

Share

The combined Tevatron results exclude the existence of a Higgs particle with a mass between 100-108 and 156-177 GeV/c2. For the range 110-155 GeV/c2, the experiments are now extremely close to the sensitivity needed (dotted line below 1) either to see a substantial excess of Higgs-like events or to rule out the existence of the particle. The small excess of Higgs-like events observed by the Tevatron experiments in the range from 120 to 155 (see solid curve) is not yet statistically significant.

Scientists of the CDF and DZero collaborations at Fermilab continue to increase the sensitivity of their Tevatron experiments to the Higgs particle and narrow the range in which the particle seems to be hiding. At the European Physical Society conference in Grenoble, Fermilab physicist Eric James reported today that together the CDF and DZero experiments now can exclude the existence of a Higgs particle in the 100-108 and the 156-177 GeV/c2 mass ranges, expanding exclusion ranges that the two experiments had reported in March 2011.

Last Friday, the ATLAS and CMS experiments at the European center for particle physics, CERN, reported their first exclusion regions. The two experiments exclude a Higgs particle with a mass of about 150 to 450 GeV/c2, confirming the Tevatron exclusion range and extending it to higher masses that are beyond the reach of the Tevatron. Even larger Higgs masses are excluded on theoretical grounds.

This leaves a narrow window for the Higgs particle, and the Tevatron experiments are on track to collect enough data by the end of September 2011 to close this window if the Higgs particle does not exist.

James reported that the Tevatron experiments are steadily becoming more sensitive to Higgs processes that the LHC experiments will not be able to measure for some time. In particular, the Tevatron experiments can look for the decay of a Higgs particle into a pair of bottom and anti-bottom quark which are the dominant, hard-to-detect decay mode of the Higgs particle. In contrast, the ATLAS and CMS experiments currently focus on the search for the decay of a Higgs particle into a pair of W bosons, which then decay into lighter particles.

This graph shows the improvement in the combined sensitivity of the CDF and DZero experiments to a Higgs signal over the last couple of years. When the sensitivity for a particular value of the Higgs mass, mH, drops below one, scientists expect the Tevatron experiments to be able to rule out a Higgs particle with that particular mass. By early 2012, the Tevatron experiments should be able to corroborate or rule out a Higgs particle with a mass between 100 to about 190 GeV/c2.

The LHC experiments reported at the EPS conference an excess of Higgs-like events in the 120-150 GeV/c2 mass region at about the 2-sigma level. The Tevatron experiments have seen a small, 1-sigma excess of Higgs-like events in this region for a couple of years. A 3-sigma level is considered evidence for a new result, but particle physicists prefer a 5-sigma level to claim a discovery. More data and better analyses are necessary to determine whether these excesses are due to a Higgs particle, some new phenomena or random data fluctuations.

In early July, before the announcement of the latest Tevatron and LHC results, a global analysis of particle physics data by the GFitter group indicated that, in the simplest Higgs model, the Higgs particle should have a mass between approximately 115 and 137 GeV/c2.

“To have confidence in having found the Higgs particle that theory predicts, you need to analyze the various ways it interacts with other particles,” said Giovanni Punzi, co-spokesperson of the CDF experiment. “If there really is a Higgs boson hiding in this region, you should be able to find its decay into a bottom-anti-bottom pair. Otherwise, the result could be a statistical fluctuation, or some different particle lurking in your data.”

The CDF and DZero experiments will continue to take data until the Tevatron shuts down at the end of September.

“The search for the Higgs particle in its bottom and anti-bottom quark decay mode really has been the strength of the Tevatron,” said Dmitri Denisov, DZero co-spokesperson

“With the additional data and further improvements in our analysis tools, we expect to be sensitive to the Higgs particle for the entire mass range that has not yet been excluded. We should be able to exclude the Higgs particle or see first hints of its existence in early 2012.”

The details of the CDF and DZero analysis are described in this note, which will be posted later today, as well as submitted to the arXiv.

—Kurt Riesselmann

Share