• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘protons’

This post was written by Brookhaven Lab scientists Shigeki Misawa and Ofer Rind.

Run 13 at the Relativistic Heavy Ion Collider (RHIC) began one month ago today, and the first particles collided in the STAR and PHENIX detectors nearly two weeks ago. As of late this past Saturday evening, preparations are complete and polarized protons are colliding with the machine and detectors operating in “physics mode,” which means gigabytes of data are pouring into the RHIC & ATLAS Computing Facility (RACF) every few seconds.

Today, we store data and provide the computing power for about 2,500 RHIC scientists here at Brookhaven Lab and institutions around the world. Approximately 30 people work at the RACF, which is located about one mile south of RHIC and connected to both the Physics and Information Technology Division buildings on site. There are four main parts to the RACF: computers that crunch the data, online storage containing data ready for further analysis, tape storage containing archived data from collisions past, and the network glue that holds it all together. Computing resources at the RACF are split about equally between the RHIC collaborations and the ATLAS experiment running at the Large Hadron Collider in Europe.

Shigeki Misawa (left) and Ofer Rind at the RHIC & ATLAS Computing Facility (RACF) at Brookhaven Lab

Where Does the Data Come From?

For RHIC, the data comes from heavy ions or polarized protons that smash into each other inside PHENIX and STAR. These detectors catch the subatomic particles that emerge from the collisions to capture information—particle species, trajectories, momenta, etc.—in the form of electrical signals. Most signals aren’t relevant to what physicists are looking for, so only the signals that trip predetermined triggers are recorded. For example, with the main focus for Run 13 being the proton’s “missing” spin, physicists are particularly interested in finding decay electrons from particles called W bosons, because these can be used as probes to quantify spin contributions from a proton’s antiquarks and different “flavors” of quarks.

Computers in the “counting houses” at STAR and PHENIX package the raw data collected from selected electrical signals and send it all to the RACF via dedicated fiber-optic cables. The RACF then archives the data and makes it available to experimenters running analysis jobs on any of our 20,000 computing cores.

Recent Upgrades at the RACF

Polarized protons are far smaller than heavy ions, so they produce considerably less data when they collide, but even still, when we talk about data at the RACF, we’re talking about a lot of data. During Run 12 last year, we began using a new tape library to increase storage capacity by 25 percent for a total of 40 petabytes—the equivalent of 655,360 of the largest iPhones available today. We also more than doubled our ability to archive data for STAR last year (in order to meet the needs of a data acquisition upgrade) so we can now sustain 700 megabytes of incoming data every second for both PHENIX and STAR. Part of this is due to new fiber-optic cables connecting the counting houses to the RACF, which provide both increased data rates and redundancy.

With all this in place, along with those 20,000 processing cores (most computers today have two or four cores), certain operations that used to require six months of computer time now can be completed often in less than one week.

Looking Ahead

If pending budgets allow for the full 15-week run planned, we expect to collect approximately four petabytes of data from this run alone. During the run, we meet formally with liaisons from the PHENIX and STAR collaborations each week to discuss the amount of data expected in the coming weeks and to assess their operational needs. Beyond these meetings, we are in continual communication with our users, as we monitor and improve system functionality, troubleshoot, and provide first-line user support.

We’ll also continue to work with experimenters to evaluate computing trends, plan for future upgrades, and test the latest equipment—all in an effort to minimize bottlenecks that slow the data from getting to users and to get the most bang for the buck.

— Shigeki Misawa – Group Leader, RACF Mass Storage and General Services

— Ofer Rind – Technology Architect, RACF Storage Management

Share

Paper vs. Protons (Pt. 2)

Tuesday, August 9th, 2011

Yup, it’s still summer conference season here in the Wonderful World of Physics. My fellow QD bloggers rocked at covering the European Physics Society meeting back in July, so check it out. Aside from the summer conferences, it is also summer school season for plenty of people (like me!). To clarify, I am not talking about repeating a class during the summer. Actually, it is quite the opposite: these are classes that are at most offered once a year and are taught in different countries, depending on the year.

To give you context, graduate students normally run out of courses to take in our second or third of our PhD program; and although the purpose of a PhD is to learn how to conduct research, there will always be an information gap between our courses and our research. There is nothing wrong with that, but sometimes that learning curve is pretty big. In order to alleviate this unavoidable issue, university professors often will teach a one-time-only “topics” course on their research to an audience of three or four students during the regular academic year. Obviously, this is not always sustainable for departments, large or small, because of fixed costs required to teach a course. The solution? Split the cost by inviting a hundred or so students from around the world to a university and cram an entire term’s worth of information into a 1- to 4-week lecture series, which, by the way, are taught by expert faculty from everywhere else in the world. 🙂

To be honest, it is like learning all about black holes & dark matter from the people who coined the names “black holes” & “dark matter.” So not only do graduate students get to learn about the latest & greatest from the people who discovered the latest & greatest, but we also get to hear all the anecdotal triumphs and setbacks that lead to the discoveries.

Fig. 1: Wisconsin’s state capitol in Madison, Wi., taken from one of the bike paths
that wrap around the city’s many lakes. (Photo: Mine)

This brings us to the point of my post. Back in July, I had the great opportunity to attend the 2011 CTEQ Summer School in Madison, Wi., where for 10 days we talked about this equation:

Now, this is not just any ordinary equation, it is arguably the most important equation for any physicist working at the Large Hadron Collider, the Tevatron, or any of the other half-dozen atom smashers on this planet. In fact, this equation is precisely what inspired the name Paper vs. Protons.

Since quantum physics is inherently statistical most calculations result in computing probabilities of things happening. The formula above allows you to compute the probability of what happens when you collide protons, something experimentalists can measure, by simply calculating the probability of something happening when you collide quarks, something undergraduates can do! Physicists love quarks very much because they are elementary particles and are not made of anything smaller, at least that is what we think. Protons are these messy balls of quarks, gluons, photons, virtual particles, elephant-anti-elephant pairs, and are just horrible. Those researchers studying the proton’s structure with something called “lattice QCD” have the eternal gratitude of physicists like me, who only deal with quarks and their kookiness.

Despite being so important the equation only has three parts, which are pretty straightforward. The first part, is that tail end of the second line:

which is just probability of this happening:

Fig. 2: Feynman diagram representing the qq-bar → γ → e+e- process.

If you read Paper vs. Protons (Pt. 1) you might recognize it. This Feynman diagram represents a quark (q) & an antiquark (q with a bar over it) combine to become a photon (that squiggly line in the center), which then decays into an electron (e-) & its antimatter partner, the positron (e+). Believe it or not, the probability of this “qq-bar → γ → e+e-” process happening (or cross section as we call it) is something that advanced college students and lower level graduate students learn to calculate in a standard particle physics course. Trust me when I say that every particle physicist has calculated it, or at the very least a slight variation that involves muons. By coincidence, I actually calculated it (for the nth time) yesterday.

Okay, time for the second part of the equation. To help explain it, I am using a great image (below) from the LHC experiment ALICE. So you & I know that all matter is made from atoms (left). Atoms, in turn, consist of a nucleus of protons & neutrons (center) that are being orbited by electrons (white dots, left). A proton (right) is made up of three quarks (three fuzzy, white dots, right) that bathe in a sea of gluons (red-blue-green fuzziness, right). About 45% of a proton’s energy at the LHC is shared by the three quarks; the remaining 55% of the proton’s energy is shared by the gluons.

Fig. 3: An atom (left), an atom’s nucleus (center), and a free proton (right). (Image: ALICE Expt)

How do we know those numbers? Easy, with something called a “parton distribution function”, or p.d.f. for short! A p.d.f. gives us back the probability of finding, for example, a quark in a proton with 15% of the proton’s energy. Since we want to know the probability of finding a quark (q) in the first proton (with momentum x1) and the probability of finding an anti-quark (q with a bar over its head) in the second proton (with momentum x2) we need to use our p.d.f. (which we will call “f”) twice. Additionally, since the quark and anti-quark can come from either of the two protons we need to use “f” a total of four times. Part 2 of our wonderful equation encapsulates the entire likelihood of finding the quarks we want to smash together:

Now the third (and final!) part is the simple to understand because all it tells us to do is to add: add together all the different ways a quark can share a proton’s energy. For example, a quark could have 5% or 55% of a proton’s energy, and even though either case might be unlikely to happen we still add together the probability of each situation happening. This the third part of our wonderful equation:

Putting everything together, we find that the probability of producing an electron (e-) and a positron (e+) when smashing together two protons is actually just the sum (part 3) of all the different ways (part 2) two quarks can produce an e+e- pair (part 1). Hopefully that made sense.

Though it gets better. When we plug our values into the formula, we get a number. This number is literally what we try to measure that the Large Hadron Collider; this is how we discover new physics! If theory “A” predicts a number and we measure a number that is way different, beyond any statistical uncertainty, it means that theory “A” is wrong. This is the infamous Battle of Paper vs Protons. To date, paper and protons agree with one another. However, at the end of this year, when the LHC shuts down for routine winter maintenance, we will have enough data to know definitively if the paper predictions for the higgs boson match what the protons say. Do you see why I think this equation is so important now? This is equation is how we determine whether or not we have discovered new physics. :p

Happy Colliding.

– richard (@bravelittlemuon)

PS. If you will be at the PreSUSY Summer School at the end of August, be sure to say hi.

Share