• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘data’

The Large Hadron Collider (LHC) at CERN has already delivered more high energy data than it had in 2015. To put this in numbers, the LHC has produced 4.8 fb-1, compared to 4.2 fb-1 last year, where fb-1 represents one inverse femtobarn, the unit used to evaluate the data sample size. This was achieved in just one and a half month compared to five months of operation last year.

With this data at hand, and the projected 20-30 fb-1 until November, both the ATLAS and CMS experiments can now explore new territories and, among other things, cross-check on the intriguing events they reported having found at the end of 2015. If this particular effect is confirmed, it would reveal the presence of a new particle with a mass of 750 GeV, six times the mass of the Higgs boson. Unfortunately, there was not enough data in 2015 to get a clear answer. The LHC had a slow restart last year following two years of major improvements to raise its energy reach. But if the current performance continues, the discovery potential will increase tremendously. All this to say that everyone is keeping their fingers crossed.

If any new particle were found, it would open the doors to bright new horizons in particle physics. Unlike the discovery of the Higgs boson in 2012, if the LHC experiments discover a anomaly or a new particle, it would bring a new understanding of the basic constituents of matter and how they interact. The Higgs boson was the last missing piece of the current theoretical model, called the Standard Model. This model can no longer accommodate new particles. However, it has been known for decades that this model is flawed, but so far, theorists have been unable to predict which theory should replace it and experimentalists have failed to find the slightest concrete signs from a broader theory. We need new experimental evidence to move forward.

Although the new data is already being reconstructed and calibrated, it will remain “blinded” until a few days prior to August 3, the opening date of the International Conference on High Energy Physics. This means that until then, the region where this new particle could be remains masked to prevent biasing the data reconstruction process. The same selection criteria that were used for last year data will then be applied to the new data. If a similar excess is still observed at 750 GeV in the 2016 data, the presence of a new particle will make no doubt.

Even if this particular excess turns out to be just a statistical fluctuation, the bane of physicists’ existence, there will still be enough data to explore a wealth of possibilities. Meanwhile, you can follow the LHC activities live or watch CMS and ATLAS data samples grow. I will not be available to report on the news from the conference in August due to hiking duties, but if anything new is announced, even I expect to hear its echo reverberating in the Alps.

Pauline Gagnon

To find out more about particle physics, check out my book « Who Cares about Particle Physics: making sense of the Higgs boson, the Large Hadron Collider and CERN », which can already be ordered from Oxford University Press. In bookstores after 21 July. Easy to read: I understood everything!

CMS-lumi-17juin

The total amount of data delivered in 2016 at an energy of 13 TeV to the experiments by the LHC (blue graph) and recorded by CMS (yellow graph) as of 17 June. One fb-1 of data is equivalent to 1000 pb-1.

Share

Le Grand collisionneur de hadrons (LHC) du CERN a déjà produit depuis avril plus de données à haute énergie qu’en 2015. Pour quantifier le tout, le LHC a produit 4.8 fb-1 en 2016, à comparer aux 4.2 fb-1 de l’année dernière. Le symbole fb-1 représente un femtobarn inverse, l’unité utilisée pour évaluer la taille des échantillons de données. Tout cela en à peine un mois et demi au lieu des cinq mois requis en 2015.

Avec ces données en réserve et les 20-30 fb-1 projetés d’ici à novembre, les expériences ATLAS et CMS peuvent déjà repousser la limite du connu et, entre autres, vérifier si les étranges événements rapportés fin 2015 sont toujours observés. Si cet effet était confirmé, il révèlerait la présence d’une nouvelle particule ayant une masse de 750 GeV, soit six fois plus lourde que le boson de Higgs. Malheureusement en 2015, il n’y avait pas suffisamment de données pour obtenir une réponse claire. Après deux ans de travaux majeurs visant à accroître sa portée en énergie, le LHC a repris ses opérations l’an dernier mais à faible régime. Si sa performance actuelle se maintient, les chances de faire de nouvelles découvertes seront décuplées. Tout le monde garde donc les doigts croisés.

Toute nouvelle particule ouvrirait la porte sur de nouveaux horizons en physique des particules. Contrairement à la découverte du boson de Higgs en 2012, si les expériences du LHC révèlent une anomalie ou l’existence d’une nouvelle particule, cela modifierait notre compréhension des constituants de base de la matière et des forces qui les régissent. Le boson de Higgs constituait la pièce manquante du Modèle standard, le modèle théorique actuel. Ce modèle ne peut plus accommoder de nouvelles particules. On sait pourtant depuis des décennies qu’il est limité, bien qu’à ce jour, les théoriciens et théoriciennes n’aient pu prédire quelle théorie devrait le remplacer et les expérimentalistes ont échoué à trouver le moindre signe révélant cette nouvelle théorie. Une évidence expérimentale est donc absolument nécessaire pour avancer.

Bien que les nouvelles données soient déjà en cours de reconstruction et de calibration, elles resteront “masquées” jusqu’à quelques jours avant le 3 août, date d’ouverture de la principale conférence de physique cet été. D’ici là, la région où la nouvelle particule pourrait se trouver est masquée afin de ne pas biaiser le processus de reconstruction des données. A la dernière minute, on appliquera aux nouvelles données les mêmes critères de sélection que ceux utilisés l’an dernier. Si ces évènements sont toujours observés à 750 GeV dans les données de 2016, la présence d’une nouvelle particule ne fera alors plus aucun doute.

Mais même si cela s’avérait n’être qu’une simple fluctuation statistique, ce qui arrive souvent en physique de par sa nature, la quantité de données accumulée permettra d’explorer une foule d’autres possibilités. En attendant, vous pouvez suivre les activités du LHC en direct ou voir grandir les échantillons de données de CMS et d’ATLAS. Je ne pourrai malheureusement pas vous rapporter ce qui sera présenté à la conférence en août, marche en montagne oblige, mais si une découverte quelconque est annoncée, même moi je m’attends à entendre son écho résonner dans les Alpes.

Pauline Gagnon

Pour en apprendre plus sur la physique des particules, ne manquez pas mon livre « Qu’est-ce que le boson de Higgs mange en hiver et autres détails essentiels » disponible en librairie au Québec et en Europe, de meme qu’aux Editions MultiMondes. Facile à lire : moi, j’ai tout compris!

CMS-lumi-17juin

Graphe cumulatif montrant la quantité de données produites à 13 TeV en 2016 par le LHC (en bleu) et récoltées par l’expérience CMS (en jaune) en date du 17 juin.

Share

Read-Set-Go: The LHC 2012 Schedule

Thursday, September 20th, 2012

From Now Until Mid-December, Expect One Thing from the LHC: More Collisions.

Figure 1: Integrated luminosity for LHC Experiments versus time. 8 TeV proton-proton collisions began in April 2012. Credit: CERN

 

Hi All,

Quick post today. That plot above represents the amount of 8 TeV data collected by the LHC experiments. As of this month, the ATLAS and CMS detector experiments have each collected 15 fb-1 of data. A single fb-1 (pronounced: inverse femto-barn) is equivalent to 70 trillion proton-proton collisions. In other words, ATLAS and CMS have each observed 1,050,000,000,000,000 proton-proton collisions. That is 1.05 thousand-trillion, or 1.05×1015.

To understand how gargantuan a number this is, consider that it took the LHC’s predecessor, the Tevatron, 24 years to deliver 12 fb-1 of proton-antiproton collisions*. The LHC has collected this much data in five months. Furthermore,  proton-proton collisions will officially continue until at least December 16th, at which time CERN will shut off the collider for the holiday season. Near the beginning of the calendar year, we can expect the LHC to collide lead ions for a while before the long, two-year shut down. During this time, the LHC magnets will be upgraded in order to allow protons to run at 13 or 14 TeV, and the detector experiments will get some much-needed tender loving care maintenance and upgrades.

To estimate how much more data we might get before the New Year, let’s assume that the LHC will deliver 0.150 fb-1 per day from now until December 16th. I consider this to be a conservative estimation, but I refer you to the LHC’s Performance and Statistics page. I also assume that the experiments operate at 100% efficiency (not so conservative but good enough). Running 7 days a week puts us at a little over 1 fb-1 per week. According to the LHC schedule, there about about 10 more weeks of running (12 weeks until Dec. 16 minus 2 weeks for “machine development”).

By this estimation, both ATLAS and CMS will have at least 25 fb-1 of data each before shut down!

25 fb-1 translates to 1.75 thousand-trillion proton-proton collisions, more than four times as much 8 TeV data used to discover the Higgs boson in July**.

Fellow QDer Ken Bloom has a terrific breakdown of what all this extra data means for studying physics. Up-to-the-minute updates about the LHC’s performance are available via the LHC Programme Coordinate Page, @LHCstatus, and @LHCmode. There are no on-going collisions at the moment because the LHC is currently under a technical stop/beam recommissioning/machine development/scrubbing, but things will be back to normal next week.

 

Happy Colliding

– richard (@bravelittlemuon)

 

* 10 fb-1 were recorded each by CDF and DZero, but to be fair, it also took Fermilab about 100 million protons to make 20 or so antiprotons.

** The Higgs boson discovery used 5 fb-1 of 7 TeV data and 5.5 fb-1 of 8 TeV data

Share

Physicists did a lot of planning for data analysis before the LHC ever ran, and we’ve put together a huge number of analyses since it started. We’ve already looked for most of the things we’ll ever look for. Of course, many of the things we’ve been looked for haven’t shown up yet; in fact, in many cases including the Higgs, we didn’t expect them to show up yet! We’ll have to repeat the analysis on more data. But that’s got to be easier than it was to collect and analyze the data the first time, right? Well, not necessarily. We always hope it will be easier the second or third time around, but the truth is that updating an analysis is a lot more complicated than just putting more numbers into a spreadsheet.

For starters, every time we add new data, it was collected under different conditions. For example, going from 2011 to 2012, the LHC beam energy will be increasing. The number of collisions per crossing will be larger too, and that means the triggers we use to collect our data are changing too. All our calculations of what the pileup on top of each interesting collision looks like will change. Some of our detectors might work better as we fix glitches, or they might work worse as they are damaged in the course of running. All these details affect the calculations for the analysis and the optimal way to put the data together.

But even if we were running on completely stable conditions, there are other reasons an analysis has to be updated as you collect more data. When you have more events to look at, you might be interested in limiting the events you look at to those you understand best. (In other words, if an analysis was previously limited by statistical uncertainties, as those shrink, you want to get rid of your largest systematic uncertainties.) To get all the power out of the new data you’ve got, you might have to study new classes of events, or get a better understanding of questions where your understanding was “good enough.”

So analyzing LHC data is really an iterative process. Collecting more data is always presenting new challenges and new opportunities that require understanding things better than before. No analysis is ever the same twice.

Share

Update: Section added to include LEP11 Results on Higgs Boson Exclusion (01 Sept 2011)

Expect bold claims at this week’s SUSY 2011 (#SUSY11 on Twitter, maybe) Conference at Fermilab, in Batavia, Illinois. No, I do not have any secret information about some analysis that undoubtedly proves Supersymmetry‘s existence; though, it would be pretty cool if such an analysis does exist. I say this because I came back from a short summer school/pre-conference that gave a very thorough introduction to the mathematical framework behind a theory that supposes that there exists a new and very powerful relationship between particles that make up matter, like electrons & quarks (fermions), and particles that mediate the forces in our universe, like photons & gluons (bosons). This theory is called “Supersymmetry”, or “SUSY” for short, and might explain many of the shortcomings of our current description of how Nature works.

At this summer school, appropriately called PreSUSY 2011, we were additionally shown the amount of data that the Large Hadron Collider is expected to collect before the end of this year and at the end of 2012. This is where the game changer appeared. Back in June 2011, CERN announced that it had collected 1 fb-1 (1 inverse femtobarn) worth of data – the equivalent of 70,000 billion proton-proton collisions – a whole six months ahead of schedule. Yes, the Large Hadron Collider generated a year’s worth of data in half a year’s time. What is more impressive is that the ATLAS and CMS experiments may each end up collecting upwards of 5 fb-1 before the end of this year, a benchmark number a large number of people said would be a “highly optimistic goal” for 2012. I cannot emphasize how crazy & surreal it is to be seriously discussing the possibility of having 10 fb-1, or even 15 fb-1, by the end of 2012.

Figure 1: Up-to-date record of the total number of protons collisions delivered to each of the Large Hadron Collider Detector Experiments. (Image: CERN)

What this means is that by the end of this year, not next year, we will definitely know whether or not the higgs boson, as predicted by the Standard Model, exists. It also means that by next year, experimentalists will be able to rule out the most basic versions of Supersymmetry which were already ruled out by previous, high-precision measurements of previously known (electroweak) physics. Were we to find Supersymmetry at the LHC now and not when the LHC is at designed specifications, which are expected to be reached in 2014, then many physicists would be at a loss trying to rectify why one set of measurements rule out SUSY but another set of measurements support its existence.

What we can expect this week, aside from the usual higgs boson and SUSY exclusion plots, are a set of updated predictions as to where we expect to be this time next year. Now that the LHC has given us more data than we had anticipated we can truly explore the unknown, so trust me when I say that the death of SUSY has been greatly exaggerated.

More on Higgs Boson Exclusion (Added 01 Sept 2011)

This morning a new BBC article came out on the possibility of the higgs being found by Christmas. So why not add some plots, shown at August’s Lepton-Photon 2011 Conference, that show this? These plots were taken from Vivek Sharma’s Higgs Searches at CMS talk.

If there is no Standard Model higgs boson, then the Compact Muon Solenoid Detector, one of the two general purpose LHC detectors, should be able to exclude the boson, singlehandedly, with a 95% Confidence Level. ATLAS, the second of the two general purpose detectors, is similarly capable of such an exclusion.

Figure A: The CMS Collaboration projected sensitivity to excluding the higgs boson with 5 fb-1 at √s = 7 TeV; the black line gives combined (total) sensitivity.

Things get less clear if there is a higgs boson because physical & statistical fluctuations adds to our uncertainty. If CMS does collect 5 fb-1 before the winter shutdown, then it is capable of claiming at least a 3σ (three-sigma) discovery for a higgs boson with a mass anywhere between mH≈ 120 GeV/c2 and mH ≈ 550 GeV/c2 . For a number of (statistical/systematic) reasons, the range might shrink or expand with 5 fb-1 worth of data but only by a few GeV/c2. In statistics, “σ” (sigma) is the Greek letter that represents a standard deviation; a “3σ result” implies that there is only a 0.3% chance of being a fluke. The threshold for discovery is set at 5σ, or a 0.000 06% of being a random fluke.

Figure B: The CMS Collaboration projected sensitivity to discovering the higgs boson with 1 (black), 2 (brown?), 5 (blue), and 10 (pink)  fb-1 at √s = 7 TeV.

By itself, the CMS detector is no longer sensitive. By combing their results, however, a joint ATLAS-CMS combined analysis can do the full 3σ discovery and a 5σ job down to 128 GeV/c2. The 114 GeV/c2 benchmark that physicists like to throw around is lower bound on the higgs boson mass set by CERN’s LEP Collider, which shutdown in 2000 to make room for the LHC.

Figure C: The projected sensitivity of a joint ATLAS-CMS analysis for SM higgs exclusion & discovery for various benchmark data sets.

However, there are two caveat in all of this. The smaller one is that these results depend on another 2.5 fb-1 being delivered by the upcoming winter shutdown; if there are any more major halts in data collection, then the mark will be missed. The second, and more serious, caveat is that this whole time I have been talking about the Standard Model higgs boson, which has a pretty rigid set of assumptions. If there is new physics, then all these discovery/exclusion bets are off. 🙂

Nature’s Little Secrets

On my way to PreSUSY, a good colleague of mine & I decided to stop by Fermilab to visit a friend and explore the little secret nooks that makes Fermilab, in my opinion, one of the most beautiful places in the world (keep in mind, I really love the Musée d’Orsay). What makes Fermilab such an gorgeous place is that is doubles as a federally sanctioned nature preserve! From bison to butterflies, the lab protects endangered or near-endangered habitats while simultaneously reaching back to the dawn of the Universe. Here is a little photographic tour of some of Nature’s best kept secrets. All the photos can be enlarged by clicking on them. Enjoy!

Figure 2: The main entrance to the Enrico Fermi National Accelerator Laboratory, U.S. Dept. of Energy Laboratory Designation: FNAL, nicknamed Fermilab. The three-way arch that does not connect evenly at the top is called Broken Symmetry and appropriately represents the a huge triumph of Theoretical (Solid State & High Energy) Physics: Spontaneous Symmetry Breaking. Wilson Hall, nicknamed “The High-Rise” can be see in the background. (Image: Mine).

Figure 3: Wilson Hall, named after FNAL’s first director and Manhattan Project Scientist Robert Wilson, is where half of Fermilab’s magic happens. Aside from housing all the theorists & being attached to the Tevatron Control Room, it also houses a second control room for the CMS Detector called the Remote Operations Center. Yes, the CMS Detector can be fully controlled from Fermilab. The photo was taken from the center of the Tevatron ring. (Image: Mine)

Figure 4: A wetlands preserve located at the center of the Tevatron accelerator ring. The preservation has been so successful at restoring local fish that people with an Illinois fishing license (See FAQ) are actually allowed to fish. From what I have been told, the fish are exceptionally delicious the closer you get to the Main Ring. I wonder if it has anything to do with all that background neutrino rad… never mind. 🙂
Disclaimer: The previous line was a joke; the radiation levels at Fermilab are well within safety limits! (Image: Mine)

Figure 5: The Feynman Computing Center (left) and BZero (right), a.k.a., The CDF Detector Collision Hall. The Computing Center, named after the late Prof. Richard Feynman, cannot be justly compared to any other data center, except with maybe CERN‘s computing center. Really, there is so much experimental computer research, custom built electronics, and such huge processing power that there are no benchmarks that allows for it to be compared. Places like Fermilab and CERN set the benchmarks. The Collider Detector at Fermilab, or CDF for short, is one of two general purpose detectors at Fermilab that collects and analyzes the decay products of proton & anti-proton collisions. Magic really does happen in that collision hall. (Image: Mine)

Figure 6: The DZero Detector Collision Hall (blue building, back), Tevatron Colling River (center) , and Collision Hall Access Road (foreground). Like CDF (Figure 5), DZero is one of two general-purpose detectors at Fermilab that collects and analyzes the decay products of proton & anti-proton collisions. There is no question that the Tevatron generates a lot of heat. It was determined long ago that by taking advantage of the area’s annual rainfall and temperature the operating costs of running the collider could be drastically cut by using naturally replenishable source of water to cool the collider. If there were ever a reason to invest in a renewable energy source, this would be it. The access road doubles as a running/biking track for employees and site visitors. If you run, one question that is often asked by other scientists is if you are a proton or anti-proton. The anti-protons travel clockwise in the Main Ring and hence you are called an anti-proton if you bike/run with the anti-protons; the protons travel counter-clockwise. FYI: I am an anti-proton. (Image: Mine)

Figure 7: The Barn (red barn, right) and American bison pen (fence, foreground). Fermilab was built on prairie land and so I find it every bit appropriate that the laboratory does all it can to preserve an important part of America’s history, i.e., forging the Great American Frontier. Such a legacy of expanding to the unknown drives Fermilab’s mantra of being an “Ongoing Pioneer of Exploring the Frontier of Discovery.” (Image: Mine)

Figure 8: American bison (bison bison) in the far background (click to enlarge). At the time of the photo, a few calves had just recently been born. (Image: Mine)

 

Happy Colliding.

 

– richard (@bravelittlemuon)

 

 

Share

During my brief time participating in the wide world of High Energy Physics (HEP) I have learned many, many things.   But above all, if there is one thing I’ve come to understand, it’s that there will never be enough:

 

Coffee

While some people may concern themselves with blood alcohol content.  I spend my time thinking about blood caffeine content.  I’ve become thoroughly addicted as a grad student, and without my daily (or sometimes hourly) “fix,” I doubt I would get anything done.

But caffeine isn’t just my own vice (or at least that’s the addict in me talking), I’ve come to think its a necessary evil within all fields of research.  As an example, there are not one, not two, but four coffee pots on my floor of the Physics & Chemistry building; and I’m not even counting the chemistry side (or those that may be found in offices).

The coffee pot that I contribute to is filled twice a day (at least).  We go through several containers of half & half every week, along with a tub of say Maxwell House coffee.  We rely on everyone to contribute to keep this stream of liquid productivity flowing.

My own coffee mug has become to be known as “The Soup Bowl” among the grad students & professors on my floor.  I maintained that it is a coffee mug, however I’ve been fighting a losing battle ever since the start of last spring semester.  But whether its a mug for drinking coffee or a bowl for holding chicken noodle soup, I would get a whole lot less done in a day without this beautiful piece of ceramic:

 

My coffee mug, compared with a "normal" coffee mug

 

And even though this mug fits a gigantic amount of coffee; I’ve come to think that it’s never enough.

 

Hours in a Day

While I need coffee to get through the hours of my day, I just really wish there were more of them.

My day begins between 8-10 am (usually depending on when I get home from the night before); I usually end up having to work until as late as 8-9pm (or sometimes even midnight) to accomplish what I need to for the day.  I spend my time corresponding with other physicists via email, attending meetings, reading papers, and computer programming.  It’s a lot of work, but I enjoy what I do.  However, I am of the opinion that the sunrise and sunset should be a bit farther apart.

 

"Zed, don't you guys ever get any sleep around here?" - Jay, "The twins keep us on Centaurian time, standard thirty-seven hour day. Give it a few months. You'll get used to it... or you'll have a psychotic episode." -Zed (Men In Black, 1997)

 

Personnel

It’s been my experience that every single analysis in CMS can always benefit from more people becoming involved.

To give you an idea of what tasks are involved in an analysis, here’s a generic outline most conform to:

  1. Define experimental techniques
  2. Perform measurements
  3. Determine backgrounds
  4. Analyze experimental/theoretical uncertainties
  5. Obtain approval (each of the LHC’s Collaborations undergo an internal peer-review process before submitting for publication in an external peer-review journal).

 

These tasks take time, and above all, they need warm bodies (who sometimes have more in common with Zombies, sans coffee that is).

But HEP is a collaborative science. Within a given experiment (such as CMS or ATLAS) we all work together to make sure research is conducted precisely, and promptly.  Each individual within the CMS Collaboration is usually juggling a series of different analyses.  The time they invest in each of these analyses varies.  However, each researcher usually has one project which is their “pet project,” and  occupies the majority of their time.

But needless to say, HEP is a massive undertaking, and it seems like there are never enough Physicists/Grad Students involved.

 

Data

What’s the difference between one inverse femtobarn (fb-1) of data, and say ten, or a hundred??  Only a series of discoveries that will forever change our understanding of the universe.  You know, nothing major.

Humor aside, the experiments at the LHC have collected over 1 fb-1 of data this past year.  And there have been several times in which we collected more data in a day then we did in all of 2010 (which I find astounding):

 

Integrated luminosity delivered to/recorded by the CMS Detector per day. Note the 2010 data set consisted of only ~43.3 pb^-1. (Image Courtsey of the CMS Collaboration)

Total integrated luminosity delivered to/recorded by the CMS Detector in 2011. (Image Courtesy of the CMS Collaboration)

 

 

But what’s the big deal?  Well, one of the rules of thumb in particle physics says: to have a discovery, you need to have a statistical significance of five sigma over your current theory/background.  Simply put, the chances that your discovery is a statistical fluke must be less then 0.01%.

While this may seem a bit ad hoc, it is actually necessary.  Three sigma effects come and go in particle physics.

But because of this stringent requirement we are always asking for more.  We always wish for our colliding beams to have a higher luminosity.  We always want the time between crossings of particles in the detector to be minimized.  In short, we always want more data, and there is never enough!

Who knows what is on the horizon of tomorrow’s proton collisions.  I for one have no idea, but I avidly look forward to the coming “more glorious dawn.”

 

CPU’s

I’m sure my colleagues have differing opinions on what is and is not needed in high energy physics.  But, I adamantly believe there are two things all of us would agree on.  We always need more data, and we always need more CPU’s.

Cluster computing is the name of the game.  There are rooms at HEP Labs that can usually be heard from “miles away” (or at least a few meters).  They literally hum with activity.  To me it sounds like raw science.  To someone more “normal,” it probably sounds like hundreds of fans all operating at once (which is exactly what it is).  These rooms are filled with racks upon racks of computers, all linked in some fashion.  Users all over the country/world submit hundreds of thousands of “jobs,” or research tasks, to these clusters.  In each of these jobs, a piece of the cluster is given some software a researcher has developed, and use this software to analyze data.

As an example, I perform a relatively small analysis (with respect to the scope of LHC Physics), but I run between 7.5-14K computing jobs a week.  Job number is a bit arbitrary though; this is because a user specifies how large each job is.  To be a bit more concrete, the size of all the data & simulated samples I need for my work is over 80 terabytes.

So how do I, and other physicists, analyze all this data?  With jobs!

And here’s how it works: one of my data sets has roughly 35 million events.  If I attempt to process this data all at once, with one computer (even recent jeopardy champion Watson) it will take forever.  Instead, I break the task of data analysis up into many many tasks (aka jobs).  Each job will analyze 25-50K events.  In this manner high energy physics makes use of “parallel-computing,” and save time.

But why do we need this job system, how long would it really take to process that data in one shot?  Well assuming a spherical cow, each of my jobs takes ~12 hours.  To run over those 35 Million events I mentioned, I need 3836 jobs.  So at 12 hours a job, it would take Watson ~5.3 years to process all the data if it was done in one job.

So much for getting my degree in less then a decade (and heaven forbid I make a mistake!).

But the irony of having so many physicists participating in a HEP experiment, is that not everyone will have all of their jobs running at a time.  Each cluster has a finite number of CPU’s, and a seemingly infinite amount of jobs submitted to it (continually).  What usually happens is a person will have anywhere between 6 to 600 of their jobs running at a time (depending on who else is using the cluster).

So to analyze data, it could take anywhere between a night to a week.  And in this regard, I believe we will never have enough CPU’s.

 

 

Until next time,

-Brian

Share

LHC is GO!!!

Sunday, March 13th, 2011

Hi, all!

It feels like forever since the LHC last delivered proton-proton collisions (… in early November). There was a very productive stretch of heavy-ion collisions followed by the usual winter shutdown, and then a few weeks of machine development that ended… just now.

Yes: The first stable beam p-p collisions of the year are happening at this very moment! As always, you can see the LHC status live: here.

The 2011 dataset promises to be EPIC. Stay tuned — lots of physics to come!

— Burton 😀

Share

Exciting new results from CMS

Tuesday, September 21st, 2010

I’m giddy today because CMS just came out with some very exciting results.  I don’t think we understand what they mean at all – and as a scientist, there is nothing I love better than shocking data, data that challenge what we think we understand.  (For the technical audience, the slides from the talk at CERN are here and the paper is here.)  I might be biased because this topic is very closely related to my doctoral thesis, but I think it’s safe to say this is the first surprising result from the LHC, something that changes our paradigm.

In heavy ion collisions at the relativistic heavy ion collider we observed something called the ridge (from this paper):

We more or less understand the peak – called the “jet-like correlation” – but we don’t understand the broad structure the peak is sitting on.  This broad structure is called the ridge.  What I mean when I say we don’t understand the ridge is that we haven’t settled in the field how this structure is formed, where it comes from.  We have a lot of models that can produce something similar, but they can’t describe the ridge quantitatively.

Here’s what CMS saw:

It’s a slightly different type of measurement – I’ve put a box around the part with the ridge.  We see the same peak as we saw before – again, we pretty much understand where this comes from.  But there’s a broad structure beneath this peak.  It’s smaller than what we saw in heavy ion collisions above, but it’s there – the fact that it’s there is surprising.

In the models we have from heavy ion collisions the ridge is from:

  • A high energy quark or gluon losing energy in the Quark Gluon Plasma,
  • Collective motion of particles in the Quark Gluon Plasma, or
  • Remnants of the initial state (meaning the incoming particles)

In our current understanding of what goes on in a proton-proton collision, there is no Quark Gluon Plasma – so the conservative interpretation of these data would mean that the ridge is somehow some remnant of the initial state. Even conservatively, this would severely constrain our models.  Some physicists, such as Mike Lisa at Ohio State University, have proposed that there may be collective motion of particles in proton-proton collisions, similar to what we see in heavy ion collisions.  This would imply that we also see a medium in proton-proton collisions.  That would be a huge discovery.  (Just to be clear, CMS is not making this claim, at least at this point.)  It will take a while for the community to debate the meaning of these data and come to a consensus on what they mean.  But these data are definitely very exciting – this is the most exciting day for me since the first collisions!

Share

Getting fired up again!

Sunday, January 31st, 2010

As the time approaches for the reinitiation of LHC operations, we are starting to feel the excitement  of this grandiose experiment again.

With the Tevatron’s first direct constraint on the mass of the Higgs boson beyond good-old LEP’s this past week, physicists in all LHC experiments are getting ready and more excited to re-start operations and finally gather some data that allow them to search for new physics and hopefully complement or surpass very quickly the astonishing Tevatron results.  Meanwhile, LHC physicists and engineers are finalizing the improvements in the quench protection systems that will allow us to run at the energy of 3.5 TeV/beam, starting middle February.

My two cents, as always, consists of collaborating in putting the CMS trigger system in the best condition possible to start taking good data.  This time though, we are using “real” data from last year’s operations as opposed to using “simulated” data.  No more relying entirely on Monte Carlo, no more tweaking and tuning and speculating about our computer simulations.  This is the real deal guys!!

What we do with the data is to skim it off-line into a collection of good and interesting events, then we feed them into our on-line system and run the trigger menu to check its performance.  These data has all the information, event by event, that the detector collected (in the form of electronic signals) from those proton-proton collisions we had last year.  For these past month or so, we have been capable of touching nature’s primary constituents over and over in order to adapt our detectors and tune them to be able to better sense the most fantastic petals of life: particles!

Edgar Carrera (Boston University)

Share

Listening to data

Thursday, January 29th, 2009

Darn it, Peter got to it first, but I too would like to call your attention to the interesting essay that Dennis Overbye wrote in The New York Times this week.  (I have to post more rapidly.)  It reflects upon President Obama’s call to “restore science to its rightful place,” and the interplay between science and democracy.  There is a shout-out to the LHC in there, as he remarks that people from a great variety of backgrounds have happily worked together (or at least happily enough) on these projects.

I agree with Overbye’s arguments, but the essay, which asserts that democracy is one of the values of science, got me thinking about what other values that science gives us.  I think that one of the most important values for me is one that Overbye touches on a little: the value of listening to what nature is telling us.  In science, that means listening to the data that our experiments provide. 

There are many ways to be creative in science — in my particular science, we create new acceleration technologies, devise new ways to detect particles, and find clever ways to analyze our data so that we can measure particle properties with the smallest possible uncertainty.  We have a healthy appreciation, and admiration, for ideas that we haven’t seen before that turn out to have a big payoff.  Practitioners of theoretical physics can build very creative theories that explain current measurements and make predictions for future results.  But there is one thing that we are never creative about, and that’s what the actual answers are.  Those we can only find by doing the experiments — we can’t make it up, we can’t guess, we can’t rely on the opinions of others, we can’t be superstitious.  All of the creativity we have must bump up against the realities that nature presents us with, and if our hypotheses disagree with the data we record, we must discard them.  It is a little humbling, in a way.

But on the other hand, it is also empowering.  So many answers may be out there, if we only open our eyes and look!  This is obviously true of something like particle physics, but I think it applies to a broader range of human problems.  What kind of programs are effective in reducing societal ills?  What economic policies might improve the lives of the largest number of people?  You can try them out and see what works, or analyze the results of previous attempts to implement them, and see if those worked.  We can do better than just following a philosophical ideal or notion — we can test our creativity against the real world.  Obviously these sorts of “experiments” have all sorts of complications that physics experiments don’t.  But we can still collect data and learn something from nature.  Perhaps that is one of the rightful places of science that Obama has in mind?

Share