• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘luminosity’

The Large Hadron Collider (LHC) at CERN has already delivered more high energy data than it had in 2015. To put this in numbers, the LHC has produced 4.8 fb-1, compared to 4.2 fb-1 last year, where fb-1 represents one inverse femtobarn, the unit used to evaluate the data sample size. This was achieved in just one and a half month compared to five months of operation last year.

With this data at hand, and the projected 20-30 fb-1 until November, both the ATLAS and CMS experiments can now explore new territories and, among other things, cross-check on the intriguing events they reported having found at the end of 2015. If this particular effect is confirmed, it would reveal the presence of a new particle with a mass of 750 GeV, six times the mass of the Higgs boson. Unfortunately, there was not enough data in 2015 to get a clear answer. The LHC had a slow restart last year following two years of major improvements to raise its energy reach. But if the current performance continues, the discovery potential will increase tremendously. All this to say that everyone is keeping their fingers crossed.

If any new particle were found, it would open the doors to bright new horizons in particle physics. Unlike the discovery of the Higgs boson in 2012, if the LHC experiments discover a anomaly or a new particle, it would bring a new understanding of the basic constituents of matter and how they interact. The Higgs boson was the last missing piece of the current theoretical model, called the Standard Model. This model can no longer accommodate new particles. However, it has been known for decades that this model is flawed, but so far, theorists have been unable to predict which theory should replace it and experimentalists have failed to find the slightest concrete signs from a broader theory. We need new experimental evidence to move forward.

Although the new data is already being reconstructed and calibrated, it will remain “blinded” until a few days prior to August 3, the opening date of the International Conference on High Energy Physics. This means that until then, the region where this new particle could be remains masked to prevent biasing the data reconstruction process. The same selection criteria that were used for last year data will then be applied to the new data. If a similar excess is still observed at 750 GeV in the 2016 data, the presence of a new particle will make no doubt.

Even if this particular excess turns out to be just a statistical fluctuation, the bane of physicists’ existence, there will still be enough data to explore a wealth of possibilities. Meanwhile, you can follow the LHC activities live or watch CMS and ATLAS data samples grow. I will not be available to report on the news from the conference in August due to hiking duties, but if anything new is announced, even I expect to hear its echo reverberating in the Alps.

Pauline Gagnon

To find out more about particle physics, check out my book « Who Cares about Particle Physics: making sense of the Higgs boson, the Large Hadron Collider and CERN », which can already be ordered from Oxford University Press. In bookstores after 21 July. Easy to read: I understood everything!

CMS-lumi-17juin

The total amount of data delivered in 2016 at an energy of 13 TeV to the experiments by the LHC (blue graph) and recorded by CMS (yellow graph) as of 17 June. One fb-1 of data is equivalent to 1000 pb-1.

Share

Physicists did a lot of planning for data analysis before the LHC ever ran, and we’ve put together a huge number of analyses since it started. We’ve already looked for most of the things we’ll ever look for. Of course, many of the things we’ve been looked for haven’t shown up yet; in fact, in many cases including the Higgs, we didn’t expect them to show up yet! We’ll have to repeat the analysis on more data. But that’s got to be easier than it was to collect and analyze the data the first time, right? Well, not necessarily. We always hope it will be easier the second or third time around, but the truth is that updating an analysis is a lot more complicated than just putting more numbers into a spreadsheet.

For starters, every time we add new data, it was collected under different conditions. For example, going from 2011 to 2012, the LHC beam energy will be increasing. The number of collisions per crossing will be larger too, and that means the triggers we use to collect our data are changing too. All our calculations of what the pileup on top of each interesting collision looks like will change. Some of our detectors might work better as we fix glitches, or they might work worse as they are damaged in the course of running. All these details affect the calculations for the analysis and the optimal way to put the data together.

But even if we were running on completely stable conditions, there are other reasons an analysis has to be updated as you collect more data. When you have more events to look at, you might be interested in limiting the events you look at to those you understand best. (In other words, if an analysis was previously limited by statistical uncertainties, as those shrink, you want to get rid of your largest systematic uncertainties.) To get all the power out of the new data you’ve got, you might have to study new classes of events, or get a better understanding of questions where your understanding was “good enough.”

So analyzing LHC data is really an iterative process. Collecting more data is always presenting new challenges and new opportunities that require understanding things better than before. No analysis is ever the same twice.

Share

Can the LHC Run Too Well?

Friday, February 3rd, 2012

For CMS data analysis, winter is a time of multitasking. On the one hand, we are rushing to finish our analyses for the winter conferences in February and March, or to finalize the papers on analyses we presented in December. On the other, we are working to prepare to take data in 2012. Although the final decisions about the LHC running conditions for 2012 haven’t been made yet, we have to be prepared both for an increase in beam energy and an increase in luminosity. For example, the energy might go to 8 TeV center-of-mass, up from last year’s 7. That will make all our events a little more exciting. But it’s the luminosity that determines how many events we get, and thus how much physics we can do in a year. For example, if the Higgs boson exists, the number of Higgs-like events we’ll see will go up, and so will the statistical power with which we can claim to have observed it. If the hints we saw at 125 GeV in December are right, our ability to be sure of its existence this year depends on collecting several times more events in 2012 than we got in 2011.

We’d many more events over 2012 if the LHC simply kept running the way it already was at the end of the year. That’s because for most of the year, the luminosity was increasing over and over as the LHC folks added more proton bunches and focused them better. But we expect that the LHC will do better, starting close to last year’s peak, and then pushing to ever-higher luminosities. The worst-case we are preparing for is perhaps twice as much luminosity as we had at the end of last year.

But wait, why did I say “worst-case”?

Well, actually, it will give us the most interesting events we can get and the best shot at officially finding the Higgs this year. But increased luminosity also gives more events in every bunch crossing, most of which are boring, and most of which get in the way. This makes it a real challenge to prepare for 2012 if you’re working on the trigger, because have to sift quickly through events with more and more extra stuff (called “pileup”). As it happens, that’s exactly what I’m working on.

Let me explain a bit more of the challenge. One of the triggers I’m becoming responsible for is trying to find collisions containing a Higgs decaying to a bottom quark and anti-bottom quark and a W boson decaying to an electron and neutrino. If we just look for an electron — the easiest thing to trigger on — then we get too many events. The easy choice is to ask only for higher-energy electrons, but beyond a certain points we start missing the events we’re looking for! So instead, we ask for the other things in the event: the two jets from the Higgs, and the missing energy from the invisible neutrino. But now, with more and more extra collisions, we have random jets added in, and random fluctuations that contribute to the missing energy. We are more and more likely to get the extra jets and missing energy we ask for even though there isn’t much missing energy or a “Higgs-like” pair of jets in the core event! As a result, the event rate for the trigger we want can become too high.

How do we deal with this? Well, there are a few choices:

1. Increase the amount of momentum required for the electron (again!)
2. Increase the amount of missing energy required
3. Increase the minimum energy of the jets being required
4. Get smarter about how you count jets, by trying to be sure that they come from the main collision rather than one of the extras
5. Check specifically if the jets come from bottom quarks
6. Find some way to allocate more bandwidth to the trigger

There’s a cost for every option. Increasing energies means we lose some events we might have wanted to collect — which means that even though the LHC has produced more Higgs bosons, it’s counterbalanced by us seeing fewer of the ones that were there. Being “smarter” about the jets means more time spent by our trigger processing software on this trigger, when it has lots of other things to look at. Asking for bottom quarks not only takes more processing, it also means the trigger can’t be shared with as many other analyses. And allocating more bandwidth means we’d have to delay processing or cut elsewhere.

And for all the options, there’s simply more work. But we have to deal with the potential for extra collisions as well as we can. In the end, the LHC collecting much more data is really the best-case scenerio.

Share

One Recorded Inverse Femtobarn!!!

Tuesday, October 18th, 2011

Last week I announced that LHC reached the 2011 milestone of delivering one inverse femtobarn of luminosity to LHCb. This week, LHCb reached the 2011 milestone of recording one inverse femtobarn of data.

There is a subtle difference between these two statements, which is better illustrated in the graph below, where it can see that the delivered and recorded integrated luminosities are different, and the difference seems to grow with time.

So what exactly does “delivered” and “recorded” integrated luminsoity actually mean? Surprisingly for physics, they mean exactly how they sound. That is, delivered integrated luminosity refers to the integrated luminosity of proton-proton collisions which the LHC has delivered to LHCb while recorded luminosity refers to the amount of data we have recorded on disk. We obviously want these to be as close to each other as possible, but as I’ve mentioned before, this is not possible due to the detector hardware.

You may all be wondering why LHCb is celebrating one inverse femtobarn while ATLAS and CMS are celebrating five inverse femtobarns. This is due to the fact that the design instantaneous luminosities for LHCb is much lower than for ATLAS and CMS. In fact, at the beginning of the year, the milestone of one inverse femtobarn seemed almost unachievable. This remarkable accomplishment has only been possible due to the excellent performance of both the LHC team in implementing luminosity leveling and the LHCb team in running the detector at higher instantaneous luminosity than it was initial designed for.

In terms of physics, one inverse femtobarn of data corresponds to about seventy billion b quark pairs decaying in the LHCb detector. This huge amount of data allows us to significantly increase the accuracy of our results like \(\phi_s\). It also increases the statistics of various rare decays like \(B_{s}\rightarrow\mu\mu\). Stay tuned for more results!

Share

Last night, while most of the collaboration was sleeping, LHC reached the 2011 milestone of delivering one inverse femtobarn of luminosity to LHCb.

In the words of our spokesperson, Pierluigi Campana: This is a great achievement for us and for the scientific community, thanks to the excellent performance of the machine, to the skill and to the commitment of the whole LHC team.

This result is even more remarkable, considering that LHC was able to deliver high quality data at the same time, and with different luminosities, to the four LHC experiments.

This performance will allow us to push even further the search for new physics and for unexpected phenomena in the flavor sector.

I really have nothing to add to that except, congratulations to all involved. 🙂

Share

Coming attractions at the LHC

Friday, September 2nd, 2011

It’s Labor Day weekend here in the US, but over at CERN it’s the end of the August technical stop for the LHC. To rework a common saying, this is the first day of the rest of the 2011 run. We have two months left of proton-proton collisions, followed by one month of lead-lead collisions, and then in December we’ll have the holiday “extended technical stop” that will probably extend to the spring.

We’re expecting an important change in running conditions once we return from the technical stop, and that is a change in how the beams are focused. This will lead to an increased rate of collisions. Remember that the proton beams are “bunched”; the beam is not a continuous stream of particles but bunches with a large separation between them. The change in the focusing will help make the bunches more compact, and that in turn will mean that there will be more proton collisions every time a pair of bunches pass through each other. When our detectors record data, they record an entire bunch crossing as a single event. Thus, each individual event will be busier, with more collisions and more particles produced.

This is good news from a physics perspective — the more collisions happen, the greater the chance that there will be something interesting coming out. But it’s a challenge from an operational perspective. We try to record as many “interesting” events as possible, but we’re ultimately limited by how quickly we can read out the detector and how much space we have to store the data. Given that we’re going to have more data coming into fixed resources, we’re going to have to limit our definition of “interesting” a little further. The busier events are also a greater strain on the software and computing for the experiments (which I focus on). Each event takes more CPU time to process and requires more RAM. Previous experience and simulations give us some guidance as to how all of this will scale up from what we’ve seen so far, but we can’t know for sure without actually doing it. (The original plan for the machine development studies period before the technical stop was supposed to include a small-scale test of this, so that we could put the computing and everything else through its paces. But that got cancelled. I had originally planned to blog about that. Oh well.)

However, all of this will be worth the trouble. Remember all of the excitement of the EPS conference? That was at the end of July, just a little more than a month ago. There is now about twice as much data that can be analyzed. With the increases in collision rate, we might well be able to double the dataset once again just in these next two months. Or, we might do even better. This will have a critical impact on our searches for new phenomena, and could allow the LHC experiments to discover or rule out the standard-model Higgs boson by the end of this year. Coming soon, to a theater near you.

Share

Congratulations LHCb!!!

Saturday, May 28th, 2011

Just a quick post today to explain this LHC status from last night:

What was this about you ask? As I’ve mentioned previously, the target instantaneous luminosity for LHCb is \(2 \times 10^{32} cm^{-2} s^{-1}\) to \(3 \times 10^{32} cm^{-2} s^{-1}\).

LHCb started taking data within this target instantaneous luminosity on the 1st of May with 756 colliding bunches corresponding to an instantaneous luminosity of \(2.15 \times 10^{32} cm^{-2} s^{-1}\). Last night the experiment moved into unknown territory, collecting data at an instantaneous luminosity of \(3 \times 10^{32} cm^{-2} s^{-1}\).

Experts have been carefully monitoring the detector behaviour and data quality, but so far it would seem that everything is performing well. Congratulations are indeed in order. 🙂

Share

The CERN Accelerator Complex

Sunday, April 24th, 2011

With all the buzz this past week regarding the breaking of the world instantaneous luminosity record, I thought it might be interesting for our readers to get an idea of how we as physicists achieved this goal.

Namely, how do we accelerate particles?

(This may be a review for some of our veteran readers due to this older post by Regina)

 

The Physics of Acceleration

Firstly, physicists rely on a principle many of us learn in our introductory physics courses, the Lorentz Force Law.  This result, from classical electromagnetism, states that a charged particle in the presence of external electric and/or magnetic fields will experience a force.  The direction and magnitude (how strong) of the force depends on the sign of the particle’s electric charge and its velocity (or direction its moving, and with what speed).

So how does this relate to accelerators?  Accelerators use radio frequency cavities to accelerate particles.  A cavity has several conductors that are hooked up to an alternating current source.  Between conductors there is empty space, but this space is spanned by a uniform electric field.  This field will accelerate a particle in a specific direction (again, depending on the sign of the particle’s electric charge).  The trick is to flip this current source such that as a charged particle goes through a succession of cavities it continues to accelerate, rather than be slowed down at various points.

A cool Java Applet that will help you visualize this acceleration process via radio frequency cavities can be found here, courtesy of CERN.

Now that’s the electric field portion of the Lorentz Force Law, what about the magnetic?  Well, magnetic fields are closed circular loops, as you get farther and farther away from their source the radii of these loops continually increases.  Whereas electric fields are straight lines that extend out to infinity (and never intersect) in all directions from their source.  This makes the physics of magnetic fields very different from that of electric fields.  We can use magnetic fields to bend the track (or path) of charged particles.  A nice demonstration of this can be found here (or any of the other thousands of hits I got for Googling “Cathode Ray Tube + YouTube”).

Imagine, if you will, a beam of light; you can focus the beam (make it smaller) by using a glass lens, you can also change the direction of the beam using a simple mirror.  Now, the LHC ring uses what are called dipole and quadropole magnets to steer and focus the beam.  If you combine the effects of these magnets you can make what is called a magnetic lens, or more broadly termed “Magnetic Optics.”  In fact, the LHC’s magnetic optics currently focus the beam to a diameter of ~90 micro-meters  (the diameter of a human hair is ~100 micro-meters, although it varies from person to person, and where on the body the hair is taken from).  However, the magnetic optics system was designed to focus the beam to a diameter of ~33 micro-meters.

In fact, the LHC uses 1232 dipole magnets, and 506 quadrupole magnets.  These magnets have  a peak magnetic field of 8.3 Tesla, or 100,000 times stronger than Earth’s magnetic field.  An example of the typical magnetic field emitted by the dipole magnets of the LHC ring is shown here [1]:

Image courtesy of CERN

 

The colored portions of the diagram indicate the magnetic flux, or the amount of magnetic field passing through a given area.  Whereas the arrows indicate the direction of the magnetic field.  The two circles (in blue) in the center of the diagram indicate the beam pipes for beams one and two.  Notice how the arrows (direction of the magnetic field) point in opposite directions!  This allows CERN Accelerator physicists to control two counter-rotating beams of protons in the same beam pipe (Excellent Question John Wells)!

Thus, accelerator physicists at CERN use electric fields to accelerate the LHC proton/lead-ion beams and the magnetic fields to steer and squeeze these beams (Also, these “magnetic optics” systems are responsible for “Lumi Leveling” discussed by Anna Phan earlier this week).

However, this isn’t the complete story, things like length contraction, and synchrotron radiation affect the acceleration process, and design of our accelerators.  But these are stories best left for another time.

 

The Accelerator Complex

But where does this process start?  Well, to answer this let’s start off with the schematic of this system:

Image courtesy of CERN

One of our readers (thanks GP!) has given us this helpful link that visualizes the acceleration process at the LHC (however, when this video was made, the LHC was going to be operating at design specifications…but more on that later).

A proton’s journey starts in a tank of research grade hydrogen gas (impurities are measured in parts per million, or parts per billion).  We first take molecular hydrogen (a diatomic molecule for those of you keeping track) and break it down into atomic hydrogen (individual atoms).  Next, we strip hydrogen’s lone electron from the atom (0:00 in the video linked above).  We are now left with a sample of pure protons.  These protons are then passed into the LINear ACcelerator 2 (LINAC2, 0:50 in the video linked above), which is the tiny purple line in the bottom middle of the above figure.

The LINAC 2 then accelerates these protons to an energy of 50 MeV, or to a 31.4% percent of the speed of light [2].  The “M” stands for mega-, or times one million.  The “eV” stands for electron-volts, which is the conventional unit of high energy physics.  But what is an electron-volt, and how does it relate to everyday life?  Well, for that answer, Christine Nattrass has done such a good job comparing the electron-volt to a chocolate bar, that any description I could give pales in comparison to hers.

Moving right along, now thanks to special relativity, we know that as objects approach the speed of light, they “gain mass.”  This is because energy and mass are equivalent currencies in physics.  An object at rest has a specific mass, and a specific energy.  But when the object is in motion, it has a kinetic energy associated with it.  The faster the object is moving, the more kinetic energy, and thus the more mass it has.  At 31.4% the speed of light, a proton’s mass is ~1.05 times its rest mass (or the proton’s mass when it is not moving).

So this is a cruel fact of nature.  As objects increase in speed, it becomes increasingly more difficult to accelerate them further!  This is a direct result of Newton’s Second Law.  If a force is applied to a light object (one with little mass) it will accelerate very rapidly; however, the same force applied to a massive object will cause a very small acceleration.

Now at an energy of 50 MeV, travelling at 31.4% the speed of light, and with a mass of 1.05 times its rest mass, the protons are injected into the Proton Synchrotron (PS) Booster (1:07 in the video).  This is the ellipse, labeled BOOSTER, in the diagram above.  The PS Booster then accelerates the protons to an energy of 1.4 GeV (where  the “G” stands for giga- or a billion times!), and a velocity that is 91.6% the speed of light [2].  The proton’s mass is now ~2.49 times its rest mass.

The PS Booster then feeds into the Proton Synchrotron (labeled as PS above, see 2:03 in video), which was CERN’s first synchrotron (and was brought online in November of 1959).  The PS then further accelerates the protons to an energy of 25 GeV, and a velocity that is 99.93% the speed of light [2].  The proton’s mass is now ~26.73 times its rest mass!  Wait, WHAT!?

At 31.4% the speed of light, the proton’s mass has barely changed from its rest mass.  Then at 91.6% the speed of light (roughly three times the previous speed), the proton’s mass was only two and a half times its rest mass.  Now, we increased speed by barely 8%, and the proton’s mass was increase by a factor of 10!?

This comes back to the statement earlier, objects become increasingly more difficult to accelerate the faster they are moving.  But this is clearly a non-linear affect.  To get an idea of what this looks like mathematically, take a look at this link here [3].  In this plot, the Y-axis is in multiples of rest mass (or Energy), and the x-axis is velocity, in multiples of the speed of light, c.  The red line is this relativistic effect that we are seeing, as we go from ~91% to 99% the speed of light, the mass increases gigantically!

But back to the proton’s journey, the PS injects the protons into the Super Proton Synchrotron (names in high energy physics are either very generic, and bland, or very outlandish, e.g. matter can be charming).  The Super Proton Synchrotron (SPS, also labeled as such in above diagram, 3:10 in video above) came online in 1976, and it was in 1983 that the W and Z bosons (mediators of the weak nuclear force) were discovered when the SPS was colliding protons with anti-protons.  In today’s world however, the SPS accelerates protons to an energy of 450 GeV, with a velocity of 99.9998% the speed of light [2].  The mass of the proton is now ~500 times its rest mass.

The SPS then injects the proton beams directly into the Large Hadron Collider.  This occurs at 3:35 in video linked above, however, when this video was recorded the LHC was operating at design energy, with each proton having an energy of 7 TeV (“T” for tera-, a million million times).  However, presently the LHC accelerates the proton to half of the design energy, and a velocity of 99.9999964% the speed of light.  The protons are then made to collide in the heart of the detectors.  At this point the protons have a mass that is ~3730 times their rest mass!

 

 

So, the breaking of the world instantaneous luminosity record was not the result of one single instrument, but the combined might of CERN’s full accelerator complex, and in no small part by the magnetic optics systems in these accelerators (I realize I haven’t gone into much detail regarding this, my goal was simply to introduce you to the acceleration process that our beams undergo before collisions).

 

Until next time,

-Brian

 

 

 

References:

[1] CERN, “LHC Design Report,” https://ab-div.web.cern.ch/ab-div/Publications/LHC-DesignReport.html

[2] CERN, “CERN faq: The LHC Guide,” http://cdsweb.cern.ch/record/1165534/files/CERN-Brochure-2009-003-Eng.pdf

[3]  School of Physics, University of Southern Wales, Sydney Australia, http://www.phys.unsw.edu.au/einsteinlight/jw/module5_equations.htm

Share

I’m interrupting my descriptions of LHCb to discuss something more relevant to the current status of the LHC. Namely this LHC status from just after midnight the other day:

Ken has already discussed the luminosity record in this post, and today I’ll be discussing luminosity leveling (LUMI LEVELING). You may be wondering what this has got to do with LHCb? Well, interaction point 8 (IP8) is where LHCb is located as can be seen in this image:

Aidan has timely discussed what luminosity is in this post where he said that larger instantaneous luminosity means having more events, we want to do everything we can to increase instantaneous luminosity. However, if you’ve been looking at the LHC luminosity plots for 2011, like the one for peak instantaneous luminosity below, you might have noticed that the instantaneous luminosities of ALICE and LHCb are lower than those of ATLAS and CMS.

The reason for the difference between the experiments is that the design instantaneous luminosities for LHCb and ALICE are much lower than for ATLAS and CMS. The target instantaneous luminosity for LHCb is \(2 \times 10^{32} cm^{-2} s^{-1} \) to \(3 \times 10^{32} cm^{-2} s^{-1}\) and for ALICE is \(5 \times 10^{29} cm^{-2} s^{-1} \) to \(5 \times 10^{30} cm^{-2} s^{-1}\) while ATLAS and CMS are designed for an instantaneous luminosity of \(10^{34} cm^{-2} s^{-1}\).

This means that while the LHC operators are trying to maximise instantaneous luminosity at ATLAS and CMS, they are also trying to provide LHCb and ALICE with their appropriate luminosities.

As Aidan mentioned in his post, there are a couple of different ways to modify instantaneous luminosity: you can change the number of proton bunches in the beam or you can change the area of the proton bunches that collide.

Last year the LHC operators optimised the collision conditions and this year have been increasing instantaneous luminosity by increasing the number of proton bunches.

The varying instantaneous luminosity requirements of the experiments have so far been handled by having a different number of proton bunches colliding at each of the interaction points. For example, last week there were 228 proton bunches in the beam, 214 of which were colliding in ATLAS and CMS, 12 of which were colliding in ALICE and 180 of which were colliding in LHCb.

However as more and more proton bunches are injected into the beam, it is not possible to continue to limit the instantaneous luminosity at ALICE and LHCb by limiting the number of colliding bunches. Instead, the LHC operators need to modify the collision conditions. This is what luminosity leveling refers to.

Luminosity leveling is performed by moving the proton beams relative to each other to modify the area available for interactions as the bunches pass through each other. This concept is much easier to explain diagrammatically: if the centres of the beams are aligned like on the left, there are more interactions than if they are offset from each other like on the right.

This luminosity leveling process can be seen in action in the graph below from the nice long LHC fill from last night. You can see the ATLAS and CMS luminosities slowly decreasing due to collisions, while the LHCb luminosity stays roughly constant at \(1.3 \times 10^{32} cm^{-2} s^{-1} \), where the vertical red lines are when the beam adjustments were made.

Share

OK, the second part of the title isn’t actually true, but more on that in a moment….

The fill that is currently in the LHC started at an instantaneous luminosity over 4E32:

Not only is this the highest collision rate ever achieved at the LHC, it’s also the highest ever at a hadron collider, exceeding the largest instantaneous luminosity ever recorded by Fermilab’s venerable Tevatron collider. As has been discussed by many of the US LHC bloggers, luminosity is key at this point — the larger it is, the more collisions we record, and the greater the chance that we can observe something truly new. In the four hours since the fill started, CMS has already recorded about one sixth of the useful data that was recorded in all of 2010!

As for the Pulitzer, this week Mike Keefe of the Denver Post won the 2011 Pulitzer for editorial cartooning for a portfolio of twenty cartoons that included this one about the LHC. (I’d rather not actually run the cartoon here, as I’m not sure we have the rights to it.) Good to see that we are part of journalism history!

Share