The Large Hadron Collider (LHC) started a vast consolidation program in March 2013 that will last well into 2015. Everybody at CERN on the accelerators or the experiments is now working hard to complete all needed tasks in time.
The experimental collaborations are currently deploying huge efforts on many fronts. One major task is preparing to deal with the increased data volume the revamped LHC will bring in 2015.
The LHC will resume at higher energy and luminosity, i.e. more intense beams. For the LHCb experiment, since it operates at constant luminosity, higher energy will translate into more tracks per event and almost twice the signal rate. Same situation for the other experiments, ALICE, CMS and ATLAS, but they will also have higher luminosity, meaning having to cope with more collisions occurring simultaneously every time bunches of protons collide in the LHC, making it increasingly difficult to disentangle each recorded event.
To give you an idea, here are three snapshots captured by the ATLAS detector in successive years. The event on the left was taken at low luminosity at the start of the LHC. Very few collisions happened at the same time yielding very few tracks per event as seen on the picture.
Then in 2011, the average number of simultaneous collisions increased to around 12 (centre) and reached up to 40 by the end of 2012 (right). In 2015, there will be between 60 and 80 superimposed collisions in each event depending on the operating scheme that will be retained. The challenge will be to extract a collision of interest from the huge quantity of tracks in each event.
Hence, much effort is spent improving the simulation, calibration and reconstruction of such events. Physicists are building on the existing techniques to be able to cope with the expected data volume.
The picture above shows a zoomed view of an event in the centre of the CMS detector where 78 proton-proton collisions took place simultaneously (the bright dots on the horizontal axis). The scale here is a few centimetres.
Here, each track corresponds to a charged particle. And each and every one of these tracks must be associated with only one vertex, namely, the point in space where it was created in a proton collision. This way, only the tracks associated to the main collision point will be retained to reconstruct the event.
In the picture above, most tracks come from collisions where the protons barely grazed each other and can be ignored. Only the energetic collisions have a chance to produce the heavy and rare particles we are interested in.
In parallel, all groups are using the opportunity of the shutdown to replace or repair electronic modules, power supplies and other components that failed or showed signs of deterioration during the past three years. New sub-detectors are even being added to increase the detectors performance. For example, the CMS collaboration is extending its muon detector coverage and the ATLAS experiment is adding a fourth layer on its pixel detector. LHCb is replacing its beam pipe and ALICE is doing major upgrades to its calorimeters.
But the main effort for all LHC experiments is still to finalize all analyses using the full data collected so far. Everyone seems to be following my mother’s advice: We must tirelessly revisit our work until it is perfect. (Cent fois sur le métier, remettez votre ouvrage). This is precisely what is happening right now. Each aspect of the data analysis is revisited to reach the full potential of the current data set: calibration, particle identification, background evaluation and signal extraction.
Every collaboration already has dozens of new results ready for the upcoming major summer conferences such as the European Physics Society meeting in mid-July.
To be alerted of new postings, follow me on Twitter: @GagnonPauline or sign-up on this mailing list to receive and e-mail notification.