In particle physics, we’re often looking for very rare phenomena, which are highly unlikely to happen in any given particle interaction. Thus, at the LHC, we want to have the greatest possible proton collision rates; the more collisions, the greater the chance that something unusual will actually happen. What are the tools that we have to increase collision rates?
Remember that the proton beams are “bunched” — there isn’t a continuous current current of protons in a beam, but a series of smaller bunches of protons, each only a few centimeters long, with gaps of many centimeters between each bunch. The beams are then timed so that bunches from each beam pass through each other (“cross”) inside one of the big detectors. A given bunch can have 10E11 protons in it, and when two bunches cross, perhaps tens of the protons in each bunch — a tiny fraction! — will interact. This bunching is actually quite important for the operation of the detectors — we can know when bunches are crossing, and thus when collisions happen, and then we know when the detectors should really be “on” to record the data.
If one were to have a fixed number of protons in the machine (and thus a fixed total amount of beam current), you could imagine two ways to create the same number of collisions: have N bunches per beam, each with M protons, or 2N bunches per beam with M/sqrt(2) protons. The more bunches in the beam, the more closely spaced they would have to be, but that can be done. From the perspective of the detectors, the second scenario is much preferred. That’s because you get fewer proton collisions per bunch crossing, and thus fewer particles streaming through the detectors. The collisions are much easier to interpret if you have fewer collisions per crossing; among other things, you need less computer processing time to reconstruct each event, and you will have fewer mistakes in the event reconstruction because there aren’t so many particles all on top of each other.
In the previous LHC run (2010-12), the accelerator had “50 ns spacing” between proton bunches, i.e. bunch crossings took place every 50 ns. But over the past few weeks, the LHC has been working on running with “25 ns spacing,” which would allow the beam to be segmented into twice as many bunches, with fewer protons per bunch. It’s a new operational mode for the machine, and thus some amount of commissioning and tuning and so forth are required. A particular concern is “electron cloud” effects due to stray particles in the beampipe striking the walls and ejecting more particles, which is a larger effect with smaller bunch spacing. But from where I sit as one of the experimenters, it looks like good progress has been made so far, and as we go through the rest of this year and into next year, 25 ns spacing should be the default mode of operation. Stay tuned for what physics we’re going to be learning from all of this!