Physicists did a lot of planning for data analysis before the LHC ever ran, and we’ve put together a huge number of analyses since it started. We’ve already looked for most of the things we’ll ever look for. Of course, many of the things we’ve been looked for haven’t shown up yet; in fact, in many cases including the Higgs, we didn’t expect them to show up yet! We’ll have to repeat the analysis on more data. But that’s got to be easier than it was to collect and analyze the data the first time, right? Well, not necessarily. We always hope it will be easier the second or third time around, but the truth is that updating an analysis is a lot more complicated than just putting more numbers into a spreadsheet.
For starters, every time we add new data, it was collected under different conditions. For example, going from 2011 to 2012, the LHC beam energy will be increasing. The number of collisions per crossing will be larger too, and that means the triggers we use to collect our data are changing too. All our calculations of what the pileup on top of each interesting collision looks like will change. Some of our detectors might work better as we fix glitches, or they might work worse as they are damaged in the course of running. All these details affect the calculations for the analysis and the optimal way to put the data together.
But even if we were running on completely stable conditions, there are other reasons an analysis has to be updated as you collect more data. When you have more events to look at, you might be interested in limiting the events you look at to those you understand best. (In other words, if an analysis was previously limited by statistical uncertainties, as those shrink, you want to get rid of your largest systematic uncertainties.) To get all the power out of the new data you’ve got, you might have to study new classes of events, or get a better understanding of questions where your understanding was “good enough.”
So analyzing LHC data is really an iterative process. Collecting more data is always presenting new challenges and new opportunities that require understanding things better than before. No analysis is ever the same twice.
Tags: data, data analysis, LHC, luminosity, pileup, trigger