“Travel the world, visit exotic places, see the same hundred people at each of them” is how one of my colleagues describes his life in particle physics. There is some truth to it — given that I and my collaborators are distributed all over the world, we might as well meet up at places other than CERN now and then, and why not go somewhere interesting?
This week I went to the annual “all hands” meeting of the Open Science Grid. The OSG provides the underlying grid middleware that supports LHC computing in the United States, and many other scientific collaborations that do large-scale, highly-distributed computing. One of those collaborations is the Laser Interferometry Gravitation Observatory (LIGO), so this year the meeting was held at one of LIGO’s experimental sites in Livingston, LA (about a half hour’s drive east of Baton Rouge).
It’s always fun to go see someone else’s physics experiment! With apologies to the people who work on LIGO, here’s how I understand it to work, in just one paragraph. Einstein’s theory of gravity predicts that astronomical systems such as binary stars can emit gravitational waves, which are propagating variations in the fabric of space-time. As they pass through some region of space, they will cause the distance between two points in space to change. So, LIGO measures the distance between sets of mirrors that are separated by 4 km in the hope of seeing that distance vary (and better still, correlating that event with some event that can be observed in the sky). But these effects will be very small, such that the mirrors will only be displaced by a small fraction of the radius of the proton. That means that you have to isolate the mirrors from the rest of the environment as well as possible, and then understand all sorts of environmental effects so that you can account for them in your measurements of the distances. This is an extremely difficult experiment, and with the current version of LIGO, the experimenters only expect to see one gravitational wave event every ten years. They are currently implementing upgrades that will increase their sensitivity to the point where they can observe tens of events (or more) every year. If they can make it work, it will provide a new way to look at the sky that can bring complementary information, just as every wavelength of light used in astronomy tells us something new. (OK, it was a long paragraph.)
So we all got to see the two long perpendicular tunnels on the site, although we weren’t able to go inside to see the apparaturs itself. We also got to hear about the particular computing challenges that LIGO faces. They produce about a terabyte of data each day, which is a factor of several less than what we would expect out of CMS. Only about 1% of that data is needed to make the distance measurements; the rest is all the information about the environment. What is most important for LIGO is turnaround — they want to be able to analyze the data in as close to real time as possible, so that if they observe something interesting, they can alert other astronomers who can try to make confirming observations.
Of course, this visit wasn’t all fun and games (or crawfish, which is none of the above) for me. We also had the annual get-together of the staff of the seven US CMS Tier-2 computing sites. We do a videoconference every two weeks to keep ourselves informed of what is going on, but it is nice to actually get everyone in the same room once a year; all those conversations that usually happen through email (see my previous complaints) or instant messaging (perhaps even worse!) can happen face to face. This year we spent a lot of time talking about what we can do to improve the reliability of site operations (if anything), our high-level plans for the next year, and what systems we may want to use to manage our disk storage in the future. This last item got a lot of attention; we chose our current system more than four years ago, and people had a lot of enthusiasm for some alternatives that have emerged since then. We’ll see how that turns out!