• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for December, 2010

Looking back: A good Year

Friday, December 31st, 2010

So, this is it. In a few hours, 2010 will be history (in quite a part of the world, about 1/3, to be precise, it already is). After a very long silence, high time for a few thoughts… I think it has been a very good year, and a fantastic one for particle physics.

First and foremost, there was the LHC, with fantastic performance. But not only that, the future is also looking good. Jonathan posted this a few days ago, and I guess many of you have heard about it, via CERN mailings or the Interactions.org news: Italy is putting substantial funding up for the development of SuperB, a super flavor factory to be based in Italy. And for SuperKEKB, the Japanese super flavor factory , already well on track, excellent news concerning the funding by the Japanese government has been circulating these past days on the mailing lists.  So, everything is pointing towards a start of SuperKEKB physics in 2014, less than four years down the road: Busy times ahead for us to get ready in time!

Of course flavor factories alone are not enough to keep particle physics vital in the long run, we thrive on international experiments at energy frontier, at the very edge of technology. Also in that respect, 2010 has been a good year: A very successful workshop in Geneva in October united the two communities working on this next big project: A linear electron-positron collider. Both for the ILC and for CLIC, an active research and development phase is currently ongoing (and has seen big steps forward in 2010), with the goal to work out proposals in time for first discoveries at the LHC.

So, what remains? In terms of New Physics, we have entered a dark room that might hold many surprises, but we are still in the process of getting the flashlight going. The fact that we have not hit our toes on something big in the dark already tells us something, though… Two examples: There are no striking resonances (such as excited quarks) at masses below about 1.2 TeV (ATLAS, http://arxiv.org/abs/1008.2461). There are no simple microscopic black holes (probably an unlikely, but too me at least one of the most exciting possibilities: Adding gravity to the particle physics menu, hard to beat that!) below about 4 TeV (CMS, http://arxiv.org/abs/arXiv:1012.3375).

For 2011 I have high hopes – the best that could happen, also for future projects and in view of funding cuts threatening the fields, is to move from limits and exclusions to evidences and discoveries. Who knows, fantastic surprises might be just around the corner!

With that, time to celebrate 2010 and to welcome 2011, a happy and successful New Year to all of you!

Share

It’s been a busy year at the LHC and in heavy ion physics.

In ALICE we’ve gotten out nine papers on data, including:

The links above are to explanations of the papers.  I haven’t quite gotten to writing a post on Bose-Einstein Correlations, but they are another way of measuring the size of the proton.  The spectra of charged hadrons is basically a measurement of how many particles are created in a collision and how fast are they going.  It’s been a good year for ALICE!  And a good year to be a heavy ion physicist.  ATLAS observed jet quenching and CMS observed the ridge, a feature previously only observed in heavy ion collisions, in proton-proton collisions.

And all of our hard work hasn’t gone unnoticed – The Onion, while declaring Snooki to be one of the most important people of 2010, said that those of us working on the LHC would be more deserving.   And my favorite radio show Wait Wait Don’t Tell Me even mentioned the first lead-lead collisions in one of their Listener Limerick challenges.  (It’s the second time my field came up on the show – the first was when I was a contestant.)

So Merry Quark Mass and Happy Glue Year!

Share

As described in the following press release (see here). The Italian government has confirmed that they will be moving forward on the construction of the high intensity electron/positron collider known as SuperB.

Expected to produce 1000’s of B-Mesons and Tau particles every second this will allow physicists to study the very rare decays of these B mesons as well as study CP violation to a much higher degree of accuracy than previously possible.

The U.S. meanwhile is still holding in the background awaiting word if we will even have a say in the future of particle physics. With the awaiting word of the Tevatron extended run (see P5 report recommending the extension of the Tevatron here) and knowing that many of the components for the SuperB will come from the short lived PEP-II experiment at SLAC just reinforces that while the rest of the world is looking forward to the future of science the US is increasingly looking like it playing a “wait and see” game.

At least this physicists may end up having to look for jobs in Europe and add to the potential brain drain (all though in my case a very small drain) facing the US.

Share

It’s clever, but is it peer review?

Saturday, December 25th, 2010

First off, happy holidays to all of our US LHC blog readers. We are finishing off a fabulous 2010 and expect only more exciting times in 2011. There are a lot of mysteries about the year ahead. What will be the center of mass energy of the LHC when the machine starts again in February? How much data will the experiments record? Will it be enough data to say anything about the infamously long-awaited Higgs boson? Or will we discover something totally unexpected? Watch this space to find out.

Now, here is something I’ve been wondering about lately. When I have submitted papers to journals for publication, or worked closely with people who have made submissions, I’ve had to wait a pretty long time, sometimes months, between submission and acceptance by the journal editors. There is some editorial process up front, then the paper has to go to reviewers, who need some time to read it and make thoughtful comments and suggestions (which are typically helpful for improving the paper), and then the authors make revisions before acceptance.

But things have been a little different around the LHC lately. Consider the CERN press release announcing the observation of jet quenching in heavy-ion collisions:

This result is reported in a paper from the ATLAS collaboration accepted for publication yesterday in the scientific journal Physical Review Letters.

The press release is dated November 26. If you and look at the paper in PRL, you can see that the paper was received by the journal on November 25. So in fact the paper was accepted on the very same day that it was submitted.

LHC results have been making it through to publication pretty quickly. The jet-quenching observation was pretty dramatic, for sure, but consider the still significant but less dramatic first measurement of the top-antitop cross section at the LHC, from CMS. This paper made it from submission to acceptance in under a month, including a revision iteration. So what is going on here?

Nothing too crazy, actually. First, journal editors are certainly allowed to waive the usual rules for something that is truly important and timely. The editors of course have the expertise to make their own judgment about the topic described and the quality of the work, and in the case of jet quenching, made the very reasonable decision to publish the result right away. For other papers that still seem to be making it through pretty quickly, I would have to figure that many parties are motivated to get LHC results out. These measurements really are groundbreaking; we haven’t had such a sudden jump in our particle-physics capabilities in a generation. Thus, editors are inclined to process papers promptly, reviewers are inclined to read them quickly, and authors are inclined to make revisions just as soon as they can. It’s good that everyone is so excited about this work. (I suppose another possibility is that my own experience is totally atypical and the rest of the world gets their papers accepted in under a month. I never claimed to write great papers.)

And then one might ask — can one do quality editing and peer review in such a short amount of time? I would be inclined to say yes. Let’s remember that the measurements and the written papers go through an incredible amount of scrutiny within the collaborations before they are submitted to a journal. In CMS, for example, a paper goes through three layers of review (maybe four, depending on how you count) before it is submitted. Tens if not hundreds of (pairs of) eyes have looked at it already, and it is pretty unlikely that any huge mistakes are going to be missed. I believe that the journals are already receiving a quality product as input. And then external review can be done pretty quickly if the reviewers have a moment to focus on it; they are experts in the field, and can understand a paper fairly quickly. This all makes it possible for the latest results to get into your hands promptly.

I should note that these blog posts aren’t peer-reviewed at all. This one will get into your hands just as soon as I click on that button on the right side of my Web page.

Share

Giro d’Italia

Tuesday, December 21st, 2010

Seafront in Naples

Seafront in Naples

For the last week, we have traveled Italy, having first visited the University of Rome “Tor Vergata” and now the University “Federico II” in Naples.
While the cold seems to have followed us down from North of the Alps and we’ve even seen snow flakes in Rome (rather unusual, we were assured), in Naples we finally met some milder temperatures again.

The atmosphere at Italian universities is rather tense these days. At the beginning of the month, Rome has seen violent clashes between police and students who were protesting against the proposed university reforms. As often the case, the reforms include large budget cuts and are widely feared to endanger the quality of university education. A picture that presents itself unfortunately world-wide to researchers these days.
Tomorrow will be the final vote about the reform, more protests are expected, and everyone is a bit nervous about the things to come.

Share

The Paperless Physicist?

Wednesday, December 15th, 2010

Already for a few years, I have been scanning all my more important handwritten calculations and notes. Like this, they can be accessed online by myself and my collaborators whenever needed. Unlike some colleagues who proudly display a 30cm thick layer of papers covering every available surface in their offices, I generally try to avoid mountains of paper in my workflow. I tend to only print out scientific articles I really need to work with, while papers I just want to look at I read on the screen.
Recently, I even eliminated several kilos of old lecture notes and calculations, dating back until my undergrad days. With the help of the fabulous ScanSnap by Fujitsu, it was feasible. This little document scanner has an automatic document feeder and can even scan duplex in one go! While I doubt I’ll ever need my calculations from 6 years ago again and scanning them was simply a maneuver to be able to let go of the physical object, I think some of my old lecture notes might actually come in handy. As PDFs, I can just keep them on my laptop and have them always accessible.
Digitizing all my old notes and calculations helped me get rid of a lot of dead weight and made me more mobile, but of course a sound back-up strategy is a must!

Going through these piles of old paper, I naturally started wondering whether it wouldn’t be easier to avoid the detour through the dead-tree-form altogether. Are handwritten notes still a good idea nowadays? Some of my colleagues with tablet PCs have been taking notes directly on their computers for years. These days, the iPad seems to be a good option: relatively small and light, you can use it like a note pad. I have to admit I am tempted. Also as a PDF reader, it seems very convenient.

Yet I wonder if for some serious calculations and problem solving, scribbling on paper is not a necessary step in the process.

Has anyone already tried to switch all note-taking and calculations to the computer? Is it working out for you? I’d like to hear of the experiences of other people.

P.S. If you ever intend to scan all your paperwork, do yourself a favor and don’t staple stuff together. You’ll be thanking yourself one day…

Share

Jet quenching

Monday, December 13th, 2010

There have been a lot of exciting results lately and I haven’t gotten a chance to write about them because I’ve been too busy.  Today I’ll tackle jet quenching, which Seth touched on in one of his posts.

You may have done absorption spectroscopy in a chemistry lab.  In absorption spectroscopy, light from a calibrated source passes through a sample and changes in the light after passing through the sample are used to determine the properties of the sample.  For example, you may have a liquid that absorbs blue light but lets orange light through.  This tells you something about the properties of the liquid.  We want something like that for studying the Quark Gluon Plasma (QGP).  Perhaps we could try shining light on the QGP to see what it does to the light, how much is absorbed?  The problem with that is that the QGP formed in a nucleus-nucleus collision doesn’t live very long – about 10-24 seconds.  Trying to aim light at the QGP would be like trying to hit a fighter plane at top speed with a Nerf gun – by the time you aimed, the plane would be long gone.

Fortunately, photons (light) are created in the lead-lead collisions.  Since they are produced in the collision, we know they went through the QGP so we can use them and study how they’re affected by the QGP to determine its properties.  This is analogous to determining what a store sells by looking at what people have in their shopping bags when they leave the store rather than by going in the store yourself.  This is one of the measurements we’ll see at some point.  But photons only interact through the electromagnetic force and many of the features of the QGP we’re trying to study come from the interaction of quarks and gluons through the strong force.  To study these properties, we need something like a photon, but that interacts through the strong force.  We can use quarks and gluons.

There are quarks and gluons in the incoming lead nuclei, and a quark or gluon in one nucleus can scatter off of a quark or gluon in the other nucleus.  We’re particularly interested in hard scatterings, where they hit each other and bounce off like billiard balls.  This process happens early in the collision, and then the partons travel through the medium, as shown below:


But there’s a complication.  We can’t see individual quarks and gluons – they’re always bound in hadrons, states made of two quarks (mesons) or three quarks (baryons), a property called confinement.  After the parton gets knocked out of the nucleon, it hadronizes – it breaks up into several mesons and baryons.  These are actually what we observe in our detector.  For each parton, we have a cone of hadrons called a jet.  This is an event display from the STAR experiment showing two jets in a proton-proton collision:

In a proton-proton collision, it’s easy to see jets, but in a heavy ion collision they’re in events like these:

So it’s not as easy to find jets in heavy ion collisions.  One thing we can do is look at very fast moving hadrons.  These are more likely to have come from jets.  This is the subject of the most recent ALICE paper.  This is the main result from that figure:

The x-axis is the momentum of the hadron perpendicular to the beam, called the transverse momentum.  The y-axis is something called RAA, which is the ratio of the number of hadrons we measure in lead-lead collisions to the number we would expect if a lead-lead collision were just a bunch of nucleon-nucleon collisions.  We take what we measure in proton-proton collisions and scale it by the number of proton-proton, proton-neutron, and neutron-neutron collisions we would expect.  (Yes, I’m skipping lots of technical details about how that scaling is done.)  Another way of putting it is that it’s what we get divided by what we expect.  If RAA were exactly 1.0, it’d mean there’s no physics in lead-lead collisions that isn’t in proton-proton collisions.  An RAA less than one means we see way fewer particles than we expect.  In the figure, the open points are what we measure for peripheral collisions, where the nuclei just barely graze each other.  The solid points show what we measure for central – head-on – collisions.  The big, obvious feature is the bump which peaks for particles with a transverse momentum of about 2 GeV/c.  There’s a lot of physics in there and it’s really interesting but it’s not what I’m talking about today.  Look at what it does at higher momenta – above about 5 GeV/c.  This is where we trust our theoretical calculations the most.  (At lower momenta, there’s much more theoretical uncertainty in what to expect.)  We see only about 15% of the number of particles we expect to see.  This was already observed at the Relativistic Heavy Ion Collider, but the effect is larger at the LHC.

This happens because the QGP is really, really dense.  It’s harder for a parton to go through the QGP than it’ll be to walk through a Target store on the day after Christmas.  The parton loses its energy in the QGP.  Imagine shooting a bullet into a block of lead – it’d just get stuck.

ATLAS’s recent paper exhibits this more directly.  Here’s a lead-lead event where the lead nuclei barely hit each other.  Here you can see two jets, like what you’d expect if neither parton got stuck in the QGP:

The φ axis is the angle around the beam pipe in radians, the η axis is a measure the angle between the particle and the beam pipe, and the z axis is the amount of energy observed in the calorimeter.  Imagine rolling this plot up into a tube, connecting φ=π to φ=-π and that would show you roughly where the energy is deposited.  The peaks are from jets, like in the event display from STAR above.  The amount of energy in each peak is about the same – if you added up each block in the peak for both peaks, they’d be about equal.  And here’s a lead-lead event where one of the partons got stuck in the medium:

In this plot one of the peaks is missing.  One of the jets is quenched – it got absorbed by the QGP.  This is the first direct observation of jet quenching in a single event.  It’s causing quite a buzz in the field.

Share

Double Chooz is full of it!

Monday, December 13th, 2010

At about 2:00AM in Chooz, France, a dedicated group of physicists finished filling the Double Chooz far detector with the neutrino detecting liquid scintillator.  The filling procedures were carefully executed over the last two months and everyone is very excited to be past this stage.  As a quick reminder, the Double Chooz detectors are concentric cylinders made of acrylic.  Each tank is filled with a different kind of liquid, and so special care needs to be taken in order to ensure that each tank is filled simultaneously.  Even a small difference in the liquid levels could put enough pressure on the acrylic cylinders to crack them.

Special thanks is owed to our filling team on site, and to those collaborators who helped monitor the detector remotely while our filling team slept.

There are many more installations needed for the detector to be fully functional, but this is an exciting step forward!

Share

When Feynman Diagrams Fail

Saturday, December 11th, 2010

We’ve gone pretty far with our series of posts about learning particle physics through Feynman diagrams. In our last post we summarized the Feynman rules for all of the known particles of the Standard Model. Now it’s time to fess up a little about the shortcomings of the Feynman diagram approach to calculations; in doing so, we’ll learn a little more about what Feynman diagrams actually represent as well as the kinds of physics that we must work with at a machine like the LHC.

When one diagram isn’t enough

Recall that mesons are bound states of quarks and anti-quarks which are confined by the strong force. This binding force is very non-perturbative; in other words, the math behind our Feynman diagrams is not the right tool to analyze it. Let’s go into more detail about what this means. Consider the simplest Feynman diagram one might draw to describe the gluon-mediated interaction between a quark and an anti-quark:

Easy, right? Well, one thing that we have glossed over in our discussions of Feynman diagrams so far is that we can also draw much more complicated diagrams. For example, using the QCD Feynman rules we can draw something much uglier:

This is another physical contribution to the interaction between a quark and an anti-quark. It should be clear that one can draw arbitrarily many diagrams of this form, each more and more complicated than the last. What does this all mean?

Each Feynman diagram represents a term in a mathematical expression. The sum of these terms gives the complete probability amplitude for the process to occur. The really complicated diagrams usually give a much smaller contribution than the simple diagrams. For example, each photon vertex additional internal photon line (edit Dec 11, thanks ChriSp and Lubos) gives a factor of roughly α=1/137 to the diagram’s contribution to the overall probability. (There are some subtleties here that are mentioned in the comments.) Thus it is usually fine to just take the simplest diagrams and calculate those. The contribution from more complicated diagrams are then very small corrections that are only important to calculate when experiments reach that level of precision. For those with some calculus background, this should sound familiar: it is simply a Taylor expansion. (In fact, most of physics is about making the right Tayor expansion.)

However, QCD defies this approximation. It turns out that the simplest diagrams do not give the dominant contribution! It turns out that both the simple diagram and the complicated diagram above give roughly the same contribution. One has to include many complicated diagrams to obtain a good approximate calculation. And by “many,” I mean almost all of them… and “almost all” of an infinite number of diagrams is quite a lot.  For various reasons, these complicated diagrams are very difficult to calculate and at the moment our normal approach is useless.

There’s a lot of current research pushing this direction (e.g. so-called holographic techniques and recent progress on scattering amplitudes), but let’s move on to what we can do.

QCD and the lattice

`Surely,’ said I, `surely that is something at my window lattice;
Let me see then, what thereat is, and this mystery explore –
— Edgar Allen Poe, “The Raven”

A different tool that we can use is called Lattice QCD. I can’t go into much detail about this since it’s rather far from my area of expertise, but the idea is that instead using Feynman diagrams to calculate processes perturbatively—i.e. only taking the simplest diagrams—we can use computers to numerically solve for a related quantity. This related quantity is called the partition function and is a mathematical object from which one can straightforwardly calculate probability amplitudes. (I only mention the fancy name because it is completely analogous to an object of the same name that one meets in thermodynamics.)

The point is that the lattice techniques are non-perturbative in the sense that we don’t calculate individual diagrams, we simultaneously calculate all diagrams. The trade-off is that one has to put spacetime on a lattice so that the calculations are actually done on a four-dimensional hyper-cube. The accuracy of this approximation depend on the lattice size and spacing relative to the physics that you want to study.  (Engineers will be familiar with this idea from the use of Fourier transforms.) As usual, a picture is worth a thousand words; suppose we wanted to study the Mona Lisa:

The first image is the original. The second image comes from putting the image on a lattice, you see that we lose details about small things. Because things with small wavelengths have high energies, we call this an ultraviolet (UV) cutoff. The third image comes from having a smaller canvas size so that we cannot see the big picture of the entire image. Because things with big wavelengths have low energies, we call this an IR cutoff. The final image is meant to convey the limitations imposed by the combination of the UV and IR cutoffs; in other words, the restrictions from using a lattice of finite size and finite lattice spacing.

If you’re interested in only the broad features the Mona Lisa’s face, then the lattice depiction above isn’t so bad. Of course, if you are a fine art critic, then the loss of small and large scale information is unforgiveable. Currently, lattice techniques have a UV cutoff of around 3 GeV and an IR cutoff of about 30 MeV; this makes them very useful for calculating information about transitions between charm (mass = 1.2 GeV) and strange quarks (mass = 100 MeV).

Translating from theory to experiment (and back)

When I was an undergraduate, I always used to be flummoxed that theorists would always draw these deceptively simple looking Feynman diagrams on their chalkboards, while experimentalists had very complicated plots and graphs to represent the same physics. Indeed, you can tell whether a scientific paper or talk has been written by a theorist or an experimentalist based on whether it includes more Feynman diagrams or histograms. (This seems to be changing a bit as the theory community has made a concerted effort over the past decade to learn to the lingo of the LHC. As Seth pointed out, this is an ongoing process.)

There’s an reason for this: experimental data is very different from writing down new models of particle interactions. I encourage you to go check out the sample event displays from CMS and ATLAS on the Symmetry Breaking blog for a fantastic and accessible discussion of what it all means. I can imagine fellow bloggers Jim and Burton spending a lot of time looking at similar event displays! (Or maybe not; I suspect that an actual analysis focuses more on accumulated data over many events rather than individual events.) As a theorist, on the other hand, I seem to be left with my chalkboard connecting squiggly lines to one another. 🙂

Once again, part of the reason why we speak such different languages is non-perturbativity. One cannot take the straightforward Feynman diagram approach and use it when there is all sorts of strongly-coupled gunk flying around. For example, here’s a diagram for electron–positron scattering from Dieter Zeppenfeld’s PiTP 2005 lectures:

The part in black, which is labeled “hard scattering,” is what a theorist would draw. As a test of your Feynman diagram see if you can “read” the following: This diagram represents an electron and positron annihilating into a Z boson, which then decays into a top–anti-top pair. The brown lines also show the subsequent decay of each top into a W and (anti-)bottom quark.

Great, that much we’ve learned from our previous posts. The big question is: what’s all that other junk?! That, my friend, is the result of QCD. You can see that the pink lines are gluons which are emitted from the final state quarks. These gluons can sprout off other gluons or quark–anti-quark pairs. All of these quarks and gluons must then hadronize into color-neutral hadron states, mostly mesons. These are shown as the grey blobs. These hadrons can in turn decay into other hadrons, depicted by yellow blobs. Most of all of this happens before any of the particles reach the detector. Needless to say, there are many, many similar diagrams which should all be calculated to give an accurate prediction.

In fact, for the LHC it’s even more complicated since even the initial states are colored and so they also spit off gluons (“hadronic junk”). Here’s a picture just to show how ridiculous these process look at a particle-by-particle level:

Let me just remark that the two dark gray blobs are the incoming protons. The big red blob represents all of the gluons that these protons emit. Note that the actual “hard interaction,” i.e. the “core process” is gluon-gluon scattering. This is a bit of a subtle point, but at very high energies, the actual point-like objects which are interacting are gluons, not the quarks that make up the proton!

All of this hadronic junk ends up being sprayed through the experiments’ detectors. If the origin of some of the hadronic junk comes from a high-energy colored particle (e.g. a quark that came from the decay of a new heavy TeV-scale particle), then they are collimated into cones that are pointing in roughly the same direction called a jet, (image from Gavin Salam’s 2010 lectures at Cargese)

Some terminology: parton refers to either a quark or gluon, LO means “leading-order”, NLO means “next-to-leading order.” The parton shower is the stage in which partons can radiate more low-energy partons, which then confine into hadrons. Now one can start to see how to connect our simple Feynman diagrams to the neat looking event reconstructions at the LHC: (image from Gavin Salam’s lectures again)

Everything except for the black lines are examples of what one would actually read off of an event display. This is meant to be a cross-section of the interaction point of the beamline. The blue lines come from a tracking chamber, basically layers of silicon chips that detect the passage of charged particles. The yellow and pink bars are readings from the calorimeters, which tell how much energy is deposited into chunks of dense material. Note how ‘messy’ the event looks experimentally: all of those hadrons obscure the so-called hard scattering underlying event (edit Dec 11, thanks to ChriSp), which is what we draw with Feynman diagrams.

So here’s the situation: theorists can calculate the “hard scattering” or “underlying event” (black lines in the two diagrams above), but all of the QCD-induced stuff that happens after the hard scattering is beyond our Feynman diagram techniques and cannot be calculated from first principles. Fortunately, most of the non-peturbative effects can again be accounted for using computers. The real question is given an underlying event (a Feynman diagram), how many times will the final state particles turn into a range of different hadrons configurations. This time one uses Monte-Carlo techniques where instead of calculating the probabilities of each hadronic final state, the computer randomly generates these final states according to some pre-defined probability distribution. If we run such a simulation over and over again, then we end up with a simulated distribution of events which should match experiments relatively well.

One might wonder why this technique should work. It seems like we’re cheating—where did these “pre-defined” probability distributions come from? Aren’t these what we want to calculate in the first place? The answer is that these probability distributions come from experiments themselves. This isn’t cheating since the experiments reflect data about low-energy physics. This is well known territory that we really understand. In fact, everything in this business of hadronic junk is low-energy physics. The whole point is that the only missing information is the high-energy underlying event hard scattering (ed. Dec 11)—but fortunately that’s the part that we can calculate! The fact this works is a straightforward result of “decoupling,” or the idea that physics at different scales shouldn’t affect one another. (In this case physicists often say that the hadronic part of the calculation “factorizes.”)

To summarize: theorists can calculate the underlying event hard scattering (ed. Dec 11) for their favorite pet models of new physics. This is not the whole story, since it doesn’t reflect what’s actually observed at a hadron collider. It’s not possible to calculate what happens next from first principles, but fortunately this isn’t necessary, we can just use well-known probability distributions to simulate many events and predict what the model of new physics would predict in a large data set from an actual experiment. Now that we’re working our way into the LHC era, clever theorists and experimentalists are working on new ways to go the other way around and take the experimental signatures to try to recreate the underlying model.

As a kid I remember learning over and over again how a bill becomes a law. What we’ve shown here is how a model of physics (a bunch of Feynman rules) becomes a prediction at a hadron collider! (And along the way we’ve hopefully learned a lot about what Feynman diagrams are and how we deal with physics that can’t be described by them.)

Share

On the Road Again

Thursday, December 9th, 2010

Heisenberg's head Just like when I first started blogging for the Quantum Diaries one year ago, I am again on my way traveling through Europe. After 12 hours of flight and 15 degrees less, I am slowly getting used to Europe again (even though the temperature shock seems to have taken its toll).
This week, we’re in Munich, where I have spent my PhD years. I’m refreshing old friendships (even though most of the people who were there with me are now scattered somewhere around the globe) and enjoying the Christmas mood of the town (with real snow as decoration).
We’re shuttling back and forth between the Arnold Sommerfeld Center of Ludwig Maximilian University and the Max Planck Institute for Physics, also known as “Werner Heisenberg Institute”. There, they have put us into the Heisenberg office. Maybe spending time in this room will transfer some scientific insight and inspiration on me? One can always hope…

Share