## Archive for July, 2013

### Giant electromagnet arrives at Fermilab

Tuesday, July 30th, 2013

The 50-foot-wide electromagnet for the Muon g-2 experiment has completed its five-week journey from New York to Illinois.

For the last three nights, a big rig has traveled slowly down the roads of suburban Illinois bearing an American flag and the warning sign “Oversize Load.” The warning may have been an understatement.

Its “load” was a 50-foot, 17-ton electromagnet that, for the last month, has voyaged by land and by sea from Brookhaven National Laboratory on Long Island. Early this morning, it reached its final destination: Fermi National Accelerator Laboratory outside of Chicago.

The electromagnet arrived accompanied by an impressive entourage: a dozen state trooper cars and more than a handful of county sheriffs and local police, plus crews from a company called Roadsafe, which was tasked with removing roadside signs ahead of the convoy and righting them after it passed. It will make its final move across the laboratory site this afternoon.

The logistics of the move have captured imaginations all along the way. But underneath the spectacle is important, potentially groundbreaking science.

The electromagnet is part of what is known as the Muon g-2 experiment. Scientists on the Muon g-2 experiment study short-lived particles called muons, which wobble when placed in a magnetic field due to an internal conflict between some of their characteristics.

In 2001, Brookhaven scientists used the ring to measure that wobble. Taking into consideration their current understanding of physics, scientists can predict what it should be like. If it turns out to be different than expected, it could indicate the presence of new physics.

In the first iteration of this experiment, Brookhaven physicists found hints that the wobble was off. Relocating the experiment to Fermilab will allow it to run in a more intense particle beam (for less money than it would cost to build the experiment anew), giving a more precise answer.

“We’ve been trying for years to really determine whether we’ve discovered something new and exciting,” says Muon g-2 Spokesperson Lee Roberts, who began working on the experiment in 1984. “We’re all excited to see the answer. It’s exciting for me personally, and it’s exciting for science.”

To relocate the magnet, the Muon g-2 team worked for over a year with Emmert International, a company that subsists on moving big, unwieldy objects, to plan the journey, which involved constructing a bright red fixture to hold the ring in place (prompting many observers along the way to compare it to a UFO).

The ring is an exquisitely sensitive device; it cannot be bent or twisted by more than a few millimeters.

“This is one of the widest, most fragile, dimensionally unusual, temperamental projects we’ve done,” says Terry Emmert, owner of Emmert International. “It’s amazing to see it all come together.”

The ring first saw daylight in mid-June, when a team from Emmert slid it out of the building that had housed it since the 1990s. After a week’s rain delay, it traveled six miles along Long Island’s William Floyd Parkway—in about half the expected time of six hours—to the Smith Point Marina, where it was loaded by crane onto a 50-by-150-foot barge and pulled by a pair of tugboats out to sea.

“The crane is just enormous,” says Chris Polly, the Muon g-2 project manager at Fermilab. “It’s a 500-ton capacity crane that’s four or five stories high, so you can pick up this device that weighs 60 tons [with the support structure] and get it on the barge. The whole procedure went pretty smoothly.”

From there, the barge—essentially a giant floating plank—faced 3200 miles of ocean and river waters and a month’s worth of unpredictable summer weather. First it floated from Long Island down the East Coast and around Florida. The barge was forced to camp out for five nights in Norfolk, Virginia, to wait out a passing tempest. After that, the barge narrowly escaped brewing tropical storm Chantal.

After rounding the tip of Florida, the barge was scheduled to move directly up the Mississippi. Due to heavy currents, which would have caused barges to back up at the locks, the team decided to take a back-roads alternative, maneuvering up the Tombigbee Waterway and the Tennessee River. On July 12, the ring made a stop in Mobile, Alabama, where Trident, the ocean-going tugboat, handed the barge off to Miss Katie, a white tugboat with red trim that coordinated nicely with the red support structure for the electromagnet wrapped in white plastic.

Miss Katie pushed the barge into the Mississippi and ultimately to Lemont, Illinois, where—after a brief pit stop in the wrong port—it greeted a cast of more than 100 scientists, family members and curious onlookers on July 20.

The ring will complete its cross-country trip later today, but its second chance at searching for new physics has just begun.

Laura Dattaro

### The Standard Model checked to the ninth decimal

Tuesday, July 30th, 2013

At the European Physics Society conference in Stockholm, two experiments operating at the Large Hadron Collider (LHC) at CERN, LHCb and CMS reported on July 19 solid evidence that the Standard Model of particle physics still shows no sign of wear and tear by checking a prediction of the model to the ninth decimal place.

The Standard Model makes very accurate predictions but theorists know this theory has its limits. At higher energy, its equations start breaking down. Theorists are convinced that despite all the success of this model, it is not giving us the big picture. Hence, scientists have been trying to find a “secret passage” to the next level, a more encompassing and more robust theory.

One way to achieve this is to look for a small deviation in a measured quantity from the value predicted by the Standard Model and a good place to find such a deviation is in an extremely rare process. It is much easier to hear a faint noise in a quiet place than in the middle of traffic during rush hour.

Specifically, the scientists measured how often composite particles denoted Bs and Bd (pronounced “b sub s and b sub d)” mesons decay into a pair of muons (particles similar to electrons but about 200 times heavier). A Bs meson is a composite particle containing b and s quarks while Bd mesons are made of b and d quarks. These heavy particles are unstable and quickly break apart into lighter particles.

The Standard Model predicts that Bs mesons decay into a pair of muons about three times in a billion while for Bd mesons, it occurs thirty times less often. This gives two excellent places to look for small deviations that could reveal the existence of new phenomena not foreseen within the Standard Model.

All theories going beyond the Standard Model come with new particles that would affect how other particles decay, i.e. how they break apart. Decays are very much like making change for a big coin. Imagine a coin of one euro. It can be broken into pieces of 1, 5, 10, 20 or 50 cents Now, say a new 25-cent coin is introduced. An automatic teller would not give change for one euro in a particular way (say with coins of 50, 20, 20 and 10 cents) as often as before just because new possibilities now exist.

By measuring how often a Bs meson decays into muons, scientists were hoping to see the first deviations from the predictions of the Standard Model. On the contrary, the two experiments confirmed this prediction within experimental errors.

CMS, whose name stands for Compact Muon Spectrometer, and LHCb, an experiment designed specifically to study particles containing b quarks, are particularly suited for these types of measurements. CMS got (3.0 +1.0-0.9) x 10-9 and LHCb obtained (2.9 +1.1-1.0) x 10-9, while the Standard Model prediction stands at (3.5 ± 0.3) x 10-9. The significances of the CMS and LHCb signals correspond to 4.3σ and 4.0σ, respectively, which means, the excesses of events that are seen most likely come from signal and not from background. Two other experiments presented new results based on smaller data samples. ATLAS (using a partial data sample) and D0 (final result with their full data sample) and they obtained the same upper limit at 15 x 10-9.

The results obtained by LHCb and CMS, as well as their combined value, is compared to the prediction from the Standard Model shown by the vertical black line and its theoretical uncertainty (green band).

For Bd decays, 95% confidence level upper limits were set at 7.4 x 10-10 for LHCb while CMS obtained 11 x 10-10. The Standard Model predicts this to be less than 1 x 10-10.

All these values are consistent with the Standard Model predictions but they do not yet rule out new physics. After the LHC resumes operation at higher energy in 2015, the LHC experiments will continue improving their Bs measurements. In particular, they will aim to get a first measurement for Bd mesons instead of an upper limit, and then evaluate the ratio for the Bs and Bd mesons, such that some of the experimental and theoretical uncertainties will cancel out, to obtain an even more precise measurement. Since no deviations were found in the ninth decimal position, it means the experiments need to check the tenth decimal position.

More details can be found on the CMS and LHCb websites.

Pauline Gagnon

### Le Modèle standard vérifié à la neuvième décimale près

Tuesday, July 30th, 2013

Lors de la conférence de la Société européenne de physique à Stockholm, deux expériences du Grand collisionneur de hadrons (LHC) du CERN, LHCb and CMS ont apporté des preuves solides que le Modèle standard de la physique des particules ne montre toujours aucun signe de fatigue en poussant la vérification de l’une des prédictions du modèle jusqu’à la neuvième décimale.

Le modèle standard permet des prédictions très précises, mais les théoricien-ne-s savent que cette théorie a ses limites. À plus haute énergie, ses équations commencent à flancher. Les théoricien-ne-s sont donc convaincu-e-s que malgré tout le succès de ce modèle, il ne nous donne qu’une image incomplète du monde matériel. Par conséquent, les scientifiques cherchent l’entrée du “passage secret” vers un niveau supérieur, révélant une théorie plus globale et plus robuste.

Une façon d’y parvenir est de rechercher le moindre petit écart par rapport aux prévisions théoriques. Et un bon endroit pour trouver une petite déviation est en regardant parmi les procédés extrêmement rares. Il est beaucoup plus facile de déceler un léger murmure dans un endroit calme qu’au beau milieu de la circulation aux heures de pointe.

Plus précisément, les scientifiques ont mesuré la fréquence de désintégrations de particules composites appelées mésons Bs et Bd en une paire de muons (particules similaires aux électrons mais 200 fois plus lourdes). Un méson Bs est une particule composite contenant un quark b et un quark s alors que les mésons Bd sont faits de quarks b et d. Ces particules lourdes sont instables et se désintègrent rapidement en particules plus légères.

Le modèle standard prédit que les mésons Bs se brisent et donnent une paire de muons environ trois fois sur un milliard de désintégrations tandis que pour les mésons Bd, cela devrait se produire environ trente fois moins souvent. Voilà donc deux excellents endroits où l’existence de phénomènes nouveaux non prévus dans le Modèle standard pourrait créer de petites déviations par rapport aux prédictions.

Toutes les théories allant au-delà du Modèle standard s’accompagnent de nouvelles particules. Ces particules affecteraient les possibilités de désintégrations des autres particules, c’est à dire comment elles se brisent. Une désintégration est très semblable à la façon de faire la monnaie pour une grosse pièce. Imaginez une pièce d’un euro. Elle peut être échangée pour des pièces de 1, 5, 10, 20 ou 50 centimes. Mais si on introduit des pièces de 25 centimes, un distributeur automatique ne donnerait plus la monnaie d’un euro en pièces de 50, 20, 20 et 10 centimes aussi souvent qu’avant parce que de nouvelles possibilités existeraient.

En mesurant combien de fois les mésons Bs et Bd se désintègrent en muons, les scientifiques espéraient voir pour la première fois un écart par rapport aux prédictions du Modèle standard. Au contraire, les deux expériences ont confirmé cette prédiction, du moins à l’intérieur des marges d’erreur.

CMS, qui signifie Spectromètre Compact pour Muons, et LHCb, une expérience conçue spécifiquement pour étudier les quarks b, sont tout particulièrement désignées pour ce genre de mesures. CMS a obtenu (3,0 +1,0-0,9) x 10-9 et LHCb (2,9 +1,1-1,0) x 10-9 alors que la prédiction du Modèle standard s’établit à (3,5 ±  0,3)  x 10-9. Cela correspond à des mesures à 4,3σ et 4,0σ, donc venant beaucoup plus probablement du signal plutôt que d’une fluctuation du bruit de fond. Deux autres expériences ont présenté de nouveaux résultats mais basés sur de plus petits échantillons de données. ATLAS (données partielles) et D0 (données finales) mesurent toutes les deux la même limite supérieure, soit 15 x 10-9.

Les résultats obtenus par LHCb et CMS pour les mésons Bs, ainsi que la prédiction théorique du Modèle standard (ligne verticale en noir) avec la marge d’incertitude théorique (bande verte).

Pour les désintégrations de mésons Bd, les collaborations LHCb et CMS ont toutes les deux pu placer  la limite supérieure à 7,4 x 10-10 pour LHCb et 11 x 10-10 pour CMS avec un indice de confiance de 95%.  La prédiction du Modèle standard se situe à moins de 1 x 10-10.

Tous ces résultats sont en accord avec les prédictions du Modèle standard. Après le redémarrage du LHC à plus haute énergie en 2015, les expériences du LHC raffineront leurs mesures pour les mésons Bs et tenteront d’obtenir une première mesure pour les mésons Bd (et non pas seulement une limite). Eventuellement, elles pourront mesurer le rapport entre les mésons Bs et Bd. Ceci permettra à certaines incertitudes expérimentales et théoriques de s’annuler, ce qui donnera une mesure encore plus précise. Puisqu’aucun écart n’a été décelé à la neuvième décimale, nous devrons aller voir ce qui se passe à la dixième décimale.

Tous les détails se trouvent sur les sites de CMS et LHCb (en anglais seulement).

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

### Snowmass: one big happy family

Monday, July 29th, 2013

Let me say this much about the Community Summer Study 2013, also known as “Snowmass on the Mississippi“: it feels like a family reunion. There are about 600 people registered for the meeting, and since in the end we are a small field, I know a lot of them. I’m surrounded by people I grew up with, people I’ve worked with before, people I work with now, and people with whom I’d really like to work someday. I find it a little overwhelming. Besides trying to learn some science, we’re all trying to catch up with each other’s lives and work.

As with any family, we have our differences on some issues. We know that there are diverse views on what the most important issues are and what are the most promising pathways to scientific discovery. But also, as with any family, there is a lot more that unites us than divides us. As Nigel Lockyer, the incoming director of Fermilab, put it, we will probably have little trouble finding consensus on what the important science questions are. Today’s speakers emphasized that we will need to approach these questions with multiple approaches, and there was mutual respect for the work being done in all of the study groups.

The challenge, of course, is how to accommodate all of these approaches within an envelope of finite resources, and how to strike a balance between near-term operations and long-term (if not very long-term) projects. As our speakers from the funding agencies pointed out, we are in a particularly challenging time for this due to national political and fiscal circumstances. Setting priorities will be a difficult job, and one that will only come after the Snowmass study has laid out all the possibilities.

The workshop continues for another eight days, and if you are interested in particle physics and the future of the field, I hope you’ll be keeping an eye on it. The agenda page linked above has a pointer to a live video stream, presentations are also being recorded for future viewing, and various people are tweeting their way along with hashtag #Snowmass or #Snowmass2013. There are a lot of exciting ideas being discussed this week, some of which can have a transformative effect on the field. Stay with us!

### Happy Higgsdependence Day

Sunday, July 28th, 2013

It’s the 4th of July 2013 – Happy Higgsdependence Day! It is exactly one year since the observation of a ‘Higgs-like particle’ was announced at CERN in the auditorium in which I now sit.

I remember stealthily watching the live stream of the announcement at my office desk last year. If you had suggested to me then that one year later I would be working at CERN I would have told you (in a broad Northern Irish accent) to ‘catch yourself on’.

Peter meet François – Professors Higgs and Englert, who collaborated during the 1960s, meet for the first time on 4 July 2012. Notice the back of the room is filled with last years summer students who camped overnight for seats for the Higgs announcement.

This of course means that the summer lecture series has started in earnest with a gentle introduction to the ‘Particle World’ from Dr Tara Shears. I understand the content of the lecture so am feeling confident, comfortable and optimistic.

Until lecture series Day 2 when Dr James Wells hits us with his lectures on the Standard Model – our best model for explaining the interaction of fundamental particles.

James is an eminent theorist and is clearly at the top of his game. He poetically weaves an intricate mathematical web to explain the subtleties of the Standard Model using an area of mathematics known as Group Theory. I nod along knowingly with the other students but am feeling a bit bamboozled by all the tensors, boosts and Lagrangians.

Next up is the über cool Prof. Cranmer and his jazzy neon shoe laces to teach us all about statistics. (See my last post “Die – electronics, Die” on the importance of statistics at CERN.)

“And that’s the difference between a quark and a fork” – me telling Dr Wells all about the Standard Model.

My favourite lectures however are on physics beyond the Standard Model. The no-nonsense Italian Dr Gian Giudice gives the clearest descriptions of the Higgs, Supersymmetry and dark energy that I have come across, while quirky Professor Gia Dvali reassuringly concludes that black holes are different from elephants. Phew.

After each morning’s three lectures there is a half hour discussion session where the students interrogate the lecturers. While some students nip off for an early lunch I find these sessions often provide the best insight into the day’s topics.

So, set yourself a mad old goal for this time next year – it just might happen…

(You can view the lecture programme and speaker’s slides for this and previous year’s CERN Summer Student Lectures at http://summer-timetable.web.cern.ch/summer-timetable/.)

### From accelerator to art

Wednesday, July 24th, 2013

Fermilab physicist Todd Johnson spends his work and vacation hours with accelerators. What he produces during each are two very different things. Photo: Todd Johnson

Twice a year, Todd Johnson drives 400 miles from the Fermilab campus in Illinois to a commercial polymer crosslinking facility in Ohio, which is generally used to prepare plastic tubing for uses like heating systems in houses. Johnson is there for its linear accelerator, something with which he is quite familiar, given his day job working in Fermilab’s Accelerator Division.

But on these two days a year, Johnson is not using the accelerator for science—although there is a lot of science involved. Johnson is making Lichtenberg figures, fractal patterns that result from the lightning-bolt-like movements of excited electrons. The hobby is a popular one among accelerator scientists, but Johnson says he and the friends he works with are working to explore the limits of the process.

“The end purpose is to do it as art,” Johnson says. “But we also do a lot of experiments to push it further. It’s a technical challenge involving physics and a little mad science, if you’ll pardon the expression. And you have art when you’re done.”

Every six months, Johnson arrives at the facility with stencils laser-cut from steel or handmade from sheet lead; clear acrylic hunks of varying sizes; and a lot of ideas. He sends his pieces of acrylic through the accelerator’s electron beam, which is designed to break chemical bonds in plastics. Because acrylic is an insulating material, the beam scatters through the material, losing momentum as it goes. Only areas of the acrylic not covered by a stencil are exposed to the beam, allowing Johnson to create shapes. Eventually the beam coalesces into a pool of electrons that desperately want to escape but can’t—an invisible puddle of potential energy.

Releasing that energy is a simple but arresting process. To do it, Johnson uses a hand-made tool reminiscent of a crude, oversized syringe. It works like a click pen—press on one end and the tip comes out the other with enough force to puncture the acrylic. The instant the tool punctures the surface, there’s a burst of white light as the pool of excited electrons escapes from the material, leaving trails of vaporized acrylic in its place.

On their way out of the acrylic, the electrons follow the same natural laws that govern all systems that flow—electricity snaking its way from a storm cloud to Earth, rivers branching into ever smaller creeks and streams, or the spidery web of veins that distributes blood throughout your body. Johnson used this property to his advantage when the husband of a pulmonologist contacted him to request a gift for his wife. He used his stencils to create the shape of a pair of lungs filled with electron trails that formed a lifelike system of capillaries.

Johnson, who has worked at Fermilab for three decades, first found out about Lichtenberg figures through a friend at the lab who builds Tesla coils. But the figures weren’t his first foray into the art world. In the 1990s, Johnson was interested in holography and built equipment in his basement to make three-dimensional photographs.

Johnson says he considers his current creative process vastly different from what most artists get to experience: bursts of inspiration, hours of freewheeling improvisation, the luxury of time. Instead, Johnson spends six months conceptualizing and preparing materials, all for the two days per year on which he can see his ideas come to fruition.

And, of course, sometimes they don’t. “We can’t just do something on a whim,” he says. “You really have to plan carefully and, the minute you shut the machine off at the end of the day, you think of things you want to do next time. You think, ‘Well that didn’t work at all, I was sure that was going to work. I’ll do it this way next time,’ but it’s time to go. It’s very frustrating. But it’s part of the excitement.”

Laura Dattaro

See more images of Johnson’s work in symmetry.

### Oh what a beautiful day

Tuesday, July 23rd, 2013

In case you hadn’t heard, the past few days have been big days for B physics, i.e. particle physics involving a b quark. On the 18th and 19th, there were three results released in particular, two by LHCb and one by CMS. Specifically, on the 18th LHCb released their analysis of $$B_{(s)}\to\mu\mu$$ using the full 3 fb$$^{-1}$$ dataset, corresponding to 1 fb$$^{-1}$$ of 2011 data at 7 TeVand 2 fb$$^{-1}$$ of 2012 data at 8 TeV. Additionally, CMS also released their result using 5 fb$$^{-1}$$ of 7 TeV and 30 fb$$^{-1}$$ of 8 TeV data.

The decay $$B_{(s)}\to\mu\mu$$ cannot decay via tree-level processes, and must proceed by higher level processes ( shown below)

These analyses have huge implications for SUSY. The decay $$B_{(s)}\to\mu\mu$$ cannot proceed via tree-level processes, as they would involve flavor changing neutral currents which are not seen in the Standard Model (picture to the right). Therefore, the process must proceed at a higher order than tree level. In the language of Feynman Diagrams, the decay must proceed by either loop or penguin diagrams, show in the diagrams below. However, the corresponding decay rates are then extremely small, about $$3\times10^{-9}$$. Any deviation from this extremely small rate, however, could therefore be New Physics, and many SUSY models are strongly constrained by these branching fractions.

The results reported are:

 Experiment $$\mathcal{B}(B_{s}\to\mu\mu)$$ Significance $$\mathcal{B}(B\to\mu\mu)$$ LHCb $$2.9^{+1.1}_{-1.0} \times 10^{-9}$$ 4.0$$\sigma$$ $$<7.4\times 10^{-10}(95\% CL)$$ CMS $$3.0^{+1.0}_{-0.9}\times 10^{-9}$$ 4.3 $$\sigma$$ $$< 1.1\times 10^{-9} (95\% CL)$$

Higher order diagrams

Both experiments saw an excess of events events for the $$B_{s}\to\mu\mu)$$ channel, corresponding to $$4.o\sigma$$ for LHCb (updated from $$3.5 \sigma$$ of last year), and $$4.3\sigma$$ for CMS. The combined results will, no doubt, be out very soon. Regardless, as tends to happen with standard model results, SUSY parameter space has continued to be squeezed. Just to get a feel of what’s happening, I’ve made a cartoon of the new results overlaid onto an older picture from D. Straub to see what the effect of the new result would be. SUSY parameter space is not necessarily looking so huge. The dashed line in the figure represents the old result. Anything shaded in was therefore excluded. By adding the largest error on the branching fraction of $$B_s\to\mu\mu$$, I get the purple boundary, which moves in quite a bit. Additionally, I overlay the new boundary for $$B\to\mu\mu$$ from CMS in orange and from LHCb in green. An interesting observation is that if you take the lower error for LHCb, the result almost hugs the SM result. I won’t go into speculation, but it is interesting.

Cartoon of Updated Limits on SUSY from $$B\to\mu\mu$$ and $$B_s\to\mu\mu$$. Orange Represents the CMS results and green represents LHCb results for $$B_s\to\mu\mu$$ . Purple is the shared observed upper limit on $$B\to\mu\mu$$. Dashed line is the old limit. Everything outside the box on the bottom left is excluded. Updated from D. Straub (http://arxiv.org/pdf/1205.6094v1.pdf)

Additionally, for a bit more perspective, see Ken Bloom’s Quantum Diaries post.

As for the third result, stay tuned and I’ll write about that this weekend!

### Die – electronics, Die

Monday, July 22nd, 2013

My first project as a CERN summer student was to assemble an electronic die. After a few hours of soldering and burnt fingers I produce this:

My creation

How does it work?

When you tap the die on a table a piezo sensor glued to the base ‘feels’ the impact and sends a small current through the printed circuit board and into a chip.

The chip converts the analog signal into a decimal value (for example a 1.24167 volt signal is converted to the number 1.24167) and reads the least significant decimal place to generate a random number. The LEDs then light up to show a number 1 to 6.

What’s the point?

Over the course of a week, and much to the annoyance of my office mates, I tap the die on my desk 1500 times, compile a data set of the numbers thrown and go about analysing whether the die is fair or biased.

What’s this got to do with CERN then?

Well statistical analysis is key to what CERN does and the discovery of the Higgs is a pertinent example.

In the Large Hadron Collider a Higgs boson is produced by approximately 1 in every 10 million particle collisions. The boson then decays in a fraction of a microsecond while the other collisions produce an array of other particles.

To make the hunt even more tricky, scientists didn’t know the exact energy level at which the Higgs would be found so they had to collide particles at a variety of energy levels.

A graph showing the energy at which the Higgs was found.

So searching for the Higgs was like looking for a tiny needle in a massive haystack full of other needles where your needle exists for a minute fraction of a second and you don’t really know what it looks like.

Scientists therefore had to statistically analyse trillions of collisions to be sure that the small bump at 125GeV in the graph above was the signature of the Higgs and not just an unlikely random fluctuation.

Before announcing the discovery of a ‘Higgs-like particle’ in July 2012 scientists were 99.9999% sure they’d found their boson i.e. there was only a one in a million chance this was not the Higgs.

What’s that got to do with the die?

In figuring out whether the die was fair I produced a relatively large data sample then used statistical techniques to conclude with 95% confidence that any bias displayed wasn’t just a random fluke. So in a way the exercise was analogous to the search for the Higgs but on a much smaller scale.

Discover anything interesting?

After some intense number crunching, detailed analysis and complex modelling I concluded (drum roll please) – bet on the number 2.

It turned out my supervisor sabotaged my die. So the key lesson learned was – don’t dice with physicists.

### Bs on the frontiers

Monday, July 22nd, 2013

At this week’s EPS conference we have seen the release of new results from both CMS and LHCb on a search for a very rare decay of the Bs meson, to a muon-antimuon pair. I’ve written about this before (yikes, two years ago!); this decay turns out to be amazingly sensitive to possible new physics processes. This is in part because the decay (which violates flavor conservation, to leading order) is highly suppressed in the standard model, and thus the presence of additional particles could have a big impact on the decay rate. Just what impact depends on the particle; depending on the model, the rate for this decay could be either increased or decreased. It’s a somewhat unusual situation — you have something interesting to say either if you see this decay when you don’t expect to (because you don’t yet expect to have sensitivity due to insufficient data), or you if you don’t see the decay when you do expect to.

But in this particular case, the standard model wins again. CMS and LHCb now have essentially identical results, both claiming observation of this process at the rate predicted by the standard model. (LHCb had shown the first clear evidence of this decay last November.) I’m not going to claim any great expertise on this topic, but this result should put stronger constraints on theories such as supersymmetry, as it will restrict the possible characteristics of possible SUSY particles. In addition, this observation is the culmination of years of searching for this decay. I reproduce the CMS plot of the history of the searches below; over the course of about 25 years, our ability to detect this decay has improved by a factor of about 10,000.

But here’s what’s really on my mind: I’m thinking about this measurement in the context of the Snowmass workshop, which begins one week from today in Minneapolis. The studies of the workshop have been divided up into categories of “frontiers”, where the physics can fall into Energy, Intensity or Cosmic Frontiers. This categorization arises from the 2008 report of a US HEP program planning committee. It is certainly a useful intellectual organization of the work that we do in particle physics that is easy to explain to people outside the field. The Department of Energy budget for particle physics is now also organized according to these frontiers.

But where exactly does this Bs measurement fit? The physics of quark flavors and the search for rare decays would be considered part of the Intensity Frontier. But the measurements are being done at the LHC, which is considered an Energy Frontier facility because it has the largest collision energy of any accelerator ever built, and the process is sensitive to the effects of putative particles of very high mass. This is just one example of physics measurements that cut across frontiers. Another that comes to mind is that the LHC experiments have sufficient sensitivity to the production of potential dark-matter particles that in some cases, they can be competitive with searches done in non-accelerator experiments that are classified as being in the Cosmic Frontier.

Heading into next week’s workshop, I am hoping that we will be cognizant of the interconnected work of all the research that we do, regardless of how they might get classified for accounting purposes. We have many ways to explore each of our physics questions, and we need to figure out how to pursue as many of them as possible within the resources that are available.

### Vieille énigme résolue

Saturday, July 20th, 2013

Ce matin, lors de la conférence de la Société européenne de physique à Stockholm, l’expérience LHCb du Grand collisionneur de hadrons (LHC)

du CERNa présenté un élément de plus pour clore le chapitre sur une étrange situation qui gardait les théoricien-ne-s perplexes depuis une vingtaine d’années.

LHCb a présenté la mesure la plus précise à ce jour de la durée de vie du baryon b. Les baryons sont des particules formées de trois quarks. Par exemple, les protons et les neutrons sont constitués d’une combinaison de quarks u et d. Ce qui rend les baryons b spéciaux, c’est qu’ils contiennent un quark b, un type de quark beaucoup plus lourd. Toutes les particules composites contenant des quarks b comme les mésons B (faits d’un quark b et soit d’un quark u ou d) et les baryons b sont instables, ce qui signifie qu’ils ont une courte durée de vie. Environ une picoseconde après avoir été créés, ils se décomposent en particules plus légères.

En théorie, les mésons B et les baryons b devraient avoir à peu près la même durée de vie. Mais dans les années 1990, quand le CERN fonctionnait avec le précurseur du LHC, l’accélérateur appelé le LEP (Large Electron Positron Collider), toutes les expériences mesuraient une durée de vie systématiquement plus courte pour les baryons b que pour les mésons B comme on peut le voir sur le graphique ci-dessous. Bien que les marges d’erreurs étaient grandes, la tendance générale vers des valeurs plus basses était d’autant plus surprenante puisque les quatre expériences (ALEPH, DELPHI, OPAL et L3) travaillaient indépendamment.

Les différentes valeurs mesurées pour la durée de vie des baryons b au fil du temps avec les plus anciennes en bas et les toutes dernières venant du LHC tout en haut. La durée de vie mesurée est maintenant très proche de 1.5 picoseconde soit celle mesurée pour les mésons B.

Cette situation avait incité plusieurs théoricien-ne-s à réexaminer leurs calculs et à chercher un effet négligé qui aurait pu expliquer la différence. Malgré tous leurs efforts, il était pratiquement impossible de réconcilier la durée de vie mesurée pour les baryons b (quelque part entre 1,1 à 1,3 picoseconde) avec celle des mésons B qui était à environ 1,5 picosecondes.

Une décennie plus tard, D0 et CDF, deux expériences d’un autre accélérateur, le Tevatron près de Chicago, ont commencé à combler l’écart. Mais il a fallu attendre une autre décennie pour que les expériences du LHC démontrent qu’en fait, il n’y a pas grand différence entre la durée de vie des baryons b et celle des mésons B.

Plus tôt cette année, ATLAS et CMS avaient toutes les deux mesuré des valeurs plus en accord avec la durée de vie des mésons B. Avec ce dernier résultat de haute précision de l’expérience LHCb, il y a maintenant suffisamment d’évidence pour clore l’affaire après une vingtaine d’années de questionnement. LHCb a mesuré la durée de vie du baryon b à 1,482 ± 0,018 ± 0,012 picoseconde. Le rapport à la durée de vie des mésons B est de 0,976 ± 0,012 ± 0,006, une valeur très proche de un, tel que prévu théoriquement.

Une explication possible serait que toutes les expériences du LEP étaient affectées par une erreur systématique commune mais toujours inconnue. Ou simplement une fluctuation statistique (i.e. malchance !). La cause exacte ne sera peut être jamais identifiée, mais au moins, le problème est résolu. C’est une grande réussite pour les théoricien-ne-s qui savent désormais que leurs calculs étaient justes, et ce depuis le début.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution