• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Posts Tagged ‘CERN’

While everybody is excited by the coming “phase 2” of the LHC, someone else is already looking beyond it, thinking: “what are the possible future scenarios for our beloved Large Hadron Collider?”

The community of “phenomenologists”, theoreticians who like to play with data, closely collaborate with experimentalists to plan new experiments. We are hoping to get the most out of a set-up and think about future stages and improvements.

In the last months there has been a lot of interest around a proposal for a new experiment at the LHC: “AFTER@LHC”, namely A Fixed Target ExpeRiment at the LHC. This means that we do not have particles running in opposite directions within two rings (the collider setting), crashing head-on; rather, there is just one ring where particles run coherently and are then extracted by means of a crystal and smashed against a fixed target, like hitting a wall.


You may actually wonder: “Why should I prefer this instead of the super nice and Nobel-prize-generator collider?”

In the LHC protons are accelerated at approximately the speed of light and collide along the ring. The protons are made out of quarks and gluons, so each proton-proton collision can be interpreted as a smashing among their elementary constituents. In particular, since gluons are the most relevant elementary constituents at the LHC energy, the latter can be thought as a collider of gluons.

As I partly discussed in a previous post, we can study the structure of the proton with 3D probability distributions (transverse-momentum-dependent distributions, TMDs) which allow you to access all the possible spin and momentum configurations of the constituents. For example, quark and gluons can be investigated with and without their spin state, and the proton where they live in can be polarized or not. There are several of these combinations and each one represents a fundamental piece in the puzzle of the proton structure.

The LHC is currently running with beam of unpolarized protons only. Meaning we do not consider their spin in analyses. For those who want to investigate the puzzle of a proton’s structure, this is a limitation. We are able to access only two out of the eight (under certain assumptions) configurations of polarizations, namely the unpolarized and the linearly polarized gluons. So there are six options we don’t get to study!

In this table the eight available TMD (transverse-momentum-dependent) distributions shaping the physics of (un)polarized gluons inside (un)polarized protons are listed. At the LHC we can access the first row only, at AFTER more combinations will be investigated.

In this table the eight available TMD distributions shaping the physics of (un)polarized gluons inside (un)polarized protons are listed. At the LHC we can access the first row only, at AFTER more combinations will be investigated.

And here is the answer to our question. The fixed target at AFTER could be easily polarized, allowing us to study the physics of gluons inside polarized protons, which would be impossible at the present collider! There is only another machine in the world where hadrons can be polarized: the Relativistic Heavy Ion Collider – RHIC at Brookhaven National Lab.

For this reason, AFTER could access novel phenomena intrinsically related to the polarization of hadrons and, at the same time, allow us to study processes already available at the LHC but in different physical regions. For example, there is the possibility of accessing the simple 1D probability distributions in a region where they are still poorly known.

A particularly interesting observable which AFTER could look at is the so-called “Sivers” distribution for gluons, namely the probability of extracting unpolarized gluons from a proton whose spin is transverse with respect to the direction of the beam. Part of its core features cannot be calculated from first principles in the theory, so a good way to explore it would be extraction from experimental data. In the past years physicists got indications that the Sivers effect for gluons could be small, but an experimental insight at AFTER would be really important.

As you can see, there could be a lot of cool physics going on. We are in the early stages, where all the possible (including economic) constraints need to be taken into account and where a good scientific motivational plan is fundamental.

When you try to give birth to an experiment you face a lot of problems, like “What’s a realistic estimate of its scientific impact? Do we really need a new machine or not?” Some of these questions have already been addressed and the answers are collected in scientific publications, which you can partly find here.

If everything goes according to plan and desires, AFTER@LHC will bring very good insight and contributions to the study of the proton structure: stay tuned for updates!


I don’t usually get to spill the beans on a big discovery like this, but this time, I DO!

CERN Had Dark Energy All Along!!

That’s right. That mysterious energy making up ~68% of the universe was being used all along at CERN! Being based at CERN now, I’ve had a first hand glimpse into the dark underside of Dark Energy. It all starts at the Crafted Refilling of Empty Mugs Area (CREMA), pictured below.

One CREMA station at CERN


Researchers and personnel seem to stumble up to these stations at almost all hours of the day, looking very dreary and dazed. They place a single cup below the spouts, and out comes a dark and eerie looking substance, which is then consumed. Some add a bit of milk for flavor, but all seem perkier and refreshed after consumption. Then they disappear from whence they came. These CREMA stations seem to be everywhere, from control rooms to offices, and are often found with groups of people huddled around them. In fact, they seem to exert a force on all who use them, keeping them in stable orbits about the stations.

In order to find out a little bit more about this mysterious substance and its dispersion, I asked a graduating student, who wished to remain unnamed, a little bit about their experiences:

Q. How much of this dark stuff do you consume on a daily basis?

A. At least one cup in the morning to fuel up, I don’t think I could manage to get to lunchtime without that one. Then multiple other cups distributed over the day, depending on the workload. It always feels like they help my thinking.

Q. Do you know where it comes from?

A. We have a machine in our office which takes capsules. I’m not 100% sure where those capsules are coming from, but they seem to restock automatically, so no one ever asked.

Q. Have you been hiding this from the world on purpose?

A. Well our stock is important to our group, if we would just share it with everyone around we could run out. And no one of us can make it through the day without. We tried alternatives, but none are so effective.

Q. Do you remember the first time you tried it?

A. Yes, they hooked me on it in university. From then on nothing worked without!

Q. Where does CERN get so much of it?

A. I never thought about this question. I think I’m just happy that there is enough for everyone here, and physicist need quite a lot of it to work.

In order to gauge just how much of this Dark Energy is being consumed, I studied the flux of people from the cafeteria as a function of time with cups of Dark Energy. I’ve compiled the results into the Dark Energy Consumption As Flux (DECAF) plot below.

Dark Energy Consumption as Flux plot. Taken March 31, 2015. Time is given in 24h time. Errors are statistical.


As the DECAF plot shows, there is a large spike in consumption, particularly after lunch. There is a clear peak at times after 12:20 and before 13:10. Whether there is an even larger peak hiding above 13:10 is not known, as the study stopped due to my advisor asking “shouldn’t you be doing actual work?”

There is an irreducible background of Light Energy in the cups used for Dark Energy, particularly of the herbal variety. Fortunately, there is often a dangly tag hanging off of the cup  to indicate to others that they are not using the precious Dark Energy supply, and provide a clear signal for this study to eliminate the background.

While illuminating, this study still does not uncover the exact nature of Dark Energy, though it is clear that it is fueling research here and beyond.


The Ties That Bind

Sunday, January 18th, 2015
Cleaning the ATLAS Experiment

Beneath the ATLAS detector – note the well-placed cable ties. IMAGE: Claudia Marcelloni, ATLAS Experiment © 2014 CERN.

A few weeks ago, I found myself in one of the most beautiful places on earth: wedged between a metallic cable tray and a row of dusty cooling pipes at the bottom of Sector 13 of the ATLAS Detector at CERN. My wrists were scratched from hard plastic cable ties, I had an industrial vacuum strapped to my back, and my only light came from a battery powered LED fastened to the front of my helmet. It was beautiful.

The ATLAS Detector is one of the largest, most complex scientific instruments ever constructed. It is 46 meters long, 26 meters high, and sits 80 metres underground, completely surrounding one of four points on the Large Hadron Collider (LHC), where proton beams are brought together to collide at high energies.  It is designed to capture remnants of the collisions, which appear in the form of particle tracks and energy deposits in its active components. Information from these remnants allows us to reconstruct properties of the collisions and, in doing so, to improve our understanding of the basic building blocks and forces of nature.

On that particular day, a few dozen of my colleagues and I were weaving our way through the detector, removing dirt and stray objects that had accumulated during the previous two years. The LHC had been shut down during that time, in order to upgrade the accelerator and prepare its detectors for proton collisions at higher energy. ATLAS is constructed around a set of very large, powerful magnets, designed to curve charged particles coming from the collisions, allowing us to precisely measure their momenta. Any metallic objects left in the detector risk turning into fast-moving projectiles when the magnets are powered up, so it was important for us to do a good job.

ATLAS Big Wheel

ATLAS is divided into 16 phi sectors with #13 at the bottom. IMAGE: Steven Goldfarb, ATLAS Experiment © 2014 CERN

The significance of the task, however, did not prevent my eyes from taking in the wonder of the beauty around me. ATLAS is shaped somewhat like a large barrel. For reference in construction, software, and physics analysis, we divide the angle around the beam axis, phi, into 16 sectors. Sector 13 is the lucky sector at the very bottom of the detector, which is where I found myself that morning. And I was right at ground zero, directly under the point of collision.

To get to that spot, I had to pass through a myriad of detector hardware, electronics, cables, and cooling pipes. One of the most striking aspects of the scenery is the ironic juxtaposition of construction-grade machinery, including built-in ladders and scaffolding, with delicate, highly sensitive detector components, some of which make positional measurements to micron (thousandth of a millimetre) precision. All of this is held in place by kilometres of cable trays, fixings, and what appear to be millions of plastic (sometimes sharp) cable ties.

Inside the ATLAS Detector

Scaffolding and ladder mounted inside the precision muon spectrometer. IMAGE: Steven Goldfarb, ATLAS Experiment © 2014 CERN.

The real beauty lies not in the parts themselves, but rather in the magnificent stories of international cooperation and collaboration that they tell. The cable tie that scratched my wrist secures a cable that was installed by an Iranian student from a Canadian university. Its purpose is to carry data from electronics designed in Germany, attached to a detector built in the USA and installed by a Russian technician.  On the other end, a Japanese readout system brings the data to a trigger designed in Australia, following the plans of a Moroccan scientist. The filtered data is processed by software written in Sweden following the plans of a French physicist at a Dutch laboratory, and then distributed by grid middleware designed by a Brazilian student at CERN. This allows the data to be analyzed by a Chinese physicist in Argentina working in a group chaired by an Israeli researcher and overseen by a British coordinator.  And what about the cable tie?  No idea, but that doesn’t take away from its beauty.

There are 178 institutions from 38 different countries participating in the ATLAS Experiment, which is only the beginning.  When one considers the international make-up of each of the institutions, it would be safe to claim that well over 100 countries from all corners of the globe are represented in the collaboration.  While this rich diversity is a wonderful story, the real beauty lies in the commonality.

All of the scientists, with their diverse social, cultural and linguistic backgrounds, share a common goal: a commitment to the success of the experiment. The plastic cable tie might scratch, but it is tight and well placed; its cable is held correctly and the data are delivered, as expected. This enormous, complex enterprise works because the researchers who built it are driven by the essential nature of the mission: to improve our understanding of the world we live in. We share a common dedication to the future, we know it depends on research like this, and we are thrilled to be a part of it.

ATLAS Collaboration Members

ATLAS Collaboration members in discussion. What discoveries are in store this year? IMAGE: Claudia Marcelloni, ATLAS Experiment © 2008 CERN.

This spring, the LHC will restart at an energy level higher than any accelerator has ever achieved before. This will allow the researchers from ATLAS, as well as the thousands of other physicists from partner experiments sharing the accelerator, to explore the fundamental components of our universe in more detail than ever before. These scientists share a common dream of discovery that will manifest itself in the excitement of the coming months. Whether or not that discovery comes this year or some time in the future, Sector 13 of the ATLAS detector reflects all the beauty of that dream.


United for peace

Monday, January 12th, 2015

The past week saw extremely sad events in Paris, reminding us that our society relies on a fragile equilibrium. This is just the most recent episode over the last years in a long list of events around the world – and also in Amsterdam, the city where I now live.

We have been flooded through the mass media by analyses, considerations, speeches and public actions. I don’t think it is necessary to add more here, because what we mostly need is time to think: about us as individuals and as active parts of a complex society.

Nevertheless, I would like to remind myself – and everyone who will read these thoughts – about what we can do as men and women of science. Even though fear and anger may knock at our doors, we need to find what could keep us united across different countries, cultures, religions and faiths. And fight for it.

As scientists, we are privileged: our job is to generate knowledge, the common heritage of mankind. Science is a universal endeavor involving people from every country, social background and culture. No matter what we think and believe, we collaborate daily to reach a high goal. Science, like any other intercultural enterprise, is a training for peace, and we are in extreme need of it and anything else that keeps us united in purity of interests, freedom and friendship.

The "tree of peace" in The Hague, which carries people's wishes for a better and peaceful world.

The “tree of peace” in The Hague (NL), which carries people’s wishes for a better and peaceful world.

The quest for peace is not just a hand-waving argument, nor fantasy of hopeful people: it is clearly stated even in the original documents of CERN – the European Center for Nuclear Research – signed by the founding members and shared by every single scientist working and studying there.

I. I. Rabi, an American scientist among the first supporters of CERN, greeted the 30th anniversary of CERN foundation with these words(*): “I hope all the scientists at CERN will remember to have more duties than just doing research in particle physics. They represent the results of centuries of research and study, showing the powers of the human mind. I hope they will not consider themselves technicians, but guardians of the European unity, so that Europe can protect peace in the world.”

Let’s build together a future of peace: we can do it.

(*) translated from the Italian version available here.


This Fermilab press release came out on Oct. 20, 2014.

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U.S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London. Although the project is designed to benefit data-intensive science throughout the U.S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. The high capacity of this new connection will provide U.S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe.

DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.S. collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.

The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012.

The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows.

An evolving data model
This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

“Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.

Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U.S. collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.S. collaborators on the CMS experiment. These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0).  The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.S.-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

“This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.”

An investment in science
ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years. Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers.

“We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.”

This work was supported by the DOE Office of Science.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy.  The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.  For more information, please visit science.energy.gov.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation for the State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit applied science and technology organization.

Visit Brookhaven Lab’s electronic newsroom for links, news archives, graphics, and more at http://www.bnl.gov/newsroom, follow Brookhaven Lab on Twitter, http://twitter.com/BrookhavenLab, or find us on Facebook, http://www.facebook.com/BrookhavenLab/.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Media contacts:

  • Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl.gov, 631-344-8350
  • Kurt Riesselmann, Fermilab Office of Communication, media@fnal.gov, 630-840-3351
  • Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl.gov, 510-486-5849

Computing contacts:

  • Lothar Bauerdick, Fermilab, US CMS software computing, bauerdick@fnal.gov, 630-840-6804
  • Oliver Gutsche, Fermilab, CMS Offline and Computing Management Board, gutsche@fnal.gov, 630-840-8909

Even before my departure to La Thuile in Italy, results from the Rencontres de Moriond conference were already flooding the news feeds. This year’s Electroweak session from 15 to 22 March, started with the first “world measurement” of the top quark mass, from a combination of the measurements published by the Tevatron and LHC experiments so far. The week went on to include a spectacular CMS result on the Higgs width.

Although nearing its 50th anniversary, Moriond has kept its edge. Despite the growing numbers of must-attend HEP conferences, Moriond retains a prime spot in the community. This is in part due to historic reasons: it’s been around since 1966, making a name for itself as the place where theorists and experimentalists come to see and be seen. Let’s take a look at what the LHC experiments had in store for us this year…

New Results­­­

Stealing the show at this year’s Moriond was, of course, the announcement of the best constraint yet of the Higgs width at < 17 MeV with 95% confidence reported in both Moriond sessions by the CMS experiment. Using a new analysis method based on Higgs decays into two Z particles, the new measurement is some 200 times better than previous results. Discussions surrounding the constraint focussed heavily on the new methodology used in the analysis. What assumptions were needed? Could the same technique be applied to Higgs to WW bosons? How would this new width influence theoretical models for New Physics? We’ll be sure to find out at next year’s Moriond…

The announcement of the first global combination of the top quark mass also generated a lot of buzz. Bringing together Tevatron and LHC data, the result is the world’s best value yet at 173.34 ± 0.76 GeV/c2.  Before the dust had settled, at the Moriond QCD session, CMS announced a new preliminary result based on the full data set collected at 7 and 8 TeV. The precision of this result alone rivals the world average, clearly demonstrating that we have yet to see the ultimate attainable precision on the top mass.

ot0172hThis graphic shows the four individual top quark mass measurements published by the ATLAS, CDF, CMS and DZero collaborations, together with the most precise measurement obtained in a joint analysis.

Other news of the top quark included new LHC precision measurements of its spin and polarisation, as well as new ATLAS results of the single top-quark cross section in the t-channel presented by Kate Shaw on Tuesday 25 March. Run II of the LHC is set to further improve our understanding of this

A fundamental and challenging measurement that probes the nature of electroweak symmetry breaking mediated by the Brout–Englert–Higgs mechanism is the scattering of two massive vector bosons against each other. Although rare, in the absence of the Higgs boson, the rate of this process would strongly rise with the collision energy, eventually breaking physical law. Evidence for electroweak vector boson scattering was detected for the first time by ATLAS in events with two leptons of the same charge and two jets exhibiting large difference in rapidity.

With the rise of statistics and increasing understanding of their data, the LHC experiments are attacking rare and difficult multi-body final states involving the Higgs boson. ATLAS presented a prime example of this, with a new result in the search for Higgs production in association with two top quarks, and decaying into a pair of b-quarks. With an expected limit of 2.6 times the Standard Model expectation in this channel alone, and an observed relative signal strength of 1.7 ± 1.4, the expectations are high for the forthcoming high-energy run of the LHC, where the rate of this process is enhanced.

Meanwhile, over in the heavy flavour world, the LHCb experiment presented further analyses of the unique exotic state X(3872). The experiment provided unambiguous confirmation of its quantum numbers JPC to be 1++, as well as evidence for its decay into ψ(2S)γ.

Explorations of the Quark-Gluon Plasma continue in the ALICE experiment, with results from the LHC’s lead-proton (p-Pb) run dominating discussions. In particular, the newly observed “double-ridge” in p-Pb is being studied in depth, with explorations of its jet peak, mass distribution and charge dependence presented.

New explorations

Taking advantage of our new understanding of the Higgs boson, the era of precision Higgs physics is now in full swing at the LHC. As well as improving our knowledge of Higgs properties – for example, measuring its spin and width – precise measurements of the Higgs’ interactions and decays are well underway. Results for searches for Beyond Standard Model (BSM) physics were also presented, as the LHC experiments continue to strongly invest in searches for Supersymmetry.

In the Higgs sector, many researchers hope to detect the supersymmetric cousins of the Higgs and electroweak bosons, so-called neutralinos and charginos, via electroweak processes. ATLAS presented two new papers summarising extensive searches for these particles. The absence of a significant signal was used to set limits excluding charginos and neutralinos up to a mass of 700 GeV – if they decay through intermediate supersymmetric partners of leptons – and up to a mass of 420 GeV – when decaying through Standard Model bosons only.

Furthermore, for the first time, a sensitive search for the most challenging electroweak mode producing pairs of charginos that decay through W bosons was conducted by ATLAS. Such a mode resembles that of Standard Model pair production of Ws, for which the currently measured rates appear a bit higher than expected.

In this context, CMS has presented new results on the search for the electroweak pair production of higgsinos through their decay into a Higgs (at 125 GeV) and a nearly massless gravitino. The final state sports a distinctive signature of 4 b-quark jets compatible with a double Higgs decay kinematics. A slight excess of candidate events means the experiment cannot exclude a higgsino signal. Upper limits on the signal strength at the level of twice the theoretical prediction are set for higgsino masses between 350 and 450 GeV.

In several Supersymmetry scenarios, charginos can be metastable and could potentially be detected as a long-lived particle. CMS has presented an innovative search for generic long-lived charged particles by mapping their detection efficiency in function of the particle kinematics and energy loss in the tracking system. This study not only allows to set stringent limits for a variety of Supersymmetric models predicting chargino proper lifetime (c*tau) greater than 50cm, but also gives a powerful tool to the theory community to independently test new models foreseeing long lived charged particles.

In the quest to be as general as possible in the search for Supersymmetry, CMS has also presented new results where a large subset of the Supersymmetry parameters, such as the gluino and squark masses, are tested for their statistical compatibility with different experimental measurements. The outcome is a probability map in a 19-dimension space. Notable observations in this map are that models predicting gluino masses below 1.2 TeV and sbottom and stop masses below 700 GeV are strongly disfavoured.

… but no New Physics

Despite careful searches, the most heard phrase at Moriond was unquestionably: “No excess observed – consistent with the Standard Model”. Hope now lies with the next run of the LHC at 13 TeV. If you want to find out more about the possibilities of the LHC’s second run, check out the CERN Bulletin article: “Life is good at 13 TeV“.

In addition to the diverse LHC experiment results presented, Tevatron experiments, BICEP, RHIC and other experiments also reported their breaking news at Moriond. Visit the Moriond EW and Moriond QCD conference websites to find out more.

Katarina Anthony-Kittelsen


This is the last part of a series of three on supersymmetry, the theory many believe could go beyond the Standard Model. First I explained what is the Standard Model and show its limitations. Then I introduced supersymmetry and explained how it would fix the main flaws of the Standard Model. I now review how experimental physicists are trying to discover “superparticles” at the Large Hadron Collider (LHC) at CERN.

If Supersymmetry (or SUSY for short) is as good as it looks, why has none of the new SUSY particles been found yet? There could be many reasons, the simplest being that this theory is wrong and supersymmetric particles do not exist. If that were the case, one would still need another way to fix the Standard Model.

SUSY can still be the right solution if supersymmetric particles have eluded us for some reasons: we might have been looking in the wrong place, or in the wrong way or they could still be out of the reach of current accelerators.

So how does one go looking for supersymmetric particles? One good place to start is at CERN with the Large Hadron Collider or LHC. The 27-km long accelerator is the most powerful in the world. It brings protons into collisions at nearly the speed of light, generating huge amounts of energy in the tiniest points in space.  Since energy and matter are two forms of the same essence, like water and ice, the released energy materializes in the form of fundamental particles. The hope is to create some of the SUSY particles.

One major problem is that nobody knows the mass of all these new particles. And without the mass, it is very much like looking for someone in a large city without knowing the person’s address. All one can do then is comb the city trying to spot that person. But imagine the task if you don’t even know what the person looks like, how she behaves or even in which city, let alone which country she lives in.

Supersymmetry is in fact a very loosely defined theory with a huge number of free parameters. These free parameters are quantities like the masses of the supersymmetric particles, or their couplings, i.e. quantities defining how often they will decay into other particles. Supersymmetry does not specify which value all these quantities can take.

Hence, theorists have to make educated guesses to reduce the zone where one should search for SUSY particles. This is how various models of supersymmetry have appeared. Each one is an attempt at circumscribing the search zone based on different assumptions.

One common starting point is to assume that a certain property called R-parity is conserved. This leads to a model called Minimal SUSY but this model still has 105 free parameters. But with this simple assumption, one SUSY particle ends up having the characteristics of dark matter. Here is how it works: R-parity conservation states that all supersymmetric particles must decay into other, lighter supersymmetric particles. Therefore, the lightest supersymmetric particle or LSP cannot decay into anything else. It remains stable and lives forever, just like dark matter particles do. Hence the LSP could be the much sought-after dark matter particle.

SUSY-cascade-Fermilab Credit: Fermilab

How can the Large Hadron Collider help? Around the accelerator, large detectors act like giant cameras, recording how the newly created and highly unstable particles break apart like miniature fireworks. By taking a snapshot of it, one can record the origin, direction and energy of each fragment and reconstruct the initial particle.

Heavy and unstable SUSY particles would decay in cascade, producing various Standard Model particles along the way. The LSP would be the last possible step for any decay chain. Generally, the LSP is one of the mixed SUSY states with no electric charge called neutralino. Hence, each of these events contains a particle that is stable but does not interact with our detectors. In the end, there would be a certain amount of energy imbalance in all these events, indicating that a particle has escaped the detector without leaving any signal.

At the LHC, both the CMS and ATLAS experiments have searched billions of events looking for such events but to no avail. Dozens of different approaches have been tested and new possibilities are constantly being explored. Each one corresponds to a different hypothesis, but nothing has been found so far.

dijet-monjet Two events with jets as seen in the ATLAS detector. (Left) A very common event containing two jets of particles. The event is balanced, all fragments were recorded, no energy is missing. (Right) A simulation of a mono-jet event where a single jet recoils against something unrecorded by the detector. The imbalance in energy could be the signature of a dark matter particle like the lightest supersymmetric particle (LSP), something that carries energy away but does not interact with the detector, i.e. something we would not see.

One reason might be that all supersymmetric particles are too heavy to have been produced by the LHC. A particle can be created only if enough energy is available. You cannot buy something that costs more money than you have in your pocket. To create heavy particles, one needs more energy. It is still possible all SUSY particles exist but were out of the current accelerator reach. This point will be settled in 2015 when the LHC resumes operation at higher energy, going from 8 TeV to at least 13 TeV.

If the SUSY particles are light enough to be created at 13 TeV, the chances of producing them will also be decupled, making them even easier to find. And if we still do not find them, new limits will be reached, which will also greatly help focus on the remaining possible models.

SUSY has not said its last word yet. The chances are good supersymmetric particles will show up when the LHC resumes. And that would be like discovering a whole new continent.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.


Voici le dernier d’une série de trois volets sur la supersymétrie, la théorie qui pourrait aller au-delà du Modèle Standard. J’ai expliqué dans un premier temps ce qu’est le Modèle standard et montré ses limites. Puis dans un deuxième volet, j’ai présenté la supersymétrie et expliqué comment elle pourrait résoudre plusieurs lacunes du Modèle standard. Finalement, je passe ici en revue comment les physicien-ne-s essaient de découvrir des « superparticules » au Grand collisionneur de hadrons (LHC) du CERN.

Si la supersymétrie (ou SUSY pour les intimes) est aussi miraculeuse que prédite, pourquoi aucune nouvelle particule supersymétrique n’a t-elle été trouvée à ce jour ? Il pourrait y avoir beaucoup de raisons, la plus simple étant que cette théorie soit fausse et les particules supersymétriques n’existent tout simplement pas. Si c’était le cas, on devrait alors trouver une alternative pour parer aux lacunes du Modèle standard.

Mais SUSY est toujours une solution plausible et ses particules supersymétriques ont pu nous échapper pour d’autres raisons. Peut-être avons nous regardé au mauvais endroit ou de la mauvaise façon. Ou encore elles pourraient être hors de portée de nos accélérateurs.

Mais au fait où et comment cherche-t-on des particules supersymétriques ? Le Grand collisionneur de hadrons (LHC) CERN est l’endroit idéal. Cet accélérateur de 27 km de longueur est le plus puissant au monde. Il provoque des collisions entre des protons lancés à une vitesse proche de celle de la lumière. Ces collisions produisent des quantités d’énergie énormes concentrés en de minuscules points de l’espace. Puisque l’énergie et la matière sont deux formes d’une même essence, comme l’eau et la glace, l’énergie libérée se matérialise sous forme de particules fondamentales. Il est donc possible de créer certaines de ces particules supersymétriques au LHC.

Malheureusement, personne ne connaît la masse de toutes ces nouvelles particules. Et sans la masse, c’est un peu comme chercher quelqu’un dans une grande ville sans connaître son adresse. Il faudrait alors ratisser la ville pour découvrir cette personne. Mais imaginez la tâche si vous ne savez même pas à quoi la personne ressemble, comment elle se comporte, ni même la ville ou le pays elle habite.

La supersymétrie est en fait une théorie comportant de nombreux paramètres libres. Ces paramètres représentent des quantités comme les masses des particules supersymétriques ou leurs couplages, c’est-à-dire la probabilité qu’elles se désintègrent en d’autres particules. La supersymétrie ne spécifie pas quelles valeurs ces quantités peuvent prendre.

Les théoricien-ne-s doivent donc faire des suppositions pour réduire la zone de recherches. C’est ainsi que divers modèles de supersymétrie sont apparus. Chaque modèle représente une tentative pour circonscrire la zone de recherche basée sur des suppositions différentes.

Une hypothèse populaire consiste à supposer qu’une certaine propriété appelée la parité R est conservée. C’est le cas pour le modèle minimal de SUSY mais il conserve tout de même 105 paramètres libres. Mais de cette simple supposition surgit une particule de SUSY ayant les caractéristiques de la matière sombre.

Voici comment ça marche : la conservation de R-parité stipule que toutes les particules supersymétriques doivent se désintégrer en d’autres particules supersymétriques. Par conséquent, la particule supersymétrique la plus légère, le LSP (de l’acronyme anglais Lightest Supersymmetric Particle) ne peut se désintégrer en rien d’autre et reste stable. Elle existe pour toujours, comme les particules de matière sombre. Le LSP pourrait donc être la particule de matière sombre tant recherchée.

Comment le Grand collisionneur de hadrons peut-il aider? Autour de l’accélérateur, de grands détecteurs agissent comme des appareils photo géants, enregistrant comment les particules nouvellement créées et fortement instables se brisent, créant de mini feux d’artifice. Ces clichés permettent d’enregistrer l’origine, la direction et l’énergie de chaque fragment et ainsi reconstruire la particule initiale.SUSY-decay-fr

Des particules de SUSY lourdes et instables se désintégreraient en cascade, produisant diverses particules du Modèle standard en chemin. Le LSP serait la dernière étape possible pour n’importe quelle chaîne de désintégration. Généralement, le LSP est un des états de SUSY mélangés sans charge électrique appelée neutralino. Au final, chaque événement supersymétrique contiendrait une particule stable, qui n’interagirait pas avec nos détecteurs. On observerait donc un déséquilibre dans la quantité d’énergie de tous ces événements, indiquant qu’une particule s’est échappée du détecteur sans laisser de signaux dans les diverses couches du détecteur.

Au LHC, les physicien-ne-s des expériences CMS et ATLAS ont trié des milliards d’événements à la recherche de tels événements, mais en vain. Des douzaines d’approches différentes ont été testées et de nouvelles possibilités sont constamment explorées. Chacune correspond à une hypothèse différente, mais rien n’a encore été trouvé.

Deux événements contenant des gerbes de particules captés par le détecteur ATLAS. (A gauche) un événement très courant contenant deux gerbes de particules. L’événement est équilibré en énergie, tous les fragments ont été enregistrés, aucune énergie ne manque. (A droite) une simulation d’un événement contenant une seule gerbe reculant contre quelque chose qui échappe au détecteur. Le déséquilibre dans l’énergie serait la signature d’une particule de matière sombre comme la particule supersymétrique la plus légère (LSP), une particule qui emporterait une certaine quantité d’énergie, mais n’interagirait pas avec le détecteur et que l’on ne verrait donc pas.

Il se peut aussi que toutes les particules supersymétriques soient trop lourdes pour avoir été produites par le LHC. Une particule peut être créée seulement si suffisamment d’énergie est disponible. On ne peut pas acheter quelque chose qui coûte plus que ce que l’on a dans sa poche. Pour créer des particules lourdes, il faut plus d’énergie. Il est donc toujours possible que toutes les particules de SUSY existent, mais qu’elles soient hors de la portée actuelle de l’accélérateur du LHC. Mais on en saura plus en 2015 quand le LHC reprendra du service à plus haute énergie, passant de 8 TeV à au moins 13 TeV.

Si les particules de SUSY sont assez légères pour être créé à 13 TeV, leurs chances de production seront aussi décuplées, les rendant encore plus facile à trouver. Et si nous ne les trouvons toujours pas, de nouvelles limites seront atteintes, ce qui permettra de se concentrer sur les modèles possibles restants.

SUSY n’a pas encore dit son dernier mot. Il reste de bonnes chances pour que des particules supersymétriques apparaissent quand le LHC redémarrera. Et si c’était le cas, ce serait aussi extraordinaire que la découverte d’un tout nouveau continent.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution


This article appeared in symmetry on March 19, 2014.

An international team of scientists from Fermilab’s Tevatron and CERN’s Large Hadron Collider has produced the world’s best value for the mass of the top quark.

An international team of scientists from Fermilab’s Tevatron and CERN’s Large Hadron Collider has produced the world’s best value for the mass of the top quark.

Scientists working on the world’s leading particle collider experiments have joined forces, combined their data and produced the first joint result from Fermilab’s Tevatron and CERN’s Large Hadron Collider. These machines are the past and current holders of the record for most powerful particle collider on Earth.

Scientists from the four experiments involved—ATLAS, CDF, CMS and DZero—announced their joint findings on the mass of the top quark today at the Rencontres de Moriond international physics conference in Italy.

Together the four experiments pooled their data analysis power to arrive at a new world’s best value for the mass of the top quark of 173.34 ± 0.76 GeV/c2.

Experiments at the LHC at the CERN laboratory in Geneva, Switzerland and the Tevatron collider at Fermilab in Illinois, USA are the only ones that have ever seen top quarks—the heaviest elementary particles ever observed. The top quark’s huge mass (more than 100 times that of the proton) makes it one of the most important tools in the physicists’ quest to understand the nature of the universe.

The new precise value of the top-quark mass will allow scientists to test further the mathematical framework that describes the quantum connections between the top quark, the Higgs particle and the carrier of the electroweak force, the W boson. Theorists will explore how the new, more precise value will change predictions regarding the stability of the Higgs field and its effects on the evolution of the universe. It will also allow scientists to look for inconsistencies in the Standard Model of particle physics—searching for hints of new physics that will lead to a better understanding of the nature of the universe.

“The combining together of data from CERN and Fermilab to make a precision top quark mass result is a strong indication of its importance to understanding nature,” says Fermilab director Nigel Lockyer. “It’s a great example of the international collaboration in our field.”

Courtesy of: Fermilab and CERN

Courtesy of: Fermilab and CERN

A total of more than six thousand scientists from more than 50 countries participate in the four experimental collaborations. The CDF and DZero experiments discovered the top quark in 1995, and the Tevatron produced about 300,000 top quark events during its 25-year lifetime, completed in 2011. Since it started collider physics operations in 2009, the LHC has produced close to 18 million events with top quarks, making it the world’s leading top quark factory.

“Collaborative competition is the name of the game,” says CERN’s Director General Rolf Heuer. “Competition between experimental collaborations and labs spurs us on, but collaboration such as this underpins the global particle physics endeavor and is essential in advancing our knowledge of the universe we live in.”

Each of the four collaborations previously released their individual top-quark mass measurements. Combining them together required close collaboration between the four experiments, understanding in detail each other’s techniques and uncertainties. Each experiment measured the top-quark mass using several different methods by analyzing different top quark decay channels, using sophisticated analysis techniques developed and improved over more than 20 years of top quark research beginning at the Tevatron and continuing at the LHC. The joint measurement has been submitted to the arXiv.

A version of this article was originally issued by Fermilab and CERN as a press release.


Voici la deuxième partie d’une série de trois sur la supersymétrie, la théorie qui pourrait aller au-delà du Modèle standard. J’ai expliqué dans un premier temps ce qu’est le Modèle standard et montré ses limites. Je présenterai ici la supersymétrie et expliquerai comment elle pourrait résoudre plusieurs lacunes du Modèle standard. Finalement, je passerai en revue comment les physicien-ne-s essaient de découvrir des « superparticules » au Grand collisionneur de hadrons (LHC) du CERN.

Les théoricien-ne-s doivent souvent attendre pendant des décennies pour voir leurs idées confirmées par des découvertes expérimentales. Ce fut le cas pour François Englert, Robert Brout et Peter Higgs  dont la théorie, élaborée en 1964, ne fut confirmée qu’en 2012 avec la découverte du boson de Higgs  par les expériences du Grand collisionneur de hadrons (LHC).

Aujourd’hui, beaucoup de théoricien-ne-s ayant participé à l’élaboration de ce que l’on connaît maintenant comme la supersymétrie, attendent de voir ce que le LHC révélera.

La supersymétrie est une théorie qui est d’abord apparue comme une symétrie mathématique dans la théorie des cordes au début des années 1970. Au fil du temps, plusieurs personnes y ont apporté de nouveaux éléments, pour finalement aboutir aujourd’hui avec la théorie la plus prometteuse pour aller au-delà du Modèle standard. Parmi les pionniers, il faut d’abord citer deux théoriciens russes, D. V. Volkov et V. P Akulov. Puis en 1973, Julius Wess et Bruno Zumino ont écrit le premier modèle supersymétrique à quatre dimensions, pavant la voie aux développements futurs. L’année suivante, Pierre Fayet a généralisé le mécanisme de Brout-Englert-Higgs  à la supersymétrie et a introduit pour la première fois des superpartenaires pour les particules du Modèle standard.

Tout ce travail ne serait resté qu’un pur exercice mathématique si on n’avait remarqué que la supersymétrie pouvait résoudre certains problèmes fondamentaux du Modèle standard.

Comme nous avons vu, le Modèle standard contient deux types de particules fondamentales : les grains de matière, les fermions avec des valeurs de spin de ½, et les porteurs de force, les bosons avec des valeurs entières de spin.

Le simple fait que les bosons et les fermions n’aient pas les mêmes valeurs de spin les fait se comporter différemment. Chaque groupe obéit à des lois statistiques différentes. Par exemple, deux fermions identiques ne peuvent pas exister dans le même état quantique. Un de leurs nombres quantiques doit être différent. Ces nombres quantiques caractérisent diverses propriétés : leur position, leur charge, leur spin ou leur charge “de couleur” pour les quarks. Puisque tout le reste est identique, deux électrons sur une même orbite atomique doivent avoir deux orientations différentes de spin, une pointant vers le haut, l’autre vers le bas. Cela implique qu’au plus deux électrons peuvent cohabiter sur une même orbite atomique puisqu’il n’y a que deux orientations possibles pour leur spin. Les atomes ont donc plusieurs orbites atomiques pour accommoder tous leurs électrons.

Au contraire, il n’y a aucune restriction imposée au nombre de bosons autorisés à exister dans le même état. Cette propriété explique le phénomène de supraconductivité. Une paire d’électrons forme un boson puisque deux spins de une demie donnent un spin de 0 ou 1 suivant s’ils sont alignés ou non. Dans un supraconducteur, toutes les paires d’électrons peuvent être identiques, chaque paire possédant exactement les mêmes nombres quantiques, ceci étant permis pour les bosons. On peut donc échanger deux paires librement, comme pour du sable mouvant. Tous ses grains de sable sont de taille identique et peuvent changer de position librement, d’où son instabilité. De même, dans un supraconducteur, toutes les paires d’électrons peuvent changer de position, sans aucune friction et donc sans aucune résistance électrique.

La supersymétrie se fonde sur le Modèle standard et associe un « superpartenaire » à chaque particule fondamentale. Les fermions obtiennent des bosons comme superpartenaires et les bosons sont associés à des fermions. Ceci unifie les composantes fondamentales de la matière avec les porteurs de force. Tout devient plus harmonieux et plus symétrique.


La supersymétrie se fonde sur le Modèle standard et vient avec plusieurs nouvelles particules supersymétriques, représentées ici avec un tilde (~) au-dessus de leur symbole. (Diagramme tiré du film « Particle Fever » et reproduit avec la permission de Mark Levinson).

Mais il y a d’autres conséquences importantes. Le nombre de particules fondamentales double. La supersymétrie associe un superpartenaire à chaque particule du Modèle standard. De plus, plusieurs de ces partenaires peuvent se mélanger, donnant des états combinés comme les charginos et les neutralinos.

Les implications sont nombreuses. Première conséquence majeure : les deux superpartenaires du quark top, appelés stops, peuvent neutraliser la grande correction du quark top à la masse du boson de Higgs. Deuxième implication: la particule supersymétrique la plus légère (en général un des états mélangés sans charge électrique appelée neutralino) a justement les propriétés que la matière sombre devrait avoir.

Non seulement la supersymétrie réparerait plusieurs gros défauts du Modèle standard, mais elle résoudrait aussi le problème de la matière sombre. On ferait d’une pierre deux coups. Seul minuscule petit problème : si ces particules supersymétriques existent, pourquoi ne les a t’on pas encore trouvées? J’aborderai cette question dans la troisième et dernière partie de cette série.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution.