• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

Latest Posts

  • Flip
  • Tanedo
  • USLHC
  • USA

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Jim
  • Rohlf
  • USLHC
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Hot Topics

New physicsInner life of the proton

Late week, CERN reported on two small anomalies discussed at the International Conference on High Energy Physics. By finding deviations from the theoretical predictions, experimentalists seek to reveal “new physics,” physics beyond the Standard Model that would help explain inconsistencies and limitations within that model. Yet new physics remains elusive.

How is new physics discovered?

By CERN | September 28, 2012
Finding an experimental anomaly is a great way to open the door to a new theory. It is such a good trick that many of us physicists are bending over backward trying to uncover the smallest deviation from what the current theory, the Standard Model of particle physics, predicts.

How to discover new physics

By Aidan Randle-Conde | June 2, 2012
There’s a big difference between discovering a new phenomenon and discovering new physics, which is something that most people (including physicists!) don’t appreciate enough.
Share

Latest Posts

This article appeared in Fermilab Today on July 24, 2014.

Fermilab engineer Jim Hoff has invented an electronic circuit that can guard against radiation damage. Photo: Hanae Armitage

Fermilab engineer Jim Hoff has invented an electronic circuit that can guard against radiation damage. Photo: Hanae Armitage

Fermilab engineer Jim Hoff has received patent approval on a very tiny, very clever invention that could have an impact on aerospace, agriculture and medical imaging industries.

Hoff has engineered a widely adaptable latch — an electronic circuit capable of remembering a logical state — that suppresses a commonly destructive circuit error caused by radiation.

There are two radiation-based errors that can damage a circuit: total dose and single-event upset. In the former, the entire circuit is doused in radiation and damaged; in an SEU, a single particle of radiation delivers its energy to the chip and alters a state of memory, which takes the form of 1s and 0s. Altered states of memory equate to an unintentional shift from logical 1 or logical 0 and ultimately lead to loss of data or imaging resolution. Hoff’s design is essentially a chip immunization, preemptively guarding against SEUs.

“There are a lot of applications,” Hoff said. “Anyone who needs to store data for a length of time and keep it in that same state, uncorrupted — anyone flying in a high-altitude plane, anyone using medical imaging technology — could use this.”

Past experimental data showed that, in any given total-ionizing radiation dose, the latch reduces single-event upsets by a factor of about 40. Hoff suspects that the invention’s newer configurations will yield at least two orders of magnitude in single-event upset reduction.

The invention is fondly referred to as SEUSS, which stands for single-event upset suppression system. It’s relatively inexpensive and designed to integrate easily with a multitude of circuits — all that’s needed is a compatible transistor.

Hoff’s line of work lies in chip development, and SEUSS is currently used in some Fermilab-developed chips such as FSSR, which is used in projects at Jefferson Lab, and Phoenix, which is used in the Relativistic Heavy Ion Collider at Brookhaven National Laboratory.

The idea of SEUSS was born out of post-knee-surgery, bed-ridden boredom. On strict bed rest, Hoff’s mind naturally wandered to engineering.

“As I was lying there, leg in pain, back cramping, I started playing with designs of my most recent project at work,” he said. “At one point I stopped and thought, ‘Wow, I just made a single-event upset-tolerant SR flip-flop!’”

While this isn’t the world’s first SEUSS-tolerant latch, Hoff is the first to create a single-event upset suppression system that is also a set-reset flip-flop, meaning it can take the form of almost any latch. As a flip-flop, the adaptability of the latch is enormous and far exceeds that of its pre-existing latch brethren.

“That’s what makes this a truly special latch — its incredible versatility,” says Hoff.

From a broader vantage point, the invention is exciting for more than just Fermilab employees; it’s one of Fermilab’s first big efforts in pursuing potential licensees from industry.

Cherri Schmidt, head of Fermilab’s Office of Partnerships and Technology Transfer, with the assistance of intern Miguel Marchan, has been developing the marketing plan to reach out to companies who may be interested in licensing the technology for commercial application.

“We’re excited about this one because it could really affect a large number of industries and companies,” Schmidt said. “That, to me, is what makes this invention so interesting and exciting.”

Hanae Armitage

Share

Welcome to Thesisland

Tuesday, July 22nd, 2014

When I joined Quantum Diaries, I did so with trepidation: while it was an exciting opportunity, I was worried that all I could write about was the process of writing a thesis and looking for postdoc jobs. I ended up telling the site admin exactly that: I only had time to work on a thesis and job hunt. I thought I was turning down the offer. But the reply I got was along the lines of “It’s great to know what topics you’ll write about! When can we expect a post?”. So, despite the fact that this is a very different topic from any recent QD posts, I’m starting a series about the process of writing a physics PhD thesis. Welcome.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

The main thesis editing desk: laptop, external monitor keyboard mouse; coffee, water; notes; and lots of encouragement.

There are as many approaches to writing a PhD thesis as there are PhDs, but they can be broadly described along a spectrum.

On one end is the “constant documentation” approach: spend some fixed fraction of your time on documenting every project you work on. In this approach, the writing phase is completely integrated with the research work, and it’s easy to remember the things you’re writing about. There is a big disadvantage: it’s really easy to write too much, to spend too much time writing and not enough doing, or otherwise un-balance your time. If you keep a constant fraction of your schedule dedicated to writing, and that fraction is (in retrospect) too big, you’ve lost a lot of time. But you have documented everything, which everyone who comes after will be grateful for. If they ever see your work.

The other end of the spectrum is the “write like hell” approach (that is, write as fast as you can), where all the research is completed and approved before writing starts. This has the advantage that if you (and your committee) decide you’ve written enough, you immediately get a PhD! The disadvantage is that if you have to write about old projects, you’ll probably have forgotten a lot. So this approach typically leads to shorter theses.

These two extremes were first described to me (see the effect of thesis writing? It’s making my blog voice go all weird and passive) by two professors who were in grad school together and still work together. Each took one approach, and they both did fine, but the “constant documentation” thesis was at least twice (or was it three times?) as long as the “write like hell” thesis.

Somewhere between those extremes is the funny phenomenon of the “staple thesis”: a thesis primarily composed of all the papers you wrote in grad school, stapled together. A few of my friends have done this, but it’s not common in my research group because our collaboration is so large. I’ll discuss that in more detail later.

I’m going for something in the middle: as soon as I saw a light at the end of the tunnel, I wanted to start writing, so I downloaded the UW latex template for PhD theses and started filling it in. It’s been about 14 months since then, with huge variations in the writing/research balance. To help balance between the two approaches, I’ve found it helpful to keep at least some notes about all the physics I do, but nothing too polished: it’s always easier to start from some notes, however minimal, than to start from nothing.

When I started writing, there were lots of topics available that needed some discussion: history and theory, my detector, all the calibration work I did for my master’s project–I could have gone full-time writing at that point and had plenty to do. But my main research project wasn’t done yet. So for me, it’s not just a matter of balancing “doing” with “documenting”; it’s also a question of balancing old documentation with current documentation. I’ve almost, *almost* finished writing the parts that don’t depend on my work from the last year or so. In the meantime, I’m still finishing the last bits of analysis work.

It’s all a very long process. How many readers are looking towards writing a thesis later on? How many have gone through this and found a method that served them well? If it was fast and relatively low-stress, would you tell me about it?

Share

This article appeared in Fermilab Today on July 21, 2014.

Members of the prototype proton CT scanner collaboration move the detector into the CDH Proton Center in Warrenville. Photo: Reidar Hahn

Members of the prototype proton CT scanner collaboration move the detector into the CDH Proton Center in Warrenville. Photo: Reidar Hahn

A prototype proton CT scanner developed by Fermilab and Northern Illinois University could someday reduce the amount of radiation delivered to healthy tissue in a patient undergoing cancer treatment.

The proton CT scanner would better target radiation doses to the cancerous tumors during proton therapy treatment. Physicists recently started testing with beam at the CDH Proton Center in Warrenville.

To create a custom treatment plan for each proton therapy patient, radiation oncologists currently use X-ray CT scanners to develop 3-D images of patient anatomy, including the tumor, to determine the size, shape and density of all organs and tissues in the body. To make sure all the tumor cells are irradiated to the prescribed dose, doctors often set the targeting volume to include a minimal amount of healthy tissue just outside the tumor.

Collaborators believe that the prototype proton CT, which is essentially a particle detector, will provide a more precise 3-D map of the patient anatomy. This allows doctors to more precisely target beam delivery, reducing the amount of radiation to healthy tissue during the CT process and treatment.

“The dose to the patient with this method would be lower than using X-ray CTs while getting better precision on the imaging,” said Fermilab’s Peter Wilson, PPD associate head for engineering and support.

Fermilab became involved in the project in 2011 at the request of NIU’s high-energy physics team because of the laboratory’s detector building expertise.

The project’s goal was a tall order, Wilson explained. The group wanted to build a prototype device, imaging software and computing system that could collect data from 1 billion protons in less than 10 minutes and then produce a 3-D reconstructed image of a human head, also in less than 10 minutes. To do that, they needed to create a device that could read data very quickly, since every second data from 2 million protons would be sent from the device — which detects only one proton at a time — to a computer.

NIU physicist Victor Rykalin recommended building a scintillating fiber tracker detector with silicon photomultipliers. A similar detector was used in the DZero experiment.

“The new prototype CT is a good example of the technical expertise of our staff in detector technology. Their expertise goes back 35 to 45 years and is really what makes it possible for us to do this,” Wilson said.

In the prototype CT, protons pass through two tracking stations, which track the particles’ trajectories in three dimensions. (See figure.) The protons then pass through the patient and finally through two more tracking stations before stopping in the energy detector, which is used to calculate the total energy loss through the patient. Devices called silicon photomultipliers pick up signals from the light resulting from these interactions and subsequently transmit electronic signals to a data acquisition system.

In the prototype proton CT scanner, protons enter from the left, passing through planes of fibers and the patient's head. Data from the protons' trajectories, including the energy deposited in the patient, is collected in a data acquisition system (right), which is then used to map the patient's tissue. Image courtesy of George Coutrakon, NIU

In the prototype proton CT scanner, protons enter from the left, passing through planes of fibers and the patient’s head. Data from the protons’ trajectories, including the energy deposited in the patient, is collected in a data acquisition system (right), which is then used to map the patient’s tissue. Image courtesy of George Coutrakon, NIU

Scientists use specialized software and a high-performance computer at NIU to accurately map the proton stopping powers in each cubic millimeter of the patient. From this map, visually displayed as conventional CT slices, the physician can outline the margins, dimensions and location of the tumor.

Elements of the prototype were developed at both NIU and Fermilab and then put together at Fermilab. NIU developed the software and computing systems. The teams at Fermilab worked on the design and construction of the tracker and the electronics to read the tracker and energy measurement. The scintillator plates, fibers and trackers were also prepared at Fermilab. A group of about eight NIU students, led by NIU’s Vishnu Zutshi, helped build the detector at Fermilab.

“A project like this requires collaboration across multiple areas of expertise,” said George Coutrakon, medical physicist and co-investigator for the project at NIU. “We’ve built on others’ previous work, and in that sense, the collaboration extends beyond NIU and Fermilab.”

Rhianna Wisniewski

Share

This article appeared in symmetry on July 11, 2014.

Together, the three experiments will search for a variety of types of dark matter particles. Photo: NASA

Together, the three experiments will search for a variety of types of dark matter particles. Photo: NASA

Two US federal funding agencies announced today which experiments they will support in the next generation of the search for dark matter.

The Department of Energy and National Science Foundation will back the Super Cryogenic Dark Matter Search-SNOLAB, or SuperCDMS; the LUX-Zeplin experiment, or LZ; and the next iteration of the Axion Dark Matter eXperiment, ADMX-Gen2.

“We wanted to pool limited resources to put together the most optimal unified national dark matter program we could create,” says Michael Salamon, who manages DOE’s dark matter program.

Second-generation dark matter experiments are defined as experiments that will be at least 10 times as sensitive as the current crop of dark matter detectors.

Program directors from the two federal funding agencies decided which experiments to pursue based on the advice of a panel of outside experts. Both agencies have committed to working to develop the new projects as expeditiously as possible, says Jim Whitmore, program director for particle astrophysics in the division of physics at NSF.

Physicists have seen plenty of evidence of the existence of dark matter through its strong gravitational influence, but they do not know what it looks like as individual particles. That’s why the funding agencies put together a varied particle-hunting team.

Both LZ and SuperCDMS will look for a type of dark matter particles called WIMPs, or weakly interacting massive particles. ADMX-Gen2 will search for a different kind of dark matter particles called axions.

LZ is capable of identifying WIMPs with a wide range of masses, including those much heavier than any particle the Large Hadron Collider at CERN could produce. SuperCDMS will specialize in looking for light WIMPs with masses lower than 10 GeV. (And of course both LZ and SuperCDMS are willing to stretch their boundaries a bit if called upon to double-check one another’s results.)

If a WIMP hits the LZ detector, a high-tech barrel of liquid xenon, it will produce quanta of light, called photons. If a WIMP hits the SuperCDMS detector, a collection of hockey-puck-sized integrated circuits made with silicon or germanium, it will produce quanta of sound, called phonons.

“But if you detect just one kind of signal, light or sound, you can be fooled,” says LZ spokesperson Harry Nelson of the University of California, Santa Barbara. “A number of things can fake it.”

SuperCDMS and LZ will be located underground—SuperCDMS at SNOLAB in Ontario, Canada, and LZ at the Sanford Underground Research Facility in South Dakota—to shield the detectors from some of the most common fakers: cosmic rays. But they will still need to deal with natural radiation from the decay of uranium and thorium in the rock around them: “One member of the decay chain, lead-210, has a half-life of 22 years,” says SuperCDMS spokesperson Blas Cabrera of Stanford University. “It’s a little hard to wait that one out.”

To combat this, both experiments collect a second signal, in addition to light or sound—charge. The ratio of the two signals lets them know whether the light or sound came from a dark matter particle or something else.

SuperCDMS will be especially skilled at this kind of differentiation, which is why the experiment should excel at searching for hard-to-hear low-mass particles.

LZ’s strength, on the other hand, stems from its size.

Dark matter particles are constantly flowing through the Earth, so their interaction points in a dark matter detector should be distributed evenly throughout. Quanta of radiation, however, can be stopped by much less significant barriers—alpha particles by a piece of paper, beta particles by a sandwich. Even gamma ray particles, which are harder to stop, cannot reach the center of LZ’s 7-ton detector. When a particle with the right characteristics interacts in the center of LZ, scientists will know to get excited.

The ADMX detector, on the other hand, approaches the dark matter search with a more delicate touch. The dark matter axions ADMX scientists are looking for are too light for even SuperCDMS to find.

If an axion passed through a magnetic field, it could convert into a photon. The ADMX team encourages this subtle transformation by placing their detector within a strong magnetic field, and then tries to detect the change.

“It’s a lot like an AM radio,” says ADMX-Gen2 co-spokesperson Gray Rybka of the University of Washington in Seattle.

The experiment slowly turns the dial, tuning itself to watch for one axion mass at a time. Its main background noise is heat.

“The more noise there is, the harder it is to hear and the slower you have to tune,” Rybka says.

In its current iteration, it would take around 100 years for the experiment to get through all of the possible channels. But with the addition of a super-cooling refrigerator, ADMX-Gen2 will be able to search all of its current channels, plus many more, in the span of just three years.

With SuperCDMS, LZ and ADMX-Gen2 in the works, the next several years of the dark matter search could be some of its most interesting.

Kathryn Jepsen

Share

La 37ème Conférence internationale de physique des hautes énergies vient de se terminer à Valence, en Espagne. Cette année, pas de grande surprise : aucun nouveau boson, aucun signe de nouvelles particules ou phénomènes révélant la nature de la matière sombre ou l’existence de nouvelles théories comme la supersymétrie. Mais comme toujours, quelques petites anomalies ont capté l’attention.

Les chercheur-e-s s’intéressent particulièrement à toute déviation par rapport aux prédictions théoriques car ces petites anomalies pourraient révéler l’existence d’une “nouvelle physique”. Cela permettrait de découvrir des indices d’une théorie plus inclusive puisque tout le monde réalise que le modèle théorique actuel, le Modèle standard, a ses limites et doit être remplacé par une théorie plus complète.

Mais il faut se méfier. Tous les physiciens et physiciennes le savent bien : de petits écarts apparaissent souvent et disparaissent tout aussi vite. Toutes les mesures faites en physique suivent des lois statistiques. Des déviations d’un écart-type entre les valeurs mesurées expérimentalement et celles prédites par la théorie sont observées dans trois mesures sur dix. De plus grands écarts sont moins communs, mais toujours possibles. Une déviation de deux écarts-types se produit dans 5% des mesures, et trois écarts-types, 1%. Il y a aussi des erreurs systématiques reliées aux instruments de mesure. Ces erreurs ne sont pas de nature statistiques mais peuvent être réduites avec une connaissance accrue du détecteur. L’erreur expérimentale associée à chaque résultat correspond à un écart-type. Voici à titre d’exemple deux petites anomalies rapportées durant la conférence et qui ont attiré l’attention cette année.

La Collaboration ATLAS a montré un résultat préliminaire sur la production d’une paire de bosons W. La mesure de ce taux permet d’effectuer des vérifications détaillées du Modèle puisque les théoricien–ne-s peuvent prévoir combien de fois des paires de bosons W sont produites quand les protons entrent en collision dans Grand collisionneur de hadrons (LHC). Le taux de production dépend de l’énergie dégagée pendant ces collisions. Jusqu’ici, on peut faire deux mesures puisque le LHC a fonctionné à deux énergies différentes, soit 7 et 8 TeV.

Les expériences CMS et ATLAS avaient déjà publié leurs résultats basés sur les données recueillis à 7 TeV. Les taux mesurés excédaient légèrement les prédictions théoriques mais restaient tout de même à l’intérieur des marges d’erreur expérimentale avec des déviations de 1.0 et 1.4 écart-type, respectivement. CMS avait aussi publié des résultats basés sur environ 20% de toutes les données accumulées à 8 TeV. Le taux mesuré excédait légèrement la prédiction théorique par 1.7 écart-type. Le dernier résultat d’ATLAS ajoute un élément supplémentaire au tableau. Il est basé sur l’ensemble des données recueillies à 8 TeV. ATLAS obtient une déviation un peu plus forte pour le taux de production de deux bosons W à 8 TeV avec une déviation de 2.1 écarts-types par rapport à la prédiction théorique.

WWResultsLes quatre mesures expérimentales du taux de production de paires de bosons W (points noirs) avec l’incertitude expérimentale (barre horizontale) aussi bien que la prédiction théorique actuelle (triangle bleu) avec sa propre incertitude (bande bleue). On peut voir que toutes les mesures sont plus élevées que les prédictions actuelles, suggérant que le calcul théorique actuel n’inclut pas tout.

Chacune de ces quatre mesures est en bon accord avec la valeur théorique mais le fait qu’elles excèdent toutes cette prédiction commence à attirer l’attention. Très probablement, cela signifie que les théoriciens n’ont pas encore pris en compte toutes les petites corrections exigées par le Modèle standard pour déterminer ce taux suffisamment précisément. C’est un peu comme si on oubliait de noter quelques petites dépenses dans son budget, menant à un déficit non expliqué à la fin du mois. Il pourrait aussi y avoir des facteurs communs dans les incertitudes expérimentales, qui réduiraient l’importance globale de cette anomalie. Mais si les prédictions théoriques demeurent ce qu’elles sont, même en rajoutant toutes les petites corrections possibles, cela indiquerait l’existence de nouveaux phénomènes, ce qui serait passionnant. Il faudra alors surveiller l’évolution de cette mesure après la remise en marche du LHC en 2015 à plus haute énergie, soit 13 TeV.

La Collaboration CMS a présenté elle aussi un résultat intrigant. Un groupe de chercheur-e-s a trouvé quelques événements compatibles avec l’observation d’une désintégration d’un boson de Higgs en un tau et un muon. De telles désintégrations sont interdites dans le Modèle standard puisqu’elles enfreignent la conservation de la « saveur » leptonique. Il y a trois saveurs ou types de leptons chargés (une catégorie de particules fondamentales) : l’électron, le muon et le tau. Chacun vient avec son propre type de neutrinos. Dans toutes les observations faites jusqu’à présent, les leptons sont toujours produits soit avec leur propre neutrino, soit avec leur antiparticule. La désintégration d’un boson de Higgs en leptons devrait donc toujours produire un lepton chargé et son antiparticule, mais jamais deux leptons chargés de saveur différente. Il est tout simplement interdit d’enfreindre cette règle à l’intérieur du cadre du Modèle standard.

Il faudra vérifier tout cela avec plus de données, ce qui sera possible après la reprise du LHC l’année prochaine. Mais d’autres modèles de « nouvelle physique » permettent la violation de la saveur leptonique. Il s’agit de modèles comme ceux comprenant plusieurs doublets de Higgs ou des bosons de Higgs composites ou encore les modèles impliquant des dimensions supplémentaires comme ceux de Randall-Sundrum. Alors si avec plus de données ATLAS et CMS confirment que cette tendance correspond à un effet réel, ce sera une véritable révolution.

HtomutauLes résultats obtenus par la Collaboration CMS pour six types de désintégrations différentes. Tous donnent une valeur non-nulle, contrairement aux prédictions du Modèle standard, pour le taux de désintégration de bosons de Higgs en paires de tau et muon.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

 

Share

Two anomalies worth noticing

Monday, July 14th, 2014

The 37th International Conference on High Energy Physics just finished in Valencia, Spain. This year, no big surprises were announced: no new boson, no signs from new particles or clear phenomena revealing the nature of dark matter or new theories such as Supersymmetry. But as always, a few small anomalies were reported.

Looking for deviations from the theoretical predictions is precisely how experimentalists are trying to find a way to reveal “new physics”. It would help discover a more encompassing theory since everybody realises the current theoretical model, the Standard Model, has its limits and must be superseded by something else. However, all physicists know that small deviations often come and go. All measurements made in physics follow statistical laws. Therefore deviations from the expected value by one standard deviation occur in three measurements out of ten. Larger deviations are less common but still possible. A two standard deviation happens 5% of the time. Then there are systematic uncertainties that relate to the experimental equipment. These are not purely statistical, but can be improved with a better understanding of our detectors. The total experimental uncertainty quoted with each result corresponds to one standard variation. Here are two small anomalies reported at this conference that attracted attention this year.

The ATLAS Collaboration showed its preliminary result on the production of a pair of W bosons. Measuring this rate provides excellent checks of the Standard Model since theorists can predict how often pairs of W bosons are produced when protons collide in the Large Hadron Collider (LHC). The production rate depends on the energy released during these collisions. So far, two measurements can be made since the LHC operated at two different energies, namely 7 TeV and 8 TeV.

CMS and ATLAS had already released their results on their 7 TeV data. The measured rates exceeded slightly the theoretical prediction but were both well within their experimental error with a deviation of 1.0 and 1.4 standard deviation, respectively. CMS had also published results based on about 20% of all data collected at 8 TeV. It exceeded slightly the theoretical prediction by 1.7 standard deviation. The latest ATLAS result adds one more element to the picture. It is based on the full 8 TeV data sample. Now ATLAS reports a slightly stronger deviation for this rate at 8 TeV with 2.1 standard deviations from the theoretical prediction.

WWResults

The four experimental measurements for the WW production rate (black dots) with the experimental uncertainty (horizontal bar) as well as the current theoretical prediction (blue triangle) with its own uncertainty (blue strip). One can see that all measurements are higher than the current prediction, indicating that the theoretical calculation fails to include everything.

The four individual measurements are each reasonably consistent with expectation, but the fact that all four measurements lie above the predictions becomes intriguing. Most likely, this means that theorists have not yet taken into account all the small corrections required by the Standard Model to precisely determine this rate. This would be like having forgotten a few small expenses in one’s budget, leading to an unexplained deficit at the end of the month. Moreover, there could be common factors in the experimental uncertainties, which would lower the overall significance of this anomaly. But if the theoretical predictions remain what they are even when adding all possible little corrections, it could indicate the existence of new phenomena, which would be exciting. It would then be something to watch for when the LHC resumes operation in 2015 at 13 TeV.

The CMS Collaboration presented another intriguing result. They found some events consistent with coming from a decay of a Higgs boson into a tau and a muon. Such decays are prohibited in the Standard Model since they violate lepton flavour conservation. There are three “flavours” or types of charged leptons (a category of fundamental particles): the electron, the muon and the tau. Each one comes with its own type of neutrinos. According to all observations made so far, leptons are always produced either with their own neutrino or with their antiparticle. Hence, the decay of a Higgs boson in leptons should always produce a charged lepton and its antiparticle, but never two charged leptons of different flavour. Violating a conservation laws in particle physics is simply not allowed.

This needs to be scrutinised with more data, which will be possible when the LHC resumes next year. Lepton flavour violation is allowed outside the Standard Model in various models such as models with more than one Higgs doublet or composite Higgs models or Randall-Sundrum models of extra dimensions for example. So if both ATLAS and CMS confirm this trend as a real effect, it would be a small revolution.

HtomutauThe results obtained by the CMS Collaboration showing that six different channels all give a non-zero value for the decay rate of Higgs boson into pairs of tau and muon.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.

 

Share

ICHEP at a distance

Friday, July 11th, 2014

I didn’t go to ICHEP this year.  In principle I could have, especially given that I have been resident at CERN for the past year, but we’re coming down to the end of our stay here and I didn’t want to squeeze in one more work trip during a week that turned out to be a pretty good opportunity for one last family vacation in Europe.  So this time I just kept track of it from my office, where I plowed through the huge volume of slides shown in the plenary sessions earlier this week.  It was a rather different experience for me from ICHEP 2012, which I attended in person in Melbourne and where we had the first look at the Higgs boson.  (I’d have to say it was also probably the pinnacle of my career as a blogger!)

Seth’s expectations turned out to be correct — there were no earth-shattering announcements at this year’s ICHEP, but still a lot to chew on.  The Standard Model of particle physics stands stronger than ever.  As Pauline wrote earlier today, the particle thought to be the Higgs boson two years ago still seems to be the Higgs boson, to the best of our abilities to characterize it.  The LHC experiments are starting to move beyond measurements of the “expected” properties — the dominant production and decay modes — into searches for unexpected, low-rate behavior.  While there are anomalous results here and there, there’s nothing that looks like more than a fluctuation.  Beyond the Higgs, all sectors of particle physics look much as predicted, and some fluctuations, such as the infamous forward-backward asymmetry of top-antitop production at the Tevatron, appear to have subsided.  Perhaps the only ambiguous result out there is that of the BICEP2 experiment which might have observed gravitational waves, or maybe not.  We’re all hoping that further data from that experiment and others will resolve the question by the end of the year.  (See the nice talk on the subject of particle physics and cosmology by Alan Guth, one of the parents of that field.)

This success of the Standard Model is both good and bad news.  It’s good that we do have a model that has stood up so well to every experimental test that we have thrown at it, in some cases to startling precision.  You want models to have predictive power.  But at the same time, we know that the model is almost surely incomplete.  Even if it can continue to work at higher energy scales than we have yet explored, at the very least we seem to be missing some particles (those that make up the dark matter we know exists from astrophysical measurements) and it also fails to explain some basic observations (the clear dominance of matter over antimatter in the universe).  We have high hopes for the next run of the LHC, which will start in Spring 2015, in which we will have higher beam energies and collision rates, and a greater chance of observing new particles (should they exist).

It was also nice to see the conference focus on the longer-term future of the field.  Since the last ICHEP, every region of the world has completed long-range strategic planning exercises, driven by recent discoveries (including that of the Higgs boson, but also of various neutrino properties) and anchored by realistic funding scenarios for the field.  There were several presentations about these plans during the conference, and a panel discussion featuring leaders of the field from around the world.  It appears that we are having a nice sorting out of which region wants to host which future facility, and when, in such a way that we can carry on our international efforts in a straightforward way.  Time will tell if we can bring all of these plans to fruition.

I’ll admit that I felt a little left out by not attending ICHEP this year.  But here’s the good news: ICHEP 2016 is in Chicago, one of the few places in the world that I can reach on a single plane flight from Lincoln.  I have marked my calendar!

Share

C’est en chantant joyeux anniversaire en faussant un peu mais dans la bonne humeur générale que plusieurs centaines de physicien-ne-s ont terminé la journée du 4 juillet lors de la 37ème Conférence internationale de physique des hautes énergies qui se tenait à Valence, en Espagne du 2 au 9 juillet. Il y a deux ans, les expériences ATLAS et CMS avaient annoncé la découverte du boson de Higgs à la veille de la même conférence tenue alors à Melbourne, en Australie. Beaucoup échangeaient des souvenirs sur où ils et elles étaient lors de cette annonce historique.

gateau

A peine deux années plus tard, les deux expériences ont déjà acquis une quantité impressionnante de connaissances sur le boson de Higgs. Les deux groupes ont maintenant mesuré avec haute précision sa masse, comment il est produit et comment il se désintègre. ATLAS a présenté son résultat récemment publié pour la masse combinée du boson Higgs, soit 125.36 ± 0.41 GeV en parfait accord avec la valeur présentée pour la première fois à cette conférence par CMS de 125.03 ± 0.30 GeV.

En présentant son résultat final sur les désintégrations de bosons de Higgs en deux photons, la Collaboration CMS a maintenant complété l’analyse de toutes les données récoltées jusqu’à maintenant. La valeur combinée pour la force du signal, une quantité mesurant le nombre de bosons de Higgs observés comparé au nombre prévu par la théorie, est de 1.00 ± 0.13. ATLAS obtient 1.3 ± 0.18. Ces deux mesures indiquent qu’avec la précision expérimentale actuelle, ce boson est compatible avec celui prévu par le Modèle standard.

On connaît aussi son spin et sa parité, deux caractéristiques propres aux particules fondamentales et équivalant à leurs empreintes digitales. Leur détermination révèle l’identité d’une particule et c’est ainsi que nous savons que le boson découvert il y a deux ans est bel et bien un boson de Higgs.

Reste encore à savoir s’il s’agit de l’unique boson de Higgs prévu par Robert Brout, François Englert et Peter Higgs en 1964 dans le cadre de la théorie actuelle, le Modèle standard. Car ce boson pourrait aussi être le plus léger des cinq bosons de Higgs prévus par une des autres théories plus inclusives comme la supersymétrie proposées pour combler plusieurs lacunes du Modèle standard. Une telle découverte ouvrirait la porte vers ce qu’on appelle communément « la nouvelle physique ».

ATLAS-Higgs-couplingsPlusieurs mesures d’ATLAS sur la force du signal, i.e. une quantité mesurant le nombre de bosons de Higgs produits dans différents canaux et se désintégrant en différentes particules, comparé au nombre prévu par la théorie. Le résultat devrait donc être égal à 1.0 si la théorie est juste. Le symbole “+” en noir indique la valeur théorique prévue tandis que les divers cercles délimitent la zone où on s’attend à trouver la valeur réelle avec un niveau de confiance de 68 % ou 95 %.

Presque toutes les données rassemblées jusqu’à la fin de 2012 – avant l’arrêt technique du Grand collisionneur de hadrons (LHC) pour maintenance et consolidation – ont maintenant été analysées. Et tout ce qui a été mesuré jusqu’ici est en accord avec les prédictions du Modèle standard en tenant compte des marges d’erreur. Non seulement les expériences ont-elles amélioré la précision dans la plupart des mesures, mais elles examinent sans cesse de nouveaux aspects. Par exemple, les expériences CMS et ATLAS ont aussi montré la distribution de la quantité de mouvement du boson de Higgs et de ses produits de désintégrations. Toutes ces mesures testent le Modèle standard avec une précision accrue. Les physicien-ne-s cherchent justement la moindre déviation par rapport aux prédictions théoriques dans l’espoir de trouver la brèche qui révèlerait en quoi consiste la « nouvelle physique », celle qui permettra d’aller au-delà du Modèle standard.

CMS-muUne série de mesures de la force du signal correspondant à différents modes de désintégrations obtenus par la Collaboration CMS. Toutes les valeurs mesurées n’ont révélé aucun écart par rapport à la valeur de 1.0 prévue par le Modèle standard, du moins dans l’état actuel des marges d’erreurs expérimentales. Une déviation suggérerait la manifestation de quelque chose allant au-delà du Modèle standard.

Mais aucune des nombreuses tentatives directes entreprises pour trouver des particules liées à cette nouvelle physique ne s’est avérée fructueuse jusqu’à maintenant. Bien qu’on ait vérifié des centaines de possibilités correspondant à autant de scénarios différents impliquant des particules hypothétiques de supersymétrie, on n’a encore détecté aucun signe de leur présence.

Tout cela s’apparente beaucoup à des fouilles archéologiques : on doit souvent pelleter longtemps avant d’extraire quelque chose de spécial. Chaque analyse effectuée correspond à un seau de terre enlevé. Et chaque petit bout d’information récoltée contribue à fournir une vue d’ensemble. Aujourd’hui, grâce aux dizaines de nouveaux résultats présentés à la conférence, les théoricien-ne-s sont en bien meilleure position pour tirer des conclusions générales, éliminer les modèles erronés et trouver la bonne solution.

Tout le monde attend maintenant avec impatience la reprise du LHC prévue pour le début de 2015 afin de récolter de nouvelles données à plus haute l’énergie et explorer tout un monde de nouvelles possibilités. Tous les espoirs de découvrir la nouvelle physique seront alors renouvelés.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

 

Share

Happy birthday, dear boson!

Friday, July 11th, 2014

Singing happy birthday slightly off-key but in good spirit. This is how several hundred physicists gathered for the 37th International Conference on High Energy Physics in Valencia, Spain closed the day on July 4th. Two years before, the CMS and ATLAS experiments had announced the discovery of the Higgs boson on the eve of the same conference that was then held in Melbourne, Australia. Lots of people reminisced about the day of the announcement, where they were when they heard the news since many were traveling.

gateau

Two years later, the two experiments have now gathered an impressive amount of knowledge on the Higgs boson. Both groups have measured with high precision the Higgs boson mass, how it is produced and how it decays. ATLAS presented its published Higgs boson mass combination, namely 125.36 ± 0.41 GeV also in perfect agreement with the  CMS measurement, presented for the first time at this conference, of 125.03 ± 0.30 GeV.

By presenting its final Higgs boson decay to two photons results, the CMS Collaboration has now completed its analysis of all the data taken so far. The obtained value for the combined signal strength, which is how many Higgs bosons are observed compared to the number predicted by the theory, is 1.00 ± 0.13. ATLAS measured 1.3 ± 0.18. Both results indicate that, within errors, this boson is compatible with what the Standard Model predicts.

Its spin and parity, two properties of fundamental particles, are also known. These are like fingerprints. Knowing them reveals the identity of a particle and that is how we know the boson discovered two years ago is really a Higgs boson.

The question is still open though to see if this is the unique Higgs boson that was predicted by Robert Brout, François Englert and Peter Higgs in 1964 in the framework of the current theory, the Standard Model. But it could also be the lightest of the five Higgs bosons predicted by a more encompassing theory like Supersymmetry that would fix some problems of the Standard Model and open the door to the so-called “new physics”.

ATLAS-Higgs-couplingsSeveral measurements from ATLAS on the signal strength, i.e. how often Higgs bosons are produced in different ways, and decay into different types of particles, compared to the theoretical predictions. The result should therefore be equal to 1.0 if the theoretical predictions are right. The black “+”symbol indicates the predicted value while the various circles give the zone where the experiment expects the real value to be with 68% or 95% confidence level.

Nearly all the data collected up to the end of 2012 – before the Large Hadron Collider (LHC) was shutdown to undergo a massive consolidation and maintenance program – were used for the many analyses presented at the conference. Everything measured so far agrees within experimental uncertainties with the predictions of the Standard Model. Not only did the experiments improve the precision on most measurements, but they are also looking at new aspects all the time. For example, CMS and ATLAS also showed the distribution of the momentum of the Higgs boson and its decay products afterwards. All these measurements test the Standard Model with increasing precision . Experimentalists are looking for any deviation from the theoretical predictions in the hope of finding the key to reveal what is the more encompassing theory lying beyond the Standard Model.

 CMS-mu

A series of results by the CMS Collaboration on the signal strength. With the current level of precision, all these measurements agree with a value of 1.0, as predicted by the Standard Model. A deviation would suggest the manifestation of something beyond the Standard Model.

But none of the numerous direct attempts to find particles related to this new physics has proved successful yet. Despite having looked at hundreds of different possibilities, each one corresponding to a particular scenario involving one of the hypothetical particles of Supersymmetry, no sign of their presence has been discovered yet.

However, this is quite similar to doing archaeology: one needs to shovel a lot of dirt before extracting something meaningful. Each analysis is like one bucket of dirt removed. And each small piece of information found helps get the bigger picture. Today, with the wealth of new results, theorists are in a much better position to draw general conclusions, eliminate wrong models and zoom in on the right solution.

The whole community is eagerly awaiting the restart of the LHC in early 2015 to collect more data at higher energy to open up a new world of opportunities. All hopes to discover this new physics will then be renewed.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.

Share

How Can We Hangout Better?

Wednesday, July 9th, 2014

Yesterday we had one of our regular Hangouts with CERN, live from ICHEP, at which we took questions from around the Internet and updated everyone on the latest results, live here at the ICEHP 2014 conference. You can see a replay here:

I sent it to my wife, like I usually do. (“Look, I’m on ‘TV’ again!”) And she told me something interesting: she didn’t really get too much out of it. As we discussed it, it became clear that that was because we really did try to give the latest news on different analyses from ICHEP. Although we (hopefully) kept the level of the discussion general, the importance of the different things we look for would be tough to follow unless you keep up with particle physics regularly. We do tend to get more viewers and more enthusiasm when the message is more general, and a lot of the questions we get are quite general as well. Sometimes it seems like we get “Do extra dimensions really exist?” almost every time we have a hangout. We don’t want to answer that every time!

So the question is: how do we provide you with an engaging discussion while also covering new ground? We want people who watch every hangout to learn something new, but people who haven’t probably would prefer to hear the most exciting and general stuff. The best answer I can come up with is that every hangout should have a balance of the basics with a few new details. But then, part of the fun of the hangouts is that they’re unscripted and have specialist guests who can report directly on what they’ve been doing, so we actually can’t balance anything too carefully.

So are we doing the best we can with a tough but interesting format? Should we organize our discussions and the questions we choose differently? Your suggestions are appreciated!

Share