• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for June, 2013

Premier volet d’une série de quatre sur la matière sombre

Certain-e-s d’entre vous ont peut-être entendu parler de la matière sombre, cette mystérieuse matière qu’on ne peut pas voir mais qui compte pour 27% du contenu de l’Univers alors que la matière visible (vous, moi, toutes les étoiles et les galaxies) n’équivaut qu’à 5%. Comment sait-on si elle existe vraiment? En fait, son existence est confirmée de bien des façons différentes.

disque matière sombre
Amas galactiques
C’est l’astronome suisse Fritz Zwicky qui soupçonna le premier l’existence de la matière sombre en 1933. Il voulait mesurer la masse d’un amas galactique (un groupe de plusieurs galaxies) et utilisa deux méthodes différentes. Il a d’abord évalué cette masse à partir de la vitesse de rotation des galaxies. Tout comme les enfants sur un carrousel doivent s’accrocher pour éviter d’être éjecté, les galaxies sont maintenues ensemble dans un amas galactique en rotation par la force gravitationnelle fournie par la matière qu’il contient. Il doit y avoir suffisamment de matière pour engendrer la force nécessaire, sans quoi les galaxies se disperseraient.

Il a comparé son résultat avec la masse évaluée à partir de la lumière émise par les galaxies. Il s’est ainsi rendu compte qu’il y avait beaucoup plus de matière dans l’amas que ce qui était visible. Cette matière d’un type inconnu générait un champ gravitationnel mais sans émettre de lumière, d’où son nom de matière sombre.

Galaxies en rotation
Mais ce n’est que dans les années 70 que l’astronome américaine Vera Rubin mesura les vitesses de rotation des étoiles à l’intérieur d’une galaxie avec suffisamment de précision pour convaincre la communauté scientifique. Elle observa que ces étoiles tournaient toutes à peu près à la même vitesse, indépendamment de leur distance du centre galactique. Ceci contredit la loi de Kepler qui décrit la rotation des planètes autour du soleil.

Une planète éloignée du soleil tourne plus lentement qu’une planète plus rapprochée, tel qu’indiqué par la courbe A sur le graphique ci-dessous. Cependant, Vera Rubin observa que les étoiles des galaxies en rotation suivaient plutôt la courbe B. C’était comme si les étoiles ne tournaient pas autour du centre visible de la galaxie, mais autour de centres inconnus, tous offrant une attraction gravitationnelle supplémentaire. Cela ne pouvait se produire que si d’énormes quantités de matière invisible remplissaient la galaxie et s’étendaient même au-delà de ses limites.

courbes-vélocité

Lentilles gravitationnelles
Une des techniques les plus frappantes de détection de la matière sombre est celle des “lentilles gravitationnelles”. Cela fonctionne sur le fait que les grandes concentrations de matière (visible ou sombre) créent des champs gravitationnels assez forts pour déformer l’espace.

Imaginez un drap tendu où on lance une balle de ping-pong. La balle se déplacera en lign droite en suivant le drap. Mais laissez tomber un objet lourd au milieu du drap, et la balle se décrira une courbe selon la surface déformée du drap.

trou-noir

La lumière se comporte de la même manière dans l’espace. Un espace vide sans aucune matière est semblable à un drap tendu et la lumière se déplace en ligne droite. En présence d’un objet massif, une étoile ou une galaxie, l’espace est déformé et la lumière suit les courbes de cet espace.
gravitational-lens
(D’après une idée tirée de la présentation TED de Pat Burchat)

La lumière provenant d’une galaxie lointaine s’infléchira en passant près d’un amas de matière sombre comme indiqué ci-dessus. La galaxie apparaîtra décalée, comme si elle se trouvait ailleurs (aux positions des images du haut et du bas). En trois dimensions, toute la lumière déviée formera un anneau comme celui que l’on voit sur la photo ci-dessous prise par le télescope Hubble. Si la galaxie et le télescope ne sont pas parfaitement alignés, seuls de petits arcs apparaitront.

Horseshoe_Einstein_Ring_from_Hubble
(Photo: NASA)

Cette technique est désormais assez puissante pour produire des cartes de la distribution de la matière sombre dans l’Univers.

Fond diffus cosmologique
Les astrophysicien-ne-s peuvent même déduire la quantité de matière sombre dans l’Univers en étudiant le fond diffus cosmologique. Il s’agit d’un rayonnement fossile datant de l’époque où l’Univers avait à peine 380.000 ans. Cette lumière fossile voyage depuis plus de 13 milliards d’années et nous parvient aujourd’hui venant de toutes et d’aucunes directions en particulier.

La carte de l’Univers ci-dessous a été établie à partir des données prises par le satellite Planck. Elle montre des points plus chauds correspondant aux endroits où d’abord la matière sombre, puis la matière visible, ont commencé à former des grumeaux, fournissant ainsi des graines de galaxies. Aujourd’hui, les scientifiques pensent que la matière sombre a servi de catalyseur dans la formation des galaxies.

CMB
(Photo: expérience Planck)

On peut analyser ce rayonnement cosmique tout comme le son d’un instrument de musique peut être décomposé en harmoniques. En se basant sur les caractéristiques de son “spectre de puissance”, c’est à dire sur la quantité de rayonnement associé à chaque fréquence, les astrophysicien-ne-s peuvent calculer la quantité de matière sombre contenue dans l’Univers.

Jusqu’à présent, toutes les manifestations de la matière sombre bien que nombreuses et convaincantes demeurent indirectes. On ne la perçoit qu’à travers ses effets gravitationnels. Existe-t-il des preuves directes? Ce sera mon prochain sujet. Mais attention: le débat fait rage dans la communauté scientifique sur l’interprétation des différents résultats.

Deuxième volet: Comment mettre la main sur la matière sombre

Troisième volet: Cosmologie et matière sombre

Quatrième volet: Le LHC résoudra-t-il l’énigme de la matière sombre?

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

 

Pour plus d’information (en anglais):

Hangout with CERN: The Dark Side of the Universe

TED Ed clip: Dark matter: The matter we can’t see

TED talk by Pat Burchat: Shedding light on dark matter

 

Share

How do we know Dark Matter exists?

Wednesday, June 26th, 2013

First part in a series of four on Dark Matter

Some of you may have heard of dark matter, this mysterious type of matter that no one can see but makes 27% of the content of the Universe while visible matter (you, me, all stars and galaxies) accounts for only 5%. How do we know it really exists? In fact, its existence is confirmed in many different ways.

disk dark matter

Galactic clusters

Fritz Zwicky, a Swiss astronomer, was the first to suspect the existence of dark matter in 1933. He was trying to measure the mass of a galactic cluster (a group of several galaxies) using two different methods. He tried to infer this mass from the speed of the galaxies. Just like kids on a merry-go-round have to hold on to avoid being ejected, galaxies are held together in a spinning galactic cluster by the gravitational force provided by the matter it contains. If there were not enough matter to create this force, the galaxies would simply scatter.

He then compared his result with the mass evaluated from the light the galaxies shed. He realised that there was way more matter in the cluster than what was visible. This matter of an unknown type generated a gravitational field without emitting light. Hence its name, dark matter.

Velocity curves of spinning galaxies

But it was not until the 1970’s that an American astronomer, Vera Rubin, measured the speed of stars in rotating galaxies accurately enough to convince the scientific community. She observed that stars in spinning galaxies were all rotating at roughly the same velocity, no matter their distance to the galactic centre. This is in contradiction with Kepler’s law that describes the rotation of planets around the Sun.

A planet located further from the Sun rotates slower, following the curve labelled A in the graph below. However, Vera Rubin showed instead that stars in a spinning galaxy followed curve B. This was as if the stars were not rotating around the visible centre of the galaxy but around many unknown centres, all providing additional gravitational attraction. This could only happen if huge amounts of invisible matter filled the entire galaxy and beyond.

 velocity-curve

Gravitational lensing

One striking dark matter detection technique is called “gravitational lensing”.  It is based on the way that large concentrations of matter (visible or dark) create gravitational fields strong enough to distort space.

Imagine a stretched bed sheet where we toss a ping-pong ball. The ball will simply roll following the surface of the sheet. But if you drop some heavy object in the middle of the sheet, the ball will still follow the sheet surface but will now move on a curve.

trou-noir

Light behaves the same way in space. An empty space, void of any matter is just like a stretched sheet. There, light moves in a straight line. In the presence of a massive object such as a star or a galaxy, the space is deformed and light follows the curvature of the distorted space.

gravitational-lens

(Adapted from Pat Burchat’s TED talk)

Light coming from a distant galaxy will bend when passing near a massive clump of dark matter as shown above. The galaxy will appear shifted, as if coming from different places (images on top and bottom). In three dimensions, all diverted light will form a ring as seen on the photo below taken by the Hubble telescope. In case the galaxy and the observers are not perfectly aligned, only small arcs form.

 Horseshoe_Einstein_Ring_from_Hubble

(Photo credit NASA)

This technique is now powerful enough to produce maps of the dark matter distribution in the Universe.

Cosmic microwave background

Astrophysicists can even infer how much dark matter exists by studying the cosmic microwave background. This is relic radiation dating back to when the Universe was barely 380,000 years old. This fossil light has been travelling around for more than 13 billion years and now reaches us coming from all and no direction in particular.

The map of the Universe below was drawn using data taken by the Planck satellite. It shows hotter spots corresponding to where first dark matter then visible matter started forming lumps, providing the seeds for galaxies. Nowadays, scientists believe dark matter acted as a catalyst in galaxy formation.

 CMB

(Photo credit Planck experiment)

The microwave background radiation can be decomposed just like sound from a musical instrument can be broken into harmonics. From the features of its “power spectrum”, i.e. the amount of radiation associated to each frequency, astrophysicists can calculate the quantity of dark matter contained in the Universe.

Today, we have numerous and convincing proofs of the presence of dark matter but see it only indirectly through its gravitational effects. How about direct evidence? This will be my next topic. But beware: there’s hot debate in the scientific community on how to interpret the various direct detection results.

Second part in a Dark Matter series:    Getting our hands on dark matter

Third part in a Dark Matter series:       Cosmology and dark matter

Fourth part in a Dark Matter series:    Can the LHC solve the Dark Matter mystery?

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.

 For more information:

Hangout with CERN: The Dark Side of the Universe

TED Ed clip: Dark matter: The matter we can’t see

TED talk by Pat Burchat: Shedding light on dark matter

 

Share

The Snowmass at Minnesota Community Summer Study Meeting is one month away, and the Young Physicists Group is planning for a strong participation. Students, postdocs and other untenured scientists are all encouraged to attend.

In addition to a wealth of physics results, many contributed by some of our brightest young people, the program includes:

  • a plenary talk (7/29, 1:30 p.m.) presenting results of the Career and Science Prospects Survey, which has gathered 1000+ responses so far and remains open until 7/15
  • a parallel session (date and time TBD) to discuss and edit the paper summarizing our views

The information and registration page is: http://www.hep.umn.edu/css2013/

The accommodations page lists hotels with rooms available at a reduced rate for participants.

Those traveling to or from Fermilab might be interested in this carpooling option.

Important deadlines are 7/7 for hotel reservations and 7/15 for registration to the meeting and inputs to the survey.

If you have not yet completed our brief survey, please do so. If you have done it already — many thanks! — please encourage others to do the same. The link to the survey is: http://tinyurl.com/snowmassyoung

See you in Minnesota!

Snowmass Young Conveners

Share

The Large Hadron Collider (LHC) started a vast consolidation program in March 2013 that will last well into 2015. Everybody at CERN on the accelerators or the experiments is now working hard to complete all needed tasks in time.

The experimental collaborations are currently deploying huge efforts on many fronts. One major task is preparing to deal with the increased data volume the revamped LHC will bring in 2015.

The LHC will resume at higher energy and luminosity, i.e. more intense beams. For the LHCb experiment, since it operates at constant luminosity, higher energy will translate into more tracks per event and almost twice the signal rate. Same situation for the other experiments, ALICE, CMS and ATLAS, but they will also have higher luminosity, meaning having to cope with more collisions occurring simultaneously every time bunches of protons collide in the LHC, making it increasingly difficult to disentangle each recorded event.

mu-2010-2012

To give you an idea, here are three snapshots captured by the ATLAS detector in successive years. The event on the left was taken at low luminosity at the start of the LHC. Very few collisions happened at the same time yielding very few tracks per event as seen on the picture.

Then in 2011, the average number of simultaneous collisions increased to around 12 (centre) and reached up to 40 by the end of 2012 (right).  In 2015, there will be between 60 and 80 superimposed collisions in each event depending on the operating scheme that will be retained. The challenge will be to extract a collision of interest from the huge quantity of tracks in each event.

Hence, much effort is spent improving the simulation, calibration and reconstruction of such events. Physicists are building on the existing techniques to be able to cope with the expected data volume.

 CMS-78vertices

The picture above shows a zoomed view of an event in the centre of the CMS detector where 78 proton-proton collisions took place simultaneously (the bright dots on the horizontal axis). The scale here is a few centimetres.

Here, each track corresponds to a charged particle. And each and every one of these tracks must be associated with only one vertex, namely, the point in space where it was created in a proton collision. This way, only the tracks associated to the main collision point will be retained to reconstruct the event.

In the picture above, most tracks come from collisions where the protons barely grazed each other and can be ignored. Only the energetic collisions have a chance to produce the heavy and rare particles we are interested in.

In parallel, all groups are using the opportunity of the shutdown to replace or repair electronic modules, power supplies and other components that failed or showed signs of deterioration during the past three years. New sub-detectors are even being added to increase the detectors performance. For example, the CMS collaboration is extending its muon detector coverage and the ATLAS experiment is adding a fourth layer on its pixel detector. LHCb is replacing its beam pipe and ALICE is doing major upgrades to its calorimeters.

But the main effort for all LHC experiments is still to finalize all analyses using the full data collected so far. Everyone seems to be following my mother’s advice: We must tirelessly revisit our work until it is perfect. (Cent fois sur le métier, remettez votre ouvrage). This is precisely what is happening right now. Each aspect of the data analysis is revisited to reach the full potential of the current data set: calibration, particle identification, background evaluation and signal extraction.

Every collaboration already has dozens of new results ready for the upcoming major summer conferences such as the European Physics Society meeting in mid-July.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.

 

Share

Le Grand collisionneur de hadrons (LHC) a entrepris un vaste programme de consolidation en mars 2013 qui durera jusqu’en 2015. Tout ceux et celles au CERN rattaché-e-s aux accélérateurs et aux expériences travaillent d’arrache pied pour tout compléter à temps.

Les collaborations expérimentales déploient des efforts considérables sur plusieurs fronts. En particulier, il leur faudra faire face au volume accru de données que le LHC apportera en 2015.

Le LHC repartira à une énergie et une luminosité plus élevées, c’est-à-dire avec des faisceaux plus intenses. Pour l’expérience LHCb, qui fonctionne à luminosité constante, plus d’énergie se traduira par plus de particules créées dans chaque événement et à un taux presque deux fois plus élevé.

Même situation pour les autres expériences, ALICE, CMS et ATLAS, avec en plus une luminosité plus élevée, ce qui signifiera plus de collisions se produisant simultanément à chaque fois que des paquets de protons entreront en collision dans le LHC ce qui rendra chaque événement enregistré de plus en plus difficile à décortiquer.

mu-2010-2012

Pour vous donner une idée, voici trois clichés d’évènements captés par le détecteur ATLAS au cours des trois dernières années. L’événement de gauche s’est produit à basse luminosité peu de temps après le démarrage du LHC. Très peu de collisions se sont produites en même temps et on retrouve donc peu de traces dans cet événement.

Puis, en 2011, le nombre moyen de collisions simultanées est passé à environ 12 par événement (au centre) et a atteint 40 à la fin de 2012 (à droite). En 2015, il y aura en moyenne entre 60 et 80 collisions superposées pour chaque évènement dépendamment du mode d’opération qui sera retenu. Le défi sera d’extraire de ce fouillis de traces la collision d’intérêt dans chaque événement.

Par conséquent, beaucoup d’efforts sont consacrés à l’amélioration de la simulation, la calibration et la reconstruction de ces événements. Les physicien-ne-s améliorent les techniques existantes pour pouvoir faire face au volume de données attendu.

 CMS-78vertices

L’image ci-dessus montre une vue agrandie d’un événement au centre du détecteur CMS où l’on peut voir les 78 collisions proton-proton qui se sont produites simultanément (les points brillants sur l’axe horizontal). L’échelle ici est de quelques centimètres.

Ici, chaque ligne reconstitue la trace d’une particule chargée. Et chacune de ces traces doit être associée à un seul point d’origine, à savoir le point de l’espace où elle a été créée dans une collision de protons. De cette façon, seules les traces appartenant à la collision principale seront retenues pour reconstituer l’évènement.

Sur l’image ci-dessus, la plupart des traces viennent de collisions où les protons se sont à peine effleurer et peuvent être ignorées. Seules les collisions énergétiques ont une chance de produire les particules lourdes et inusitées qui nous intéressent.

En même temps, tous les groupes profitent de cet arrêt pour remplacer ou réparer les modules électroniques, blocs d’alimentation et autres composantes qui ont fait défaut ou ont montré des signes de détérioration au cours des trois dernières années. De nouveaux sous-détecteurs sont même installés pour améliorer la performance des détecteurs. Par exemple, la collaboration CMS étend sa couverture du détecteur de muons et ATLAS rajoute une quatrième couche au détecteur à pixels. LHCb remplace son tube à faisceaux et ALICE améliore ses calorimètres.

Mais l’effort principal pour toutes les expériences du LHC reste encore la complétions des analyses basées sur l’ensemble des données recueillies jusqu’ici. Tout le monde semble suivre les conseils préférés de ma mère: Cent fois sur le métier, remettez votre ouvrage. C’est précisément ce qui se passe en ce moment. Chaque aspect de l’analyse des données est revisité pour tirer le maximum de l’ensemble des données actuelles: la calibration, l’identification des particules, l’évaluation du bruit de fond et l’extraction du signal.

Toutes les collaborations ont déjà des dizaines de nouveaux résultats fin prêts pour les prochaines grandes conférences estivales telle que celle de la Société européenne de physique à la mi-juillet.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

Share

This article appeared in symmetry on June 20, 2013.

TRIUMF Nigel Lockyer 2013 Headshots

Physicist Nigel Lockyer, head of TRIUMF, Canada’s national laboratory for particle and nuclear physics, will take a new role in September: director of Fermi National Accelerator Laboratory.

Lockyer will be a familiar face to many at Fermilab. He spent 22 years as a researcher on the Collider Detector at Fermilab, or CDF, experiment at the Tevatron particle collider, starting in 1984. He was co-leader of the 600-member experiment from 2002 to 2004.

Lockyer says he looks forward to reconnecting with former colleagues, to forging closer ties between laboratories in the United States and Canada, and to directing the only US national laboratory solely dedicated to particle physics research.

“The opportunity to lead one of the world’s most prestigious particle physics laboratories was too good to pass up,” Lockyer says. “The future of the field looks to be very exciting; it’s a great time to be a particle physicist.”

The Board of Directors of the Fermi Research Alliance, which operates Fermilab for the US Department of Energy, offered Lockyer the job at the conclusion of a nine-month, international search by a 15-member appointed committee. He will succeed Fermilab Director Pier Oddone, who announced in 2012 that he would retire this year after eight years as director.

“I’m happy to see Nigel will take the helm at Fermilab,” Oddone says. “He is quite familiar with the laboratory and will keep Fermilab at the forefront of particle physics research. He has shown immense competence in his job at TRIUMF, stepping out of his comfort zone as a faculty member at the University of Pennsylvania to lead and transform Canada’s particle and nuclear physics laboratory.”

The progress TRIUMF has made under Lockyer’s leadership bodes well for the future of the Fermilab program. Since he became director in 2007, TRIUMF has expanded its operations by 25 percent and has established Canada’s first accelerator-science cooperative agreements with Japan, India, China and Korea.

In 2008, TRIUMF established the nonprofit organization Advanced Applied Physics Solutions, dedicated to commercializing innovations from the physical sciences across Canada. In 2009, TRIUMF began operation of TIGRESS, a state-of-the-art gamma-ray spectrometer with which scientists conduct detailed studies of nuclear decays.

In 2011, TRIUMF started construction of the Advanced Rare IsotopE Laboratory, or ARIEL, the laboratory’s flagship facility to expand Canada’s capabilities to produce and study isotopes for medicine using next-generation superconducting radio-frequency technology.

This year, TRIUMF received support from the Canadian government to pursue new technology for producing Tc-99m, a medical isotope used to image diseases in about 40 million medical procedures annually. The laboratory developed new cyclotron-based technology that frees medical centers from their current dependence on nuclear reactors to produce the isotopes.

Lockyer boasts impressive scientific credentials as well. In 2006 he won the American Physical Society’s W.K.H. Panofsky Prize for measuring the lifetime of one of the smallest building blocks of the universe, the bottom quark.

Lockyer was born in Scotland and raised in Canada. He earned his undergraduate degree from York University in Toronto in 1975 and his PhD from The Ohio State University in 1980. After that, he spent four years at SLAC National Accelerator Laboratory as a postdoc working with Nobel Laureate Burton Richter, who directed SLAC from 1984 to 1999.

In 1984, Lockyer was hired as a professor of particle physics at the University of Pennsylvania and began work on the CDF experiment. In Pennsylvania, he partnered with the Penn Medical School to work on applications of particle physics in cancer treatment. Lockyer became director of TRIUMF in May 2007 and has worked as a professor at the University of British Columbia during his tenure.

Even before Lockyer gets started at Fermilab, he will have the chance to help shape the future of US particle physics. He will take part in the field’s long-term planning process, a major component of which will take place in Minneapolis in August. But he’s most looking forward to moving the science ahead, he says.

“I would like to help articulate the vision for the US particle physics community, get everyone on board and get started,” he says.

Kathryn Jepsen

Share

Your summer travel options

Friday, June 14th, 2013

Now that summer is fully here, are you feeling that old wanderlust, the desire to hit the open road? Well then, there are a lot of interesting places to go on the physics conference circuit between now and Labor Day. There are many fabulous locations on the menu, and who knows, you might get to hear the first public presentation of an exciting new physics result. While it’s true that what many would consider the most glamorous stuff from the LHC has already been pushed out (at the highest priority), you can be assured that scientists are hard at work on new results, and of course there are many other particle-physics experiments that are doing important work. So, find your frequent-flyer card and make sure you’ve changed the oil, and let’s see where you might be headed this summer:

  • 2013 Lepton Photon Conference, San Francisco, CA, June 24-29, hosted by SLAC. This is definitely the most prestigious conference this year; it is the international conference that is the odd-numbered year complement to the ICHEP meetings that are held in even-numbered years. Last year’s ICHEP saw the announcement of the observation of the Higgs boson, and if someone wants to make a big splash this year, they will do it at Lepton Photon. I have previously discussed how ICHEP works; the Lepton Photon series has a similarly storied history, but is slightly different in format, in that there are only plenary overview talks rather than a series of shorter, more focused presentations. San Francisco is always a great destination, and a fine place to consider the physics of the cable car and plate tectonics.
  • 2013 European Physical Society Conference on High Energy Physics, Stockholm, Sweden, July 18-24. If results aren’t ready in time for Lepton Photon, they could be ready in time for EPS. This conference also appears in odd-numbered years, and with a format that has both parallel and plenary sessions, there are many opportunities for younger people to present their work. It is probably the premier particle-physics conference in Europe this year. Thanks to the tilted axis of the earth, and the position of Stockholm at 59 degrees north of the equator, you’ll be able to enjoy 17 hours and 40 minutes of daylight each day at this conference…starting at 4 AM each morning.
  • Community Summer Study 2013, aka Snowmass on the Mississippi, Minneapolis, MN, July 29-August 6. This isn’t really a conference, but it is the culmination of the year-long effort of the US particle-physics community to define its long-range plan. With the discovery of the Higgs boson and important developments neutrino physics, we have better clues on what we should be trying to study in the future. Now we have to understand what facilities are best for this science, and what the technical barriers are to building and exploiting them. But we have to realize that we’re working with a finite budget, and we’ll have to do some hard thinking to understand how to set priorities. You might think that Minneapolis doesn’t have much on San Francisco or Stockholm, but my wife is from there, so I have traveled there many times and I think it’s a great place to visit. You can contemplate the balancing forces and torques on the “Spoonbridge and Cherry” sculpture at the Walker Art Center, or the aerodynamics of Mary Tyler Moore’s hat on the Nicollet Mall.
  • 2013 Meeting of the American Physical Society Division of Particles and Fields, Santa Cruz, CA, August 13-17. Like the EPS conference, DPF also meets in odd-numbered years and is a chance for the US particle physics community to gather. It’s one of my favorite conferences, with a broad program of particle physics and neither too big or too small. It is especially friendly to younger people presenting their own work. Measurements that weren’t ready for the earlier conferences could still get a good audience here. Yes, you might have gone to nearby San Francisco in June, but Santa Cruz has a totally different feel, and you can study the hydrodynamics that power the redwood trees that are all over the campus.

    And you might ask, where am I going this summer? I’d love to get to all of these, but I have another destination this summer — I will be moving my family to Geneva for a sabbatical year at CERN in July. It’s a little disappointing to be missing some of the action in the US, but I’m looking forward to an exciting year. I will be returning to the US for the Snowmass workshop, where I’m co-leading a working group, but that’s about it for conferences for me this summer. That will still be plenty exciting, and I’ll do my best to report all the news about it here.

    Share
  • Les dirigeant-e-s politiques doivent faire des choix difficiles quant il s’agit de dépenser l’argent public. Investir dans la science est pourtant un excellent placement non seulement à long terme, mais aussi pour des retombées immédiates.

    Bien sûr, si vous demandez ce que le boson de Higgs mettra dans l’assiette de l’humanité, la réponse est simple: personne ne le sait. Quand le ministre des Finances a questionné Michael Faraday sur la valeur pratique de l’électricité en 1850, il n’en avait aucune idée mais a répondu  «Un jour, monsieur, vous trouverez un moyen de la taxer.”

    La découverte du boson de Higgs signifie que nous avons désormais une théorie complète qui explique ce qu’est la matière visible. L’humanité peut donc aller se coucher ce soir en en sachant un peu plus sur l’univers dans lequel nous vivons.

    Par contre, les retombées indirectes sont nombreuses et découlent de l’ensemble des activités de recherche en physique des particules. Plusieurs viennent d’être résumées dans une nouvelle brochure intitulée «Accélérer la science et l’innovation – Les avantages sociétaux de la recherche européenne en physique des particules” (en anglais).

    Cette brochure a été présentée par le CERN aux ministres des sciences et de la technologie d’Europe la dernière semaine de mai à Bruxelles à l’occasion d’une réunion spéciale du Conseil du CERN organisée par la Commission européenne.

    Le World Wide Web, inventé au CERN il y a plus de 20 ans, est estimé avoir stimulé € 1,5 billion en trafic commercial annuel. C’est 1500 fois plus que le milliard de francs suisses consacrés à la recherche au CERN chaque année.

    Environ 10.000 accélérateurs servent en médecine à travers le monde, tous utilisant la technologie développée en physique des particules.

    Grâce à la physique, la radiothérapie et les rayons X sont utilisés tous les jours pour le traitement du cancer et l’imagerie médicale. La thérapie hadronique, où des protons ou des ions de carbone sont utilisés au lieu des photons comme en radiothérapie conventionnelle, est la dernière technique prometteuse développée récemment et devrait améliorer considérablement le traitement de certains types de cancer. De tels accélérateurs développés en collaboration avec le CERN sont déjà utilisés par MedAustron  en Autriche et CNAO en Italie.

    CNAOL’accélérateur développé pour la thérapie hadronique par CNAO en collaboration avec le CERN pour détruire plus efficacement les tuneurs cancérigènes (photo gracieuseté de CNAO)

    Même les recherches sur l’antimatière sont mises à contribution. L’expérience ACE effectuée à « l’usine » d’antimatière du CERN a montré que les antiprotons pourraient être efficaces pour détruire les tumeurs.

    La physique des particules au CERN a aidé à produire des panneaux solaires plus efficaces et contribue maintenant au développement d’accélérateurs de poche qui permettront aux hôpitaux de produire localement leurs propres doses d’isotopes radioactifs suivant la demande.

    Des ingénieurs du CERN testent des câbles supraconducteurs à haute température faits de diborure de magnésium. Ce type de recherche pourrait signifier éventuellement le transport de l’électricité sur de grandes distances sans perte d’énergie.

    panneaux-solaires

    Panneaux solaires nouvelles génération de l’aéroport de Genève utilisant la technologie du vide développée pour les accélérateurs du CERN

    Les technologies des accélérateurs sont également utilisées dans divers projets de nettoyage industriels. Dans des essais effectués au Texas, des faisceaux d’électrons ont converti les boues d’épuration hautement infectieuses en engrais agricole sécuritaire. Des efforts sont également en cours sur l’installation n-TOF du CERN pour transmuter les déchets nucléaires hautement radioactifs en matériaux inoffensifs.

    Ce ne sont là que quelques-unes des nombreuses applications découlant de la recherche menée dans les laboratoires de physique des particules. Et c’est encore sans compter la formation d’une main d’oeuvre hautement qualifiée prête à relever d’autres défis technologiques ainsi que l’enthousiasme suscité parmi les jeunes, les professeur-e-s et le grand public.

    C’était donc une excellente nouvelle fin mai quand le Conseil du CERN a adopté la Stratégie européenne pour la physique des particules lors de sa réunion spéciale organisée par la Commission européenne. Les bénéfices sociétaux sont multiplés quand les nations unissent leurs efforts et leurs ressources pour l’approfondissement de la connaissance fondamentale.

    Pauline Gagnon

    Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

    Share

    Politicians are faced with hard choices. How should they spend public money? Investing in science is an excellent choice not only for the long-term but also for immediate returns.

    Of course, if you are asking what will the Higgs boson put on humanity’s plate, the answer is easy: nobody knows. When the finance minister asked Michael Faraday about the practical value of electricity in 1850, he had an idea, but he replied: “One day sir, you may tax it.”

    At least, the discovery of the Higgs boson means that we now have a complete theory to explain how visible matter works. Hence, humanity can go to bed knowing a little more about the Universe we live in.

    But there are plenty of indirect returns stemming from all the research activities in particle physics. Many of them have just been summarised in a new brochure called “Accelerating science and innovation – Societal benefits of European research in particle physics”.

    The brochure was presented by CERN to European science and technology ministers last week of May in Brussels on the occasion of a special meeting of the CERN Council hosted by the European Commission.

    The World Wide Web, invented at CERN more than 20 years ago, is estimated to have stimulated €1.5 trillion in annual commercial traffic. This is 1500 times larger than the billion CHF spent on research annually at CERN.

    Around 10,000 accelerators using technology developed for particle physics are now in operation for medical use in hospitals worldwide.

    Thanks to physics, X-rays and radiotherapy are used everyday for cancer treatment and medical imaging. Hadron therapy, where protons or carbon ions are used instead of photons as in conventional radiotherapy, is the latest promising technique developed recently and is set to greatly improve therapy for certain types of cancer. Such accelerators developed in collaboration with CERN are already in used by MedAustron in Austria and CNAO in Italy.

    CNAO

    The CNAO accelerator used for hadron therapy developed in partnership with CERN provides a more efficient way to kill cancerous cells.

    Even antimatter research is put to good use. The ACE experiment performed at CERN’s antimatter facility showed that antiprotons could be powerful in destroying tumours.

    Particle physics at CERN has helped produce more efficient solar energy panels and is now developing desk-top accelerators to enable hospitals to produce locally their own single doses of radioactive isotopes as needed.

    CERN engineers are testing high temperature superconducting cables of magnesium diboride. This kind of research could lead to electricity being carried over large distances without energy loss.

    panneaux-solaires

    The solar panels used by the Geneva airport for heating use a technology created to improve the vacuum in CERN accelerators beam pipes.

    Accelerator technology is also used for various industrial clean-up projects. In trials in Texas, electron beams have converted highly infectious sewage sludge into safe-to-handle agricultural fertiliser. Efforts are also underway with the n-TOF facility at CERN to transmute highly radioactive nuclear waste into safe materials.

    These are but a few of the many applications stemming from research conducted in particle physics facilities. Not to mention training a supply of people ready for technological challenges, stimulating students and teachers interest and igniting enthusiasm for physics all over the world.

    So it was great news last week that CERN Council adopted the European Strategy for Particle Physics at its special meeting hosted by the European Commission. The benefits are multiplied when nations pool their efforts and resources in the pursuit of fundamental knowledge.

    Pauline Gagnon

    To be alerted of new postings, follow me on Twitter: @GagnonPauline
 or sign-up on this mailing list to receive and e-mail notification.

     

    Share

    Does God exist?  This is one of the oldest questions in philosophy and is still much debated. The debate on the God particle is much more recent but searching for it has cost a large fortune and inspired people’s careers. But before we can answer the questions implied in the title, we have to decide what we mean when we say something exists. The approach here follows that of my previous essay that defines knowledge in terms of models that make successful predictions.

    Let us start with a simple question: What does it mean when we say a tree exists? The evidence for the existence of trees falls into two categories: direct and indirect. Every autumn, I rake the leaves in my backyard. From this I deduce that the neighbour has a tree. This is indirect evidence. I develop a model that the leaves in my backyard come from a tree in the neighbour’s yard. This model is tested by checking the prediction that the leaves are coming from the direction of the neighbour’s yard. Observations have confirmed this prediction.  Can I then conclude that a tree exists? Probably, but it would be useful to have direct evidence. To obtain this, I look into my neighbour’s yard. Yup, there is a tree. But not so fast–what my eye perceives is a series of impressions of light. The brain then uses that input to construct a model of reality and that model includes the tree. The tree we see is so obvious that we frequently forget that it is the result of model construction, subconscious model construction, but model construction none-the-less. The model is tested when I walk into the tree and hurt myself.

    Now consider a slightly more sophisticated example: atoms. The idea of atoms, in some form or other, dates back to ancient India and Greece but the modern idea of atoms dates to John Dalton (1766 – 1844). He used the concept of atoms to explain why elements always interact in the ratios of small whole numbers. This is indirect evidence for the existence of atoms and was enough to convince the chemists but not the physicists of that time. Some like Ernst Mach (1838 – 1916) refused to believe in what they could not see up until the beginning of the last century[1]. But then Albert Einstein’s (1879 – 1955) famous 1905 paper[2] on Brownian motion (the motion of small particles suspended in a liquid) convinced even the most recalcitrant physicists that atoms exist.  Einstein showed that Brownian motion could be easily understood as the result of the motion of discrete atoms. This was still indirect evidence but convincing to almost everyone. Atoms were only directly seen after the invention of the scanning electron microscope and even then there was model dependence in interpreting the scanning electron microscope results. As with the tree, we claim that atoms exist because, as a shown by Dalton, Einstein and others, they form an essential part of models that have strong track record of successful predictions.

    Now on to the God particle. What a name! The God particle has little in common with God but the name does sound good in the title of this essay. Then again, calling it the Higgs boson is not without problems as people other than Peter Higgs[3] (1920 – ) have claimed to have been the first to predict its existence. Back to the main point, why do we say the God particle exists? First there is the indirect evidence. The standard model of particle physics has an enviable record of successful predictions. Indeed, many (most?) particle physicists would be happier if it had had some incorrect predictions. We could replicate most of the successful predictions of the standard model without the God particle but only at the expense of making the model much more complicated. Like the recalcitrant physicists of old who rejected the atom, the indirect evidence for the God particle was not good enough for most modern-day particle physicists. Although few actually doubted its existence, like doubting Thomas, they had to see it for themselves. Thus, the Large Hadron Collider (LHC) and its detectors were built and direct evidence was found. Or was it? Would lines on a computer screen have convinced the logical positivists like Ernst Mach? Probably not, but the standard model predicted bumps in the cross-sections and the bumps were found. Given the accumulated evidence and its starring role in the standard model of particle physics, we confidently proclaim that the God particle, like the tree and the atom, exists. But remember, that even for the tree our arguments were model dependent.

    Having discussed the God particle what about God? I would apply the same criteria to His/Her/Its existence as for the tree, the atom, or the God particle. As in those cases, the evidence can be direct or indirect.  Indirect evidence for God’s existence would be, for example, the argument from design attributed to William Paley (1743 – 1805). This argument makes an analogy between the design in nature and the design of a watch. The question is then is this a good analogy? If we adopt the approach of science this reduces to the question: Can the analogy be used to make correct predictions for observations? If it can, the analogy is useful, otherwise it should be discarded. There is also the possibility of direct evidence: Has God or His messengers ever been seen or heard? But as the previous examples show, nothing is ever really seen directly but depends on model construction. As optical illusions illustrate, what is seen is not always what is there. Even doubting Thomas may have been too ready to accept what he had seen. As with the tree, the atom or the God particle, the question comes back to: Does God form an essential part of a model with a track record of successful predictions?

    So does God exist? I have outlined the method for answering this question and given examples of the method for trees, atoms and the God particle. Following the accepted pedagogical practice in nuclear physics, I leave the task of answering the question of God’s existence as an exercise for you, the reader.

    To receive a notice of future posts follow me on Twitter: @musquod.


    [1] Yes, 1905 was the last century. I am getting old.

    [2] He had more than one famous 1905 paper.

    [3] Why do we claim Peter Higgs exists?  But, I digress.

    Share