• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for October, 2012

Torride et cool à la fois

Friday, October 19th, 2012

En septembre, les opérateurs du Grand Collisionneur de Hadrons (LHC) au CERN on réussi un truc nouveau : mettre en collision un faisceau de protons avec un faisceau d’ions de plomb. Habituellement, le LHC fonctionne avec deux faisceaux de particules identiques (protons ou ions) circulant en sens inverse dans l’accélérateur. Pourquoi cette nouvelle configuration ?

Ces ions sont des atomes auxquels on  a arraché tous les électrons, ne laissant que le noyau atomique. Les ions de plomb contiennent 82 protons plus 126 neutrons, le tout maintenu ensemble par la force nucléaire.  Les protons sont eux aussi des particules composites puisqu’ils sont faits de trois quarks « collés » ensemble grâce aux « gluons », les particules associées à la force nucléaire.

Alors quand de tels noyaux entrent en collision à presque la vitesse de la lumière, qui pourrait prédire où chaque quark et chaque gluon aboutira? Même avec seulement quinze balles de billard, il est pratiquement impossible de deviner où elles iront après la casse.  Si, de surcroit, chaque projectile est fait de centaines de particules, cela devient totalement imprévisible.

A première vue, il semblerait que tout ce qui peut sortir de collisions ions-ions est un fouillis incroyable. Mais en fait, ces collisions super énergétiques produisent le fouillis le plus torride et le plus cool qui soit : un plasma de quarks et gluons.

Tout le monde connaît les trois états de la matière: solide, liquide et gazeux mais le quatrième état, le plasma, est lui bien moins connu. C’est ce qu’on retrouve dans un tube au néon quand la différence de potentiel appliquée est assez forte pour arracher tous les électrons du gaz. Les ions chargés positivement ainsi que les électrons flottent librement, ayant suffisamment d’énergie pour ne pas se recombiner.

Le plasma de quarks et gluons est juste l’étape suivante. Imaginez qu’on fournisse suffisamment d’énergie pour pouvoir dissocier non seulement les atomes mais aussi les noyaux et mêmes les nucléons (le nom générique donné aux neutrons et protons à l’intérieur des noyaux atomiques). On obtient alors une soupe extrêmement énergétique de quarks et de gluons.

Il n’y a pas plus chaud et ce serait l’état dans lequel se trouvait toute la matière immédiatement après le Big Bang. Fait étonnant : le plasma de quarks et gluons se comporte comme un fluide ayant des propriétés collectives et non comme un ensemble de particules indépendantes. C’est en fait un fluide parfait ayant une viscosité nulle. Si on essayait de le confiner dans un contenant, le fluide remonterait les parois du contenant et se répandrait au maximum. Plus cool que ça et tu meurs…

L’expérience ALICE se consacre justement à l’étude de ce plasma. Chaque année, le LHC opère pour quelques semaines avec des ions de plomb au lieu des protons. ALICE accumule des données durant les collisions protons-protons et celles d’ions lourds. Même lorsque ce sont seulement des protons qui entrent en collision, les projectiles ne sont pas des balles pleines comme au billard mais bien des objets composites. En comparant ce que l’on obtient à partir de collisions d’ions ou de protons, les physicien-ne-s d’ALICE doivent d’abord distinguer ce qui vient du fait que les projectiles sont des protons liés dans le noyau ou bien à l’état libre.

Jusqu’à maintenant, il semble que le plasma de quarks et gluons ne se forme que dans les collisions d’ions puisqu’ils sont les seuls à fournir la densité d’énergie requise sur un volume assez substantiel (le volume d’un noyau atomique). Certains des effets observés, comme le nombre  de particules à émerger du plasma de quarks et gluons à différents angles ou vitesses dépend en partie de la nature de l’état final créé. Quand un plasma se forme, il réabsorbe une partie des particules émises, de telles sortent qu’on en voit beaucoup moins sortir de ces collisions.

Les collisions de protons sur des ions lourds permettront peut-être de démêler ce qui est attribuable à l’état initial (protons libres ou liés dans le noyau) et l’état final (comme lorsque le plasma réabsorbe une partie des particules émises).

Déjà, avec une seule journée d’opération à ce régime, la collaboration ALICE vient de publier deux articles scientifiques. Le premier article donne la mesure de la densité de hadrons chargés produits dans des collisions proton-ions comparée aux mêmes mesures effectuées avec des collisions protons-protons ou ions-ions, après avoir normalisé le tout. Le second article porte sur la comparaison des distributions de quantités de mouvement des hadrons chargés pour des collisions protons-protons et ions-ions.

Le but ultime est d’étudier les fonctions de structure des projectiles utilisés, c’est-à-dire décrire comment les quarks et les gluons sont distribués à l’intérieur des protons quand ils sont libres ou liés dans le noyau des ions de plomb.

Bien d’autres études suivront au début de 2013 durant la période de deux mois consacrée aux collisions protons-ions.

« Cliché » des débris d’une collision de proton-ion de plomb capturé par le détecteur ALICE montrant un grand nombres de particules diverses crées à partir de l’énergie dégagée.

 

Pauline Gagnon

 

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

 

Share

I like talking about science. I like talking about religion. I even like talking about the relationship and boundaries between the two. These are all fascinating subjects, with many questions that are very much up for debate, so I am very pleased to see that CERN is participating in an event in which scientists, philosophers, and theologians talk together about the Big Bang and other questions.

But this quote, at least as reported by the BBC, simply doesn’t make any sense:

Co-organiser Canon Dr Gary Wilton, the Archbishop of Canterbury’s representative in Brussels, said that the Higgs particle “raised lots of questions [about the origins of the Universe] that scientists alone can’t answer”.

“They need to explore them with theologians and philosophers,” he added.

The Higgs particle does no such thing; it is one aspect of a model that describes the matter we see around us. If there is a God, CERN’s recent observations tell us that God created a universe in which the symmetry between the photon and the weak bosons is probably broken via the Higgs Mechanism. If there is not, they tell us that a universe exists anyway in which the symmetry between the photon and the weak bosons is probably broken via the Higgs Mechanism. It doesn’t raise any special questions about the origins of the universe, any more than the existence of the electron does.

There are many interesting philosophical questions to ask about the relationships between models of scientific observations on the one hand, and notions of absolute Truth on the other. You can also talk about what happened before the times we can make scientific observations about, whether there are “other universes” with different particles and symmetries, and so on. Theologians and philosophers have much to say about these issues.

But in regard to searches for the Higgs boson in particular, the people we need to explore questions with are mostly theoretical physicists and statisticians.

Share

“Snowmass” (Not Snowmass)

Saturday, October 13th, 2012

Every so often, perhaps once or twice a decade, particle physics in the United States comes to some kind of a crossroads that requires us to think about the long-term direction of the field. Perhaps there is new experimental data that is pointing in new directions, or technology developments that make some new facility possible, or we’re seeing the end of the previous long-term plan and it’s time to develop the next one. And when this happens, the cry goes up in the community — “We need a Snowmass!”

Snowmass refers to Snowmass Village in Colorado, just down the road from Aspen, the home of the Aspen Center for Physics, a noted haunt for theorists. During the winter, Snowmass a ski resort. During the summer, it’s a mostly empty ski resort, where it’s not all that expensive to rent some condos and meeting rooms for a few weeks. Over the past few decades there have been occasional “summer studies” held at Snowmass, typically organized by the Division of Particles and Fields of the American Physical Society (and sponsored by a host of organizations and agencies). It’s a time for the particle-physics community to come together for a few weeks and spend some quality time focusing on long-range planning.

The last big Snowmass workshop was in 2001. At the time, the Fermilab Tevatron was just getting started on a new data run after a five-year shutdown for upgrades, and the LHC was under construction. The top quark had been discovered, but was not yet well characterized. We were just beginning to understand neutrino masses and mixing. The modern era of observational cosmology was just beginning. A thousand physicists came to Snowmass over the course of three weeks to plot the future of the field. (And I was a lot younger.) Flash forward eleven years: the Tevatron has been shut down (leaving the US without a major high-energy particle collider), the LHC is running like gangbusters, we’re trying to figure out what dark energy is, and just in the past year two big shoes have dropped — we have measured the last neutrino mixing angle, and, quite famously, observed what could well be the Higgs boson. So indeed, it is time for another Snowmass workshop.

This week I came to Fermilab for a Community Planning Meeting for next year’s Snowmass workshop. Snowmass 2013 is going to be a bit different than previous workshops in that it will not actually be at Snowmass! Budgetary concerns and new federal government travel regulations have made the old style of workshop infeasible. Instead, there will be a shorter meeting this summer hosted by our colleagues at the University of Minnesota (hats off to thee for having us), so this time we won’t have as much time during the workshop to chew over the issues, and more work will have to be done ahead of time. (But I suspect that we’re still going to call this workshop “Snowmass”, just as the ICHEP conference was “the Rochester conference” for such a long time, even if it’s now the “Community Summer Study”.)

This Snowmass is being organized along the three “frontiers” that we’re using to classify the current research efforts in the field — energy, intensity and cosmic. As someone who works at the LHC, I’m most familiar with what’s going on at the energy frontier, and certainly there are important questions that have only come into focus this year. Did we observe the Higgs boson at the LHC? What more do we have to know about it to believe that it’s the Higgs? What are the implications of not having observed any other new particles yet for particle physics and for future experiments? The Snowmass study will help us understand how we answer these questions, and specifically what experiments and facilities are needed to do so. There are lots of interesting ideas that are out there right now. Can the LHC tell us what we need to know, possibly with an energy or luminosity upgrade? Is this the time to build a “Higgs factory” that would allow us to study measure Higgs properties precisely? If so, what’s the right machine for that? Or do we perhaps need an accelerator with even greater energy reach, something that will help us create new particles that would be out of reach of the LHC? What kind of instrumentation and computing technologies are needed to make sense of the particle interactions at these new facilities? The intensity and cosmic frontiers have equally big and interesting questions. I would posit that the scientific questions of particle physics have not been so compelling for a long time, and that it is a pivotal time to think about what new experiments are needed.

However, we also have the bracing reality that we are looking at these questions in a budget environment that is perhaps as constrained as it has ever been. Presentations from our champions and advocates at the Department of Energy and the National Science Foundation, the agencies that fund this research (and that sponsor the US LHC blog) were encouraging about the scientific opportunities but also noted the boundary conditions that arise from the federal budget as a whole, national research priorities, and our pre-existing facilities plan. It will continue to be a challenge to make the case for our work (compelling as it may be to us, and to someone who might be interested in looking at the Quantum Diaries site) and to envision a set of facilities that can be built and used given the funding available.

The first (non-native) settlers of Snowmass, Colorado, were miners, who were searching for buried treasure under adverse conditions. They were constrained by the technology of the time, and the facilities that were available for their work. I shouldn’t suggest that what we are doing is exactly like mining (it’s much safer, for one thing), but hopefully when we go to Snowmass (or really “Snowmass”) we will be figuring out how to develop the technology and facilities that are needed to extract an even greater treasure.

Share

Until last evening I thought that The Simpsons game for iPhone and iPad was just another overrated little game. Your task is to rebuild Springfield after an explosion obliterates Homer’s power plant. The game is supposed to be an interesting, easy-going distraction, but it evolves at too slow a pace (unless you are willing to spend money on it, which I am not) and lacks the much-appreciated humorous punch that characterizes the show.

Or so was my impression until, out of the blue, Professor Frink shows up and asks for nothing less than a new … super collider.  Lisa, as a citizen reluctant to spend the taxpayers’ money, asks, “What about the LHC at CERN?”, but Frink argues that’s not powerful enough. So I decide that my Springfield will have its very own super collider and then I find out the new super collider’s building looks like Fermilab’s Wilson Hall. Now, THAT was a Simpsons punch!

Check out the screen shots below.

Marcelle Soares-Santos

Share

Many years ago, I served on a committee responsible for recommending funding levels for research grants. After the awards were announced, a colleague commented that all we did was count the number of publications and award grants in proportion to that number. So, I checked and did a scatter plot. Boy, did they scatter. The correlation between the grant size and the number of publications was not that strong. I then tried citations; again a large scatter. Well, perhaps the results really were random—nah, that could not happen; I was on the committee after all.

I did not do a multivariable analysis, but there were no simple correlations between what might be called quantitative indicators and the size the research grant. This supports the conclusions of the Expert Panel on Science Performance and Research Funding: Mapping research funding allocation directly to quantitative indicators is far too simplistic, and is not a realistic strategy[1]. Trying to do that is making the mistake of the logical positivists who wanted to attach significance directly to the measurements. As I have argued in previous essays, the meaning is always in the model and logical positivism leads to a dead end.

In deciding funding levels, the situation is too complicated for the use of a simple algorithm.  Consider the number of publications. There are different types of publications: letters, regular journal articles, review articles, conference contributions, etc. Publications are of different lengths. Should one count pages rather than publications? Or is one letter worth two regular journal papers; letters being shorter and considered by some to be more important than regular articles.  But, in reality, one wants to see a mix of the different types of publications. A review article might indicate standing in the field but one also wants to see original papers.  Is a paper in a prestigious journal worth more than one in a more mundane journal? What is a prestigious journal anyway? There is also the question of multi-author papers. One gets suspicious if all the papers are with more senior or well-known authors but all single author papers is also a warning sign. Generally co-authoring papers with junior collaborators is a good thing. In some fields, all papers include all members of the collaboration so the number of coauthors carries very little information. The order of authors on a publication may or may not be important. And on it goes. Expert judgment is, as always, required to sort out what it all means.

Citations are an even bigger can of worms. Even in a field as small as sub-atomic theoretical physics there are distinct variations in the pattern of citations among the subfields: string theory, particle phenomenology or nuclear physics. For example, the lifetime for citations in particle phenomenology is significantly less than in nuclear physics. Then there is the question of self-citations, citations to one’s own work or, more subtle, to close collaborators. And what about review articles?  Is a citation to a review article as important as one to an article on original research?   Review articles frequently collect more references. My most cited paper is a review article. A person can, with a bit of effort, sort this all out. Setting up an algorithm would be damn near impossible. A person could even, gasp, read some of the papers and form an independent opinion of their validity. But that could introduce biases. Hence, numbers are important but they must be interpreted. This leads to the conclusion: Quantitative indicators should be used to inform rather than replace expert judgment in the context of science assessment for research funding allocation.[2]

The other problem with simple algorithms is the feedback loop. With a simple algorithm, researchers naturally change their behaviour to maximize their grants. For example, if we judge on the number of publications, people split papers up, publish weak papers, or publish what is basically the same thing several times. I have done that myself. None of these improve the quality of the work being done. Expert judgment can generally spot these a mile away. After all, the experts have used these tricks themselves.

More generally, there is the problem of trying to reduce everything to questions that have nice quantitative answers. Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong questions, which can always be made precise[3]. There seems to be this argument that since science normally uses quantitative methods, administration should follow suit so it can have the success of science. It is like the medieval argument that since most successful farmers had three cows; the way to make farmers successful was to give them all three cows.  But the wrong question can never give the right answer. It is far better to ask the right question and then work on getting a meaningful answer. What we want to do at a science laboratory, or for funding science generally, is to advance our understanding of how the universe works to maximum extent possible and use the findings for the benefit of society. The real question is how do we do this? That is neither an easy question to answer nor one that can be easily quantified. Not being quantifiable does not make it a meaningless question. There are various metrics an informed observer can use to make intelligent judgments. But it is very important that administrators avoid the siren call of logical positivism and not try to attach meaning directly to a few simple measurements.


[1] Quote from: Informing research choices: indicators and judgment. Expert Panel on Science Performance and Research Funding. Council of Canadian Academies (2012).

[2] Ibid

[3] Tukey, J. W. (1962). The future of data analysis. Annals of Mathematical Statistics 33(1), 1-67.

Share

VERTEX 2012

Friday, October 5th, 2012

Seth talking at the VERTEX2012 conferenceNever mind my complaints about travel, VERTEX 2012 was a very nice conference. There were a lot of interesting people there, mostly much more expert than me on the subject of vertex detectors. (I’ve written before about how tracking works and how a pixel detector works. In general, a vertex detector is a high-precision tracker designed to measure exactly where tracks come from; a pixel detector is one type of vertex detector.) My talk was about the current operations of the CMS pixel detector; you can see me giving the talk at right, and the (very technical) slides are here. Other talks were about future development in on-detector chip and sensor technology; this work is likely to affect the next detectors we build, and the upgrades of our current detectors as well.

VERTEX 2012 Conference attendees at Sunrise Peak, JejuThe location of the conference — Jeju, Korea — was also very nice, and we got an afternoon off to see some of the island. The whole island is volcanic. The central mountain dominates the landscape, and there are lots of grass-covered craters. Sunrise peak, at left, erupted as recently as 5,000 years ago, but it seemed pretty quiet when we were there.

Overall, the conference was a great opportunity to meet people from all over the world and learn from them. And that’s really why we have to travel so far for these things, because good people work everywhere.

Share

Happy Birthday ATLAS!

Monday, October 1st, 2012

Today marks the 20th anniversary of the submission of the experiment’s Letter of Intent – the first published document officially using the name ATLAS.

Here’s to many exciting years to come!
More on the history: ATLAS News

Higgsdepenence Day cake, celebrating the 5 sigma discovery of a higgs-like boson

Share

Le Centre de spectrométrie nucléaire et de spectrométrie de masse (CSNSM) connut une période d’excitation intense lorsque les premiers supraconducteurs à haute température critique virent le jour : entre essais infructueux et recettes magiques, Louis Dumoulin chercheur au CSNSM nous raconte cet épisode aux accents mystérieux :

Cette année 1987, le CSNSM se lançait dans la construction de l’accélérateur d’ions Aramis. Un petit groupe d’étude composé d’Harry Bernas, Jacques Chaumont et Jérôme Lesueur partit en mission aux États-Unis pour visiter les réalisations existantes de ce type d’accélérateurs. Ils passèrent notamment par les prestigieux Bell Labs.

Lorsqu’ils revinrent, ils avaient les yeux brillants et un air bizarre. Ils parlaient entre eux de manière sibylline et entendue… mais pas de faisceaux d’ions ! Ils finirent alors par nous expliquer : “Il régnait aux Bell Labs une atmosphère étrange. Les gens étaient tous très occupés et se déplaçaient furtivement. Souvent on pouvait voir des chercheurs connus ayant délaissé leurs pupitres “high tech” pour écraser minutieusement une poudre noire au pilon dans des mortiers. Même des théoriciens étaient atteints !”. Grâce à l’amitié entre Harry et Bob Dynes, directeur du département, nos trois missionnaires furent mis dans le secret : on venait de découvrir un matériau supraconducteur à 92 K, c’est-à-dire 15 K au dessus de la température d’ébullition de l’azote liquide, le Graal de tous les physiciens du domaine, une bombe scientifique et technologique… mais encore non publiée.

Nous nous précipitons sur la recette griffonnée sur un carnet. C’est incroyablement simple ! Il faut faire un mélange intime, noir en l’occurrence, de trois composés d’yttrium, de baryum et de cuivre, le fritter par pression et recuire à 800° C sous oxygène. Les ingrédients sont rapidement rassemblés car ils sont courants en chimie. À notre tour nous sommes gagnés par la fièvre de la “poudre noire” et d’aucuns ont dû se demander ce qui était arrivé à l’équipe pour ressortir les pilons et les mortiers.

Pendant plusieurs jours, c’est l’échec. Les échantillons sont plutôt isolants. Désespérés, nous tenons conseil, par hasard sur le parking. Alors passe sur sa vieille moto, venant du Laboratoire de physique des solides et rentrant chez lui, Philippe Monod. Il s’arrête. Très vite nous savons qu’il sait et réciproquement :

– Vous y arrivez ?
– Oui.
– Pas nous !
– Comment faites-vous ?
– Eh bien, après quelques heures à 800°, nous sortons l’échantillon et nous nous précipitons pour le mesurer à l’azote liquide.
– Bon ! Ce soir, vous coupez le four et vous allez vous coucher.

Puis il s’en va en pétaradant… Nous sommes sidérés. Nous sommes dans un laboratoire pluridisciplinaire, nous nous appuyons sur le Modèle standard, la mécanique quantique et même parfois la Relativité générale et nous devons respecter une sorte de pratique magique ? En humbles expérimentateurs, nous nous exécutons. La recette a l’avantage de nous éviter une nuit blanche de plus.

Le lendemain, le nouvel échantillon est monté sur le dispositif de test. Tous les acteurs sont là car chacun sent qu’il va se passer quelque-chose. Il y a même Pierre Lehmann, directeur de l’IN2P3, qui passait pour une autre raison : les grands hommes sont toujours là aux grands moments. Jean Paul Burger met des croix au crayon sur un méchant papier millimétré. Je lui dicte les couples de valeurs résistance-température. 95K… 93K… puis la résistance fléchit, puis elle décroit irrésistiblement -si j’ose dire. À 90K, elle est nulle. C’est l’enthousiasme ! Nous avons réussi ! C’est notre premier échantillon à haute température critique. Sans doute le premier dans un laboratoire de l’IN2P3. Les aimants supraconducteurs des accélérateurs fonctionneront à l’azote liquide, on mettra des panneaux photovoltaïques et des éoliennes dans les déserts où cela ne dérangera personne et on transportera l’énergie sans pertes sur des milliers de Km. Et puis nul ne doute qu’un supraconducteur à la température ambiante est pour demain !

Vingt cinq ans plus tard, les températures critiques plafonnent à 130K, les aimants fonctionnent toujours à l’hélium liquide – le LHC nous le rappelle aujourd’hui -, les panneaux solaires sont sur les toits et les éoliennes… Mais ce jour là nous rêvions : une ère nouvelle s’ouvrait en physique avec des applications fantastiques.

Mais pourquoi fallait-il attendre une nuit avant de tester l’échantillon ? Nous eûmes la réponse un peu plus tard. Le recuit à 800° sous oxygène produit la structure cristalline requise, mais conduit au composé YBa2Cu3O6 qui est isolant. Le septième oxygène indispensable à la supraconductivité ne peut être introduit qu’au-dessous de 400°C avec une cinétique lente. Il faut donc passer du temps dans cette gamme de température, ce qui se fait naturellement au cours du refroidissement du four mais pas lorsqu’on sort brutalement l’échantillon.

Pour les rescapés de cette aventure, notre bien modeste – mais ô Combien réjouissant – succès est toujours associé au passage de Philippe Monod sur sa moto. L’information scientifique prend parfois des chemins imprévisibles.

— anecdote fournie par le Centre de Spectrométrie Nucléaire et de Spectrométrie de Masse (CSNSM), unité mixte de recherche du CNRS/IN2P3 et de l’Université Paris Sud, dans le cadre des 40 ans de l’IN2P3.

Share