• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for January, 2012

So what is the largest neutrino detector in the world? This discussion came up in regards to a very nice little educational video on YouTube that mentions the ANITA experiment:

(these minutephysics pieces are quite good!)

So, ANITA is the balloon-borne experiment mentioned in the video and of which I am a collaborator. But folks at IceCube claim that’s the world’s largest neutrino detector. And that’s a project I also work on. Furthermore, I was just at the South Pole working on a new neutrino detector called ARA (the Askaryan Radio Array) which has been mentioned as the largest neutrino detector in the world, even when only partially constructed. (See arxiv for a good ARA summary.)

So what’s the truth? Well, as in so many different endeavors, it comes down to the definition of largest. Or largest in what sense.

IceCube: This is an instrumented volume of a full cubic kilometer. Made up of over five thousand individual digital optical modules (DOMs) it is certainly the largest instrumented volume in the world. It uses the Cherenkov effect of neutrino-induced shower particles in the optically clear ice to image the shower and hence the neutrino.

ANITA: During an ANITA balloon flight, the payload observes a simply astonishing, more than a million square kilometers at a time. Only for certain narrow angular ranges can events form in the ice, refract through the surface and reach the balloon floating at 120,000 feet, but it is the largest observed area. This uses the Askaryan Effect which is a Cherenkov-like radio pulse emission from showers in dense materials.

ARA: The full ARA will cover hundreds of cubic kilometers of ice, but will have just 37 stations, each with four strings of four antennas. A much larger volume than IceCube, but much more sparsely instrumented due to the better attenuation length of radio than optical photons in the cold polar ice. The engineering test station that has been running since January 2011 has the largest volumetric acceptance of any neutrino detector in the world, several cubic kilometers. This also uses the radio technique.

So, largest neutrino detector in the world? Depends on your definition.

Read more about them: ANITA, IceCube on Facebook, Ice Cube on Facebook, ARA homepage, other radio neutrino efforts…RICE, ARIANNA, SalSA

Share

When I first started dabbling in the dark side and told people I was working on the philosophy of science, the most common response from my colleagues was: Oh the foundations of quantum mechanics? Actually not. For the most part, I find the foundations of quantum mechanics rather boring. Perhaps that is because my view of science has a strong instrumentalist tinge, but the foundations of quantum mechanics have always seemed to me to be trying to fit a quantum reality into a classical framework; the proverbial triangular peg in an hexagonal hole. Take wave-particle duality for example. Wave and particles are classical idealizations. The classical point particle does not exist, even within the context of classical mechanics. It should come as no surprise that when the classical framework breaks down, the concepts from classical mechanics are no longer valid. What quantum mechanics is telling us is only that the classical concepts of waves and particles are no longer valid. Interesting, but nothing to get excited about.

The problem with the uncertainty principle is similar. This principle states that we cannot simultaneously measure the position and motion of a particle. Now, classically, the state of a particle is given by its location and motion (i.e. it’s momentum). Quantum mechanically, the state is given by the wave function or, if you prefer, by a distribution in the location-motion space[1]. Now the problem is not that the location and motion cannot be measured simultaneously but that the particle does not simultaneous have a well-defined position and motion since its state is given by a distribution. This causes realists, at least classical realists, to have fits. In quantum mechanics, the position is only known when it is directly measured, ie properties of the system only exist when they are being looked at. This is a distinctly antirealist point of view. Again, this is trying to force a classical framework on a quantum system. If anything is real in quantum systems, it is wave functions, not individual observables. But see below.

Quantum mechanics is definitely weird; it goes against our common sense, our intuition. The main problem is that, while classical mechanics is deterministic, quantum mechanics is probabilistic. To see why this is a problem, consider the classical-probability problem of rolling a dice. I roll a fair dice. The chance of it being 2 is 1/6; similarly for any value from 1 to 6. Now once I look at the dice the probability distribution collapses. Let’s say, I see a 2. The probability is now 1 that the value is 2 and zero for the other values. But for Alice who has not seen me check, the probabilities are still all 1/6. I now tell her that the number is even. This collapse her probability distribution so that it is 1/3 for 2,4,6 and zero for 1,3,5. Now for Bob, who did not hear me telling Alice, the probabilities are still 1/6 for each of the numbers. Two important points arise from this. First, classical probabilities change discontinuously when measurements are made and, second, classical probabilities depend not just on the system but on the observer, ie probabilities are observer dependant.

We should expect the same quantum mechanically. We should expect measurements to discontinuously change the probability distribution and the probability distribution to be observer dependent. The first is certainly true. Quantum mechanical measurements cause the wave function to collapse and consequently the probability distribution[2] also collapses. The second is not commonly realized or accepted, but it should be. The idea that the wave function is a property of the quantum system plus observer, not the quantum system in isolation, is not new. Indeed, it is a variant of the original Copenhagen interpretation of quantum mechanics. But frequently, it is denied. When this is done, one is usually forced to the conclusion that the mind or consciousness plays a large and mysterious role in the measurement process. Making the wave function, or the state description, observer dependent avoids this problem.  The wave function is then just the information the observer has about the quantum system. As Niels Bohr (1885 – 1962), one of the founders of quantum mechanics, said: It is wrong to think that the task of physics is to find out how nature is. Physics concerns what we can say about nature.

Let us consider the wave function collapse in more detail. Consider an entanglement experiment. The idea is to have a system emit two particles such that if we know the properties of one, the properties of the other are also known. One of the two emitted particles is measured by Bob and the other by Alice.[3] Now, Alice is lazy so she has her particle transported to her home laboratory. She also knows that once Bob has done his measurement, she does not have to measure her particle but only has to call Bob to get the answer. Bob is also lazy, but he does go the lab and, if he feels like it, does the measurement and faithfully records it in his log book. One day when Alice calls, she gets no answer. It turns out Bob has died between the time he would have made the measurement and when he would have recorded it in his lab book. Now Alice is very upset. Not that Bob has died—she never liked him anyway—but that she does not know if the momentous event of the wave function collapse has happened or not. Her particle has not arrived at her home yet, but there is no experiment she can do on it to determine if the wave function has collapsed or not. The universe may have split into many worlds but she can never know! Of course, if the wave function is a property of the observer-quantum system, there is no problem.  The information Bob had on the wave function was lost when Bob died and Alice’s wave function is as it always was. Nothing to see here, move along.

So what is the interpretation of quantum mechanics? An important part seems to be that wave functions are the information the observer has on the quantum system, and is not a property of the quantum system alone. If you do not like that, well there is always instrumentalism,[4] i.e. shut up and calculate.

Additional posts in this series will appear most Friday afternoons at 3:30 pm Vancouver time. To receive a reminder follow me on Twitter: @musquod.


[1] Technically, the phase space.

[2] The probability is the absolute value of the wave function squared.

[3] By convention it has to be Bob and Alice. I believe this is a quantum effect.

[4] Instrumentalism has no problem with quantum mechanics or, indeed, any other scientific model.

Share

Hi All,

Exciting news came out the Japanese physics lab KEK (@KEK_jp, @KEK_en) last week about some pretty exotic combinations of quarks and anti-quarks. And yes, “exotic” is the new “tantalizing.” At any rate, I generally like assuming that people do not know much about hadrons so here is a quick explanation of what they are. On the other hand, click to jump pass “Hadrons 101” and straight to the news.

Hadrons 101: Meeting the Folks: The Baryons & Mesons

Hadrons are pretty cool stuff and are magnitudes more quirky than those quarky quarks. The two most famous hadrons, the name for any stable combination of quarks and anti-quarks, are undoubtedly the proton and the neutron:

According to our best description of hadrons (Quantum Chromodynamics), the proton is effectively* made up two up-type quarks, each with an electric charge of +2/3 elementary charges**; one down-type quark, which has an electric charge of -1/3 elementary charges; and all three quarks are held together by gluons, which are electrically neutral. Similarly, the neutron is effectively composed of two down-type quarks, one up-type quark, and all the quarks are held strongly together by gluons. Specifically, any combination of three quarks or anti-quarks is called a baryon. Now just toss an electron around the proton and you have hydrogen, the most abundant element in the Universe! Bringing together two protons, two neutrons, and two electrons makes helium. As they say, the rest is Chemistry.

However, as the name implies, baryons are not the only type of hadrons in town. There also exists mesons, combinations of exactly one quark and one anti-quark. As an example, we have the pions (pronounced: pie-ons). The π+ (pronounced: pie-plus) has an electric charge of +1 elementary charges, and consists of an up-type quark & an anti-down-type quark. Its anti-particle partner, the π (pronounced: pie-minus), has a charge of -1, and is made up of an anti-up-type quark & a down-type quark.

 

If we now include heavier quarks, like strange-type quarks and bottom-type quarks, then we can construct all kinds of baryons, mesons, anti-baryons, and anti-mesons. Interactive lists of all known mesons and all known baryons are available from the Particle Data Group (PDG)***. That is it. There is nothing more to know about hadrons, nor has there been any recent discovery of additional types of hadrons. Thanks for reading and have a great day!

 

* By “effectively,” I mean to ignore and gloss over the fact that there are tons more things in a proton, like photons and heavier quarks, but their aggregate influences cancel out.

** Here, an elementary charge is the magnitude of an electron’s electron charge. In other words, the electric charge of an electron is (-1) elementary charges (that is, “negative one elementary charges”). Sometimes an elementary charge is defined as the electric charge of a proton, but that is entirely tautological for our present purpose.

*** If you are unfamiliar with the PDG, it is arguably the most useful site to high energy physicists aside from CERN’s ROOT user guides and Wikipedia’s Standard Model articles.

The News: That’s Belle with an e

So KEK operates a super-high intensity electron-positron collider in order to study super-rare physics phenomena. It’s kind of super. Well, guess what. While analyzing collisions with the Belle detector experiment, researchers discovered the existence of two new hadrons, each made of four quarks! That’s right, count them: 1, 2, 3, 4 quarks! In each case, one of the four quarks is a bottom-type quark and another is an anti-bottom quark. (Cool bottom-quark stuff.) The remaining two quarks are believed to be an up-type quark and an anti-down type quark.

The two exotic hadrons have been named Zb(10610) and Zb(10650). Here, the “Z” implies that our hadrons are “exotic,” i.e., not a baryon or meson, the subscript “b” indicates that it contains a bottom-quark, and the 10610/10650 tell us that our hadrons weigh 10,610 MeV/c2 and 10,650 MeV/c2, respectively. A proton’s mass is about 938 MeV/c2, so both hadrons are about 11 times heavier than the proton (that is pretty heavy). The Belle Collaboration presser is really great, so I will not add much more.

Other Exotic Hadrons: When Barry met Sally.

For those keeping track, the Belle Collaboration’s recent finding of two new 4-quark hadrons makes it the twelfth-or-so “tetra-quark” discovery. What makes this so special, however, is that all previous tetra-quarks have been limited to include a charm-type quark and an anti-charm-type quark. This is definitely the first case to include bottom-type quarks, and therefore offer more evidence that the formation of such states is not a unique property of particularly charming quarks but rather a naturally occurring phenomenon affecting all quarks.

Furthermore, it suggests the possibility of 5-quark hadrons, called penta-quarks. Now these things take the cake. They are a sort of grand link between elementary particle physics and nuclear physics. To be exact, we know 6-quark systems exist: it is called deuterium, a radioactive stable isotope of hydrogen (Thanks to @incognitoman for pointing out that deuterium is, in fact, stable.). 9-quark systems definitely exist too, e.g., He-3 and tritium. Etc. You get the idea. Discovering the existence of five-quark hadrons empirically establishes a very elegant and fundamental principle: That in order to produce a new nuclear isotope, so long as all Standard Model symmetries are conserved, one must simply tack on quarks and anti-quarks. Surprisingly straightforward, right? Though sadly, history is not on the side of 5-quark systems.

Now go discuss and ask questions! 🙂

Run-of-the-mill hadrons that are common to everyday interactions involving the Strong Nuclear Force (QCD) are colloquially called “standard hadrons.” They include mesons (quark-anti-quark pairs) and baryons (three-quark/anti-quark combinations). Quark combinations consisting of more than three quarks are called “exotic hadrons.”

 

 

 

 

Happy Colliding.

– richard (@bravelittlemuon)

 

PS, I am always happy to write about topics upon request. You know, QED, QCD, OED, etc.

http://en.wikipedia.org/wiki/Neutron
Share

Getting layed

Friday, January 20th, 2012

On a past blog post I came across the most wonderful comment from Kelly, one of our readers:

Lay people are far smarter than it is supposed, they are also fickle and quick to get bored or offended if talked down to

This got me thinking about the last time I spoke to an expert in another field about their research, about the last time I got “layed”, if you’ll excuse the awful pun. I also hope you’ll excuse an excursion into biochemistry for one post.

Alex and Kia, relaxing in the sun

Alex and Kia, relaxing in the sun

I was in Manchester for the weekend, spending the evening with Alex and Kia, a couple of friends from undergraduate and we had a lot of catching up to do! They’re both biochemistry graduate students and they work in the same lab, although in different areas. We stayed up all night over tea and biscuits (how British), discussing our research, using analogies, looking at diagrams, and coming up with all sorts of thought experiments to try to understand what was happening. They had a lot of questions about how the detector works, how we reconstruct particles (including the Higgs!) and why it takes so long to find it.

Having a discussion about something technical with an expert is not only lots of fun, but it also tells you a lot about your own skills when it comes to explaining concepts. As Kelly mentioned in her comment, there’s a temptation to talk down to people, but I find it’s more rewarding for all involved if we match our discussion to the intelligence of the audience. I’d like to think that most people who read Quantum Diaries and US LHC Blogs are here because they’re intelligent, they’re not scared of nuance, they want to read more than what a press release will tell them, and they may even be a scientist too. Once we find the right level of discussion for a given audience things get much more rewarding!

From the outside biochemistry is such a wonderful field of research. Their work is instantly relevant to the fight against disease and cancer, the field is expanding so rapidly that what students learn one year may be out of date a couple of years later, and there’s no end to the range of different topics you can research. It’s about as fast paced as you can get! It must have its frustrations, like any area of research, but being a layperson I got a chance to appreciate the concepts without the hard work, and that made it sound amazing.

The watered down version of what they told me went something like this:

HIV and T cells

What HIV looks like (Telegraph)

What HIV looks like (Telegraph)

The HIV virus is extremely dangerous for one reason- it infects the white blood cells (T cells) that fight disease in the body. This in itself isn’t a huge problem, but when a person with HIV have some infection then things become very serious. It’s not so much that the white blood cells don’t function anymore, it’s more that they use so many of their resources building more copies of the virus. The virus attaches itself via a protein, and a small percentage of the population have a different form of the protein, which has a different shape. In principle, if a person could get a complete blood transfusion then they could be given the white blood cells with the other protein and may become immune to HIV. An easier way to do this would be to have a bone marrow transplant from another person, as the bone marrow creates the white blood cells. Naturally there are dangers associated with any procedure like this, so it’s not something to be taken lightly. Still, in the course of an hour or so my friends gave me a wonderful insight into how HIV works and some of the discoveries in the fight against the disease.

Genetic diseases

While on the topic of diseases with risky treatments we also discussed a family of genetic diseases (known as mucopolysaccharide diseases, a name I could not remember) which cause premature aging or degradation of the body. The diseases are associated with the failure of the body to break down certain sugars, so the cells get clogged up, do not function as well and then part of the body ages. The exact type of disease manifests in different ways, and sometimes they can only be identified once the disease has progressed. So I asked why children aren’t just screened for this at birth, as they are for many other diseases. It turns out that the cost of the test isn’t low enough and rate of incidence of the disease isn’t high enough for that to become a realistic option yet. Putting groundbreaking, life saving research in that kind of context is rather chilling. I’m glad physicists don’t have to deal with those kinds of choices!

Kia was kind enough to link to one of the charities, so I could find out more about the disease and how it affects us: The MPS Society.

The immune system

But we weren’t done yet! We also talked about the immune system and cancer. Having heard so much about T cells, I was curious about where they came from and why they only attacked foreign objects in the body. It turns out that T cells spend much of their time in the thymus where they are trained to learn what cells in the body look like. When the T cells are produced there is some shuffling of genes and each T cell ends up a little different. If a T cell latches onto part of the thymus it gets destroyed and isn’t allowed into the rest of the body. Otherwise the T cell is let out into the bloodstream. If it finds a cell it “thinks” is attractive, it latches on and releases chemicals into the blood stream. Other T cells respond to the chemical gradient and they too latch on. After a short while the foreign body is overwhelmed and dies.

A red blood cell, a platelet and a T-cell, side by side (Wikipedia)

A red blood cell, a platelet and a T-cell, side by side (Wikipedia)

Well that’s how it works in principle, and there are many ways in which it can go wrong. Some viruses are adept at mutating so that their appearance changes. (On the subject of mutations, my friends also treated me to a discussion of “frame shifts” and how you can get two proteins from one gene!) If one of these viruses gets identified and overwhelmed, one copy may mutate into another form, and the T cells are back to square one again. Another “nightmare scenario” is when a cancerous growth releases a different kind of chemical which essentially says “All fine over here! Carry on!” to the T cells. If that happens then things can go quite seriously wrong quite quickly. If all that wasn’t complicated enough, T cells can also get “confused” and latch on cells from their host body, giving rise to auto immune diseases. The immune system is so amazingly intricate that you could easily spend a whole evening just scratching the surface of the subject. At the same time it also seems immensely fragile and wonderfully robust. Although the apparatus for making an immune system is inherited, the good work it does fighting disease isn’t. If those ideas doesn’t blow your mind then I don’t know what will!

The PhD problem

To round off the evening we also discussed how our PhDs had progressed. Biochemistry seems less forgiving than physics, and they told me that between them and two other mutual friends, two of them had to find new topics, new funding and new institutions. Sometimes, when a research idea doesn’t work out and the funding disappears, even if it’s through no fault of the student, the student has no choice but to start again. I faced a similar situation with my own PhD, as funding for the experiment was cut short and I suddenly found myself with 18 months left, no research topic and no service task. My colleagues rallied round, asked questions, contacted people and helped me find a new topic and a new service task on the same experiment. I finished about 9 months later than expected (but still within four years!) with a decent thesis and some glowing letters of recommendation. Once again, I was glad to be in the cozy realm of physics! It’s differences like these that aren’t at all obvious, and make us realize just how much we have to learn from each other. (My friends were also amazed to find I had about a hundred papers with my name on!)

PhDs are elastic... (PhD Comics)

PhDs are elastic... (PhD Comics)

When did you last get layed?

So for a few hours I was a layperson with two experts at my disposal, and it was one of the most entertaining evenings I’ve had in a long time. So to the lay people reading this blog, if you don’t find the term “layperson” pejorative, it would be great to hear about your experiences. What discussions particularly excited you? How you deal with being patronized or, perhaps worse, overwhelmed with ideas? Or for that matter, if you’re an expert in another area, what are your experiences telling other people about your work? In short, tell us happened last time you got “layed”.

Share

2012: l’année du dragon

Friday, January 20th, 2012

Sans même avoir recours à une boule de cristal, on peut prédire ce que l ‘on peut espérer des expériences du Grand Collisionneur de Hadrons (ou LHC) au CERN pour 2012.

En ce moment, l’accélérateur est à l’arrêt pour procéder à l’entretien annuel. C’est l’occasion de réparer ou changer tout ce qui a cassé durant l’année précédente, tant pour le LHC que pour les détecteurs. Ceux-ci ont été ouvert et tout ce qui est accessible est réparé.

Tout le long des 27 km du tunnel du LHC, les géomètres réalignent les aimants avec précision pendant que de multiples réparations et opérations d’entretien prennent place. Dès le début mars, tous les aimants devront avoir été refroidis et être prêts à redémarrer.

Du côté des expériences, les scientifiques s’affairent non seulement sur les détecteurs, mais aussi améliorent tous les aspects de leurs logiciels: la simulation des détecteurs, les algorithmes de reconstruction des évènements, les critères d’identification des particules et les techniques d’analyses sont tous revisités.

Fin mars, le LHC reprendra les collisions de protons avec pour but de fournir environ 16 femtobarn inverses de données en 2012, soit bien plus que les 5 femtobarn inverses délivrés l’an dernier. Ces données permettront aux expériences d’améliorer la précision de toutes les mesures effectuées jusqu’à maintenant, pousser les limites des recherches de particules nouvelles un petit peu plus loin et explorer des phénomènes nouveaux. On espère découvrir des particules associées à de nouveaux phénomènes de physique encore inconnus. Les physiciennes et physiciens de CMS et ATLAS traquent des dizaines de ces particules, le boson de Higgs étant de loin le plus médiatisé mais tout de même, qu’une particule parmi tant d’autres.

Quand les protons se collisionnent dans l’accélérateur du LHC, l’énergie dégagée peut se matérialiser sous forme de particules massives et instables. C’est ce que décrit la célèbre formule E=mc2, qui prédit tout simplement que l’énergie (représentée par E) et la masse (m) sont équivalentes et peuvent se changer l’une en l’autre. Le symbole c2 représente le carré de la vitesse de la lumière et agit comme un facteur de conversion. Ceci explique pourquoi en physique des particules on mesure la masse des particules en unité d’énergie comme le GeV (giga-électronvolt) ou TeV (téra-électronvolt). L’électronvolt est l ’énergie acquise par un électron en traversant une différence de potentiel d’un volt.

Il est donc plus facile de créer des particules légères que des lourdes. Au fil des décennies, on a pu observer les particules les plus légères des zillions de fois dans plusieurs expériences. On peut donc prédire précisément combien les évènements qu’on récolte avec nos détecteurs devraient en contenir.  On décèle la présence de nouvelles particules lorsqu’on observe plus d’évènements ayant une certaine topologie que ce à quoi on s’attend venant des phénomènes bien connus, ce que l’on appelle le bruit-de-fond.

Bien sûr, plus l’excédent est grand, plus on est certain qu’on a affaire à une nouvelle particule. C’est d’ailleurs pourquoi on cherche toujours à accumuler de plus en plus d’évènements, chacun étant une photo instantanée montrant les particules émergeant des collisions de protons. Il faut toujours s’assurer que l’excès n’est pas simplement causé par une variation aléatoire du bruit-de-fond.

Certaines des particules recherchées devraient être assez légères, dans les centaines de GeV. C’est le cas pour le boson de Higgs dont les signes possibles de sa présence nous sont apparus en décembre. Si les petits excès observés continuent à augmenter au fur et à mesure que plus de données seront disponibles, on devrait en récolter suffisamment en 2012 pour clamer sa découverte ou l’exclure à jamais, le cas échéant.

D’autres de ces particules hypothétiques pourraient avoir une masse dans les milliers de GeV, soit quelques TeV. En 2011, l’accélérateur fournissait 7 TeV d’énergie au point de collision. On ne peut produire que les particules dont la masse est inférieure à l’énergie disponible, tout comme on ne peut pas acheter une voiture de 7000 CHF avec seulement 5000 CHF en poche. Alors si on espère créer une paire de particules ayant chacune une masse de 3.5 TeV (ou 3500 GeV), il faut au minimum fournir 7 TeV pour les produire. Mais comme l’énergie fournie est partagée entre plusieurs particules en général, cette limite est souvent plus basse que l’énergie de l’accélérateur.

Certaines discussions sont en cours pour voir si le LHC pourrait opérer à 8 TeV cette année au lieu de 7 TeV comme en 2011. La décision sera prise début février.

Si le CERN décide d’opérer à 8 TeV, les chances de trouver des particules plus lourdes augmenteront légèrement puisqu’il y aura plus d’énergie disponible. Ce sera le cas pour les recherches pour le W’ et le Z’, une version en plus lourd de deux bosons courants, le W et le Z. Pour de telles recherches, accumuler plus de données a moins d’impact que d’augmenter l’énergie disponible. Mais il faudra sans doute attendre en 2015 lorsque le LHC atteindra enfin sa pleine puissance de 13 ou 14 TeV pour pousser encore beaucoup plus loin les limites sur ces particules.

Pour LHCb et ALICE, le but principal n’est pas de trouver de nouvelles particules. LHCb tente plutôt de faire des mesures ultra précises afin de déceler la moindre faille dans le modèle théorique actuel, le modèle standard de la physique des particules. Pour ce faire, disposer de plus de données fera toute la différence. Déjà en 2011, LHCb a repéré les premiers signes impliquant la violation de parité impliquant des quarks charmés et toute l’équipe espère confirmer cette observation.

Cette mesure et d’autres similaires apporteront de nouveaux éléments pour comprendre pourquoi la matière a pris le dessus sur l’antimatière durant l’expansion de l’univers, alors qu’elles ont dû être créées en quantités égales lors du Big Bang. Ils vont aussi développer différentes techniques d’analyses et explorer de nouvelles avenues.

Pour les chercheur-e-s d’ALICE, l’analyse des données de 2011 récoltées en novembre avec des collisions d’ions de plomb est en marche. On y apprendra peut-être encore mieux comment le plasma de quarks et gluons s’est formé tout de suite après le Big Bang. Cette année, on aura aussi droit à des collisions entre protons et ions de plomb, ce qui ajoutera une nouvelle corde à leur arc.

Donc, en résumé, explorer de nouvelles avenues et méthodes d’analyse, réduire les marges d’erreurs sur toutes les mesures et presque sans aucun doute, obtenir le fin mot de l’histoire sur le boson de Higgs. Ou on le trouvera sans équivoque, ou on le réfutera à tout jamais. Espérons qu’en 2012 le dragon chinois, symbole de persévérance et de succès, récompensera nos efforts.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

Share

2012: the year of the dragon

Friday, January 20th, 2012

I do not have a crystal ball but it is nevertheless possible to sketch what can be expected from the Large Hadron Collider (LHC) experiments at CERN this year.

Right now, the accelerator is stopped for the annual maintenance shutdown. This is the opportunity to fix all problems that occurred during the past year both on the accelerator and the experiments. The detectors are opened and all accessible malfunctioning equipment is being repaired or replaced.

In the 27-km long LHC tunnel, surveyors are busy getting everything realigned to a high precision, while various repairs and maintenance operations are on their way. By early March, all magnets will have been cooled down again and prepared for operation.

The experimentalists are not only working on their detectors but also improving all aspects of their software: the detector simulations, event reconstruction algorithms, particle identification schemes and analysis techniques are all being revised.

By late March, the LHC will resume colliding protons with the goal of delivering about 16 inverse femtobarns of data, compared to 5 inverse femtobarns in 2011. This will enable the experiments to improve the precision of all measurements achieved so far, push all searches for new phenomena slightly further and explore areas not yet tackled. The hope is to discover particles associated with new physics revealing the existence of new phenomena. The CMS and ATLAS physicists are looking for dozens of hypothetical particles, the Higgs boson being the most publicized but only one of many.

When protons collide in the LHC accelerator, the energy released materializes in the form of massive but unstable particles. This is a consequence of the well-known equation E=mc2, which simply states that energy (represented by E) and mass (m) are equivalent, each one can change into the other. The symbol c2 represents the speed of light squared and acts like a conversion factor. This is why in particle physics we measure particle masses in units of energy like GeV (giga electronvolt) or TeV (tera electronvolt). One electronvolt is the energy acquired by an electron through a potential difference of one volt.

It is therefore easier to create lighter particles since less energy is required. Over the past few decades, we have already observed the lighter particles countless times in various experiments. So we know fairly well how many events containing them we should observe. We can tell when new particles are created when we see more events of a certain topology than what we expect from those well-known phenomena, which we refer to as the background.

We can claim that something additional and new is also occurring when we see an excess of events. Of course, the bigger the excess, the easier it is to claim something new is happening. This is the reason why we accumulate so many events, each one being a snap-shots of the debris coming out of a proton-proton collisions. We want to be sure the excess cannot be due to some random fluctuation.

Some of the particles we are looking for are expected to have a mass in the order of a few hundred GeV. This is the case for the Higgs boson and we already saw possible signs of its presence last year. If the observed excess continues to grow as we collect more data in 2012, it will be enough to claim the Higgs boson discovery beyond any doubt in 2012 or rule it out forever.

Other hypothetical particles may have masses as large as a few thousand GeV or equivalently, a few TeV. In 2011, the accelerator provided 7 TeV of energy at the collision point.  The more energy the accelerator has, the higher the reach in masses, just like one cannot buy a 7000 CHF car with 5000 CHF. So to create a pair of particles with a mass of 3.5 TeV (or 3500 GeV), one needs to provide at least 7 TeV to produce them. But since some of the energy is shared among many particles, the effective limit is lower than the accelerator energy.

There are ongoing discussions right now to decide if the LHC will be operating at 8 TeV this year instead of 7 TeV as in 2011. The decision will be made in early February.

If CERN decides to operate at 8 TeV, the chances of finding very heavy particles will slightly increase, thanks to the extra energy available. This will be the case for searches for particles like the W’ or Z’, a heavier version of the well-known W and Z bosons. For these, collecting more data in 2012 will probably not be enough to push the current limits much farther. We will need to wait until the LHC reaches full energy at 13 or 14 TeV in 2015 to push these searches higher than in 2011 where limits have already been placed around 1 TeV.

For LHCb and ALICE, the main goal is not to find new particles. LHCb aims at making extremely precise measurements to see if there are any weak points in the current theoretical model, the Standard Model of particle physics. For this, more data will make a whole difference. Already in 2011, they saw the first signs of CP-violation involving charm quarks and hope to confirm this observation. This measurement could shed light on why matter overtook antimatter as the universe expanded after the Big Bang when matter and antimatter must have been created in equal amounts. They will also investigate new techniques and new channels.

Meanwhile, ALICE has just started analyzing the 2011 data taken in November with lead ion collisions. The hope is to better understand how the quark-gluon plasma formed right after the Big Bang. This year, a special run involving collisions of protons and lead ions should bring a new twist in this investigation.

Exploring new corners, testing new ideas, improving the errors on all measurements and most likely the final answer on the Higgs, that is what we are in with the LHC for in 2012. Let’s hope that in 2012 the oriental dragon, symbol of perseverance and success, will see our efforts bear fruit.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline or sign-up on this mailing list to receive and e-mail notification.


 

Share

Location, Location, Location

Thursday, January 19th, 2012

If I had to pick one thing that’s definitely better on my old experiment, ATLAS, than on my new experiment, CMS — and especially if I had to pick something I could write publicly without getting into trouble — it would be this: the ATLAS detector is across the street from the rest of CERN. I’m not sure how that was decided, but once you know that, you know where CMS has to be: on the other side of the ring, 5 or 6 miles away. That’s because the detectors have the same goals and need the same beam conditions; two opposite points on the LHC are where a duplicate performance is easiest. The pre-existing caverns from the LEP collider, whose tunnel the LHC now uses, probably also helped determine where the detectors are.

In any case, it used to be that when I wanted to work on my detector, I had only to go across the street. Now I have to drive out of Switzerland and several miles into France. Except, I don’t like driving. So I’ve been working on alternate means of transportation. A few months ago I walked. Last night I had to go to downtown Geneva, so I took the bus. It’s actually pretty good, although the bus stop is a mile away from CMS. There’s also the shift shuttle, which runs from the main CERN site to CMS every 8 hours via a rather roundabout route. And I can bike, once the weather gets better and I get myself a little more road-worthy. To be honest, every option for getting here is much slower than driving, but I enjoy figuring out ways to get places enough that I’m going to keep trying for a while.

I have plenty of chances to try, because I’ll be here in the CMS control room a lot of the time over the next few weeks. Right now, I’m learning and helping with the pixel detector calibration effort. (We’re changing the operating temperature, so all the settings have to be checked.) Soon I’ll be learning to take on-call shifts. So the more I stay here, the more I learn. I got here this morning, and I won’t leave tonight until about 11 pm. I could take the shift shuttle back — or maybe I’ll just get a ride.

Share

Visiting LHCb!

Thursday, January 19th, 2012

Warning for those on slow internet connections: this post contains quite a few large images
which may take a long time to load. Just be patient, I promise they’ll be worth the wait!

I’ve been blogging about LHCb for about eight months now, telling you all about the detector and the physics. If you’ve been following my posts from the start, you might recall that as well as being new to Quantum Diaries, I was also new to LHCb.

Why do I bring that fact up now? Combined with the timetable of the LHC (which operated between March and November last year), this has meant that while I could read about the detector, monitor the data taking and start analysing the recorded data, I had never actually been underground and seen the detector.

So when I found out that Kathryn Grim, of USLHC Communications, was taking a pair of videographers and photographers down, I asked to be part of the visit. Luckily, there was space for me and I had already passed all the necessary training and had all the required access privileges.

I was pretty excited about the visit, in addition to getting to see the detector I work with, the last LHC detector I saw was ATLAS, back in 2009 before any serious data taking had begun. And before that, I visited ATLAS and CMS during construction way back in 2007.

Why is this history important? Well, visiting LHCb is a history lesson of sorts. Unlike ATLAS and CMS, which are located in caverns especially built for the experiments, as seen the schematic map below, LHCb and ALICE reside in caverns which previously contained LEP detectors, DELPHI and L3.

As I’ve mentioned before, ALICE took advantage of that fact by incorporating the L3 magnet in its detector. LHCb took a different approach, simply disconnecting the DELPHI detector and moving it away from the beam line into an exhibition area behind concrete shielding. I didn’t have much time in the DELPHI part of the cavern as the videographers and photographers wanted to get straight to LHCb, but I was able to grab a couple of shots of the detector, one of which I include below…

So you may be wondering about the videographers and photographers Kathryn and I were accompanying underground (along with a couple of other LHCb colleagues). It was kind of confusing actually, there were two separate crews, each of which contained one videographer and one photographer. However, the focus of one team was the videographer and the focus of the other was the photographer.

Here on the left, I have a photo of the videographer, Steve Elkins, who was filming for a documentary. He had a accompanying crew member to assist with the filming and to take photos of the process for promotion. You can find out more about the upcoming documentary at his website.

In his words, “The film will be about questions, and the diverse routes to ask them. It will be about the struggles to lift the seemingly impenetrable veils of mystery from the intangible and transcendent, whether through bodies, machines, brains, or stars… It will involve the largest astronomy project in human history, Tuvan throat singers, a neuroscientist’s quest to actually photograph memories being formed in the brain, and the Kalacakra sand mandala ceremony overseen by the Dalai Lama in India, all told through the true story of a man running alone across Death Valley in average temperatures of 130 degrees fahrenheit.”

It sounds really intriguing and I look forward to seeing it.

Here on the right, I have a photo of the photographer, Enrico Sacchetti. You may be wondering why a photographer requires a videographer. It has to do with the camera he was using, a Phase One 645DF. From what I gathered, the company lent him the camera, on the condition that he film himself using it for promotional purposes.

You can find some of his previous photos of the LHC experiments on his website, which are quite nice. From what I saw on the preview screen on the camera though, the new ones will be spectacular.

That’s enough about the people on the visit; onto photos of the detector! I won’t bombard you with images of the whole detector, since they all look fairly similar, but instead, below, I’ll show you a few different unique views of certain components.

The top photo shows the view between the hadronic calorimeter and the muon system from below the detector, looking up towards the ceiling. You can see the beam pipe on the right of the photo. The left photo shows people working in the tracking system. The experiments use the LHC downtime to maintain their detectors. You can see that two of the tracking stations have been retracted, while one remains in position (the two left stations are the retracted ones). The right photo shows the dipole from the front, with a lot of safety tape and plastic covering the beam pipe. These are placed there during the maintenance period to protect the equipment. They will be removed before the start of data taking so they won’t interfere with the physics.

 

Pretty cool huh? I really enjoyed my visit and the unique opportunity to witness physics and art in action. I’ll leave you all now with the obligatory photo of me and the detector.

Share

This column by Fermilab Director Pier Oddone appeared in Fermilab Today on Jan. 17.

Last week we hosted the US-UK Workshop on Proton Accelerators for Science and Innovation. The workshop brought together scientists from the United States and the United Kingdom who are working on high-intensity proton accelerators across a variety of fronts. The meeting included not only the developers of high-intensity accelerators but also the experimental users and those involved in the applications of such accelerators beyond particle physics.

At the end of the conference, John Womersly, CEO of the UK’s Science and Technology Facilities Council, and I signed a letter of intent specifying the joint goals and activities of our collaboration for the next five years. We plan to have another workshop in about a year to review progress and explore additional areas of collaboration.

Our collaboration with scientists from the United Kingdom in the area of high-intensity proton accelerators is already well established. We have a common interest in muon accelerators, both in connection with neutrino factories and muon colliders. Both of these future projects require multi-megawatt beams of protons to produce the secondary muons that are accelerated. We collaborate on the International Muon Ionization Cooling Experiment at the Rutherford Appleton Laboratory. MICE is the first muon cooling experiment and an essential step in the road to neutrino factories and muon colliders. We also collaborate on the International Scoping Study for neutrino factories.

In our current neutrino program we are very appreciative of this collaboration and U.K. expertise in the difficult mechanical design of high-power targets, in particular for the MINOS, NOvA and LBNE experiments. The design of these targets is quite challenging as the rapid deposition of energy creates shock waves that can destroy them.The Project X experimental program also depends on having appropriate megawatt-class targets relatively close to experimental set-ups.

One of the primary interests in applications outside of particle physics is the development of intense proton accelerators that could be used for the transmutation of waste or even the generation of electrical power in subcritical nuclear reactors. The accelerators necessary for such subcritical reactors could not have been built just a decade ago, but the advent of reliable superconducting linacs changed that. Several programs abroad are developing such accelerators coupled to reactors. While the United States has no explicit program on accelerator-driven subcritical systems, the technologies that we are developing for other applications, such as Project X, place us in a good position should the United States decide to develop such systems.

Overall, the workshop was very productive and the areas of potential collaboration seemed to multiply through the meeting. Each one of the five working groups is preparing a brief summary of the potential areas of collaboration as well as a specific and focused plan for the next year.

Share

Shut Up and Calculate

Friday, January 13th, 2012

Andreas Osiander (1498 – 1552) was a Lutheran theologian who is best remembered today for his preface to Nicolaus Copernicus’s (1473 – 1543) book on heliocentric astronomy: De revolutionibus orbium coelestium. The preface, originally anonymous, suggested that the model described in the book was not necessarily true, or even probable, but was useful for computational purposes. Whatever motivated the Lutheran Osiander, it was certainly not keeping the Pope and the Catholic Church happy. It might have been theological, or it could have been the more general idea that one should not mix mathematics with reality.  Johannes Kepler (1571 – 1630), whose work provided a foundation for Isaac Newton’s theory of gravity, took Copernicus’s idea as physical and was criticized by no less than his mentor, Michael Maestlin (1550 – 1631) for mixing astronomy and physics. This was all part of a more general debate about whether or not the mathematical descriptions of the heavens should be considered merely mathematical tricks or if physics should be attached to them.

Osiander’s approach has been adopted by many others down through the history of science. Sir Isaac Newton—the great Sir Isaac Newton himself—did not like action at a distance and when asked about gravity said, “Hypotheses non fingo.” This can be roughly paraphrased into English as: shut up and calculate. He was following Osiander’s example. It was not until Einstein’s general theory of relativity that one could do better. Even then, one could take a shut up and calculate approach to the curved space-time of general relativity.

Although atoms were widely used in chemistry, they were not accepted by many in the physics community until after Einstein’s work on Brownian motion in 1905.  Ernst Mach (1838 – 1916) opposed them because they could not be seen. Even in the early years of the twentieth century Mach and his followers insisted that papers discussing atoms, published in some leading European physics journals, have an Osiander-like introduction. And so it continues: in his first paper on quarks, Murray Gell-Mann (1929) introduced quarks as a mathematical trick.  If Alfred Wegener (1880–1930) had used that approach to continental drift it might not have taken fifty years for it to be accepted.

We see a trend: ideas that are considered heretical or at least unorthodox—heliocentrism, action at a distance, atoms, and quarks—are introduced first as mathematical tricks. Later, once people become used to the idea, they take on a physical reality, at least in people’s minds.

In one case, the trend went the other way. Maxwell’s equations describe electromagnetic phenomena very well. They are also wave equations. Now, physicists had encountered wave equations before and every time, there was a medium for the waves. Not being content to shut up and calculate, they invented the ether as the medium for the waves. Lord Kelvin (1824 –1907) even proposed that particles of matter were vortices in the ether. High school text books defined physics in terms of vibrations in the either.  And then it all went poof when Einstein published the special theory of relativity.  Sometimes, it is best to just shut up and calculate.

Of course, the expression Shut up and calculate is applied most notably to quantum mechanics. In much the same vein as with the ether, physicists invented the Omphalos … oops, I mean the many-worlds interpretation, of quantum mechanics to try to give the mathematics a physical interpretation. At least Philip Gosse (1810 –1888), with the Omphalos hypothesis, only had one universe pop into existence without any direct evidence of the pop. The proponents of the many-worlds interpretation have many universes popping into existence every time a measurement is made.  Unless someone comes up with a subtle knife[1] so one can travel from one of these universes to another, they should be not taken any more seriously than the ether.

The shut up and calculate approach to science is known as instrumentalism—the idea that the models of science are only instruments that allow one to describe and predict observations. The other extreme is realism—the idea that the entities in the scientific models refer to something that is present in reality. Considering the history of science, the role of simplicity, and the implications of quantum mechanics[2] (a topic for another post), realism—at least in its naïve form—is not tenable. Every time there is a paradigm change or major advance in science, what changes is the nature of reality given in the models. For example, with the advent of special relativity, the fixed space-time that was a part of reality in classical mechanics vanished.  But with an instrumentalists view, all that changes with a paradigm change is the range of validity of the previous models. Classical mechanics is still valid as an instrument to predict, for example, planetary motion. Indeed, even the caloric model of heat is still a good instrument to describe many properties of thermodynamics and the efficiency of heat engines. Instrumentalism thus circumvents one of the frequent charges again science: namely that we claim to know how the universe works and then discover that we were wrong. This is only true if you take realism seriously and apply it the internals of models.

The model building approach to science advocated in these posts is perhaps an intermediate between the extremes of instrumentalism and realism. The models are judged by their usefulness as instruments to describe past observations and make predictions for new ones; hence the tie-in to instrumentalism. The models are not reality any more than a model boat is, but they capture some not completely determined aspect of reality. Thus, the models are more than mere instruments, but less than complete reality.  In any event, one never goes wrong by shutting up and calculating.

Additional posts in this series will appear most Friday afternoons at 3:30 pm Vancouver time. To receive a reminder follow me on Twitter: @musquod


[1] The Subtle Knife, the second novel in the His Dark Materials trilogy, was written by the English novelist Philip Pullman

[2] In particular Bell’s inequalities.

Share