• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Archive for the ‘Uncategorized’ Category

Portuguese version below…

Nature sometimes demands a lot of effort in order to reveal its secrets. Particle physics, of course, obeys a very similar pattern. Many, many events have to be analyzed in order to find a few that can be really interesting. Let’s take the case of the ever sought Higgs particle. The probability of generating a single Higgs is quite small. Assuming that in the total period from the start of the physics operation in 2010 and ending just before the July 4th, 2012 announcement comprises, in fact, around 450 real LHC operation days, we had, on average, 480 Higgs per day (all numbers in this post are approximative). However, quite unfortunately, most (~60%) of these Higgs decay in a mode (H->bb) which is very easy to confuse with other production modes or (~20%, H->WW) in a mode not so easy to measure precisely its properties. One of the cleanest modes to study the Higgs is its decay into two photons. The photons were detected with the ATLAS calorimeters (see our previous posts). But the quantity of produced events in this mode is much smaller (basically, around 1 per day!). Sometimes, LHC produces more than 30 million collisions per second. Now, imagine that the LHC could only produce a tenth of this number of collisions (3 million), we would have to wait 10 days for a detectable Higgs. Here comes (at last!) our central topic. Given the rarity of Higgs events, the LHC has to produce a ridiculously high amount of events per second to produce a few interesting ones in a practical rate. Given also the fact that Nature loves to produce other events which are very common and very well known, our detectors are filled up with events that are basically junk (background), at least for Higgs search. If we recorded all of the events produced in ATLAS, more than 40 GBs of data storage space would be necessary per second. That would be simply unmanageable!!! So, the only way to have a reasonable data flow and still be able to make physics at a reasonable rate is to select events before recording. That’s what we call the “trigger”.

Immediately two obsessions related to the trigger system appear : reduce as much as possible the huge rates of events faking good signatures (or the data acquisition system of the detector will not handle the stress); highest possible efficiency : never loose a very good candidate of a given signature or you loose the physics event which is exactly why one builds such gigantic machines!! In the case of a lost Higgs, another working day will be necessary! As you will later on the next posts, the algorithms used in the trigger operate always at the limit rate in which they can guarantee a very high efficiency (usually, not too far from 100%).

Finding the Higgs

The complex work of finding the Higgs. Picture downloaded from : http://www.englishblog.com/2012/07/higgs-boson-cartoons.html#.UER8fkRhq6B

ATLAS Higgs to gamma gamma plot

ATLAS Higgs to gamma gamma mass plot. For more information, check the page : https://twiki.cern.ch/twiki/bin/view/AtlasPublic/HiggsPublicResults

The picture above (stolen from many places in the web – see the caption), is not very far from the truth. To find a Higgs, you have to search a lot. See for instance, the official ATLAS plot for the Higgs detection. In this plot, the amount of Higgs candidates shown as an excess is quite small (around 230-250 events in the 4 bins between 122 and 130 GeV). See in the top plot, that this excess shows up in more than 8000 events (around 2000 in each of the four histogram bins). And this after trigger and offline analysis selection, for a very narrow mass range and only for events in which two photons were detected! In principle, we should expect more Higgs (around 400-450), but some are lost because their photons appear too close to the particle beam and ATLAS does not see them (or at least misses one of them and the pair cannot be formed). Others are lost because the requirements to accept a photon as such are very restrictive and the chance of loosing at least one of the two photons is relatively high. So, loosing a fraction of these events is unfortunate, but unavoidable given the experimental conditions. Another remarkable fact is that we cannot necessarily know which 250 events of those 8250 are really related to Higgs. We just know that the Higgs contributed by increasing the rate of possible events in that mass range. The researchers are always trying to find some clever techniques to avoid the 8000 unnecessary events, but this is no trivial task. If a new technique is developed, it would certainly end up included as a trigger algorithm.

As for the “videos” section of this post, I’d like to make a little propaganda about some sources of information. So, I recorded quick working sessions on two tools that are used in the ATLAS Control Room to visualize a small fraction of the acquired events while the experiment is running. The first is directly from the atlas.ch web site, the so called atlas live events (check the link!). It takes around 15 to 20 seconds to change the event. I made a short extract in the video below. Another very nice tool is Camelia that can make 3D images from events coming directly from the ATLAS detector and you can play with them. The important point here is that most of the events displayed (a random sample) are the ones which have very small signal (lots of tracks with low momentum) but almost no important signal (straight lines with some calorimeter activity). If you wait long enough you will eventually see some interesting events. This demonstrates why we need to apply a strong selection to avoid wasting recording time and space with trivial events. You may want to see these in full screen.

I advise also the third video, where I tried to make a quick analysis using a pre-recorded events. First, one finds two jets in a single event (it could be two photons) and later two muons. Summing the muons momenta, we can see that the pair mass (93 GeV/c2) is quite close to a Z boson mass (91 GeV/c2).

In the next post, we will see the three levels of the ATLAS trigger system with increasing complexity and accessing each time more details of the detector. When it gets to the software trigger levels, we will probably have a post about computing in such environment (yes, the trigger software must be fast, even if you have to loose a bit of precision!). If you are interested in understanding how one of these detectors work, I advise a look at my latest 3 posts (first, second and third). You will need that information to understand the trigger.

Portuguese version :

Por vezes, a Natureza exige um grande esforço antes de revelar seus segredos. A física de partículas obedece um padrão similar. Muitos, muitos eventos têm de ser analisados antes de se encontrar alguns que sejam realmente interessantes. Vejamos o caso da famosa partícula de Higgs. A probabilidade de gerar um Higgs é bem pequena. Assumindo que o período total de operação do LHC real em torno de 450 dias em 2010, 2011 e parte de 2012, tivemos, na média, cerca de 480 Higgs por dia (os números discutidos são aproximações). Entretanto, infelizmente, cerca de 60% desses Higgs têm um modo de decaimento (H->bb) que é muito similar à outros processos físicos, e, logo, difícil de detectar. Outro modo de decaimento (H->WW, 20%) é um pouco difícil de ter suas propriedades medidas. Um dos modos mais “limpos” de se detectar o Higgs é o seu decaimento em dois fótons. Estes fótons sensibilizaram os calorímetros do ATLAS (veja posts anteriores). Mas a quantidade de eventos produzidos dessa forma é muito menor (mais ou menos uma vez por dia!). Algumas vezes, o LHC produz mais de 30 milhões de colisões por segundo. Agora, imagine que o LHC produzisse um décimo desde número (3 milhões), teríamos que esperar 10 dias para encontrar um Higgs detectável. Assim, chegamos (finalmente!) no nosso tópico central. Dada a raridade com que eventos possuindo Higgs acontecem, o LHC tem que produzir uma quantidade ridiculamente gigante de eventos por segundo para produzir alguns interessantes numa taxa praticável. Também dado o fato de que a Natureza adora produzir eventos que são já muito conhecidos, nossos detectores são mantidos cheios de eventos inúteis, pelo menos inúteis para a procura do Higgs. Se gravássemos todos os eventos produzidos no ATLAS, seriam necessários mais de 40 GB de espaço de armazenagem de dados por segundo! O manuseio de tantos dados seria impraticável. Ou seja, a única forma de se ter um regime de armazenamento mais razoável e ainda ser capaz de fazer os estudos da física é selecionar os eventos antes de gravá-los. Esse processo é chamado de “trigger” (cuja tradução literal seria disparo).

Imediatamente, vemos dois tópicos que são verdadeiras obsessões no domínio do trigger : redução ao máximo das gigantescas taxas de eventos fingindo ser boas assinaturas (para reduzir o stress sobre o sistema de aquisição de dados); maior eficiência possível : nunca perder um evento interessante que é exatamente o motivo pelo qual construímos essas máquinas gigantescas! No caso do Higgs, isso pode ser traduzir na necessidade de esperar por todo um novo dia de trabalho! Como ficará claro nos próximos posts, os algoritmos usados no trigger operam sempre na taxa em que eles ainda possam garantir altíssima eficiência (usualmente, próximo a 100%).

Finding the Higgs

O trabalho complicado para encontrar o Higgs. Figura capturada da página : http://www.englishblog.com/2012/07/higgs-boson-cartoons.html#.UER8fkRhq6B

ATLAS Higgs to gamma gamma plot

Massa do par Higgs para gamma gamma medida pelo detector. Para maiores informações, veja https://twiki.cern.ch/twiki/bin/view/AtlasPublic/HiggsPublicResults

Veja a figura acima (encontrada na web – veja o link no texto da figura). A brincadeira é muito próxima da realidade. Para encontrar um Higgs, deve haver uma extensiva procura. Veja por exemplo, a figura da detecção do Higgs pelo detector ATLAS. Nessa figura, a quantidade de candidatos à Higgs aparece como um pequeno excesso (cerca de 230-250 eventos nos quatro bins entre 122 e 130 GeV). Veja na parte superior da figura que esse excesso acontece sobre cerca de 8000 eventos (distribuídos em quatro bins do histograma). E isso, depois da seleção do trigger e da análise offline, por uma pequena faixa de massa e para os eventos nos quais dois fótons foram encontrados. Em princípio, deveríamos ter mais Higgs (cerca de 400-450), mas muitos são perdidos porque seus fótons aparecem muito perto do feixe de partículas e ATLAS não os vê (ou não vê um deles, logo o par não pode ser estudado). Outros são perdidos porque os requisitos para aceitar um fóton são muitos restritivos e a chance de se perder um deles é relativamente alta. Assim, perder uma fração destes eventos é muito ruim, mas inevitável dadas as condições experimentais. Outro fato remarcável é que não podemos saber quais 250 dos 8250 eventos são realmente relativos ao Higgs. Apenas sabemos que o Higgs contribui para o aumento da taxa de eventos naquela faixa de massa. O pesquisadores tentam inventar novas técnicas inteligentes que permitam evitar os 8000 eventos desnecessários, mas essa não é uma tarefa simples. Se uma nova técnica for desenvolvida, ela vai acabar sendo aplicada no trigger.

Para a seção de vídeos deste post, eu gostaria de aproveitar para fazer uma pequena propaganda de algumas fontes de informações. Assim, gravei uma rápida seção de trabalho em duas ferramentas que são utilizadas na sala de Controle do ATLAS para visualizar uma pequena fração dos eventos adquiridos enquanto o experimento está tomando dados. O primeiro está disponível no site atlas.ch com o nome de live events (eventos ao vivo – visite o link!). Demora cerca de 15 a 20 segundos para se trocar o evento. Outra ferramenta é a Camelia que pode fazer imagens 3D dos eventos vindos diretamente do detector ATLAS, e você pode brincar com o detector. O ponto importante aqui é observar que uma amostra aleatória é basicamente composta com eventos com pouco sinal (muitos traços com pouco momento) mas quase nenhuma atividade no calorímetro. Se você esperar algum tempo, você verá alguns eventos com um nível de atividade razoável. Isso demonstra o quanto precisamos fazer uma seleção forte para evitar desperdício de tempo e espaço de dados com eventos triviais. Talvez seja mais prático ver estes eventos em tela cheia.

Também tentei fazer uma análise rápida de alguns eventos pré-gravados no terceiro vídeo. No primeiro, vemos dois jatos num mesmo evento (podem ser dois fótons não corretamente identificados) e, depois, vemos dois múons. Somando o momentum desses múons, podemos ver que a massa do par (93 GeV/c2) é muito próxima à do bóson Z (91 GeV/c2).

No próximo post, veremos os três níveis de trigger do ATLAS acessando a cada nível mais detalhes do detector. Quando chegamos no nível de seleção do trigger por software, provavelmente teremos um post sobre computação neste ambiente (o algoritmo tem que tomar decisões rápidas mesmo que seja a perder um pouco de precisão!). Se você está interessado em como funcionam os detectores, revise meu últimos três posts (primeiro, segundo e terceiro). Você vai precisar disso para entender o trigger.

Share

My Science Scout Badges

Thursday, November 10th, 2011

For translations, click or roll over each badge, or see the web page of the Order of the Science Scouts of Exemplary Repute and Above Average Physique.

Share

 

http://blog.woodmarvels.com/

Congratulations goes out to fellow US LHC Blogger Prof. Sarah Demers for just being awarded the Department of Energy’s Early Career Award.  The announcement is naturally featured prominently on the website of her home institution, Yale University Physics Department.  This award has recently replaced the long-standing DOE Outstanding Junior Investigator Award (OJI), which has awarded grants to promising junior faculty members from 1978-2008, an impressive run!  The new Early Career award and has brought the previous National Science Foundation’s Early Career Award and DOE OJI awards together to a more similar format and award level.

These awards can mean a tremendous amount to a new faculty member in particle physics. I was fortunate enough to receive an OJI from the DOE, and fellow blogger Prof. Ken Bloom was fortunate to receive a Career Award from NSF, when we were both new junior professors.  This allowed us both to support perhaps a graduate student and part of a postdoc’s salary as well as our own summer salaries while we established our research programs as new faculty members.  Now Sarah has earned a peer-reviewed grant, which is a major milestone for a new professor, and which enables her to proceed with her successful research program without relying on university start-up funds (which eventually dry up).  Here’s to Sarah’s future success!

Photo by Waldo Jaquith

Share

Tramtastic!

Friday, May 6th, 2011

CERN is the place to be if you’re a particle physicist! It has everything you could want here: the most promising experiments, all kinds of experts on hand, some of the most powerful computing systems in the world, fascinating seminars. It’s enough to draw people in from all over the world. The only downside is that it’s a bit tricky to get away from CERN for an evening in the city. Well not anymore! This week the tram arrived at CERN, giving us an essential lifeline to Geneva, with all its services and nightlife.

CERN tram

The CERN tram!

The town of Meryin saw the new tram as cause for a street party, with all kinds of entertainers, a jazz band, and free rides on an historic tram. So I went along to see what there was to offer, and how people reacted to the new transport link. Everyone seemed to be very happy about it (except perhaps for a few motorists!) “Great!” I thought, this gives us an easy way to get around. We can socialize more often, making it easier to meet people, enjoy ourselves, and making short trips to CERN all the more fun. There are many people who come to CERN for a few weeks or months at a time over the summer, and there’s pressure to cram as much into their time here as possible. Trimming some minutes off the journeys to and from Geneva makes things just that little bit easier for everyone!

People coming to explore CERN

People coming to explore CERN

What impressed me most was how CERN used this opportunity to reach out the public. In retrospect it was silly that I didn’t realize the tram went to CERN as well as from CERN! The new service included a tram advertising CERN, taking people right up the Microcosm and the Globe, where they were welcomed in to see what CERN has to offer. Presumably this is only the start of a new way of approaching CERN (literally and figuratively.) This is the first time people can get directly from the heart of Geneva to the center of CERN’s public spaces. The icing on the cake is the tram itself, which is so modern and spacious. First impressions matter, and no longer relying on the rickety number 56 bus to go the final mile will make a big difference to people’s perceptions of CERN. It’s a place which is modern, relevant, well connected and a vital part of the greater Geneva area. It’s deserved a tram stop for years and one has finally arrived!

Share

A couple of weeks ago we met the Higgs boson and discussed its Feynman rules.

 

I had forgotten to put up the obligatory Particle Zoo plush Higgs picture in my last post, but US LHC readers will know that Burton has the best photos of the [plushy] Higgs. (It seems that the Higgs has changed color over that the Particle Zoo.)

We learned that the Higgs is a different kind of particle from the usual gauge boson “force” particles or the fermion “matter” particles: it’s a scalar particle which, for those who want to be sophisticated, means that it carries no intrinsic quantum mechanical spin. Practically for these posts, it means that we ended up drawing the Higgs as a dashed line. For the most part, however, the Feynman rules that we presented in the previous post were pretty boring…

Recall the big picture for how to draw Feynman diagrams:

  1. Different particles are represented by lines. We now have three kinds: fermions (solid lines with arrows), gauge bosons (wiggly lines), and scalars (dashed lines).
  2. When these particles interact, their lines intersect. The “rules” above tell us what kinds of intersections are allowed.
  3. If we want to figure out whether a process is possible, we have to decide whether or not we can use the rules to convert the initial set of particles into the final set of particles.

If you’ve been following our posts on Feynman diagrams, then you might already be bored of this process. We could see how electrons could turn into muons, or even how the Higgs boson might be produced at the LHC; but now we’ve arrived at the Higgs boson—one of the main goals of the LHC—where is the pizzazz? What makes it special, and how do we see it in our Feynman rules?

The Higgs is special

It turns out that the Higgs has a trick up it’s sleeve that the other particles in the Standard Model do not. In the language of Feynman diagrams, a Higgs line can terminate:

The “x” means that the line just ends; there are no other particles coming out. Very peculiar! We know that ordinary particles don’t do this… we don’t see matter particles disappearing into nothing, nor do we see force particles disappearing without being absorbed by other particles. We can think about what happens when matter and anti-matter annihilate, but there we usually release energy in the form of force particles (usually photons). The above rule tells us that a single Higgs line—happily doing its own thing—can be suddenly be cut off. It shouldn’t be read as an initial state or final state particle. It’s just some intermediate line which happens to stop.

We’ll discuss the physical meaning of this in upcoming posts. Sometimes when people try to explain the physical meaning they can get caught up in their own analogies. Instead, let us use the Feynman diagrams as a crutch to see the effects of this weird Feynman rule. Recall that in the previous post we introduced a four-point Higgs self-interaction (“four-point” means four Higgs lines intersecting):

If we take one of the lines and terminate it, we end up with a three-point Higgs self interaction:

In fact, since the crossed out line isn’t doing anything, we might as well say that there is a new Feynman rule of the form

Now that’s somewhat interesting. We could have forgotten about the “crossed out Higgs line” rule and just postulated a three-point vertex. In fact, usually this is the way people write out Feynman rules (this is why our method has been “idiosyncratic“); however, for our particular purposes it’s important to emphasize that what people really mean is that there is implicitly a “crossed out Higgs line.” The significance is closely tied up to what makes the Higgs so special.

We could play this game again and cross one one of these three lines. This would lead us to a two-point Higgs interaction.

Once again, we could just as well chop off the two terminated lines and say that there is a ‘new’ two-point Higgs Feynman rule. But this is really just a line, and we already knew that we could draw lines as part of our Feynman rules. In fact, we know that that lines just mean that a particle moves from one place to another. So it seems like this interaction with two crossed out lines doesn’t give us anything news.

… except there’s more to it, and this is where we start to get a hint of the magic associated with the Higgs. Let me make the following statement without motivation:

Claim: the above Feynman rule is a contribution to the Higgs mass.

At this point, you should say something incredulous like, “Whaaaaaat?” Until now, we’ve said that particles have some particular mass. The number never really mattered that much, some particles are lighter than others, some particles have zero mass. Mass is just another property that each particle seems to have. Now, however, we’ve made a rather deep statement that puts us at the tip of a rather large iceberg: we’re now relating a particular Feynman rule to the mass of the particle, which we had previously assumed was just some number that we had to specify with our theory.

We’ll have to wait until my next post to really get into why such a relation should exist and really what we even mean by mass, but this should at least start to lend credence to the idea that the Higgs boson can give masses to particles. At this point this should still feel very mysterious and somewhat unsatisfying—that’s okay! We’ll get there. For now, I just want you to feel comfortable with the following string of ideas:

  1. The Higgs boson has a special Feynman rule where a line can terminate.
  2. This means we can take any interaction and effectively remove the Higgs line by terminating it immediately after the vertex.
  3. In particular, this means that we generate a vertex with just two lines.
  4. This vertex with two lines should—for reasons which are presently mysterious—be identified with mass.

Giving mass to the other particles

Now that we see how this game works, we should immediately go back to the first two Feynman rules we wrote down:

These are the interactions of the Higgs with fermions and gauge bosons. Here’s what you should be thinking:

Hm… I know that the Higgs boson line can terminate; I can just cross out the end points of a dashed line. And I just saw that when I do this to the Higgs self-interaction vertex enough times, I end up with a two-point interaction which Flip tells me is a mass for some weird reason. Now I these two vertexes representing the Higgs interaction with two matter particles or two force particles. Does terminating the Higgs line also give mass to these particles?

The answer is yes! We end up with vertices like this:

For aesthetic reasons (and really only for aesthetic reasons) we can shrink this diagram to:

We can even drop the “x” if you want to be even more of a purist… but for clarity we’ll leave it here to distinguish this from a normal line. These diagrams indeed represent a mass contribution to fermions and gauge bosons. Again, I’m just telling you this as a mysterious fact—we’ll explain why this interpretation is accurate later on. We’ll need to first understand what “mass” really is… and that will require some care.

Bumping up against the Higgs

In fact, instead of saying that particles “start out” with any masses, one can formulate our entire Feynman diagram program in terms of completely massless particles. In such a picture, particles like the top quark or Z boson undergo lots of the aforementioned two-point “mass” interactions and so are observed to have larger masses. Heuristically, heavy particles barrel along and have lots of these two-point interactions:

For comparison, a light particle like the electron would have fewer of these interactions. Their motion (again, heuristically) looks more like this:

We should remember that each of these crosses is really a terminated Higgs line. To use some fancy parlance which will come up in a later post, we say that the Higgs has a “vacuum expectation value” and that these particles are bumping up against it. The above pictures are just ‘cartoons’ of Feynman diagrams, but you can see how this seems to convey a sense of “inertia.” More massive particles (like the top quark) are harder to push around because they keep bumping up against the Higgs. Light particles, like the electron, don’t interact with the Higgs so much and so can be pushed more easily.

In this sense, we can think of all particles as being massless, but their interactions with the Higgs generates a two-point interaction which is effectively a mass. Particles which interact more strongly with the Higgs have more mass, while particles which interact weakly with the Higgs have less mass. In fact, once we assume this, we might as well drop all of the silly crosses on these lines—and then we’re left with the usual Feynman rules (with no terminating Higgs lines) that are usually presented.

(A small technical note: the Higgs isn’t actually responsible for all mass. For example, bound states get masses from their binding energy. Just look up the mass of the proton and compare it to the mass of its constituent quarks. The proton has a mass of about 1 GeV, while the up/down quarks are only one thousandth of this. Most of the proton mass comes from the binding energy of QCD.)

Some closing remarks

Before letting you ponder these things a bit more, let me make a few final remarks to whet your appetite for our next discussion.

  • The photon, as we know, is massless. We thus expect that the Higgs does not interact with the photon, or else we could have ‘terminated’ the Higgs lines in the interaction vertex and generated a photon mass.
  • On the other hand, the Higgs gives the W and Z bosons mass. This means that it costs energy to produce these guys and so the weak is only really effective over a short distance. Compare this to photons, which are massless, and so can produce a long range force. (Gluons are also massless, but they have a short range force due to their confinement.) Thus the Higgs is responsible for the “weakness” of the weak force.
  • … on that note, it’s worth noting that the “weak” force isn’t really so weak—it only appears weak at long distances due to the mass of the W and Z. If you look at shorter distances—say on distances shorter than the distance between two Higgs crosses in the cartoon picture above—then you’d find that the weak force is actually quite potent compared to electromagnetism. Thus a more accurate statement is that the Higgs is responsible for the short-ranged-ness of the weak force.

There are also a few open questions that are worth pointing out at this point. We’ll try to wrap these up in the upcoming posts on this subject.

  • The big elephant in the room is the question of why the two-point interaction from terminating a Higgs line should be interpreted as a mass. We got a hint in the picture above of how “bumping off the Higgs” can at least heuristically appear to have something to do with inertia. We’d like to better understand what we really mean by mass.
  • We also very glibly talked about treating everything as massless and only generating ‘effective’ masses through such Higgs interactions. Special relativity tells us that there is a very big difference between a particle with exactly no mass and those with some mass… this has to do with whether or not it is possible in principle to catch up to a particle. How does this mesh with our picture above that masses can come from ‘bumping off the Higgs?”
  • What does it mean physically that the Higgs line can terminate? What do we mean by the “vacuum expectation value?” This will turn out to be related to the idea that all of our particles are manifested as quantum fields. What does this mean?
  • This whole business is related to something called electroweak symmetry breaking, and that is the phenomenon associated with the Higgs which is really, really magical.
Share

How many different particles can you make from quarks? A lot. Every two years or so, the particle data group puts out a catalog of the ones we know about. I always love getting mine in the mail. It’s as big as a phone book, with thin paper like a Bible. The compilation of all the particles and their properties represents a truly massive intellectual effort. Most of the hadrons are just labeled with Greek letters, but they’re festooned with all kinds of superscripts and asterisks, and their properties have names as colorful and idiosyncratic as their discoverers. For example, the neutral Ξ or “cascade” hyperon is a doubly-strange baryon with negative half-integer isospin. To my ear, most science fiction falls flat compared to real conversations between particle physicists.

By adding energy to hadrons, they can change their nature and go into excited states called resonances. The idea is loosely analogous to exciting atoms in a laser or fluorescent lamp, except more relativistic. Their mass can change. The humble proton, for example, can be excited into something called a Δ resonance, which is around 30% more massive, because some of the absorbed energy converts to mass. They don’t hang around very long, but as you look at higher and higher masses, you see more and more of them. By the 1960’s, the number of newly discovered particles and resonances had grown rapidly in step with the energy of the accelerators that produced them. This proliferation led to questions about how to explain such large variety, and what, if any, the limitations are in the number of states. When the quantum-mechanical rules governing properties like spin, charge, angular momentum, etc. were taken into account, the number of hadronic states was found to rise exponentially with mass. This plot is a fairly recent example:

 

Up to a certain mass, the number of hadrons rises exponentially. The red curve includes particles that weren't plotted in earlier references, represented by the green curve.

When you see a straight line on a semi-log plot, it’s a dead giveaway for an exponential form. Why is that pattern followed? What’s even more interesting is that the number of particles rises with mass at the same rate as it falls with increasing (transverse) momentum, at least below a few GeV. Several creative ideas emerged as attempts to explain the hadron spectra, but a physicist named Rolf Hagedorn gets the credit for developing a theory using statistical mechanics. This is before the era of quarks, remember: he referred to hadrons as “fireballs”, and considered that the heavy resonances were compositions of lighter ones, which were in turn composed of still lighter ones. In one of his lively papers, he said:

His mathematical line of reasoning implied that if you were to collect a bunch of hadrons together and treat them as a gas of particles, their energy would become infinite as the temperature approached a limiting value. He seems to have been quite a character. In the same paper, he concluded:

It follows that T is the highest possible temperature—a kind of ‘boiling point of hadronic matter’ in whose vicinity particle creation becomes so vehement that the temperature cannot increase anymore, no matter how much energy is fed in.

And now we come to the point. Hagedorn’s argument implies a change in the number of fundamental degrees of freedom of the system. In other words, it has to break down to more fundamental building blocks. Instead of remaining as a gas of hadrons, a superheated system would melt into a phase with simpler constituents at a temperature near what is now known as the Hagedorn temperature. Using the best data available, he extrapolated from the known spectra to obtain a value of the critical temperature near 160 MeV, or in more familiar units, a trillion degrees Celsius.

With a more sophisticated understanding thanks to Quantum Chromodynamics (QCD), more tools have become available to check this number. It’s a tough job, because this physics lies in the so-called “non-perturbative” regime,  where pencil-and-paper solutions to the QCD equations don’t work well. But that’s what supercomputers are for. The founders of QCD devised a way to crunch out the answers by dividing space-time itself into a grid of points called the lattice, “playing” the equations forward numerically in steps. It takes a lot of CPU cycles, but the answer seems to corroborate Hagedorn’s estimate.

So nuclear matter melts if you get it hot enough. It was suggested over 40 years ago, and theoretical innovations only seem to confirm it. So what happens then? And is this temperature achievable in the lab? I’ll post again soon to follow up on these questions.

 

Share

Why Frank loves SUSY

Tuesday, May 3rd, 2011

This week I’ve been in Arlington Texas, attending the excellent south western ATLAS analysis jamboree. As a special treat the jamboree dinner was held in conjunction with an event at Southern Methodist University just to the north of Dallas.

The key speaker at this event was Frank Wilczek, the 2004 winner of the Nobel prize in physics. Frank won this prize for work he began during his Ph.D. studies (take note all you students) concerning the nature of the strong force. Tonight though, he did not talk about this, instead he focused on the LHC and on its ability to discover Supersymmetry (SUSY).

 

Me and Frank Wilczek

Me and Frank Wilczek

 

I’ve name dropped SUSY before, and once again explaining SUSY is way beyond the scope of what I intend to say today. In brief, SUSY solves a number of problems present in the Standard Model by introducing a new symmetry to the theory which allows the transformation of force particle (bosons) into matter particles (fermions). Essentially presenting these as two facets of the same thing.

SUSY has a lot of interesting and beautiful implications. It brings a greater level of symmetry to the Standard Model and by doing so explains all of the known particles and forces in a concise and elegant way.

Frank’s favourite property of SUSY is its ability to explain the strong, weak and electromagnetic forces each as manifestations of a single “grand-unified” force. These forces then only appear to be different to us as we’re forced to study them at the exceptionally low energies available in everyday life. However, if we were to look at these forces more closely, that is to say at much much higher energy, then SUSY predicts that we’d see that they are all one and the same thing.

The motivation for this grand-unification claim comes from, among other things, studying the how the strengths of these forces change with increasing energy. The idea being that if they are all the same force, then at some energy their strengths should all be the same.

If the Standard Model is the final word then this doesn’t happen. But, if we throw SUSY into the equation then, miraculously, it does. Moreover it happens at an energy that fits nicely(-ish) into our understanding of the universe.

 

The evolution of the strengths of the forces with energy in the Standard Model (1).

 

 

The evolution of the strengths of the forces with energy in the Minimal Supersymmetric Standard Model (1). Gravity is also shown in red.

 

Unfortunately even with the LHC studying the unification energy is way way out of reach. But, if SUSY is able to provide grand unification, then we’ll certainly be able to see it at the LHC.

Whether you buy this as a suitable motivation for SUSY or not is a matter of taste. Not everyone is convinced, one of the reason being that to get to the unification scale you have to extrapolate the strengths of the various forces over thirteen orders of magnitude. Yet, to date, we’ve only measured them over the first three.

Frank, however, doesn’t seem to feel this is an issue and as he’s the one with the Nobel prize maybe you should listen to him.

References:

[1] Anticipating a New Golden Age, Frank Wilczek, arXiv:0708.4236v3.

Share

Hi everyone! Readers of this blog might enjoy some of the following recent multimedia by some well-known  particle physicists.

  • First, a podcast from Jim Gates of the University of Maryland about his path in  Go Tell It on the Mountain (link to iTunes, link to mp3) from The Moth. The talk is from the 2008 World Science Festival, which will be held again this year in New York City in a month.
  • Next, a very nice animated discussion with Daniel Whiteson and Jonathan Feng from UC Irvine on PhD Comics. They discuss dark matter, particle physics, and the Large Hadron Collider.
  • Along the lines of dark matter and particle physics, here’s a mission briefing from NASA on AMS-2, the “particle detector in space,” featuring principal investigator (and Nobel laureate for the discovery of the J/ψ particle) Sam Ting. Matt mentioned AMS-2 in his inaugural post. A lot of particle physicists are excited about AMS due to recent anomalies in the spectrum cosmic positrons and anti-protons that may be a result of dark matter interactions.
  • Finally, some time ago I had a general-public-level post about Nima Arkani-Hamed‘s (and collaborators) work in scattering amplitudes. For those with a technical background who interested in learning more, his informal lectures to the Cornell particle theory group are now posted online: part 1, part 2, part 3, part 4, part 5. For those who can’t get enough, there’s also an ongoing program at the KITP with lots of recorded talks. These links are at the level of theoretical physicists doing work in the field; for a general public version, see Nima’s messenger lectures.
Share

The April Meeting

Tuesday, May 3rd, 2011

2011 American Physical Society April Meeting, Anaheim, CA

 

 

 

 

 

 

 

 

 

 

 

Hello from Anaheim, California!

Yes it is that time of year: the April APS (American Physical Society) meeting.   It has become tradition that each year in April, the membership of the APS in the Division of Particles and Fields meets together with the membership of somewhat related divisions: Astrophysics, Nuclear Physics, Computational Physics, Physics of Beams, and Plasma Physics.  I find these meetings particularly broadening, as I can sometimes hear about topics that I do not necessarily get exposure to all of the time in my day-to-day work in hadron collider physics.  In fact, some of the more entertaining session titles I have seen here include “Black Holes: Nature’s Ultimate Spinmeisters, “Much Ado about Nothing: The Quantum Vacuum, and “So Many Dynamos: Flow-Generated Magnetic Fields in Nature, in the Computer, and in the Lab.  (I believe the latter also wins for longest session title, barely beating out the more straightforward and understandable– for me at least– “Precision Measurements, Fundamental Symmetries, and Tests of the Standard Model“.)

Other interesting topics at this meeting, such as “Nuclear Weapons at 65“, “The Status of Arms Control“, and “Best Practices in K12 Physics Teacher Education Programs”  are a result of the inclusion of the Forum on Society, the Forum on Education, and other such broad-interest topics in this meeting.  Yet in my opinion one of the most important roles that these APS (and the Divisional) meetings play is to provide a forum for students to give 10-15 minute parallel session talks on their own analysis.  At other conferences it is rare to have single-result talks rather than summaries, and summaries are generally given to more senior people.  This is often the first (and sometimes only) chance a graduate student has to prepare a talk for a non-expert (non-working group) audience. With these talks they learn to prepare a summary of their work with an appropriate level of detail, omitting jargon, timing it properly, and most importantly, stating the big picture (the context) of their work, as well as the bottom line.  When I was a graduate student I found the APS meetings to be valuable training in public presentations.  For this reason I sent my student, David Cox, from Fermilab to Anaheim to present his own recent work on our searches for a massive top-like, perhaps 4th generation, quark (“tprime“) at the Tevatron.  He has actually had practice giving talks at other meetings, but this is still good experience for him.  He is also attending useful career sessions for graduate students as well.

My own main purpose for attending this meeting has been to present results in an invited plenary talk on Top Quark Physics, which I delivered on Saturday morning during one of several plenary sessions. My talk focused on results from the Tevatron‘s CDF and D0 experiments, not from the LHC.  This was in fact a tall order for a 30 minute talk, since the large datasets from Run 2 of the Tevatron, together with the years of experience with these detectors and analysis tools, have meant a plethora of interesting and innovative results from CDF and D0 constantly being released to the public.  Measurements of the top quark mass for example, the all-important electroweak parameter, have reached sensitivities to less than a percent relative, much better than the Run 2 goal of 3 GeV.  Yet some relatively new measurements, such as the studies of the difference between the mass of the top quark and the mass of the anti-top quark (expected to be zero if CPT is conserved), still have very little statistical sensitivity due to the difficulty of the measurement.

The measurements of the forward-backward asymmetry AFB in top pair production have received attention earlier this year not only because both CDF and D0 continue to see a 2-sigma (or more) discrepancy with the theoretical predictions, but also because there appears to be a dependence on the invariant mass of the top pair system, which could imply the existence of new high-mass particles decaying to top quarks.  (The original AFB measurement at the Tevatron was actually performed by my postdoc, Tom Schwarz — CDF Top Group Convener, when he was a thesis student at U. Michigan, and we’ve continued to study this anomaly with our collaborators from Michigan since then.)  This measurement has generated quite a bit of theoretical interest so I was happy to take some time for these measurements,  along with many other interesting topics, such as whether the top quark really has an exotic -4/3 charge instead of the +2/3 charge of the Standard Model.

While the Tevatron is producing spectacular results in the area of top quark physics (and many other areas), the reality is that even at half of the design energy, the LHC will soon outshine the Tevatron for most measurements.  The production cross section (production rate) for top pairs at the 7 TeV LHC is much greater than at the ~2 TeV Tevatron due to the higher energies available.  Measurements of things like the top-antitop mass difference, or the top quark charge, will soon have better sensitivity at the LHC.  It may take a little longer for the LHC experiments to catch up in the area of the precision top mass measurement, mainly due to the complicated systematic uncertainties that need to be taken into account, but eventually the Tevatron will be bested there as well.  The AFB measurement will be difficult to challenge or improve upon at the LHC, however, since the asymmetry is thought to result from quark-antiquark annihilation, which is much more dominant at the Tevatron’s proton anti-proton collider than the proton-proton collider of the LHC.  For that we will still have more to say from the Tevatron’s final datasets.

Giving this talk has been a nice way for me to pay tribute to the amazing results from dedicated analysts at the Tevatron over the ~16 years since the top quark was discovered there. Although the Tevatron is scheduled to close down later this year , I cannot help be excited about the new projects I and many others are working on at the LHC.  Some are topics that we could barely touch at the Tevatron such as boosted top quarks, which I am currently working on at CMS.  (See Flip Tanedo’s recent post on this subject from Atlas.)  Some, like our tprime searches, have shown hints of excess events on the tails of the distribution, so we are excited to see whether this excess grows at the LHC.  Regardless of the particular topic, we are all approaching the LHC with the knowledge we have gained from the Tevatron, and are excited to continue to explore the particle frontier with the greater rates and energies of the LHC.  And we are definitely on the look-out for discoveries!

 

Share

Any large collaboration like ATLAS needs a process for allowing members to communicate their work to each other and to the public. There have been some recent questions about how this process works, so I’m going to address the topic in this post.

We particle physicists are a bit unusual, though not unique, among scientific disciplines in that our authors sign official papers in alphabetical order as opposed to being ranked by how much they contributed to the work. We are also famous for our long author lists, which for the large LHC experiments include up to a few thousand people since all members of the collaboration sign each paper unless individuals request that their names be removed.

There has been some debate in the field about whether our author lists should be more exclusive and include only those people who worked directly on the physics analysis being published. I have always appreciated the lack of squabbling over author lists and the way our inclusive list gives a nod to the fact that our detector is incredibly complex and could only be built, maintained and interpreted for physics results with a large team. There are also many people who have contributed to the upstream work of an analysis, which makes the final result possible. The counter-argument is that it is nearly impossible for people outside the field to know who did the actual analysis work for any particular result. I think that people inside the field can usually find out who did what, even at other experiments, pretty easily by seeing who gives the related talks at the conferences and from reference letters within the collaboration, and even just by asking around.

Regardless of where you come down on the author list debate, the fact that our author list is currently the entire collaboration puts a burden on our result approval process in that every author needs to be given the opportunity to comment on every result he/she will sign.

Before we worry about communicating our results to the world, we need to have a mechanism to communicate our work in real time to each other within the collaboration. This allows us to scrutinize the steps as they are taken so we know that we are building a solid analysis. We achieve that by giving presentations at meetings and writing emails, but we communicate probably most efficiently by writing notes to each other to document snapshots of the early stages of an analysis. This documentation can have a much smaller list of authors who are responsible for the specific set of ideas presented. Documents like this are simply labeled “COM” for “communication,” and they are not intended for public consumption. Any ATLAS member can write a COM note at any time, and people do not necessarily put the names of all of the people on which their work relies on the author list.

If you want your work to move toward official internal ATLAS approval, you can request that it be given the status “INT” for “internal”. At this point leaders of the relevant physics group appoint reviewers, and the authors have a chance to get feedback in a formal way from other collaboration members. A note that has gained INT status has undergone at least some peer review, though it stays internal to the collaboration.  The content of the INT note is often too technical for general public interest, but can be invaluable for other ATLAS collaborators who want to either reproduce a result or take the analysis to the next step with a good understanding of everything that has come before.

Some COM-notes can also become public (i.e. available to everyone on the planet). Together with published papers, these public notes report the scientific output of the experiment.  In order for the result to take the final step to become public, an editorial board is appointed, and often a new note is written (starting as a COM note) with an attempt to remove ATLAS-specific jargon and details that people outside the collaboration would not necessarily find useful. With the help of the editorial board, the note is brought to a stage where it is ready to receive feedback from the entire collaboration. If the note is approved by the collaboration it will be posted to an archive that is available to the public, submitted for publication and/or the results will be shown at conferences.

There are, of course, many details that I haven’t described, but the end result is that an analysis that has been publicly approved by ATLAS will have come under scrutiny at many stages of the process. People work very hard to make sure that the results presented to the public are worthy of being signed by the collaboration. Our goal is to work as a team as quickly as we can to get these results out to the rest of the world while at the same time ensuring that we have not made mistakes.  Our scientific reputation is on the line.

Share