• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Sally
  • Shaw
  • University College London
  • UK

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Posts Tagged ‘OPERA’

A Grumpy Note on Statistics

Tuesday, March 13th, 2012

Last week’s press release Fermilab about the latest Higgs search results, describing the statistical significance of the excess events, said:

Physicists claim evidence of a new particle only if the probability that the data could be due to a statistical fluctuation is less than 1 in 740, or three sigmas. A discovery is claimed only if that probability is less than 1 in 3.5 million, or five sigmas.

This actually contains a rather common error — not in how we present scientific results, but in how we explain them to the public. Here’s the issue:

Wrong: “the probability that the data could be due to a statistical fluctuation”
Right: “the probability that, were there no Higgs at all, a statistical fluctuation that could explain our data would occur”

Obviously the first sentence fragment is easier to read — sorry![1] — but, really, what’s the difference? Well, if the only goal is to give a qualitative idea of the statistical power of the measurement, it likely doesn’t matter at all. But technically it’s not the same, and in unusual cases things could be quite different. My edited (“right”) sentence fragment is only a statement about what could happen in a particular model of reality (in this case, the Standard Model without the Higgs boson). The mistaken fragment implies that we know the likelihood of different possible models actually being true, based on our measurement. But there’s no way to make such a statement based on only one measurement; we’d need to include some of our prior knowledge of which models are likely to be right.[2]

Why is that? Well, consider the difference between two measurements, one of which observed the top quark with 5 sigma significance and the other of which found that neutrinos go faster than light with 5 sigma significance. If “5 sigma significance” really meant “the probability that the data could be due to a statistical fluctuation,” then we would logically find both analyses equally believable if they were done equally carefully. But that’s not how those two measurements were received, because the real interpretation of “5 sigma” is as the likelihood that we would get a measurement like this if the conclusion were false. We were expecting the top quark, so it’s a lot more believable that the excess is associated with the top quark than with an incredibly unlikely fluctuation. But we have many reasons to believe neutrinos can’t go faster than light, so we would sooner believe that an incredibly unlikely fluctuation had happened than that the measurement was correct.[3]

Isn’t it bad that we’d let our prior beliefs bias whether we think measurements are right or not? No, not as long as we don’t let them bias the results we present. It’s perfectly fair to say, as OPERA did, that they were compelled to publish their results but thought they were likely wrong. Ultimately, the scientific community does reach conclusions about which “reality” is more correct on a particular question — but one measurement usually can’t do it alone.

———————————

[1] For what it’s worth, I actually spent a while thinking and chatting about how to make the second sentence fragment simpler, while preserving the essential difference between the two. In this quest for simplicity, I’ve left off any mention of gaussian distributions, the fact that we really give the chance of a statistical fluctuation as large or larger than our excess, the phrase “null hypothesis,” and doubtless other things as well. I can only hope I’ve hit that sweet spot where experts think I’ve oversimplified to the point of incorrectness, while non-expert readers still think it’s completely unreadable. ;)

[2] The consensus among experimental particle physicists is that it’s not wise to include prior knowledge explicitly in the statistical conclusions of our papers. Not everyone agrees; the debate is between Frequentist and Bayesian statistics, and a detailed discussion is beyond the scope of both this blog entry and my own knowledge. A wider discussion of the issues in this entry, from a Bayesian perspective, can be found in this preprint by G. D’Agostini. I certainly don’t agree with all of the preprint, but I do owe it a certain amount of thanks for help in clarifying my thinking.

[3] A systematic mistake in the result, or in the calculation of uncertainties, would be an even likelier suspect.

Share

This week the OPERA experiment released a statement about their famous “faster than light” neutrino measurement. In September scientists announced that they had measured the speed of neutrinos traveling from CERN to Gran Sasso and they found that they arrived slightly sooner than they should do according to special relativity. There was a plethora of scientific papers, all kinds of rumors and speculation, and most physicists simply refused to believe that anything had traveled faster than light. After months of diligent study, OPERA announced that they may have tracked down two sources of experimental error, and they are doing their best to investigate the situation.

But until we get the results of OPERA’s proposed studies we can’t say for sure that their measurement is right or wrong. Suppose that they reduce the lead time of the neutrinos from 60ns to 40ns. That would still be a problem for special relativity! So let’s investigate how we can get faster than light neutrinos in special relativity, before we no longer have the luxury of an exciting result to play with.

The OPERA detector (OPERA Collaboration)

The OPERA detector (OPERA Collaboration)

Special relativity was developed over a hundred years ago to describe how electromagnetic objects act. The electromagnetic interaction is transferred with electromagnetic waves and these waves were known to travel extremely quickly, and they seemed to travel at the same speed with respect to all objects, no matter how those objects were moving. What Einstein did was to say that the constancy of the speed of light was a fundamental law of nature. Taking this to its logical conclusion meant that the fastest speed possible was the speed of light. We can call the fastest possible speed \(s\) and the speed of light \(c\). Einstein then says \(c=s\). And that’s how things stood for over a century. But since 1905 we’ve discovered a whole range of new particles that could cast doubt on this conclusion.

When we introduce quantum mechanics to our model of the universe we have to take interference of different states into account. This means that if more than one interaction can explain a phenomenon then we need to sum the probabilities for all these interactions, and this means we can expect some strange effects. A famous example of this is the neutral kaon system. There two lightest neutral kaons are called \(K^0\) and \(\bar{K}^0\) and the quark contents of these mesons are \(d\bar{s}\) and \(s\bar{d}\) respectively. Now from the “outside” these mesons look the same as each other. They’ve got the same mass, they decay to the same particles and they’re made in equal numbers in high energy processes. Since they look identical they interfere with each other, and this gives us clues about why we have more matter than antimatter in the universe.

Since we see interference all over the place in the Standard Model it makes sense to ask if we see interference with a photon. It turns out that that we do! The shape of the Z mass peak is slightly asymmetric because of interference between virtual Z bosons and virtual photons. There are plenty of other particles that the photon can interfere with, including the \(J/\psi\) meson, and the \(\rho\) meson. In fact, any neutral vector meson with no net flavor will do. Einstein didn’t know about any of these particles, and even if he did he never really accepted the conclusions of quantum mechanics, so it’s no surprise that his theory would require that the speed of light is the fastest speed (that is, \(c=s\).) But if the photon interferes with other particles then it’s possible that the speed of light is slightly lower than the fastest possible speed (\(c<s\)). Admittedly, the difference in speed would have to be very small!

In terms of quantum mechanics we would have something like this:
\[
|light>_{Einstein} = |\gamma>
\]
\[
|light>_{reality} = a_\gamma |\gamma> + a_{J/\psi} |J/\psi> + a_Z |Z> + \ldots
\]

As you can see there are a lot of terms in this second equation! The contributions would be tiny because of the large difference in mass between the massive particles and the photon. Even so, it could be enough to make sure that the speed of light is ever so slightly slower than the fastest possible speed.

At this point we need to make a few remarks about what this small change in speed would mean for experiments. It would not change our measurements of the speed of light, since the speed of light is still extremely fast and no experiment has ever showed a deviation from this extremely fast speed. Unless somebody comes up with an ingenious experiment to show that the difference between the speed of light and the fastest possible speed is non-zero we would probably never notice any variation in the speed of light. It’s a bit unfortunate that since 1983 it’s been technically impossible to measure the speed of light, since it is used in the definition of our unit of length.

Now we know that photons can interfere with other particles it makes sense to ask the same question about neutrinos. Do they interfere with anything? Yes, they can interfere, so of course they do! They mix with neutrinos of other flavors, but beyond that there are not many options. They can interfere with a W boson and a lepton, but there is a huge penalty to pay in the mass difference. The wavefunction looks something like this:
\[
|\nu_e>(t) = a(t)_{\nu_e}|\nu_e> + a(t)_{\nu_{\mu}}|\nu_\mu> + a(t)_{\nu_{\tau}}|\nu_\tau> + a(t)_{We}|We>
\]
(I’ve had to add a time dependence due to neutrino mixing, but it’s essentially no more complicated than what we had for the photon.)

That means that the photon could get slowed down slightly by the interference with other particles (including particles in the vacuum) and that neutrinos could get slowed down more slightly by their interference terms with other particles. And that way we could get neutrinos traveling faster than the speed of light and special relativity could remain intact. (In this description of the universe we can do what used to seem impossible, we can boost into the rest frame of a photon. What would it mean to do that? Well I suppose it would mean that in this frame the photon would have to be an off-shell massive particle at rest.)

The SN 1987 supernova, a rich source of slower than light electron neutrinos (Hubble, ESA/NASA)

Now I’ll sit back and see people smarter than I am pick holes in the argument. That’s okay, this isn’t intended to be a serious post, just a bit of fun! There are probably predictions of all kinds of weird effects such as shock waves and time travel that have never been observed. And there are plenty of bits I’ve missed out such as the muon neutrinos traveling faster than electron neutrinos. It’s not often we get an excuse to exercise our analytic muscles on ideas like this though, so I think we should make the most of it and enjoy playing about with relativity.

Share

New Information on “FTL Neutrinos”

Thursday, February 23rd, 2012

We have new information, but my position on the OPERA experiment’s FTL neutrino measurement hasn’t changed.

First, here’s what we know. Members of the OPERA experiment has been working diligently to improve their measurement, better understand their uncertainties, and look for errors. Yesterday, the discovery of some possible problems was leaked anonymously (and vaguely) in Science Insider. This compelled OPERA to release a statement clarifying the status of their work: there are two possible problems, which would have opposite effects on the results. (Nature News has a good summary here.)

The important thing to learn here, I think, is that the work is actually ongoing. The problems need further study, and their overall impact needs to be assessed. New measurements will be performed in May. What we’ve gotten is a status update whose timing was forced by the initial news article, not a definitive repudiation of the measurement.

Of course, we already knew with incredible confidence that the OPERA result is wrong. I wrote about that last October, but I also wrote that we still need a better understanding of the experiment. Good scientific work can’t be dismissed because we think it must have a mistake somewhere. I’m standing by that position: it’s worth waiting for the final analysis.

Share

For what it’s worth, neutrinos are weird. They are probably the strangest bits of matter in the Universe, and I do not mean in the quark sense either. Assuming that neutrinos are not actually trans-dimensional beings in search of a new home, there is probably no particle in Physics Past, Present, & Future that has bore more brunt of physicists’ creativity. On the other hand, as far as I know, there is no other particle that has solved as many problems in physics as neutrinos. The higgs boson is a good contender, but I still think neutrinos take the cake due to the fact that they have been around longer. Well, that and actually having been found to exist.

Figure 1: The (Left) Electron-, (Center) Muon-, and (Right) Tau-Neutrino, in plushie representation, brought to you by ParticleZoo. [Images: ParticleZoo]

I am sure by now you are wondering, “What are you talking about?”, and in all fairness, that is a very good question. In physics, neutrinos have a long history of being either the particle that broke the mold or the particle that saved physics. In doing so, neutrinos have developed this reputation for being the go-to particle for a new theory. In all fairness though, neutrinos are not doing themselves any favors if experiments keep finding contradictions with known laws of physics *cough*. I am sure for every flavor of ice cream at Baskin-Robbins or Ben & Jerry’s, there is a neutrino that has either been discovered or hypothesized.

Figure 2: The (Left) Electron-, (Center) Muon-, and (Right) Tau-Antineutrino, in plushie representation, also brought to you by ParticleZoo. [Images: ParticleZoo]

For today’s post, I though I would share with you a few of the many flavors of neutrinos. It is also my secret goal to mention “neutrinos” so often in this post that it will be at the top of Google’s queue. The table of contents is just below with the full list today’s neutrino flavors. Believe it or not, there are still plenty of types omitted. I suppose I have to write a future post to include these. :D

Happy Halloween & Happy Colliding!

- richard (@bravelittlemuon)

Table of Contents

  1. The First Neutrino: Pauli’s Neutron
  2. Chadwick’s  Neutrino: The Neutron
  3. Fermi’s Neutrino: The Key to the Weak Nuclear Force
  4. Majorana’s Neutrino
  5. The Super Massive Neutrino
  6. The Extra, Extra Neutrino
  7. The Sterile Neutrino: Type I
  8. The Sterile Neutrino: Type II
  9. The Tachyon Neutrino

 

1. The First Neutrino: Pauli’s Neutron

Back in the days when particle physics was still a young field in physics, about a decade before the discovery of Quantum Mechanics, experimentalists studying radioactive decay discovered something very startling: When a radioisotope decayed and emitted a high speed electron, then energy & momentum were not conserved. This was a very worrisome result because these conservation laws were, and still are, pillars of physics. In 1930, Wolfgang Pauli, after whom the famed Pauli-Exclusion Principle is named, made an audacious suggestion that perhaps radioactive decay involving electron emission also involved the production of an additional particle. Pauli’s stated that his neutrino, then named the neutron (different from today’s neutron), that was (1) electrically neutral and (2) massless, or nearly massless, (3) did not travel at the speed of light, and (4) virtually undetectable by contemporary, experimental standards.

Figure 3. The Nobel Foundation’s official portrait of Prof. Pauli (Nobel 1945). Yes, this is the man responsible for suggesting the existence of the neutrino. As father of all hypothetical particles, Pauli would later come to regret (mid-page) proposing an undetectable objects. [Image: Nobel Foundation]

At the end of the day Pauli was spot on with his suggestion. Radioactive decay involving electron emission does, indeed, require a very light, electrically neutral particle. In fact, the following generation of neutrino detectors were able to discover it without a problem. It turns out, all someone needed was a nuclear reactor and patience.

2. Chadwick’s  Neutrino: The Neutron

http://jovasquez.blogspot.com/2010_08_01_archive.htmlFigure 4: The (real) neutron is composed of one up-flavor quark and two down-flavor quarks. [Image: Internet]

James Chadwick‘s discovery of the neutron proved one thing very, very well: that the Universe has an odd sense of humor and likes to confuse those to attempt to understand it. Having uses from nuclear power to cancer therapy, at the end of the day neutrons have been a boon for the scientific community and society as a whole. When first discovered, however, Chadwick initially misidentified it as Pauli’s neutron (a.k.a. the real neutrino). Today, the names we have for many particles are really artifacts of the confusion in particle physics through the 1930s & 40s. (For those of the physics history persuasion, this is just like the discovery of the “μ” meson.) Here is a time line the discovery of Chadwick’s neutrino (a.k.a. the fake neutrino):

  • 1911 – The gold foil experiment is carried showing that the atom consists of a dense center. It is later found that an atom’s nucleus is too heavy to be composed only of protons. Fifty years later, gold foil is also discovered to be a source of unlimited amounts of chocolate.
  • 1911β-decay, the mechanism through which some radioisotopes decay, appears initially to violate the Law of Conservation of Energy.
  • 1930 – Pauli proposes, in his famous “Dear Radioactive Ladies and Gentlemen” letter, the existence of a massless (0r near massless), electrically neutral particle, called the “neutron” (actually the electron-neutrino), to resolve the apparent energy non-conservation in radioactive β-decay.
  • 1932 – Chadwick claims possible discovery of a massive, electrically neutral, particle within the nucleus of an atom. Believing it to be Pauli’s neutron (actually the electron-neutrino), he calls it the “neutron” (actually the real neutron).
  • 1934Enrio Fermi, using the newly created framework of Quantum Field Theory, proposes a simple four-particle interaction to describe β-decay (See 3. Fermi’s Neutrino). With known experimental results, Fermi was able to determine that Chadwick’s neutron (real neutron) was much too heavy to be Pauli’s neutron (fake neutron; real neutrino) and renamed Pauli’s neutron the “neutrino,” which is Italian for “little neutral one.” The only thing more impressive than the accuracy to which this model actually describes Nature is how short the paper is.
  • 1942 – Pauli’s neutrino is discovered. In full disclosure, the particle he proposed to solve the problems of β-decay and what was actually discovered first is really the anti-electron-neutrino.

The real neutron is not really a neutrino; it just stole the real neutrino’s name. That jerk (the neutron not Chadwick).

[Note: It is really hard to write "neutrino," "neutron," and embed hyperlinks, all while focusing on the historical context.]

3.Fermi’s Neutrino: The Key to the Weak Nuclear Force

The mathematical and physical description of radioactive decay is, by far, one of the most beautiful things I have every seen in either Mathematics or Physics. (The second is probably the metric structure in Special Relativity.) What is so amazing about it is how it changes at higher energies. On one end of the energy spectrum, you have everyday radioactive decay; somewhere near the middle, you have the restoration of electroweak symmetry and higgs boson production; and on the far end, you have the grand unification of all forces.

In attempt to explain a type of radioactive decay known as β-decay, Enrico Fermi, in 1934, supposed that during this process a radioisotope will decay into a more stable isotope, a high speed electron (β-particle), and a hypothetical particle predicted to exist by Pauli, called the neutrino (See 2. Chadwick’s Neutrino). They Feynman diagram that illustrates this interaction is just below. I should note now that what Pauli really predicted is a neutrino’s antimatter equivalent call the anti-neutrino.

Figure 5: Enrico Fermi’s 4-fermion interaction model to describe β-decay. n represents an incoming neutron, p represents an outgoing proton, e is an outgoing electron, and note the outgoing anti-electron-neutrino (νe). [Image: Mine]

Being a fermion, a neutrino has an antimatter partner called an anti-neutrino. Under the rules of Quantum Field Theory, one can then induce β-decay by directing a beam of neutrinos into a bunch of heavy nuclei, like a thick plate of steel. Such a process would be drawn like this:

Figure 6: Enrico Fermi’s 4-fermion interaction model to describe neutrino scattering. n represents an incoming neutron, p represents an outgoing proton, e is an outgoing electron, and note the incoming electron-neutrino (νe). [Image: Mine]

Though the probability of inducing β-decay is very small but it becomes larger with higher energy. If you extrapolate this to very high energies, you find out that eventually the probability of inducing β-decay becomes larger than 100%, which is total nonsense. You can never have a 101% of your interactions result in anything. In particle physics, the sum of all probabilities must add up to 100%; in such cases where they do not, we say that “unitarity has been violated.” This terminology originates from the fact that the matrix containing all possible interaction outcomes is a unitary matrix, implying that total probability is (1) conserved and (2) identically equal to 1 (or 100%).

How does Nature avoid breaking math at high energies? Well at around 100 GeV, rather than two particles smashing into each other to produce two different particles, a neutrino will radiate a W boson and become the high speed electron (β-particle). This W boson is then absorbed by a neutron (Chadwick’s neutron) and is turned into a proton, thereby transmuting one isotope into another isotope. Since producing a W boson (mW = 80.399 GeV/c2) is not cheap and requires a lot of energy, the probability of scattering a neutrino off a nucleus is driven down and prevents unitarity from being violated.

In summary, Fermi’s neutrino & Weak Nuclear Theory model is the  foundation for the Electroweak component of the Standard Model.

Figure 7: Tree-level diagram of the neutrino scattering process in which (1) a neutrino will emit a W and become an electron, and is followed by (2) a down-type quark absorbing the W boson and becoming an up-type quark. The 4-fermion model is the low-energy approximation of this description. Color represents the QCD charge held by the quarks in a nuclei. Color also makes things look nicer. [Image: Mine]

4. Majorana’s Neutrino

Antimatter, the destroyer of basilicas, the stuff of warp drives, and just all around fascinating piece of science, was predicted to exist in 1928 by the great Paul Dirac, and discovered shortly thereafter (1932) by Caltech’s Carl Anderson. This is the same Anderson who is discovered the muon, and so he probably qualifies to be my hero. One way to describe antimatter is to imagine regular, ordinary matter, but for each charge a piece of matter has its antimatter partner has the opposite charge. For example, the top quark has a number of charges: +2/3 electric charge; it can have a red, blue, or green charge from the Strong Nuclear force (QCD); and it also has a “topness” (or “truthfulness”) charge under the Weak Nuclear force. An anti-topquark then must have: a -2/3 electric charge; an anti-red, anti-blue, or anti-green “color” charge; and has “anti-topness” (or “anti-truthfulness”… does that make anti-topquarks liars?).

Well, I suppose one has to wonder if it is possible for a particle to ever be its own anti-particle. The answer is yes. Such particles are called Majorana particles. Italian physicist Ettore Majorana speculated and determined a number of constraints, namely to conserve all the various types of charges (electric, color, weak) a Majorana particle must be neutral under all its charges. To get this right, I need an electrically neutral, colorfully neutral, and weakly neutral. To me, this sounds just like a neutrino! If it smells like a neutrino, looks like a neutrino, and tastes like a neutrino, then clearly it must be a duck neutrino.

What is the problem? Well, if neutrinos are their own antiparticle then physicists expect to see something called neutrino-less double β-decay (or 0νββ for short). In this process, a radioisotope will undergo β-decay and emit a high speed electron and an anti-electron neutrino. If neutrinos are indeed Majorana particles, then the anti-electron-neutrino is also an electron-neutrino and can force a second radioisotope to also emit a high speed electron.

To date, 0νββ has not been observed but that does not mean it does not exist. It is possible that 0νββ does exist, it must just be a really, really rare process.

Figure 8: Feynman diagram demonstrating how neutrino-less double β can occur if neutrinos are also Majorana particles. [Image: Wikipedia]

5. The Super Massive Neutrino

According to the Standard Model of Particle Physics, there are only three “light” neutrinos. “Light” is defined as less than 1/2 the mass of the Z boson, which mZ = 91.1876 GeV/c2. We have observed this empirically by producing Z bosons in copious amounts at the large electron positron collider and looking at all possible ways we can observe a Z boson can decay. The total number of observed Z decays is then used to calculate the Z boson’s average lifetime (or rate of decay). The observed decay rate is subtracted from the Standard Model’s prediction for the total decay rate. The difference between the theoretical prediction and the experimental observation is then compared to the situation where the Z boson were able to decay into 1, 2, 3, … different pairs of particles that could not be observed with our detectors. These sorts of decays are called “invisible decays” or “invisible decay modes.” From this data, all signs point to three different invisible decay modes, which correspond to the three neutrino flavors in the Standard Model (electron, muon, tau).

Time for caveat number 4,321: Z bosons can only decay into particles lighter than itself, otherwise all sorts of bad things would happen. By bad things, I mean violations of conservation laws. If any particle were to decay into two (almost) identical particles, then at most each daughter particle could weight half of the mother particle. This means, according to invisible decay searches of the Z boson, there are only three types of neutrinos with mass less than 1/2 the mass of the Z boson. It is fair game for neutrinos to be heavier than half the Z mass; in fact, it is possible for a neutrino to be as heavy as ten top quarks! (The top quark is currently the most heavy particle known to exist.)

The most recent experimental results have found that for a stable (non-decaying) neutrino, its mass must be at least 45.0 GeV/c2 (39.5 GeV/c2) for an ordinary (Majorana) neutrino. For a short-lived (decaying) neutrino, it must have a mass of at least 90.3 GeV/c2 (80.5 GeV/c2) for an ordinary (Majorana) neutrino.

6. The Extra, Extra Neutrino

Neutrinos can oscillate. What do I mean by that? Well, if you make a beam of neutrinos and look at the beam composition (% of electron-neutrinos v.s. % of muon-neutrinos, v.s. % of tau-neutrinos),  as a function of distance, then one will notice that the relative composition changes.

For example: If I measure the beam to be made of 100% electron-neutrinos & 0% muon-neutrinos, and a few football pitches away I find that it is now 50% electron-neutrinos, 50% muon-neutrinos, then a few football pitches away from that I can expect to see 100% electron-neutrinos & 0% muon-neutrinos once again. I made up the exact numbers, but I hope you get the idea. It has only been recently (1,2) that all oscillation permutations have been observed.

Figure 9: To measure neutrino oscillations, a neutrino beam is typically shot into the Earth (right), measured by a detector close to the beam’s origin (near detector), and then detected by a detector on the opposite side of the planet (left). Yes, we literally shoot a beam a particles into the Earth and wait for them to come out the other side. PHYSICS. IS. AWESOME. [Image: Interactions]

Well, back in 2001 (that was over 10 years ago, weird…) a Los Alamos experiment LSND (Liquid Scintillator Neutrino Detector) saw a signal that could be explained if neutrinos were also oscillating into a fourth type of neutrino. The MiniBooNE experiment at Fermilab tried to verify this result and was unable to make a conclusive determination. In other words, the jury is still out on the existence of a 4th type of neutrino.

7. The Sterile Neutrino: Type I

I like sterile neutrinos; they are fun. According to the Standard Model, all observed neutrinos are (1) colorless (no interactions via the Strong Nuclear Force), (2) electrically neutral (no interactions via Electromagnetism), and (3) are left handed (Weak charge). This means that Standard Model neutrinos can only interact with the W bosons and sometimes with the Z boson. Well, suppose there were a right-handed neutrino (opposite Weak charge from left-handed neutrino). It is still invisible to the Strong Nuclear Force, the Electromagnetic Force, and the W± bosons (because all W‘s are left-handed). In principle right-handed neutrinos can interact with the Z boson, trying to separate the corresponding signal from background data is like trying to find a find a needle, in a haystack, at a fair. Did I mention this fair is a tri-state fair?

Right-handed neutrinos and other neutrinos that are invisible to the Standard Model forces are examples of what physicists call “sterile neutrinos.” (Personally, I like to qualify these sorts of little tykes with the title “Type I.” See 8. The Sterile Neutrino: Type II why I do so.) If right-handed neutrinos do exist, then there is no way to see detect them given our current understanding of physics. However, this does not mean they cannot interact through some new, undiscovered force.

To date, there is no confirmed evidence, direct or indirect, of the existence of a right-handed or any other type  sterile neutrino. To date, there is no evidence for a new fundamental force either. Though interestingly enough, since sterile neutrinos, in principal, cannot be detected, then it is logical that there could be hundred or even thousands of slightly different sterile neutrinos. Alternatively, we can also a universe filled with a single type of neutrino and we would not be able to detect them outside of gravity (assuming they have mass), which brings me to mention that sterile neutrinos have even been proposed as a dark matter candidate. Neutrinos are resourceful, I will give them that.

Figure 10: A snow-covered hay bale at Fermilab. Imagine trying to find a needly in that field. [Image: FNAL]

 

8. The Sterile Neutrino: Type II

Sterile neutrino type II (again, I made up the “type” nomenclature) is very much like type I but with one glaring difference. Even if there are are new forces in the Universe, these types of neutrinos will still not interact with anything. The only possible forces through which these neutrinos might interact are gravity and whatever unified force that produced these oddballs.

9. The Tachyon Neutrino

In September, the Italian neutrino experiment OPERA (Oscillation Project with Emulsion-tRacking Apparatus) shocked the world when the collaboration announced it had observed neutrinos traveling at a speed faster than that at which light travels. My colleagues have blogged about it here, here, here, and more recently here. This is a huge deal because, according to Special Relativity, the speed of light (numerically c = 299,792, 458 m/s or 983, 571, 056 ft/s) is pretty much a cosmic speed limit that no real particle can surpass. So I am not sure which makes me happier, the fact that tachyons are seriously being floated as an explanation for this claim or that #FTLneutrinos is a thing. (“FTL” stands for “faster than light.”)

Metaphorically, tachyons are interesting sorts of creatures. I do not know too much about them beyond the fact that they have (in the mathematical sense) a purely imaginary mass. The last time I checked quantum mechanics, we cannot observe strictly imaginary quantities, but I digress. What I do know is that special relativity implies that having a purely imaginary mass should then enable tachyons to permanently travel at speeds faster than c. If neutrinos do travel at speeds faster than the speed of light, then they may also be tachyons. I think it is a perfectly reasonable argument. However, there is a very big elephant in the room that I have to address. Having imaginary mass means that all tachyons always travel at superluminal speeds. If some neutrinos are found to travel at subluminal speeds then the idea that neutrinos are tachyons is tossed out. End of story.

So in light of the considerable implications of any particle traveling faster than the speed of light, it is very appropriate to remain cautious and wait for OPERA to reproduce their results and independent verification, possibly by Fermilab’s MINOS Experiment or KEK’s T2K Experiment.

Figure 11: A real life tachyon. [Image: ParticleZoo]

Share

Le temps de la réflexion

Tuesday, October 25th, 2011

Notre domaine de physique vit une période de grande ébullition, riche et passionnante. L’attribution récente du prix Nobel de physique à nos collègues Perlmutter, Schmidt et Riess que je veux féliciter ici, est d’ailleurs pour moi le reflet de ce grand mouvement  de réflexion auquel est associé l’ensemble de notre communauté. Leur découverte il y a une douzaine d’années de l’accélération de l’expansion de l’Univers a proprement sidéré le monde de la cosmologie, et l’énergie noire qui pourrait expliquer cette évidence est devenue un nouveau graal pour les physiciens et pour notre Institut en particulier, qui participe depuis l’origine à ces travaux.

Au même moment, le monde entier porte son regard vers le LHC (Grand collisionneur de hadrons), le plus grand accélérateur de particules au monde, dans l’attente de nouvelles révélations sur les lois les plus intimes de la matière. Nos chercheurs sont ainsi engagés dans une chasse effrénée au boson de Higgs, ce chainon manquant du modèle standard de la physique des particules. Du côté de la physique nucléaire, le chantier du futur accélérateur linéaire Spiral2 démarre officiellement et offrira bientôt à notre communauté une infrastructure internationale de premier plan, permettant d’étudier plus en détail la structure du noyau atomique. L’étau se resserre également dans notre quête de la matière noire, tandis que de manière inattendue, les neutrinos viennent quant à eux jeter le trouble en mettant en doute certains fondements de nos théories.

Bien sûr, il est beaucoup trop tôt pour parler de découverte et le résultat de l’expérience Opera devra être reproduit ou mis en défaut. Le scepticisme quant à cette incompréhensible mesure de la vitesse des neutrinos est d’ailleurs parfaitement sain. Mais il est d’ores et déjà extraordinaire de constater la très grande mobilisation de notre communauté à étudier cette question, aussi bien d’un point de vue expérimental que théorique, dans un fabuleux effort de réflexion collective.

Ainsi, quelles que soient les surprises que nous réserve la Nature, les mois qui viennent seront sans nul doute décisifs pour notre recherche. C’est également pour cette raison qu’il est temps de rassembler notre communauté et de l’inviter à participer à une autre forme de réflexion collective, dans un exercice de prospective pour l’ensemble de nos disciplines. En cette période charnière où d’importantes réflexions stratégiques sont menées en Europe et dans le monde pour imaginer notre recherche de demain, il est important que nous nous rassemblions, chercheurs de l’IN2P3 et du CEA/Irfu pour prendre ensemble le temps de cette réflexion, qui devra permettre à la France de continuer d’être un partenaire majeur dans cette grande quête pour la connaissance dans laquelle nous sommes engagés.

– Jacques Martino, Directeur de l’Institut national de physique nucléaire et de physique des particules du CNRS

Les journées de prospective de l’IN2P3 et de l’Irfu se dérouleront à Giens, du 2 au 5 avril 2012. Les personnels des instituts peuvent participer aux groupes de travail : http://www.in2p3.fr/actualites/media/journees_prospective2012.pdf

Share

To start, let me say that there are extremely strong reasons to believe that the OPERA experiment’s measurement of neutrinos travelling faster than light is flawed. We knew that from the moment it came out, because it contradicts General Relativity (GR), which is an extraordinarily well-tested theory. Not only that, but the most obvious ways to modify GR to allow the result to be true give you immediate problems that contradict other measurements. To my knowledge, there’s no complete theoretical framework that makes predictions consistent with existing tests of GR and allows the OPERA result to be right.

But in my view of how experimental physics is done, history has shown us that once in a great while, something is discovered that nobody thought of and nobody can fit into the existing theoretical mold. The measurements that led to the discovery of GR in the first place provide a good example of this. Such shifts are extremely rare, but I don’t like the idea of ignoring a result because it doesn’t fit with the theories we have.

No, we have to address the measurement itself, and satisfy ourselves that there really was a mistake. There are many ideas for what might have gone wrong, and as far as I know, the discussion is ongoing. I’m not an expert on it, but I know enough to disagree with some of the blogosphere discussion lately that has pronounced that the case is closed. There seem to be two categories of claims going around:

  1. Articles that point out that the OPERA result is inconsistent with other measurements, as in this piece by Tommaso Dorigo (who is, incidentally, my colleague now that I’ve joined CMS). These are of course correct within the context of GR or any straightforward modifications thereof, as I said right at the start of this post. The question is whether there’s some modification that can accomodate the results consistently, and that’s a very hard thing to exclude. (There is some good discussion in the comments of Tommaso’s post about this, in fact.)
  2. Articles that the OPERA result has been refuted because someone posted an idea on the arXiv server. A current example is this preprint, which asserts that a 60 nanosecond delay might be explained by OPERA having made a relatively trivial mistake in their GPS calculations. Of course, it’s possible that a trivial mistake has been made. But I’m not inclined to consider it definitive, especially because the author has already partially backpedaled upon learning more about how GPS works.


It’s great that people are sending ideas for what might have gone wrong with the result, or how it might be explained. But let’s wait for the discussion to settle down — and, indeed, for OPERA to finalize their paper — before we conclude that the case is closed. I do expect the result to be disproven, but what I want to see is one of these things:

  1. OPERA finds that there really was a problem with their measurement, revises it, and the “superluminal” effect goes away.
  2. Another experiment makes the same measurement, and gets a result consistent with GR.


Either way, I’ll consider the case closed, but there’s no reason to get ahead of ourselves. Doing science usually doesn’t mean knowing the answer in time for tomorrow’s news.

Share

Live blog: neutrinos!

Friday, September 23rd, 2011

This is a live blog for the CERN EP Seminar “New results from OPERA on neutrino properties“, presented by Dario Autiero. Live webcast is available. The paper is available on the arXiv.

The crowd in the auditorium (Thanks to Kathryn Grim)

The crowd in the auditorium (Thanks to Kathryn Grim)

15:39: So here I am at CERN, impatiently waiting for the Colloquium to start on the OPERA result. The room is already filling up and the chatter is quite loud. I’m here with my flatmate Sudan, and we have a copy of the paper on the desk in front of us. I just bumped into a friend, Brian, and wished him look finding a chair! (He just ran to get me a coffee. Cheers Brian!)

15:53: Wow, the room is really crowded now! People are sitting on the steps, in the aisles, and more are coming in. The title slide is already up on the projector, and some AV equipment is being brought in. I was just chatting to Sudan and Brian, and we commenting that this is probably the biggest presentation that the world’s biggest physics lab has seen in a long time! As Sudan says, “The whole world is going to be watching this man.”

15:55: Burton and Pauline are here too, getting some photos before the talk begins. Expect to see more (less hastily written) blog posts about this talk!

15:59: We’re not allowed to take photos of the talk itself, but there will be a video feed that you can watch. See this link for details about the live webcast.

16:03: The talk begins. A fairly straightforward start so far. As usual, the speaker introduces the OPERA Collaboration, and gives a bit of background. Nothing ground breaking so far!

16:06: The analysis was performed blind, which means that the physicists checked and double checked their systematic uncertainties before looking at the data. This is a common best practice in these kinds of experiments and it is a good way to eliminate a lot of experimenter bias. The speaker is now discussing past results, some of which show no faster than light speed, and one of which (from MINOS) that shows a small effect which is less than 2σ.

16:16: Autiero is currently discussing the hardware of the experiment. It looks like a standard neutrino observatory setup- large amounts of dense matter (Pb), scintillation plates and tracking hardware for the muons which get produced when the neutrinos interact. By the time the beam reaches Gran Sasso it is about 2km wide! At CERN the neutrinos are produced by accelerating protons at a target, producing pions and kaons, which are then allowed to decay to muons and muon neutrinos. The hadrons are stopped with large amounts of Carbon and Iron, so that only the neutrinos and some muons survive. By the time the neutrino beam reaches Gran Sasso the muons have long since interacted and are no longer present in the beam. The neutrinos have 17GeV of energy when they leave CERN, so they are very energetic!

16:29: The discussion has moved onto the timing system, probably the most controversial aspect of the experiment. The timing challenge is probably the most difficult part of the whole analysis, and the part that particle physicists are least familiar with. Autiero points out that the same methods of timing are commonly used in metrology experiments. For OPERA, the location of each end of the experiment in space and time is determined using GPS satellites in the normal way, and then a “common view” is defined, leading to 1ns accuracy in synchronization. It looks like variations in the local clocks are corrected using the common view method. The time difference between CERN and Gran Sasso was found to be 2.3 ± 0.9 ns, consistent with the corrections.

16:36: Things are made trickier by identifying where in the “spill” of protons a neutrino came from. For a given neutrino it’s pretty much impossible to get ns precision timing, so probability density functions are used and the time interval for a given proton spill is folded into the distribution. We also don’t know where each neutrino is produced within the decay tube. The average uncertainty in this time is about 1.4ns. Autiero is now talking about the time of flight measurement in more detail, showing the proton spills and neutrino measurements overlaid.

16:39: Geodesy is important to this analysis. OPERA need to know the distance between CERN and Gran Sasso to good precision (they need to know the distances underground, which makes things more complicated.) They get a precision of 20cm in 730km. Not bad! Autiero is now showing the position information, showing evidence of continental drift and even an earthquake. This is very cool!

16:47: Two techniques are used to verify timing, using Caesium clocks and optical fibers. These agree to ns precision. The overall timing system is rather complicated, and I’m having trouble following it all!

16:48: I just got a message from a friend who saw this blog via Twitter. Hello Angela! Welcome to all the readers from Twitter!

16:52: Currently discussing event selection at Gran Sasso. Events must have a highly relativistic muon associated with them. (The speed of the muon and slight difference in direction of flight can only increase the measured time of flight.)

16:54: Autiero is telling us about how the analysis is blinded. They used very old calibrations, intentionally giving meaningless results. A novel approach to blinding!

16:56: No evidence of variation with respect to time of day or time of year. So that’s the “Earth moved!” theory sunk.

17:01: Unblinding: Δt = -987.8ns correction to time of flight after applying corrections (ie using up to date calibration.) Total systematic uncertainty is 7.4ns. Time of flight obtained using maximum likelihood. Measured difference in time of flight between speed of light and speed of neutrinos is

\[
\delta t (c-\nu) = (60.7 \pm 6.9(stat) \pm 7.40 (syst)) ns
\]

\[
\frac{c-v_{\nu}}{c} = -(2.4 \pm 0.28 \pm 0.30)\times 10^{-5}
\]

17:03: ~16,000 events observed. OPERA has spent six months checking and rechecking systematic uncertainties. Cannot account for discrepancy in terms of systematic uncertainties.

17:04: “Thank you”. Huge ripple of applause fills the auditorium.

Questions

(These questions and answers are happening fast. I probably make an error or omission here and there. Apologies. Consult the webcast for a more accurate account or for any clarifications.)

17:05: Questions are to be organized. Questions about the distance interval, then the time interval, then the experiment itself. There will be plenty of questions!

17:08: Question: How can you be sure that the timing calibrations were not subject to the same systematic uncertainties whenever they were made? Answer: Several checks made. One suggestion is to drill a direct hole. This was considered, but has an uncertainty associated of the order of 5%, too large for this experiment.

17:12: Question: Geodesy measurements were taken at one time. There are tidal effects (for example, measured at LEP.) How can you be sure that there are no further deviations in the geodesy? Answer: Many checks made and many measurements checked.

17:14: Question: Looking for an effect of 1 part in 105. Two measurements not sufficient. Movement of the Moon could affect measurements, for example. Answer: Several measurements made. Data taken over three years, tidal forces should average out.

17:15: Question: Is the 20cm uncertainty in 730km common? Answer: Similar measurements performed elsewhere. Close to state of the art. Even had to stop traffic on half the highway to get the measurement of geodesy!

17:16: Question: Do you take into account the rotation of the Earth? Answer: Yes, it’s a sub ns effect.

17:23: Question: Uncertainty at CERN is of the order of 10μs. How do you get uncertainty of 60ns at Gran Sasso? Answer: We perform a maximum likelihood analysis averaging over the (known shape) of the proton spill and use probability density functions.

(Long discussion about beam timings and maximum likelihood measurement etc.)

17:31: Large uncertainty from internal timers at each site (antenna gives large uncertainty.) Measurements of timing don’t all agree. How can you be sure of the calibration? Answer: There are advanced ways to calibrate measurements. Perform inclusive measurement using optic fibers. Comment from timing friends in the audience? Audience member: Your answer is fine. Good to get opportunity to work on timing at CERN.

17:33 Question: What about variation with respect to time of day/year? Answer: Results show no variation in day/night or Summer vs Spring+Fall.

17:35: Question: How can you be sure of geodesy measurements if they do not agree? Answer: The measurements shown are for four different points, not the same point measured four times. Clocks are also continually resynchronized.

17:37: Question: Do temperature variations affect GPS signals? Answer: Local temperature does not affect GPS measurements. Two frequencies are used to get the position in ionosphere. 1ps precision possible, but not needed for OPERA.

17:41: Question: Can you show the tails of the timing distributions with and without the correction? Is selection biasing the shapes of the fitted distributions? Answer: Not much dependence on spatial position from BCT at CERN. (Colleague from audience): The fit is performed globally. More variation present than is shown in the slides, with more features to which the fit is sensitive.

17:43: Question: Two factors in the fit: delay and normalization. Do you take normalization into account? Answer: Normalization is fixed to number of events observed. (Not normalized to the cross section.)

17:45: Question: Do you take beam stretching/squeezing into account? Answer: Timing is measured on BCT. No correlation between position in Gran Sasso and at CERN.

17:47: Question: Don’t know where muons were generated (could be in rock.) How is that taken in to account? Answer: We look at events with and without selections on muons.

17:49: Question: Do you get a better fit if you fit to the whole range and different regions? What is the χ2/n for the fits? Answer: We perform the fit on the whole range and have the values of χ2/n, but I can’t remember what they are, and they are not on the slides.

17:50: Question: What about any energy dependence of the result? Answer: We don’t claim energy dependence or rule it out with our level of precision and accuracy.

17:52: Question: Is a near experiment possible? Answer: This is a side analysis. The main aim is to search for τ appearance. (Laughter and applause from audience.) We cannot compromise our main physics focus. E-mail questions welcome!

17:53: End, and lots of applause. Time for discussion over coffee! Thanks for reading!

The start of the neutrinos journey, taken from the OPERA paper.  (http://arxiv.org/abs/1109.4897)

The start of the neutrinos journey, taken from the OPERA paper. (http://arxiv.org/abs/1109.4897)

Share