• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘new physics’

A Little Bit of the Higgs Boson for Everyone

Hi All,

This post is long overdue but nonetheless I am thrilled to finally write it. We have discovered the a some  ??? Higgs boson, and it is precisely my trouble writing this very sentence that inspires a new post. CERN‘s press office has keenly presented a new question in particle physics known as the Definite Article Problem:

Have we discovered “a” Higgs boson or “the” Higgs boson?

We can express the Article problem in another way:

Are there more Higgs bosons?

Before I touch upon that problem, I want to explain about why the Higgs boson is important. In particular, I want to talk about the Sun! Yes, the Sun.

asd

The Higgs Boson and Electroweak Symmetry Breaking is Important because the Sun Shines.

Okay, there is no way to avoid this: I really like the sun.

Slide Credit: Mine. Image Credit: GOES Collaboration

It shines. It keeps the planet warm. There is liquid water on Earth, and some very tasty plants too.

Slide Credit: Mine. Image Credit: NobelPrize.org

At the heart of the Sun is a ranging nuclear furnace and involves two types of processes: (1) those that involve the Strong nuclear force and (2) those that involve the Weak nuclear force (look for the neutrinos!). The two types of processes work together in a solar relay race to complete a circuit, only to do it over and over again for billions of years. And just like a real relay race, the speed at which the circuit is finished is set by the slowest member. In this case, the Weak force is the limiting factor and considerably slows down the rate at which the sun could theoretically operate. If we make the Weak force stronger, then the Sun would shine more brightly. Conversely, if we make the Weak force even weaker, the Sun would be dimmer.

Slide Credit: Mine. Image Credit: NobelPrize.org

From studying the decays of radioactive substances, we have learned that the rate of Weak nuclear processes is set by a physical constant called Fermi’s Constant. Fermi’s Constant is represented by symbol GF. From study the Higgs boson and the Higgs Mechanism, we have learned that Fermi’s Constant is literally just another constant, v, in disguise. This second physical constant (v) is called the Higgs “vacuum expectation value” , or “vev” for short, and is the amount of energy the Higgs field has at all times relative to the vacuum.

The point I want to make is this: If we increase the Higgs vev, Fermi’s Constant gets smaller, which reduces the rate of Weak nuclear interactions. In other words, a larger Higgs vev would make the sun shine less brightly. Going the other way, a smaller Higgs vev would make the sun shine more brightly. (This is really cool!)

Slide Credit: Mine. Image Credit: Jacky-Boi

The Higgs vev is responsible for some other things, too. It is a source of energy from which all elementary particles can draw. Through the Higgs Mechanism, the Higgs field provides mass to all elementary particles and massive bosons. One would think that for such an important particle we would have a firm theoretical understanding it, but we do not.

Credit: Mine

We have a very poor theoretical understanding of the Higgs boson. Among other things, according to our current understanding of the Higgs boson, the particle should be much heavier than what we have measured.

Credit: Mine

The Definite Article Problem

There are lots of possible solutions to the problems and theoretical inconsistencies we have discovered relating to the Standard Model Higgs boson. Many of these ideas hypothesize the existence of other Higgs bosons or particles that would interact like the Higgs boson. There are also scenarios where Higgses have identity crises: the Higgs boson we have observed could be a quantum mechanical combination (superposition) of several Higgs bosons.

I do not know if there are additional Higgses. Truthfully, there are many attractive proposals that require upping the number Higgs bosons. What I do know is that our Higgs boson is interesting and merits much further studying.

...

Credit: Mine

Happy Colliding

– richard (@bravelittlemuon)

PS In case anyone is wondering, yes, I did take screen shots from previous talks and turn them into a DQ post.

Share

Huge impact from a tiny decay

Wednesday, November 14th, 2012

The Hadron Collider Physics Symposium opened on November 12 in Kyoto on a grand note. For the first time, the LHCb collaboration operating at the Large Hadron Collider (LHC) at CERN showed evidence for an extremely rare type of events, namely the decay of a Bs meson into a pair of muons (a particle very similar to the electron but 200 times heavier). A meson is a composite class of particles formed from a quark and an antiquark. The Bs meson is made of a bottom quark b and a strange quark s. This particle is very unstable and decays in about a picosecond (a millionth of a millionth of a second) into lighter particles.

Decays into two muons are predicted by the theory, the Standard Model of particle physics, that states it should occur only about 3 times in a billionth of decays. In scientific notation, we write (3.54±0.30)x10-9 where the value of 0.30 represents the error margin on this theoretical calculation. Now, the LHCb collaboration proudly announced that they observed it at a rate of (3.2+1.5-1.2)x10-9 , a value very close to the theoretically predicted value, at least within the experimental error.

Here is the plot shown by the LHCb collaboration for the number of events found in data as a function of the combined mass of the two muons. The solid blue line represents the sum of all types of events from known phenomena containing two muons. The dashed curve in red shows the number of events coming from a Bs meson. With the current error margin on the measurement (shown by the

vertical and horizontal bars on the data points), the data seem to agree with all expected contributions from known sources, leaving little room for new phenomena.

This represents a great achievement, not only because this is the rarest process ever observed, but because it puts stringent limits on new theories. Here is why.

Theorists are convinced that a theory far more encompassing than the Standard Model exists even though we have not detected its presence yet. As if the Standard Model is to particle physics what the four basic operations (addition, multiplication, division and subtraction) are to mathematics. They are sufficient to tackle daily problems but one needs algebra, geometry and calculus to solve more complex problems. And in particle physics, we do have problems we cannot solve with the Standard Model, such as explaining the nature of dark matter and dark energy.

A good place to catch the first signs of “new physics” is where the Standard Model predicts very faint signals such as in Bs mesons decaying into two muons. These decays occur extremely rarely because the Standard Model only has limited ways to produce them. But if an additional mechanism comes into play due to some new theory, we would observe these decays at a rate different from what is expected within the Standard Model.

This is a bit like using the surface of a lake to detect the presence of an invisible creature, hoping its breath would create a ripple on the water surface. It would only work if the lake were extremely calm or disturbed only by an occasional tiny fish.  Here the Standard Model acts like all known little animals creating ripples on the water surface.  The hope was to detect other ripples in the absence of known causes (fish, frogs or mosquitoes). The LHCb result reveals no extra ripples yet. So either the new creature does not breathe as expected or we need to find another method to see it. It will be easier to know once the error margin is reduced with more data.

This new result pushes the reach for new physics even further. Nevertheless, it will help theorists eliminate faulty models like on the plot below and eventually zoom on the right solution. Meanwhile, experimentalists will have to devise yet more stringent tests to be able to discover the way to this new physics.

This plot shows how this measurement (horizontal axis) shown earlier this year reduced the space where new physics could be seen. With this new result, the constraints will even be stronger.

(For more details, see LHCb website)

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline or sign-up on this mailing list to receive and e-mail notification.

Share

How is new physics discovered?

Friday, September 28th, 2012

Finding an experimental anomaly is a great way to open the door to a new theory. It is such a good trick that many of us physicists are bending over backward trying to uncover the smallest deviation from what the current theory, the Standard Model of particle physics, predicts.

This is the approach the LHCb collaboration at CERN is pursuing when looking at very rare decays. A minute deviation can be more easily spotted for rare processes. One good place to look is in the rate of K meson decays, a particle made of one strange quark s and one anti-down quark d.

There are in fact two sorts of K mesons: short-lived ones, K0S (called K-short) and long-lived ones, K0L (“K-long”). In the early 1970’s, scientists discovered that the K0L were decaying into a pair of muons 10 000 times less often than the theory predicted. At the time, the theory knew of only three quarks: u, d and s. This hinted three theorists, Sheldon Glashow, John Iliopoulos and Luciano Maiani to propose a mechanism that required the existence of a new, unknown quark, the charm quark c, to explain how this rate could be so suppressed. This explanation is now called the GIM mechanism, an acronym based on their last names.

This major breakthrough on a theoretical level was soon confirmed by the discovery of the charm quark in 1974.

Recently, the LHCb collaboration has turned its attention to measuring the decay rate of the short-lived kaons K0S, the only K mesons decaying fast enough to be seen with precision in their detector.

To make this measurement, they had to select billions of muon pairs and see if any was coming from the decay of a K0S. One can reconstruct the mass of a decaying particle by adding together the mass and momentun of all its fragments. If these muons were coming from the decays of K0S, the reconstructed mass would be the K0S mass. An accumulation of events would appear near this value in the distribution of all the recombined masses.

But as can be seen in the figure below, no such accumulation appears in the region around 500 MeV, the K0S mass value. This allowed the LHCb collaboration to estimate how often a K0S can decay into two muons, a quantity called the branching ratio. They placed a limit at less than 9 times in a billion, or in scientific notation, BR(K0S →μμ ) < 9 x 10-9 with 90% confidence level using all of the 2011 data. Since no peak appears anywhere on this curve, it means the muon pairs were produced in a variety of decays where other particles were also produced.

They have a long way to go since it is still about 2000 times larger than what the Standard Model predicts, namely a branching ratio of 5×10-12. Nevertheless, LHCb is getting closer to the theoretical prediction and eventually, given enough data, they might be able to test it.

Not easy to get to the next layer of the theory when the current one makes predictions requiring thousands of billions of events to be tested.

Pauline Gagnon

To be alerted of new postings, follow me on Twitter: @GagnonPauline or sign-up on this mailing list to receive and e-mail notification.

Share

Theoretically, the Higgs boson solves a lot of problems. Theoretically, this Higgs boson is a problem.

Greetings from the good ol’ U.S. of A.

Now that Fall is here, classes are going, holidays are wrapping up, and research programs are in full steam. Unfortunately, all is not well in the Wonderful World of Physics. To refresh, back on 4th of July, the LHC experiments announced the outstanding and historical discovery of a new particle with properties consistent with the Standard Model Higgs boson. No doubt, this is a fantastic feat by the experiments, a triumph and culmination of a decades-long endeavor. However, there is deep concern about the existence of a 125 GeV Higgs boson. Being roughly 130 times the proton’s mass, this Higgs boson is too light. A full and formal calculation of the Higgs boson’s mass, according to the theory that predicts it, places the Higgs mass pretty close to infinity. Obviously, the Higgs boson’s mass is less than infinite. So let’s talk mass and why this is still a very good thing for particle physics.

For an introduction to the Higgs boson, click here, here, or here (This last one is pretty good).

The Standard Higgs

The Standard Model of Particle Physics (SM) is the theory that describes, well, everything with the exception of gravity (Yes, this is admittedly a pretty big exception).  It may sound pompous and arrogant, but the SM really does a good job at explaining how things work: things like the lights in your kitchen, or smoke detectors, or the sun.

Though if this “theory of almost-everything” can do all this, then when written out explicitly, it must be pretty big, right? Yes. The answer is yes. Undeniably, yes. When written out fully and explicitly, the “Lagrangian of the Standard Model” looks like this (click to enlarge):

Figure 1: The Standard Model Lagrangian in the Feynman Gauge. Credit: T.D. Gutierrez

This rather infamous and impressive piece of work is by Prof. Thomas Gutierrez of Cal Poly SLO. Today, however, we only care about two terms (look for the red circles):

Figure 2: The Standard Model Lagrangian in the Feynman Gauge with the Higgs boson tree-level mass and 4-Higgs vertex interactions terms circles. Original Credit: T.D. Gutierrez

The first term is pretty straightforward. It expresses the fact that the Higgs boson has a mass, and this can represented by the Feynman diagram in Fig 3. (below). As simple and uneventful as this line may appear, its existence has a profound impact on the properties of the Higgs boson. For example, because of its mass, the Higgs boson can never travel at the speed of light; this is the complete opposite for the massless photon, which can only travel at the speed of light. The existence of the diagram if Fig. 3 also tells us exactly how a Higgs boson (denoted by h) travels from one place in the Universe, let’s call is x, to another place in the Universe, let’s call it y. Armed with this information, and a few other details, we can calculate the probability that a Higgs boson will travel from point x to point y, or if it will decay at some point in between.

Figure 3: The tree-level Feynman diagram the represents a SM Higgs boson (h) propagating from a point x in the Universe to a point y somewhere else in the Universe. Credit: Mine

The second term is an interesting little fella. It expresses the way the Higgs boson can interact with other Higgs bosons, or even itself. The Feynman diagram associated with this second term is in Fig. 4. It implies that there is a probability a Higgs boson (at position w) and a second Higgs boson (at position x) can collide into each other at some point in the Universe, annihilate, and then produce two Higgs bosons (at point z and y). To recap: two Higgses go in, two Higgses go out.

Figure 4: The tree-level Feynman diagram the represents two SM Higgs bosons (h) at points w and x in the Universe annihilating and producing two new SM Higgs bosons at points z and y somewhere else in the Universe. Credit: Mine

This next step may seem a little out-of-the-blue and unmotivated, but let’s suppose that one of the incoming Higgs bosons was also one of the outgoing Higgs bosons. This is equivalent to supposing that w was equal to z. The Feynman diagram would look like Fig. 5 (below).

Figure 5: By making an incoming Higgs boson (h) the same as an outgoing Higgs boson in the 4-Higgs interaction term, we can transform the tree-level 4-Higgs interaction term into the 1-loop level correction to the Fig. 1, the diagram the represents the propagation of a Higgs boson in the Universe. Credit: Mine

In words, this “new” diagram states that as a Higgs boson (h) at position x travels to position y, it will emit and absorb a second Higgs boson somewhere in between x and y. Yes, the Higgs boson can and will emit and absorb a second Higgs boson.

If you look carefully, this new diagram has the same starting point and ending point at our first diagram in Fig. 3, the one that described the a Higgs boson traveling from position x to position y. According to the well-tested rules of quantum mechanics, if two diagrams have the same starting and ending conditions, then both diagrams contribute to all the same processes and both must be included in any calculation that has the same stating and ending points. In terms of Feynman diagrams, if we want to talk about a Higgs boson traveling from point x to point y, then we need to look no further than Fig. 6.

 

Figure 6: The tree-level (L) and 1-loop level (R) contributions to a Higgs boson (h) traveling from point x to point y. Credit: Mine

What Does This All Mean?

Now that I am done building things up, let me quickly get to the point. The second diagram can be considered a “correction” to the first diagram. The first diagram is present because the Higgs boson is allowed to have mass (mH). In a very real sense, the second diagram is a correction to the Higgs boson’s mass. In a single equation, the two diagrams in Fig. 6 imply

Equation 1: The theoretical prediction for the SM Higgs boson's observed mass, which includes the "tree-level" contribution ("free parameter"), and 1-loop level contribution ("cutoff"). Credit: Mine

In Eq. (1), term on the far left is the Higgs boson’s mass that has been experimentally measured, i.e., 125 GeV. Hence the label, “what we measure.” The term just right of that (the “free parameter”) is the mass of the Higgs boson associated with the first term in the SM Lagrangian (Fig. 2 and 3). When physicists talk about the Standard Model not predicting the mass of the Higgs boson, it is this term (the free parameter) that we talk about. The SM makes no mention as to what it should be. We have to get down, dirty, and actually conduct an experiment get the thing. The term on the far right can be ignored. The term “Λ” (the “cutoff scale“), on the other hand, terrifies and mystifies particle physicists.

Λ is called the “cutoff scale” of the SM. Physically, it represents the energy at which the SM stops working. I mean it: we stop calculating things when we get to energies equal to Λ. Experimentally, Λ is at least a few hundred times the mass of the proton. If Λ is very LARGE, like several times larger than the LHC’s energy range, then the observed Higgs mass gets an equally LARGE bump. For example, if the SM were 100% correct for all energies, then Λ would be infinity. If this were true, then

(the Higgs boson’s mass) = (something not infinity) + (something infinity) ,

which comes out inevitably to be infinity. In other words, if the Standard Model of Physics were 100% correct, then the Higgs boson’s mass is predicted to be infinity. The Higgs boson is not infinity, obviously, and therefore the Standard Model is not 100%. Therefore, the existence of the Higgs boson is proof that there must be new physics somewhere. “Where and at what energy?,” is a whole different question and rightfully deserves its own post.

 

Happy Colliding

– Richard (@bravelittlemuon)

Share

The biggest news at CIPANP 2012 for particle physicists seems to be coming from the “low” energy frontier, at energies in the ballpark of 10GeV and lower. This may come as a surprise to some people, after all we’ve had experiments working at these energies for a few decades now, and there’s a tendency to think that higher energies mean more potential for discovery. The lower energy experiments have a great advantage over the giants at LHC and Tevatron, and this is richer collection of analyses.

There’s a big difference between discovering a new phenomenon and discovering new physics, which is something that most people (including physicists!) don’t appreciate enough. Whenever a claim of new physics is made we need to look at the wider implications of the idea. For example, let’s say that we see the decay of a \(\tau\) lepton to an proton and a \(\pi^0\) meson. The Feynman diagram would look something like this:

tau lepton decay to a proton and a neutral pion, mediated by a leptoquark

tau lepton decay to a proton and a neutral pion, mediated by a leptoquark

The “X” particle is a leptoquark, and it turns leptons into quarks and vice versa. Now for this decay to happen at an observable rate we need something like this leptoquark to exist. There is no Standard Model process for \(\tau\to p\pi^0\) since it violates baryon number (a process which is only allowed under very special conditions). So suppose someone claims to see this decay, does this mean that they’ve discovered new physics? The answer is a resounding “No”, because if they make a claim of new physics they need to look elsewhere for similar effects. For example, if the leptoquark existed the proton could decay with this process:

proton decay, mediated by a leptoquark

proton decay to an electron and neutral pion, mediated by a leptoquark

We have very stringent tests on the lifetime of the proton, and the lower limits are currently about 20 orders of magnitude longer than the age the universe. Just take a second to appreciate the size of that limit on the lifetime. The proton lasts for at least 20 orders of magnitude longer than the age of the universe itself. So if someone is going to claim that they have proven the leptoquark exists we need to check that what they have seen is consistent with the proton lifetime measurements. A claim of new physics is stronger than a claim of a new phenomena, because it must be consistent with all the current data, not just the part we’re working.

How does all this relate to CIPANP 2012 and the low energy experiments? Well it turns out that there are a handful of large disagreements in this regime that all tend to involve the same particles. The \(B\) meson can decay to several lighter particles and the BaBar experiment has seen the decays to the \(\tau\) lepton are higher than they should be. The disagreement is more than \(3\sigma\) disagreement with the Standard Model predictions for \(B\to D^{(*)}\tau\nu\), which is interesting because it involves the heaviest quarks in bound states, and the heaviest lepton. It suggests that if there is a new particle or process, that it favors coupling to heavy particles.

Standard model decays of the B mesons to τν, Dτν, and D*τν final states

Standard model decays of the B mesons to τν, Dτν, and D*τν final states

In another area of \(B\) physics we find that the branching fraction \(\mathcal{B}(B\to\tau\nu)\) is about twice as large as we expect from the Standard Model. You can see the disagreement in the following plot, which compares two measurements (\(\mathcal{B}(B\to\tau\nu)\) and \(\sin 2\beta\)) to what we expect given everything else. The distance between the data point and the most favored region (center of the colored region) is very large, about \(3\sigma\) in total!

The disagreement between B→τν, sin2β and the rest of the unitary triangle measurements (CKMFitter)

The disagreement between B→τν, sin2β and the rest of the unitary triangle measurements (CKMFitter)

Theorists love to combine these measurements using colorful diagrams, and the best known example is the unitary triangle. If the CKM mechanism describes all the quark mixing processes then all of the measurements should agree, and they should converge on a single apex of the triangle (at the angle labeled \(\alpha\)). Each colored band corresponds to a different kind of process, and if you look closely you can see some small disagreements between the various measurements:

The unitary triangle after Moriond 2012 (CKMFitter)

The unitary triangle after Moriond 2012 (CKMFitter)

The blue \(\sin 2\beta\) measurement is pulling the apex down slightly, and green \(|V_{ub}|\) measurement is pulling it in the other direction. This tension shows some interesting properties when we try to investigate it further. If we remove the \(\sin 2\beta\) measurement and then work out what we expect based on the other measurements, we find that the new “derived” value of \(\sin 2\beta\) is far off what is actually measured. The channel used for analysis of \(\sin 2\beta\) is often called the golden channel, and it has been the main focus of both BaBar and Belle experiments since their creation. The results for \(\sin2\beta\) are some of the best in the world and they have been checked and rechecked, so maybe the problem is not associated with \(\sin 2\beta\).

Moving our attention to \(|V_{ub}|\) the theorists at CKMFitter decided to split up the contributions based on the semileptonic inclusive and exclusive decays, and from \(\mathcal{B}(B\to\tau\nu)\). When this happens we find that the biggest disagreement comes from \(\mathcal{B}(B\to\tau\nu)\) compared to the rest. The uncertainties get smaller when \(\mathcal{B}(B\to\tau\nu)\) is combined with the \(B\) mixing parameter, \(\Delta m_d\), which is well understood in terms of top quark interactions, but these results still disagree with everything else!:

Disagreement between B→τν, Δmd and the rest of the unitary triangle measurments (CKMFitter)

Disagreement between B→τν, Δmd and the rest of the unitary triangle measurments (CKMFitter)

What this is seeming to tell us is that there could be a new process that affects \(B\) meson interactions, enhancing decays with \(\tau\) leptons in the final state. If this is the case then we need to look at other processes that could be affected by these kinds of processes. The most obvious signal to look for at the LHC is something like production of \(b\) quarks and \(\tau\) leptons. Third generation leptoquarks would be a good candidate, as long as they cannot mediate proton decay in any way. Searching for a new particle of a new effect is the job of the experimentalist, but creating a model that accommodates the discoveries we make is the job of a theorist.

That, in a nutshell is the difference between discovering a new phenomenon and discovering new physics. Anyone can find a bump in a spectrum, or even discover a new particle, but forming a consistent model of new physics takes a long time and a lot of input from all different kinds of experiments. The latest news from BaBar, Belle, CLEO and LHCb are giving us hints that there is something new lurking in the data. I can’t wait to see wait to see what our theorist colleagues do with these measurements. If they can create a model which explains anomalously high branching fractions \(\mathcal{B}(B\to\tau\nu)\), \(\mathcal{B}(B\to D\tau\nu)\), and \(\mathcal{B}(B\to D^*\tau\nu)\), which tells us where else to look then we’re in for an exciting year at LHC. We could see something more exciting than the Higgs in our data!

(CKMFitter images kindly provided by the CKMfitter Group (J. Charles et al.), Eur. Phys. J. C41, 1-131 (2005) [hep-ph/0406184], updated results and plots available at: http://ckmfitter.in2p3.fr)

Share

Update: I accidentally miscalculated the decay rate of K40 in a banana. There are 12 decays, per second, per banana, not 18.

Wimps, they are everywhere! They pervade the Universe to its furthest reaches; they help make this little galaxy of ours spin right round like a record (we think); and they can even be found with all the fruit in your local grocery store.

Figure 1: ( L) Two colliding galaxies galaxy clusters (Image: NASA’s Chandra X-Ray Observatory). (R) Bananas, what else? (Image: Google)

WIMPs: Weakly-Interactive Massive Particles, is an all-encompassing term used to describe any particle that has (1) mass, and (2) is unlikely to interactive with other particles. This term is amazing; it describes particles we know exist and is a generic, blanket-term that adequately describes many hypothetical particles.

Neutrinos: The Prototypical WIMP

Back in 1930, there was a bit of a crisis in the freshly established field of particle physics. The primary mechanism that mediates most nuclear reactions, known as β-decay (beta-decay), violated (at the time) one of the great pillars of experimental physics: The Law of Conservation of Energy. This law says that energy can NEVER be created or destroyed, ever. Period. Sure, energy can be converted from one type, like vibrational energy, to another type, like heat, but it can never just magically (dis)appear.

Figure 2: In β-decay, before 1930, neutrons were (erroneously) believed to decay into a high speed electron (β) and a proton (p+).

Before 1930, physicists thought that when an atom’s nucleus decayed via β-decay a very energetic electron (at the time called a β particle) would be emitted from the nucleus. From the Conservation of Energy, the energy of an electron is exactly predicted. The experimental result was pretty much as far off from the prediction as possible and implied the terrifying notion that perhaps energy was not conserved for Quantum Mechanics. Then, in 1930, the Nobel Prize-Winning physicist Wolfgang Pauli noticed that the experimental measurements of β-decay looked a bit like what one would expect if instead of one particle being emitted by a radioactive nucleus, two particles were emitted.

Prof. Pauli thought the idea of a radioactive nucleus emitting two particles, one visible (the electron) and one invisible, was horrible, silly, and unprofessional. Consequentially, he decided to pen a letter to the physics community suggesting there existed such a particle. 🙂 Using this idea and what could only be described as a level of intuition beyond that of genius, Nobel Laureate Enrico Fermi suggested that perhaps nuclear decay was actually the manifestation of a new, weak force and aptly named it the Weak Nuclear Force (note the capitalization).

To recap: 1 hypothetical particle mediated by 1 hypothetical force.

Figure 3: Prof. Pauli proposed that β-decay actually included an electrically neutral particle with little mass (χ0), in addition to the final-state electron (β) & proton (p+). This once-hypothetical particle is now known as the anti-neutrino (ν).

30 years later, in 1962, Prof. Pauli’s invisible particles (by then called neutrinos) were discovered; 20 years after that, the Weak Force was definitively confirmed; and after another 20 years, neutrinos were found to have mass.

Since 1930, hundreds of theories have invoked the existence of new particles that (1) have mass, and (2) interact weakly (note lack of capitalization) with other particles, which may/may not involve the Weak Nuclear Force (note capitalization, again). At some point in the 1980s, it was finally decided to coin a generic term that described these particles from other large classes of particles that are, say, massless or readily interact with other particles, e.g., with photons or gluons.

Dark Matter: The Elephant in the Galaxy

Kepler’s Laws of Motion & General Relativity are phenomenal at predicting the orbits of planets and solar systems around immense sources of gravity, like stars & black holes. However, there are two known astronomical observations where our predictions do not readily match the experimental results.

The first has to do with how our galaxy spins like a top. Theoretically, the more distant you are from a galaxy’s center, the slower you orbit around the center; vice versa, the closer you are to the center of the galaxy’s center, the faster you orbit around it. Experimentally, astronomers have found that after a certain distance from the galaxy’s center an object’s speed becomes roughly constant. In other words, if Earth were half as close to the galactic center as it is now, its speed will not have appreciably changed. See figure 4 (below) for nice little graph that compares what is observed (solid line) and what is predicted (dotted line). Furthermore, this is not just our galaxy; this is common to all galaxies. Weird, right?

Figure 4: (A) The theoretical prediction of how fast an object travels (velocity) around the galactic center, as a function of (radial) distance from the center. (B) The experimental observation. (Image: Penn State)

The second disagreement between theory and experiment comes from watching galaxies collide with one another. Yes, I literally mean watching galaxies collide into one another (and you thought the LHC was wicked). This is how it looks:

Figure 5: Chandra X-Ray Image of two galaxies galaxy clusters colliding. The pink regions represent the visible portions of the galaxies; the blue regions represent the invisible (dark matter) portions, as calculated from gravitational lensing. (Image: NASA)

Astronomers & astrophysicists can usually determine how massive galaxies & stars are by how bright they are; however, the mass can also be determined by a phenomenon called gravitational lensing (a triumph of General Relativity). When NASA’s Chandra X-Ray telescope took this little snapshot of two galaxies (pink) passing right through each other it was discovered, rather surprisingly, that the mass deduced from the brightness of the galaxies was only a fraction of the mass deduced from gravitational lensings (blue). You can think of this as physically feeling more matter than what can visibly be seen.

What is fascinating is that these problems (of cosmic proportion) wonderfully disappear if there exists in the universe a very stable (read: does not decay), massive, weakly-interacting particle. Sounds familiar? It better because this type of WIMP is commonly known as Dark Matter! Normally, if a theory does not work, then it is just thrown out. What makes General Relativity different is that we know it works; it has made a whole slew of correct predictions that are pretty unique. Predicting the precession of the perihelion of the planet Mercury is not as easy as it sounds. I am probably a bit biased but personally I think it is a very simple solution to two “non-trivial” problems.

Bananas: A Daily Source of K-40

Since I bought a bunch of bananas this morning, I thought I would add a WIMP-related fact about bananas. Like I mentioned earlier, β-decay occurs when a proton neutron decays into a neutron proton by emitting an electron and an anti-neutrino. From a particle physics perspective, this occurs when a down-type quark emits a W boson (via the Weak Force) and becomes an up-type quark. The W boson, which by our definition is a WIMP itself, then decays into an electron (e) and an anti-neutrino (ν – a WIMP). This is how a neutron, which has two down-type & one up-type quark, becomes a proton, which has one-down type & two up-type quarks.

Figure 6: The fully understood mechanism of β-decay in which a neutron (n0) can decay into a proton (p+) when a d-type quark (d) in a neutron emits a W boson (W) and becomes an u-type quark (u). The W boson consequentially decays into an electron (e) and an anti-neutrino (νe).

This type of nuclear transmutation often occurs when a light atom, like potassium (K), has too many neutrons. Potassium-40, which has 19 protons & 21 neutrons, makes up about 0.01% of all naturally forming potassium. Bananas are an exceptionally great source of this vital element, about 450 mg worth, and consequentially have about 45 μg (or ~6.8·1017 atoms) of the radioactive K-40 isotope. This translates to roughly 18 12 nuclear decays (or 18 12 neutrinos), per second, per banana. Considering humans and bananas have coexisted for quite a while in peaceful harmony, minus the whole humans-eat-banans thing, it is my professional opinion that bananas are perfectly safe. 🙂

Dark Matter Detection: CRESST

Okay, I have to be honest: I have a secret agenda in writing about WIMPs. The Cryogenic Rare Event Search with Superconducting Thermometers (CRESST) Experiment Collaboration will be announcing some, uh… interesting results at a press conference tomorrow, as a part of the Topics in Astroparticle & Underground Physics Conference (TAUP 2011). I have no idea what will be said or shown aside from this press release that states the “latest results from the CRESST Experiment provide an indication of dark matter.”

 

With that, I bid you adieu & Happy Colliding.

– richard (@bravelittlemuon)

Share

The CDF detector at Fermilab. Credit: Fermilab/ Reidar Hahn

Wednesday afternoon, the CDF collaboration announced that it has evidence of a peak in a specific sample of its data. The peak is an excess of particle collision events that produce a W boson accompanied by two hadronic jets. This peak showed up in a mass region where we did not expect one. The peak was observed in the 140 GeV/c2 mass range, as shown in the plot above. It is the kind of peak in a plot that, if confirmed, scientists associate with the existence of a particle. The significance of this excess was determined to be 3.2 sigma, after accounting for the effect of systematic uncertainties. This means that there is less than a 1 in 1375 chance that the effect is mimicked by a statistical fluctuation. Particle physicists consider a result at 5.0 sigma to be a discovery.

The excess might be explained by the production of a new, unknown particle that is not predicted by the Standard Model, the current standard theory of the fundamental laws of physics. The features of this excess exclude the possibility that this peak might be due to a Standard Model Higgs boson or a supersymmetric particle. Instead, we might see a completely new type of force or interaction. A few models proposed and developed in recent years postulate the existence of new fundamental interactions beyond those known today, which would create an excess similar to the one seen in the CDF data. That’s why everybody at CDF is excited about this result.

The di-jet invariant mass distribution for candidate events selected in an analysis of W+2 jet events. The black points represent the data. The red line plots the expected Standard Model background shape based on Monte Carlo modeling. The red shading shows the systematic and statistical uncertainty on this background shape. The blue histogram is the Gaussian fit to the unexpected peak centered at 144 GeV/c2

The alternative explanation for this excess would be that we need to reconsider the theory that is used to predict the background spectrum, which is based on standard particle physics processes. That possibility, albeit less glamorous, would still have important implications. Those calculations use theoretical tools that are generally regarded as reliable and well understood, and form the basis of many other predictions in particle physics. Questioning these tools would require us to challenge our understanding of the fundamental forces of nature, the foundation of particle physics.

The current analysis is based on 4.3 inverse femtobarns of data. The CDF collaboration will repeat the analysis with at least twice as much data to see whether the bump gets more or less pronounced. Other experiments, including DZero and the LHC experiments, will look for a particle of about 140 GeV/c2 in their data as well. Their results will either refute or confirm our result. Our result has been submitted to Physical Review Letters. You can read the paper and watch the lecture online.

It remains to be seen whether this measurement is an important indication of long-awaited new physics beyond the Standard Model.

— Edited by Rob Roser and Giovanni Punzi

Several interesting articles have been written about the result. Media interest was generated after a thesis article was spotted in an academic journal.  Gordon Watts has an intriguing blog post about how the release of scientific information is and could be affected by today’s fast-paced, Internet-driven society. This could bring people into the scientific process before an analysis has been fully vetted or enough data has been complied and analyzed to declare something a discovery by reaching the 5-sigma  threshold. Do you think that is a good or bad thing?

Related stories:

USLHC blog: A hint of something new in “W+dijets” at CDF

New York Times: At Particle Lab, a Tantalizing Glimpse Has Physicists Holding Their Breaths

Nature:  The Tevatron claims possible glimpse of physics beyond the standard model

Jakarta Globe:  US atom smasher may have found new force of nature

Share

Back to school!

Tuesday, January 12th, 2010

It’s the first week of spring classes at UNL, even if it doesn’t look much like spring. (Temperatures will break the freezing mark tomorrow for the first time in about three weeks.) Today was the first day of the course I’m teaching this semester — introduction to particle physics at the graduate level. Actually, this “introduction” to the field is the only graduate-level course that we offer in the subject (we’re a small program), so I consider it a great privilege to be teaching it, and it is certainly a great responsibility, as for many of the students this will be the last course they ever take on this topic.

This is my second time teaching the class, and I must admit that I learned a lot of physics on my first time around, two years ago. Yes, I took a course like this as a graduate student, but the way to really learn something is to be prepared to teach it. I have a much greater appreciation for the successes of our models, and the constraints that all the existing data place on the possible extensions to those models.

It’s a lot easier to teach a course for the second time than for the first time, since you’ve done the work to re-learn all the material relatively recently, and you have a good idea about how you want to structure the course, etc. But I actually wish it weren’t so easy this time! When I last taught the class, in Spring 2008, the LHC was scheduled to start up that fall, and we would have had a year’s worth of data under our belts at this point. Perhaps it would have been naive to expect that we could have made any significant discoveries by now, but at the very least we would have started mapping out the physics of the next energy scale. I was hoping that I might have to significantly change the course for 2010 in light of what we were learning from the LHC!

But it wasn’t to be. However, by the end of the semester in early May, we will have collected a good amount of collision data at 7 TeV, and I’m hoping that I’ll be able to share some of that experience with the students in the class. And I am expecting that I’ll be teaching this course again in Spring 2012 — let’s hope that I have a lot of prep work to do then!

Share