• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

Latest Posts

  • Flip
  • Tanedo
  • USLHC
  • USA

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Jim
  • Rohlf
  • USLHC
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Kyle Cranmer | USLHC | USA

View Blog | Read Bio

The Higgs Boson: A Natural Disaster!

The discovery of the Higgs boson was a triumph for particle physics. Its discovery completes the tremendously successful Standard Model of particle physics.  Of course, we know there are other phenomena — like dark matter, the dominance of matter over anti-matter, the mass of neutrinos, etc. – that aren’t explained by the Standard Model.  However, the Higgs itself is the source of one of the deepest mysteries of particle physics: the fine tuning problem.

The fine-tuning problem is related to the slippery concept of naturalness, and has driven the bulk of theoretical work for the last several decades.  Unfortunately, it is notoriously difficult to explain.  I took on this topic recently for a public lecture and came up with an analogy that I would like to share.

Why we take our theory seriously

Before discussing the fine tuning, we need need a few prerequisites.  The first thing to know is that the Standard Model (and most other theories we are testing) is based on a conceptual framework called Relativistic Quantum Field Theory (QFT).  As you might guess from the name, it’s based on the pillars of relativity, quantum mechanics, and field theory.  The key point here is that relativistic quantum field theory goes beyond the initial formulation of quantum mechanics.  To illustrate this difference, let’s consider a property of the electron and muon called its “g-factor” that relates its magnetic moment and spin [more].  In standard quantum mechanics, the prediction is that g=2; however, with relativistic quantum field theory we expect corrections.  Those corrections are shown pictorially  in the Feynman diagrams below.

g-2corrections

It turns out that this correction is small — about one part in a thousand.  But we can calculate it to an exquisite accuracy (about ten digits).  Moreover, we can measure it to a comparable accuracy.  The current result for the muon is

g = 2.0023318416 ± 0.000000001

This is a real tour de force for relativistic quantum field theory and represents one of the most stringent tests of any theory in the history of science [more].  To put it into perspective, it’s slightly better than hitting a hole in one from New York to China (that distance is about 10,000 km =1 billion cm).

It is because of tests like these that we take the predictions of this conceptual framework very seriously.

Precision-g-2

The Higgs, fine tuning, and an analogy

It turns out that all quantities that we can predict receive similar quantum corrections, even the mass of the Higgs boson.    In the Standard Model, there is a free parameter that can be thought of as an initial estimate for the Higgs mass, let’s call it M₀.  There will also be corrections, let’s call them ΔM (where Δ is pronounced “delta” and it indicates “change to”).   The physical mass that we observe is this initial estimate plus the corrections.  [For the aficionados: usually physicists talk about the mass squared instead of the mass, but that does not change the essential message].

The funny thing about the mass of the Higgs is that the corrections are not small.  In fact, the naive size of the corrections is enormously larger than the 126 GeV mass of that we observe!

Confused?  Now is a good time to bring in the analogy.  Let’s think about the budget of a large country like the U.S.  We will think of positive contributions to the Higgs mass as income (taxes) and negative contributions to the Higgs mass as spending.  The physical Higgs mass that we measure corresponds to the budget surplus.

Now imagine that there is no coordination between the raising of taxes and government spending (maybe it’s not that hard). Wouldn’t you be surprised that a large economy of trillions of dollars would have a budget balanced to better than a penny?  Wouldn’t that be unnatural to expect such a  fine tuning between  income and spending if they are just independent quantities?

This is exactly the case we find ourselves in with the Standard Model… and we don’t like it.  With the discovery of the Higgs, the Standard Model is now complete.  It is also the first theory we have had that can be extrapolated to very high energies (we say that it is renormalizable). But it has a severe fine tuning problem and does not seem natural.

Budget

AnalogyTable

 

The analogy can be fleshed out a bit more.  It turns out that the size of the corrections to the Higgs mass is related to something we call the cutoff, which is the  energy scale where the theory is no longer a valid approximation because some other phenomena become important.  For example, in a grand unified theory the strong force and the electroweak force would unify at approximately 10¹⁶ GeV (10 quadrillion GeV), and we would expect the corrections to be of a similar size.  Another common energy scale for the cutoff is the Planck Scale — 10¹⁹ GeV — where the quantum effects of gravity become important.  In the analogy, the cutoff energy corresponds to the fiscal year.  As time goes on, the budget grows and the chance of balancing the budget so precisely seems more and more unnatural.

Going even further, I can’t resist pointing out that the analogy even offers a nice way to think about one of the most enigmatic concepts in quantum field theory called renormalization.  We often use this term to describe how fundamental constants aren’t really constant.  For example, the  charge of an electron depends on the energy you use to probe the electron.  In the analogy, renormalization is like adjusting for inflation.  We know that a dollar today isn’t comparable to a dollar fifty years ago.

Breaking down the budget

The first thing one wants to understand before attempting to balance the budget is to find out where the money is going.  In the U.S. the big budget items are the military and social programs like social security and medicare.  In the case of the Higgs, the biggest corrections come from the top quark (the top diagrams on the right).  Of course the big budget items get most of the attention, and so it is with physics as well.  Most of the thinking that goes into to solving the fine tuning problem is related to the top quark.

BudgetOfCorrections

Searching for a principle to balance the budget

Maybe it’s not a miraculous coincidence that the budget is balanced so well.  Maybe there is some underlying principle.  Maybe someone came to Washington DC and passed a law to balance the budget that says that for every dollar of spending there must be a dollar of revenue.  This is an excellent analogy for supersymmetry.  In supersymmetry, there is an underlying principle — a symmetry — that relates two types of particles (fermions and bosons).  These two types of particles give corrections to the Higgs mass with opposite signs.  If this symmetry was perfect, the budget would be perfectly balanced, and it would not be unnatural for the Higgs to be 126 GeV.

That is one of the reasons that supersymmetry is so highly motivated, and there is an enormous effort to search for signs of supersymmetry in the LHC data.  Unfortunately, we haven’t seen any evidence for supersymmetry thus far. In the analogy that is a bit like saying that if there is some sort of law to balance the budget, it allows for some wiggle room between spending and taxes.  If the laws allow for too much wiggle room between spending and taxes then it may still be a law, but it isn’t explaining why the budget is balanced as well as it is.  The current state of the LHC experiments indicates that budget is balanced about 10-100 times better than the wiggle room allows  – which is better than we would expect, but not so much better that it seems unnatural.  However, if we don’t see supersymmetry in the next run of the LHC the situation will be worse. And if we were to build a 100 TeV collider and not see evidence of supersymmetry, then the level of fine tuning would be high enough that most physicists probably would consider the situation unnatural and abandon supersymmetry as the solution to the fine tuning problem.

SUSY

Since the fine tuning problem was first recognized, there have been essentially two proposed solutions.  One of them is supersymmetry, which I discussed above.  The second is often referred to as strong dynamics or compositeness.  The idea there is that maybe the Higgs is not a fundamental particle, but instead it’s a composite of some more fundamental particles.  My colleague Jamison Galloway and I tried to think through the analogy in that situation. In that case, one must start to think of different kinds of currencies… say the dollar for the Higgs boson and something other currencies like bitcoin for the more fundamental particles.  You would imagine that as time goes on (energy increases) that there is a transition from one currency to another.   At early times the budget is described entirely in terms of  dollars, but at later times the budget is described in terms of bitcoin.  That transition can be very complicated, but if it happened at a time when the total budget in dollars wasn’t too  large, then a well balanced budget wouldn’t seem too unnatural.  Trying to explain the rest of the compositeness story took us from a simple analogy to the basis for a series of sci-fi fantasy books, and I will spare you from that.

There are a number of examples where this aesthetic notion of naturalness has been a good guide, which is partially why physicists hold it so dear.  However, another avenue of thinking is that maybe the theory is unnatural, maybe it is random chance that the budget is balanced so well.  That thinking is bolstered by the idea that there may be a huge number of universes that are part of a larger complex we call the multiverse. In most of these universes the budget wouldn’t be balanced, the Higgs mass  would be very different.  In fact, most universes would not form atoms, would not form starts, and would not support life.  Of course, we are here here to observe our universe, and the conditions necessary to support life select very special universes out of the larger multiverse.  Maybe it is this requirement that explains why our universe seems so finely tuned.  This reasoning is called the anthropic principle, and it is one of the most controversial topics in theoretical physics. Many consider it giving up on a more fundamental theory that would explain why nature is as it is.  The very fact that we are resorting to this type of reasoning is evidence that the fine tuning problem is a big deal. I discuss this at the end of the public lecture (starting around the 30 min mark) with another analogy for the multiverse, but maybe I will leave that for another post.

Nota bene:  After developing this analogy I learned about a similar analogy from Tommaso Dorigo. They both use the idea of money, but the budget analogy goes a bit further.

Share

15 Responses to “The Higgs Boson: A Natural Disaster!”

  1. Lamont Granquist says:

    So, what you’re saying is that naturalness is just as terrible of an idea as a balanced budget amendment?

    • Kyle Cranmer says:

      I think the analogy would be that Supersymmetry is as bad an idea as a balanced budget amendment

  2. Tienzen (Jeh-Tween) Gong says:

    @ Kyle Cranmer: “ …that Supersymmetry is as bad an idea as a balanced budget amendment.”

    Amen!

    In addition to not having any direct evidence of SUSY in the LHC first run, all vital signs of any SUSY are dead.
    1. Being a useful dark matter candidate: very much ruled out by the LUX data.
    2. Being a hidden player (balancer) at the weak scale: almost totally ruled out by the LHCb data and the electric dipole moment (EDM) measurement.
    3. Having a SUSY-Higgs boson: almost totally ruled out by the Compact Muon Solenoid Experiment.

    @ Kyle Cranmer: “And if we were to build a 100 TeV collider and not see evidence of supersymmetry, then the level of fine tuning would be high enough that most physicists probably would consider the situation unnatural and abandon supersymmetry as the solution to the fine tuning problem.”

    What can the 10 or 50 Tev sparticle play any meaningful part in ‘this’ universe, especially in terms of Standard Model?

    @ Kyle Cranmer: “However, another avenue of thinking is that maybe the theory is unnatural, maybe it is random chance that the budget is balanced so well. … In most of these universes the budget wouldn’t be balanced, the Higgs mass would be very different. In fact, most universes … and would not support life. Of course, we are here here to observe our universe, and the conditions necessary to support life select very special universes out of the larger multiverse.”

    There is one argument against this multiverse idea, at http://snarxivblog.blogspot.com/2014/01/numerology-from-m-theory.html?showComment=1390518129054#c2231367787497872375 .

    • Sandro says:

      Tienzen, your argument against the multiverse is not based on “falsification”, it’s merely an argument of axiomatic parsimony, ie. Occam’s razor, along with the recognition that many such multiverse theories provide no additional predictive power, so they are scientifically weak at best. They may be metaphysically interesting however, which is the point that Kyle was getting at.

  3. Peter Gerdes says:

    While many physicists worry a great deal over fine-tuning I haven’t yet seen a single one make a serious argument for the claim that the universe is fine-tuned.

    Obviously, we can’t use the fact that the universe happens to be well-suited to the creatures that evolved in it as evidence for fine tuning. That’s simply unsurprising. The mere fact that we can retrospectively come up with a fact about the universe that we would have assigned low prior probability is never evidence for fine tuning. The only real observation we have that is evidence for fine-tuning is that the universe contains some kind of intelligent behavior, i.e., behavior that looks like complex but still orderly data processing since it is the only observation that we can be confident that any creatures, regardless of how their hypothetical universe started, would agree with us on.

    So for fine-tuning to be a problem it should turn out that it is intuitively very unlikely for the universe to have contained intelligent life unless the experimentally determined constants in our theory settled in a very narrow region.

    So yah before we worry over solving the fine-tuning problem let’s make sure it really exists. Simply noting that if we had significantly different values of the various physical constants we wouldn’t get things like stars or that planets wouldn’t form or whatever isn’t enough. After all nothing requires the hypothetical itelligences in a hypothetical universe to look anything like us. Without a detailed analysis it is totally unjustified to assume that the intelligences in that other universe don’t end up being far momre spread out in time and space or perhaps in such a universe the behavior of particles (even if the atom is unstable) falling into black holes or in neutron stars doesn’t undergo a distant analog of the complex behavior driven by some kind of `entropy’ gradient that produced our own intelligence.

    I put entropy in scare quotes because even that notion is relative to the creatures and universe in question. We talk like entropy is a real physical property but in reality it’s definition depends on the fact that certain kinds of states (say those in which I have keep two kinds of gas separated and unmixed) admit a much more natural description than the state 5m after they start to mix. But in calculating entropy we made a fairly arbitrary choice when we decided that it was the number of micro-states compatible with certain kinds of macroscopic observables of the system. One could certainly imagine some creature which was more diffuse in space and time and only relative to some entirely different way of looking at their universe that was natural to them did a similar notion of entropy exist.

    In other words I’m suggesting that no matter how the universe starts out, even if it started out in what we would call a high entropy state, there is some way of describing that universe under which it `starts’ (since it always starts at the low entropy point since that determines the arrow of time) and evolves towards a low entropy state. This description may be totally meaningless to us except as a mathematical formalism. We can’t even describe creatures for whom a single instant is spread out over all of what we would call time, and it’s forward evolution in time need not correspond to any of our physical dimensions but as long as it is consistant with the laws of physics that doesn’t matter.

    As a toy analogy consider all the crazy results we prove in physics such if the universe is in X kind of equilibrium there is a orderly statistical relation between the number of times that particle system absorbs a photon in the past or future, the maximum kinetic energy the system achieves achieves and the maximum momentum it ever achieves and various integrals over it’s total behvior. If these relations are not only orderly but are local (wrt themselves) so can be written as differential equations in terms of the max momentum, max energy and number of absorbtions it might well be that number of absorbtions works out to be similar to our time, the two maximum correspond to spacial dimensions and the remaining statistics give the observables.

    This isn’t actually true but it makes it clear that behaving sufficiently like our universe has nothing to do with being similar to our universe. It is a very tricky mathematical question about whether or not you can describe that universe with laws that have certain properties.

  4. The balanced budget analogy is great! I will surely use it the next time I have to explain this idea to non-physicists. Your writeup is very good, too.

  5. Xezlec says:

    The question is, will we ever have the political will to build a a 100 TeV collider if the LHC turns up nothing special in the next run? The LHC was already more costly than the politicians seem to be fully comfortable with.

    Not that I think they’re right to be uncofortable, mind you — by a quick, rough calculation, I’d say the LHC cost about 36 dollars per EU taxpayer in total, or about 2 dollars per taxpayer per year of construction and operation. That seems insignificant. With American support, it should be even easier. But there doesn’t seem to be any organization or individual who makes any effort to get those numbers out into the public eye. The media talks in terms of total costs, which are completely unintuitive to the average voter.

    • Kyle Cranmer says:

      Hi Xeziec,

      I agree with your point, but I hope you aren’t under the impression that only European taxpayers contributed to the LHC. It’s an international effort, and the US made a very large contribution:
      http://www.uslhc.us/The_US_and_the_LHC/Fact_Sheets

      I agree that people often focus on the total number and not the time scale of the investment or what it costs to an average person. Moreover, the huge benefit to society (both economically and intellectually) is often under appreciated.

      Thanks for the comment.

  6. Tienzen (Jeh-Tween) Gong says:

    @Peter Gerdes: “So yah before we worry over solving the fine-tuning problem let’s make sure it really exists.”

    I personally do not think that there is a fine-tuning problem for ‘Nature’. But, it is a true problem for physicists and physics theories, as those theories are not complete. Fifty years ago, Nature means nature semantically. Now, many think that theory is nature while the Nature could be un-nature. This is a mentality issue. Then, this issue gets worse with the linguistic confusions, among the words of preciseness vs fine-tuning, evolution vs axiom-expressions.

    Evolution is clearly defined as the ‘change’ of one entity when it encounters an ‘external’ entity (force). On the other hand, an axiomatic system ‘changes’ by an axiomatic-expression, totally internal without any encountering the external entity. When we put the ‘multiverse’ issue aside, this universe changes via the axiomatic-expression, not evolution (although we always use the term of evolution to describe the stage-changes of this universe). This semantic confusion of terminology is one of the big reason for this fine-tuning issue.

    Then, ‘this’ universe is anchored (rooted) with some nature constants, such as Alpha (fine structure constant; locking e (electric charge), c (light speed) and ħ (Planck constant)). With these roots firmly anchored (locked), this universe is allowed to express. These roots guarantee the ‘precise’ expression of this universe which is now confused as the fine-tuning.

    When a theory which can derive those roots, it will reveal that axiom-system of this universe, and this fine-tuning issue will be no more. More detailed discussion on this is available at http://tienzengong.wordpress.com/2013/12/17/nonsense-of-the-un-nature-nature/ .

  7. JC says:

    Very well described and good analogy. Even I can understand quit a bit of it now!

  8. Gunn Quznetsov says:

    All well-known elementary bosons are gauge. Apparently, the found by LHC 125-126 particle represents some hadron multiplet.

    Every physics event is interpretted by particles which similar well-known elementary particles – leptons, quarks and gauge bosons. Therefore, if anybody will claim that he had found Higgs then not believe – this is not Higgs. http://arxiv.org/abs/physics/0302013v3 , Quznetsov G 2013 Logical foundation of fundamental theoretical physics (Lambert Academic Publ.)

  9. […] over at QuantumDiaries on an analogy about the fine tuning problem related to the Higgs boson. "The Higgs Boson: A Natural Disaster!" ‹ Enabling data sharing, citation, and discovery Posted in […]

  10. Kevin Dowd says:

    Hi Kyle, great analogy. I wouldnsay that the unified budget, where social security is included, gives a (deliberately) distorted view of US financies. If we take out the FICA revenue, and the SS payments, (and the smaller medicare tax portion) the resulting budget expense pcts look quite different.

    Our military budget is then about a third of our expenses. when you add in hidden military such as nukes ate DOE, the VA for war’s results, its portion of the interest, NSA and CIA expenses, etc. you get an even larger share.

    Maybe we need a dedicated military tax like we have FICA for SS!

    great blog. I will bookmark it.

  11. Exomnium says:

    “Trying to explain the rest of the compositeness story took us from a simple analogy to the basis for a series of sci-fi fantasy books, and I will spare you from that.”

    Did you write it down? I think my friends in phenomenology would enjoy it.

Leave a Reply

Commenting Policy