• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Kyle Cranmer | USLHC | USA

Read Bio

A particle detector in your pocket

Wednesday, February 4th, 2015

Do you love science and technology and sometimes wish you could contribute to a major discovery? I’ve got good news: “there’s an app for that.” With the Crayfis app, you can join a world-wide network of smartphones designed to detect ultra-high energy cosmic rays.

Cosmic rays were discovered by Victor Hess in 1912, for which he received the Nobel Prize in Physics in 1936. They are constantly raining down on us from space; typically atomic nuclei that hit the upper atmosphere leading to a huge shower of particles, some of which make it to the Earth’s surface.

Just last year a team of scientists published a result based on data from the Fermi gamma-ray space telescope indicating that lower-energy cosmic rays are associated to supernovae. However, the origin of the most energetic ones remains a mystery.

The highest energy cosmic rays are amazing, they have about as much kinetic energy as a 60 mph (~100 km/h) baseball packed into a single atomic nucleus! This is much higher energy than what is probed by the LHC, but these kinds of ultra-high energy cosmic rays are very rare. To get a feel for the numbers, the Pierre Auger Observatory, which is about the size of Rhode Island or Luxembourg, observes one of these ultra-high energy cosmic rays roughly every four weeks. What could possibly be responsible for accelerating particles to such high energies?

Untangling the mystery of these ultra-high energy cosmic rays will require observing many more, which means either a very long-running experiment or a very large area. Current arrays with large, highly-efficient devices like Auger cannot grow dramatically larger without becoming much more expensive. This motivates some out of the box thinking.

Smartphones are perfect candidates for a global cosmic ray detector. Phones these days are high-tech gadgets. The camera sensor is a lot like the pixel detectors of ATLAS and CMS, so they are capable of detecting particles from cosmic ray showers (check out the video for a quick demo). In addition, most phones have GPS to tell them where they are, wifi connections to the internet, and significant processing power. Perhaps most importantly, there are billions of smartphones already in use.

Late last year a small team led by Daniel Whiteson and Michael Mulhearn put out a paper making the case for such a world-wide network of smartphones. The paper is backed up by lab tests of the smart phone cameras and simulations of ultra-high energy cosmic ray showers. The results indicate that if we can have roughly a thousand sq. km clusters each with a thousand phones that the exposure time would be roughly equivalent to the Pierre Auger observatory. The paper quickly garnered attention as indicated by the “altmetric” summary below.

After the initial press release, more than 50,000 people signed up to the Crayfis project! That’s a great start. The Crayfis app for iOS and android are currently in beta testing and should be ready soon. I’ve joined this small project by helping develop the iOS app and the website, which are both a lot of fun. All you have to do is plug your phone in and set it camera down, probably at night when you are sleeping. If your phone thinks it has been hit by a cosmic ray, it will upload the data to the Crayfis servers. Later, we will look for groups of phones that were hit simultaneously, which indicates that its not just noise but a real cosmic ray shower.

The image below shows a screenshot of the live monitor of the Crayfis network so far — check it out, it’s fun to play with. As you can see Crayfis is already a world-wide network and may soon have the claim for the world’s largest detector.

Crayfis: A global network of smartphones

Crayfis: A global network of smartphones

 

What’s stopping you? Turn your phone into a cosmic ray detector, join the search, Get the app!

 

————————————————————————————————-


Help the Crayfis project grow, like us on facebook and

Kyle Cranmer is a Professor of physics and data science at New York University. His blog is at theoryandpractice.org.

Share

The Realineituhedron

Tuesday, April 1st, 2014

Inspired by the deep insights revealed in the recent work around the Amplituhedron, a new and deeper mathematical principle has revealed itself. While the amplituhedron caused quite a buzzeven outside of the world of theoretical particle physics, thus far it is restricted to N=4 supersymmetry. In contrast, this new object is able to represent all known predictions for physical observables. The new object, outlined in a recent paper is being called “The Realineituhedron”.

The key observation is that at the end of the day, everything we measure can be represented as a real number. The paper outlines a particular way of projecting these observations onto the realineituhedron, in which the “volume” Ω of the object represents the value of the observation.

In fact, the physically observable quantity must be a real number, a feature foreshadoewed by the Hermitian postulate of quantum mechanics.

The paper is full of beautiful hand-drawn figures, such as the ones below:

 Is it possible that there is some geometrical object is able to capture the Hermitian nature of these operators–indeed, is it able to represent all fundamental observables?

This masterful work will take some time to digest — it was only released today! One of the most intriguing ideas is that of a “The Master Realineituhedron”, denoted ℝ², in which all realineituhedrons can be embeded.

It would be interesting to see whether this larger space has any interesting role to play in understanding the m = 1 geometry relevant to physics.

 

[This post was originally posted here]


Share

The Higgs Boson: A Natural Disaster!

Saturday, February 1st, 2014

The discovery of the Higgs boson was a triumph for particle physics. Its discovery completes the tremendously successful Standard Model of particle physics.  Of course, we know there are other phenomena — like dark matter, the dominance of matter over anti-matter, the mass of neutrinos, etc. — that aren’t explained by the Standard Model.  However, the Higgs itself is the source of one of the deepest mysteries of particle physics: the fine tuning problem.

The fine-tuning problem is related to the slippery concept of naturalness, and has driven the bulk of theoretical work for the last several decades.  Unfortunately, it is notoriously difficult to explain.  I took on this topic recently for a public lecture and came up with an analogy that I would like to share.

Why we take our theory seriously

Before discussing the fine tuning, we need need a few prerequisites.  The first thing to know is that the Standard Model (and most other theories we are testing) is based on a conceptual framework called Relativistic Quantum Field Theory (QFT).  As you might guess from the name, it’s based on the pillars of relativity, quantum mechanics, and field theory.  The key point here is that relativistic quantum field theory goes beyond the initial formulation of quantum mechanics.  To illustrate this difference, let’s consider a property of the electron and muon called its “g-factor” that relates its magnetic moment and spin [more].  In standard quantum mechanics, the prediction is that g=2; however, with relativistic quantum field theory we expect corrections.  Those corrections are shown pictorially  in the Feynman diagrams below.

g-2corrections

It turns out that this correction is small — about one part in a thousand.  But we can calculate it to an exquisite accuracy (about ten digits).  Moreover, we can measure it to a comparable accuracy.  The current result for the muon is

g = 2.0023318416 ± 0.000000001

This is a real tour de force for relativistic quantum field theory and represents one of the most stringent tests of any theory in the history of science [more].  To put it into perspective, it’s slightly better than hitting a hole in one from New York to China (that distance is about 10,000 km =1 billion cm).

It is because of tests like these that we take the predictions of this conceptual framework very seriously.

Precision-g-2

The Higgs, fine tuning, and an analogy

It turns out that all quantities that we can predict receive similar quantum corrections, even the mass of the Higgs boson.    In the Standard Model, there is a free parameter that can be thought of as an initial estimate for the Higgs mass, let’s call it M₀.  There will also be corrections, let’s call them ΔM (where Δ is pronounced “delta” and it indicates “change to”).   The physical mass that we observe is this initial estimate plus the corrections.  [For the aficionados: usually physicists talk about the mass squared instead of the mass, but that does not change the essential message].

The funny thing about the mass of the Higgs is that the corrections are not small.  In fact, the naive size of the corrections is enormously larger than the 126 GeV mass of that we observe!

Confused?  Now is a good time to bring in the analogy.  Let’s think about the budget of a large country like the U.S.  We will think of positive contributions to the Higgs mass as income (taxes) and negative contributions to the Higgs mass as spending.  The physical Higgs mass that we measure corresponds to the budget surplus.

Now imagine that there is no coordination between the raising of taxes and government spending (maybe it’s not that hard). Wouldn’t you be surprised that a large economy of trillions of dollars would have a budget balanced to better than a penny?  Wouldn’t that be unnatural to expect such a  fine tuning between  income and spending if they are just independent quantities?

This is exactly the case we find ourselves in with the Standard Model… and we don’t like it.  With the discovery of the Higgs, the Standard Model is now complete.  It is also the first theory we have had that can be extrapolated to very high energies (we say that it is renormalizable). But it has a severe fine tuning problem and does not seem natural.

Budget

AnalogyTable

 

The analogy can be fleshed out a bit more.  It turns out that the size of the corrections to the Higgs mass is related to something we call the cutoff, which is the  energy scale where the theory is no longer a valid approximation because some other phenomena become important.  For example, in a grand unified theory the strong force and the electroweak force would unify at approximately 10¹⁶ GeV (10 quadrillion GeV), and we would expect the corrections to be of a similar size.  Another common energy scale for the cutoff is the Planck Scale — 10¹⁹ GeV — where the quantum effects of gravity become important.  In the analogy, the cutoff energy corresponds to the fiscal year.  As time goes on, the budget grows and the chance of balancing the budget so precisely seems more and more unnatural.

Going even further, I can’t resist pointing out that the analogy even offers a nice way to think about one of the most enigmatic concepts in quantum field theory called renormalization.  We often use this term to describe how fundamental constants aren’t really constant.  For example, the  charge of an electron depends on the energy you use to probe the electron.  In the analogy, renormalization is like adjusting for inflation.  We know that a dollar today isn’t comparable to a dollar fifty years ago.

Breaking down the budget

The first thing one wants to understand before attempting to balance the budget is to find out where the money is going.  In the U.S. the big budget items are the military and social programs like social security and medicare.  In the case of the Higgs, the biggest corrections come from the top quark (the top diagrams on the right).  Of course the big budget items get most of the attention, and so it is with physics as well.  Most of the thinking that goes into to solving the fine tuning problem is related to the top quark.

BudgetOfCorrections

Searching for a principle to balance the budget

Maybe it’s not a miraculous coincidence that the budget is balanced so well.  Maybe there is some underlying principle.  Maybe someone came to Washington DC and passed a law to balance the budget that says that for every dollar of spending there must be a dollar of revenue.  This is an excellent analogy for supersymmetry.  In supersymmetry, there is an underlying principle — a symmetry — that relates two types of particles (fermions and bosons).  These two types of particles give corrections to the Higgs mass with opposite signs.  If this symmetry was perfect, the budget would be perfectly balanced, and it would not be unnatural for the Higgs to be 126 GeV.

That is one of the reasons that supersymmetry is so highly motivated, and there is an enormous effort to search for signs of supersymmetry in the LHC data.  Unfortunately, we haven’t seen any evidence for supersymmetry thus far. In the analogy that is a bit like saying that if there is some sort of law to balance the budget, it allows for some wiggle room between spending and taxes.  If the laws allow for too much wiggle room between spending and taxes then it may still be a law, but it isn’t explaining why the budget is balanced as well as it is.  The current state of the LHC experiments indicates that budget is balanced about 10-100 times better than the wiggle room allows  — which is better than we would expect, but not so much better that it seems unnatural.  However, if we don’t see supersymmetry in the next run of the LHC the situation will be worse. And if we were to build a 100 TeV collider and not see evidence of supersymmetry, then the level of fine tuning would be high enough that most physicists probably would consider the situation unnatural and abandon supersymmetry as the solution to the fine tuning problem.

SUSY

Since the fine tuning problem was first recognized, there have been essentially two proposed solutions.  One of them is supersymmetry, which I discussed above.  The second is often referred to as strong dynamics or compositeness.  The idea there is that maybe the Higgs is not a fundamental particle, but instead it’s a composite of some more fundamental particles.  My colleague Jamison Galloway and I tried to think through the analogy in that situation. In that case, one must start to think of different kinds of currencies… say the dollar for the Higgs boson and something other currencies like bitcoin for the more fundamental particles.  You would imagine that as time goes on (energy increases) that there is a transition from one currency to another.   At early times the budget is described entirely in terms of  dollars, but at later times the budget is described in terms of bitcoin.  That transition can be very complicated, but if it happened at a time when the total budget in dollars wasn’t too  large, then a well balanced budget wouldn’t seem too unnatural.  Trying to explain the rest of the compositeness story took us from a simple analogy to the basis for a series of sci-fi fantasy books, and I will spare you from that.

There are a number of examples where this aesthetic notion of naturalness has been a good guide, which is partially why physicists hold it so dear.  However, another avenue of thinking is that maybe the theory is unnatural, maybe it is random chance that the budget is balanced so well.  That thinking is bolstered by the idea that there may be a huge number of universes that are part of a larger complex we call the multiverse. In most of these universes the budget wouldn’t be balanced, the Higgs mass  would be very different.  In fact, most universes would not form atoms, would not form starts, and would not support life.  Of course, we are here here to observe our universe, and the conditions necessary to support life select very special universes out of the larger multiverse.  Maybe it is this requirement that explains why our universe seems so finely tuned.  This reasoning is called the anthropic principle, and it is one of the most controversial topics in theoretical physics. Many consider it giving up on a more fundamental theory that would explain why nature is as it is.  The very fact that we are resorting to this type of reasoning is evidence that the fine tuning problem is a big deal. I discuss this at the end of the public lecture (starting around the 30 min mark) with another analogy for the multiverse, but maybe I will leave that for another post.

Nota bene:  After developing this analogy I learned about a similar analogy from Tommaso Dorigo. They both use the idea of money, but the budget analogy goes a bit further.

Share

Inspired by the Higgs, a step forward in open access

Thursday, September 12th, 2013

The discovery of the Higgs boson is a major step forward in our understanding of nature at the most fundamental levels. In addition to being the last piece of the standard model, it is also at the core of the fine tuning problem — one of the deepest mysteries in particle physics. So it is only natural that our scientific methodology rise to the occasion to provide the most powerful and complete analysis of this breakthrough discovery.

This week the ATLAS collaboration has taken an important step forward by making the likelihood function for three key measurements about the Higgs available to the world digitally. Furthermore, this data is being shared in a way that represents a template for how particle physics operates in the fast-evolving world of open access to data. These steps are a culmination of decades of work, so allow me to elaborate.

Four interactions that can produced a Higgs boson at the LHC

Four interactions that can produced a Higgs boson at the LHC

Higgs production and decay measured by ATLAS.

Higgs production and decay measured by ATLAS.

First of all, what are the three key measurements, and why are they important? The three results were presented by ATLAS in this recent paper.  Essentially, they are measurements for how often the Higgs is produced at the LHC through different types of interactions (shown above) and how often it decays into three different force carrying particles (photons, W, and Z bosons).  In this plot, the black + sign at (1,1) represents the standard model prediction and the three sets of contours represent the measurements performed by ATLAS.  These measurements are fundamental tests of the standard model and any deviation could be a sign of new physics like supersymmetry!

Ok, so what is the likelihood function, and why is it useful?  Here maybe it is best to give a little bit of history.  In 2000, the first in a series of workshops was held at CERN where physicists gathered to discuss the details of our statistical procedures that lead to the final results of our experiments.  Perhaps surprisingly, there is no unique statistical procedure, and there is a lot of debate about the merits of different approaches.  After a long discussion panel, Massimo Corradi cut to the point

It seems to me that there is a general consensus that what is really meaningful for an experiment is likelihood, and almost everybody would agree on the prescription that experiments should give their likelihood function for these kinds of results. Does everybody agree on this statement, to publish likelihoods?

And as Louis Lyons charred the session…

Any disagreement? Carried unanimously.  That’s actually quite an achievement for this workshop.

So there you have it, the likelihood function is the essential piece of information needed for communicating scientific results.

So what happened next?  Well… for years, despite unanimous support, experiments still do not publish their likelihood functions.  Part of the reason is that we lacked the underlying technology to communicate these likelihood functions efficiently.  In the run up to the LHC we developed some technology (associated to RooFit and RooStats) for being able to share very complicated likelihood functions internally.  This would be the ideal way to share our likelihood functions, but we aren’t quite there yet.  In January 2013, we had a conference devoted to the topic of publishing likleihood functions, which culminated in a paper “On the presentation of LHC Higgs results”.  This paper, written by theorist and experimentalists, singled out the likelihood associated to the plot above as the most useful way of communicating information about the Higgs properties.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

The reason that these specific Higgs plots are so useful is that more specific tests of the standard model can be derived from them.  For instance, one might want to consider beyond the standard model theories where the Higgs interacts with all the matter particles (fermions) or all the force carrying particles (vector bosons) differently than in the standard model.  To do that, it is useful to group together all of the information in a particular way and take a special 2-d slice through the 6-d parameter space described by the three 2-d plots above.  To the left is the result of this test (where the axes are called κ_F and κ_V for the vector bosons and fermions, respectively).  What is special about this plot is that there is an overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.  While my student Sven Kreiss made the comparison as part of a test, anyone can now reproduce this plot from the official ATLAS likelihood functions.  More importantly, the same procedure that was used to make this plot can be used to test other specific theories — and there are a lot of alternative ways to reinterpret these Higgs results.

Great! So where can you find these likelihood functions and what does this have to do with open access?  I think this part is very exciting.  CERN is now famous for being the birthplace for the world wide web and having a forward-looking vision for open access to our published papers. The sheer volume and complexity of the LHC data makes the conversation about open access to the raw data quite complicated.  However, having access to our published results is much less controversial.  While it is not done consistently, there are several examples of experiments putting information that goes into tables and figures on HepData (a repository for particle physics measurements).  Recently, our literature system INSPIRE started to integrate with HepData so that the data are directly associated to the original publication (here is an example).  What is important is that this data is discoverable and citable.  If someone uses this data, we want to know exactly what is being used and the collaborations that produced the data deserve some credit.  INSPIRE is now issuing a Digital Object Identifier (DOI) to this data, which is a persistent and trackable link to the data.

So now for the fun part, you can go over to the INSPIRE record for the recent Higgs paper (http://inspirehep.net/record/1241574) and you will see this:

The INSPIRE record for the recent ATLAS Higgs paper.

 

If you click on HepData tab at the top it will take you to a list of data associated to this paper.   Each of the three entries has a DOI associated to it (and lists all the ATLAS authors).  For example, the H→γγ result’s DOI is 10.7484/INSPIREHEP.DATA.A78C.HK44, and this is what should be cited for any result that uses this likelihood.    (Note, to get to the actual data, you click on the Files tab.)  INSPIRE is now working so that your author profile will not only include all of your papers, but also the data sets that you are associated with (and you can also see the data associated with your ORCID ID).

The H→γγ likelihood function.

The INSPIRE record for the H→γγ likelihood function.

Now it’s time for me and my co-authors to update our paper “On the presentation of LHC Higgs results” to cite this data.  And next week, Salvatore Mele, head of Open Access at CERN, will give a keynote presentation to the DataCite conference entitled “A short history of the Higgs Boson. From a tenth of a billionth of a second after the Big Bang, through the discovery at CERN, to a DataCite DOI”.

I truly hope that this becomes standard practice for the LHC.  It is a real milestone for the information architecture associated to the field of high energy physics and a step forward in the global analysis of the Higgs boson discovered at the LHC!

Update (Sept. 17): The new version of our paper is out that has citations to the likelihoods.

Update (Sept. 18): The data record now has a citation tab as well, so you can distinguish citations to the data and citations to the paper.

Share

Visiting my high school in Arkansas

Wednesday, August 21st, 2013

This week I will be going to visit my high school in Arkansas.  It was 20 years ago that the school first opened its doors and I was part of that Charter class.  The Arkansas School for Mathematics, Science & the Arts is a bit unusual, it is “one of only fifteen public, residential high schools in the country specializing in the education of gifted and talented students who have an interest and aptitude for mathematics and science.”  And this was a state-wide school, so it was a lot like leaving for college two years early.

Arkansas is not particularly well known for its educational system — as a kid we would joke “thank god for Mississippi” when Arkansas would come in 49/50th in some educational ranking.  My brother attended Little Rock’s Central High, which is famous for its history in the civil rights movement and the desegregation of the school system).  I’m happy to see that Arkansas is doing better in the educational rankings, but there is still a long way to go.  For those of you not from the US, I’ve included a map showing this rural state in the southern part of the US.


View Larger Map

Kyle Cranmer with Bill Clinton in Arkansas Governor's office in 1991.

Kyle Cranmer with Bill Clinton in Arkansas Governor’s office in 1991.

 

The school has an interesting history, it was created in 1991 by an act of the Arkansas Legislature.  Bill Clinton was Governor of Arkansas at the time, and I happened to get a photo with him that year in his office (wearing my friend’s hideous sweater, since my clothes were all dirty while playing at his house).

 While the school is more closely modeled after the North Carolina School of Science and Mathematics, one of the other early schools of this type was the Illinois Mathematics and Science Academy.  Here’s a tidbit from Wikipedia:

“Nobel laureate Leon Lederman, director emeritus of nearby Fermi National Accelerator Laboratory in Batavia, Illinois, was among the first to propose the Illinois school in 1982, and together with Governor Jim Thompson led the effort for its creation. Thompson has noted with pride that he chose to build IMSA instead of competing for the ill-fated supercollider project.”

 

This school changed my life.  I learned calculus and calculus-based physics from Dr. Irina Lyublinskaya, a Russian-educated Ph.D. physicist that had left Russia due to religious persecution.  I took an organic chemistry in high school with awesome labs where we extracted DNA from plants and ran gel electrophoresis.  I was frustrated by the lack of activities, so I got involved in school politics. But probably the most important aspect of my time there was learning from my friends and taking on all sorts of projects.  I learned some basic electronics from my electronics guru friends Colin and  Stephen (who made a TV from a scrap oscilloscope), my friend Thomas made a pretty nice Tesla Coil, we used to get in trouble making potato guns and I almost lost an eye with a rail gun trial.  I remember making a binary half adder out of some huge old telphone relay switches, and when I connected the current the you could hear the simple computation proceed knock-knock-knock until the lights at the end of the big piece of plywood I was using lit up to confirm 1+2=3.  My friend Sean taught me about programming, my friend Colin taught me about Neural Networks and Fast Fourier Transforms.  I spent weeks soldering together an EEG for my science fair project to identify different classes of thought by using brain waves and identifying them by analyzing their characteristic frequency spectrum with a neural network — an idea I got while watching a documentary of Stephen Hawking.  And we were all on-line and exposed to the world wide web in its formative years (93-95).

Tomorrow I leave to go visit the school 20 years later.  We will meet with legislators, parents, alumni, students, and supporters.  I look forward to telling the students about the tremendously exciting career I’ve had in particle physics, culminating in the discovery of the Higgs boson.

Share

A fresh look for the standard model

Monday, August 19th, 2013

(Note: This is an updated version of a post that I originally made on my personal website theoryandpractice.org.)

Recently I’ve been more involved in communication, education, and outreach activities via the “Snowmass” Community Summer Study.  One of the goals we discussed was to get to the point that the public is more aware of the fundamental particles.  Ideally, we’d like something as iconic as the periodic table (which is rotated from Mendeleev’s original).

The periodic table

 

Our standard graphic for the standard model builds on this tabular format, which is not unreasonable with the three generations of fermions for the columns and rows pointing to the up/down pairing of the SU(2) symmetry in the weak force.  It’s a cute graphic, but it has a number of problems for communicating with the public

  1. the Higgs is absent
  2. the 3-d effect is meaningless and is second only to our notorious use of Comic Sans for painting physicists as being inept in the graphic design department

 

standardmodel standard

 

It seems easy enough to add the Higgs to this table, but there seems no agreement on where to put it as you can see from Google’s image search.

SM-confusion

From a physicists point of view there are some other problems that actually harm those starting to learn the standard model in detail

  1. there symmetry for the strong force (the RGB colors of the quarks) is not reflected at all leading to the idea that there is only one type of up quark.
  2. the complications about the left- and right- handed parts of the leptons in the weak interaction
  3. the mixing between the quarks
  4. the rows and columns don’t mean anything for the force carriers, and any sort of group-theoretic structure for the gauge bosons is missing

In June, I went to the Sheffield Documentary Film Festival for the premiere screening of Particle Fever.  It’s a great film that humanizes fundamental particle physics in an emotional, funny, and romantic way.  It also has some great graphics.  One of my favorite graphics was a new way of representing the fundamental particles.  During the after party of the premiere, the director Mark Levinson gave me the back story (which I forgot about until he reminded me)

It was actually our brilliant editor, Walter Murch, who had been obsessing about finding an iconic representation for the Standard Model equivalent to the Periodic Table. He wanted something that was accurate, meaningful, elegant and simple. One morning he came into the edit room and told me he had had a “benzene ring” dream – an idea for a circular representation of the SM. I think David [Kaplan] and I may have suggested a couple of small modifications, but essentially it was the “artist” who trumped the physicists in devising what I hope becomes an iconic representation of the fundamental particles of physics!

 

Particle Fever Standard Model Graphic

 

Here’s what I like about it

  1. it looks complete (which the standard model is in a certain sense), unlike like a table that can keep being appended with rows and columns
  2. it has a fresh, flat design that lends itself way to an iconic image (stickers, t-shirts, etc.)
  3. It’s round, which evokes notions of symmetry
  4. it is minimal, but it still has some basic structure
    1. rings of fermions, vector bosons, scalar (Higgs) boson
    2. quarks/leptons are top/bottom or red/green
    3. families are still there in the clockwise orientation
  5. the Higgs is central (I’m kind of kidding, but the Higgs is a unique, central part of the theory and it has gathered a huge amount of attention to the field)

Of course, the graphic is not perfect.  I’ve thought about variations.  For instance, rearranging the fermions from a clockwise oriented flow to a left/right and top/bottom symmetry for the quark/lepton and weak force (SU(2) doublet) structure.  One could play with color a bit so that the up/down-type quarks and leptons have a common coloring in some way.  However, all of these changes also can be given the same criticism I gave the standard standard model graphic at the top.  For instance, focusing on the weak interaction over the strong interaction.

After the  original post I got a few comments on the graphic.  Some didn’t like the idea that it looked complete, because we know the standard model is not the full story (Dark Matter, baryogensis, neutrino masses, etc.).  While it is certainly true fundamental physics is not complete, the standard model is.  Near the end of this trailer for Particle Fever, you see this standard model graphic dressed up with a Penrose tiling and some supersymmetric friends.  The other complaint was that it suggested that the force carriers only interact with specific particles (g with d,s,b; γ with u,c,t; Z with neutrinos; and W with charged leptons).  I guess so, but that same kind of geometrical/semantic connection was also there with the standard graphic that we use.  Any graphic will be prone to these types of criticisms from the experts, so we must weigh those objections against the gain in communicating a more streamlined message.

In the end I think it would behoove the physics community to popularize a fresh, iconic image for the standard model and use the public’s excitement of the Higgs discovery as impetus to educate the general public about the basics of fundamental particle physics.

 

EDIT: You can now buy a shirt or poster with the Particle Fever Standard Model graphic here:
http://particlefever.bigcartel.com/product/standard-model-t-shirt

Share