• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Flip
  • Tanedo
  • USLHC
  • USA

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Karen
  • Andeen
  • Karlsruhe Institute of Technology

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Alexandre
  • Fauré
  • CEA/IRFU
  • FRANCE

Latest Posts

  • Jim
  • Rohlf
  • USLHC
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

USLHC | USA

CERN’s universe is ours!

Ken Bloom
Sunday, September 29th, 2013

This past weekend, CERN held its first open days for the public in about five years. This was a big, big deal. I haven’t heard any final statistics, but the lab was expecting about 50,000 visitors on each of the two days. (Some rain on Sunday might have held down attendance.) Thus, the open days were a huge operation — roads were shut down, and Transports Publics Genevois was running special shuttle buses amongst the Meyrin and Previssen sites and the access points on the LHC ring. The tunnels were open to people who had reserved tickets in advance — a rare opportunity, and one that is only possible during a long shutdown such as the one currently underway.

A better CERN user than me would have volunteered for the open days. Instead, I took my kids to see the activities. We thought that the event went really well. I was bracing for it to be a mob scene, but in the end the Meyrin site was busy but not overrun. (Because the children are too small, we couldn’t go to any of the underground areas.) There were many eager orange-shirted volunteers at our service, as we visited open areas around the campus. We got to see a number of demonstrations, such as the effects of liquid-nitrogen temperatures on different materials. There were hands-on activities for kids, such as assembling your own LHC and trying to use a scientific approach to guessing what was inside a closed box. Pieces of particle detectors and LHC magnets were on display for all to see.

But I have to say, what really got my kids excited was the Transport and Handling exhibit, which featured CERN’s heavy lifting equipment. They rode a scissors lift that took them to a height of several stories, and got to operate a giant crane. Such a thing would never, ever happen in the US, which has a very different culture of legal liability.

I hope that all of the visitors had a great time too! I anticipate that the next open days won’t be until the next long shutdown, which is some years away, but it will be well worth the trip.

Share

Aces high

Ken Bloom
Thursday, September 19th, 2013

Much as I love living in Lincoln, Nebraska, having a long residence at CERN has some advantages. For instance, we do get much better traffic of seminar and colloquium speakers here. (I know, you were thinking about chocolate.) Today’s colloquium in particular really got me thinking about how we do, or don’t, understand particle physics today.

The speaker was George Zweig of MIT. Zweig has been to CERN before — almost fifty years ago, when he was a postdoctoral fellow. (This was his first return visit since then.) He had just gotten his PhD at Caltech under Richard Feynman, and was busy trying to understand the “zoo” of hadronic particles that were being discovered in the 1960′s. (Side note: Zweig pointed out today that at the time there were 26 known hadronic particles…19 of which are no longer believed to exist.) Zweig developed a theory that explained the observations of the time by positing a set of hadronic constituents that he called “aces”. (He thought there might be four of them, hence the name.) Some particles were made of two aces (and thus called “deuces”) and others were made of three (and called “trays”). This theory successfully explained why some expected particle decays didn’t actually happen in nature, and gave an explanation for differences in masses between various sets of particles.

Now, reading this far along, you might think that this sounds like the theory of quarks. Yes and no — it was Murray Gell-Mann who first proposed quarks, and had similar successful predictions in his model. But there was a critical difference between the two theories. Zweig’s aces were meant to be true physical particles — concrete quarks, as he referred to them. Gell-Mann’s quarks, by contrast, were merely mathematical constructs whose physical reality was not required for the success of the theory. At the time, Gell-Mann’s thinking held sway; I’m no expert on the history of this period of history in theoretical particle physics. But my understanding was that the Gell-Mann approach was more in line with the theory fashions of the day, and besides, if you could have a successful theory that didn’t have to introduce some new particles that were themselves sketchy (their electric charges had to be fractions of the electron charge, and they apparently couldn’t be observed anyway), why would you?

Of course, we now know that Zweig’s interpretation is more correct; this was even becoming apparent a few short years later, when deep-inelastic scattering experiments at SLAC in the late 1960′s discovered that nucleons had smaller constituents, but at that time it was controversial to actually associate those with the quarks (or aces). For whatever reason, Zweig left the field of particle physics and went on to a successful career as a faculty member at MIT, doing work in neurobiology that involved understanding the mechanisms of hearing.

I find it a fascinating tale of how science actually gets done. How might it apply to our science today? A theory like the standard model of particle physics has been so well tested by experiment that it is taken to be true without controversy. But theories of physics beyond the standard model, the sort of theories that we’re now trying to test at the LHC, are much less constrained. And, to be sure, some are more popular than others, because they are believed to have some certain inherent beauty to them, or because they fit well with patterns that we think we observe. I’m no theorist, but I’m sure that some theories are currently more fashionable than others. But in the absence of experimental data, we can’t know that they are right. Perhaps there are some voices that are not being heard as well as they need to be. Fifty years from now, will we identify another George Zweig?

Share

Inspired by the Higgs, a step forward in open access

Kyle Cranmer
Thursday, September 12th, 2013

The discovery of the Higgs boson is a major step forward in our understanding of nature at the most fundamental levels. In addition to being the last piece of the standard model, it is also at the core of the fine tuning problem — one of the deepest mysteries in particle physics. So it is only natural that our scientific methodology rise to the occasion to provide the most powerful and complete analysis of this breakthrough discovery.

This week the ATLAS collaboration has taken an important step forward by making the likelihood function for three key measurements about the Higgs available to the world digitally. Furthermore, this data is being shared in a way that represents a template for how particle physics operates in the fast-evolving world of open access to data. These steps are a culmination of decades of work, so allow me to elaborate.

Four interactions that can produced a Higgs boson at the LHC

Four interactions that can produced a Higgs boson at the LHC

Higgs production and decay measured by ATLAS.

Higgs production and decay measured by ATLAS.

First of all, what are the three key measurements, and why are they important? The three results were presented by ATLAS in this recent paper.  Essentially, they are measurements for how often the Higgs is produced at the LHC through different types of interactions (shown above) and how often it decays into three different force carrying particles (photons, W, and Z bosons).  In this plot, the black + sign at (1,1) represents the standard model prediction and the three sets of contours represent the measurements performed by ATLAS.  These measurements are fundamental tests of the standard model and any deviation could be a sign of new physics like supersymmetry!

Ok, so what is the likelihood function, and why is it useful?  Here maybe it is best to give a little bit of history.  In 2000, the first in a series of workshops was held at CERN where physicists gathered to discuss the details of our statistical procedures that lead to the final results of our experiments.  Perhaps surprisingly, there is no unique statistical procedure, and there is a lot of debate about the merits of different approaches.  After a long discussion panel, Massimo Corradi cut to the point

It seems to me that there is a general consensus that what is really meaningful for an experiment is likelihood, and almost everybody would agree on the prescription that experiments should give their likelihood function for these kinds of results. Does everybody agree on this statement, to publish likelihoods?

And as Louis Lyons charred the session…

Any disagreement? Carried unanimously.  That’s actually quite an achievement for this workshop.

So there you have it, the likelihood function is the essential piece of information needed for communicating scientific results.

So what happened next?  Well… for years, despite unanimous support, experiments still do not publish their likelihood functions.  Part of the reason is that we lacked the underlying technology to communicate these likelihood functions efficiently.  In the run up to the LHC we developed some technology (associated to RooFit and RooStats) for being able to share very complicated likelihood functions internally.  This would be the ideal way to share our likelihood functions, but we aren’t quite there yet.  In January 2013, we had a conference devoted to the topic of publishing likleihood functions, which culminated in a paper “On the presentation of LHC Higgs results”.  This paper, written by theorist and experimentalists, singled out the likelihood associated to the plot above as the most useful way of communicating information about the Higgs properties.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

An overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.

The reason that these specific Higgs plots are so useful is that more specific tests of the standard model can be derived from them.  For instance, one might want to consider beyond the standard model theories where the Higgs interacts with all the matter particles (fermions) or all the force carrying particles (vector bosons) differently than in the standard model.  To do that, it is useful to group together all of the information in a particular way and take a special 2-d slice through the 6-d parameter space described by the three 2-d plots above.  To the left is the result of this test (where the axes are called κ_F and κ_V for the vector bosons and fermions, respectively).  What is special about this plot is that there is an overlay of the original ATLAS result (filled contours) and those reproduced from the official ATLAS likelihood functions.  While my student Sven Kreiss made the comparison as part of a test, anyone can now reproduce this plot from the official ATLAS likelihood functions.  More importantly, the same procedure that was used to make this plot can be used to test other specific theories — and there are a lot of alternative ways to reinterpret these Higgs results.

Great! So where can you find these likelihood functions and what does this have to do with open access?  I think this part is very exciting.  CERN is now famous for being the birthplace for the world wide web and having a forward-looking vision for open access to our published papers. The sheer volume and complexity of the LHC data makes the conversation about open access to the raw data quite complicated.  However, having access to our published results is much less controversial.  While it is not done consistently, there are several examples of experiments putting information that goes into tables and figures on HepData (a repository for particle physics measurements).  Recently, our literature system INSPIRE started to integrate with HepData so that the data are directly associated to the original publication (here is an example).  What is important is that this data is discoverable and citable.  If someone uses this data, we want to know exactly what is being used and the collaborations that produced the data deserve some credit.  INSPIRE is now issuing a Digital Object Identifier (DOI) to this data, which is a persistent and trackable link to the data.

So now for the fun part, you can go over to the INSPIRE record for the recent Higgs paper (http://inspirehep.net/record/1241574) and you will see this:

The INSPIRE record for the recent ATLAS Higgs paper.

 

If you click on HepData tab at the top it will take you to a list of data associated to this paper.   Each of the three entries has a DOI associated to it (and lists all the ATLAS authors).  For example, the H→γγ result’s DOI is 10.7484/INSPIREHEP.DATA.A78C.HK44, and this is what should be cited for any result that uses this likelihood.    (Note, to get to the actual data, you click on the Files tab.)  INSPIRE is now working so that your author profile will not only include all of your papers, but also the data sets that you are associated with (and you can also see the data associated with your ORCID ID).

The H→γγ likelihood function.

The INSPIRE record for the H→γγ likelihood function.

Now it’s time for me and my co-authors to update our paper “On the presentation of LHC Higgs results” to cite this data.  And next week, Salvatore Mele, head of Open Access at CERN, will give a keynote presentation to the DataCite conference entitled “A short history of the Higgs Boson. From a tenth of a billionth of a second after the Big Bang, through the discovery at CERN, to a DataCite DOI”.

I truly hope that this becomes standard practice for the LHC.  It is a real milestone for the information architecture associated to the field of high energy physics and a step forward in the global analysis of the Higgs boson discovered at the LHC!

Update (Sept. 17): The new version of our paper is out that has citations to the likelihoods.

Update (Sept. 18): The data record now has a citation tab as well, so you can distinguish citations to the data and citations to the paper.

Share

Prioritizing the future

Ken Bloom
Monday, September 9th, 2013

As I’ve discussed a number of times, the United States particle physics community has spent the last nine months trying to understand what the exciting research and discovery opportunities are for the next ten to twenty years, and what sort of facilities might be required to exploit them. But what comes next? How do we decide which of these avenues of research are the most attractive, and, perhaps most importantly, can be achieved given that we work within finite budgets, need the right enabling technologies to be available at the right times, and must be planned in partnership with researchers around the world?

In the United States, this is the job of the Particle Physics Project Prioritization Panel, or P5. What is this big mouthful? First, it is a sub-panel of the High Energy Physics Advisory Panel, or HEPAP. HEPAP is the official body that can advise the Department of Energy and the National Science Foundation (the primary funders of particle physics in the US, and also the sponsors of the US LHC blog) on programmatic direction of the field in the US. As an official Federal Advisory Committee, HEPAP operates in full public view, but it is allowed to appoint sub-panels that are under the control of and report to HEPAP but have more flexibility to deliberate in private. This particular sub-panel, P5, was first envisioned in a report of a previous HEPAP sub-panel in 2001 that looked at, among other things, the long-term planning process for the field. The original idea was that P5 would meet quite regularly and continually review the long-term roadmap for the field and adjust it according to current conditions and scientific knowledge. However, in reality P5′s have been short-lived and been re-formed every few years. The last P5 report dates from 2008, and obviously a lot has changed since then — in particular, we now know from the LHC that there is a Higgs boson that looks like the one predicted in the standard model, and there have been some important advances in our understanding of neutrino mixing. Thus the time is ripe to take another look at the plan.

And so it is that a new P5 was formed last week, tasked with coming up with a new strategic plan for the field “that can be executed over a 10 year timescale, in the context of a 20-year global vision for the field.” P5 is supposed to be taking into account the latest developments in the field, and use the Snowmass studies as inputs. The sub-panel is to consider what investments are needed to fulfill the scientific goals, what mix of small, medium and large experiments is appropriate, and how international partnerships can fit into the picture. Along the way, they are also being asked to provide a discussion of the scientific questions of the field that is accessible to non-specialists (along the lines of this lovely report from 2004) and articulate the value of particle-physics research to other sciences and society. Oh, and the sub-panel is supposed to have a final report by May 1. No problem at all, right?

Since HEPAP’s recommendations will drive the the plan for the field, it is very important that this panel does a good job! Fortunately, there are two good things going for it. First, the membership of the panel looks really great — talented and knowledgeable scientists who are representative of the demographics of the field and include representatives from outside the US. Second, they are being asked to make their recommendations in the context of fairly optimistic budget projections. Let us only hope that these come to pass!

Watch this space for more about the P5 process over the coming eight months.

Share

Where the Future Lies – 30 Years in the Making

James Faulkner
Monday, September 9th, 2013

Last week, the CMS and ATLAS experiments hosted a party for all collaborators
of the respective groups to celebrate receiving the 2013 EPS High Energy
Particle Physics Prize. This was an opportunity to celebrate the past 30 years of
hard work in planning and execution of an international effort to discover the
next level of high-energy physics. During the gathering, it was pointed out that
now is the time to start planning and executing the next generation of particle
colliders and high energy physics searches. This is quite true, given that we must
not fall into the mindset of, “If we can discover the Higgs particle, then why build
a better detector?” It is very much a reality that the LHC still offers much to be
discovered, as we have yet to reach its full potential. But if we wait until we have
exhausted the potential of the LHC before planning for the next experiments, it
would create a gap in progress for future generations and ourselves.

Another great moment from the evening was when the awards were displayed—
symbolically communicating that we all received this award and we can all share
in the moment. Individual and personal achievement will always be a highlight
in one’s own life, but this moment of gratitude and humility for a job well done
through a international collaboration was certainly inspiring.

A physicists’ party could be imagined as a bunch of nerds reciting equations to
the beat of techno music, but that’s not quite what happened. Working in an
international collaboration usually means we have irregular work hours, with
swarms of emails sent from all over the world at any hour of the day (or night).
We usually try to maintain a normal workday as well as life outside of work, but
you can still spot at least a few glowing computer screens as work carries on into
the dead of night. To have a dedicated party for both collaborations where we
were able to unwind and talk about sports or vacations and dance highlights how
worthwhile our experiences are. The work is difficult, but how many careers
offer the chance to literally travel across the world on a regular basis and work
on the largest particle physics experiment ever?

Share

Visiting my high school in Arkansas

Kyle Cranmer
Wednesday, August 21st, 2013

This week I will be going to visit my high school in Arkansas.  It was 20 years ago that the school first opened its doors and I was part of that Charter class.  The Arkansas School for Mathematics, Science & the Arts is a bit unusual, it is “one of only fifteen public, residential high schools in the country specializing in the education of gifted and talented students who have an interest and aptitude for mathematics and science.”  And this was a state-wide school, so it was a lot like leaving for college two years early.

Arkansas is not particularly well known for its educational system — as a kid we would joke “thank god for Mississippi” when Arkansas would come in 49/50th in some educational ranking.  My brother attended Little Rock’s Central High, which is famous for its history in the civil rights movement and the desegregation of the school system).  I’m happy to see that Arkansas is doing better in the educational rankings, but there is still a long way to go.  For those of you not from the US, I’ve included a map showing this rural state in the southern part of the US.


View Larger Map

Kyle Cranmer with Bill Clinton in Arkansas Governor's office in 1991.

Kyle Cranmer with Bill Clinton in Arkansas Governor’s office in 1991.

 

The school has an interesting history, it was created in 1991 by an act of the Arkansas Legislature.  Bill Clinton was Governor of Arkansas at the time, and I happened to get a photo with him that year in his office (wearing my friend’s hideous sweater, since my clothes were all dirty while playing at his house).

 While the school is more closely modeled after the North Carolina School of Science and Mathematics, one of the other early schools of this type was the Illinois Mathematics and Science Academy.  Here’s a tidbit from Wikipedia:

“Nobel laureate Leon Lederman, director emeritus of nearby Fermi National Accelerator Laboratory in Batavia, Illinois, was among the first to propose the Illinois school in 1982, and together with Governor Jim Thompson led the effort for its creation. Thompson has noted with pride that he chose to build IMSA instead of competing for the ill-fated supercollider project.”

 

This school changed my life.  I learned calculus and calculus-based physics from Dr. Irina Lyublinskaya, a Russian-educated Ph.D. physicist that had left Russia due to religious persecution.  I took an organic chemistry in high school with awesome labs where we extracted DNA from plants and ran gel electrophoresis.  I was frustrated by the lack of activities, so I got involved in school politics. But probably the most important aspect of my time there was learning from my friends and taking on all sorts of projects.  I learned some basic electronics from my electronics guru friends Colin and  Stephen (who made a TV from a scrap oscilloscope), my friend Thomas made a pretty nice Tesla Coil, we used to get in trouble making potato guns and I almost lost an eye with a rail gun trial.  I remember making a binary half adder out of some huge old telphone relay switches, and when I connected the current the you could hear the simple computation proceed knock-knock-knock until the lights at the end of the big piece of plywood I was using lit up to confirm 1+2=3.  My friend Sean taught me about programming, my friend Colin taught me about Neural Networks and Fast Fourier Transforms.  I spent weeks soldering together an EEG for my science fair project to identify different classes of thought by using brain waves and identifying them by analyzing their characteristic frequency spectrum with a neural network — an idea I got while watching a documentary of Stephen Hawking.  And we were all on-line and exposed to the world wide web in its formative years (93-95).

Tomorrow I leave to go visit the school 20 years later.  We will meet with legislators, parents, alumni, students, and supporters.  I look forward to telling the students about the tremendously exciting career I’ve had in particle physics, culminating in the discovery of the Higgs boson.

Share

A fresh look for the standard model

Kyle Cranmer
Monday, August 19th, 2013

(Note: This is an updated version of a post that I originally made on my personal website theoryandpractice.org.)

Recently I’ve been more involved in communication, education, and outreach activities via the “Snowmass” Community Summer Study.  One of the goals we discussed was to get to the point that the public is more aware of the fundamental particles.  Ideally, we’d like something as iconic as the periodic table (which is rotated from Mendeleev’s original).

The periodic table

 

Our standard graphic for the standard model builds on this tabular format, which is not unreasonable with the three generations of fermions for the columns and rows pointing to the up/down pairing of the SU(2) symmetry in the weak force.  It’s a cute graphic, but it has a number of problems for communicating with the public

  1. the Higgs is absent
  2. the 3-d effect is meaningless and is second only to our notorious use of Comic Sans for painting physicists as being inept in the graphic design department

 

standardmodel standard

 

It seems easy enough to add the Higgs to this table, but there seems no agreement on where to put it as you can see from Google’s image search.

SM-confusion

From a physicists point of view there are some other problems that actually harm those starting to learn the standard model in detail

  1. there symmetry for the strong force (the RGB colors of the quarks) is not reflected at all leading to the idea that there is only one type of up quark.
  2. the complications about the left- and right- handed parts of the leptons in the weak interaction
  3. the mixing between the quarks
  4. the rows and columns don’t mean anything for the force carriers, and any sort of group-theoretic structure for the gauge bosons is missing

In June, I went to the Sheffield Documentary Film Festival for the premiere screening of Particle Fever.  It’s a great film that humanizes fundamental particle physics in an emotional, funny, and romantic way.  It also has some great graphics.  One of my favorite graphics was a new way of representing the fundamental particles.  During the after party of the premiere, the director Mark Levinson gave me the back story (which I forgot about until he reminded me)

It was actually our brilliant editor, Walter Murch, who had been obsessing about finding an iconic representation for the Standard Model equivalent to the Periodic Table. He wanted something that was accurate, meaningful, elegant and simple. One morning he came into the edit room and told me he had had a “benzene ring” dream – an idea for a circular representation of the SM. I think David [Kaplan] and I may have suggested a couple of small modifications, but essentially it was the “artist” who trumped the physicists in devising what I hope becomes an iconic representation of the fundamental particles of physics!

 

Particle Fever Standard Model Graphic

 

Here’s what I like about it

  1. it looks complete (which the standard model is in a certain sense), unlike like a table that can keep being appended with rows and columns
  2. it has a fresh, flat design that lends itself way to an iconic image (stickers, t-shirts, etc.)
  3. It’s round, which evokes notions of symmetry
  4. it is minimal, but it still has some basic structure
    1. rings of fermions, vector bosons, scalar (Higgs) boson
    2. quarks/leptons are top/bottom or red/green
    3. families are still there in the clockwise orientation
  5. the Higgs is central (I’m kind of kidding, but the Higgs is a unique, central part of the theory and it has gathered a huge amount of attention to the field)

Of course, the graphic is not perfect.  I’ve thought about variations.  For instance, rearranging the fermions from a clockwise oriented flow to a left/right and top/bottom symmetry for the quark/lepton and weak force (SU(2) doublet) structure.  One could play with color a bit so that the up/down-type quarks and leptons have a common coloring in some way.  However, all of these changes also can be given the same criticism I gave the standard standard model graphic at the top.  For instance, focusing on the weak interaction over the strong interaction.

After the  original post I got a few comments on the graphic.  Some didn’t like the idea that it looked complete, because we know the standard model is not the full story (Dark Matter, baryogensis, neutrino masses, etc.).  While it is certainly true fundamental physics is not complete, the standard model is.  Near the end of this trailer for Particle Fever, you see this standard model graphic dressed up with a Penrose tiling and some supersymmetric friends.  The other complaint was that it suggested that the force carriers only interact with specific particles (g with d,s,b; γ with u,c,t; Z with neutrinos; and W with charged leptons).  I guess so, but that same kind of geometrical/semantic connection was also there with the standard graphic that we use.  Any graphic will be prone to these types of criticisms from the experts, so we must weigh those objections against the gain in communicating a more streamlined message.

In the end I think it would behoove the physics community to popularize a fresh, iconic image for the standard model and use the public’s excitement of the Higgs discovery as impetus to educate the general public about the basics of fundamental particle physics.

Share

Snowmass: in Frontierland

Ken Bloom
Tuesday, August 6th, 2013

What an interesting but exhausting week it has been here at the Snowmass workshop in Minneapolis. I wrote last week about the opening of the workshop. In the following days, we followed what seemed to me like a pretty original schedule for a workshop. Each morning, we bifurcated (or multi-furcated, if that’s a word) into overlapping parallel sessions in which the various working groups were trying to finalize their studies. There were joint sessions between groups, in which, for instance, people studying some physics frontier were interacting with the people studying the facilities or instrumentation needed to realize the physics goals. Every afternoon we have gathered for plenary (or semi-plenary) sessions, featuring short talks on the theory and experimental work undergirding some physics topic, followed by a discussion of “tough questions” about the topic that challenged its importance in the grand scheme of things and the value of pursuing an experimental program on it. We would close each day with a panel discussion on broader policy questions, such as what is the proper balance between domestic and off-shore facilities, or how to make the case for long-term science.

It is a lot of work to put together and participate in a program like this, and overall everyone did a great job of giving well-prepared and thoughtful presentations. I should also take this opportunity to thank our hosts at the University of Minnesota for their successful management of a complicated and ever-evolving program that involved 700 physicists, most of whom registered at the last minute. (And special personal thanks to my Minneapolis in-laws, who made my visit easy!)

We’ve now gotten through the closing sessions, in which we heard summary reports from all the “frontier” working groups. I’m still digesting what everyone had to say, but here is one thing I think I know: there is general agreement that the frontiers that we are organizing our science around are not themselves science topics but approaches that can tell us about many different topics in different ways. For instance, I was quite taken with the news that cosmology can help us set bounds on the total mass of the different kinds of neutrinos; this will help us understand the neutrino spectrum with complementary information to that provided by accelerator-based neutrino experiments. Everyone is really looking to the other “frontiers” to see how we can create a program of research that can attack important physics questions in the most comprehensive possible way. And I think that a number of speakers have gone out of their way to point out that discoveries on someone else’s “frontier” may fundamentally change our understanding of the world.

(On a related note, it is also clear that we are all bothered by the tyranny of Venn diagrams. I am hoping to find time to write again about how many times a graphic of three intersecting circles appeared over the course of the week, and what amount of irony was implied each time.)

Since this the US LHC blog, I should also mention that the LHC came out well in the discussions. It is clear that there is a lot of potential for understanding and discovery at this machine, both when we increase the energy in 2015 and when we (hopefully) run in a high-luminosity mode later on in which we will attempt to increase the size of the dataset by a factor of ten. We expect to learn a tremendous amount about the newly-discovered, very strange Higgs boson, and hope to discover TeV-scale particles that make it possible for the Higgs to be what it is. From a more practical point of view, it is currently the only high-energy particle collider operating in this world, and it will stay this way for at least a decade. We must do everything we can to exploit the capabilities of this unique facility.

Where do we go from here? The results of the workshop, the handiwork of hundreds of physicists working over the course of a year, will get written up as a report that is meant to inform future deliberations. It is quite clear that we have more projects that have great physics potential, and that we really want to execute, than we have the resources to execute. In some ways, it is a good problem to have. But some hard choices will have to be made, and it won’t be long until we have convened a Particle Physics Project Prioritization Panel that will be charged with making recommendations on how we do this. I’m in no position to guess the outcome, but whatever it turns out to be, I suspect that our entire field is going to have to stand behind it and advocate it if we are to realize any, if not all, of our visions of the frontiers of particle physics.

Share

Snowmass: one big happy family

Ken Bloom
Monday, July 29th, 2013

Let me say this much about the Community Summer Study 2013, also known as “Snowmass on the Mississippi“: it feels like a family reunion. There are about 600 people registered for the meeting, and since in the end we are a small field, I know a lot of them. I’m surrounded by people I grew up with, people I’ve worked with before, people I work with now, and people with whom I’d really like to work someday. I find it a little overwhelming. Besides trying to learn some science, we’re all trying to catch up with each other’s lives and work.

As with any family, we have our differences on some issues. We know that there are diverse views on what the most important issues are and what are the most promising pathways to scientific discovery. But also, as with any family, there is a lot more that unites us than divides us. As Nigel Lockyer, the incoming director of Fermilab, put it, we will probably have little trouble finding consensus on what the important science questions are. Today’s speakers emphasized that we will need to approach these questions with multiple approaches, and there was mutual respect for the work being done in all of the study groups.

The challenge, of course, is how to accommodate all of these approaches within an envelope of finite resources, and how to strike a balance between near-term operations and long-term (if not very long-term) projects. As our speakers from the funding agencies pointed out, we are in a particularly challenging time for this due to national political and fiscal circumstances. Setting priorities will be a difficult job, and one that will only come after the Snowmass study has laid out all the possibilities.

The workshop continues for another eight days, and if you are interested in particle physics and the future of the field, I hope you’ll be keeping an eye on it. The agenda page linked above has a pointer to a live video stream, presentations are also being recorded for future viewing, and various people are tweeting their way along with hashtag #Snowmass or #Snowmass2013. There are a lot of exciting ideas being discussed this week, some of which can have a transformative effect on the field. Stay with us!

Share

Oh what a beautiful day

Adam Davis
Tuesday, July 23rd, 2013

In case you hadn’t heard, the past few days have been big days for B physics, i.e. particle physics involving a b quark. On the 18th and 19th, there were three results released in particular, two by LHCb and one by CMS. Specifically, on the 18th LHCb released their analysis of \( B_{(s)}\to\mu\mu\) using the full 3 fb\(^{-1}\) dataset, corresponding to 1 fb\(^{-1}\) of 2011 data at 7 TeVand 2 fb\(^{-1}\) of 2012 data at 8 TeV. Additionally, CMS also released their result using 5 fb\(^{-1}\) of 7 TeV and 30 fb\(^{-1}\) of 8 TeV data.

no FCNC

The decay \(B_{(s)}\to\mu\mu\) cannot decay via tree-level processes, and must proceed by higher level processes ( shown below)

These analyses have huge implications for SUSY. The decay \( B_{(s)}\to\mu\mu\) cannot proceed via tree-level processes, as they would involve flavor changing neutral currents which are not seen in the Standard Model (picture to the right). Therefore, the process must proceed at a higher order than tree level. In the language of Feynman Diagrams, the decay must proceed by either loop or penguin diagrams, show in the diagrams below. However, the corresponding decay rates are then extremely small, about \(3\times10^{-9}\). Any deviation from this extremely small rate, however, could therefore be New Physics, and many SUSY models are strongly constrained by these branching fractions.

The results reported are:

Experiment    \(\mathcal{B}(B_{s}\to\mu\mu)\) Significance \(\mathcal{B}(B\to\mu\mu)\)
LHCb \( 2.9^{+1.1}_{-1.0} \times 10^{-9}\) 4.0\(\sigma\) \(<7.4\times 10^{-10}(95\% CL) \)
CMS \(3.0^{+1.0}_{-0.9}\times 10^{-9}\) 4.3 \(\sigma\) \(< 1.1\times 10^{-9} (95\% CL)\)
bs_loop_penguin

Higher order diagrams

Both experiments saw an excess of events events for the \(B_{s}\to\mu\mu)\) channel, corresponding to \(4.o\sigma\) for LHCb (updated from \(3.5 \sigma\) of last year), and \(4.3\sigma\) for CMS. The combined results will, no doubt, be out very soon. Regardless, as tends to happen with standard model results, SUSY parameter space has continued to be squeezed. Just to get a feel of what’s happening, I’ve made a cartoon of the new results overlaid onto an older picture from D. Straub to see what the effect of the new result would be. SUSY parameter space is not necessarily looking so huge. The dashed line in the figure represents the old result. Anything shaded in was therefore excluded. By adding the largest error on the branching fraction of \(B_s\to\mu\mu\), I get the purple boundary, which moves in quite a bit. Additionally, I overlay the new boundary for \(B\to\mu\mu\) from CMS in orange and from LHCb in green. An interesting observation is that if you take the lower error for LHCb, the result almost hugs the SM result. I won’t go into speculation, but it is interesting.

Cartoon of updated limits

Cartoon of Updated Limits on SUSY from \(B\to\mu\mu\) and \(B_s\to\mu\mu\). Orange Represents the CMS results and green represents LHCb results for \(B_s\to\mu\mu\) . Purple is the shared observed upper limit on \(B\to\mu\mu\). Dashed line is the old limit. Everything outside the box on the bottom left is excluded. Updated from D. Straub (http://arxiv.org/pdf/1205.6094v1.pdf)

 

Additionally, for a bit more perspective, see Ken Bloom’s Quantum Diaries post.

As for the third result, stay tuned and I’ll write about that this weekend!

Share