Open daysParticle physics laboratories around the world regularly host open days, when they throw their doors wide and invite the public in to learn about ongoing projects and experiments. This month, Nikhef in the Netherlands hosted hundreds of visitors as part of the Amsterdam Science Park Open Day, offering hands-on activities, talks and a glimpse into how a large science lab works. If you missed Nikhef’s Open Day, there’s sure to be another near you soon!
The Nikhef Open Day had a huge impact, the benefit of which can be summarized with the words of one man who stopped by. He listened to the careful explanation provided by one of the students, and said, “Thanks. Now I feel it mine too.”
Many people have a sore throat this week at CERN. Not too surprising given the 70,000 inquisitive visitors we welcomed over the weekend! It was amazing to see so much interest from the public and the enthusiasm of the 2,300 volunteers.
Last Saturday was the Open Day of our institute, which takes place every two years. That is an excellent opportunity to show what we are doing to the public. This year was particularly successful, with a record attendance.
On Saturday, 4 October, Nikhef – the Dutch National Institute for Subatomic Physics where I spend long days and efforts – opened its doors, labs and facilities to the public. In addition to Nikhef, all the other institutes located in the so-called “Science Park” – the scientific district located in the east part of Amsterdam – welcomed people all day long.
It’s the second “Open Day” that I’ve attended, both as a guest and as guide. Together with my fellow theoreticians we provided answers and explanations to people’s questions and curiosities, standing in the “Big Bang Theory Corner” of the main hall. Each department in Nikhef arranged its own stand and activities, and there were plenty of things to be amazed at to cover the entire day.
The research institutes in Science Park (and outside it) offer a good overview of the concept of research, looking for what is beyond the current status of knowledge. “Verder kijken”, or looking further, is the motto of Vrije Universiteit Amsterdam, my Dutch alma mater.
I deeply like this attitude of research, the willingness to investigating what’s around the corner. As they like to define themselves, Dutch people are “future oriented”: this is manifest in several things, from the way they read the clock (“half past seven” becomes “half before eight” in Dutch) to some peculiarities of the city itself, like the presence of a lot of cultural and research institutes.
This abundance of institutes, museums, exhibitions, public libraries, music festivals, art spaces, and independent cinemas makes me feel this city as cultural place. People interact with culture in its many manifestations and are connected to it in a more dynamic way than if they were only surrounded by historical and artistic.
Back to the Open Day and Nikhef, I was pleased to see lots of people, families with kids running here and there, checking out delicate instruments with their curious hands, and groups of guys and girls (also someone who looked like he had come straight from a skate-park) stopping by and looking around as if it were their own courtyard.
The following pictures give some examples of the ongoing activities:
We had a model of the ATLAS detector built with Legos: amazing!
And not only toy-models. We had also true detectors, like a cloud chamber that allowed visitors to see the traces of particles passing by!
The majority of people here (not me) are blond and/or tall, but not tall enough to see cosmic rays with just their eyes… So, please ask the experts!
I think I can summarize the huge impact and the benefit of such a cool day with the words of one man who stopped by one of the experimental setups. He listened to the careful (but a bit fuzzy) explanation provided by one of the students, and said “Thanks. Now I feel it mine too.”
Many more photos are available here: enjoy!
It’s been a little while since I’ve posted anything, but I wanted to write a bit about some of the testbeam efforts at CERN right now. In the middle of July this year, the Proton Synchrotron, or PS, the second ring of boosters/colliders which are used to get protons up to speed to collide in the LHC, saw its first beam since the shutdown at the end Run I of the LHC. In addition to providing beam to experiments like CLOUD, the beam can also be used to create secondary particles of up to 15 GeV/c momentum, which are then used for studies of future detector technology. Such a beam is called a testbeam, and all I can say is WOOT, BEAM! I must say that being able to take accelerator data is amazing!
The next biggest milestone is the testbeams from the SPS, which started on the 6th of October. This is the last ring before the LHC. If you’re unfamiliar with the process used to get protons up to the energies of the LHC, a great video can be found at the bottom of the page.
Just to be clear, test beams aren’t limited to CERN. Keep your eyes out for a post by my friend Rebecca Carney in the near future.
I was lucky enough to be part of the test beam effort of LHCb, which was testing both new technology for the VELO and for the upgrade of the TT station, called the Upstream Tracker, or UT. I worked mainly with the UT group, testing a sensor technology which will be used in the 2019 upgraded detector. I won’t go too much into the technology of the upgrade right now, but if you are interested in the nitty-gritty of it all, I will instead point you to the Technical Design Report itself.
I just wanted to take a bit to talk about my experience with the test beam in July, starting with walking into the experimental area itself. The first sight you see upon entering the building is a picture reminding you that you are entering a radiation zone.
Then, as you enter, you see a large wall of radioactive concrete.
This is where the beam is dumped. Following along here, you get to the control room, which is where all the data taking stuff is set up outside the experimental area itself. Lots of people are always working in the control room, focused and making sure to take as much data as possible. I didn’t take their picture since they were working so hard.
Then there’s the experimental area itself.
There are actually 4 setups here, but I think only three were being used at this time (click on the picture for a larger view). We occupied the area where the guy with the hardhat is.
Now the idea behind a tracker testbeam is pretty straight forward. A charged particle flies by, and many very sensitive detector planes record where the charged particle passed. These planes together form what’s called a “telescope.” The setup is completed when you add a detector to be tested either in the middle of the telescope or at one end.
From timing information and from signals from these detectors, a trajectory of the particle can be determined. Now, you compare the position which your telescope gives you to the position you record in the detector you want to test, and voila, you have a way to understand the resolution and abilities of your tested detector. After that, the game is statistics. Ideally, you want to be in the middle of the telescope, so you have the information on where the charged particle passed on either side of your detector as this information gives the best resolution, but it can work if you’re on one side or the other, too.
This is the setup which we have been using for the testbeam at the PS. We’ll be using a similar setup for the testbeam at the SPS next week! I’ll try to write a follow up post on that when we finish!
And finally, here is the promised video.
This article appeared in symmetry on Oct. 15, 2014.
“What happens to a quark deferred?” the poet Langston Hughes may have asked, had he been a physicist. If scientists lost interest in a particle after its discovery, much of what it could show us about the universe would remain hidden. A niche of scientists, therefore, stay dedicated to intimately understanding its properties.
Case in point: Top 2014, an annual workshop on top quark physics, recently convened in Cannes, France, to address the latest questions and scientific results surrounding the heavyweight particle discovered in 1995 (early top quark event pictured above).
Top and Higgs: a dynamic duo?
A major question addressed at the workshop, held from September 29 to October 3, was whether top quarks have a special connection with Higgs bosons. The two particles, weighing in at about 173 and 125 billion electronvolts, respectively, dwarf other fundamental particles (the bottom quark, for example, has a mass of about 4 billion electronvolts and a whole proton sits at just below 1 billion electronvolts).
Prevailing theory dictates that particles gain mass through interactions with the Higgs field, so why do top quarks interact so much more with the Higgs than do any other known particles?
Direct measurements of top-Higgs interactions depend on recording collisions that produce the two side-by-side. This hasn’t happened yet at high enough rates to be seen; these events theoretically require higher energies than the Tevatron or even the LHC’s initial run could supply. But scientists are hopeful for results from the next run at the LHC.
“We are already seeing a few tantalizing hints,” says Martijn Mulders, staff scientist at CERN. “After a year of data-taking at the higher energy, we expect to see a clear signal.” No one knows for sure until it happens, though, so Mulders and the rest of the top quark community are waiting anxiously.
A sensitive probe to new physics
Top and antitop quark production at colliders, measured very precisely, started to reveal some deviations from expected values. But in the last year, theorists have responded by calculating an unprecedented layer of mathematical corrections, which refined the expectation and promise to realigned the slightly rogue numbers.
Precision is an important, ongoing effort. If researchers aren’t able to reconcile such deviations, the logical conclusion is that the difference represents something they don’t know about — new particles, new interactions, new physics beyond the Standard Model.
The challenge of extremely precise measurements can also drive the formation of new research alliances. Earlier this year, the first Fermilab-CERN joint announcement of collaborative results set a world standard for the mass of the top quark.
Such accuracy hones methods applied to other questions in physics, too, the same way that research on W bosons, discovered in 1983, led to the methods Mulders began using to measure the top quark mass in 2005. In fact, top quark production is now so well controlled that it has become a tool itself to study detectors.
With the upcoming restart in 2015, the LHC will produce millions of top quarks, giving researchers troves of data to further physics. But scientists will still need to factor in the background noise and data-skewing inherent in the instruments themselves, called systematic uncertainty.
“The CDF and DZero experiments at the Tevatron are mature,” says Andreas Jung, senior postdoc at Fermilab. “It’s shut down, so the understanding of the detectors is very good, and thus the control of systematic uncertainties is also very good.”
Jung has been combing through the old data with his colleagues and publishing new results, even though the Tevatron hasn’t collided particles since 2011. The two labs combined their respective strengths to produce their joint results, but scientists still have much to learn about the top quark, and a new arsenal of tools to accomplish it.
“DZero published a paper in Nature in 2004 about the measurement of the top quark mass that was based on 22 events,” Mulders says. “And now we are working with millions of events. It’s incredible to see how things have evolved over the years.”
Management done properly satisfies Sir Karl Popper’s (1902 – 1994) demarcation criteria for science, i.e. using models that make falsifiable or at least testable predictions. That was brought home to me by a book by Douglas Hubbard on risk management where he advocated observationally constrained (falsifiable or testable) models for risk analysis evaluated through Monte Carlo calculations. Hmm, observationally constrained models and Monte Carlo calculations, sounds like a recipe for science.
Let us take a step back. The essence of science is modeling how the universe works and checking the assumptions of the model and its predictions against observations. The predictions must be testable. According to Hubbard, the essence of risk management is modeling processes and checking the assumptions of the model and its predictions against observations. The predictions must be testable. What we are seeing here is a common paradigm for knowledge in which modeling and testing against observation play a key role.
The knowledge paradigm is the same in project management. A project plan, with its resource loaded schedules and other paraphernalia, is a model for how the project is expected to proceed. To monitor a project you check the plan (model) against actuals (a fancy euphemism for observations, where observations may or may not correspond to reality). Again, it reduces back to observationally constrained models and testable predictions.
The foundations of science and good management practices are tied even closer together. Consider the PDCA cycle for process management that is present, either implicitly or explicitly, in essentially all the ISO standards related to management. It was originated by Walter Shewhart (1891 – 1967), an American physicist, engineer and statistician, and popularized by Edwards Deming (1900 – 1993), an American engineer, statistician, professor, author, lecturer and management consultant. Engineers are into everything. The actual idea of the cycle is based on the ideas of Francis Bacon (1561 – 1629) but could equally well be based on the work of Roger Bacon (1214 – 1294). Hence, it should probably be called the Double Bacon Cycle (no, that sounds too much like a breakfast food).
But what is this cycle? For science, it is: plan an experiment to test a model, do the experiment, check the model results against the observed results, and act to change the model in response to the new information from the check stage or devise more precise tests if the predictions and observations agree. For process management replace experiment with production process. As a result, you have a model for how the production process should work and doing the process allows you to test the model. The check stage is where you see if the process performed as expected and the act stage allows you to improve the process if the model and actuals do not agree. The key point is the check step. It is necessary if you are to improve the process; otherwise you do not know what is going wrong or, indeed, even if something is going wrong. It is only possible if the plan makes predictions that are falsifiable or at least testable. Popper would be pleased.
There is another interesting aspect of the ISO 9001 standard. It is based on the idea of processes. A process is defined as an activity that converts inputs into outputs. Well, that sound rather vague, but the vagueness is an asset, kind of like degrees of freedom in an effective field theory. Define them as you like but if you choose them incorrectly you will be sorry. The real advantage of effective field theory and the flexible definition of process is that you can study a system at any scale you like. In effective field theory, you study processes that operate at the scale of the atom, the scale of the nucleus or the scale of the nucleon and tie them together with a few parameters. Similarly with processes, you can study the whole organization as a process or drill down and look at sub process at any scale you like, for CERN or TRIUMF that would be down to the last magnet. It would not be useful to go further and study accelerator operations at the nucleon scale. At a given scale different processes are tied together by their inputs and outputs and these are also used to tie process at different scales.
As a theoretical physicist who has gone over to the dark side and into administration, I find it amusing to see the techniques and approaches from science being borrowed for use in administration, even Monte Carlo calculations. The use of similar techniques in science and administration goes back to the same underlying idea: all true knowledge is obtained through observation and its use to build better testable models, whether in science or other walks of life.
 The Failure of Risk Management: Why It’s Broken and How to Fix It by Douglas W. Hubbard (Apr 27, 2009)
 Roger Bacon described a repeating cycle of observation, hypothesis, and experimentation.
Dark matter – it’s essential to our universe, it’s mysterious and it brings to mind cool things like space, stars, and galaxies. I have been fascinated by it since I was a child, and I feel very lucky to be a part for the search for it. But that’s not actually what I’m going to be talking about today.
I am a graduate student just starting my second year in the High Energy Physics group at UCL, London. Ironically, as a dark matter physicist working in the LUX (Large Underground Xenon detector) and LZ (LUX-ZEPLIN) collaborations, I’m actually dealing with very low energy physics.
When people ask what I do, I find myself saying different things, to differing responses:
- “I’m doing a PhD in physics” – reaction: person slowly backs away
- “I’m doing a PhD in particle physics” – reaction: some interest, mention of the LHC, person mildly impressed
- “I’m doing a PhD in astro-particle physics” – reaction: mild confusion but still interested, probably still mention the Large Hadron Collider
- “I’m looking for dark matter!” – reaction: awe, excitement, lots of questions
This obviously isn’t true in all cases, but has been the general pattern assumed. Admittedly, I enjoy that people are impressed, but sometimes I struggle to find a way to explain to people not in physics what I actually do day to day. Often I just say, “it’s a lot of computer programming; I analyse data from a detector to help towards finding a dark matter signal”, but that still induces a panicked look in a lot of people.
Nevertheless, I actually came across a group of people who didn’t ask anything about what I actually do last week, and I found myself going right back to basics in terms of the physics I think about daily. Term has just started, and that means one thing: undergraduates. The frequent noise they make as they stampede past my office going the wrong way to labs makes me wonder if the main reason for sending them away for so long is to give the researchers the chance to do their work in peace.
Nonetheless, somehow I found myself in the undergraduate lab on Friday. I had to ask myself why on earth I had chosen to demonstrate – I am, almost by definition, terrible in a lab. I am clumsy and awkward, and even the most simple equipment feels unwieldy in my hands. During my own undergrad, my overall practical mark always brought my average mark down for the year. My masters project was, thank god, entirely computational. But thanks to a moment of madness (and the prospect of earning a little cash, as London living on a PhD stipend is hard), I have signed up to be a lab demonstrator for the new first year physicists.
Things started off awkwardly as I was told to brief them on the experiment and realised I had not a great deal to say. I got more into the swing of things as time went by, but I still felt like I’d been thrown in the deep end. I told the students I was a second year PhD student; one of them got the wrong end of the stick and asked if I knew a student who was a second year undergrad here. I told him I was postgraduate and he looked quite embarrassed, whilst I couldn’t help but laugh at the thought of the chaos that would ensue if a second year demonstrated the first year labs.
None of them asked what my PhD was in. They weren’t interested – somehow I had become a faceless authority who told them what to do and had no other purpose. I am not surprised – they are brand new to university, and more importantly, they were pretty distracted by the new experience of the laboratory. That’s not to say they particularly enjoyed it, they seemed to have very little enthusiasm for the experiment. It was a very simple task: measuring the speed of sound in air using a frequency generator, an oscillator and a ruler. For someone now accustomed to dealing with data from a high tech dark matter detector, it was bizarre! I do find the more advanced physics I learn, the worse I become at the basics, and I had to go aside for a moment with a pen and paper to reconcile the theory in my head – it was embarrassing, to say the least!
Their frustration at the task was evident – there were frequent complaints over the length of time they were writing for, over the experimental ‘aims’ and ‘objectives’, of the fact they needed to introduce their diagrams before drawing them, etc. Eyes were rolling at me. I was going to have to really try to drill it in that this was indeed an important exercise. The panic I could sense from them was a horrible reminder of how I used to feel in my own labs. It’s hard to understand at that point that this isn’t just some form of torture, you are actually learning some very valuable and transferrable skills about how to conduct a real experiment. Some examples:
- Learn to write EVERYTHING down, you might end up in court over something and some tiny detail might save you.
- Get your errors right. You cannot claim a discovery without an uncertainty, that’s just physics. Its difficult to grasp, but you can never fully prove a hypothesis, only provide solid evidence towards it.
- Understand the health and safety risks – they seem pointless and stupid when the only real risk seems to be tripping over your bags, but speaking as someone who has worked down a mine with pressurised gases, high voltages and radioactive sources, they are extremely important and may be the difference between life and death.
In the end, I think my group did well. They got the right number for the speed of sound and their lab books weren’t a complete disaster. A few actually thanked me on their way out.
It was a bit of a relief to get back to my laptop where I actually feel like I know what I am doing, but the experience was a stark reminder of where I was 5 years ago and how much I have learned. Choosing physics for university means you will have to struggle to understand things, work hard and exhaust yourself, but in all honestly it was completely worth it, at least for me. Measuring the speed of sound in air is just the beginning. One day, some of those students might be measuring the quarks inside a proton, or a distant black hole, or the quantum mechanical properties of a semiconductor.
I’m back in the labs this afternoon, and I am actually quite looking forward to seeing how they cope this week, when we study that essential pillar of physics, conservation of momentum. I just hope they don’t start throwing steel ball-bearings at each other. Wish me luck.
In a short while, starting at 11:00 CEST / 10:00 BST, ATLAS will announce some new Higgs results:
“New Higgs physics results from the ATLAS experiment using the full Run-1 LHC dataset, corresponding to an integrated luminosity of approximately 25 fb-1, of proton-proton collisions at 7 TeV and 8 TeV, will be presented.” [seminar link]
I don’t expect anything earth-shattering, because ATLAS already has preliminary analyses for all the major Higgs channels. They have also submitted final publications for LHC Run I on Higgs decaying to two photons, two b quarks, two Z bosons – so it’s reasonable to guess that Higgs decaying to taus or W’s is going to be covered today.
(Parenthetically, CMS has already published final results for all of the major Higgs decays, because we are faster, stronger, smarter, better looking, and more fun at parties.)
I know folks on ATLAS who are working on things that might be shown today, and they promise they have some new tricks, so I’m hoping things will be fairly interesting. But again, nothing earth-shattering.
I’ll update this very page during the seminar. You should also be able to watch it on the Webcast Service.
10:55 I have a front row seat in the CERN Council Chamber, which is smaller than the main auditorium that you might be more familiar with. Looks like it will be very, very full.
11:00 Here we go! (Now’s a good time to click the webcast, if you plan to.)
11:03 Yes, it turns out it will be taus and W’s.
11:06 As an entree, look how fabulously successful the Standard Model, including the Higgs, has been:
11:10 Good overview right now over overall Higgs production and decay and the framework we used to understand it. Have any questions I can answer during the seminar? Put them in the comments or write something at me on Twitter.
11:18 We’re learning about the already-released results for Higgs to photons and ZZ first.
11:24 Higgs to bb, the channel I worked on for CMS during Run I. These ATLAS results are quite new and have a lot of nice improvements from their preliminary analysis. Very pretty plot of improved Higgs mass resolution when corrections are made for muons produced inside b-jets.
11:30 Now to Higgs to tau tau, a new result!
11:35 Developments since preliminary analysis include detailed validation of techniques for estimating from data how isolated the taus should be from other things in the detector.
11:36 I hope that doesn’t sound too boring, but this stuff’s important. It’s what we do all day, not just counting sigmas.
11:37 4.5 sigma evidence (only 3.5 expected) for the Higgs coupling to the tau lepton!
11:39 Their signal is a bit bigger than the SM predicts, but still very consistent with it. And now on to WW, also new.
11:41 In other news, the Nobel Prize in Physics will be announced in 4 minutes: It’s very unlikely to be for anything in this talk.
11:44 Fixed last comment: “likely” –> “unlikely”. Heh.
11:48 When the W’s decay to a lepton and an invisible neutrino, you can’t measure a “Higgs peak” like we do when it decays to photons or Z’s. So you have to do very careful work to make sure that a misunderstanding of you background (i.e. non-Higgs processes) produces what looks like a Higgs signal.
11:50 Background-subtracted result does show a clear Higgs excess over the SM backgrounds. This will be a pretty strong result.
11:51 6.1 sigma for H –> WW –> lvlv. 3.2 sigma for VBF production mechanism. Very consistent with the SM again.
11:52 Lots of very nice, detailed work here. But the universe has no surprises for us today.
11:54 We can still look forward to the final ATLAS combination of all Higgs channels, but we know it’s going to look an awful lot like the Standard Model. Congratulations to my ATLAS colleagues on their hard work.
11:56 By the way, you can read the slides on the seminar link.
12:02 The most significant result here might actually be the single-channel observation of the Vector Boson Fusion production mechanism. The Higgs boson really is behaving the way the Standard Model says it should! Signing off here, time for lunch
This Fermilab press release came out on Oct. 6, 2014.
It’s the most powerful accelerator-based neutrino experiment ever built in the United States, and the longest-distance one in the world. It’s called NOvA, and after nearly five years of construction, scientists are now using the two massive detectors – placed 500 miles apart – to study one of nature’s most elusive subatomic particles.
Scientists believe that a better understanding of neutrinos, one of the most abundant and difficult-to-study particles, may lead to a clearer picture of the origins of matter and the inner workings of the universe. Using the world’s most powerful beam of neutrinos, generated at the U.S. Department of Energy’s Fermi National Accelerator Laboratory near Chicago, the NOvA experiment can precisely record the telltale traces of those rare instances when one of these ghostly particles interacts with matter.
Construction on NOvA’s two massive neutrino detectors began in 2009. In September, the Department of Energy officially proclaimed construction of the experiment completed, on schedule and under budget.
“Congratulations to the NOvA collaboration for successfully completing the construction phase of this important and exciting experiment,” said James Siegrist, DOE associate director of science for high energy physics. “With every neutrino interaction recorded, we learn more about these particles and their role in shaping our universe.”
NOvA’s particle detectors were both constructed in the path of the neutrino beam sent from Fermilab in Batavia, Illinois, to northern Minnesota. The 300-ton near detector, installed underground at the laboratory, observes the neutrinos as they embark on their near-light-speed journey through the Earth, with no tunnel needed. The 14,000-ton far detector — constructed in Ash River, Minnesota, near the Canadian border – spots those neutrinos after their 500-mile trip and allows scientists to analyze how they change over that long distance.
For the next six years, Fermilab will send tens of thousands of billions of neutrinos every second in a beam aimed at both detectors, and scientists expect to catch only a few each day in the far detector, so rarely do neutrinos interact with matter.
From this data, scientists hope to learn more about how and why neutrinos change between one type and another. The three types, called flavors, are the muon, electron and tau neutrino. Over longer distances, neutrinos can flip between these flavors. NOvA is specifically designed to study muon neutrinos changing into electron neutrinos. Unraveling this mystery may help scientists understand why the universe is composed of matter and why that matter was not annihilated by antimatter after the big bang.
Scientists will also probe the still-unknown masses of the three types of neutrinos in an attempt to determine which is the heaviest.
“Neutrino research is one of the cornerstones of Fermilab’s future and an important part of the worldwide particle physics program,” said Fermilab Director Nigel Lockyer. “We’re proud of the NOvA team for completing the construction of this world-class experiment, and we’re looking forward to seeing the first results in 2015.”
The far detector in Minnesota is believed to be the largest free-standing plastic structure in the world, at 200 feet long, 50 feet high and 50 feet wide. Both detectors are constructed from PVC and filled with a scintillating liquid that gives off light when a neutrino interacts with it. Fiber optic cables transmit that light to a data acquisition system, which creates 3-D pictures of those interactions for scientists to analyze.
The NOvA far detector in Ash River saw its first long-distance neutrinos in November 2013. The far detector is operated by the University of Minnesota under an agreement with Fermilab, and students at the university were employed to manufacture the component parts of both detectors.
“Building the NOvA detectors was a wide-ranging effort that involved hundreds of people in several countries,” said Gary Feldman, co-spokesperson of the NOvA experiment. “To see the construction completed and the operations phase beginning is a victory for all of us and a testament to the hard work of the entire collaboration.”
The NOvA collaboration comprises 208 scientists from 38 institutions in the United States, Brazil, the Czech Republic, Greece, India, Russia and the United Kingdom. The experiment receives funding from the U.S. Department of Energy, the National Science Foundation and other funding agencies.
For more information, visit the experiment’s website: http://www-nova.fnal.gov.
Note: NOvA stands for NuMI Off-Axis Electron Neutrino Appearance. NuMI is itself an acronym, standing for Neutrinos from the Main Injector, Fermilab’s flagship accelerator.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.
The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
While the LHC experiments are surely turning their attention towards the 2015 run of the collider, at an energy nearly double that of the previous run, we’re also busy trying to finalize and publish measurements using the data that we already have in the can. Some measurements just take longer than others, and some it took us a while to get to. And while I don’t like tooting my own horn too much here at the US LHC blog, I wanted to discuss a new result from CMS that I have been working on with a student, Dan Knowlton, here at the University of Nebraska-Lincoln, along with collaborators from a number of other institutions. It’s been in the works for so long that I’m thrilled to get it out to the public!
(This is one of many CMS results that were shown for the first time last week at the TOP 2014 conference. If you look through the conference presentations, you’ll find that the top quark, which has been around for about twenty years now, has continued to be a very interesting topic of study, with implications for searches for new physics and even for the fate of the universe. One result that’s particularly interesting is a new average of CMS top-quark mass measurements, which is now the most accurate measurement of that quantity in the world.)
The LHC experiments have studied the Higgs boson through many different Higgs decay modes, and many different production mechanisms also. Here is a plot of the expected cross sections for different Higgs production mechanisms as a function of Higgs mass; of course we know now that the Higgs has a mass of 125 GeV:
The most common production mechanism has a Higgs being produced with nothing else, but it can also be produced in association with other particles. In our new result, we search for a Higgs production mechanism that is so much more rare that it doesn’t even appear on the above plot! The mechanism is the production of a Higgs boson in association with a single top quark, and in the standard model, the cross section is expected to be 0.018 pb, about an order of magnitude below the cross section for Higgs production in association with a top-antitop pair. Why even bother to look for such a thing, given how rare it is?
The answer lies in the reason for why this process is so rare. There are actually two ways for this particular final state to be produced. Here are the Feynman diagrams for them:
In one case, the Higgs is radiated off the virtual W, while in the other it comes off the real final-state top quark. Now, this is quantum mechanics: if you have two different ways to connect an initial and final state, you have to add the two amplitudes together before you square them to get a probability for the process. It just so happens that these two amplitudes largely destructively interfere, and thus the production cross section is quite small. There isn’t anything deep at work (e.g. no symmetries that suppress this process), it’s just how it comes out.
At least, that’s how it comes out in the standard model. We assume certain values for the coupling factors of the Higgs to the top and W particles that appear in the diagrams above. Other measurements of Higgs properties certainly suggest that the coupling factors do have the expected values, but there is room within the constraints for deviations. It’s even possible that one of the two coupling values has the exact opposite sign from what we expect. In that case, the destructive interference between the two amplitudes would become constructive, and the cross section would be almost a factor of 13 larger than expected!
The new result from CMS is a search for this anomalous production of the Higgs in association with a single top quark. CMS already has a result for a search in which the Higgs decays to pair of photons; this new result describes a search in which the Higgs decays to bottom quarks. That is a much more common Higgs decay mode, so there ought to be more events to see, but at the same time the backgrounds are much higher. The production of a top-antitop pair along with an extra jet of hadrons that is mis-identified as arising from a bottom quark looks very much like the targeted Higgs production mechanism. The top-antitop cross section is about 1000 times bigger than that of the anomalous production mechanism that we are looking for, and thus even a tiny bottom mis-identification rate leads to a huge number of background events. A lot of the work in the data analysis goes into figuring out how to distinguish the (putative) signal events from the dominant background, and then verifying that the estimations of the background rates are correct.
The analysis is so challenging that we predicted that even by throwing everything we had at it, the best we could expect to do was to exclude the anomalous Higgs production process at a level of about five times the predicted rate for it. When we looked at the data, we found that we could exclude it at about seven times the anomalous rate, roughly in line with what we expected. In short, we do not see an anomalous rate for anomalous Higgs production! But we are able to set a fairly tight limit, at around 1.8 pb.
What do I like about this measurement? First, it’s a very different way to try to measure the properties of the Higgs boson. The measurements we have are very impressive given the amount of data that we have so far, but they are not very constraining, and there is enough wiggle room for some strange stuff to be going on. This is one of the few ways to probe the Higgs couplings through the interference of two processes, rather than just through the rate for one dominant process. All of these Higgs properties measurements are going to be much more accurate in next year’s data run, when we expect to integrate more data and all of the production rates will be larger due to the increase in beam energy. (For this anomalous production process, the cross section will increase by about a factor of four.) In this particular case, we should be able to exclude anomalous Higgs couplings through this measurement…or, if nature surprises us, we will actually observe them! There is a lot of fun ahead for Higgs physics (and top physics) at the LHC.
I’ve also really enjoyed working with my CMS colleagues on this project. Any measurement coming out of the experiment is truly the work of thousands of people who have built and operated the detector, gotten the data recorded and processed, developed and refined the reconstruction algorithms, and defined the baselines for how we identify all kinds of particles that are produced in the proton collisions. But the final stages of any measurement are carried out by smaller groups of people, and in this case we worked with colleagues from the Catholic University of Louvain in Belgium, the Karlsruhe Institute of Technology in Germany, the University of Malaya in Malaysia, and the University of Kansas (in Kansas). We relied on the efforts of a strong group of graduate students with the assistance of harried senior physicists like myself, and the whole team did a great job of supporting each other and stepping up to solve problems as they arose. These team efforts are one of the things that I’m proud of in particle physics, and that make our scientists so successful in the wider world.
With my first post on Quantum Diaries I will not address a technical topic; instead, I would like to talk about the act (or art) of “studying” itself. In particular, why do we care about fundamental research, pure knowledge without any practical purpose or immediate application?
A. Flexner in 1939 authored a contribution to Harper’s Magazine (issue 179) named “The usefulness of useless knowledge”. He opens the discussion with an interesting question: “Is it not a curios fact that in a world steeped in irrational hatreds which threaten civilization itself, men and women – old and young – detach themselves wholly or partly from the angry current of daily life to devote themselves to the cultivation of beauty, to the extension of knowledge […] ?”
Nowadays this interrogative is still present, and probably the need for a satisfactory answer is even stronger.
From a pragmatic point of view, we can argue that there are many important applications and spin-offs of theoretical investigations into the deep structure of Nature that did not arise immediately after the scientific discoveries. This is, for example, the case of QED and antimatter, the theories for which date back to the 1920s and are nowadays exploited in hospitals for imaging purposes (like in PET, positron emission tomography). The most important discoveries affecting our everyday life, from electricity to the energy bounded in the atom, came from completely pure and theoretical studies: electricity and magnetism, summarized in Maxwell’s equations, and quantum mechanics are shining examples.
It may seem that it is just a matter of time: “Wait enough, and something useful will eventually pop out of these abstract studies!” True. But that would not be the most important answer. To me this is: “Pure research is important because it generates knowledge and education”. It is our own contribution to the understanding of Nature, a short but important step in a marvelous challenge set up by the human mind.
Personally, I find that research into the yet unknown aspects of Nature responds to some partly conscious and partly unconscious desires. Intellectual achievements provide a genuine ‘spiritual’ satisfaction, peculiar to the art of studying. For sake of truth I must say that there are also a lot of dark sides: frustration, stress, graduate-depression effects, geographical and economic instability and so on. But leaving for a while all these troubles aside, I think I am pretty lucky in doing this job.
During difficult times from the economic point of view, it is legitimate to ask also “Why spend a lot of money on expensive experiments like the Large Hadron Collider?” or “Why fund abstract research in labs and universities instead of investing in more socially useful studies?”
We could answer by stressing again the fact that many of the best innovations came from the fuzziest studies. But in my mind the ultimate answer, once for all, relies in the power of generating culture, and education through its diffusion. Everything occurs within our possibilities and limitations. A willingness to learn, a passion for teaching, blackboards, books and (super)computers: these are our tools.
Citing again Flexner’s paper: “The mere fact spiritual and intellectual freedoms bring satisfaction to an individual soul bent upon its own purification and elevation is all the justification that they need. […] A poem, a symphony, a painting, a mathematical truth, a new scientific fact, all bear in themselves all the justification that universities, colleges and institutes of research need or require.”
Last but not least, it is remarkable to think about how many people from different parts of the world may have met and collaborated while questing together after knowledge. This may seem a drop in the ocean, but research daily contributes in generating a culture of peace and cooperation among people with different cultural backgrounds. And that is for sure one of the more important practical spin-offs.
This article appeared in Fermilab Today on Sept. 30, 2014.
As an eighth grader, Paul Nebres took part in a 2012 field trip to Fermilab. He learned about the laboratory’s exciting scientific experiments, said hello to a few bison and went home inspired.
Now a junior at the Illinois Mathematics and Science Academy (IMSA) in Aurora, Nebres is back at Fermilab, this time actively contributing to its scientific program. He’s been working on the Muon g-2 project since the summer, writing software that will help shape the magnetic field that guides muons around a 150-foot-circumference muon storage ring.
Nebres is one of 13 IMSA students at Fermilab. The high school students are part of the academy’s Student Inquiry and Research program, or SIR. Every Wednesday over the course of a school year, the students use these weekly Inquiry Days to work at the laboratory, putting their skills to work and learning new ones that advance their understanding in the STEM fields.
The program is a win for both the laboratory and the students, who work on DZero, MicroBooNE, MINERvA and electrical engineering projects, in addition to Muon g-2.
“You can throw challenging problems at these students, problems you really want solved, and then they contribute to an important part of the experiment,” said Muon g-2 scientist Brendan Kiburg, who co-mentors a group of four SIR students with scientists Brendan Casey and Tammy Walton. “Students can build on various aspects of the projects over time toward a science result and accumulate quite a nice portfolio.”
This year roughly 250 IMSA students are in the broader SIR program, conducting independent research projects at Argonne National Laboratory, the University of Chicago and other Chicago-area institutions.
IMSA junior Nerione Agrawal, who started in the SIR program this month, uses her background in computing and engineering to simulate the potential materials that will be used to build Muon g-2 detectors.
“I’d been to Fermilab a couple of times before attending IMSA, and when I found out that you could do an SIR at Fermilab, I decided I wanted to do it,” she said. “I’ve really enjoyed it so far. I’ve learned so much in three weeks alone.”
The opportunities for students at the laboratory extend beyond their particular projects.
“We had the summer undergraduate lecture series, so apart from doing background for the experiment, I learned what else is going on around Fermilab, too,” Nebres said. “I didn’t expect the amount of collaboration that goes on around here to be at the level that it is.”
In April, every SIR student will create a poster on his or her project and give a short talk at the annual IMSAloquium.
Kiburg encourages other researchers at the lab to advance their projects while nurturing young talent through SIR.
“This is an opportunity to let a creative person take the reins of a project, steward it to completion or to a point that you could pick up where they leave off and finish it,” he said. “There’s a real deliverable outcome. It’s inspiring.”