• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Sally
  • Shaw
  • University College London
  • UK

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Mandeep
  • Gill

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Hot Topics

Open days Nikhef Open Day

Particle physics laboratories around the world regularly host open days, when they throw their doors wide and invite the public in to learn about ongoing projects and experiments. This month, Nikhef in the Netherlands hosted hundreds of visitors as part of the Amsterdam Science Park Open Day, offering hands-on activities, talks and a glimpse into how a large science lab works. If you missed Nikhef’s Open Day, there’s sure to be another near you soon!

I feel it mine

By Andrea Sigori | October 21, 2014
The Nikhef Open Day had a huge impact, the benefit of which can be summarized with the words of one man who stopped by. He listened to the careful explanation provided by one of the students, and said, “Thanks. Now I feel it mine too.”

Open days and sore throats

By CERN | October 2, 2013
Many people have a sore throat this week at CERN. Not too surprising given the 70,000 inquisitive visitors we welcomed over the weekend! It was amazing to see so much interest from the public and the enthusiasm of the 2,300 volunteers.

The open day

By Frank Simon | October 21, 2009
Last Saturday was the Open Day of our institute, which takes place every two years. That is an excellent opportunity to show what we are doing to the public. This year was particularly successful, with a record attendance.
Share

Latest Posts

Have we detected Dark Matter Axions?

Wednesday, October 22nd, 2014

An interesting headline piqued my interest when browsing the social networking and news website Reddit the other day. It simply said:

“The first direct detection of dark matter particles may have been achieved.”


Well, that was news to me! 
Obviously, the key word here is “may”. Nonetheless, I was intrigued, not being aware of any direct detection experiments publishing such results around this time. As a member of LUX, there are usually collaboration-wide emails sent out when a big paper is published by a rival group, most recently the DarkSide-50 results . Often an email like this is followed by a chain of comments, both good and bad, from the senior members of our group. I can’t imagine there being a day where I think I could read a paper and instantly have intelligent criticisms to share like those guys – but maybe when I’ve been in the dark matter business for 20+ years I will!

It is useful to look at other work similar to our own. We can learn from the mistakes and successes of the other groups within our community, and most of the time rivalry is friendly and professional. 
So obviously I took a look at this claimed direct detection. Note that there are three methods to dark matter detection, see figure. To summarise quickly,

The three routes to dark matter detection

  • Direct detection is the observation of an interaction of a dark matter particle with a standard model one
.
  • Indirect detection is the observation of annihilation products that have no apparent standard model source and so are assumed to be the products of dark matter annihilation.
  • Production is the measurement of missing energy and momentum in a particle interaction (generally a collider experiment) that could signify the creation of dark matter (this method must be very careful, as this is how the neutrinos are measured in collider experiments).

So I was rather surprised to find the article linked was about a space telescope – the XMM-Newton observatory. These sort of experiments are usually for indirect detection. The replies on the Reddit link reflected my own doubt – aside from the personification of x-rays, this comment was also my first thought:

“If they detected x-rays who are produced by dark matter axions then it’s not direct detection.”

These x-rays supposedly come from a particle called an axion – a dark matter candidate. But to address the comment, I considered LUX, a direct dark matter detector, where what we are actually detecting is photons. These are produced by the recoil of a xenon nuclei that interacted with a dark matter particle, and yet we call it direct – because the dark matter has interacted with a standard model particle, the xenon. 
So to determine whether this possible axion detection is direct, we need to understand the effect producing the x-rays. And for that, we need to know about axions.

I haven’t personally studied axions much at all. At the beginning of my PhD, I read a paper called “Expected Sensitivity to Galactic/Solar Axions and Bosonic Super-WIMPs based on the Axio-electric Effect in Liquid Xenon Dark Matter Detectors” – but I couldn’t tell you a single thing from that paper now, without re-reading it. After some research I have a bit more understanding under my belt, and for those of you that are physicists, I can summarise the idea:

  • The axion is a light boson, proposed by Roberto Peccei and Helen Quinn in 1977 to solve the strong CP problem (why does QCD not break CP-symmetry when there is no theoretical reason it shouldn’t?).
  • The introduction of the particle causes the strong CP violation to go to zero (by some fancy maths that I can’t pretend to understand!).
  • 
It has been considered as a cold dark matter candidate because it is neutral and very weakly interacting, and could have been produced with the right abundance.
Conversion of an axion to  a photon within a magnetic field (Yamanaka, Masato et al)

Conversion of an axion to a photon within a magnetic field (Yamanaka, Masato et al)


For non-physicists, the key thing to understand is that the axion is a particle predicted by a separate theory (nothing to do with dark matter) that solves another problem in physics. It just so happens that its properties make it a suitable candidate for dark matter. Sounds good so far – the axion kills two birds with one stone. We could detect a dark matter axion via an effect that converts an axion to an x-ray photon within a magnetic field. The XMM-Newton observatory orbits the Earth and looks for x-rays produced by the conversion of an axion within the Earth’s magnetic field. Although there is no particular interaction with a standard model particle (one is produced), the axion is not annihilating to produce the photons, so I think it is fair to call this direct detection.

What about the actual results? What has actually been detected is a seasonal variation in the cosmic x-ray background. The conversion signal is expected to be greater in summer due to the changing visibility of the magnetic field region facing the sun, and that’s exactly what was observed. In the paper’s conclusion the authors state:

“On the basis of our results from XMM-Newton, it appears plausible that axions – dark matter particle candidates – are indeed produced in the core of the Sun and do indeed convert to soft X-rays in the magnetic field of the Earth, giving rise to a significant, seasonally-variable component of the 2-6 keV CXB”

 

axions

Conversion of solar axions into photons within the Earth’s magnetic field (University of Leicester)

Note the language used – “it appears plausible”. This attitude of physicists to always be cautious and hold back from bold claims is a wise one – look what happened to BICEP2. It is something I am personally becoming familiar with, last week having come across a lovely LUX event that passed my initial cuts and looked very much like it could have been a WIMP. My project partner from my masters degree at the University of Warwick is now a new PhD student at UCL – and he takes great joy in embarrassing me in whatever way he can. So after I shared my findings with him, he told everyone we came across that I had found WIMPs. Even upon running into my supervisor, he asked “Have you seen Sally’s WIMP?”. I was not pleased – that is not a claim I want to make as a mere second year PhD student. Sadly, but not unexpectedly, my “WIMP” has now been cut away. But not for one second did I truly believe it could have been one – surely there’s no way I‘m going to be the one that discovers dark matter! (Universe, feel free to prove me wrong.)

These XMM-Newton results are nice, but tentative – they need confirming by more experiments. I can’t help but wonder how many big discoveries end up delayed or even discarded due to the cautiousness of physicists, who can scarcely believe they have found something so great. I look forward to the time when someone actually comes out and says ‘We did it – we found it.” with certainty. It would be extra nice if it were LUX. But realistically, to really convince anyone that dark matter has been found, detection via several different methods and in several different places is needed. There is a lot of work to do yet.

It’s an exciting time to be in this field, and papers like the XMM-Newton one keep us on our toes! LUX will be starting up again soon for what we hope will be a 300 day run, and an increase in sensitivity to WIMPs of around 5x. Maybe it’s time for me to re-read that paper on the axio-electric effect in liquid xenon detectors!

Share

This Fermilab press release came out on Oct. 20, 2014.

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U.S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London. Although the project is designed to benefit data-intensive science throughout the U.S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. The high capacity of this new connection will provide U.S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe.

DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.S. collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.

The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012.

The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows.

An evolving data model
This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

“Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.

Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U.S. collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.S. collaborators on the CMS experiment. These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0).  The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.S.-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

“This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.”

An investment in science
ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years. Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers.

“We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.”

This work was supported by the DOE Office of Science.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy.  The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.  For more information, please visit science.energy.gov.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation for the State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit applied science and technology organization.

Visit Brookhaven Lab’s electronic newsroom for links, news archives, graphics, and more at http://www.bnl.gov/newsroom, follow Brookhaven Lab on Twitter, http://twitter.com/BrookhavenLab, or find us on Facebook, http://www.facebook.com/BrookhavenLab/.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Media contacts:

  • Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl.gov, 631-344-8350
  • Kurt Riesselmann, Fermilab Office of Communication, media@fnal.gov, 630-840-3351
  • Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl.gov, 510-486-5849

Computing contacts:

  • Lothar Bauerdick, Fermilab, US CMS software computing, bauerdick@fnal.gov, 630-840-6804
  • Oliver Gutsche, Fermilab, CMS Offline and Computing Management Board, gutsche@fnal.gov, 630-840-8909
Share

I feel it mine

Tuesday, October 21st, 2014

On Saturday, 4 October, Nikhef – the Dutch National Institute for Subatomic Physics where I spend long days and efforts – opened its doors, labs and facilities to the public. In addition to Nikhef, all the other institutes located in the so-called “Science Park” – the scientific district located in the east part of Amsterdam – welcomed people all day long.

It’s the second “Open Day” that I’ve attended, both as a guest and as guide. Together with my fellow theoreticians we provided answers and explanations to people’s questions and curiosities, standing in the “Big Bang Theory Corner” of the main hall. Each department in Nikhef arranged its own stand and activities, and there were plenty of things to be amazed at to cover the entire day.

The research institutes in Science Park (and outside it) offer a good overview of the concept of research, looking for what is beyond the current status of knowledge. “Verder kijken”, or looking further, is the motto of Vrije Universiteit Amsterdam, my Dutch alma mater.

I deeply like this attitude of research, the willingness to investigating what’s around the corner. As they like to define themselves, Dutch people are “future oriented”: this is manifest in several things, from the way they read the clock (“half past seven” becomes “half before eight” in Dutch) to some peculiarities of the city itself, like the presence of a lot of cultural and research institutes.

This abundance of institutes, museums, exhibitions, public libraries, music festivals, art spaces, and independent cinemas makes me feel this city as cultural place. People interact with culture in its many manifestations and are connected to it in a more dynamic way than if they were only surrounded by historical and artistic.

Back to the Open Day and Nikhef, I was pleased to see lots of people, families with kids running here and there, checking out delicate instruments with their curious hands, and groups of guys and girls (also someone who looked like he had come straight from a skate-park) stopping by and looking around as if it were their own courtyard.

The following pictures give some examples of the ongoing activities:

We had a model of the ATLAS detector built with Legos: amazing!

IMG_0770

Copyright Nikhef

And not only toy-models. We had also true detectors, like a cloud chamber that allowed visitors to see the traces of particles passing by!

ADL_167796

Copyright Nikhef

Weak force and anti-matter are also cool, right?

ADL_167823

Copyright Nikhef

The majority of people here (not me) are blond and/or tall, but not tall enough to see cosmic rays with just their eyes… So, please ask the experts!

ADL_167793

Copyright Nikhef

I think I can summarize the huge impact and the benefit of such a cool day with the words of one man who stopped by one of the experimental setups. He listened to the careful (but a bit fuzzy) explanation provided by one of the students, and said “Thanks. Now I feel it mine too.”

Many more photos are available here: enjoy!

Share

Let there be beam!

Wednesday, October 15th, 2014

It’s been a little while since I’ve posted anything, but I wanted to write a bit about some of the testbeam efforts at CERN right now. In the middle of July this year, the Proton Synchrotron, or PS, the second ring of boosters/colliders which are used to get protons up to speed to collide in the LHC, saw its first beam since the shutdown at the end Run I of the LHC. In addition to providing beam to experiments like CLOUD, the beam can also be used to create secondary particles of up to 15 GeV/c momentum, which are then used for studies of future detector technology. Such a beam is called a testbeam, and all I can say is WOOT, BEAM! I must say that being able to take accelerator data is amazing!

The next biggest milestone is the testbeams from the SPS, which started on the 6th of October. This is the last ring before the LHC. If you’re unfamiliar with the process used to get protons up to the energies of the LHC, a great video can be found at the bottom of the page.

Just to be clear, test beams aren’t limited to CERN. Keep your eyes out for a post by my friend Rebecca Carney in the near future.

I was lucky enough to be part of the test beam effort of LHCb, which was testing both new technology for the VELO and for the upgrade of the TT station, called the Upstream Tracker, or UT. I worked mainly with the UT group, testing a sensor technology which will be used in the 2019 upgraded detector. I won’t go too much into the technology of the upgrade right now, but if you are interested in the nitty-gritty of it all, I will instead point you to the Technical Design Report itself.

I just wanted to take a bit to talk about my experience with the test beam in July, starting with walking into the experimental area itself. The first sight you see upon entering the building is a picture reminding you that you are entering a radiation zone.

ps_entrance

The Entrance!!

Then, as you enter, you see a large wall of radioactive concrete.

the_wall

Don’t lick those!

This is where the beam is dumped. Following along here, you get to the control room, which is where all the data taking stuff is set up outside the experimental area itself. Lots of people are always working in the control room, focused and making sure to take as much data as possible. I didn’t take their picture since they were working so hard.

Then there’s the experimental area itself.

the_setup

The Setup! To find the hardhat, look for the orange and green racks, then follow them towards the top right of the picture.

Ah, beautiful. :)

There are actually 4 setups here, but I think only three were being used at this time (click on the picture for a larger view). We occupied the area where the guy with the hardhat is.

Now the idea behind a tracker testbeam is pretty straight forward. A charged particle flies by, and many very sensitive detector planes record where the charged particle passed. These planes together form what’s called a “telescope.” The setup is completed when you add a detector to be tested either in the middle of the telescope or at one end.

Cartoon of a test beam setup. The blue indicates the "telescope", the orange is the detector under test, and the red is the trajectory of a charged particle.

Cartoon of a test beam setup. The blue indicates the “telescope”, the orange is the detector under test, and the red is the trajectory of a charged particle.

 

From timing information and from signals from these detectors, a trajectory of the particle can be determined. Now, you compare the position which your telescope gives you to the position you record in the detector you want to test, and voila, you have a way to understand the resolution and abilities of your tested detector. After that, the game is statistics. Ideally, you want to be in the middle of the telescope, so you have the information on where the charged particle passed on either side of your detector as this information gives the best resolution, but it can work if you’re on one side or the other, too.

This is the setup which we have been using for the testbeam at the PS.  We’ll be using a similar setup for the testbeam at the SPS next week! I’ll try to write a follow up post on that when we finish!

And finally, here is the promised video.

 

Share

Top quark still raising questions

Wednesday, October 15th, 2014

This article appeared in symmetry on Oct. 15, 2014.

Why are scientists still interested in the heaviest fundamental particle nearly 20 years after its discovery? Photo: Reidar Hahn, Fermilab

Why are scientists still interested in the heaviest fundamental particle nearly 20 years after its discovery? Photo: Reidar Hahn, Fermilab

“What happens to a quark deferred?” the poet Langston Hughes may have asked, had he been a physicist. If scientists lost interest in a particle after its discovery, much of what it could show us about the universe would remain hidden. A niche of scientists, therefore, stay dedicated to intimately understanding its properties.

Case in point: Top 2014, an annual workshop on top quark physics, recently convened in Cannes, France, to address the latest questions and scientific results surrounding the heavyweight particle discovered in 1995 (early top quark event pictured above).

Top and Higgs: a dynamic duo?
A major question addressed at the workshop, held from September 29 to October 3, was whether top quarks have a special connection with Higgs bosons. The two particles, weighing in at about 173 and 125 billion electronvolts, respectively, dwarf other fundamental particles (the bottom quark, for example, has a mass of about 4 billion electronvolts and a whole proton sits at just below 1 billion electronvolts).

Prevailing theory dictates that particles gain mass through interactions with the Higgs field, so why do top quarks interact so much more with the Higgs than do any other known particles?

Direct measurements of top-Higgs interactions depend on recording collisions that produce the two side-by-side. This hasn’t happened yet at high enough rates to be seen; these events theoretically require higher energies than the Tevatron or even the LHC’s initial run could supply. But scientists are hopeful for results from the next run at the LHC.

“We are already seeing a few tantalizing hints,” says Martijn Mulders, staff scientist at CERN. “After a year of data-taking at the higher energy, we expect to see a clear signal.” No one knows for sure until it happens, though, so Mulders and the rest of the top quark community are waiting anxiously.

A sensitive probe to new physics

Top and antitop quark production at colliders, measured very precisely, started to reveal some deviations from expected values. But in the last year, theorists have responded by calculating an unprecedented layer of mathematical corrections, which refined the expectation and promise to realigned the slightly rogue numbers.

Precision is an important, ongoing effort. If researchers aren’t able to reconcile such deviations, the logical conclusion is that the difference represents something they don’t know about — new particles, new interactions, new physics beyond the Standard Model.

The challenge of extremely precise measurements can also drive the formation of new research alliances. Earlier this year, the first Fermilab-CERN joint announcement of collaborative results set a world standard for the mass of the top quark.

Such accuracy hones methods applied to other questions in physics, too, the same way that research on W bosons, discovered in 1983, led to the methods Mulders began using to measure the top quark mass in 2005. In fact, top quark production is now so well controlled that it has become a tool itself to study detectors.

Forward-backward synergy

With the upcoming restart in 2015, the LHC will produce millions of top quarks, giving researchers troves of data to further physics. But scientists will still need to factor in the background noise and data-skewing inherent in the instruments themselves, called systematic uncertainty.

“The CDF and DZero experiments at the Tevatron are mature,” says Andreas Jung, senior postdoc at Fermilab. “It’s shut down, so the understanding of the detectors is very good, and thus the control of systematic uncertainties is also very good.”

Jung has been combing through the old data with his colleagues and publishing new results, even though the Tevatron hasn’t collided particles since 2011. The two labs combined their respective strengths to produce their joint results, but scientists still have much to learn about the top quark, and a new arsenal of tools to accomplish it.

“DZero published a paper in Nature in 2004 about the measurement of the top quark mass that was based on 22 events,” Mulders says. “And now we are working with millions of events. It’s incredible to see how things have evolved over the years.”

Troy Rummler

Share

Good Management is Science

Friday, October 10th, 2014

Management done properly satisfies Sir Karl Popper’s (1902 – 1994) demarcation criteria for science, i.e. using models that make falsifiable or at least testable predictions. That was brought home to me by a book[1] by Douglas Hubbard on risk management where he advocated observationally constrained (falsifiable or testable) models for risk analysis evaluated through Monte Carlo calculations. Hmm, observationally constrained models and Monte Carlo calculations, sounds like a recipe for science.

Let us take a step back. The essence of science is modeling how the universe works and checking the assumptions of the model and its predictions against observations. The predictions must be testable. According to Hubbard, the essence of risk management is modeling processes and checking the assumptions of the model and its predictions against observations. The predictions must be testable. What we are seeing here is a common paradigm for knowledge in which modeling and testing against observation play a key role.

The knowledge paradigm is the same in project management. A project plan, with its resource loaded schedules and other paraphernalia, is a model for how the project is expected to proceed. To monitor a project you check the plan (model) against actuals (a fancy euphemism for observations, where observations may or may not correspond to reality). Again, it reduces back to observationally constrained models and testable predictions.

The foundations of science and good management practices are tied even closer together. Consider the PDCA cycle for process management that is present, either implicitly or explicitly, in essentially all the ISO standards related to management. It was originated by Walter Shewhart (1891 – 1967), an American physicist, engineer and statistician, and popularized by Edwards Deming (1900 – 1993), an American engineer, statistician, professor, author, lecturer and management consultant. Engineers are into everything. The actual idea of the cycle is based on the ideas of Francis Bacon (1561 – 1629) but could equally well be based on the work of Roger Bacon[2] (1214 – 1294). Hence, it should probably be called the Double Bacon Cycle (no, that sounds too much like a breakfast food).

But what is this cycle? For science, it is: plan an experiment to test a model, do the experiment, check the model results against theCapture observed results, and act to change the model in response to the new information from the check stage or devise more precise tests if the predictions and observations agree. For process management replace experiment with production process. As a result, you have a model for how the production process should work and doing the process allows you to test the model. The check stage is where you see if the process performed as expected and the act stage allows you to improve the process if the model and actuals do not agree. The key point is the check step. It is necessary if you are to improve the process; otherwise you do not know what is going wrong or, indeed, even if something is going wrong. It is only possible if the plan makes predictions that are falsifiable or at least testable. Popper would be pleased.

There is another interesting aspect of the ISO 9001 standard. It is based on the idea of processes. A process is defined as an activity that converts inputs into outputs. Well, that sound rather vague, but the vagueness is an asset, kind of like degrees of freedom in an effective field theory. Define them as you like but if you choose them incorrectly you will be sorry. The real advantage of effective field theory and the flexible definition of process is that you can study a system at any scale you like. In effective field theory, you study processes that operate at the scale of the atom, the scale of the nucleus or the scale of the nucleon and tie them together with a few parameters. Similarly with processes, you can study the whole organization as a process or drill down and look at sub process at any scale you like, for CERN or TRIUMF that would be down to the last magnet. It would not be useful to go further and study accelerator operations at the nucleon scale. At a given scale different processes are tied together by their inputs and outputs and these are also used to tie process at different scales.

As a theoretical physicist who has gone over to the dark side and into administration, I find it amusing to see the techniques and approaches from science being borrowed for use in administration, even Monte Carlo calculations. The use of similar techniques in science and administration goes back to the same underlying idea: all true knowledge is obtained through observation and its use to build better testable models, whether in science or other walks of life.

[1] The Failure of Risk Management: Why It’s Broken and How to Fix It by Douglas W. Hubbard (Apr 27, 2009)

[2] Roger Bacon described a repeating cycle of observation, hypothesis, and experimentation.

Share

Physics Laboratory: Back to Basics

Friday, October 10th, 2014

Dark matter –  it’s essential to our universe, it’s mysterious and it brings to mind cool things like space, stars, and galaxies. I have been fascinated by it since I was a child, and I feel very lucky to be a part for the search for it. But that’s not actually what I’m going to be talking about today.

I am a graduate student just starting my second year in the High Energy Physics group at UCL, London. Ironically, as a dark matter physicist working in the LUX (Large Underground Xenon detector) and LZ (LUX-ZEPLIN) collaborations, I’m actually dealing with very low energy physics.
When people ask what I do, I find myself saying different things, to differing responses:

  1. “I’m doing a PhD in physics” – reaction: person slowly backs away
  2. “I’m doing a PhD in particle physics” – reaction: some interest, mention of the LHC, person mildly impressed
  3. “I’m doing a PhD in astro-particle physics” – reaction: mild confusion but still interested, probably still mention the Large Hadron Collider
  4. “I’m looking for dark matter!” – reaction: awe, excitement, lots of questions

This obviously isn’t true in all cases, but has been the general pattern assumed. Admittedly, I enjoy that people are impressed, but sometimes I struggle to find a way to explain to people not in physics what I actually do day to day. Often I just say, “it’s a lot of computer programming; I analyse data from a detector to help towards finding a dark matter signal”, but that still induces a panicked look in a lot of people.

Nevertheless, I actually came across a group of people who didn’t ask anything about what I actually do last week, and I found myself going right back to basics in terms of the physics I think about daily. Term has just started, and that means one thing: undergraduates. The frequent noise they make as they stampede past my office going the wrong way to labs makes me wonder if the main reason for sending them away for so long is to give the researchers the chance to do their work in peace.

Nonetheless, somehow I found myself in the undergraduate lab on Friday. I had to ask myself why on earth I had chosen to demonstrate – I am, almost by definition, terrible in a lab. I am clumsy and awkward, and even the most simple equipment feels unwieldy in my hands. During my own undergrad, my overall practical mark always brought my average mark down for the year. My masters project was, thank god, entirely computational. But thanks to a moment of madness (and the prospect of earning a little cash, as London living on a PhD stipend is hard), I have signed up to be a lab demonstrator for the new first year physicists.

Things started off awkwardly as I was told to brief them on the experiment and realised I had not a great deal to say.  I got more into the swing of things as time went by, but I still felt like I’d been thrown in the deep end. I told the students I was a second year PhD student; one of them got the wrong end of the stick and asked if I knew a student who was a second year undergrad here. I told him I was postgraduate and he looked quite embarrassed, whilst I couldn’t help but laugh at the thought of the chaos that would ensue if a second year demonstrated the first year labs.

oscilloscope

The oscilloscope: the nemesis of physics undergrads in labs everywhere

None of them asked what my PhD was in. They weren’t interested – somehow I had become a faceless authority who told them what to do and had no other purpose. I am not surprised – they are brand new to university, and more importantly, they were pretty distracted by the new experience of the laboratory. That’s not to say they particularly enjoyed it, they seemed to have very little enthusiasm for the experiment. It was a very simple task: measuring the speed of sound in air using a frequency generator, an oscillator and a ruler. For someone now accustomed to dealing with data from a high tech dark matter detector, it was bizarre! I do find the more advanced physics I learn, the worse I become at the basics, and I had to go aside for a moment with a pen and paper to reconcile the theory in my head – it was embarrassing, to say the least!

Their frustration at the task was evident – there were frequent complaints over the length of time they were writing for, over the experimental ‘aims’ and ‘objectives’, of the fact they needed to introduce their diagrams before drawing them, etc. Eyes were rolling at me. I was going to have to really try to drill it in that this was indeed an important exercise. The panic I could sense from them was a horrible reminder of how I used to feel in my own labs. It’s hard to understand at that point that this isn’t just some form of torture, you are actually learning some very valuable and transferrable skills about how to conduct a real experiment. Some examples:

  1. Learn to write EVERYTHING down, you might end up in court over something and some tiny detail might save you.
  2. Get your errors right. You cannot claim a discovery without an uncertainty, that’s just physics. Its difficult to grasp, but you can never fully prove a hypothesis, only provide solid evidence towards it.
  3. Understand the health and safety risks – they seem pointless and stupid when the only real risk seems to be tripping over your bags, but speaking as someone who has worked down a mine with pressurised gases, high voltages and radioactive sources, they are extremely important and may be the difference between life and death.

In the end, I think my group did well. They got the right number for the speed of sound and their lab books weren’t a complete disaster. A few actually thanked me on their way out. 

It was a bit of a relief to get back to my laptop where I actually feel like I know what I am doing, but the experience was a stark reminder of where I was 5 years ago and how much I have learned. Choosing physics for university means you will have to struggle to understand things, work hard and exhaust yourself, but in all honestly it was completely worth it, at least for me. Measuring the speed of sound in air is just the beginning. One day, some of those students might be measuring the quarks inside a proton, or a distant black hole, or the quantum mechanical properties of a semiconductor. 

I’m back in the labs this afternoon, and I am actually quite looking forward to seeing how they cope this week, when we study that essential pillar of physics, conservation of momentum. I just hope they don’t start throwing steel ball-bearings at each other. Wish me luck.

Share

Liveblog: New ATLAS Higgs Results

Tuesday, October 7th, 2014

In a short while, starting at 11:00 CEST / 10:00 BST, ATLAS will announce some new Higgs results:

“New Higgs physics results from the ATLAS experiment using the full Run-1 LHC dataset, corresponding to an integrated luminosity of approximately 25 fb-1, of proton-proton collisions at 7 TeV and 8 TeV, will be presented.” [seminar link]

I don’t expect anything earth-shattering, because ATLAS already has preliminary analyses for all the major Higgs channels. They have also submitted final publications for LHC Run I on Higgs decaying to two photons, two b quarks, two Z bosons – so it’s reasonable to guess that Higgs decaying to taus or W’s is going to be covered today.

(Parenthetically, CMS has already published final results for all of the major Higgs decays, because we are faster, stronger, smarter, better looking, and more fun at parties.)

I know folks on ATLAS who are working on things that might be shown today, and they promise they have some new tricks, so I’m hoping things will be fairly interesting. But again, nothing earth-shattering.

I’ll update this very page during the seminar. You should also be able to watch it on the Webcast Service.

10:55 I have a front row seat in the CERN Council Chamber, which is smaller than the main auditorium that you might be more familiar with. Looks like it will be very, very full.

11:00 Here we go! (Now’s a good time to click the webcast, if you plan to.)

11:03 Yes, it turns out it will be taus and W’s.

11:06 As an entree, look how fabulously successful the Standard Model, including the Higgs, has been:

11:10 Good overview right now over overall Higgs production and decay and the framework we used to understand it. Have any questions I can answer during the seminar? Put them in the comments or write something at me on Twitter.

11:18 We’re learning about the already-released results for Higgs to photons and ZZ first.

11:24 Higgs to bb, the channel I worked on for CMS during Run I. These ATLAS results are quite new and have a lot of nice improvements from their preliminary analysis. Very pretty plot of improved Higgs mass resolution when corrections are made for muons produced inside b-jets.

11:30 Now to Higgs to tau tau, a new result!

11:35 Developments since preliminary analysis include detailed validation of techniques for estimating from data how isolated the taus should be from other things in the detector.

11:36 I hope that doesn’t sound too boring, but this stuff’s important. It’s what we do all day, not just counting sigmas.

11:37 4.5 sigma evidence (only 3.5 expected) for the Higgs coupling to the tau lepton!

11:39 Their signal is a bit bigger than the SM predicts, but still very consistent with it. And now on to WW, also new.

11:41 In other news, the Nobel Prize in Physics will be announced in 4 minutes: It’s very unlikely to be for anything in this talk.

11:44 Fixed last comment: “likely” –> “unlikely”. Heh.

11:48 When the W’s decay to a lepton and an invisible neutrino, you can’t measure a “Higgs peak” like we do when it decays to photons or Z’s. So you have to do very careful work to make sure that a misunderstanding of you background (i.e. non-Higgs processes) produces what looks like a Higgs signal.

11:50 Background-subtracted result does show a clear Higgs excess over the SM backgrounds. This will be a pretty strong result.

11:51 6.1 sigma for H –> WW –> lvlv. 3.2 sigma for VBF production mechanism. Very consistent with the SM again.

11:52 Lots of very nice, detailed work here. But the universe has no surprises for us today.

11:54 We can still look forward to the final ATLAS combination of all Higgs channels, but we know it’s going to look an awful lot like the Standard Model. Congratulations to my ATLAS colleagues on their hard work.

11:56 By the way, you can read the slides on the seminar link.

12:02 The most significant result here might actually be the single-channel observation of the Vector Boson Fusion production mechanism. The Higgs boson really is behaving the way the Standard Model says it should! Signing off here, time for lunch

Share

This Fermilab press release came out on Oct. 6, 2014.

With construction completed, the NOvA experiment has begun its probe into the mysteries of ghostly particles that may hold the key to understanding the universe. Image: Fermilab/Sandbox Studio

With construction completed, the NOvA experiment has begun its probe into the mysteries of ghostly particles that may hold the key to understanding the universe. Image: Fermilab/Sandbox Studio

It’s the most powerful accelerator-based neutrino experiment ever built in the United States, and the longest-distance one in the world. It’s called NOvA, and after nearly five years of construction, scientists are now using the two massive detectors – placed 500 miles apart – to study one of nature’s most elusive subatomic particles.

Scientists believe that a better understanding of neutrinos, one of the most abundant and difficult-to-study particles, may lead to a clearer picture of the origins of matter and the inner workings of the universe. Using the world’s most powerful beam of neutrinos, generated at the U.S. Department of Energy’s Fermi National Accelerator Laboratory near Chicago, the NOvA experiment can precisely record the telltale traces of those rare instances when one of these ghostly particles interacts with matter.

Construction on NOvA’s two massive neutrino detectors began in 2009. In September, the Department of Energy officially proclaimed construction of the experiment completed, on schedule and under budget.

“Congratulations to the NOvA collaboration for successfully completing the construction phase of this important and exciting experiment,” said James Siegrist, DOE associate director of science for high energy physics. “With every neutrino interaction recorded, we learn more about these particles and their role in shaping our universe.”

NOvA’s particle detectors were both constructed in the path of the neutrino beam sent from Fermilab in Batavia, Illinois, to northern Minnesota. The 300-ton near detector, installed underground at the laboratory, observes the neutrinos as they embark on their near-light-speed journey through the Earth, with no tunnel needed. The 14,000-ton far detector — constructed in Ash River, Minnesota, near the Canadian border – spots those neutrinos after their 500-mile trip and allows scientists to analyze how they change over that long distance.

For the next six years, Fermilab will send tens of thousands of billions of neutrinos every second in a beam aimed at both detectors, and scientists expect to catch only a few each day in the far detector, so rarely do neutrinos interact with matter.

From this data, scientists hope to learn more about how and why neutrinos change between one type and another. The three types, called flavors, are the muon, electron and tau neutrino. Over longer distances, neutrinos can flip between these flavors. NOvA is specifically designed to study muon neutrinos changing into electron neutrinos. Unraveling this mystery may help scientists understand why the universe is composed of matter and why that matter was not annihilated by antimatter after the big bang.

Scientists will also probe the still-unknown masses of the three types of neutrinos in an attempt to determine which is the heaviest.

“Neutrino research is one of the cornerstones of Fermilab’s future and an important part of the worldwide particle physics program,” said Fermilab Director Nigel Lockyer. “We’re proud of the NOvA team for completing the construction of this world-class experiment, and we’re looking forward to seeing the first results in 2015.”

The far detector in Minnesota is believed to be the largest free-standing plastic structure in the world, at 200 feet long, 50 feet high and 50 feet wide. Both detectors are constructed from PVC and filled with a scintillating liquid that gives off light when a neutrino interacts with it. Fiber optic cables transmit that light to a data acquisition system, which creates 3-D pictures of those interactions for scientists to analyze.

The NOvA far detector in Ash River saw its first long-distance neutrinos in November 2013. The far detector is operated by the University of Minnesota under an agreement with Fermilab, and students at the university were employed to manufacture the component parts of both detectors.

“Building the NOvA detectors was a wide-ranging effort that involved hundreds of people in several countries,” said Gary Feldman, co-spokesperson of the NOvA experiment. “To see the construction completed and the operations phase beginning is a victory for all of us and a testament to the hard work of the entire collaboration.”

The NOvA collaboration comprises 208 scientists from 38 institutions in the United States, Brazil, the Czech Republic, Greece, India, Russia and the United Kingdom. The experiment receives funding from the U.S. Department of Energy, the National Science Foundation and other funding agencies.

For more information, visit the experiment’s website: http://www-nova.fnal.gov.

Note: NOvA stands for NuMI Off-Axis Electron Neutrino Appearance. NuMI is itself an acronym, standing for Neutrinos from the Main Injector, Fermilab’s flagship accelerator.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Share

Teaming up on top and Higgs

Monday, October 6th, 2014

While the LHC experiments are surely turning their attention towards the 2015 run of the collider, at an energy nearly double that of the previous run, we’re also busy trying to finalize and publish measurements using the data that we already have in the can.  Some measurements just take longer than others, and some it took us a while to get to.  And while I don’t like tooting my own horn too much here at the US LHC blog, I wanted to discuss a new result from CMS that I have been working on with a student, Dan Knowlton, here at the University of Nebraska-Lincoln, along with collaborators from a number of other institutions.  It’s been in the works for so long that I’m thrilled to get it out to the public!

(This is one of many CMS results that were shown for the first time last week at the TOP 2014 conference.  If you look through the conference presentations, you’ll find that the top quark, which has been around for about twenty years now, has continued to be a very interesting topic of study, with implications for searches for new physics and even for the fate of the universe.  One result that’s particularly interesting is a new average of CMS top-quark mass measurements, which is now the most accurate measurement of that quantity in the world.)

The LHC experiments have studied the Higgs boson through many different Higgs decay modes, and many different production mechanisms also.  Here is a plot of the expected cross sections for different Higgs production mechanisms as a function of Higgs mass; of course we know now that the Higgs has a mass of 125 GeV:

The most common production mechanism has a Higgs being produced with nothing else, but it can also be produced in association with other particles.  In our new result, we search for a Higgs production mechanism that is so much more rare that it doesn’t even appear on the above plot!  The mechanism is the production of a Higgs boson in association with a single top quark, and in the standard model, the cross section is expected to be 0.018 pb, about an order of magnitude below the cross section for Higgs production in association with a top-antitop pair.  Why even bother to look for such a thing, given how rare it is?

The answer lies in the reason for why this process is so rare.  There are actually two ways for this particular final state to be produced. Here are the Feynman diagrams for them:

   

In one case, the Higgs is radiated off the virtual W, while in the other it comes off the real final-state top quark.  Now, this is quantum mechanics: if you have two different ways to connect an initial and final state, you have to add the two amplitudes together before you square them to get a probability for the process.  It just so happens that these two amplitudes largely destructively interfere, and thus the production cross section is quite small.  There isn’t anything deep at work (e.g. no symmetries that suppress this process), it’s just how it comes out.

At least, that’s how it comes out in the standard model.  We assume certain values for the coupling factors of the Higgs to the top and W particles that appear in the diagrams above.  Other measurements of Higgs properties certainly suggest that the coupling factors do have the expected values, but there is room within the constraints for deviations.  It’s even possible that one of the two coupling values has the exact opposite sign from what we expect.  In that case, the destructive interference between the two amplitudes would become constructive, and the cross section would be almost a factor of 13 larger than expected!

The new result from CMS is a search for this anomalous production of the Higgs in association with a single top quark.  CMS already has a result for a search in which the Higgs decays to pair of photons; this new result describes a search in which the Higgs decays to bottom quarks.  That is a much more common Higgs decay mode, so there ought to be more events to see, but at the same time the backgrounds are much higher.  The production of a top-antitop pair along with an extra jet of hadrons that is mis-identified as arising from a bottom quark looks very much like the targeted Higgs production mechanism.  The top-antitop cross section is about 1000 times bigger than that of the anomalous production mechanism that we are looking for, and thus even a tiny bottom mis-identification rate leads to a huge number of background events.  A lot of the work in the data analysis goes into figuring out how to distinguish the (putative) signal events from the dominant background, and then verifying that the estimations of the background rates are correct.

The analysis is so challenging that we predicted that even by throwing everything we had at it, the best we could expect to do was to exclude the anomalous Higgs production process at a level of about five times the predicted rate for it.  When we looked at the data, we found that we could exclude it at about seven times the anomalous rate, roughly in line with what we expected.  In short, we do not see an anomalous rate for anomalous Higgs production!  But we are able to set a fairly tight limit, at around 1.8 pb.

What do I like about this measurement?  First, it’s a very different way to try to measure the properties of the Higgs boson.  The measurements we have are very impressive given the amount of data that we have so far, but they are not very constraining, and there is enough wiggle room for some strange stuff to be going on.  This is one of the few ways to probe the Higgs couplings through the interference of two processes, rather than just through the rate for one dominant process.  All of these Higgs properties measurements are going to be much more accurate in next year’s data run, when we expect to integrate more data and all of the production rates will be larger due to the increase in beam energy.  (For this anomalous production process, the cross section will increase by about a factor of four.)  In this particular case, we should be able to exclude anomalous Higgs couplings through this measurement…or, if nature surprises us, we will actually observe them!  There is a lot of fun ahead for Higgs physics (and top physics) at the LHC.

I’ve also really enjoyed working with my CMS colleagues on this project.  Any measurement coming out of the experiment is truly the work of thousands of people who have built and operated the detector, gotten the data recorded and processed, developed and refined the reconstruction algorithms, and defined the baselines for how we identify all kinds of particles that are produced in the proton collisions.  But the final stages of any measurement are carried out by smaller groups of people, and in this case we worked with colleagues from the Catholic University of Louvain in Belgium, the Karlsruhe Institute of Technology in Germany, the University of Malaya in Malaysia, and the University of Kansas (in Kansas).  We relied on the efforts of a strong group of graduate students with the assistance of harried senior physicists like myself, and the whole team did a great job of supporting each other and stepping up to solve problems as they arose.  These team efforts are one of the things that I’m proud of in particle physics, and that make our scientists so successful in the wider world.

Share