• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Anadi Canepa | TRIUMF | Canada

Read Bio

Experimentalists and theorists meet

Monday, May 11th, 2009

High energy proton collisions are finally expected this Fall at the world’s largest particle accelerator, the LHC.
Our community is eager to start exploring the data produced at the energy frontier.

What will data tell us ? What secrets will it unveil ? Why do we expect new physics, new phenomena, new particles at the energy scale of
the LHC operation ? One compelling argument is the following. The Standard Model does not explain the scattering of W bosons at large energies. To prevent the probability of that process from exceeding 100%, new particles must exist with mass of roughly 100 GeV (100  times the mass of the proton). If Dark Matter is composed of slow moving, heavy, neutral particles, their mass should be of order of 100 GeV as well to explain the current abundance. These and many other experimental and theoretical facts convince us that New Physics will manifest itself at the LHC.

Which theory will provide complete description of Nature?

And several are the theory of New Physics that have been proposed in the past decades. Not all possible theories are viable, though. The current experiments constrain the modeling and the mathematical framework built so far (tested to 0.1% precision in some cases!) needs to be fulfilled. At the same time, the creativity of physicist who broke barriers of knowledge using imagination and evidence, is still alive. Other bloggers mentioned the funniest names for experiments. How about the ones for new theories ? Two are the most popular visions of New Phenomena, one is Extra-dimensions and the other one is SUSY (nick name for SUperSYmmetry). And lately theorists proposed Hidden Valley, Little Higgs, Quirks etc!

The math behind it is fascinating, each model is able to explain phenomena the Standard Model fails to account for. However, from experimental point of view, their signatures can be strikingly similar. Once a discovery is made, in that a signal  is observed at the LHC, it will take  decades – if not another type of accelerator – to identify the model and solve the puzzle. The strategy will be to progress step by step, proposing new models and testing their validity. This process requires experimentalists who know how to operate the detector and to carry out a data analysis, and theorists capable of providing feed back and ready to  modify their models according to what data indicate. See this blog about experimentalists versus theorists! http://public.web.cern.ch/public/en/People/Experimentalists-en.html

It will be an intense process of interaction between the two communities. To improve the efficiency of our interactions conference and workshops are organized all around the world. Few are major conferences where the experiments present their latest results, but most are minor conference or workshops intended as a work table for physicists attending them. Very informal and oriented to maximize the information exchange, the conferences are usually several day long, with plenary and parallel sessions. Minor conferences can host less than 100 physicists, while major ones can register several hundreds!

lhc-workshoppreview
Recently there has been a push towards combining theory and experimental workshops to start building open discussions and cooperation between the two communities which is needed and crucial for our understanding of Nature. A good example was a workshop held a TRIUMF at  the beginning of the month, which I had the pleasure to attend (this was a very good opportunity for me to spend time at my own institution).

It was a very sunny week in Vancouver!

Share

Data analysis in simple words

Tuesday, May 5th, 2009

And here I am, back to CERN! In the past weeks I spent most of my time traveling and attending conferences, which is one exciting component of our job. The IFAE (Italian Conference of High Energy) was held in Bari, located in the South of Italy.

dsc00575 Its fame in the world comes not only from the wonderful cathedral,but mainly from the “orecchiette”, the Puglia’s traditional ear-shaped pasta. As well   known, you can get “orecchiette” while walking through the narrow and charming streets of the old town!

(These nice pictures are from good friends of mine whom I had the pleasure to see again)

img_1440

The conference covered a broad range of physics results, spanning from astrophysics to nuclear physics and finally to particle physics. Let me spend some time now to explain how we actually carry out a data analysis and produce our results. The process is long and complex, involving data taking, high level programming and final extensive data understanding. First the events are collected on tape, i.e. all signals from all sub-detectors are saved and used for so called “reconstructing” the objects in the event (from a set of hits recorded by the muon chamber we can infer that a muon passed through that given chamber, for instance). At this stage we know the nature of the object (being muon or electron, etc) at high level of confidence. Once we have such pictures of all events, we select those events which resemble the event we look for. If the particle we hunt for decayed into two muons and two neutrinos, we would select only events with two muons and missing transverse energy (neutrinos translate into missing transverse energy in the detector language). However, not only the particle we look for decays into two muons and two neutrinos, but also many other (non-interesting) ones. And generally the non-interesting processes happen at higher rate than the interesting ones! We might be left with millions of possible (candidate) events while we expect our particle to contribute with just hundreds events (or less). How do we dig these events out ? For each given event, we don’t know which process it corresponds to. We only know the rates of processes. Our approach needs to be probabilistic. In this framework, we then look for deviation from the rates we expect. Typically we measure the rates of the background processes that populate our pool of candidate events. The rate are known within some uncertainty. In most of the current searches the uncertainty is larger than the signal itself. The plot below gives you an example.

st

The histogram presents the number of candidate events we observe in data (black marker) compared to the number of events we expect from out background model (the meaning of the x-axis is not crucial now). The dashed area indicates the uncertainty on the prediction. If we focus on the first bin, we expect  a number of events varying between 2800 and 4000 and we observe 3600.  If the signal causes a deviation of – say – 50 events we would not be able to see it by simple counting.

To overcome this experimental limitations, advanced analysis techniques have been studied and finally, after careful consideration, deployed in the searches. Those techniques are not new, but were imported in the field fairly recently. They are machine learning tools, ranging from Neural Network to Boosted Decision Tree. Let me steal from Wikipedia a concise description: “machine learning is the sub-field of artificial intelligence that is concerned with the design and development of algorithms that allow computers to improve their performance over time based on data, such as from sensor data or databases. A major focus of machine learning research is to automatically produce (induce) models, such as rules and patterns, from data. Hence, machine learning is closely related to fields such as data mining, statistics, inductive reasoning, pattern recognition, and theoretical computer science.”
The basic idea is to teach an artificial brain to distinguish the signal from the background to levels that the experiments could not reach. This gave a boost to the sensitivity of the current experiments at the Tevatron. The top quark is mainly produced in pairs from a gluon at high rate; however it can also be singly produced in processes involving the exchange of the W boson. While in the first case we end up with high energy events containing a large number of jets (experimental manifestation of quarks in this case) and leptons, in the second case the amount of energy is smaller and the number of objects is reduced (one top quark decays instead of two top quarks). As a consequence the second process is extremely challenging from experimental point of view. The Tevatron experiments, CDF and D0, invested the past years in looking for that process!
Teams of physicists analyzed the data produced in proton-antiproton collisions to build the basis of background modeling and construct a solid “single top” search. Depending on the mode in which the single top decays, they could look for electrons, or muons or jets and missing transverse energy. Each decay mode needs to be distinguished from a different background source, due to other un-interesting processes or detector mis-measurements. Finally the separate analyzes are combined in a single sensitive search using machine learning techniques. The observation was announced in March, 15 years after the pair production of top quark pairs was firstly observed! The measured rate of single top production is in agreement with the Standard Model expectations.

tev_st

Share

Collisions in the Universe

Thursday, April 23rd, 2009

Last week astronomers observed the most crowded collision in the Universe! Four clusters of galaxies poured into a crowded 13 million-light-year-long stream of galaxies. (eventually our Milky Way will merge with the neighboring Andromeda galaxy as well!). These galactic events can probe the existence of so called “Dark Matter”. In fact particle physicist developed a superb model with predictions confirmed at the per mille level.  But how much of the Universe does the Standard Model explain? Just 4%! The rest is out there to be discovered.

I am not  an expert in astronomical measurements, but these events do grab my attention. Let’s start from a simple definition of Dark Matter. Dark Matter is matter undetectable by its emitted radiation, it is not visible. As for today, astronomers measured its contribution to the total Universe mass to be ~25%.

How did we infer the existence of Dark Matter in first place ? As the Earth rotates around the Sun due to gravitational attraction,  stars in galaxies rotate around the center of the galaxy. However the amount of mass visible is not enough to explain the rotational velocity, a large component of non-visible mass must exist. This is one of the proofs along with orbital velocities of galaxies in clusters of galaxies, and gravitational lensing.

Gravitational Lensing

Gravitational Lensing

The presence of mass can in fact be explained by the fascinating phenomena of “gravitational lensing”. Imagine a star far from the Earth. If the light from the star travels without encountering obstacles up to the Earth, we see a light spot. However, if there is a large amount of mass (say a galaxy) between us and the star, the light from the star changes its path (see the picture on the left). The  gravity due to the extra galaxy acts  like a lens to redirect the light rays, it bends the light. The gravitational lens does not create one single image of the star, but multiple ones. It can also distort the star disk-like shape into an ellipse. If the extra galaxy were perfectly symmetric with respect  to the line between the star and the Earth,  we would see a ring of stars!

Image of gravitational lensing

Image of gravitational lensing

What happens when two clusters of galaxies collide ? By now we now that a cluster of galaxies is gravitationally bound object, and the densest part of the Universe. Stars constitute ~2% of its total mass while so called “intergalactic gas” contributes to ~15%. The remaining mass is still in the Dark.
The clusters collide at speeds of millions of miles per hour. Several are the observatories taking pictures of these titanic events.

The Hubble Space Telescope, the Magellan Telescope and Very Large Telescope provide a map of the total mass (dark and ordinary) using visible light. The gravitational lensing indicates the location of the Dark Matter component (blue). As an example you can see the pictures from the well known “Bullet Cluster” observed in 2006. The Chandra data enabled the astronomers to accurately map the position of the ordinary matter by itself, mostly in the form of hot gas, which glows brightly in X-rays (shown in pink).

Image of Dark Matter

Image of visible mass

Image of Dark Matter (above); Image of visible mass (below)

As the clusters travel in opposite direction, they eventually collide. The picture below shows you the mass distribution after the collision. The ordinary matter slowed down compared to the Dark Matter and the two components separate.This is due to the different forces exerted on the Dark and visible mass. Dark Matter  particles interact with each other only very weakly or not at all, apart from the pull of gravity. (ordinary matter experiences larger “friction”, therefore it slows down during the collision).

The separation provides observational evidence for dark matter.

The "Bullet cluster" collision

The "Bullet cluster" collision

What’s the Nature of Dark Matter ?

A variety of cosmological data suggests that Dark Matter may be relics from particles present in the early universe. Currently the best theory to explain the origin of dark matter is Supersymmetry (SUSY), which predicts the existence of a “superpartner”  for each Standard Model particle. The lightest superpartner of the neutral bosons (the Z and the Higgs bosons), called the “neutralino,”  is an excellent candidate for this elusive form of matter. Being able to observe the SUSY particles would be crucial for a deep understanding of the universe.  Superparticles could be generated in proton-antiproton collisions at the Tevatron and in proton-proton collisions at the LHC.
The experiments at the Tevatron accelerator, CDF and D0, are desperately seeking a sign of SUSY in the collisions stored on tape, however these particles – if they exist – might be heavier than 100 times the proton. ATLAS and CMS are tuning their tools to be ready for the incoming LHC collisions!

Share

How does a detector for high energy physics work ?

Saturday, April 11th, 2009

Last week I visited two of the LHC caverns, one hosting the ATLAS detector and the other one the CMS detector.

Front view of ATLAS (not fully assembled yet)

Front view of the ATLAS detector (not fully assembled yet)

Both ATLAS and CMS are so called “multi-purpose” detectors as measurements taken are suitable for a broad physics programme (some of which will be discoveries!). This is the reason why the underlying design is similar in both cases, and for that matter similar to the one of CDF and D0, the Tevatron experiments at Fermilab.

To get a better grip on how we actually study a physics process, it is important to know what we actually observe when a proton-proton collision happens in the LHC; this, in turn, requires some knowledge of matter and forces.

Myself at CMS!

Myself at CMS!

Our current understanding of Nature is summarized in the “Standard Model of Particles and Fields” (SM). The SM is very elegant and sophisticated from a mathematical point of view. It describes any form of matter we observe in terms of 12 fundamental particles.

Particle content of the Standard Model

Particle content of the Standard Model

In fact, our bodies, the planets, the starts … are all formed by a combination of six leptons (green) and six quarks (red). These fundamental particles interact by means of force-carrying particles called bosons (violet). Every phenomenon observed in nature can be understood as the interplay of these fundamental particles and these forces (“g” stands for gluon, gravity is not in the picture yet!). During a collision at the LHC, the large amount of energy available will be converted into mass. Not only fundamental particles, but also a zoo of composite ones will be generated. However, all (known) particles formed of the t, b and c quarks, along with the tau lepton, W, and Z, are heavy and decay immediately.
Surprisingly, the only particles flying through our detectors will be light leptons (e, mu, and the three neutrinos), the photon (gamma) and various combination of light quarks (u,d,s) with gluons. How do we measure their identity and properties?

Since I belong to the ATLAS Collaboration, let me present its detector as an example. ATLAS is composed of several sub-detectors (a sort of cylindrical “onion” around the collision point), each one capable of measuring a given property of the particle passing through. Only after combining the information provided by all sub-detectors, we are able to understand what particle was actually produced.

Computer model of ATLAS

Computer model of ATLAS

Note the size of ATLAS, compared to the people on the left! The protons travel in the central pipe in both directions and collide in the center of the detector.

Particle interaction with material

Particle interaction with material

The innermost part extends radially from a few centimeters to 1.2 meters, and it is 7 meters in length along the beam pipe. It is the ATLAS tracking system, designed and built to measure the track, i.e. the path followed by the particle flying out of the collision. The basic idea is to have several cylindrical layers of active material (silicon or gas) surrounding the interaction point. When a charged particle crosses any of these layers, it interacts and creates a signal which can be detected. The track is obtained from this set of points. The entire tracking system is surrounded by a magnetic field which bends the particle trajectory. The bending angle indicates the charge and the momentum (“velocity”) of the particle. The system has three components: the Pixel, SCT and TRT. The first two are detectors built with tiny silicon (semiconductor) elements, while the TRT is a set of straw filled with gas. In total the pixel detector has 80 million readout channels, i.e. elements capable of measuring; the SCT has 6.2 million readout channels and a total area of 61 square meters; finally, the TRT consists of 351,000 straws!

The next step is to measure the energy of particles. This is achieved by placing “calorimeters” around the tracking system. A calorimeter is made of metal sheets (“absorbers”) and a detection medium. Whenever a particle meets the absorber, it interacts with the material and produces a shower of secondary particles which are detected in the detection medium. The interaction depends on the particle nature, for instance electrons interact exchanging a photon, while particles composed of quarks can also exchange a gluon. For this reason, the calorimeter has two sectors: the “electromagnetic” sector to detect electrons and photons, and the “hadronic” to measure the energy of other charged particles (except muons). In the “electromagnetic” calorimeter the absorbers are made of lead and the detection medium is liquid argon; the latter needs to be maintained under an intense electric field (2000 V over 2 mm) at -180 deg. The “hadronic” compartment uses either liquid argon or tile as active medium, and either steel or copper as absorber. The liquid argon component by itself has 170000 read out channels.

By the time we get outside the calorimeter, almost all particles have been absorbed. The only ones capable of traversing so much material without being stopped are muons. To detect them, a muon spectrometer surrounds the calorimeters and measures the muon trajectories, charge and momentum. This happens inside a volume of magnetic field produced by superconducting toroidal magnets. The detection elements are made of thousands of metal tubes equiped with a central wire and filled with gas. As a muon passes through these tubes, it leaves a trail of ions and electrons which drift to the sides and center of the tube. By measuring the time it takes for these charges to drift from the starting point, it is possible to determine the position of the muon as it passes through.

Computer simulation of tracks emerging from a collision in the ATLAS detector

Computer simulation of tracks emerging from a collision in the ATLAS detector. Tracks can be associated to energy deposits.

Finally, when all pieces are glued together a collision might look like this! We can also measure the so called “missing transverse energy”, more on this next time!

Share

“Vague but exciting”

Thursday, April 2nd, 2009

The WWW turns 20! and CERN celebrates its birthday inviting the father of the web, Tim Berners-Lee. In several occasions during the event, Berners-Lee and the CERN Director General had the opportunity to point out the role of CERN in the birth and success of the web. Their words well summarize what CERN, likewise TRIUMF or Fermilab, really is: a pool of highly motivated people coming together to push forward the understanding of Nature. And not only. As a matter of fact in the late ’90, CERN was already the largest and most geographically distributed scientific community. But it is not only the size and the diversity which leads to great advancements in the field. What drives us, as particle physicists, is a culture of sharing knowledge, “open-ness”. Thousands of people are actively contributing just to build
one of the experiments around the ring (be it ATLAS or CMS; the ALICE and LHCb are slightly smaller collaborations). Building an experiment is a whole experience by itself. We start from designing and testing the detectors. Each component will have to operate in extreme experimental conditions (high radiation for instance), and to provide excellent performance (fast response, good accuracy. etc). After the “R&D” phase is completed and the production starts, groups of experts sit at the same table
to design the test procedure of the hardware: most of our subdetectors won’t be accessible once assembled in the cavern and sophisticated and exhaustive tests need to guarantee a long term reliability of the hardware. Similar is the procedure for the electronics reading the data coming out of the detector. This is the beginning of the chain! The data will be transferred and analyzed; a complicated forest of databases are put in place to store the information about run conditions, calibration, alignment.
Data understanding has to be carried out online (“live”) to provide immediate feed back, detailed study might take years. Finally the tools are ready, our enormous amount of data is distributed world wide using the “grid” and the search will start. The creativity of the whole collaboration spurs from a free interaction among its members. The management holding the community together for the accomplishment of goals, does not have a top-down structure. It is rather formed by colleagues with scientific skills and a strong capability of building consensus, as voted (directly or indirectly) by the collaboration. This makes the world of particle physics free from any influence, but the strive for scientific achievements. The laboratory becomes an optimal place for a spark, for a brilliant idea to be born and spread. Of course the world would be different, if for instance, any loyalty had been enforced on the web. Similarly, after an extremely rigorous scrutiny, all our scientific results are published and represent the common foundation for the next step in research. While the world is going through the so called “Globalization 3.0” (you can read the “World is flat” by T.L. Zimmermann, a debatable but interesting view), the British Library (and the New York Times for that matters) is undertaking a collecting and archiving project to preserve information stored solely in short term web sites, to prevent a “black hole for future historians and writers”!

fullcartemay071

Geography of ATLAS, one of the experiments at the LHC

Share

Welcome

Sunday, March 29th, 2009

This is the first time I blog and I expect it to be a very enjoyable experience, for me and – hopefully – for the reader! While thinking of the first post, I recalled a “colloquium” held at CERN  a couple of weeks ago. (For Quantum Diaries new comers, CERN is the world’s largest particle physics laboratory. Located at the border of Switzerland and France, it hosts particle accelerators and was the site of several milestone discoveries in our field.) The invited speaker was the Intel CEO. Regardless his familiarity with high tech., he did not hide deep fascination when exposed to the technological challenge we are facing to design, build, commission, operate, and maintain our instruments.

The idea of the Large Hadron Collider (LHC), began in the early 1980s while Spring 1992 marked the real beginning.  The machine was finally turned on September 10th 2009, but activities had to be interrupted soon after due to an electrical failure. Operations will resume in late 2009. But, what is the LHC ?

The LHC accelerator

The LHC accelerator

It is a circular particle accelerator sitting 100 m underground with a circumference of 27 km. At full power, trillions of protons will race around the LHC accelerator ring 11245 times a second, traveling at 99.99% the speed of light. Two beams of protons will each travel at a maximum energy of 7 TeV (tera-electronvolt), corresponding to head-to-head collisions of 14 TeV. Altogether some 600 million collisions will take place every second. The LHC is the emptiest space in the Solar System. To avoid colliding with gas molecules inside the accelerator, the beams of particles travel in an ultra-high vacuum pipe with a pressure ten times less than the pressure on the Moon. The LHC is a machine of extreme hot and cold. When two beams of protons collide, they will generate temperatures more than 100 000 times hotter than the heart of the Sun, concentrated within a minuscule space. By contrast the LHC is kept a the temperature of -271.3C (1.9 K), even colder than outer space.  The number talks by themselves: the LHC is one of the greatest endeavour in science!

Come and visit CERN, if you can. You will have a unique opportunity before the door is shut for the years to come.  Otherwise, you can learn more at http://cdsmedia.cern.ch/img/CERN-Brochure-2008-001-Eng.pdf. This gigantic adventure is however just the beginning. When the protons collide, they fragment into constituents which subsequently can combine to form new particles. This “event” is recorded by detectors sitting along the ring. Detectors to come in the next posts.

Share