• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Anadi Canepa | TRIUMF | Canada

Read Bio


Friday, December 18th, 2009

On December 17 at 6PM the machine operators switched off the beam in the LHC


sending us greetings through our communication channel
(you can find here the status of the beam in real time http://op-webtools.web.cern.ch/op-webtools/vistar/vistars.php?usr=LHC1)

It has been an extraordinaty 2009. The LHC is back and it performed beyond expectations in the past couple of weeks (yes, it has been just a couple of weeks). The machine was able to provide stable beam (which means good quality beam with no risk of damaging the detectors) and the highest energy collisions in the World, breaking the Tevatron record with 2.36 TeV. ATLAS successfully collected hundred of thousands of candidate collisions. What you see in the beautiful visual representation of a reconstructed event is the production of “jets”.


Even though the LHC and the experiments have been built with the aim of discovering New Phenomena, standard “strong processes” are those happening more frequently, thus the ones produced and observed right from the beginning of data taking. What does “strong” stand for? The protons in the two circulating beams are made of quarks (the name “quark” was taken by Murray Gell-Mann from the book “Finnegan’s Wake” by James Joyce). all

Quarks are – based on today’s knowledge – elementary particles (they are not composed of other particles), and there are six of them coupled in pairs due to common properties. The lightest quarks are called “up” and “down”, slightly heavier ones are the “charm and strange” and finally the “bottom and top”.

The mass of the quarks ranges from small (0.001 times the mass of the proton) to the largest mass observed in particle physics so far (170 times the mass of the proton).

Each quark is accompanied by an antiquark with a different charge. What is  unusual is their fractional electric charge, for instance the top has charge +2/3 while the bottom has -1/3. The youngest quark is the top quark, discovered at the Tevatron in 1995. Quarks interact via quantum chromodynamics (QCD), the theory of the “strong interaction”. In the same way as the electrons interact via quantum electrodynamics (QED) thanks to their electric charge, quarks interact via their “color” charge. In fact, quarks carry colors (red, green and blue) which however has nothing to do with our daily concept of color. One important characteristic of QCD is called the “confinement”: the force between quarks does not decrease as the quarks separate.  This results in not being able to see quarks separately, but only bounded to form composite particles (the top is exceptional in this respect). While travelling through the detectors they generate showers of such particles called jets”.

Our collaborators in ATLAS scanned the data collected so far looking for signs of light particles, composed of quarks, being produced. In the first runs of data taking, we could already re-discover three of the quarks. We observed a bound state of light quarks, the u and d quarks, and once more data had been collected the strange quark – in form of resonance (a peak) – appeared on our screeens!

Finally, let me conclude this fantastic year with the words of the Director General at CERN “It has been a fantastic year for the LHC [….] I want to underline the fact that it has been made possible by the unique global collaboration that is particle physics. It has been truly heart-warming to see the community pulling together to achieve its goals”.

See you in 2010!


Beam … beams in the LHC Tunnel

Sunday, November 29th, 2009

The past 10 days have been extremely exciting for every one at CERN or working for the CERN experiments. The accelerator injected first one beam, then two beams in the tunnel; a bunch of proton was spinning in each directions and made to collide in the centres of the different experiments. During these first tests, the energy of the beam in the LHC is the same as at the injection (450 GeV), nevertheless it proved to be a very successful performance of the machine and of the experiments. Because of the low energy, no new particles can be produced, but the data collected are crucial for understanding the detector timing. The electronics sitting in the detector reads the signal produced by the particles flying through. When should the readout start? The accelerator propagates a signal (the “clock”) to indicate when collisions are likely to happen. However, if the detector is not properly timed, the electronics won’t read any signal out of the different sensors. Roughly 10 days ago, so called “splash events” were provided by the accelerator. In this case. The beam collimators at the door of the experiments are closed; one beam hits them producing a splash of  particles shining the detector.  The understanding of these type of events is challenging, because they are geometrically different from what the experiments is designed to record. The particles fly from the side of the detector, and not radially from the centre. The electronics and the software needed to be adjusted to use these events as “calibration” tools.



The appearance of signal in the detector caused quite some excitement in the ATLAS Control room! Even if it is the second time I see an experiment starting (after my CDF experience), I was extremely happy to feel the same empathy and the same joy. People in operations devoted their life for years (decades?) to see the apparatus working. Being in the same room with experts staring at their system taking data was extraordinary.




The tests and fast processing of the splash events were successful. ATLAS was finally ready for the first LHC Collisions of 2009! LHC and the 4 experiments joined into a celebration on Thursday: during a very crowded seminar held in the CERN auditorium knowledge about the status of the machine and of the experiments, lessons learnt with the first events was shared … and above all the excitement for the new beginning.


A closer look to the LHC progress …

Thursday, November 12th, 2009

More than one year has passed since the accident which required an extremely careful repair job and new evaluation of possible risks associated with running at high energy. The machine experts completed their activity and now particles are back in the LHC. Thousands of people belonging to the four LHC experiments (ALICE, ATLAS, CMS, LHCb) keep clicking on the LHC web page to follow the news, meet and discuss at the CERN cafeteria and around the world. Indeed, the machine is following its schedule. All tests carried out so far were successful making collisions possible by the end of year. Here is a bullet summary of what happened in the past months,

  • ** The sectors were cooled down during the firs half of October; they reaches the cryogenic temperature of 1.9 K. It marked an important step towards the final commissioning of the machine. At the same time, experts started powering the magnets, so the machine should be fully powered soon after the cool-down is completed.


  • ** On Friday 23 October, a first beam of lead ions entered the clockwise beam pipe of the LHC. The beam entered at point 2 where the ALICE detector is sitting and was dumped before point 3. One more step proved to be successful
  • ** The following day, the equivalent test was carried out on the anticlockwise beam pipe of the LHC. The first proton beam was injected close to the  LHCb experiment and dumped before point 7.
  • ** Operators are also progressively increasing the sector current up to 2 *1000 Ampere. This current allows the steering of beams at about 1.2 TeV (this is called “phase 2 of the LHC”)
  • ** Bird or Raccoon ? Nature decided to interfere with operations. On Tuesday 3 November, a bird carrying a baguette bread caused a short circuit in an electrical installation supporting two sectors causing an interruption to the cryogenics system. The community is not new to to these “interventions”. Here is an old report from the Tevatron collider (2006/06/19) “Operations reported a raccoon attack in the Linac gallery. It seemed to be a coordinated effort. Fortunately, by 1:53 AM, a joint force of operators and Pbar experts managed to drive the raccoons out of their hastily made fortifications.”
  • ** Finally, particles travelled half of the LHC ring. On Saturday evening, November 7th , after passing through the LHCb detector, protons almost reached the CMS experiment. Even if they were dumped in a collimator just upstream CMS, splashes illuminated the detector. Champagne bottles popped in the CMS control room!


The short period of  collisions at the injection energy of 450 GeV per beam will occur soon, the four detectors will tune their tools for the new exciting era ahead of us.


The new LHC start-up is getting closer and closer

Saturday, October 3rd, 2009

We are one month apart from the beginning of a new era in High Energy Physics! Commissioning of the LHC is progressing and the new start-up is scheduled for November. The LHC consists of eight so called “sectors” which can be tested separately, and six out of eight are already at the operating temperature of few kelvin.


Why is such a low temperature needed? Let’s follow the path of the particles within the accelerator chain. All begins with hydrogen atoms, stripped of electrons and injected into the LINAC2, a linear accelerator where the electric field accelerates the positive particles, the protons, to roughly 1/3 of the speed of light.  The following stage consists in splitting the particle packets and fed them into the Booster, a circular accelerator where the packets are squeezed and their energy increased to 92% the speed of light. A magnetic field bends the trajectory while the pulsating electric field raises the energy. Once the packets are recombined and sent to the PS (proton synchrotron), a circular accelerator of 600 m circumference, the protons make the transition. Their speed reaches 99% of the speed of light; since it cannot increase further, the additional energy acquired in the PS converts into mass. The proton mass is now 25 times larger than the mass at rest! The energy, measured in electron volt corresponds to 25 GeV (GeV =10E9 eV=1000000000 ev). The protons are ready to be channeled into the 7 km circumference SPS (super proton synchrotron) pushing the energy up to 450 GeV. The next transfer leads the protons into the gigantic LHC, nestled between the Jura and the Alps, 100 deep under ground with a circumference of 27 km. Additional energy is added such that each proton beam stores 7 TeV (TeV = 10E12 eV=1000000000000 ev) leading to a total energy per collision of 14 TeV. (during 2009-2010 the machine will operate at a maximum of 7 TeV in total.) The bending is possible because 1200 dipole magnets are powered along the path. The only possibility to have magnets strong enough to bend the high energy beam is to have them cold, which means superconducting magnets.

One of the  superconducting magnets is lowered into the LHC tunnel via a specially constructed pit

One of the superconducting magnets is lowered into the LHC tunnel via a specially constructed pit

In total, 1600 superconducting magnets are installed in the LHC with most weighing over 27 tonnes. “96 tonnes of liquid helium is needed to keep the magnets at their operating temperature of 1.9 K, making the LHC the largest cryogenic facility in the world at liquid helium temperature.” After the beam accident of last year, accelerator experts fixed the magnet inter-connections which, in normal superconducting state, should exhibit negligible electrical resistance.

15 m-long dipole magnets are seen lined up. To the right of the magnets the test hall can be seen.

15 m-long dipole magnets are seen lined up. To the right of the magnets the test hall can be seen.

Over 10000 high-current superconducting electrical connections were examined and conditions are now safe to start operating the machine!

A screenshot showing lead ions in the transfer line from the Super Proton Synchrotron (SPS) to the LHC

A screenshot showing lead ions in the transfer line from the Super Proton Synchrotron (SPS) to the LHC


Discovery or … ?

Wednesday, September 9th, 2009

If you are a frequent visitor of blogs on High Energy Physics, you might know that our calendar is filled with a plethora of different meetings. We also have meetings to discuss how to reduce the number of meetings! This structure guarantees constant communication
among experts and non-experts and allows for constant improvement in one’s knowledge about the experiment, physics, etc. Besides meetings, ATLAS also hosts the “ATLAS e-News” (google can find it for you) which has somewhat a different goal. The idea is to keep all ATLAS members informed about what goes on in the experiment, the latest news, etc. Here is an summary of what we’ve just published about our data quality control.

The assessment of data quality is an integral and essential part in any High Energy experiment. It is even more so for ATLAS, given the extreme detector complexity and the challenging experimental environment. Ultimately, a priori checks assure the scientific validity of the data. The status of ATLAS data taking is evaluated based on information from the data acquisition and trigger systems, and the analysis of events reconstructed online and offline, constituting the Data Quality Assessment. Let me explain a little what we mean by this. Data are recorded by the ATLAS detector: when a particle flies through the calorimeter detector for instance, it interacts with the material, produces a signal which gets read out by the calorimeter electronics. If the event is interesting, it fires a trigger system which causes the event to be saved and stored for future data analysis. What about the jargon words “online” and “offline” ? With “online” we mean the data processing carried out while collisions happen (almost real time), while “offline” indicates any study you would perform once data are stored. While taking data, quality control is essential to make sure you are actually recording meaningful data. We should always bear in mind that millions of channels are read out and any of those could have an intermittent failure at any level. In the almost real time analysis, raw detector data is accessed in the event flow and examined for hardware and configuration failures. Over 10 million histograms are produced by more than 150 monitoring applications, the ATLAS software automatically checks 50,000 histograms every minute. It also visualizes the results in a graphical form, adopting a hardware view to facilitate their interpretation. Once data are stored, we have a more detailed validation carried out in computer farms where the full event gets reconstructed. This happens within an hour from the beginning of the data taking. An even deeper study will provide results within 24h. All this will become routine as soon as the LHC turns on!


Travel to the future …

Wednesday, August 5th, 2009

At the end of the TSI  2009 (the Summer Institute at TRIUMF, dedicated to particle physics every 3 years) we had an interesting open discussion. The theme was “Is fundamental research more useful than applied research to society?”. Studies are on-going from an economical stem-point. It has been shown that gain factors are slightly higher for fundamental research than for applied science. Nevertheless, this is the perennial question asked by non-science oriented friends, young students, and mainly the agencies supporting our research. A good example of applications is the development of the semiconductor technology (for instance chip for computers) which is based on quantum phenomena. In the medical field, MRI and particle accelerators used for cancer treatment are now every day techniques made possible by fundamental research in the past 30 years.
What about Einstein’s theory of special relativity ? One of the two central assumptions in special relativity is the Galileo’s principle: the laws of physics are the same in any system moving at constant speed. If you are in a standing ship and drop an object, it will fall downwards. If the ship moves at constant speed, the object falls at exactly the same position on the ship. If people on the ship observe the falling object, they cannot tell if they are really at rest or if they are moving with the ship. They cannot distinguish their state of rest from the ship’s state by observing motion that takes place within the “reference frame” of the ship. Galileo proved that absolute space does not exist.

However he maintained the assumption of absolute time. Special relativity instead relies on the breakdown of absolute time. This concept is more difficult to grasp. The fundamental notion here is that the speed of light is constant in any system (you should read about the extremely elegant experiment carried out by Michelson and Morley!). This seems in contradiction with our every day experience. If you are standing and throw a ball towards a friend at a given velocity, the friend of yours will perceive the ball standing on his/her side if he/she is running at the same velocity and in the same direction. This does not hold for the light. If the light is shining out of a flashlight, regardless of how fast the person runs, he/she will always see the light moving at 299,792,458 m/s. The genius of Einstein lies in accepting a preposition which, at that time, seemed unreasonable and build a theory on that. This theory will prove to be correct and crucial to any further discovery and advancement in science.

What’s the main implication of assuming the speed of light constant ? Think again of a train, and let’s assume it is standing in the station for the moment. We place inside two mirrors and shines light from one side so it gets reflected back and forth. If the distance between the two mirrors is 1 m, then the light travels 2 m. Since the speed of light is 299,792,458 m/s, the light shines on the second mirror after 6.67 nano sec (tiny amount of time! 0.00000000667 sec). What happens if the train starts moving at some speed v ? We can apply the Pythagoras’s theorem and calculate the distance traveled by the light between the fist and the second mirror and find out that the distance is larger than before. But the light travels at the same speed. What does this imply ? That the clock on the train must take longer to tick. Time ticks at different rates depending on the speed at which we move compared to an observer, namely it is stretched on the train. The absolute time does not exist.
If you travel on a train moving at 300 km/h for 100 years, the station clock will be faster than the train clock by one-tenth of a 0.001 sec. It is not a large difference, but a real effect nonetheless!
Are there applications of this principle ? The answer is yes, and one of the most common is the GPS system (Global Positioning System). Originally developed for military used, the system is based on an array of satellites orbiting the Earth, each carrying a precise atomic clock. Using a GPS receiver detecting the radio emissions form any satellite, we can determine latitude, longitude and altitude with good accuracy and local time. The satellite clocks are moving at 10,000 km/h and Einstein’s theory of special relativity says that rapidly moving clocks tick more slowly, by about seven microseconds (millionths of a second) per day. Also, the orbiting clocks are 20,000 km above the Earth, and experience gravity that is four times weaker than that on the ground. Einstein’s general relativity theory says that gravity curves space and time,  which makes orbiting clocks to tick faster by about 45 microseconds per day. The final result is that time on a GPS satellite clock ticks faster than a clock on the ground by about 38 microseconds per day. Without correcting for the special and general relativity effects, the GPS would fail in its navigational functions within about 2 minutes!


ATLAS went cosmic

Wednesday, July 22nd, 2009

The world of particle physics is waiting for the beam injection into the LHC scheduled for late 2009. In the mean time, the Collaborations working at the experiments sitting around the 27 km long ring are tuning their tools. How do we work in absence of beam ? We detect very energetic particles coming from the sky!

Cosmic rays are mainly protons with large energy (orders of magnitude larger than the energy of particles produced in accelerators) originating on the Sun or during to-be-understood events in the far Universe. When the cosmic ray enters the atmosphere, it interacts with oxygen molecules and creates a shower of particles. These particles, called pions and kaons, eventually decay into muons. Muons are what ATLAS can see as they travel the 100 m of rock under which our detector is installed.


From detection point of view, the main difference between signals produced by the muons from the sky and the     particles from the beam interaction is the nature of the particle itself (during collisions we’ll produce electrons, photons, jets of hadrons, along with muons) and its direction (cosmic rays are vertical from top down, while collision particles  will stem from the interaction point and fly outwards).

Nevertheless, large collection of cosmic rays are crucial for detector commissioning, calibration, and alignment. Besides testing almost all 100 million channels constituting the ATLAS detector, the cooling systems, the read-out electronics, etc … the so called “cosmic run” (period of time dedicated to record cosmic events) is designed to verify how robust the data acquisition system is. Let me step back for some necessary introduction.

When LHC is operating, bunches of protons will collide 40 million times a second (each bunch contains 100000000000 protons). We expect an average of 1 billion collisions a second in the detector. Not all of these are interesting events and not all of these could possibly be recorded! ATLAS designed a smart system for selecting interesting events. The selection system is the trigger system. It is a three stage system, where the first so called “level” is hardware, while the second and the third ones are applications running on ~500 and ~2000 dual-pc processors.  The reduction rate leads to have 100000 events a second surviving the first level  of trigger and 3000 and 200 the second and the third one, respectively. The data needs to be reconstructed (all patterns of signal combined together to understand what type of physics process occurred in a given event) as soon as the data are accepted by the full system.

TRIUMF ATLAS team members with one of the control stations for the TRIUMF Tier 1 (Vancouver, Canda).

TRIUMF ATLAS team members with one of the control stations for the TRIUMF Tier 1 (Vancouver, Canda).

The “Tier 0”, a parallel computer network with 100,000 CPUs is set up to immediately store and manage the raw data (1s and 0s of binary code) produced by ATLAS. The Tier 0 then distributes data via fiber optic lines to 11 “Tier 1” sites across North America, Asia and Europe. “To put this into perspective, an average household  computer with a very good connection may be able to download data at a rate of one or two megabytes per second (if you are very lucky! I get 500 kilobytes/second). So, LHC engineers have designed a new kind of data handling method that can store and distribute petabytes (million-gigabytes) of data to LHC collaborators worldwide (without getting old whilst waiting for a download).” (Ian O’Neill)

The cosmic run provides a full practice for the entire system. Data from the detectors are accepted by the trigger system and assembled online. The quality of the data recorded is assessed by detailed monitoring systems capable of reporting failure modes on a short time scale (collision data are very precious! this is the reason why dozens of experts are sitting in the ATLAS Control room and in remote sites 24/7!). The beginning of the offline processing of the data at the Tier 0 starts not later than 1 hour after data is recorded. The ATLAS Collaboration, for the second time in 2009, turned the full detector on and recorded millions of cosmic events in almost 2 weeks of data taking. The activity was intence (and tense!), but the detector performed well. ATLAS is ready for beam … and physics!


Speech on science

Tuesday, June 2nd, 2009

Thanks to Dominique, one of the two postdocs in our ATLAS group at TRIUMF, I had the chance of listening to the US president Obama addressing the National Academy of Sciences. Similarly to Dominique, I found the speech really inspirational. You can find it linked here.

From a scientific point of view we might be living what’s called a “quiet crisis”. The expression was chosen by Ann Jackson from National Science Foundation some years ago to define a time when a steady “erosion of […] scientific and engineering base, source of […] innovation and rising standard of living” occurs.
Several can be the reasons, ranging from the choice of the administrations to the economical crisis.

In the speech, Obama traces a parallelism with the American Civil War. During the time when US was building its identity, in the midst of the war,
the Academy was actually founded. It was understood that a solid education in science was the key for the progress in the years to come. Stimulated by the launch of the Sputnik by Soviet Union, the US government promoted education for young people to become scientists and engineers. The trend was intensified when Kennedy focused on the space programme.

Let’s keep focusing on the status of science and engineer in the US as an example. According to the NSF statistics, half of American scientists and engineers are actually forty years old or older. As a matter of fact, the number of graduate students in physics increased by almost 30% from 1999 to 2006; but the number of staff and faculty positions just increased by 15%. The new administration approved a new budget which seems to improve the situation. In the new S&E policy, preparation of the science and engineering workforce and research community is seen a vital arena for the future.

This step has already been taken by countries like China, and South Korea for example. S&E now constitutes 60% of all bachelor’s degrees in China and 33% in South Korea. Asian countries are now setting the pace in advance science and math. All countries wanting to contribute to the world development should follow the lead. International and national organizations are already active in providing common criteria to assess progress; they can be panel for recommendations. Science and technology is a career choice which goes beyond the pure interest in the field, it is an active contribution to development of society. Because it takes years to create a community, we should not leave any educational gap.


The “Wow” effect

Sunday, May 24th, 2009

googlelhc1A couple of days ago I read an interesting article on the Wall Street Journal, reporting about the HR strategy of Google. The driving principle of Google is a constant change and innovation which, in turns, should keep the
employees engaged. However, a flux of engineers, sales representatives etc takes place from Google to Facebook or Twitter. Why does it happen ? To tackle the problem, Google is developing the “Happiness algorithm”. Data extracted from progress reports, evaluation forms, etc. are crunched into a machinery which is supposed to output the level of commitment and  satisfaction the employee has. Not surprisingly the approach was not well received as it leads to reduce human brain and emotions into a set of data which are then
compared to some kind of template. Conclusions are drawn from there. Many of the Google employees stressed that the main reason for applying and working for Google is the “Wow effect” and since the beginning, the excitement faded and has being looked for somewhere else. Google should not invest in algorithms to predict the level of pattern of the engagement, but rather in new products and R&D which is what attracted good minds. This is bottom line expressed by its employees in return to the company decision.

Should we learn anything from the Google experience ? On one side the size of the LHC Experiments indicate a similarity between them and companies like Google. On the other hand, our aim is fundamental research and the management itself does not have profit as its goal. Nevertheless, we do encounter bumps in the road. In particular the life of a graduate student is very challenging now.  Besides the student type role of learning and producing results, the student is frequently a member of a larger group, learns how to interact, how to share the work and the reward, she/he is given the opportunity to present her/his job in front of the Collaboration and to International
Conferences. We should not forget that learning her/his subject means learning the theory behind the data analysis, the computational tools, the experimental probes available, the statistical interpretation of the results, all this along with hardware activities in most cases. Given the big challenge in front of each students, the mentors do have the responsibility of making the environment pleasant, of supporting students when needed and giving the appropriate guidance. No, algorithms should not be used!
It is a critical moment for all of us belonging to the LHC experiments. Great are the expectations, both in terms of discovery and in terms of personal interest given how sophisticated analyses will be carried out in this environment. Because of the delay in the past years and the accident of last September, the enthusiasm might be reduce. On top, student might be close to graduation, for which they planned of having a data analysis completed.
The management is doing a great job in holding the Collaboration together, keeping it engaged. Participation to meetings and understanding of what activities are on-going is in fact the best reward to the effort of their colleagues. Group conveners and mentors follow the lead. It is crucial that whoever experienced the beginning of an experiment share what it means, if unhappiness shows up along the way.  The LHC is the discovery machine once operational and the physics we’ll discover might change our view of the world. Yes, I know. This by itself does not help if you spend a couple of days debugging your code and you feel frustrated, I understand. Let me tell you how I felt when I arrived to Fermilab. I was freshly graduated in Italy (the system is different than US, our “laurea” is equivalent to the  master) and I was given the opportunity to spend three months at Fermilab. It was extremely exciting for me. And all my expectations were met, and what I found was even beyond that. As soon as I landed, I got engaged in some testing, cabling, services.  These activities are naturally not as intellectual as carrying out an analysis, but they do transfer ownership of the experiment. If you have the key to go into the pit at night (that was the case at CDF, the LHC experiments require more security checks), climb into the detector
sit with a handful of colleagues inside the detector, that detector will be yours. And no bumps in the road will mislead you, you will always be excited when it comes to maintain it operational and analyze the data it produces. When the first data event was recorded at CDF, people were emotional. Likewise to the LHC, physicists can work decades on the same project and bringing it to completion is a great satisfaction. We should all encourage the continuous effort in education, but we should also allow young collaborators to experience such a participation. Passion stems from understanding that you are contributing to an important project, that your effort is crucial to its progress
and success. Hardware activities and operations are a natural place where this can happen. Even if the LHC experiments will mainly operate remotely and most of people will be based at their home institutions, visiting CERN and devoting time to activities closely related to the detector constitutes a deep push. Many students in fact decide to join High Energy Physics soon after they spend time at the laboratory. The exciting field of study and the idea of participating to a unique and large project such as the LHC is a real motivation. The picture below was taken when beam circulated in the LHC last year and ATLAS recorded events. It’s happening soon!



CERN in the movie theaters

Sunday, May 17th, 2009

The movie “Angels & Demons” from a novel by Dan Brown has been released this week-end. As well as its predecessor,  the “Da Vinci Code”, both the book and the movie raised quite some interest.

Publicity poster for "Angels and Demons," a Ron Howard Film, starring Tom Hanks, to open on May 15, 2009. Credit: 2009 Columbia Pictures Industries, Inc. All Rights Reserved.

Publicity poster for "Angels and Demons," a Ron Howard Film, starring Tom Hanks, to open on May 15, 2009. Credit: 2009 Columbia Pictures Industries, Inc. All Rights Reserved.

The plot is about the Harvard symbologist Robert Langdon. The story begins with the murder of the most respected CERN physicist supposedly by a secret society called the Illuminati. The physicist has newly discovered antimatter which is though of a weapon of destruction. That antimatter is made using the Large Hadron Collider (LHC)  and is stolen from CERN by the Illuminati. The evil goal of the Illuminati is destroying Vatican City. The idea behind the book is the conflict between science (Illuminati’s society abusing of CERN discoveries) and the Roman Catholic Church.

Despite the fact that I am not inclined to such readings, I did buy the book on my way back from Vancouver.
It was a way to relax during the long flight, but my pre-judgment did not change much!

Nevertheless, the novel and the book draw attention of the general public towards CERN and its activities.
(Parts of the movie were actually filmed at CERN).


This is extremely important for our field and for the general education of society. I do believe that each
community of experts should invest time and resources for spreading knowledge to non-experts. Given that
such activity is not foreseen in profit oriented companies, laboratories and universities should take the
opportunity to pursue this goal. Public lectures, organized visits, etc are already pretty common in our
field. The message we want to convey is two-folded. On one hand, the rigorous mathematical foundations of
science as it stands today and its predictive power have to be presented; on the other hand, the amazing
advancement of experimental physics, with strong emphasis on the complexity of our tools can be used to
raise interest. Our aim is to try and educate people who are not in science in general and physics in particular; besides this, we want to convince young and enthusiastic students to join our field by showing the depths at which phenomena are understood along with the fascinating open questions we want to answer.

Embracing the opportunity of the movie launch to discuss the real science of antimatter, CERN has opened an exhibition starting this Sunday (May 16th) where all questions the novel and the movie might have generated can be answered.  “You can learn how antimatter is made, whether it could be a new source of energy, and what the science of the movie has in common with current research – and much more”. Events are also planned in particle physics institutions  across Europe, Asia, Central North and South America. If you want to know more, go here. http://angelsanddemons.cern.ch/expo

When the task is spreading science, care must be taken to validate each statement. Strong internal and external reviews  make sure that the results of experiments are solid and can be reproduced; similarly scientists presenting their research or giving broader talks on the current status of research do not abuse of the fascination that the current state of the art could actually induce in non-experts. Similarly, books and movies should go through a process of review. The novel “Angels and Demons” was found incorrect in few parts and errors were pointed out. CERN has created a web page to lead the reader through common misunderstandings: http://angelsanddemons.cern.ch/