• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for March, 2011

–by Nigel S. Lockyer, Director

Last week I gave a colloquium at the University of Pennsylvania (Penn), my former institution of 22 years. I talked about the on-again off-again worldwide medical isotope crisis and the proposed Canadian solutions. It was not what they really expected, since after all I am a particle physicist and Director of TRIUMF, a particle and nuclear physics laboratory. But that was part of the appeal!

Lots of friends and former colleagues from the department as well as quite a few from the medical school came to hear about the problem and our proposed solutions. Medical isotopes are used in roughly 40 million nuclear medicine procedures around the world and about half of these are in the U.S. The problem is a worldwide shortage of technetium-99m, or Tc-99m, the most popular medical isotope. It is used for heart perfusion imaging (which shows blood flow in the heart muscle, something a doctor wishes to see after a possible heart attack) and to identify bone metastasis. Tc-99m is made primarily in two ageing reactors, the NRU in Canada and the Petten in the Netherlands, both over 50 years old and showing signs of wear. About two years ago they managed to both go offline at the same time for an extended period for needed repairs. The ensuing medical isotope shortage made headlines around the world.

The good news is that my audience seemed to stay awake. Better still, after the lecture, I asked one of the students what they learned and the answer was simply that accelerators were the answer. I was pleased. The message was apparently clear to at least one person…I must say, at least one highly perceptive person. My experience tells me that the best colloquia are the ones where the listeners go home that evening and share the story with their families. The day after the colloquium, one faculty member told me that he had discussed the issues with his kids that evening. Mission accomplished…sort of!

Although there are several solutions to the medical isotope crisis being proposed with accelerators in Canada and now the U.S., and reactors in the U.S. and Europe, I focused in my talk primarily on the approach using small medical cyclotrons, a area of expertise of TRIUMF and Canadian industry (for example, ACSI in Vancouver, which grew out of a collaboration with TRIUMF two decades ago, and manufactures small medical cyclotrons with the prefix “TR”). Another company, BEST, located near Ottawa is also starting to manufacture small medical cyclotrons. In a nutshell, we think that a small cyclotron, a TR-19 for example, running with a few hundred micro-amps of beam current should be capable of supplying enough Tc-99m for a city of roughly 2 million people (think Vancouver). Much work is ongoing at the BC Cancer Agency, Lawson Health Centre in London Ontario, the Centre for Probe Development at McMaster University and TRIUMF. With luck, we will have a positive answer in about 15 months. The world will pay attention (we hope) and certainly our colleagues at Penn are paying attention since they have two small cyclotrons. Enough for Philadelphia? they asked.

Let’s get off topic for a bit. Penn houses the oldest medical school in the United States (1765) and today has over 1700 full time faculty today. That is a huge program. The medical school complex is like a small city itself: throw in Children’s Hospital of Philadelphia (ranked as one of the best in the U.S. by U.S. News and World Report and ranked number one in pediatrics in 2008), the VA Hospital, the Children’s Seashore House Hospital…you get the picture. A new addition just down the street is the Abramson Cancer Center.  I was peripherally involved in the embryo of what is now the new Roberts Proton Therapy Centre. The Roberts Proton Therapy Center is the world largest proton therapy center associated with an academic medical center in the world and one of only six such centers in the United States. (Canada does not have proton therapy except the small ocular melanoma program at TRIUMF.) The Roberts Proton Therapy Center features five treatment rooms:  four gantries with 90-ton rotational machines designed to deliver the therapeutic beam at the precise angle prescribed by the physician, and one fixed-beam room. It is a very impressive facility. I have several friends who work at or are associated with the centre. Oh, did I mention it uses a cyclotron (manufactured by IBA, a Belgium company) to make the beams of protons. I am getting off topic—none of this was in my talk. Back to radiology.

Penn is also well known, to say the least, in radiology; they even maintain a short online history.  Quoting from this  article,

“Radiology at Penn began even before the beginning. In late 1895, German physicist Wilhelm Conrad Roentgen announced his seminal discovery of x-rays. Almost immediately, Penn Physics Professor Arthur Willis Goodspeed realized he had produced x-rays almost six years before, and had the physical plates to prove it. But rather than looking backward, Goodspeed looked forward instead. He quickly teamed with Penn surgeons J. William White and Charles Lester Leonard to produce, on February 4, 1896, one of the first recorded patient exposures using x-rays. That spring, Leonard was named the University Hospital’s first “skiagrapher,” and arguably the first academic department of radiology in the United States and, perhaps, the world was born.”

Roentgen's first x-ray

The original May 18th, 1903 NY Times article can be found online.

“The first x-rays (called “skiagraphs”) were taken of extremities. By June of 1896, the chief of surgery used a skiagraph to locate a toy jack that a child had swallowed. Within months, several hospital departments were using roentgen rays for diagnosis, surgical planning, and follow-up. In 1898 , Charles Lester Leonard used X-rays as a method to identify urinary stones. He also wrote the first paper on the hazards of X-rays.”

That subject is still topical as we all know.

“In 1905 Henry Pancoast described the utility of bismuth and then barium for contrast in radiology studies.” Among many other contributions, he is known for his description of Pancoast’s tumor , a large cancerous tumour in the lung most likely caused by smoking. “He also later described the relationship of prolonged irradiation and the development of leukemia and the use of X-rays in the treatment of Hodgkin’s disease and leukemias. Henry Pancoast was appointed as the first Professor of Radiology (roentgenology) in the United States.”

“More recently, in 1964 David E. Kuhl developed the technique of Single Photon Emission Computed Tomography (SPECT) and the principles of Positron Emission Tomography (PET). In 1976 the world’s first FDG (fluorodeoxyglucose) PET image was obtained at Penn, starting an ongoing new era in functional imaging.” How big is that? The department currently performs more than 1,045,000 procedures annually. You get the picture…a busy place clinically and in research.

In the last several months Penn has been in the press for developing an F-18 labeled molecule that attaches to amyloid plaque, a possible cause (or result) of Alzheimer’s disease. This is not the first such tracer to attach to the plaque. The best known is Pittsburgh Compound B or PIB that is labeled with C-11 which has only a 20 minutes half life (C-11 is an isotope of carbon with one less neutron) . PIB works well, and it has been used at TRIUMF and UBC. However only medical centres with cyclotrons can make PIB and thus its success has set off a rush to make a similar compound labeled with F-18 (not to be confused with CF-18, a Canadian jet fighter), which may be more practical because it can be shipped to a 2-hour radius. Eli Lilly just bought the small start-up company that developed the successful radiotracer. TRIUMF is not yet at the level of developing breakthrough radiotracers like this one. Once they are developed elsewhere, we make them and use them for research or supply them for clinical use to the local research communities.

Penn’s Professor Cam Kuch developed a molecule EF5, which attaches to hypoxic tumours (tumours with reduced oxygen content). These tumours are radiation resistant and often require some treatment other than radiation….for example chemo-therapy. TRIUMF’s Mike Adam developed the chemistry to attach F-18 to this molecule and it is now used in PET imaging at the BC Cancer Agency for some patients to understand more about their cancerous tumour. Thanks Cam. Thanks Mike.

I finished my talk, acknowledged my TRIUMF, UBC and BC Cancer colleagues who teach me about all the great research they are doing here in BC and I realized by the high level of interest at Penn in what we were doing that I had successfully brought coals to Newcastle ….or medical isotopes to Penn.


Lots of interesting particle physics news recently on the Cosmic Frontier front.

Science News reports that the National Research Council’s March 7 report for science in the coming decade recommends completion of the Large Synoptic Space Telescope.

…which will not only probe the nature of dark matter and dark energy but aid in tracking near-Earth asteroids.

LSST  is a huge public and private partnership, which includes many of the national labs, among them Fermilab, which hopes to build on its computing experience with the Sloan Digital Sky Survey to help manage the unprecedented flow of data expected from LSST. The February issue of symmetry magazine outlines the partnership needs the experiment will require.

…the LSST camera will produce 3.2-billion-pixel images and generate, on an average viewing night, about 15 terabytes of raw data, or 25,000 CDs worth. To display one of the LSST full-sky images on a television would require not just a high-definition screen, but 1500 of them.

While LSST is not expected to take data for quite sometime, its predecessor the Dark Energy Survey should start its first sky survey in October. The blog dark matter, dark energy, dark gravity explains how DES will be the first experiment to use four different methods at once to search for dark energy. Medill news services uses a great video to show physicists at Fermilab wrapping up tests on camera components before shipping the final parts to Chile for assembly on the 4-meter Blanco telescope. Sadly, the New York Times reports that the driving force behind making the telescope a bastion of U.S. science in Chile, Victor Blanco, passed away. 

Unlike DES and LSST, the holometer experiment aims not to record the sky as we see it but as Fermilab theorist Craig Hogan thinks it really is: a giant hologram.  The Little India newspaper explains Hogan’s theory and how it relates to black hole science.

Scientists have known for long time that information plays a key role in the creation of a system. Our computers and robots are just metals and wires if no information is exchanged in the form of bits. Our brain is inanimate if no information is carried by the neurons. Our genes are futile if no information is available from DNA that instructs how to function. In everything we know information is the key.

Similarly the entire information about our universe must be encoded elsewhere. Like a hologram on our credit cards, which contains the information in a thin film, and can generate 3D objects when viewed in proper light, the reality we tempt to believe is actually just one way of viewing information printed on a distant cosmic film. What we see and experience as reality are the shadows of the truth.

–Tona Kunz



Entrance to Soudan Mine. Credit: Fermilab

It looks like good news for the MINOS and CDMS experiments. The Minneapolis Star Tribune reported today that the fire has been contained in the Soudan Mine, which houses the two  experiments.

The newspaper reported:”Workers also determined there probably has been no water damage to a $100 million University of Minnesota research lab at the bottom of the mine, 2,341 feet below the Earth’s surface, said Minnesota DNR spokesman Mark Wurdeman.”

A three-man crew that entered the mine Sunday did not report seeing any active fire, but officials are holding off calling the fire extinguished until further investigation. The cause of the fire remains undetermined but it appears to have been fed by wooden support timbers inside the former iron ore mine.

The fire was noticed Thursday night when smoke alarms went off. No one was in the mine at the time. It appears from a Minnesota Department of Natural Resources press release Friday that the fire began on the tourist area of the mine, two to four  floors above the experiment halls. The Minnesota Department of Natural Resources operates the mine while the University of Minnesota overseas the high-energy physics laboratory. 

The University is working with firefighters to determine the amount of water that can be sprayed in the mine without causing seepage into the experiment halls. Firefighters also are using flame-suppression foam. The mine has a ventilation system that hopefully should keep smoke out of the delicate detectors.

The MINOS experiment studies how neutrinos change from one type to another over long distances. The detector on the 27th floor of the mine records neutrinos sent via a particle beam from Fermilab 450 miles away. CDMS is a dark matter search using cryogenic germanium and silicon detectors to record dark matter particles that pass from the atmosphere through the earth. 

CDMS inside Soudan Mine. Credit: Fermilab

MINOS detector inside Soudan Mine. Credit: Fermilab

Related information:

Take a virtual tour of the Soudan mine.

 — Tona Kunz


The Kids are Alright

Friday, March 18th, 2011

– By David Morrissey, TRIUMF Theorist

What is it that comes to mind when you think of a national physics  laboratory? When I ask people this question about TRIUMF, the answer I usually get involves an image of serious senior scientists in lab coats slogging away over complicated piles of wires and tubing and computer displays.  This is partially correct, but it is far from the whole story.

TRIUMF is full of students.  Most of them are graduate students working on Masters or Doctoral degrees, but we also have a large group of undergraduates doing co-op programs.  These students make crucial contributions to the research going on at the lab and they are one of our most valuable outputs.

A typical doctoral (Ph.D.) degree in the basic sciences takes from 4 to 6+ years to complete.  It is a major undertaking, involving a lot of hard work for not much pay, all for the opportunity to work on something really fascinating.  The key part of a doctoral degree is doing original, fundamental research.  Put another way, to get a Ph.D. in the sciences you have to add to our understanding of Nature.

Graduate research at TRIUMF is supervised  mainly by lab research scientists, sometimes in conjuction with faculty at one of our member universities.  A beginning graduate student will usually start off doing very specific directed tasks, she will gradually progress to working more and more independently, and by the time she finishes her degree she will often be close to running the entire experiment she has been working on.  It is really an apprenticeship of sorts, and students are responsible for a great deal of the hands-on work in scientific research.

After graduating, many students continue on in scientific research, but a significant number move on to other fields.  I know former physics graduate students who have gone on to careers in medicine, law, journalism, financial analysis, teaching, professional bike racing, and all over the high-tech industry.  Even though the specific research a student will do for his degree might not have any obvious applications outside of fundamental science, the training he will get along the way is enormously useful in many different areas and is almost impossible to reproduce.  I don’t know how to quantify it, but I would not be at all surprised if the economic benefit of turning out all these highly-trained people far exceeds the total investment made in basic science research.

Given the broad importance of students, it probably won’t come as much of a surprise that one of the main roles of a professional scientist is advising and supervising them.  This is also something that I feel very strongly about, partly for a very selfish reason. On my own, I can only do so much research.  But by training students, who will in turn go on to do their own research and to train their own students, I can contribute exponentially more to the progress of Science. And this is what I really want – to learn as many of the answers as I can.


I am overdue for a blog post because I have been way too busy lately.  I got an email from an elementary schooler, Jacob, asking about the QGP so I thought instead of replying privately I’d reply here since it may be of general interest.  The questions are from Jacob.

What is QGP going to be used for in the future when it is better controlled?

Right now we don’t think the QGP has any practical applications.  We’re studying it because we want to understand the universe in general and nuclear matter in particular.  Shortly after the Big Bang, we think that the universe went through a Quark Gluon Plasma phase.  By understanding the QGP better, we may understand how the universe expanded better.  When we do basic research, we don’t usually know what impact it will have.  What we know by looking at history is that basic research eventually leads to benefits to humanity – but we’re very bad at predicting what those benefits will be.  When Mendel studied genetics of plants, he never imagined that genetic studies would lead to all of the improvements in medical care we have now.  Einstein developed his theory of gravity not so that we could send satellites into space or so that we could all have GPS in our cars or get better TV reception – he was motivated by simple curiosity and a desire to understand our universe better.  We are still reaping new benefits from quantum mechanics, developed in the early 20th century – we now have light emitting diodes (LEDs) in traffic lights and flashlights and while LEDs existed when I was your age, they weren’t nearly as widespread, as cheap, or available in so many colors.  So it takes a long time to see the benefits of basic research.

So we don’t know what applications this research will have in the future.  That said, there are a lot of spin off benefits to this research.  In high energy physics, we are always building the fastest and most precise detectors possible.  To do this we often have to develop and test new detector technologies.  Once we’ve developed the technology, these detectors can be used elsewhere too.  Particle detectors are used in hospitals in x-ray and MRI machines.  They are used in chemical and biomedical research to study the images of proteins and the structures of solids.  They are used in national security for detecting radioactive materials.

Basic research moves the boundary of what is possible.  Once we have done that, there are a lot of benefits.  But since we’re working on doing things that have never been done and studying things never studied before, we can’t predict exactly how it will be useful.  Put another way, if we knew what would happen, we wouldn’t call it an experiment.

What attributes does it have that other matter does not have?

This is a difficult question to answer as worded – it depends on what you mean by “attributes”.  When I think of the properties of a particular form of matter, I think about its density, its opacity to different probes (like if you shine light through it does the light come out the other side?)…  All forms of matter have a density.  So I’m going to answer a slightly different question – what makes a QGP unique?  What makes the QGP unique (among the forms of matter we’ve studied in the laboratory) is that the quarks and gluons interact through the strong force.  There are four fundamental forces in nature

1. Gravitation
2. Electromagnetism
3. Weak interaction
4. Strong interaction

The first two are the most familiar.  Gravity is the reason why you stay on the ground instead of floating through the air.  It’s also the reason the Earth orbits the Sun.  The electromagnetic force is ultimately responsible for basically every other force you feel or see.  When you sit in a chair, the reason you don’t fall through the chair is ultimately due to interactions between your atoms and the atoms of the chair.  It’s also behind light and electricity.  It’s how your microwave and your TV work.  The most familiar thing we can attribute to the electroweak decay is beta decay – a particular kind of decay of a nucleus.   The strong force is what holds nuclei together.  If we only had the electromagnetic force, the protons in the nucleus would not be bound.

So a QGP is a liquid of quarks and gluons bound together by the strong force.  Water molecules, for instance, primarily interact through the electromagnetic force.  The properties of water are determined by the way water molecules interact through the electromagnetic force.  To understand the QGP, we have to understand how quarks and gluons interact through the strong force.  This turns out to be a very difficult computational problem.  But by studying the QGP, we can try to calculate what we would expect and then compare what we expect from our theories to what we see in the laboratory.

In addition to that, it is the hottest, densest form of matter ever created in the laboratory.  And it appears to have the lowest viscosity of any form of matter ever created in the laboratory.  Viscosity is a way of measuring how much a fluid resists flowing.  Honey, for instance, is much more viscous than water.

How will QGP affect modern or future physics?

I don’t know exactly.  It depends on what we learn.  Already we’ve learned a lot about relativistic fluids – where the individual particles in the fluid are traveling close to the speed of light.  As I said in the first answer, we don’t know exactly what we’ll learn – because if we did, we wouldn’t call it an experiment. One thing I hope – and maybe you can help me out here – is that we’ll inspire the next generation to go into science, math and engineering.

Also, what state of matter is it?  I know that it is called plasma but I’ve also read that it is very similar to both liquid and gas.

A QGP is a new state of matter.  We believe it is a liquid – indeed, a liquid that probably has the lowest viscosity of anything we’ve ever measured.  We thought it’d be a gas, but it turned out to be a liquid.  Here I have a post describing what we know about the QGP and its phase diagram.

I also could not verify what temperature it occurs at because there is so much different information on the internet.

The reason what you find on the internet is somewhat unclear is that the answer is somewhat unclear.  First, it doesn’t exist at just one temperature.  Think about water.  Water can be cold, warm, hot, etc.  It depends.  There’s a temperature where ice melts and becomes water and below that you can’t have water.  That temperature is called the melting point.  But then once you have water, you can heat it up and you have to heat it up a lot before it boils and becomes a gas.  That also occurs at a special temperature – the boiling point.  The problem is, these temperatures depend on pressure and volume.  Water boils at a lower temperature at high altitude.  Analogously, we have a melting point and a boiling point for the QGP.  We think the melting point at the baryochemical potential at RHIC is about 170 MeV – but there’s a fairly large uncertainty in that number.  We think we’re well above that at RHIC and we’ll be even further above it at the LHC (but we haven’t yet had enough time to analyze the data at the LHC to say how hot it is). This gets to a crucial issue – we don’t have a thermometer to measure a QGP.  If you put a thermometer like the one you have in your house into a vat of QGP (if we could ever create that much of it) it’d melt.  So we have to come up with other ways of measuring the temperature.  We can look at the energies of particles created in the collision, for instance.  But it takes more work than just using a thermometer.

Many thanks to Jacob for the great questions!


The top (CDF) and bottom (DZero) images show the expected and observed 95 percent confidence level upper limits on the production rate of a Higgs boson as a multiple of the Standard Model prediction, assuming standard model decay branching ratios. The solid, horizontal line shows the prediction for the Higgs boson according to the Standard Model. We determine our measurement by how our data relates to this solid line. The figures have two squiggly lines: one dotted and one solid. The dotted line shows what we expected to measure and is surrounded by a bright green band. The band shows how certain we were in our prediction. The best way to interpret this is that the bright green region shows the area where we predicted our measurement should be.

This article ran in Fermilab Today March 17.

The Standard Model of particle physics needs the Higgs mechanism to explain why all the particles in our universe have mass, but no experiment has yet observed the elusive Higgs boson. Answering the question of whether the Higgs mechanism is correct or whether something else is responsible for the masses of particles is central to our understanding of nature. Many physicists around the world have spent decades searching for the Higgs boson. This week, a crucial step forward in this quest has been made by the CDF and DZero experiments.

All recent Higgs boson mass exclusions have come from combinations of results from more than one experiment. Despite the importance of such combined statements, it is an important milestone when a single experiment reaches the level of sensitivity necessary to rule out or see the Higgs boson. Late last week, the CDF and the DZero experiments crossed this threshold individually. The CDF and DZero experiment collaborations recently updated their Higgs boson searches in the high mass range (130 to 185 Gev). In this range, the Higgs boson mass is high enough to allow it to decay to a pair of W bosons.

Together, the Tevatron experiments put to good use an additional 1.5 inverse femtobarns of data collected since their joint result from last summer, and added several new improvements to their analysis techniques. The new data and improvements have allowed both Tevatron experiments to exclude a portion of the Higgs boson mass range: 158 to 168 GeV for CDF and 163 to 168 GeV for DZero. The CDF and DZero experiments have also combined their results; the region thus excluded is 158 to 173 GeV. A Higgs boson of mass 165 GeV is now excluded at the unprecedented level of more than a 99.5 percent confidence level.

Fermilab currently expects the Tevatron to keep recording data until September 2011. CDF and DZero are also ideally suited to look for the Higgs boson in the low mass range, where the Higgs would decay mainly into bottom quarks. CDF and Dzero expect to present new results in this search region later this year. This large data sample, along with expected analysis improvements, will allow the experiments to either exclude the Higgs boson over the entire mass range of interest if it does not exist or to see hints of it – representing a major breakthrough in our understanding of nature.

— Edited by Andy Beretvas


Following the Fukushima Story

Wednesday, March 16th, 2011

— By T. “Isaac” Meyer, Head of Strategic Planning and Communications

The series of events at the Fukushima nuclear reactors in Japan following the massive earthquake and tsunamis will be something many of us will remember forever. If we ever doubted that we truly live in the “atomic age” as it was so fondly dubbed in the 1960s, we must surrender conception that now. From medical isotopes that diagnose disease and save lives to nuclear power plants that reduce greenhouse gas emissions and sometimes breakdown and create massive drama, we humans do live in a world that is controlled and affected by “physics” beyond the human eye.

As a science communicator for Canada’s national laboratory for particle and nuclear physics, I’ve been working almost non-stop to help track, interpret, and translate the unfolding drama of the heroic efforts to cool down and shut down the Fukushima Daiichi nuclear power plants. With the team here at TRIUMF, we have provided 15 radio interviews, five TV interviews, and numerous print comments in addition to online exchanges. Its not that we have special communications channels, its not that we operate a nuclear power plant, and its not that we have a crystal ball.

No, its that we know the difference between a dose and a dose rate; we have people who can translate the stream of high-quality information coming straight from Japanese twitter feeds (TRIUMF’s first PhD student is a now a Univ of Tokyo professor in Japan who is leading much of the scientific and technical communication efforts in the crisis); and we’ve been around radiation before. We are a particle and nuclear physics laboratory and we have radiation health and safety people that rival the best in the world. We don’t deal in quantities of radiation or material nearly as large as a nuclear power plant, of course, but we can shed some light on the issues and the context of what constitutes significant and what does not. In a way, providing this interpretation and even guidance is part of our responsibility as publicly funded researchers.

But it is a challenging and frustrating situation. Getting hard facts about what is going on in Fukushima Daiichi is difficult. This is because of language barriers, distance/transmission delays, cultural attitudes (parts of Japanese culture are more reserved than North America and its media), and the tremendous concentration required to actually focus on resolving the situation. You’ll notice that when the fire department is extinguishing a house fire, its only afterward that the fire chief starts talking with the media. All of her attention is on managing the crisis. As I said, its a tough challenge to balance getting the job and sharing news with the public—particularly when it might impact them.

I’ve thought about trying to blog about the situation, but the reality is that I’d be behind and since we don’t have all the facts, some of it would be speculation. I can say that the west coast of the U.S. and Canada, despite the deteroriating fuel material at Daiichi, is still quite safe from “blowover” of radioactive dust. The latest summary of where things are at is from the Washington Post with this nice graphic. There are even online geiger counters in Tokyo where you can check the “background radiation” weather.

My thoughts and prayers are with the people of Japan. What is making headlines this week will change their country forever…as it will the entire world.


I am re-posting here a YouTube video of the earthquake swarm before and after Friday’s big earthquake. While not all of the earthquakes pictured, many of them could be felt in Tokyo as well. The aftershocks are continuing also today. Just an hour ago, we had another magnitude 6 event. No one is too comfortable here these days.

This screen image from the Tevatron main control room shows how the earthquake in Japan March 11 affects superconducting quadrupole magnets in the accelerator tunnel.

When the 8.9-magnitude earthquake struck Japan last week, Fermilab felt the jolt emotionally and physically.

Accelerator operators in the main control room of the Tevatron saw the heart-rate-monitor-style tracking system for the more than 1,000 superconducting magnets go into cardiac arrest. This signaled the forward and backward pitch and side-to-side roll of the 4- ton, 20-foot-long magnets buried underground.

And that meant somewhere, something very bad had happened.

The monitor readings came from sensors called tiltmeters on underground magnets that steer particles around the four-mile Tevatron ring. They record vibrations too tiny for people at the laboratory to feel, including seismic waves from earthquakes thousands of miles away. The last time the magnets rocked like that was in 2010 when a 7-magnitude quake struck Haiti. The Tevatron also recorded a 2007 quake in Mexico, a 2006 quake in New Zealand, and earthquakes that triggered deadly tsunamis in Sumatra in 2005 and Indonesia in 2004. In all, the Tevatron has felt disaster more than 20 times.

A December 2010 symmetry magazine article explains how physicists first noticed the Tevatron’s super sensitivity, and how they work to make sure it doesn’t  interrupt the laboratory’s multi-million-dollar research efforts.

For accelerator operators, learning that the computer sqiggles signaled a quake in  Japan was an emotional blow. Fermilab
has a long and fruitful history of working with Japanese physicists and institutions. Japanese scientists have been involved with Fermilab from about the beginning of the experimental program in the early 1970s and became key members of the Tevatron’s CDF collaboration in the early 1980s.  Many Fermilab scientists, engineers and technicians have friends in Japan, from Japan or have worked at its high-energy physics laboratory, KEK, or JPARC, the high-energy accelerator complex.

In 2010, the most recent data available, Fermilab had 80 visiting researchers from Japanese institutions spread throughout the country, including the areas hardest hit by the earthquake and tsunami. Those scientists are valuable members of several experiments, particularly the CDF collaboration and the accelerator research program.  In all likelihood, the Japanese contribute even more to Fermilab’s research program because the also work at the laboratory as users from non-Japanese institutions, but a statistic on the number of those users is unavailable.

— Tona Kunz


One of the great things about physics is its universality: theory developed to describe a certain phenomenon can often be widely applied in a multitude of situations. A century ago, the electron was a recent discovery, and the “plum-pudding” model of the atom had just been felled. Protons (and indeed, antiprotons), ion traps, and the rest of our modern toolkit remained unknown. Yet, the methods used at ATRAP to cool antiprotons in an ion trap some 6 orders of magnitude can be traced back to ideas formulated around the turn of the century.

Why cool antiprotons at all? Well, when we make antihydrogen, its temperature is dominated by the temperature of the incoming antiproton, since the mass of the positron is so comparatively small. The fraction of trappable antihydrogen atoms decreases dramatically as the temperature goes up, so it’s important to start with the coldest possible antiprotons.

ATRAP Magnetic Field

Proof of our large magnetic field (yes, small Euro coins are slightly magnetic)

We start by noting that in our ion trap, we have a large, uniform background magnetic field. A charged particle in a magnetic field is confined to move in circles, constantly changing direction and therefore accelerating. If we reach back to 1897, we come across Larmor’s derivation that accelerating charges radiate away energy. Exactly how quickly depends on the magnetic field and the mass of the particle; in the ATRAP experiment, the antiproton radiates its energy away with a time constant of 36 years.

But, there’s good news. For the same magnetic field, the much lighter electron radiates much more quickly – 2/10 of a second. Even better, the electron and antiproton have the same sign of charge (negative). They can be trapped simultaneously in the same voltage well, and allowed to interact (there are no annihilations, since the antiproton and electron do not form an antimatter-matter pair). So, we exploit the quick cooling of electrons by letting them collide with antiprotons in our trap. After only a minute or so, the electrons and antiprotons have come into thermal equilibrium with each other, and with their 4 Kelvin surroundings. The final antiproton temperature is actually closer to 20 Kelvin, though, because unwanted electrical noise makes its way down into our trap and acts as a heat source. Nonetheless, electron cooling successfully reduces the antiproton energy by a factor of 100000.

(Side note: the same sort of physics explains why the LHC has to be so Large. The tighter the loop, the larger the energy loss due to radiation; at some point, the energy losses make the whole process wildly inefficient).

We’ve recently published a paper describing how we can further cool antiprotons to 3.5 Kelvin. We use the technique of adiabatic cooling, which is a fancy way to say that an expanding gas gets colder (provided nothing external puts in or takes away energy). Examples can be found in surprising places – it’s the reason why compressed air sprayed out of a can feels cold, why water vapor condenses into clouds as it rises and expands, and why a refrigerator can keep food cold. And, in keeping with the theme of this post, it’s all well described by thermodynamics worked out in the late 19th century. (Incidentally, the related process of adiabatic heating – compressing a gas makes it hotter – forms the heart of a diesel engine).

Larmor, Carnot, Boltzmann

Larmor, Carnot, and Boltzmann - 3 guys who never heard of an antiproton

At ATRAP, our “gas” is a cloud of antiprotons, which we can let expand in a controlled way by reducing the trapping electric field. We demonstrate that the measured temperature decreases as the volume increases – the hallmark of adiabatic cooling. It’s worth mentioning that we measure the final temperature of our antiprotons by observing the number that escape our trap as a function of trap depth; this traces out the tail of a Boltzmann distribution, from which we can determine the temperature – another 100+ year old invention.

It amazes me that here we are doing cutting edge research, and concepts from the 19th century are still being put to good use. As experimentalists, we should consider ourselves lucky that the well from which we draw our ideas runs so deep.