• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for February, 2011

–by Nigel S. Lockyer, Director

Did you ever hear or think of a word and then have it seem to pop up everywhere? Well, for me, the word is water. Pick your word and count how many different ways it comes up in conversation, on television, in the news, or however else you encounter information on a daily basis.

I’m interested in the origin of water on our planet. Its not as obvious as you might think. Over the past few years, I have followed the subject in the popular-science press and have read a couple of books to try to understand the answer—in short, no firm conclusion. But more on this later.  First, let’s follow water…water always flows downhill, right?

In Canada, water is ubiquitous. In science, water is ubiquitous. In technology, water is ubiquitous. But let’s start with the obscure. TRIUMF is ordering a new water-jet cutter to replace a plasma-arc cutter for our machine shop. The water jet is supposed to be better because it doesn’t heat the metal during cutting. Hard to believe that water cuts metal…but it does—abrasives in the water help—because the pressures can reach 50,000 psi. The modules which contain the infrastructure associated with our isotope-production program are cooled with water; and recently, we diagnosed some regular leaks due to poor quality control of the brazing. (We need to fix that!) Oh, going back in history, it occurs to me my thesis used a water target…but I wasn’t so interested in water back then.

The Meson Building, the original science research building of TRIUMF, is adjacent to Cyclotron Building (which holds the world’s largest cyclotron) and contains our eye-cancer therapy centre and many of our material science experiments with muons. Well, the roof leaks, you guessed it–water. This doesn’t phase me too much because every lab in the world has been designed, I am sure, with leaky roofs. My favorite is the meson building at Fermilab. (Maybe there’s something about meson buildings!) Bring your “wellies” (short for Wellingtons or … hard to believe.. fashionable rubber rain boots in Vancouver).

Ok, more water. The Saturday morning physics lecture this week for local high-school students was by Paul Percival from Simon Fraser University. He talked about supercritical water and how he is able to study chemical reactions in a pressure vessel at several hundred degrees and hundreds of bars pressure using spin-polarized muons. I was impressed with one potential application he described — that supercritical water can be used to burn “toxic waste” such as old nerve gas from weapons stockpiles. He showed a very cool picture of what looked like a flame burning in water…figure that out. I was waiting for him to say he had made “fake water” where he replaced the hydrogen with muonium. Wonder what that would taste like? Drink fast!

Flame burning under water

Flame burning under water

The second talk was about pulsars by Ingrid Stairs from the University of British Columbia. (Nobody ever says the full name of the university, just “UBC” is good enough in Canada. The accent is on the “U,” draw it out to last about half a second, then and say “BC” quickly. Now you have a Vancouver accent.) Having mentioned Vancouver, how could we not talk about salmon? So two years ago there were only one million salmon that returned to the Fraser River for spawning. A Royal Commission was set up by the government to understand why they were missing 9 million salmon…the best theory was that new salmon farms, located along Vancouver Island where the young salmon pass on their way out to sea, were polluting the water. Each year there are usually 10 million salmon. While the committee was investigating, the run this year harvested 35 million salmon. The most in memory! But back to pulsars…

Pulsars are way out there in terms of extreme environments. I was amazed when one of the young people asked, “Are there any practical applications?” I started science because I thought it was fun…thank goodness there have been a FEW practical applications since then! By the way, the electricity for my laptop comes from hydro-electric power in northern BC…more water at work. Some time ago I was talking with the president of the University of Victoria, David Turpin. (The University of Victoria is one of the 11 universities that own and operate TRIUMF). He was telling me about their new underwater projects off the coast of Vancouver Island called Neptune and Venus. Very cool stuff. The ocean floor is being instrumented to monitor all kinds of ecological and geophysical indicators, establishing some of the first time-series trends across large surfaces of the ocean.

Then I remember another way water is connected – the U.S. particle physics and nuclear physics science budgets are part of the Energy and Water Appropriations package passed by the Congress. How weird is that? Then I watched a Discovery Channel TV show on the crash of the Air France jet over the Atlantic on its way from Brazil to Paris. They think the problem was super-cooled water, formed in a thunder storm at 35,000 feet, that clogged the air-speed detectors, rendering them useless. This may have effectively caused much of the online software to fail.

The colour of water is blue as you probably know; it isn’t clear. In fact, the colour of water is thought to be the only material whose color is due to electronic vibrational transitions rather than just interaction of photons with electrons in the material.

Oh, all this talk about water makes me thirsty for a nice class of cold, not too cold, maybe 25 degrees, pure, Ok not too pure, I do want a few minerals, bubbly of course, CO2 filled (does that count as carbon sequestration?), glass of blue liquid water…derived from outgassing of the earth’s mantle that had solidified from molten rock in the early earth producing an atmosphere with water vapour…or perhaps derived from a comet…or maybe it was an asteroid, or maybe interstellar dust grains with fractal surfaces, that crashed to earth 4.5 billion years ago!

Now, what was I doing before this?  And why am I all wet?

Share

–by T. “Isaac” Meyer, Head of Strategic Planning and Communications

We worked last week to finalize and submit a position paper to the Government of Canada as part of their “Expert Review Panel on Federal Support to Research & Development.”  Our thesis was that national laboratories, especially those that span the spectrum from basic research to applied technology, are a natural environment for the academic scientists to mingle with hard-nosed business people…the result — better understanding, more-aligned expectations, and ultimately, easier partnerships for identifying the good ideas and taking them to market.

So, for fun, here are some excerpts from our contentions.

One of the compelling drivers for public investment in research and development is the hoped-for outcome of economic growth through innovation, knowledge transfer, and commercialization of new products or technologies. The natural timescales for these benefits are often much longer than individual businesses can afford. In the modern world of the 21st century, nations are increasingly concerned about optimizing these economic benefits of R&D for their own citizens as well as competing successfully with other countries around the world.

A national laboratory with good networks and open-access policies provides a fertile environment for business innovation to get started.  That is, when businesses frequently and informally intersect with academic research, the likelihood of a firm choosing innovation as a business strategy greatly improves.  Most businesses get started with one or two ideas—tunnel vision is then required to get them from the garage to full-market penetration.  By interacting with a laboratory, businesses are exposed to the broader spectrum of technologies and skills ancillary to their original product.

For instance, with TRIUMF’s long history of medical-isotope production with Nordion and strong academic connections to the UBC Department of Chemistry, it was natural for Nordion to return to TRIUMF and its research partners to develop new radiochemical products in a cost-shared approach that took advantage of a Government of Canada program that matches a public dollar with each private-sector dollar to support joint research.  A preliminary patent on a new product has just been filed.

Businesses need to perform some of their own R&D.  We are not in the golden age of the last century where monolithic corporations could afford elite research labs that drove breakthrough after breakthrough from the lab bench to the marketplace.  More and more, the model for big-business innovation and product development is to partner with the best teams around the world.  For instance, General Electric’s medical-cyclotron division based in northern Europe came to TRIUMF in Canada in 2009 to discuss options for partnering on the development of a third-generation cyclotron that would be unit-sized and operate at the push of a button on a table-top.

Today’s world separates “pre-competitive R&D” and “competitive R&D” where the “R” in the latter is much less than the “D.”  Pre-competitive R&D takes place before high-value intellectual property is developed and is typically performed in a collaborative partnership.  Because pre-competitive R&D has shared benefits, it typically uses shared resources and shared talents with regular participation of public funds.  Businesses regularly seek competitive leveraging of their funds with public monies on their topics of interest.

The next two steps after precompetitive R&D are tricky:  (1) Determining when the research is moving into competitive technology development and (2) Performing the competitive R&D.  The first person to say that a technology is ready for field testing and commercialization is likely the academic; the last person to say that a technology is ready for market analysis and commercialization is likely the business partner.  In between these extremes is the so-called “valley of death.”  Pitched in these terms, however, the challenge is not just technological—it is one of communication and understanding.  The second tricky part, performing the R&D in an IP-protecting fashion that respects the proprietary nature of the work, is more feasible and usually requires a high degree of focus.  Experience is the best teacher here and thus businesses engaged in R&D need to mix with each other as well academia.

Businesses need to be involved in performing their own proprietary R&D and in partnering with selected teams on it.  This capability allows them (a) to stay abreast of the market and even develop their own forecasting abilities, and (b) to more quickly deploy new technologies and products.  Today’s globalized world doesn’t allow much time for “catch up.”  If the competition releases a new product or feature, depending on the industry, you have six months or even just six days to respond.

In Canada, the national laboratories and several public-sector programs (e.g., CECRs) are becoming more effective at lowering this initial barrier to relatedness and understanding.  Laboratories are in regular communication and contact with businesses as vendors, customers, and sponsors.  Businesses work with engineers and technical staff at laboratories to build and deliver one-of-a-kind equipment and often have informal consultations with key laboratory staff about new product ideas or performance constraints.  Academic researchers relate to laboratories as meeting grounds and expert resources for technical projects.  Driven by budgets and promised milestones, laboratories deliver progress and performance on a schedule.  Taken together, these attributes can make national laboratories a natural nexus for businesses and universities to get to know each other and to work alongside one another.

What do YOU think?  Do national laboratories play a unique or critical role in the national “ecosystem” for imagination, invention, or innovation?

Share

Home again

Sunday, February 20th, 2011

I’m now back home after spending about a month at CERN. I was trying to think of how to describe a trip to CERN. They’re usually right at the board line between manageable and overwhelming. There’s always someone you run into whom you should speak to about something. Sometimes it’s a someone from another institute who’d benefit from your expertise on a given topic. Sometimes it’s someone who has expertise on a topic you’d benefit from. Sometimes it’s someone you’re working with on an analysis with a group of people and you need to figure out the best way to proceed. There’s always friends – and there is no sharp line between friends and coworkers. Many of my collaborators are also friends. People who aren’t collaborators now may be at some point in the future, and anyone can provide useful insight into physics. Sometimes you hang out with friends with completely different physics interests and you end up talking about some physics topic which is really useful to both of you. If you’re not based at CERN, you always need to get as much done as possible since you won’t be back for a while. There is something you can’t get done before you leave. There is always work to do and you have to prioritize.

There is also the physical environment at CERN. CERN sits in the valley between the Juras and the Alps. It’s beautiful. There is a a vinyard right next to the lab. Wine from the region is exceptional. And the lab has old buildings named by numbers alone, with numbers that have no relation whatsoever to either their location or their function. At best the buildings are boring. When I stay at the dorms, I walk about 100 feet to work and about 50 feet to get to the cafeteria. It is really convenient to just eat, sleep, and work.  Which is basically what I did the last month.

Trips to CERN are always highly productive and incredibly exhausting. During this trip I:

  • Completed the training required to work inside the magnet
  • Attended a three day meeting on the status of the electromagnetic calorimeter
  • Tested and repaired dozens of front end electronic cards
  • Worked on getting the newly installed electromagnetic supermodules installed and ready for data
  • Met with some visiting journalism students to discuss what we do in ALICE
  • Worked with collaborators on our data analysis
  • Attended ALICE meetings
  • Attended phone meetings for a pending paper
  • Attended group meetings over the phone

The last two sets of meetings were meetings in the US, so I would call in to them at 9 PM at night. Most Mondays, Tuesdays, and Wednesdays I had phone meetings at 8 or 9 PM, after a full day of work.

So I worked on many things on this trip and worked long hours every day. This is normal for trips to CERN – they are exhausting. And productive. Every day is different. No days are easy. There is nothing about this job that is routine. I spent time climbing around inside the ALICE magnet fixing electronics, worked on outreach to the public, fixed electronics, discussed our analysis method, worked on writing a paper and an analysis note… I learned, taught, listened to others give talks, gave talks… All trips to CERN are really busy, but this was a busier trip than usual. And really productive.

I am developing a love-hate relationship with the CERN cafeteria. It is a much better cafeteria than most cafeterias in the US – definitely better than either the Oak Ridge National Lab or the Brookhaven National Lab cafeterias, both of which I know far too well. But it is still cafeteria food. It’s really convenient so when I need to get a lot of work done it is really easy to just eat at the cafeteria rather than going out or trying to use the kitchens in the dorms to cook. Right now they’re remodeling the main kitchen in the dorm so it’s even tougher than usual to cook at CERN. The first meal I had at home was fajitas with lots of guacamole and hot salsa that is actually spicy.  Next up:  BBQ.

I want to get back to my morning runs, which aren’t so easy when I’m at CERN. The weather in Tennessee should be really nice for hiking soon and I missed the Smokies. I won’t miss 9 PM phone meetings. (I might have to call into some 8 AM phone meetings instead…) So it’s good to be home.

Share

Publish now?

Friday, February 18th, 2011

It’s a busy time. First, the LHC was closed up today for the first time this year, allowing the start of machine checkout and then, eventually, circulating beams. The beginning of the 2011 run is in sight, although we won’t have collisions for physics for a while yet. Also, we’re getting close to winter conference season. The Recontres de Moriond meetings are traditionally a venue for the presentation of new experimental physics results, and you can be sure that all of the LHC experiments are readying some interesting stuff for that. I have previously discussed the internal review processes of experiments, which can take a while, so even though the conferences are a few weeks away, a lot of analyses are becoming finalized and starting to be reviewed right now. Whether you are a reviewer or reviewee (or both), it can take a lot of time. (Oh, and then there is the recent discussion of federal budget cuts in Washington, which has us all reading the newspapers pretty closely.) So we don’t lack for things to do.

But, meanwhile, here is something to consider. The ATLAS and CMS experiments are ultimately very similar — they both have similar goals (which are different than for ALICE and LHCb, hence their absence from this discussion), and similar enough capabilities (although differing strengths and weaknesses), and they both record pretty much the same amount of data. So why don’t they publish the same measurements at the same time? Just as an example, the two experiments submitted publications on measurements of rates of W and Z bosons three months apart, with the later one analyzing ten times as much data (and having much more precise results) than the first one. Please note, in an attempt to be neutral, I am not naming names here!! Let’s instead take this as an introduction to a broader question — given that the LHC will continue to pile up data over time, when do you stop and say, “OK, let’s publish with what we’ve got?” How much data is enough?

I’m not going to claim to have all the answers to this question, and for any given measurement there will be a unique set of circumstances. But here are a few possible considerations:

  • Is there a break in the action at the LHC? This is a totally pedestrian consideration, but if the LHC isn’t going to run for, say, three months, as is happening right now, for many measurements it might not be worth the wait for more data, so you should just publish with what you’ve got. There are going to be a lot of publications based on the data recorded in 2010. It’s true that in 2011, if the collision rates are as expected, the 2010 data will quickly be superseded, but why wait those few months, especially if you are doing a measurement in which additional statistics might not make a meaningful difference?
  • When can I make a scientific statement that has sufficient impact on the world? If you only have enough data to make a measurement that’s, for instance, ten times less accurate than the most accurate measurement of the same quantity that’s currently available, there’s no point in publishing. But if you are at least in the range of comparable to the best measurement (even if not yet the best), it might make sense to publish, because it’s accurate enough to make a difference in the world’s understanding. If you average two measurements of equal precision, then the average will be a factor of 1/sqrt(2) = 1.4 more accurate than either individual measurement. Seems worth it, right?
  • Am I worried that someone else is going to beat me to something? Let’s face it, there is some glory to being first, especially if there is something new to report. If you are worried that competitors might get to it first, perhaps you will decide that you have to release your result, even if you know you might do a better job yet, either by recording more data or just having more time to work on it.
  • Then again, it’s better to be second than to be wrong. A wrong result would be embarrassing, for sure, so it’s better to do the work necessary to have greater confidence in the result.
  • If you really can do a much better job with not much more time or effort, why not just do that? If you do, then your measurement is going to be the one in the history books, even if you weren’t first.
  • Do I finally have enough data to report a statistically significant result? Well, this is what we’re all waiting for — at some point some new phenomenon is going to emerge from the data. At first, the statistical strength will be marginal, but as more data are analyzed, the signal will stand out more strongly. You can be sure that once any anomaly is observed, even at a low level, it will be tracked very carefully as additional data are recorded, and as soon as an effect reaches some level of statistical significance, it’s going to be published just as quickly as possible, without delay.

These are just a few of my own musings, dashed off quickly — I invite our readers to offer ideas of their own. (OK, and now I click on the “Publish” button on the right….)

Share

Last time I posted, we looked at the “Eightfold way” classification of mesons. We argued that this is based only on symmetry and allowed physicists in the 60s to make meaningful predictions about mesons even though mesons are ultimately complicated “non-perturbative” objects where quarks and anti-quarks perform an intricate subatomic ‘dance’ (more on this below!).

Historical models of mesons

In fact, physicists even developed theories of mesons as fundamental particles—rather than bound states of quarks—which accurately described the observed light meson masses and interactions. These theories were known as “phenomenological” models, chiral perturbation theory, or nonlinear sigma models. These are all fancy names for the same idea.

The non-linear sigma model is a useful tool even in modern particle physics, as evidenced by the so-called little Higgs models. In these models the  Higgs boson is relatively light due to a mechanism called collective symmetry breaking in which multiple symmetries must be broken to generate a Higgs mass. (For  technical introductions for physicists, see here and here.) This idea that light particles come from broken symmetries has its origin in “phenomenological” models of mesons via the Goldstone mechanism.

From a formal point of view these models suffered a theoretical sickness: while they agreed well with experiment at low energies, they didn’t seem to make much sense if you used them to calculate predictions for high energies. It’s not that the predictions didn’t match with experiments, it’s that theory seemed to make no predictions! (Alternately, its predictions were nonsense.) The technical name for this illness is non-renormalizability, and it was American Nobel Laureate Ken Wilson who really clarified the correct way to understand these theories.

Ken Wilson (b. 1936) may not have the public fame as Richard Feynman or Robert Oppenheimer, but he is without a doubt one of the great American theoretical physicists of the century. His research focused on the theoretical framework of quantum field theory and its applications to both particle physics and condensed matter physics. He was one of the great thinkers of our field who really understood the “big idea,” and I think he is nothing short of a hero of modern physics.

Rather than going into the precise sense in which a non-renormalizable theory is a ‘sick’ theory, let’s emphasize Wilson’s key insight: these sick theories are fine as long as we are careful to ask the right questions. Wilson made this statement in a much more mathematically rigorous and elegant way—but in this post we’ll focus on getting the intuition correct.

Effective theories

The point is that these “non-renormalizable” theories are just approximations for the behavior of a more fundamental theory, which we call an effective theory (here’s a very old post on the big idea). These approximations get the “rough behavior” correct, but doesn’t sweat the details. If you then try to ask the approximate theory something about the details that it neglects, then it gives you a gibberish response. Wilson taught us how to understand the gibberish as the theory saying, “I’m not sophisticated enough to answer that!”

Here’s a concrete example. One of my previous posts presented a pixelated image of the Mona Lisa to demonstrate “lattice QCD.” (This is actually exactly the effective theory that Ken Wilson was working on.)

The pixelated Mona Lisa is an “effective” image with details blurred out compared to the “fundamental” image. Even with these details removed, from far away the images look the same. In fact, the effective image is sufficient to answer questions like

  • What is the overall color of the image or of different patches of the image? (Beige/brown)
  • How many figures are in the image? (One… but keep this in mind for later)

On the other hand, effective Mona Lisa is completely unequipped to answer more subtle questions like

  • Where is the Mona Lisa looking?
  • Is the Mona Lisa happy or sad?

Okay, arguably even art historians can’t come up with answers to those questions. But the point is that the pixelated image can’t even begin to try to answer them—the questions ask about details that were left out of the “effective” image. Such questions are outside of the domain of validity of the effective image.

Now here’s a very important lesson in particle physics:

Models of particle physics also have a domain of validity, beyond which they are ill equipped to make sensible predictions.

For some models, like the effective theories of mesons, asking questions outside of the model’s domain of validity leads to nonsense answers. On the other hand, within the domain of validity the models are perfectly predictive. In fact, different “effective models” have to agree when their domains of validity overlap. Here’s an example from an old post where classical electromagnetism is an effective theory for quantum electrodynamics, as manifested by the formula for the electric field.

Dancing with the quarks

Now let’s get back to mesons, albeit though an analogy. We know that a pion is really a quark–anti-quark caught up in a subatomic dance. They spin about one another, exchange gluons, and can even interact with other particles as a joint entity. Here’s a rough picture:

But here’s the thing: that’s the picture that we see only if we can really look very closely and observe the quarks directly. This requires having front row seats at “Dancing with the Quarks” (or at least an HDTV). For someone who can only watch the broadcast at low resolution, the dance looks very different: everything is blurred out:

In fact, this is now just like the case of the pixelated Mona Lisa. Note that because the quarks are so meticulously coordinated, the blurry picture looks like there’s only one object dancing! We call that object a pion and we can make careful measurements of how it spins and interacts… all without knowing that if we only had better resolution we would actually see two quarks dancing in unison rather than one pion.

Things brings us back to the state of particle physics in the 1960s. We can create an entire effective theory to describe the pion, but we have to accept that we’ve put on our fuzzy glasses and can’t make out any details. We can’t ask our effective theory something like “how many hands are in the picture above?” Well, it looks like two… but it’s hard to be sure. I could ask an even more difficult question: what is the gender of the dancers in the picture above? Now the effective theory completely falls apart. Any answer that it can give must be manifestly wrong because it doesn’t even know that there are two dancers, much less the particular gender of either. In the same way, the effective theories of mesons seemed to fall apart when you asked questions about energies higher than their regime of validity.

Modern Effective Theories

Let me end by remarking that even though the underlying goal of high energy physics is to probe nature at a fundamental level, effective theories are still incredibly useful tools.

  1. Matching theories to low-energy experiments. It is often the case that theories of exotic new particles at high energies are constrained by experiments that are conducted at much lower energies. For example, many models of new physics are limited by how they would affect the physics of ordinary W and Z bosons.  By writing an effective theory of W and Z bosons that parameterizes the effect of new physics, we can provide robust bounds on the properties of whatever new particles appear at high energies. (For experts: these are the electroweak precision constraints, see hep-ph/0405040, hep-ph/0412166, hep-ph/0604111) The analogy to the dancing quarks is to use the blurry picture to tell us that, “I don’t know how many hands there are, but if there are more than two, then they have to be pretty close to one another.” (For experts: this approach has recently been applied to direct detection of dark matter.)
  2. Phenomenological models.” In the previous case we simplify a calculation of a fundamental theory by working with an effective theory; this is a top-down approach. We can also consider the bottom-up approach where we write down a model that describes known low-energy physics and figure out at what energy it breaks down. We can then predict that there should be some new physics not encapsulated in our model appearing at those energies. This is where we are with particle physics: we have observed a bunch of neat particles and measured their properties—but the entire framework breaks down somewhere around the TeV scale unless we have something like the Higgs boson appearing.
  3. Strong coupling and duality. This brings us back to mesons. Recall that our effective meson theory was a way for 1960s physicists to describe the particles coming out of early colliders without ever having to worry about the horrible non-perturbative QCD substructure that we now know is actually there. In some cases, there is a much stronger relation between the fundamental and effective theories and the two theories are said to be dual to one another. The 1990s were revolutionary for the development of formal dualities between seemingly unrelated theories: Witten’s web of dualities in M-theory, Seiberg duality in supersymmetric gauge theories, and gauge/gravity dualities like the AdS/CFT correspondence proposed by Maldacena. (For theoretical physics fans: those are some really big names in the field; each one of them is a MacArthur”Genuis” fellow!)

Anyway, there’s a surprising amount of “deep” physics that one can glean from thinking about mesons… even if they are somewhat “boring” particles that aren’t even fundamental. The notion of effective field theory is one of the central pillars of particle physics (as well as statistical physics), and in fact perhaps provides the most solid intuition about the entire field of high energy physics.

Share

Budget Problems Facing the U.S.

Wednesday, February 16th, 2011

If it seems that the news lately has been grim coming out of the U.S…it is because it is. In a talk yesterday by the director of Fermilab Pier Oddone laid out some very bleak circumstances for the coming year that Fermilab and many scientific ventures face.

See Pier’s talk here

The short of it is the budget put forward by the Republican congress would slash domestic spending in view of the fact that the U.S would see 1.5 trillion dollar deficit in Fiscal Year 2011. Now that being said, these proposals actually only effect < 20% of the total budget and definitely don’t fix the total problem.

However, the impact would be very real! Since the U.S government has been acting on a continuing resolution (basically saying keep 2010 spending and appropriations since Congress didn’t pass a new budget), getting a 20% cut now actually amounts to a 40% cut for the rest of the year! This fact has DIRE impact on Fermilab as Pier said in his talk.

Impossible to accommodate such cuts without major disruptions:

  1. Stop operation of all accelerators immediately
  2. Slow down projects to barely keep-alive levels
  3. Prepare layoffs of 20% of the staff or 400 employees
  4. Furlough staff for roughly two of the remaining six months

This coming on the heels of the announcement that there would be no extension to the running of the Tevatron in 2011 means that things would slow down in the US High Energy Physics area very quickly with no clear signs of when they will pick up again.

Frankly, for a government that is purporting not wanting to miss “their sputnik moment” the idea of drastically cutting funding to fundamental research seems just plain stupid! Innovation does not happen in a climate where people are worried if the lab they work at is going to be there tomorrow…nor will “job creation” and “a balanced budget” happen by cutting spending to a small fraction of the overall budget that actually generates jobs and opportunities in the United States. The best quote I’ve heard to describe this sort of approach to budgetary problem solving was in an article in the Washington Post.

“Making the government lean by cutting the most defensible (and productive) federal spending is akin to making an overweight aircraft fly by removing the engine!”

So what do we do? Write your congressman/congresswomen and tell them that gutting science is no way to the future! There are instructions on the webpage of how to get this letter written. I encourage all readers to write to help save such great scientific programs such as Fermilab!
Write Congress

Share

 The House will vote on Thursday on FY2011 budget appropriations bill that slashes the Department of Energy’s Office of Science and thus high-energy physics funding by 20 percent, even as the President’s FY2012 budget shows support for science with relatively flat funding.

If the FY2011 cuts are enacted, they could force 400 layoffs and two months of furloughs at Fermilab, Director Pier Oddone said in a meeting with laboratory staff, which you can view here.

Several area newspapers and blogs have written about the House proposal and its potential impacts. See stories in Crain’s Chicago Business, Science Insider, Patch.com, Cosmic Variance, and Physics and Physicists

Oddone addressed the dire situation of the FY2011 budget in his column in Fermilab Today Tuesday.

The proposed cuts for Office of Science are a stunning 20 percent. Because we will be six months into the fiscal year by the time the final FY11 budget is passed, this would amount to a 40 percent cut for the remaining of the year and would be catastrophic not only for our laboratory but for all Office of Science labs. It would stop the operation of user facilities and lead to major layoffs and furloughs. We are working with our representatives to explain the consequences of such cuts on us, on the standing of our nation in science and innovation, and on how we will be viewed by our international partners.

Congressional actions so far seem to reflect a misunderstanding of the role of the Office of Science within a generally supportive atmosphere for science and innovation as demonstrated by the bi-partisan support of the America Competes Act. The Office of Science is the main agency for physical science research in our nation and indispensable in the overall framework of scientific research. It provides the main user facilities such as ours at Fermilab, the light sources, neutron sources, electron microscopes, nanoscience centers and large computational facilities that support scientific research and innovation carried out by thousands of people in universities and industries. Without the Office of Science, the scientific enterprise in our country would be crippled.

Share

Scientists hoping to unravel the mystery of proton spin at the Relativistic Heavy Ion Collider (RHIC) have a new tool at their disposal — the first to directly explore how quarks of different types, or “flavors,” contribute to the overall spin of the proton. The technique, described in papers just published by RHIC’s STAR and PHENIX collaborations in Physical Review Letters, relies on the measurement of particles called W bosons, the mediators of the weak force responsible for the decay of radioactive nuclei.

Illustration of a new measurement using W boson production in polarized proton collisions at RHIC. Collisions of polarized protons (beam entering from left) and unpolarized protons (right) result in the production of W bosons (in this case, W-). RHIC's detectors identify the particles emitted as the W bosons decay (in this case, electrons, e-) and the angles at which they emerge. The colored arrows represent different possible directions, which probe how different quark flavors (e.g., “anti-up,” ū; and “down,” d) contribute to the proton spin.

Spin is a quantum property that describes a particle’s intrinsic angular momentum. Like charge and mass, it’s part of a particle’s identity, whose magnitude is the same for all particles of a given type. But unlike charge and mass, spin has a direction that can be oriented differently for individual particles of a given species.

Spin is used by a wide range of people, from astronomers studying the contents of the universe to doctors using an MRI (magnetic resonance imaging) machine to see inside the human body. But where spin comes from is still a mystery.

Physicists have long thought that the spin of a proton was simply the sum of the spins of its three component quarks. But experiments have shown that the quarks account for only about 25 percent of the proton’s spin. What accounts for the missing 75 percent? RHIC is the world’s only machine capable of colliding high-energy beams of polarized protons — a useful approach for investigating this question.

After beginning polarized proton collisions at RHIC late in 2001, scientists first looked for the missing spin in the gluons, the particles that hold a proton’s quarks together via the strong force. But so far, gluons have been found to contribute much less than originally speculated to proton spin.

Now, RHIC scientists have a new tool to guide their search. Thanks to new detection techniques and the ability to run polarized proton collisions at very high energies — 500 GeV, or 500 billion electron volts — scientists at both PHENIX and STAR are able to directly probe the polarization contributions from different flavored quarks (known by the names “up” and “down”) inside protons for the first time.

Read more about this new technique here.

-Karen McNulty Walsh, BNL Media & Communications

Share

RCNP.

Tuesday, February 15th, 2011

古い雰囲気の講義室。緊張した雰囲気で覗くと、がらんとした静かな講義室が目に入った。一目見て、そこで講演をした前回の雰囲気がありありと思い出された。前回は研究会の中の招待講演として呼ばれ、楽しく話したものの、やはり超弦理論に興味の無いハドロン物理学者も多いと見えて、僕の講演の前に退出する外国人もちらほら見えた。一方、わざわざ僕の講演を聴きにそのとき入室してくる人もいた。その情景が、ふっと思い出される。さて、今日はどんな雰囲気になるんだろう。

そう思いながらそーっと覗いていたのは、RCNP、阪大の核物理研究センターの講義室である。原子核の城と呼ぶにふさわしい場所のような気がしている。前の日記にもそう書いたのを覚えている。RCNPに足を踏み入れ、静かな玄関ホールに響く自分の足音を聞き、ホール奥のエレベーターのボタンを押すとき、何とも、城に入った気分がする。

講義室をこっそり覗いていた数分後、僕はすぐに熱い議論のさなかにいた。幸運にもRCNPの方々に今回呼んでいただいて、ぎっちり議論しましょうとの嬉しいお誘いだったのだが、まさにそれが瞬く間に実現していた。土岐さんと保坂さんってすごいね。むちゃ嬉しい。議論をし、瞬く間に二時間のセミナーが終わり(お付き合いくださった皆さん有難うございました)、その後も息もつかずに楽しい議論をたっぷり。大変楽しかった。

超弦理論を原子核に応用するということをやっている僕には、原子核を肌で毎日感じている研究者、そして今までの原子核物理の歴史を肌で知っている研究者、の意見は非常に貴重である。原子核と素粒子は、お隣の分野であるのに、交流が少ない。少なくとも、僕が院生のときには、ゼロだった。exactにゼロだった。その中で育った僕は、原子核のセンスを今学ぶしかない。幸い、素晴らしい原子核物理学者の方々と知り合う機会があり、こうして存分に議論させてもらっている。

セミナーで自分の話をすることはもちろん重要で、それについてのコメントも大変貴重なのだが、今回のRCNP訪問では、一つの質問を、呼んでくださった方に投げかけるというミッションを自分で抱いていた。その質問は、今後の自分の研究の方向性を決めるために重要なポイントであり、また、その答えは、自身の研究の遂行の将来的な実現性を占うためにも重要な答えとなるのは明らかだった。そしてその質問を聞ける人というのは、少なくとも僕にはとても限られていた。今回はその大変貴重な機会だった。

果たして、その質問には、大変具体的な返答をいただいた。その返答の一部は僕にも想像できたものだったが、明確に述べていただいたおかげで、具体性を帯び、そして明確な目標となった。そして、返答の一部は、僕の想像していた者とは全く違っていた。原子核をずっとやっている方には当たり前のことが、僕には全く当たり前ではない。それはメリットでもあるしデメリットでもある。しかしそれがメリットである面とデメリットである面をきちんと把握しておかないと、研究の意義が簡単に転げてしまう。今回たっぷり議論していただいて、厳しいコメントもいただき、自分の研究成果を客観的に振り返ることが出来たのは、大変大きな収穫だった。

帰りの飛行機は一瞬で眠りこけてしまい、また家に帰ってからは夜の育児の当番。

もうあと4日で引越も迫っている。人生、楽しい。

Share

by Nigel S. Lockyer, Director

The short answer is YES. And not because its an easier way to carry two cups of coffee without spilling them or because elevators remind me of Albert Einstein’s famous musings about acceleration and frames of reference. No, its because the TRIUMF main office building presently has no elevator.

The auditorium and the theory meeting room are on the second floor. During the rainy season (OK, let me be more precise, during the winter months), when Vancouverites are not enjoying beach volleyball by the thousands, pounding the Grouse Grind, sailing on English Bay, or picking up after their dogs, a few science-curious people (more than 100) show up at TRIUMF for two lectures on numerous Saturdays during the school year. Lecturers are from Simon Fraser University, the University of British Columbia, or TRIUMF.

Attendance is strictly limited by our auditorium size. People are turned away. Some gung-ho high-school students travel an hour by bus, drag their mom or dad along, and listen to lecturers about cosmology, radiation therapy for ocular melanoma at TRIUMF using a proton beam, the higgs boson, stars exploding…and so on. But what happens when grandma or grandpa shows up? Well sometimes she or he gets a helping hand up the stairs and in some cases it just doesn’t work…and they don’t come back. OK, enough is enough. Somebody build us an elevator!

Its not just for the public. When we have a Board of Management meeting at TRIUMF — a group of 40 people, vice-presidents of research from our owner universities — the on-site catering staff must carry 40 lunches and desserts up and down the stairs. Let’s see now, 40 people, who drink two coffees and one water, that’s 120 pounds, then add food…and it is a lot of trips. OK, enough is enough. Somebody build us an elevator!

Our former Director, one of the founders of TRIUMF, is Erich Vogt. A tall man, a heroic man. A bionic man…well at least a bionic knee. He fights his 80+ year old body up those stairs every day to his second-floor office. OK, enough is enough. Somebody build us an elevator!

Now a Vancouver hero and Olympic icon is Rick Hansen, of Man in Motion tour fame. He was a student at UBC and yet he hasn’t been to TRIUMF yet. One of my goals is to get Rick to TRIUMF. I am a big fan of his as are many at TRIUMF and in Canada. However, we can’t get him to the auditorium, since he is in a wheel chair. OK, enough is enough. Somebody build us an elevator!

Finally, we got it. Within about two months, TRIUMF will have a new elevator.

Oh, I forgot to tell you, it goes up only one floor.

Share