## Hot Topics

### Taking Aim at Brain Cancer

Particle physics technology has found many applications in medicine, including diagnosing disease and providing powerful imaging tools. Efforts underway at laboratories around the world are investigating how the same technologies used to explore the frontiers of physics could treat some of the most aggressive forms of cancer. For example, a carefully aimed beam of protons or heavy ions can target a tumor with great precision, sparing surrounding tissues. This approach can be especially beneficial when treating brain tumors.

### Particle Beam Cancer Therapy: The Promise and Challenges

By Brookhaven | March 4, 2014
Advances in accelerators built for fundamental physics research have inspired improved cancer treatment facilities. But will one of the most promising—a carbon ion treatment facility—be built in the U.S.? Participants at a symposium organized by Brookhaven Lab for the 2014 AAAS meeting explored the science and surrounding issues.

### Particle accelerators join fight against brain cancer

By Fermilab | January 13, 2014
One of the most common and aggressive types of malignant tumor originating in the human brain is called a glioblastoma multiforme. Patients diagnosed with this kind of tumor are told they have, on average, a little more than a year to live.

### How particle physics can save your life

By Fermilab | November, 11 2013
The same particle-physics technology used to understand the universe is also used to improve health and medicine. Accelerators and detectors play an important role in diagnosing disease, shrinking tumors and sterilizing medical equipment. Large-scale computing makes it possible to determine which potential new drugs are most likely to work before starting large-scale human trials. And particle-physics-trained scientists serve as medical physicists, making sure it all works as planned.

## Latest Posts

### Nobody understands quantum mechanics? Nonsense!

Saturday, March 8th, 2014

Despite the old canard about nobody understanding quantum mechanics, physicists do understand it.  With all of the interpretations ever conceived for quantum mechanics[1], this claim may seem a bit of a stretch, but like the proverbial ostrich with its head in the sand, many physicists prefer to claim they do not understand quantum mechanics, rather than just admit that it is what it is and move on.

What is it about quantum mechanics that generates so much controversy and even had Albert Einstein (1879 – 1955) refusing to accept it? There are three points about quantum mechanics that generate controversy. It is probabilistic, eschews realism, and is local. Let us look at these three points in more detail.

1. Quantum mechanics is probabilistic, not determinist. Consider a radioactive atom. It is impossible, within the confines of quantum mechanics, to predict when an individual atom will decay. There is no measurement or series of measurements that can be made on a given atom to allow me to predict when it will decay. I can calculate the probability of when it will decay or the time it takes half of a sample to decay but not the exact time a given atom will decay. This lack of ability to predict exact outcomes, but only probabilities, permeates all of quantum mechanics. No possible set of measurements on the initial state of a system allows one to predict precisely the result of all possible experiments on that state.
2. Quantum mechanics eschews realism[2]. This is a corollary of the first point. A quantum mechanical system does not have well defined values for properties that have not been directly measured. This has been compared to the moon only existing when someone is looking at it. For deterministic systems one can always safely infer back from a measurement what the system was like before the measurement. Hence if I measure a particle’s position and motion I can infer not only where it will go but where it has come from. The probabilistic nature of quantum mechanics prevents this backward looking inference. If I measure the spin of an atom, there is no certainty that is had only that value before the measurement. It is this aspect of quantum mechanics that most disturbs people, but quantum mechanics is what it is.
3. Quantum mechanics is local. To be precise, no action at point A will have an observable effect at point B that is instantaneous, or non-causal.  Note the word observable. Locality is often denied in an attempt to circumvent Point 2, but when restricted to what is observable, locality holds. Despite the Pentagon’s best efforts, no messages have been sent using quantum non-locality.

Realism, at least, is a common aspect of the macroscopic world. Even a baby quickly learns that the ball is behind the box even when he cannot see it. But much about the microscopic world is not obviously determinist, the weather in Vancouver for example (it is snowing as I write this). Nevertheless, we cling to determinism and realism like a child to his security blanket. It seems to me that determinism or realism, if they exist, would be at least as hard to understand as their lack. There is no theorem that states the universe should be deterministic and not probabilistic or vice versa. Perhaps god, contrary to Einstein’s assertion, does indeed like a good game of craps[3].

So quantum mechanics, at least at the surface level, has features many do not like. What has the response been? They have followed the example set by Philip Gosse (1810 – 1888) with the Omphalos hypothesis[4]. Gosse, being a literal Christian, had trouble with the geological evidence that the world was older than 6,000, so he came up with an interpretation of history that the world was created only 6,000 years ago but in such a manner that it appeared much older. This can be called an interpretation of history because it leaves all predictions for observations intact but changes the internal aspects of the model so that they match his preconceived ideas. To some extent, Tycho Brahe (1546 – 1601) used the same technique to keep the earth at the center of the universe. He had the earth fixed and the sun circle the earth and the other planets the sun. With the information available at the time, this was consistent with all observations.

The general technique is to adjust those aspects of the model that are not constrained by observation to make it conform to one’s ideas of how the universe should behave. In quantum mechanics these efforts are called interpretations. Hugh Everett (1930 – 1982) proposed many worlds in an attempt to make quantum mechanics deterministic and realistic. But it was only in the unobservable parts of the interpretation that this was achieved and the results of experiments in this world are still unpredictable. Louis de Broglie (1892 – 1987) and later David Bohm (1917 – 1992) introduced pilot waves in an effort to restore realism and determinism. In doing do they gave up locality. Like Gosse’s work, theirs was nice proof in principle that, with sufficient ingenuity, the universe could be made to conform to almost any preconceived ideas, or at least appear to do so. Reassuring I guess, but like Gosse it was done by introducing non-observable aspects to the model: not just unobserved but in principle unobservable. The observable aspects of the universe, at least as far as quantum mechanics is correct, are as stated in the three points above: probabilistic, nonrealistic and local.

Me, I am not convinced that there is anything to understand about quantum mechanics beyond the rules for its use given in standard quantum mechanics text books. However, interpretations of quantum mechanics might, possibly might, suggest different ways to tackle unsolved problems like quantum gravity and they do give one something to discuss after one has had a few beers (or is that a few too many beers).

[1] See my February 2014 post “Reality and the Interpretations of Quantum Mechanics.”

[2] Realism as defined in the paper by Einstein, Podolsky and Rosen, Physical Review 47 (10): 777–780 (1935).

[3] Or dice.

### My Week as a Real Scientist

Thursday, March 6th, 2014

For a week at the end of January, I was a real scientist. Actually, I’m always a real scientist, but only for that week was I tweeting from the @realscientists Twitter account, which has a new scientist each week typing about his or her life and work. I tweeted a lot. I tweeted about the conference I was at. I tweeted about the philosophy of science and religion. I tweeted about how my wife, @CuratorPolly, wasn’t a big fan of me being called the “curator” of the account for the week. I tweeted about airplanes and very possibly bagels. But most of all I tweeted the answers to questions about particle physics and the LHC.

Real Scientists wrote posts for the start and end of my week, and all my tweets for the week are at this Storify page. My regular twitter account, by the way, is @sethzenz.

I was surprised by how many questions people had when I they were told that a real physicist at a relatively high-profile Twitter account was open for questions. A lot of the questions had answers that can already be found, often right here on Quantum Diaries! It got me thinking a bit about different ways to communicate to the public about physics. People really seem to value personal interaction, rather than just looking things up, and they interact a lot with an account that they know is tweeting in “real time.” (I almost never do a tweet per minute with my regular account, because I assume it will annoy people, but it’s what people expect stylistically from the @realscientists account.) So maybe we should do special tweet sessions from one of the CERN-related accounts, like @CMSexperiment, where we get four physicists around one computer for an hour and answer questions. (A lot of museums did a similar thing with #AskACurator day last September.) We’ve also discussed the possibility of doing a AMA on Reddit. And the Hangout with CERN series will be starting again soon!

But while you’re waiting for all that, let me tell you a secret: there are lots of physicists on Twitter. (Lists here and here and here, four-part Symmetry Magazine series here and here and here and here.) And I can’t speak for everyone, but an awful lot of us would answer questions if you had any. Anytime. No special events. Just because we like talking about our work. So leave us comments. Tweet at us. Your odds of getting an answer are pretty good.

In other news, Real Scientists is a finalist for the Shorty Award for social media’s best science. We’ll have to wait and see how they — we? — do in a head-to-head matchup with giants like NASA and Neil deGrasse Tyson. But I think it’s clear that people value hearing directly from researchers, and social media seems to give us more and more ways to communicate every year.

### Particle Beam Cancer Therapy: The Promise and Challenges

Tuesday, March 4th, 2014

Advances in accelerators built for fundamental physics research have inspired improved cancer treatment facilities. But will one of the most promising—a carbon ion treatment facility—be built in the U.S.? Participants at a symposium organized by Brookhaven Lab for the 2014 AAAS meeting explored the science and surrounding issues.

by Karen McNulty Walsh

Accelerator physicists are natural-born problem solvers, finding ever more powerful ways to generate and steer particle beams for research into the mysteries of physics, materials, and matter. And from the very beginning, this field born at the dawn of the atomic age has actively sought ways to apply advanced technologies to tackle more practical problems. At the top of the list—even in those early days— was taking aim at cancer, the second leading cause of death in the U.S. today, affecting one in two men and one in three women.

Using beams of accelerated protons or heavier ions such as carbon, oncologists can deliver cell-killing energy to precisely targeted tumors—and do so without causing extensive damage to surrounding healthy tissue, eliminating the major drawback of conventional radiation therapy using x-rays.

“This is cancer care aimed at curing cancer, not just treating it,” said Ken Peach, a physicist and professor at the Particle Therapy Cancer Research Institute at Oxford University.

Peach was one of six participants in a symposium exploring the latest advances and challenges in this field—and a related press briefing attended by more than 30 science journalists—at the 2014 meeting of the American Association for the Advancement of Science in Chicago on February 16. The session, “Targeting Tumors: Ion Beam Accelerators Take Aim at Cancer,” was organized by the U.S. Department of Energy’s (DOE’s) Brookhaven National Laboratory, an active partner in an effort to build a prototype carbon-ion accelerator for medical research and therapy. Brookhaven Lab is also currently the only place in the U.S. where scientists can conduct fundamental radiobiological studies of how beams of ions heavier than protons, such as carbon ions, affect cells and DNA.

Participants in a symposium and press briefing exploring the latest advances and challenges in particle therapy for cancer at the 2014 AAAS meeting: Eric Colby (U.S. Department of Energy), Jim Deye (National Cancer Institute), Hak Choy (University of Texas Southwestern Medical Center), Kathryn Held (Harvard Medical School and Massachusetts General Hospital), Stephen Peggs (Brookhaven National Laboratory and Stony Brook University), and Ken Peach (Oxford University). (Credit: AAAS)

“We could cure a very high percentage of tumors if we could give sufficiently high doses of radiation, but we can’t because of the damage to healthy tissue,” said radiation biologist Kathryn Held of Harvard Medical School and Massachusetts General Hospital during her presentation. “That’s the advantage of particles. We can tailor the dose to the tumor and limit the amount of damage in the critical surrounding normal tissues.”

Yet despite the promise of this approach and the emergence of encouraging clinical results from carbon treatment facilities in Asia and Europe, there are currently no carbon therapy centers operating in the U.S.

Participants in the Brookhaven-organized session agreed: That situation has to change—especially since the very idea of particle therapy was born in the U.S.

Physicists as pioneers

“When Harvard physicist Robert Wilson, who later became the first director of Fermilab, was asked to explore the potential dangers of proton particle radiation [just after World War II], he flipped the problem on its head and described how proton beams might be extremely useful—as effective killers of cancer cells,” said Stephen Peggs, an accelerator physicist at Brookhaven Lab and adjunct professor at Stony Brook University.

As Peggs explained, the reason is simple: Unlike conventional x-rays, which deposit energy—and cause damage—all along their path as they travel through healthy tissue en route to a tumor (and beyond it), protons and other ions deposit most of their energy where the beam stops. Using magnets, accelerators can steer these charged particles left, right, up, and down and vary the energy of the beam to precisely place the cell-killing energy right where it’s needed: in the tumor.

The first implementation of particle therapy used helium and other ions generated by the Bevatron at Berkeley Lab. Those spin-off studies “established a foundation for all subsequent ion therapy,” Peggs said. And as accelerators for physics research grew in size, pioneering experiments in particle therapy continued, operating “parasitically” until the very first accelerator built for hospital-based proton therapy was completed with the help of DOE scientists at Fermilab in 1990.

But even before that machine left Illinois for Loma Linda University Medical Center in California, physicists were thinking about how it could be made better. The mantra of making machines smaller, faster, cheaper—and capable of accelerating more kinds of ions—has driven the field since then.

Advances in magnet technology, including compact superconducting magnets and beam-delivery systems developed at Brookhaven Lab, hold great promise for new machines. Peggs is working to incorporate these technologies in a prototype ‘ion Rapid Cycling Medical Synchrotron’ (iRCMS) capable of delivering protons and/or carbon ions for radiobiology research and for treating patients.

Brookhaven Lab accelerator physicist Stephen Peggs with magnet technology that could reduce the size of particle accelerators needed to steer heavy ion beams and deliver cell-killing energy to precisely targeted tumors while sparing surrounding healthy tissue.

Small machine, big particle impact

The benefits of using charged particles heavier than protons (e.g., carbon ions) stem not only from their physical properties—they stop and deposit their energy over an even smaller and better targeted tumor volume than protons—but also a range of biological advantages they have over x-rays.

As Kathryn Held elaborated in her talk, compared with x-ray photons, “carbon ions are much more effective at killing tumor cells. They put a huge hole through DNA compared to the small pinprick caused by x-rays, which causes clustered or complex DNA damage that is less accurately repaired between treatments—less repaired, period—and thus more lethal [to the tumor].” Carbon ions also appear to be more effective than x-rays at killing oxygen-deprived tumor cells, and might be most effective in fewer higher doses, “but we need more basic biological studies to really understand these effects,” Held said.

Different types of radiation treatment cause different kinds of damage to the DNA in a tumor cell. X-ray photons (top arrow) cause fairly simple damage (purple area) that cancer cells can sometimes repair between treatments. Charged particles—particularly ions heavier than protons (bottom arrow)—cause more and more complex forms of damage, resulting in less repair and a more lethal effect on the tumor. (Credit: NASA)

Held conducts research at the NASA Space Radiation Laboratory (NSRL) at Brookhaven Lab, an accelerator-based facility designed to fully understand risks and design protections for future astronauts exposed to radiation. But much of that research is relevant to understanding the mechanisms and basic radiobiological responses that can apply to the treatment of cancer. But additional facilities and funding are needed for research specifically aimed at understanding the radiobiological effects of heavier ions for potential cancer therapies, Held emphasized.

Hak Choy, a radiation oncologist and chair in the Department of Radiation Oncology at the University of Texas Southwestern Medical Center, presented compelling clinical data on the benefits of proton particle therapy, including improved outcomes and reduced side effects when compared with conventional radiation, particularly for treating tumors in sensitive areas such as the brain and spine and in children. “When you can target the tumor and spare critical tissue you get fewer side effects,” he said.

Data from Japan and Europe suggest that carbon ions could be three or four times more biologically potent than protons, Choy said, backing that claim with impressive survival statistics for certain types of cancers where carbon therapy surpassed protons, and was even better than surgery for one type of salivary gland cancer. “And carbon therapy is noninvasive,” he emphasized.

To learn more about this promising technology and the challenges of building a carbon ion treatment/research facility in the U.S., including perspectives from the National Cancer Institute, DOE and a discussion about economics, read the full summary of the AAAS symposium here: http://www.bnl.gov/newsroom/news.php?a=24672.

Karen McNulty Walsh is a science writer in the Media & Communications Office at Brookhaven National Laboratory.

### CDMS result covers new ground in search for dark matter

Monday, March 3rd, 2014

The Cryogenic Dark Matter Search has set more stringent limits on light dark matter.

Scientists looking for dark matter face a serious challenge: No one knows what dark matter particles look like. So their search covers a wide range of possible traits—different masses, different probabilities of interacting with regular matter.

Today, scientists on the Cryogenic Dark Matter Search experiment, or CDMS, announced they have shifted the border of this search down to a dark-matter particle mass and rate of interaction that has never been probed.

“We’re pushing CDMS to as low mass as we can,” says Fermilab physicist Dan Bauer, the project manager for CDMS. “We’re proving the particle detector technology here.”

Their result, which does not claim any hints of dark matter particles, contradicts a result announced in January by another dark matter experiment, CoGeNT, which uses particle detectors made of germanium, the same material as used by CDMS.

To search for dark matter, CDMS scientists cool their detectors to very low temperatures in order to detect the very small energies deposited by the collisions of dark matter particles with the germanium. They operate their detectors half of a mile underground in a former iron ore mine in northern Minnesota. The mine provides shielding from cosmic rays that could clutter the detector as it waits for passing dark matter particles.

Today’s result carves out interesting new dark matter territory for masses below 6 billion electronvolts. The dark matter experiment Large Underground Xenon, or LUX, recently ruled out a wide range of masses and interaction rates above that with the announcement of its first result in October 2013.

Scientists have expressed an increasing amount of interest of late in the search for low-mass dark matter particles, with CDMS and three other experiments—DAMA, CoGeNT and CRESST—all finding their data compatible with the existence of dark matter particles between 5 billion and 20 billion electronvolts. But such light dark-matter particles are hard to pin down. The lower the mass of the dark-matter particles, the less energy they leave in detectors, and the more likely it is that background noise will drown out any signals.

Even more confounding is the fact that scientists don’t know whether dark matter particles interact in the same way in detectors built with different materials. In addition to germanium, scientists use argon, xenon, silicon and other materials to search for dark matter in more than a dozen experiments around the world.

“It’s important to look in as many materials as possible to try to understand whether dark matter interacts in this more complicated way,” says Adam Anderson, a graduate student at MIT who worked on the latest CDMS analysis as part of his thesis. “Some materials might have very weak interactions. If you only picked one, you might miss it.”

Scientists around the world seem to be taking that advice, building different types of detectors and constantly improving their methods.

“Progress is extremely fast,” Anderson says. “The sensitivity of these experiments is increasing by an order of magnitude every few years.”

Kathryn Jepsen

Monday, March 3rd, 2014

In case you haven’t figured it out already from reading the US LHC blog or any of the others at Quantum Diaries, people who do research in particle physics feel passionate about their work. There is so much to be passionate about! There are challenging intellectual issues, tricky technical problems, and cutting-edge instrumentation to work with — all in pursuit of understanding the nature of the universe at its most fundamental level. Your work can lead to global attention and support Nobel Prizes. It’s a lot of effort put in over long days and nights, but there is also a lot of satisfaction to be gained from our accomplishments.

That being said, a fundamental truth about our field is that not everyone doing particle-physics research will be doing that for their entire career. There are fewer permanent jobs in the field than there are people who are qualified to hold them. It is certainly easy to do the math about university jobs in particular — each professor may supervise a large number of PhD students in his or her career, but only one could possibly inherit that job position in the end. Most of our researchers will end up working in other fields, quite likely in the for-profit sector, and as a field we do need to make sure that they are well-prepared for jobs in that part of the world.

I’ve always believed that we do a good job of this, but my belief was reinforced by a recent column by Tom Friedman in The New York Times. It was based around an interview with the Google staff member who oversees hiring for the company. The essay describes the attributes that Google looks for in new employees, and I couldn’t help but to think that people who work in the large experimental particle physics projects such as those at the LHC have all of those attributes. Google is not just looking for technical skills — it goes without saying that they are, and that particle physicists have those skills and great experience with digesting large amounts of computerized data. Google is also looking for social and personality traits that are also important for success in particle physics.

(Side note: I don’t support all of what Friedman writes in his essay; he is somewhat dismissive of the utility of a college education, and as a university professor I think that we are doing better than he suggests. But I will focus on some of his other points here. I also recognize that it is perhaps too easy for me to write about careers outside the field when I personally hold a permanent job in particle physics, but believe me that it just as easily could have wound up differently for me.)

For example, just reading from the Friedman column, one thing Google looks for is what is referred to as “emergent leadership”. This is not leadership in the form of holding a position with a particular title, but seeing when a group needs you to step forward to lead on something when the time is right, but also to step back and let someone else lead when needed. While the big particle-physics collaborations appear to be massive organizations, much of the day to day work, such as the development of a physics measurement, is done in smaller groups that function very organically. When they function well, people do step up to take on the most critical tasks, especially when they see that they are particularly positioned to do them. Everyone figures out how to interact in such a way that the job gets done. Another facet of this is ownership: everyone who is working together on a project feels personally responsible for it and will do what is right for the group, if not the entire experiment — even if it means putting aside your own ideas and efforts when someone else clearly has the better thing.

And related to that in turn is what is referred to in the column as “intellectual humility.” We are all very aggressive in making our arguments based on the facts that we have in hand. We look at the data and we draw conclusions, and we develop and promote research techniques that appear to be effective. But when presented with new information that demonstrates that the previous arguments are invalid, we happily drop what we had been pursuing and move on to the next thing. That’s how all of science works, really; all of your theories are only as good as the evidence that supports them, and are worthless in the face of contradictory evidence. Google wants people who take this kind of approach to their work.

I don’t think you have to be Google to be looking for the same qualities in your co-workers. If you are an employer who wants to have staff members who are smart, technically skilled, passionate about what they do, able to incorporate disparate pieces of information and generate new ideas, ready to take charge when they need to, feel responsible for the entire enterprise, and able to say they are wrong when they are wrong — you should be hiring particle physicists.

### B Decays Get More Interesting

Friday, February 28th, 2014

While flavor physics often offers a multitude of witty jokes (read as bad puns), I think I’ll skip one just this time and let the analysis speak for itself. Just recently, at the Lake Louise Winter Institute, a new result was released for the analysis looking for $$b\to s\gamma$$ transitions. Now this is a flavor changing neutral current, which cannot occur at tree level in the standard model. Therefore, the the lowest order diagram which this decay can proceed by is the one loop penguin shown below to the right.

One loop penguin diagram representing the transition $$b \to s \gamma$$.

From quantum mechanics, photons can have either left handed or right handed circular polarization. In the standard model, the photon in the decay $$b\to s\gamma$$ is primarily left handed, due to spin and angular momentum conservation. However, models beyond the standard model, including some minimally super symmetric models (MSSM) predict a larger than standard model right handed component to the photon polarization. So even though the decay rates observed for $$b\to s\gamma$$ agree with those predicted by the standard model, the photon polarization itself is sensitive to new physics scenarios.

As it turns out, the decays $$B^\pm \to K^\pm \pi^\mp \pi^\pm \gamma$$ are well suited to explore photon polarizations after playing a few tricks. In order to understand why, the easies way is to consider a picture.

Picture defining the angle $$\theta$$ in the analysis of $$B^\pm\to K^\pm \pi^\mp \pi^\pm \gamma$$. From the Lake Louise Conference Talk

In the picture to the left, we consider the rest frame of a possible resonance which decays into $$K^\pm \pi^\mp \pi^\pm$$. It is then possible to form the triple product of $$p_\gamma\cdot(p_{\pi,slow}\times p_{\pi,fast})$$. Effectively, this defines the angle $$\theta$$ defined in the picture to the left.

Now for the trick: Photon polarization is odd under parity transformation, and so is the triple product defined above. Defining the decay rate as a function of this angle, we find:

$$\frac{d\Gamma}{d \cos(\theta)}\propto \sum_{i=0,2,4}a_i cos^i\theta + \lambda_i\sum_{j=1,3} a_j \cos^j \theta$$

This is an expansion in Legendre Polynomials up to the 4th order. The odd moments are those which would contribute to photon polarization effects. The lambda is the photon polarization. Therefore, by looking at the decay rate as a function of this angle, we can directly access the photon polarization. However, another way to access the same information is by taking the asymmetry between the decay rate for events where theta is above the plane and those where theta is below the plane. This is then proportional to the photon polarization as well and allows for direct statistical calculation. We will call this the up-down asymmetry, or $$A_{ud}$$. For more information, a useful theory paper is found here.

Enter LHCb. With the 3 fb$$^{-1}$$ collected over 2011 and 2012 containing ~14,000 signal events, the up-down asymmetry was measured.

Up-down asymmetry for the analysis of $$b\to s\gamma$$. From the Lake Louise Conference Talk

In bins of invariant mass of the $$K \pi \pi$$ system, we see the asymmetry is clearly non-zero, and varies across the mass range given. As seen in the note posted to the arXiv, the shapes of the fit of the Legendre moments are not the same in differing mass bins, either. This corresponds to a 5.2$$\sigma$$ observation of photon polarization in this channel. What this means for new physics models, however, is not interpreted, though I’m sure that the arXiv will be full of explanations given about a week.

### Scientists complete the top quark puzzle

Monday, February 24th, 2014

This Fermilab press release was published on February 24.

Matteo Cremonesi, left, of the University of Oxford and the CDF collaboration and Reinhard Schweinhorst of Michigan State University and the DZero collaboration present their joint discovery at a forum at Fermilab on Friday, Feb. 21. The two collaborations have observed the production of single top quarks in the s-channel, as seen in data collected from the Tevatron. Photo: Cindy Arnold

Scientists on the CDF and DZero experiments at the U.S. Department of Energy’s Fermi National Accelerator Laboratory have announced that they have found the final predicted way of creating a top quark, completing a picture of this particle nearly 20 years in the making.

The two collaborations jointly announced on Friday, Feb. 21, that they had observed one of the rarest methods of producing the elementary particle – creating a single top quark through the weak nuclear force, in what is called the s-channel. For this analysis, scientists from the CDF and DZero collaborations sifted through data from more than 500 trillion proton-antiproton collisions produced by the Tevatron from 2001 to 2011. They identified about 40 particle collisions in which the weak nuclear force produced single top quarks in conjunction with single bottom quarks.

Top quarks are the heaviest and among the most puzzling elementary particles. They weigh even more than the Higgs boson – as much as an atom of gold – and only two machines have ever produced them: Fermilab’s Tevatron and the Large Hadron Collider at CERN. There are several ways to produce them, as predicted by the theoretical framework known as the Standard Model, and the most common one was the first one discovered: a collision in which the strong nuclear force creates a pair consisting of a top quark and its antimatter cousin, the anti-top quark.

Collisions that produce a single top quark through the weak nuclear force are rarer, and the process scientists on the Tevatron experiments have just announced is the most challenging of these to detect. This method of producing single top quarks is among the rarest interactions allowed by the laws of physics. The detection of this process was one of the ultimate goals of the Tevatron, which for 25 years was the most powerful particle collider in the world.

“This is an important discovery that provides a valuable addition to the picture of the Standard Model universe,” said James Siegrist, DOE associate director of science for high energy physics. “It completes a portrait of one of the fundamental particles of our universe by showing us one of the rarest ways to create them.”

Searching for single top quarks is like looking for a needle in billions of haystacks. Only one in every 50 billion Tevatron collisions produced a single s-channel top quark, and the CDF and DZero collaborations only selected a small fraction of those to separate them from background, which is why the number of observed occurrences of this particular channel is so small. However, the statistical significance of the CDF and DZero data exceeds that required to claim a discovery.

“Kudos to the CDF and DZero collaborations for their work in discovering this process,” said Saul Gonzalez, program director for the National Science Foundation. “Researchers from around the world, including dozens of universities in the United States, contributed to this important find.”

The CDF and DZero experiments first observed particle collisions that created single top quarks through a different process of the weak nuclear force in 2009. This observation was later confirmed by scientists using the Large Hadron Collider.

Scientists from 27 countries collaborated on the Tevatron CDF and DZero experiments and continue to study the reams of data produced during the collider’s run, using ever more sophisticated techniques and computing methods.

“I’m pleased that the CDF and DZero collaborations have brought their study of the top quark full circle,” said Fermilab Director Nigel Lockyer. “The legacy of the Tevatron is indelible, and this discovery makes the breadth of that research even more remarkable.”

Fermilab is America’s national laboratory for particle physics research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

### A second chance at sight

Monday, February 17th, 2014

Silicon microstrip detectors, a staple in particle physics experiments, provide information that may be critical to restoring vision to some who lost it.

In 1995, physicist Alan Litke co-wrote a particularly prescient article for Scientific American about potential uses for an emerging technology called the silicon microstrip detector. With its unprecedented precision, this technology was already helping scientists search for the top quark and, Litke wrote, it could help discover the elusive Higgs boson. He further speculated that it could perhaps also begin to uncover some of the many mysteries of the brain.

As the article went to press, physicists at Fermilab announced the discovery of the top quark, using those very same silicon detectors. In 2012, the world celebrated the discovery of the Higgs boson, aided by silicon microstrip detectors at CERN. Now Litke’s third premonition is also coming true: His work with silicon microstrip detectors and slices of retinal tissue is leading to developments in neurobiology that are starting to help people with certain kinds of damage to their vision to see.

“The starting point and the motivation was fundamental physics,” says Litke, who splits his time between University of California, Santa Cruz, and CERN. “But once you have this wonderful technology, you can think about applying it to many other fields.”

Silicon microstrip detectors use a thin slab of silicon, implanted with an array of diode strips, to detect charged particles. As a particle passes through the silicon, a localized current is generated. This current can be detected on the nearby strips and measured with high spatial resolution and accuracy.

Litke and collaborators with expertise in, and inspiration from, the development of silicon microstrip detectors, fabricated two-dimensional arrays of microscopic electrodes to study the complex circuitry of the retina. In the experiments, a slice of retinal tissue is placed on top of one of the arrays. Then a movie—a variety of visual stimuli including flashing checkerboards and moving bars—is focused on the input neurons of the retina, and the electrical signals generated by hundreds of the retina’s output neurons are simultaneously recorded. This electrical activity is what would normally be sent as signals to the brain and translated into visual perceptions.

This process allowed Litke and his collaborators to help decipher the retina’s coded messages to the brain and to create a functional connectivity map of the retina, showing the strengths of connections between the input and output neurons. That in itself was important to neurobiology, but Litke wanted to take this research further, to not just record neural activity but also to stimulate it. Litke and his team designed a system in which they stimulate retinal and brain tissue with precise electrical signals and study the kinds of signals the tissue produces in response.

Such observations have led to an outpouring of new neurobiology and biomedical applications, including studies for the design of a retinal prosthesis, a device that can restore sight. In a disease like retinitis pigmentosa or age-related macular degeneration, the eye’s output system to the brain is fine, but the input system has degraded.

In one version of a retinal prosthesis, a patient could wear a small video camera—something similar to Google Glass. A small computer would process the collected images and generate a pattern of electrical signals that would, in turn, stimulate the retina’s output neurons. In this way, the pattern of electrical signals that a naturally functioning eye would create could be replicated. The studies with the stimulation/recording system are being carried out in collaboration with neurobiologist E. J. Chichilnisky (Salk Institute and Stanford University) and physicist Pawel Hottowy (AGH University of Science and Technology, Krakow). The interdisciplinary and international character of the research highlights its origins in high energy physics.

In another approach, the degraded input neurons—the neurons that convert light into electrical signals—are functionally replaced by a two-dimensional array of silicon photodiodes. Daniel Palanker, an associate professor at Stanford University, has been using Litke’s arrays, in collaboration with Alexander Sher, an assistant professor at UCSC, who completed his postdoctoral work with Litke, to study how a prosthesis of this type would interact with a retina. Palanker and Sher are also researching retinal plasticity and have discovered that, in patients whose eyes have been treated with lasers, which can cause scar tissue, healthy cells sometimes migrate into an area where cells have died.

“I’m not sure we would be able to get this kind of information without these arrays,” Palanker says. “We use them all the time. It’s absolutely brilliant technology.”

Litke’s physics-inspired technology is continuing to play a role in the development of neurobiology. In 2013, President Obama announced the BRAIN—Brain Research through Advancing Innovative Neurotechnologies—Initiative, with the aim of mapping the entire neural circuitry of the human brain. A Nature Methods paper laying out the initiative’s scientific priorities noted that “advances in the last decade have made it possible to measure neural activities in large ensembles of neurons,” citing Litke’s arrays.

“The technology has enabled initial studies that now have contributed to this BRAIN Initiative,” Litke says. “That comes from the Higgs boson. That’s an amazing chain.”

### NOvA experiment sees first long-distance neutrinos

Friday, February 14th, 2014

Fermilab released this press release on Feb. 11, 2014.

Workers at the NOvA hall in northern Minnesota assemble the final block of the far detector in early February 2014, with the nearly completed detector in the background. Each block of the detector measures about 50 feet by 50 feet by 6 feet and is made up of 384 plastic PVC modules, assembled flat on a massive pivoting machine. Photo courtesy of NOvA collaboration

Scientists on the world’s longest-distance neutrino experiment announced today that they have seen their first neutrinos.

The NOvA experiment consists of two huge particle detectors placed 500 miles apart, and its job is to explore the properties of an intense beam of ghostly particles called neutrinos. Neutrinos are abundant in nature, but they very rarely interact with other matter. Studying them could yield crucial information about the early moments of the universe.

“NOvA represents a new generation of neutrino experiments,” said Fermilab Director Nigel Lockyer. “We are proud to reach this important milestone on our way to learning more about these fundamental particles.”

Scientists generate a beam of the particles for the NOvA experiment using one of the world’s largest accelerators, located at the Department of Energy’s Fermi National Accelerator Laboratory near Chicago. They aim this beam in the direction of the two particle detectors, one near the source at Fermilab and the other in Ash River, Minn., near the Canadian border. The detector in Ash River is operated by the University of Minnesota under a cooperative agreement with the Department of Energy’s Office of Science.

Billions of those particles are sent through the earth every two seconds, aimed at the massive detectors. Once the experiment is fully operational, scientists will catch a precious few each day.

Neutrinos are curious particles. They come in three types, called flavors, and change between them as they travel. The two detectors of the NOvA experiment are placed so far apart to give the neutrinos the time to oscillate from one flavor to another while traveling at nearly the speed of light. Even though only a fraction of the experiment’s larger detector, called the far detector, is fully built, filled with scintillator and wired with electronics at this point, the experiment has already used it to record signals from its first neutrinos.

“That the first neutrinos have been detected even before the NOvA far detector installation is complete is a real tribute to everyone involved. That includes the staff at Fermilab, Ash River Lab and the University of Minnesota module facility, the NOvA scientists, and all of the professionals and students building this detector,” said University of Minnesota physicist Marvin Marshak, Ash River Laboratory director. “This early result suggests that the NOvA collaboration will make important contributions to our knowledge of these particles in the not so distant future.”

Once completed, NOvA’s near and far detectors will weigh 300 and 14,000 tons, respectively. Crews will put into place the last module of the far detector early this spring and will finish outfitting both detectors with electronics in the summer.

“The first neutrinos mean we’re on our way,” said Harvard physicist Gary Feldman, who has been a co-leader of the experiment from the beginning. “We started meeting more than 10 years ago to discuss how to design this experiment, so we are eager to get under way.”

The NOvA collaboration is made up of 208 scientists from 38 institutions in the United States, Brazil, the Czech Republic, Greece, India, Russia and the United Kingdom. The experiment receives funding from the U.S. Department of Energy, the National Science Foundation and other funding agencies.

The NOvA experiment is scheduled to run for six years. Because neutrinos interact with matter so rarely, scientists expect to catch just about 5,000 neutrinos or antineutrinos during that time. Scientists can study the timing, direction and energy of the particles that interact in their detectors to determine whether they came from Fermilab or elsewhere.

Fermilab creates a beam of neutrinos by smashing protons into a graphite target, which releases a variety of particles. Scientists use magnets to steer the charged particles that emerge from the energy of the collision into a beam. Some of those particles decay into neutrinos, and the scientists filter the non-neutrinos from the beam.

Fermilab started sending a beam of neutrinos through the detectors in September, after 16 months of work by about 300 people to upgrade the lab’s accelerator complex.

“It is great to see the first neutrinos from the upgraded complex,” said Fermilab physicist Paul Derwent, who led the accelerator upgrade project. “It is the culmination of a lot of hard work to get the program up and running again.”

Different types of neutrinos have different masses, but scientists do not know how these masses compare to one another. A goal of the NOvA experiment is to determine the order of the neutrino masses, known as the mass hierarchy, which will help scientists narrow their list of possible theories about how neutrinos work.

“Seeing neutrinos in the first modules of the detector in Minnesota is a major milestone,” said Fermilab physicist Rick Tesarek, deputy project leader for NOvA. “Now we can start doing physics.”

Note: NOvA stands for NuMI Off-Axis Electron Neutrino Appearance. NuMI is itself an acronym, standing for Neutrinos from the Main Injector, Fermilab’s flagship accelerator.

### Oh brave new world, which has such physicists in it!

Monday, February 10th, 2014

In August I moved away from CERN, and I’ve been back and forth between CERN and Brussels quite a lot since then. In fact right now I’m sitting in the building 40 where people go to drink coffee and have meetings, and I can see the ATLAS Higgs Convener sitting on the next table. All this leaves me feeling a little detached from what is really happening at CERN, as if it’s not “my” lab anymore, and that actually sums up how many people think about particle physics at the moment. With LHC Run I we found the Higgs boson. It was what most people expected to see, and by a large margin it was the most probable thing we would have discovered. Things will be different for Run II. Nobody has a good idea about what to expect in terms of new particles (and if they say they do have a good idea, they’re lying.) In that sense it’s not “our” dataset, it’s whatever nature decides it should be. All we can do is say what is possible, not what is probable. (Although we can probably say one scenario is more probable than another.)

The problem we now face is that there is no longer an obvious piece that’s missing, but there are still many unanswered questions, which means we have to move from an era of a well constrained search to an era of phenomenology, or looking for new effects in the data. That’s not a transition I’m entirely comfortable with for several reasons. It’s often said that nature is not spiteful, but it is subtle and indifferent to our expectations. There’s no reason to think that there “should” be new physics for us to discover as we increase the energy of the LHC, and we could be unlucky enough to not find anything new in the Run II dataset. A phenomenological search also means that we’d be overly sensitive to statistical bumps and dips in the data. Every time there’s a new peak that we don’t expect we have to exercise caution and skepticism, almost to the point where it stops being fun. Suppose we find an excess in a dijet spectrum. We may conclude that this is due a new particle, but if we’re going to be phenomenologists about it we must remain open minded, so we can’t necessarily expect to see the same particle in a dimuon final state. It would then be prudent to ask if such a peak comes from a poorly understood effect, such as jet energy scales, and those kinds of effects can be hard to untangle if we don’t have a good control sample in data. At least with the discovery of the Higgs boson, the top quark, and the W and Z bosons we knew what final states to expect and what ratios they should exhibit. There’s also something a little unsettling about not having a roadmap of what to expect. When asked to pick between several alternative scenarios that are neither favoured by evidence nor disfavoured by lack of evidence it’s hard to decide what to prioritise.

Take your pick of new physics! Each scenario will have new phase space to explore in LHC Run II [CMS]

On the other hand there is reason to be excited. Since we don’t know what to expect in LHC Run II, anything we do discover will change our views considerably, and will lead to a paradigm shift. If we do discover a new particle, or even better, a new sector of particles, it could help frame the Standard Model as a subset of something more elegant and unified. If that’s the case then we can look forward to decades of intense and exciting research, that would make the Higgs discovery look like small potatoes. So the next few years at the LHC could be either the most boring or the most exciting time in the history of particle physics, and we won’t know until we look at the data. Will nature tantalise us with hints of something novel, will it give us irrefutable evidence of a new resonance, or will it leave us with nothing new at all? For my part I’m taking on the dilepton final states. These are quick, clean, simple, and versatile signatures of something new that are not tied down to a specific model. That’s the best search I can perform in an environment of such uncertainty and with a lack of coherent direction. Let’s hope it pays off, and paves the way for even more discoveries.

What’s happening at 325GeV at CDF? Only more data can tell us. Based on what the LHC has seen, this is probably a statistical fluctuation. (CDF)