• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

Brookhaven | Long Island, NY | USA

Read Bio

Joining forces in the search for the Higgs

Monday, November 21st, 2011

This post, originally published on 11/18/11 here, was written by Kétévi Adiklè Assamagan, a staff physicist at Brookhaven National Laboratory and the ATLAS contact person for the ATLAS-CMS combined Higgs analysis.

Today we witnessed a landmark LHC first: At the HCP conference in Paris, friendly rivals, the ATLAS and CMS collaborations, came together to present a joint result! This ATLAS-CMS combined Higgs search was motivated by the fact that pooling the dataset increases our chances of excluding or finding the Higgs boson over those of a single experiment. This is the first example of this kind of scientific collaboration at the LHC, and the success of the whole endeavor hinged on a whole host of thorny issues being tackled…

Discussions about combining our Higgs search results with CMS’s first started over a year ago, but before we could proceed with any kind of combined analysis, we had first to jointly outline how on earth we were going to go about doing it. This was no small undertaking; although we’re looking for the same physics, the ATLAS and CMS detectors are very different beasts materially, and use completely independent software to define and identify particles. How can we be certain that what passes for an electron in ATLAS would also be picked out as such in CMS? (more…)


Pioneering Supercomputer QCDOC Retires, Regenerates in ‘Next-Generation’ QCDCQ

Monday, October 31st, 2011

On May 26, 2005, a new supercomputer, a pioneering giant of its time, was unveiled at Brookhaven National Laboratory at a dedication ceremony attended by physicists from around the world. That supercomputer was called QCDOC, for quantum chromodynamics (QCD) on a chip, capable of handling the complex calculations of QCD, the theory that describes the nature and interactions of the basic building blocks of the universe. Now, after a career of state-of-the-art physics calculations, QCDOC has been retired — and will soon be replaced by a new “next generation” machine. (more…)


Physics Phoenix: Plotting the Journey of Muon g – 2

Tuesday, October 4th, 2011

“There it is — the world’s most beautiful physics experiment,” says physicist Chris Polly from a metal footbridge that crosses over the 14-meter blue steel ring of Brookhaven National Laboratory’s muon g – 2 experiment, now being disassembled. A haze of dust hangs in the air above Polly and a handful of other physicists and engineers who’ve gathered together to help resurrect the $20-million machine by transporting it hundreds of miles to Fermi National Accelerator Laboratory in Illinois. (more…)


Promoting Diversity ― on the Atomic Level

Wednesday, August 24th, 2011

This story first appeared on Brookhaven’s website.

They come from the midst of exploding stars beyond our solar system — and possibly, from the nuclei of far distant galaxies. Their name, “galactic cosmic rays,” sounds like something from a science fiction movie. They’re not really rays.

Galactic cosmic rays (GCR) is the term used to describe a wide variety of charged particles traveling through space at high energies and almost the speed of light, from subatomic particles like electrons and positrons to the nuclei of every element on the periodic table. Since they’re created at energies sufficient to propel them on long journeys through space, GCRs are a form of ionizing radiation, or streaming particles and light waves with enough oomph to knock electrons out of their orbits, creating newly charged, unstable atoms in most of the matter they traverse. (more…)


The Daya Bay Reactor Neutrino Experiment Begins Taking Data

Monday, August 15th, 2011

This story first appeared as a press release on Interactions.org, issued by Brookhaven National Laboratory, the Institute of High Energy Physics, and Lawrence Berkeley National Laboratory. For the full version and contact information, go here.

The Daya Bay Reactor Neutrino Experiment has begun its quest to answer some of the most puzzling questions about the elusive elementary particles known as neutrinos. The experiment’s first completed set of twin detectors is now recording interactions of antineutrinos (antipartners of neutrinos) as they travel away from the powerful reactors of the China Guangdong Nuclear Power Group in southern China.

Neutrinos are uncharged particles produced in nuclear reactions, such as in the sun, by cosmic rays, and in nuclear power plants. They come in three types or “flavors” — electron, muon, and tau neutrinos — that morph, or oscillate, from one form to another, interacting hardly at all as they travel through space and matter, including people, buildings, and planets like Earth.

The start-up of the Daya Bay experiment marks the first step in the international effort of the Daya Bay Collaboration to measure a crucial quantity related to the third type of oscillation, in which the electron-flavored neutrinos morph into the other two flavored neutrinos. (more…)


Quest for Understanding the Perfect Liquid Continues

Thursday, July 28th, 2011

This story first appeared on Brookhaven Lab’s homepage.

Over the past few years, scientists have seen an exciting convergence of three distinct lines of research on different kinds of extreme quantum matter. Two of these involve quantum fluids that can be studied in the laboratory: ultracold quantum gases and the quark-gluon plasma produced at Brookhaven’s Relativistic Heavy Ion Collider (RHIC). Even though these two quantum fluids exist at vastly different energy scales — from near absolute zero to four trillion degrees — their physical properties are remarkably similar. The third line of research is based on the discovery of a new theoretical tool, derived from string theory, for investigating the properties of extreme quantum matter — namely holographic dualities, a mathematical relationship between quantum mechanical systems in our world and black holes that theoretically exist in a higher dimensional space. (more…)


Narrowing in on the Higgs Boson at EPS

Monday, July 25th, 2011

The following guest post is from Kostas Nikolopoulos, a postdoctoral researcher at Brookhaven National Laboratory. Nikolopoulos, who is analyzing data from the Large Hadron Collider at CERN, received his Ph.D. in experimental high-energy physics from the University of Athens in 2010.

Last Wednesday, I travelled three hours by train from Geneva, Switzerland to Grenoble, France to spend a week at the International Europhysics Conference on High Energy Physics. Here, I’m presenting some of the latest findings in the search for the Higgs boson at the Large Hadron Collider’s ATLAS detector, and joining the overarching conversation about the elusive particle. (more…)

Hearts collide — in a good way — at RHIC

Tuesday, July 19th, 2011

Dave Mosher (left) and Kendra Snyder inside the STAR detector at Brookhaven's Relativistic Heavy Ion Collider, minutes after getting engaged. (photo courtesy of Dave Mosher/davemosher.com)

When Kendra Snyder, a science writer in Brookhaven Lab’s Media & Communications Office, entered the STAR detector at the Relativistic Heavy Ion Collider (RHIC) last Friday afternoon to view some unusual crystalline deposits — supposedly formed in the beam pipe — she got an even bigger surprise: a diamond ring and a marriage proposal from fellow science writer Dave Mosher.

The unusual proposal, dubbed “The Nerdiest Marriage Proposal . . . Ever” on Mosher’s blog, triggered the interest of a reporter at The Daily, News Corporation’s new iPad-only daily newspaper. Snyder and Mosher were interviewed last night, and the story — which includes references to RHIC’s near-light-speed gold ion collisions — appears in today’s edition.

Even if you don’t have an iPad, you can view the video, “Geek Love,” here.

— Karen McNulty Walsh, Brookhaven Media & Communications


Trapping Antimatter with Magnets

Monday, June 6th, 2011

Researchers at the ALPHA experiment at CERN made major news today with the announcement that they’ve trapped antimatter atoms for 1,000 seconds. That’s more than 16 minutes and 5,000 times longer than their last published record of two tenths of a second.

The ALPHA magnet being wound at Brookhaven

The new feat will allow scientists to study the properties of antimatter in detail, which could help them understand why the universe is made only of matter even though the Big Bang should have created equal amounts of matter and antimatter.

These studies have been made possible, in part, by a bottle-like, anti matter-catching device called a minimum magnetic field trap. At the heart of the trap is an octupole (eight-magnetic-pole) magnet that was fabricated at Brookhaven Lab in 2006.

Several special features of the coil design and a unique machine used to wind it contributed to the suc­cess of this magnet. For exam­ple, the magnet generates a very pure octupole field, which keeps the antimatter away from the walls of the trap, preventing them from annihilating.

Antiprotons and positrons are brought into the ALPHA trap from opposite ends and held there by electric and magnetic fields. Brought together, they form antiatoms neutral in charge but with a magnetic moment. If their energy is low enough they can be held by the octupole and mirror fields of the Minimum Magnetic Field Trap.

To figure out how many antiprotons were in the trap, the scientists “quench,”  or abruptly switch off the superconducting magnet, releasing the antimatter. The anti-atom’s subsequent annihilation into particles called pions is recorded by a three-layer silicon vertex detector similar to those used in high-energy experiments like Fermilab’s Tevatron and the Large Hadron Collider.

But the pions must travel through the magnets of the trap before reaching the silicon. To prevent the particles from scattering multiple times during their journey to the detector, Brookhaven physicists and engineers had to figure out how limit the amount of material used in the magnet. A specially developed 3D winding machine allowed the researchers to build the magnet directly onto the outside of the ALPHA vacuum chamber. The result is a magnet that looks far different from the bulky, steel-surrounded instrumentation in most particle colliders. In fact, only the superconducting cables are metal.

–Kendra Snyder, BNL Media & Communications


Searching the Skies for Dark Energy

Thursday, May 5th, 2011

The last time I wrote a post was in February, ages ago in this fast-paced world. Since then, a lot of things have happened. I just flew back from the April Meeting of the American Physical Society in California, where I presented our most recent result from the Sloan Digital Sky Survey-III: the creation of the largest-ever 3D map of the distant universe. Thanks to the power sockets kindly provided by United on this particular flight, I can actually spend my time doing something useful while in this metal tube — attempt to write a marginally interesting post.

Anyway, my recent results were about the Lyman-alpha forest, a completely new technique that my colleagues and I got to work for the first time. It means a lot to me, mostly because there was a great deal of skepticism in the community on whether this technique would ever work, and given that I spent the past two years in the trenches trying to make it happen, I’m very happy. You can read more about it here. Fun fact: the media attention surrounding this announcement caused my name to appear on Fox News, a somewhat bizarre occurence for a European-style liberal like me.

What I want to tell you more about today is dark energy, which is a problem that is relevant both for my Lyman-alpha work as well as the Large Synoptic Survey Telescope (LSST) science that I discussed in this post. To put it mildly: dark energy is one big embarassement for modern physics. So, what is it?

A rendering of the Large Synoptic Survey Telescope, a proposed 8.4-meter ground-based telescope that will survey the entire visible sky deeply in multiple colors every week from a mountaintop in Chile. (Image credit: LSST Corporation/NOAO)

First, one important clarification: dark energy is not dark matter. Dark matter is a substance that is omnipresent in our universe and essentially behaves as a cold, invisible dust that collapses under its own gravity. The observational evidence for dark matter is overwhelming, but there are also many good theoretical ideas about what dark matter might be. Physicists have embarked on a long program to establish a “theory of everything,” or at least a “theory of many things.” We have unified electrodynamics and the weak interactions of particles — and the strong force can be self-consistently added — but how we combine these three forces with gravity is still an unsolved problem. There are many proposals on how to do it: supersymmetry, string theories, quantum loop gravity, etc.  The beauty of all these proposals is that that, in addition to having observational signatures at the Large Hadron Collider (LHC), they naturally explain dark matter. Most of these theories have at least one stable, weakly-interacting particle that could act as dark matter. The picture hasn’t quite clicked together yet, but this will indeed happen in the next couple of years, as more results come from the LHC.

Dark energy, on the other hand, doesn’t have such beautiful connections to fundamental physics. Nobody has the slightest idea of what if could be and how it could fit into the bigger picture. So what do we know?

In the late 1990s, observations of the dimming of distant supernovae showed that the universe is undergoing a phase of accelerated expansion. In other words, the universe is expanding faster and faster. This is very counterintuitive: if you throw a ball upwards, it keeps slowing down until it reaches its maximum height and then it falls down. The universe does something similar: After the initial kick, which we call the Big Bang, the expansion of the universe went slower and slower. But, some 7 billion years ago, the universe started to accelerate. It’s like throwing a ball in the air and watching it do what it’s supposed to do for a while before it suddenly changes its mind and zooms off to the skies!

A simulated 15-second LSST exposure from one of the charge-coupled devices in the focal plane. (Image Credit: LSST simulations team)

After the initial discovery of dark energy through distant supernovae, the evidence grew stronger and stronger and now we see it in many different measurements of the universe: the Lyman-alpha forest measurements that I mentioned earlier, as well as measurements from the Dark Energy Survey and LSST will all constrain the behavior of dark energy. The amazing thing is that we can describe this accelerated expansion of the universe by putting an extra term in the equations that describes the evolution of the universe — the so-called cosmological constant. At the moment, all observations are consistent with adding this one simple number to our equations. But this number has nothing to do with the physics that we know; it is of a wrong order of magnitude and shouldn’t be there to start with. So we all measure like wackos and hope that we will detect some small deviation away from this simple solution. This would indicate that dark energy is more complicated, somewhat dynamical, and thus, give us a handle on understanding it. But it might turn out that it is just that — a cosmological constant with no connection to anything else. In the latter case we are stuck for the foreseable future hoping that someone will eventually be lucky enough to make an observation or theoretical insight that will bring everything together.