• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Posts Tagged ‘Brookhaven National Laboratory’

This story first appeared as a press release on Interactions.org, issued by Brookhaven National Laboratory, the Institute of High Energy Physics, and Lawrence Berkeley National Laboratory. For the full version and contact information, go here.

The Daya Bay Reactor Neutrino Experiment has begun its quest to answer some of the most puzzling questions about the elusive elementary particles known as neutrinos. The experiment’s first completed set of twin detectors is now recording interactions of antineutrinos (antipartners of neutrinos) as they travel away from the powerful reactors of the China Guangdong Nuclear Power Group in southern China.

Neutrinos are uncharged particles produced in nuclear reactions, such as in the sun, by cosmic rays, and in nuclear power plants. They come in three types or “flavors” — electron, muon, and tau neutrinos — that morph, or oscillate, from one form to another, interacting hardly at all as they travel through space and matter, including people, buildings, and planets like Earth.

The start-up of the Daya Bay experiment marks the first step in the international effort of the Daya Bay Collaboration to measure a crucial quantity related to the third type of oscillation, in which the electron-flavored neutrinos morph into the other two flavored neutrinos. (more…)

Share
This story appeared in Fermilab Today July 29.
PHENIX, one of two major experiments located at the Relativistic Heavy Ion Collider (RHIC) based at Brookhaven National Laboratory, is upgrading again with help from Fermilab’s Slicon Detector Facility (SiDet). Fermilab technicians finished assembling hundreds of forward silicon vertex tracker (FVTX) detector components in early July.

One of the hundreds of forward silicon vertex tracker (FVTX) components assembled at Fermilab's Silicon Detector Facility. Photo: Vassili Papavassiliou, New Mexico State University

The wedge-shaped components will be installed in PHENIX to help scientists study the properties of quark gluon plasma (QGP), which theorists believe made up the universe moments after the Big Bang.

Eric Mannel, a physicist from Columbia University and one of about 450 PHENIX contributors, worked as an electronics project engineer overseeing the final stages of assembly at Fermilab.

“We want to understand how the universe evolved the way it did from the very beginning,” Mannel said. “The FVTX detector will provide a higher resolution for tracking of particles which will allow us to study the properties of QGP.”

QGP is a near-perfect liquid composed of disassociated quarks and gluons suspended in plasma. It is said to be nearly perfect because it contains almost no internal friction—if you were to stir the plasma, it would continue to swirl forever. Physicists create QGP by smashing heavy ions and protons together. SiDet personnel provided a technical capabilities unique to Fermilab, to construct detectors that will allow physicists to study those collisions in more detail than ever before.

“We anticipate that we’ll be able to reconstruct secondary vertices from the decay of charm and beauty quarks with a resolution of 70 microns. The typical decay lengths for those particles are several hundred microns in heavy-ion collisions at RHIC,” Mannel said. The average human hair is about 100 microns thick.

The SiDet team completed the microassembly of FVTX components in mid-July. From left to right: Tammy Hawke, Michelle Jonas, Nina Ronzhina, Bert Gonzalez and Mike Herron. Also part of the group is Hogan Nguyen, not pictured. The FVTX group of PHENIX collaborators are also not pictured: Eric Mannel, Vassili Papavassiliou, Elaine Tennant, AAron Veicht and Dave Winter. Photo: Reidar Hahn.

AAron Veicht, a Ph.D. student at Columbia University, spent nearly 10 months working with the technicians at SiDet and will be part of the team installing the detector in PHENIX this fall.

“I’ll get to see the project from the very early stages all the way through to analyzing the data, so it’s very exciting,” Veicht said. “I gained a lot of experience while working with the technicians at Fermilab. It was a vital part of my education.”

Bert Gonzalez was the Fermilab technical supervisor on the design project. “The process went quite well, as this was the first endeavor where we worked with program collaborators,” Gonzalez said. Gonzalez and his Fermilab team spoke with PHENIX collaborators via conference calls for most of the design and development of the components.

“It was a good run,” Gonzalez said. “The project will be missed at SiDet, because it was a concrete job; you could dig your hands into it.”

Veicht felt that the people at SiDet were helpful and knowledgeable.

“It was my first time at Fermilab, and it was absolutely fantastic,” Veicht said.

PHENIX detector. Photo: Brookhaven National Laboratory

PHENIX collaborators plan to commission the detector in October and begin data collection this January.

– Ashley WennersHerron

Related information:

*PHENIX website

*RHIClets: A collection of Java applet games about the RHIC collider and RHIC physics.

*PHENIX cartoons

Share

The following guest post is from Kostas Nikolopoulos, a postdoctoral researcher at Brookhaven National Laboratory. Nikolopoulos, who is analyzing data from the Large Hadron Collider at CERN, received his Ph.D. in experimental high-energy physics from the University of Athens in 2010.

Last Wednesday, I travelled three hours by train from Geneva, Switzerland to Grenoble, France to spend a week at the International Europhysics Conference on High Energy Physics. Here, I’m presenting some of the latest findings in the search for the Higgs boson at the Large Hadron Collider’s ATLAS detector, and joining the overarching conversation about the elusive particle. (more…)
Share

Dave Mosher (left) and Kendra Snyder inside the STAR detector at Brookhaven's Relativistic Heavy Ion Collider, minutes after getting engaged. (photo courtesy of Dave Mosher/davemosher.com)

When Kendra Snyder, a science writer in Brookhaven Lab’s Media & Communications Office, entered the STAR detector at the Relativistic Heavy Ion Collider (RHIC) last Friday afternoon to view some unusual crystalline deposits — supposedly formed in the beam pipe — she got an even bigger surprise: a diamond ring and a marriage proposal from fellow science writer Dave Mosher.

The unusual proposal, dubbed “The Nerdiest Marriage Proposal . . . Ever” on Mosher’s blog, triggered the interest of a reporter at The Daily, News Corporation’s new iPad-only daily newspaper. Snyder and Mosher were interviewed last night, and the story — which includes references to RHIC’s near-light-speed gold ion collisions — appears in today’s edition.

Even if you don’t have an iPad, you can view the video, “Geek Love,” here.

— Karen McNulty Walsh, Brookhaven Media & Communications

Share

Researchers at the ALPHA experiment at CERN made major news today with the announcement that they’ve trapped antimatter atoms for 1,000 seconds. That’s more than 16 minutes and 5,000 times longer than their last published record of two tenths of a second.

The ALPHA magnet being wound at Brookhaven

The new feat will allow scientists to study the properties of antimatter in detail, which could help them understand why the universe is made only of matter even though the Big Bang should have created equal amounts of matter and antimatter.

These studies have been made possible, in part, by a bottle-like, anti matter-catching device called a minimum magnetic field trap. At the heart of the trap is an octupole (eight-magnetic-pole) magnet that was fabricated at Brookhaven Lab in 2006.

Several special features of the coil design and a unique machine used to wind it contributed to the suc­cess of this magnet. For exam­ple, the magnet generates a very pure octupole field, which keeps the antimatter away from the walls of the trap, preventing them from annihilating.

Antiprotons and positrons are brought into the ALPHA trap from opposite ends and held there by electric and magnetic fields. Brought together, they form antiatoms neutral in charge but with a magnetic moment. If their energy is low enough they can be held by the octupole and mirror fields of the Minimum Magnetic Field Trap.

To figure out how many antiprotons were in the trap, the scientists “quench,”  or abruptly switch off the superconducting magnet, releasing the antimatter. The anti-atom’s subsequent annihilation into particles called pions is recorded by a three-layer silicon vertex detector similar to those used in high-energy experiments like Fermilab’s Tevatron and the Large Hadron Collider.

But the pions must travel through the magnets of the trap before reaching the silicon. To prevent the particles from scattering multiple times during their journey to the detector, Brookhaven physicists and engineers had to figure out how limit the amount of material used in the magnet. A specially developed 3D winding machine allowed the researchers to build the magnet directly onto the outside of the ALPHA vacuum chamber. The result is a magnet that looks far different from the bulky, steel-surrounded instrumentation in most particle colliders. In fact, only the superconducting cables are metal.

–Kendra Snyder, BNL Media & Communications

Share

The last time I wrote a post was in February, ages ago in this fast-paced world. Since then, a lot of things have happened. I just flew back from the April Meeting of the American Physical Society in California, where I presented our most recent result from the Sloan Digital Sky Survey-III: the creation of the largest-ever 3D map of the distant universe. Thanks to the power sockets kindly provided by United on this particular flight, I can actually spend my time doing something useful while in this metal tube — attempt to write a marginally interesting post.

Anyway, my recent results were about the Lyman-alpha forest, a completely new technique that my colleagues and I got to work for the first time. It means a lot to me, mostly because there was a great deal of skepticism in the community on whether this technique would ever work, and given that I spent the past two years in the trenches trying to make it happen, I’m very happy. You can read more about it here. Fun fact: the media attention surrounding this announcement caused my name to appear on Fox News, a somewhat bizarre occurence for a European-style liberal like me.

What I want to tell you more about today is dark energy, which is a problem that is relevant both for my Lyman-alpha work as well as the Large Synoptic Survey Telescope (LSST) science that I discussed in this post. To put it mildly: dark energy is one big embarassement for modern physics. So, what is it?

A rendering of the Large Synoptic Survey Telescope, a proposed 8.4-meter ground-based telescope that will survey the entire visible sky deeply in multiple colors every week from a mountaintop in Chile. (Image credit: LSST Corporation/NOAO)

First, one important clarification: dark energy is not dark matter. Dark matter is a substance that is omnipresent in our universe and essentially behaves as a cold, invisible dust that collapses under its own gravity. The observational evidence for dark matter is overwhelming, but there are also many good theoretical ideas about what dark matter might be. Physicists have embarked on a long program to establish a “theory of everything,” or at least a “theory of many things.” We have unified electrodynamics and the weak interactions of particles — and the strong force can be self-consistently added — but how we combine these three forces with gravity is still an unsolved problem. There are many proposals on how to do it: supersymmetry, string theories, quantum loop gravity, etc.  The beauty of all these proposals is that that, in addition to having observational signatures at the Large Hadron Collider (LHC), they naturally explain dark matter. Most of these theories have at least one stable, weakly-interacting particle that could act as dark matter. The picture hasn’t quite clicked together yet, but this will indeed happen in the next couple of years, as more results come from the LHC.

Dark energy, on the other hand, doesn’t have such beautiful connections to fundamental physics. Nobody has the slightest idea of what if could be and how it could fit into the bigger picture. So what do we know?

In the late 1990s, observations of the dimming of distant supernovae showed that the universe is undergoing a phase of accelerated expansion. In other words, the universe is expanding faster and faster. This is very counterintuitive: if you throw a ball upwards, it keeps slowing down until it reaches its maximum height and then it falls down. The universe does something similar: After the initial kick, which we call the Big Bang, the expansion of the universe went slower and slower. But, some 7 billion years ago, the universe started to accelerate. It’s like throwing a ball in the air and watching it do what it’s supposed to do for a while before it suddenly changes its mind and zooms off to the skies!

A simulated 15-second LSST exposure from one of the charge-coupled devices in the focal plane. (Image Credit: LSST simulations team)

After the initial discovery of dark energy through distant supernovae, the evidence grew stronger and stronger and now we see it in many different measurements of the universe: the Lyman-alpha forest measurements that I mentioned earlier, as well as measurements from the Dark Energy Survey and LSST will all constrain the behavior of dark energy. The amazing thing is that we can describe this accelerated expansion of the universe by putting an extra term in the equations that describes the evolution of the universe — the so-called cosmological constant. At the moment, all observations are consistent with adding this one simple number to our equations. But this number has nothing to do with the physics that we know; it is of a wrong order of magnitude and shouldn’t be there to start with. So we all measure like wackos and hope that we will detect some small deviation away from this simple solution. This would indicate that dark energy is more complicated, somewhat dynamical, and thus, give us a handle on understanding it. But it might turn out that it is just that — a cosmological constant with no connection to anything else. In the latter case we are stuck for the foreseable future hoping that someone will eventually be lucky enough to make an observation or theoretical insight that will bring everything together.

-Anže

Share

The following news release from the Sloan Digital Sky Survey-III (SDSS-III) collaboration was first posted on Brookhaven’s website.

Scientists from the Sloan Digital Sky Survey III (SDSS-III) have created the largest ever three-dimensional map of the distant universe by using the light of the brightest objects in the cosmos to illuminate ghostly clouds of intergalactic hydrogen. The map provides an unprecedented view of what the universe looked like 11 billion years ago.

A slice through the three-dimensional map of the universe. SDSS-III scientists are looking out from the Milky Way, at the bottom tip of the wedge. Distances are labeled on the right in billions of light-years. The black dots going out to about 7 billion light years are nearby galaxies. The red cross-hatched region could not be observed with the SDSS telescope, but the future BigBOSS survey, the proposed successor to BOSS, could observe it. The colored region shows the map of intergalactic hydrogen gas in the distant universe. Red areas have more gas; blue areas have less gas.

The new findings were presented on May 1 at a meeting of the American Physical Society by Anže Slosar, a physicist at the U.S. Department of Energy’s Brookhaven National Laboratory, and described in an article posted online on the arXiv astrophysics preprint server.

The new technique used by Slosar and his colleagues turns the standard approach of astronomy on its head. “Usually we make our maps of the universe by looking at galaxies, which emit light,” Slosar explained. “But here, we are looking at intergalactic hydrogen gas, which blocks light. It’s like looking at the moon through clouds — you can see the shapes of the clouds by the moonlight that they block.”

Instead of the moon, the SDSS team observed quasars, brilliantly luminous beacons powered by giant black holes. Quasars are bright enough to be seen billions of light years from Earth, but at these distances they look like tiny, faint points of light. As light from a quasar travels on its long journey to Earth, it passes through clouds of intergalactic hydrogen gas that absorb light at specific wavelengths, which depend on the distances to the clouds. This patchy absorption imprints an irregular pattern on the quasar light known as the “Lyman-alpha forest.”

An observation of a single quasar gives a map of the hydrogen in the direction of the quasar, Slosar explained. The key to making a full, three-dimensional map is numbers. “When we use moonlight to look at clouds in the atmosphere, we only have one moon. But if we had 14,000 moons all over the sky, we could look at the light blocked by clouds in front of all of them, much like what we can see during the day. You don’t just get many small pictures — you get the big picture.”

The big picture shown in Slosar’s map contains important clues to the history of the universe. The map shows a time 11 billion years ago, when the first galaxies were just starting to come together under the force of gravity to form the first large clusters. As the galaxies moved, the intergalacitc hydrogen moved with them. Andreu Font-Ribera, a graduate student at the Institute of Space Sciences in Barcelona, created computer models of how the gas likely moved as those clusters formed. The results of his computer models matched well with the map. “That tells us that we really do understand what we’re measuring,” Font-Ribera said. “With that information, we can compare the universe then to the universe now, and learn how things have changed.”

A zoomed-in view of the map slice shown in the previous image. Red areas have more gas; blue areas have less gas. The black scalebar in the bottom right measures one billion light years. Image credit: A. Slosar and the SDSS-III collaboration.

The quasar observations come from the Baryon Oscillation Spectroscopic Survey (BOSS), the largest of the four surveys that make up SDSS-III. Eric Aubourg, from the University of Paris, led a team of French astronomers who visually inspected every one of the 14,000 quasars individually. “The final analysis is done by computers,” Aubourg said, “but when it comes to spotting problems and finding surprises, there are still things a human can do that a computer can’t.”

“BOSS is the first time anyone has used the Lyman-alpha forest to measure the three-dimensional structure of the universe,” said David Schlegel, a physicist at Lawrence Berkeley National Laboratory in California and the principal investigator of BOSS. “With any new technique, people are nervous about whether you can really pull it off, but now we’ve shown that we can.” In addition to BOSS, Schlegel noted, the new mapping technique can be applied to future, still more ambitious surveys, like its proposed successor BigBOSS.

When BOSS observations are completed in 2014, astronomers can make a map ten times larger than the one being released today, according to Patrick McDonald of Lawrence Berkeley National Laboratory and Brookhaven National Laboratory, who pioneered techniques for measuring the universe with the Lyman-alpha forest and helped design the BOSS quasar survey. BOSS’s ultimate goal is to use subtle features in maps like Slosar’s to study how the expansion of the universe has changed during its history. “By the time BOSS ends, we will be able to measure how fast the universe was expanding 11 billion years ago with an accuracy of a couple of percent. Considering that no one has ever measured the cosmic expansion rate so far back in time, that’s a pretty astonishing prospect.”

Quasar expert Patrick Petitjean of the Institut d’Astrophysique de Paris, a key member of Aubourg’s quasar-inspecting team, is looking forward to the continuing flood of BOSS data. “Fourteen thousand quasars down, one hundred and forty thousand to go,” he said. “If BOSS finds them, we’ll be happy to look at them all, one by one. With that much data, we’re bound to find things that we never expected.”

Share

Pumping Up Proton Polarization

Thursday, April 7th, 2011

Brookhaven Lab’s oldest and most-trophied workhorse, the Alternating Gradient Synchrotron (AGS), has broke its own world record for producing intense beams of polarized protons – particles that “spin” in the same direction.

Spin, a quantum property that describes a particle’s intrinsic angular momentum,  is used in a wide range of fields, from astronomy to medical imaging. But where spin comes from is still unknown.

In this picture of a proton-proton collision, the spin of the particles is shown as arrows circling the spherical particles. The red and green particles represent reaction products from the collision that are "seen" and analyzed by RHIC detectors.

To explore the mystery of spin, Brookhaven’s Relativistic Heavy Ion Collider (RHIC) smashes beams of polarized protons at close to the speed of light. RHIC is the only machine in the world with this capability. But before reaching RHIC’s high-speed collision course, the protons travel about one million miles through a series of linear and circular accelerators, including the AGS, a 41-year old circular accelerator more than a half mile around. Home to three of BNL’s seven Nobel Prize-winning discoveries, the AGS is Brookhaven’s longest-running accelerator.

Now, with a new upgrade, the AGS can keep up to 75 percent of those particles in the beam polarized while they accelerate – a 5 to 8 percent increase over the previous record. This feat was accomplished with custom-built power supplies created from old inventory and two revamped 1960s quadrupole magnets pulled from storage.

The two refurbished quadrupole magnets before being installed at the AGS

As the particles race through the AGS, two of the customized power supplies quickly pulse, hold, and pull back surges of power for each of the quadrupoles in a matter of milliseconds. Forty-two times within half a second, these pulsed currents produce magnetic kicks that keep the particles spinning in the correct direction.

For more details, see this story.

-Kendra Snyder, BNL Media & Communications

Share

Going in Circles

Wednesday, February 23rd, 2011

RHIC's main control room

If you were expecting this blog entry to be about the great song from the Friends of Distinction, it unfortunately won’t be the case. Instead, I’m taking you on the first trip this year of RHIC’s Yellow beam, one of two oppositely circulating particle beams that collide in the center of RHIC’s detectors. Why? Because the main goal of my first shift of Run11 was to get our particles to circulate in a closed orbit in the Yellow beamline. This will also give me the opportunity to get back to some of the technical terms I used in my previous post. So allow me to go over a little bit of accelerator physics theory for a minute.

RHIC, which stands for Relativistic Heavy Ion Collider, is the name given to the large circular collider at Brookhaven National Laboratory; in order for particles to actually reach the collider from their source, they need travel through a series of other accelerators:

  1. the LINAC for protons, the Tandem-to-Booster beamline for heavy ions
  2. the Booster synchrotron
  3. the Booster-to-AGS (BtA) line
  4. the Alternating Gradient Synchrotron (AGS)
  5. the AGS-to-RHIC (AtR) line

(more…)

Share

Hello again,

One of the commenters on our very first post wanted to hear more about the Large Synoptic Survey Telescope (LSST), one of the three cosmological projects that involve Brookhaven Lab. Set high on a mountaintop in Chile, LSST will be a very big and expensive ground-based telescope. Planning for the project started near the end of the 20th century and the experiment probably won’t start taking data in a scientific manner until 2020.

Artist rendering of LSST on Cerro Pachon, Chile. (Image Credit: Michael Mullen Design, LSST Corporation)

The story is that at a decadal survey 10 years ago, the person who first proposed that the word “synoptic” be used in the project’s name had a misunderstanding about what synoptic really means. Either way, the name has stuck. Synoptic, by the way, comes from Greek word “synopsis” and refers to looking at something from all possible aspects, which is precisely what LSST will do.

Astronomical survey instruments fall broadly under two categories: imaging instruments that take photos of the sky, and spectroscopic instruments that take spectra (that is, distribution of light across wavelengths) of a selected few objects in the sky. LSST falls into the first category — it will take many, many images of the sky in the five bands, which are a bit like colors, from ultra-violet light to infrared light.

(more…)

Share