• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

Latest Posts

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Sally
  • Shaw
  • University College London
  • UK

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Mandeep
  • Gill

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Posts Tagged ‘Brookhaven National Laboratory’

This Fermilab press release came out on Oct. 20, 2014.

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

ESnet to build high-speed extension for faster data exchange between United States and Europe. Image: ESnet

Scientists across the United States will soon have access to new, ultra-high-speed network links spanning the Atlantic Ocean thanks to a project currently under way to extend ESnet (the U.S. Department of Energy’s Energy Sciences Network) to Amsterdam, Geneva and London. Although the project is designed to benefit data-intensive science throughout the U.S. national laboratory complex, heaviest users of the new links will be particle physicists conducting research at the Large Hadron Collider (LHC), the world’s largest and most powerful particle collider. The high capacity of this new connection will provide U.S. scientists with enhanced access to data at the LHC and other European-based experiments by accelerating the exchange of data sets between institutions in the United States and computing facilities in Europe.

DOE’s Brookhaven National Laboratory and Fermi National Accelerator Laboratory—the primary computing centers for U.S. collaborators on the LHC’s ATLAS and CMS experiments, respectively—will make immediate use of the new network infrastructure once it is rigorously tested and commissioned. Because ESnet, based at DOE’s Lawrence Berkeley National Laboratory, interconnects all national laboratories and a number of university-based projects in the United States, tens of thousands of researchers from all disciplines will benefit as well.

The ESnet extension will be in place before the LHC at CERN in Switzerland—currently shut down for maintenance and upgrades—is up and running again in the spring of 2015. Because the accelerator will be colliding protons at much higher energy, the data output from the detectors will expand considerably—to approximately 40 petabytes of raw data per year compared with 20 petabytes for all of the previous lower-energy collisions produced over the three years of the LHC first run between 2010 and 2012.

The cross-Atlantic connectivity during the first successful run for the LHC experiments, which culminated in the discovery of the Higgs boson, was provided by the US LHCNet network, managed by the California Institute of Technology. In recent years, major research and education networks around the world—including ESnet, Internet2, California’s CENIC, and European networks such as DANTE, SURFnet and NORDUnet—have increased their backbone capacity by a factor of 10, using sophisticated new optical networking and digital signal processing technologies. Until recently, however, higher-speed links were not deployed for production purposes across the Atlantic Ocean—creating a network “impedance mismatch” that can harm large, intercontinental data flows.

An evolving data model
This upgrade coincides with a shift in the data model for LHC science. Previously, data moved in a more predictable and hierarchical pattern strongly influenced by geographical proximity, but network upgrades around the world have now made it possible for data to be fetched and exchanged more flexibly and dynamically. This change enables faster science outcomes and more efficient use of storage and computational power, but it requires networks around the world to perform flawlessly together.

“Having the new infrastructure in place will meet the increased need for dealing with LHC data and provide more agile access to that data in a much more dynamic fashion than LHC collaborators have had in the past,” said physicist Michael Ernst of DOE’s Brookhaven National Laboratory, a key member of the team laying out the new and more flexible framework for exchanging data between the Worldwide LHC Computing Grid centers.

Ernst directs a computing facility at Brookhaven Lab that was originally set up as a central hub for U.S. collaborators on the LHC’s ATLAS experiment. A similar facility at Fermi National Accelerator Laboratory has played this role for the LHC’s U.S. collaborators on the CMS experiment. These computing resources, dubbed Tier 1 centers, have direct links to the LHC at the European laboratory CERN (Tier 0).  The experts who run them will continue to serve scientists under the new structure. But instead of serving as hubs for data storage and distribution only among U.S.-based collaborators at Tier 2 and 3 research centers, the dedicated facilities at Brookhaven and Fermilab will be able to serve data needs of the entire ATLAS and CMS collaborations throughout the world. And likewise, U.S. Tier 2 and Tier 3 research centers will have higher-speed access to Tier 1 and Tier 2 centers in Europe.

“This new infrastructure will offer LHC researchers at laboratories and universities around the world faster access to important data,” said Fermilab’s Lothar Bauerdick, head of software and computing for the U.S. CMS group. “As the LHC experiments continue to produce exciting results, this important upgrade will let collaborators see and analyze those results better than ever before.”

Ernst added, “As centralized hubs for handling LHC data, our reliability, performance and expertise have been in demand by the whole collaboration, and now we will be better able to serve the scientists’ needs.”

An investment in science
ESnet is funded by DOE’s Office of Science to meet networking needs of DOE labs and science projects. The transatlantic extension represents a financial collaboration, with partial support coming from DOE’s Office of High Energy Physics (HEP) for the next three years. Although LHC scientists will get a dedicated portion of the new network once it is in place, all science programs that make use of ESnet will now have access to faster network links for their data transfers.

“We are eagerly awaiting the start of commissioning for the new infrastructure,” said Oliver Gutsche, Fermilab scientist and member of the CMS Offline and Computing Management Board. “After the Higgs discovery, the next big LHC milestones will come in 2015, and this network will be indispensable for the success of the LHC Run 2 physics program.”

This work was supported by the DOE Office of Science.
Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy.  The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.  For more information, please visit science.energy.gov.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation for the State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit applied science and technology organization.

Visit Brookhaven Lab’s electronic newsroom for links, news archives, graphics, and more at http://www.bnl.gov/newsroom, follow Brookhaven Lab on Twitter, http://twitter.com/BrookhavenLab, or find us on Facebook, http://www.facebook.com/BrookhavenLab/.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

Media contacts:

  • Karen McNulty-Walsh, Brookhaven Media and Communications Office, kmcnulty@bnl.gov, 631-344-8350
  • Kurt Riesselmann, Fermilab Office of Communication, media@fnal.gov, 630-840-3351
  • Jon Bashor, Computing Sciences Communications Manager, Lawrence Berkeley National Laboratory, jbashor@lbnl.gov, 510-486-5849

Computing contacts:

  • Lothar Bauerdick, Fermilab, US CMS software computing, bauerdick@fnal.gov, 630-840-6804
  • Oliver Gutsche, Fermilab, CMS Offline and Computing Management Board, gutsche@fnal.gov, 630-840-8909
Share

This article appeared in symmetry on May 1, 2014.

Scientists stay inspired in their sometimes tedious task of inspecting photographs taken in the Dark Energy Survey’s ambitious cataloging of one-eighth of the sky. Image courtesy of Dark Energy Survey

Scientists stay inspired in their sometimes tedious task of inspecting photographs taken in the Dark Energy Survey’s ambitious cataloging of one-eighth of the sky. Image courtesy of Dark Energy Survey

Physicists working on the Dark Energy Survey can expect to pull many an all-nighter. The international collaboration of more than 120 scientists aims to take about 100,000 photographs peering deep into the night sky. Scientists must personally review many of these photos to make sure the experiment is working well, and they’ve come up with ways to stay motivated while doing so.

DES scientists collected almost 14,000 photographs from August 2013 to February 2014, in the first of five seasons they plan to operate their sophisticated Dark Energy Camera. Even for those of us who aren’t trying to take the most detailed survey of the universe, it might not come as a surprise that complications can occur during operation. For example, the telescope may not always sync up with the natural movement of the night sky, and passing airplanes can create trails in the images. Software bugs can also cause issues.

Two of the DES researchers, Erin Sheldon of Brookhaven National Laboratory and Peter Melchior of The Ohio State University, created the DES Exposure Checker, an online gallery of images from the telescope. Team members use the photo repository as a way to spot imperfections and other issues with the images so they can fix problems as quickly as possible.

“These problems are easier for an actual person to see rather than some automated program,” Sheldon says. “And then we can create an inventory to help diagnose troubles that may occur with future images.”

When reviewing photos, DES scientists flag the ones that show symptoms of different problems, such as long streaks from satellites; unwanted reflections, called ghosts; or marks left by cosmic rays. But the process can get overwhelming with thousands of photos to look over. So the DES researchers decided to add a positive classification to the mix—an “Awesome!” category. When someone sees an incredible photo, they can mark it as such in the database.

Sheldon points out one of his favorite images, one that captured a passing comet. “It was just so serendipitous. We couldn’t find that if we pointed the telescope in the same place at any other time,” he says.

Steve Kent, Fermilab scientist and head of the experimental astrophysics group, says one of his favorite images from the survey shows a dying star. In the color photo, a bright blue oxygen haze surrounds the hot remnant of what was formerly a giant red star.

A second way to encourage team members classifying images is the leader board posted on the DES Exposure Checker website, honoring individuals who have categorized the most photos. Researchers compete to see their names at the top.

But more than friendly competition drives the DES team to categorize images. They’re also seeking answers to questions about the past and future of our universe such as: Has the density of dark energy changed over time? Why is the expansion of the universe speeding up?

“For me, it’s a mystery,” Sheldon says. “I have this question, and I have to find out the answer.”

Amanda Solliday

Share

Heat: Adventures in the World's Fiery Places (Little Brown, 2013). If you haven't already fallen in love with the groundbreaking science that's taking place at RHIC, this book about all things hot is sure to ignite your passion.

Bill Streever, a biologist and best-selling author of Cold: Adventures in the World’s Frozen Places, has just published his second scientific survey, which takes place at the opposite end of the temperature spectrum. Heat: Adventures in the World’s Fiery Places features flames, firewalking, and notably, a journey into the heart of the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory.

I accompanied Streever for a full-day visit in July 2011 with physicist Barbara Jacak of Stony Brook University, then spokesperson of the PHENIX Collaboration at RHIC. The intrepid reporter (who’d already tagged along with woodland firefighters and walked across newly formed, still-hot volcanic lava—among other adventures described in the book) met with RHIC physicists at STAR and PHENIX, descended into the accelerator tunnel, and toured the refrigeration system that keeps RHIC’s magnets supercold. He also interviewed staff at the RHIC/ATLAS Computing Facility—who face the challenge of dissipating unwanted heat while accumulating and processing reams of RHIC data—as well as theorists and even climate scientists, all in a quest for understanding the ultrawarm.

The result is an enormously engaging, entertaining, and informative portrayal of heat in a wide range of settings, including the 7-trillion-degree “perfect” liquid quark-gluon plasma created at RHIC, and physicists’ pursuit of new knowledge about the fundamental forces and interactions of matter. But Streever’s book does more: It presents the compelling story of creating and measuring the world’s hottest temperature within the broader context of the Lab’s history, including its role as an induction center during both World Wars, and the breadth and depth of our current research—from atoms to energy and climate research, and even the Long Island Solar Farm.

“Brookhaven has become an IQ magnet, where smart people congregate to work on things that excite geniuses,” he writes.

Streever’s own passion for science comes across clearly throughout the book. But being at “the top of the thermometer” (the title of his final chapter, dedicated in part to describing RHIC) has its privileges. RHIC’s innermost beam pipes—at the hearts of its detectors, inside which head-on ion collisions create the highest temperature ever measured in a laboratory—have clearly left an impression:

“… I am forever enthralled by Brookhaven’s pipes. At the top of the thermometer, beyond any temperature that I could possibly imagine, those pipes explore conditions near the beginning of the universe … In my day-to-day life, bundled in a thick coat or standing before my woodstove or moving along a snow-covered trail, I find myself thinking of those pipes. And when I think of them, I remember that at the top of the thermometer lies matter with the audacity to behave as though it were absolutely cold, flowing like a perfect liquid…”

There’s more, a wonderful bit more that conveys the pure essence of science. But I don’t want to spoil it. Please read and share this book. The final word is awe.

The book is available for purchase through major online retailers and in stores.

-Karen McNulty Walsh, BNL Media & Communications Office

Share

Theoretical physicist Raju Venugopalan

We sat down with Brookhaven theoretical physicist Raju Venugopalan for a conversation about “color glass condensate” and the structure of visible matter in the universe.

Q. We’ve heard a lot recently about a “new form of matter” possibly seen at the Large Hadron Collider (LHC) in Europe — a state of saturated gluons called “color glass condensate.” Brookhaven Lab, and you in particular, have a long history with this idea. Can you tell me a bit about that history?

A. The idea for the color glass condensate arose to help us understand heavy ion collisions at our own collider here at Brookhaven, the Relativistic Heavy Ion Collider (RHIC)—even before RHIC turned on in 2000, and long before the LHC was built. These machines are designed to look at the most fundamental constituents of matter and the forces through which they interact—the same kinds of studies that a century ago led to huge advances in our understanding of electrons and magnetism. Only now instead of studying the behavior of the electrons that surround atomic nuclei, we are probing the subatomic particles that make up the nuclei themselves, and studying how they interact via nature’s strongest force to “give shape” to the universe today.

We do that by colliding nuclei at very high energies to recreate the conditions of the early universe so we can study these particles and their interactions under the most extreme conditions. But when you collide two nuclei and produce matter at RHIC, and also at the LHC, you have to think about the matter that makes up the nuclei you are colliding. What is the structure of nuclei before they collide?

We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at RHIC (and later at LHC) would reach an upper limit of gluon concentration—a state of gluon saturation we call color glass condensate.* The collision of these super-dense gluon force fields is what produces the matter at RHIC, so learning more about this state would help us understand how the matter is created in the collisions. The theory we developed to describe the color glass condensate also allowed us to make calculations and predictions we could test with experiments. (more…)

Share

The art of data mining is about searching for the extraordinary within a vast ocean of regularity. This can be a painful process in any field, but especially in particle physics, where the amount of data can be enormous, and ‘extraordinary’ means a new understanding about the fundamental underpinnings of our universe. Now, a tool first conceived in 2005 to manage data from the world’s largest particle accelerator may soon push the boundaries of other disciplines. When repurposed, it could bring the immense power of data mining to a variety of fields, effectively cracking open the possibility for more discoveries to be pulled up from ever-increasing mountains of scientific data.

Advanced data management tools offer scientists a way to cut through the noise by analyzing information across a vast network. The result is a searchable pool that software can sift through and use for a specific purpose. One such hunt was for the Higgs boson, the last remaining elementary particle of the Standard Model that, in theory, endows other particles with mass.

With the help of a system called PanDA, or Production and Distributed Analysis, researchers at CERN’s Large Hadron Collider (LHC) in Geneva, Switzerland discovered such a particle by slamming protons together at relativistic speeds hundreds of millions of times per second. The data produced from those trillions of collisions—roughly 13 million gigabytes worth of raw information—was processed by the PanDA system across a worldwide network and made available to thousands of scientists around the globe. From there, they were able to pinpoint an unknown boson containing a mass between 125–127 GeV, a characteristic consistent with the long-sought Higgs.

An ATLAS event with two muons and two electrons - a candidate for a Higgs-like decay. The two muons are picked out as long blue tracks, the two electrons as short blue tracks matching green clusters of energy in the calorimeters. ATLAS Experiment © 2012 CERN.

The sheer amount of data arises from the fact that each particle collision carries unique signatures that compete for attention with the millions of other collisions happening nanoseconds later. These must be recorded, processed, and analyzed as distinct events in a steady stream of information. (more…)

Share

Today’s public seminar at CERN, where the ATLAS and CMS collaborations presented the preliminary results of their searches for the Standard Model (SM) Higgs boson with the full dataset collected during 2011, is a landmark for high-energy physics!

The Higgs boson is a still-hypothetical particle postulated in the mid-1960s to complete what is considered the SM of particle interactions. Its role within the SM is to provide other particles with mass. Specifically, the mass of elementary particles is the result of their interaction with the Higgs field. The Higgs boson’s properties are defined in the SM, apart from its mass, which is a free parameter of the theory. (more…)

Share

This post, originally published on 11/18/11 here, was written by Kétévi Adiklè Assamagan, a staff physicist at Brookhaven National Laboratory and the ATLAS contact person for the ATLAS-CMS combined Higgs analysis.

Today we witnessed a landmark LHC first: At the HCP conference in Paris, friendly rivals, the ATLAS and CMS collaborations, came together to present a joint result! This ATLAS-CMS combined Higgs search was motivated by the fact that pooling the dataset increases our chances of excluding or finding the Higgs boson over those of a single experiment. This is the first example of this kind of scientific collaboration at the LHC, and the success of the whole endeavor hinged on a whole host of thorny issues being tackled…

Discussions about combining our Higgs search results with CMS’s first started over a year ago, but before we could proceed with any kind of combined analysis, we had first to jointly outline how on earth we were going to go about doing it. This was no small undertaking; although we’re looking for the same physics, the ATLAS and CMS detectors are very different beasts materially, and use completely independent software to define and identify particles. How can we be certain that what passes for an electron in ATLAS would also be picked out as such in CMS? (more…)

Share

On May 26, 2005, a new supercomputer, a pioneering giant of its time, was unveiled at Brookhaven National Laboratory at a dedication ceremony attended by physicists from around the world. That supercomputer was called QCDOC, for quantum chromodynamics (QCD) on a chip, capable of handling the complex calculations of QCD, the theory that describes the nature and interactions of the basic building blocks of the universe. Now, after a career of state-of-the-art physics calculations, QCDOC has been retired — and will soon be replaced by a new “next generation” machine. (more…)

Share

“There it is — the world’s most beautiful physics experiment,” says physicist Chris Polly from a metal footbridge that crosses over the 14-meter blue steel ring of Brookhaven National Laboratory’s muon g – 2 experiment, now being disassembled. A haze of dust hangs in the air above Polly and a handful of other physicists and engineers who’ve gathered together to help resurrect the $20-million machine by transporting it hundreds of miles to Fermi National Accelerator Laboratory in Illinois. (more…)

Share

This story first appeared on Brookhaven’s website.

They come from the midst of exploding stars beyond our solar system — and possibly, from the nuclei of far distant galaxies. Their name, “galactic cosmic rays,” sounds like something from a science fiction movie. They’re not really rays.

Galactic cosmic rays (GCR) is the term used to describe a wide variety of charged particles traveling through space at high energies and almost the speed of light, from subatomic particles like electrons and positrons to the nuclei of every element on the periodic table. Since they’re created at energies sufficient to propel them on long journeys through space, GCRs are a form of ionizing radiation, or streaming particles and light waves with enough oomph to knock electrons out of their orbits, creating newly charged, unstable atoms in most of the matter they traverse. (more…)

Share