• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Posts Tagged ‘heavy ion physics’

This post was written by Brookhaven Lab scientists Shigeki Misawa and Ofer Rind.

Run 13 at the Relativistic Heavy Ion Collider (RHIC) began one month ago today, and the first particles collided in the STAR and PHENIX detectors nearly two weeks ago. As of late this past Saturday evening, preparations are complete and polarized protons are colliding with the machine and detectors operating in “physics mode,” which means gigabytes of data are pouring into the RHIC & ATLAS Computing Facility (RACF) every few seconds.

Today, we store data and provide the computing power for about 2,500 RHIC scientists here at Brookhaven Lab and institutions around the world. Approximately 30 people work at the RACF, which is located about one mile south of RHIC and connected to both the Physics and Information Technology Division buildings on site. There are four main parts to the RACF: computers that crunch the data, online storage containing data ready for further analysis, tape storage containing archived data from collisions past, and the network glue that holds it all together. Computing resources at the RACF are split about equally between the RHIC collaborations and the ATLAS experiment running at the Large Hadron Collider in Europe.

Shigeki Misawa (left) and Ofer Rind at the RHIC & ATLAS Computing Facility (RACF) at Brookhaven Lab

Where Does the Data Come From?

For RHIC, the data comes from heavy ions or polarized protons that smash into each other inside PHENIX and STAR. These detectors catch the subatomic particles that emerge from the collisions to capture information—particle species, trajectories, momenta, etc.—in the form of electrical signals. Most signals aren’t relevant to what physicists are looking for, so only the signals that trip predetermined triggers are recorded. For example, with the main focus for Run 13 being the proton’s “missing” spin, physicists are particularly interested in finding decay electrons from particles called W bosons, because these can be used as probes to quantify spin contributions from a proton’s antiquarks and different “flavors” of quarks.

Computers in the “counting houses” at STAR and PHENIX package the raw data collected from selected electrical signals and send it all to the RACF via dedicated fiber-optic cables. The RACF then archives the data and makes it available to experimenters running analysis jobs on any of our 20,000 computing cores.

Recent Upgrades at the RACF

Polarized protons are far smaller than heavy ions, so they produce considerably less data when they collide, but even still, when we talk about data at the RACF, we’re talking about a lot of data. During Run 12 last year, we began using a new tape library to increase storage capacity by 25 percent for a total of 40 petabytes—the equivalent of 655,360 of the largest iPhones available today. We also more than doubled our ability to archive data for STAR last year (in order to meet the needs of a data acquisition upgrade) so we can now sustain 700 megabytes of incoming data every second for both PHENIX and STAR. Part of this is due to new fiber-optic cables connecting the counting houses to the RACF, which provide both increased data rates and redundancy.

With all this in place, along with those 20,000 processing cores (most computers today have two or four cores), certain operations that used to require six months of computer time now can be completed often in less than one week.

Looking Ahead

If pending budgets allow for the full 15-week run planned, we expect to collect approximately four petabytes of data from this run alone. During the run, we meet formally with liaisons from the PHENIX and STAR collaborations each week to discuss the amount of data expected in the coming weeks and to assess their operational needs. Beyond these meetings, we are in continual communication with our users, as we monitor and improve system functionality, troubleshoot, and provide first-line user support.

We’ll also continue to work with experimenters to evaluate computing trends, plan for future upgrades, and test the latest equipment—all in an effort to minimize bottlenecks that slow the data from getting to users and to get the most bang for the buck.

— Shigeki Misawa – Group Leader, RACF Mass Storage and General Services

— Ofer Rind – Technology Architect, RACF Storage Management

Share

Theoretical physicist Raju Venugopalan

We sat down with Brookhaven theoretical physicist Raju Venugopalan for a conversation about “color glass condensate” and the structure of visible matter in the universe.

Q. We’ve heard a lot recently about a “new form of matter” possibly seen at the Large Hadron Collider (LHC) in Europe — a state of saturated gluons called “color glass condensate.” Brookhaven Lab, and you in particular, have a long history with this idea. Can you tell me a bit about that history?

A. The idea for the color glass condensate arose to help us understand heavy ion collisions at our own collider here at Brookhaven, the Relativistic Heavy Ion Collider (RHIC)—even before RHIC turned on in 2000, and long before the LHC was built. These machines are designed to look at the most fundamental constituents of matter and the forces through which they interact—the same kinds of studies that a century ago led to huge advances in our understanding of electrons and magnetism. Only now instead of studying the behavior of the electrons that surround atomic nuclei, we are probing the subatomic particles that make up the nuclei themselves, and studying how they interact via nature’s strongest force to “give shape” to the universe today.

We do that by colliding nuclei at very high energies to recreate the conditions of the early universe so we can study these particles and their interactions under the most extreme conditions. But when you collide two nuclei and produce matter at RHIC, and also at the LHC, you have to think about the matter that makes up the nuclei you are colliding. What is the structure of nuclei before they collide?

We all know the nuclei are made of protons and neutrons, and those are each made of quarks and gluons. There were hints in data from the HERA collider in Germany and other experiments that the number of gluons increases dramatically as you accelerate particles to high energy. Nuclear physics theorists predicted that the ions accelerated to near the speed of light at RHIC (and later at LHC) would reach an upper limit of gluon concentration—a state of gluon saturation we call color glass condensate.* The collision of these super-dense gluon force fields is what produces the matter at RHIC, so learning more about this state would help us understand how the matter is created in the collisions. The theory we developed to describe the color glass condensate also allowed us to make calculations and predictions we could test with experiments. (more…)

Share

The Glue that Binds Us All

Wednesday, June 13th, 2012

RHIC, the Relativistic Heavy Ion Collider at Brookhaven Lab, found it first: a “perfect” liquid of strongly interacting quarks and gluons – a quark-gluon plasma (QGP) – produced by slamming heavy ions together at close to the speed of light. The fact that the QGP produced in these particle smashups was a liquid and not the expected gas, and that it flowed like a nearly frictionless fluid, took the physics world by surprise. These findings, now confirmed by heavy-ion experiments at the Large Hadron Collider (LHC) in Europe, have raised compelling new questions about the nature of matter and the strong force that holds the visible universe together.

Similarly, searches for the source of “missing” proton spin at RHIC have opened a deeper mystery: So far, it’s nowhere to be found.

To probe these and other puzzles, nuclear physicists would like to build a new machine: an electron-ion collider (EIC) designed to shine a very bright “light” on both protons and heavy ions to reveal their inner secrets. (more…)

Share

Happy Hallowe’en!

Monday, October 31st, 2011

Panel 1
Panel 2
Panel 3
Panel 4
Panel 5

Thanks to Steve for the inspiration and Rozmin for her help with editing! And, of course, to the LHC team.

Share

On May 26, 2005, a new supercomputer, a pioneering giant of its time, was unveiled at Brookhaven National Laboratory at a dedication ceremony attended by physicists from around the world. That supercomputer was called QCDOC, for quantum chromodynamics (QCD) on a chip, capable of handling the complex calculations of QCD, the theory that describes the nature and interactions of the basic building blocks of the universe. Now, after a career of state-of-the-art physics calculations, QCDOC has been retired — and will soon be replaced by a new “next generation” machine. (more…)

Share

Why run at lower energy?

Wednesday, March 23rd, 2011

Right now the LHC is about to start a short run with proton-proton collisions at a center of mass energy 2.76 TeV.  This is lower than what we ran last year and is a special request from the heavy ion physicists.  So you’ve heard a lot about why the particle physicists want to go to higher energy.  But why do we heavy ion physicists want to go to lower energy?

We want a reference for our lead-lead collisions.  If nucleus-nucleus collisions were nothing but a bunch of proton-proton collisions, what we measure in lead-lead collisions should be just some constant times what we measure in proton-proton collisions.  This is a bit simplistic, but it’s a pretty good start.  A lot of our measurements use proton-proton collisions as a reference and look for differences between proton-proton collisions and lead-lead collisions.  For instance, in the paper I discussed here we looked at the distribution of particles as a function of their momenta in lead-lead collisions and compared that to what we observed in proton-proton collisions.  For this paper we used the data from proton-proton collisions at 900 GeV and at 7 TeV to extrapolate to what we’d expect at 2.76 TeV, the same energy per nucleon as our lead-lead collisions.  As discussed here our models for proton-proton collisions are pretty good but they get some of the details wrong – and miss some features like this.   Since we depend on models to extrapolate to 2.76 TeV, we have greater uncertainty in our measurements than we would have if we had data at 2.76 TeV.  The LHC can go down to 2.76 TeV and what we need 2.76 TeV proton-proton data doesn’t require as many statistics (as many total proton-proton collisions) as what the particle physicists need to look for things like the Higgs.  So we’re having a short run with proton-proton collisions at a lower energy because it will significantly help the heavy ion physics program.  (We’ll also get some core physics measurements out of the 2.76 TeV proton-proton data, but like the paper I discussed here, these will refine our understanding but not dramatically change our understanding of proton-proton collisions.)  I hope you’re as excited as I am!

Share

I am overdue for a blog post because I have been way too busy lately.  I got an email from an elementary schooler, Jacob, asking about the QGP so I thought instead of replying privately I’d reply here since it may be of general interest.  The questions are from Jacob.

What is QGP going to be used for in the future when it is better controlled?

Right now we don’t think the QGP has any practical applications.  We’re studying it because we want to understand the universe in general and nuclear matter in particular.  Shortly after the Big Bang, we think that the universe went through a Quark Gluon Plasma phase.  By understanding the QGP better, we may understand how the universe expanded better.  When we do basic research, we don’t usually know what impact it will have.  What we know by looking at history is that basic research eventually leads to benefits to humanity – but we’re very bad at predicting what those benefits will be.  When Mendel studied genetics of plants, he never imagined that genetic studies would lead to all of the improvements in medical care we have now.  Einstein developed his theory of gravity not so that we could send satellites into space or so that we could all have GPS in our cars or get better TV reception – he was motivated by simple curiosity and a desire to understand our universe better.  We are still reaping new benefits from quantum mechanics, developed in the early 20th century – we now have light emitting diodes (LEDs) in traffic lights and flashlights and while LEDs existed when I was your age, they weren’t nearly as widespread, as cheap, or available in so many colors.  So it takes a long time to see the benefits of basic research.

So we don’t know what applications this research will have in the future.  That said, there are a lot of spin off benefits to this research.  In high energy physics, we are always building the fastest and most precise detectors possible.  To do this we often have to develop and test new detector technologies.  Once we’ve developed the technology, these detectors can be used elsewhere too.  Particle detectors are used in hospitals in x-ray and MRI machines.  They are used in chemical and biomedical research to study the images of proteins and the structures of solids.  They are used in national security for detecting radioactive materials.

Basic research moves the boundary of what is possible.  Once we have done that, there are a lot of benefits.  But since we’re working on doing things that have never been done and studying things never studied before, we can’t predict exactly how it will be useful.  Put another way, if we knew what would happen, we wouldn’t call it an experiment.

What attributes does it have that other matter does not have?

This is a difficult question to answer as worded – it depends on what you mean by “attributes”.  When I think of the properties of a particular form of matter, I think about its density, its opacity to different probes (like if you shine light through it does the light come out the other side?)…  All forms of matter have a density.  So I’m going to answer a slightly different question – what makes a QGP unique?  What makes the QGP unique (among the forms of matter we’ve studied in the laboratory) is that the quarks and gluons interact through the strong force.  There are four fundamental forces in nature

1. Gravitation
2. Electromagnetism
3. Weak interaction
4. Strong interaction

The first two are the most familiar.  Gravity is the reason why you stay on the ground instead of floating through the air.  It’s also the reason the Earth orbits the Sun.  The electromagnetic force is ultimately responsible for basically every other force you feel or see.  When you sit in a chair, the reason you don’t fall through the chair is ultimately due to interactions between your atoms and the atoms of the chair.  It’s also behind light and electricity.  It’s how your microwave and your TV work.  The most familiar thing we can attribute to the electroweak decay is beta decay – a particular kind of decay of a nucleus.   The strong force is what holds nuclei together.  If we only had the electromagnetic force, the protons in the nucleus would not be bound.

So a QGP is a liquid of quarks and gluons bound together by the strong force.  Water molecules, for instance, primarily interact through the electromagnetic force.  The properties of water are determined by the way water molecules interact through the electromagnetic force.  To understand the QGP, we have to understand how quarks and gluons interact through the strong force.  This turns out to be a very difficult computational problem.  But by studying the QGP, we can try to calculate what we would expect and then compare what we expect from our theories to what we see in the laboratory.

In addition to that, it is the hottest, densest form of matter ever created in the laboratory.  And it appears to have the lowest viscosity of any form of matter ever created in the laboratory.  Viscosity is a way of measuring how much a fluid resists flowing.  Honey, for instance, is much more viscous than water.

How will QGP affect modern or future physics?

I don’t know exactly.  It depends on what we learn.  Already we’ve learned a lot about relativistic fluids – where the individual particles in the fluid are traveling close to the speed of light.  As I said in the first answer, we don’t know exactly what we’ll learn – because if we did, we wouldn’t call it an experiment. One thing I hope – and maybe you can help me out here – is that we’ll inspire the next generation to go into science, math and engineering.

Also, what state of matter is it?  I know that it is called plasma but I’ve also read that it is very similar to both liquid and gas.

A QGP is a new state of matter.  We believe it is a liquid – indeed, a liquid that probably has the lowest viscosity of anything we’ve ever measured.  We thought it’d be a gas, but it turned out to be a liquid.  Here I have a post describing what we know about the QGP and its phase diagram.

I also could not verify what temperature it occurs at because there is so much different information on the internet.

The reason what you find on the internet is somewhat unclear is that the answer is somewhat unclear.  First, it doesn’t exist at just one temperature.  Think about water.  Water can be cold, warm, hot, etc.  It depends.  There’s a temperature where ice melts and becomes water and below that you can’t have water.  That temperature is called the melting point.  But then once you have water, you can heat it up and you have to heat it up a lot before it boils and becomes a gas.  That also occurs at a special temperature – the boiling point.  The problem is, these temperatures depend on pressure and volume.  Water boils at a lower temperature at high altitude.  Analogously, we have a melting point and a boiling point for the QGP.  We think the melting point at the baryochemical potential at RHIC is about 170 MeV – but there’s a fairly large uncertainty in that number.  We think we’re well above that at RHIC and we’ll be even further above it at the LHC (but we haven’t yet had enough time to analyze the data at the LHC to say how hot it is). This gets to a crucial issue – we don’t have a thermometer to measure a QGP.  If you put a thermometer like the one you have in your house into a vat of QGP (if we could ever create that much of it) it’d melt.  So we have to come up with other ways of measuring the temperature.  We can look at the energies of particles created in the collision, for instance.  But it takes more work than just using a thermometer.

Many thanks to Jacob for the great questions!

Share

On ALICE

Thursday, February 3rd, 2011

The electromagnetic calorimeter is now fully installed but there’s still work to do before we start running.  We now have to make sure we’re able to read all of the data.  I’ve spent most of the last week in, on, and next to ALICE troubleshooting (along with several of my colleagues working on the calorimeter.)  Here I am sitting inside the magnet on top of  the support structure next to the front end electronics (the boards that read out the data) for the calorimeter.  I’m on the phone with someone upstairs who’s trying to take a pedestal run to see if we’ve fixed a problem reading out data from one of the new supermodules.  (A pedestal run is a run you take without proton-proton or lead-lead collisions to see what the background in your detector is.  It’s useful for troubleshooting because the detector has to send data.)

Now that we’re getting close to the start of the run, they’re putting the concrete shielding in.  In total 30 or 40 tons of concrete blocks sit above ALICE.  Here you can see one of the last blocks going in:

And just to go along with the preposition theme, here’s a picture under ALICE (in the magnet but under the TPC, TRD, and TOF):

Share

I introduced you to the ALICE electromagnetic calorimeter (EMCal) a while ago, and told you about some additional training I had so that I could work on the detector after the EMCal is physically installed.  Over the winter shut down – right now – the EMCal is being installed inside ALICE.

There are several steps in this process.  First the EMCal was assembled, partially calibrated, and tested – this was done in November.  There were several stages of testing.   We tested that each individual cell works.  We tested that each individual electronic card for reading out the data works.  We assembled everything exactly how it would be installed inside ALICE and tested it again, making sure that all of the parts (including the wires) worked together.  We partially calibrated the detector by taking data on cosmic rays.  We’ve had all of the six supermodules we’re adding waiting at CERN until we could get access to ALICE to install them.

Now they’re physically installing the supermodules and our amateur EMCal documentarian, Federico, has taken some videos of the process.  (It might help to go back to this post, where I introduce each of the detectors and explain what they do, and this post, where I show you some pictures of each of the detectors.  Then maybe you can identify the different parts of ALICE in the video.)  Note the action in the videos is very slow because it’s very important not to damage anything while installing the detector.

The first step is to put the supermodule in the EMCal insertion tool.  This is a specialized device for installing EMCal supermodules.  Here you can see a video of that step:

[youtube 75Olhr4YoUw&NR]

And then once the supermodule is in the insertion tool, it gets installed in ALICE:

[youtube 0es9Qcdj-H8]

And now some gratuitous cool pictures of the process:

ALICE ready for the installation of EMCal supermodules

Looking up from the cavern.  We sit at the top when we take data and the detector is far below us.  The EMCal supermodules have to be lowered down.

Checking everything twice to make sure there are no mistakes.

Getting the EMCal insertion tool ready for a supermodule

Getting ready to strap the support onto the supermodule

Sliding the support onto the supermodule

Now it’s in and they’re strapping it onto the crane

Up goes the supermodule…

…and into the insertion tool.

And it’s rotated to the correct angle…

Waiting on the support structure in ALICE to make sure it goes in properly…

Get it in the right position…

Now loosen it from the EMCal tool and in it goes!

And now we do the next one.

Many thanks to Federico for the great pics!

Share

Jet quenching

Monday, December 13th, 2010

There have been a lot of exciting results lately and I haven’t gotten a chance to write about them because I’ve been too busy.  Today I’ll tackle jet quenching, which Seth touched on in one of his posts.

You may have done absorption spectroscopy in a chemistry lab.  In absorption spectroscopy, light from a calibrated source passes through a sample and changes in the light after passing through the sample are used to determine the properties of the sample.  For example, you may have a liquid that absorbs blue light but lets orange light through.  This tells you something about the properties of the liquid.  We want something like that for studying the Quark Gluon Plasma (QGP).  Perhaps we could try shining light on the QGP to see what it does to the light, how much is absorbed?  The problem with that is that the QGP formed in a nucleus-nucleus collision doesn’t live very long – about 10-24 seconds.  Trying to aim light at the QGP would be like trying to hit a fighter plane at top speed with a Nerf gun – by the time you aimed, the plane would be long gone.

Fortunately, photons (light) are created in the lead-lead collisions.  Since they are produced in the collision, we know they went through the QGP so we can use them and study how they’re affected by the QGP to determine its properties.  This is analogous to determining what a store sells by looking at what people have in their shopping bags when they leave the store rather than by going in the store yourself.  This is one of the measurements we’ll see at some point.  But photons only interact through the electromagnetic force and many of the features of the QGP we’re trying to study come from the interaction of quarks and gluons through the strong force.  To study these properties, we need something like a photon, but that interacts through the strong force.  We can use quarks and gluons.

There are quarks and gluons in the incoming lead nuclei, and a quark or gluon in one nucleus can scatter off of a quark or gluon in the other nucleus.  We’re particularly interested in hard scatterings, where they hit each other and bounce off like billiard balls.  This process happens early in the collision, and then the partons travel through the medium, as shown below:


But there’s a complication.  We can’t see individual quarks and gluons – they’re always bound in hadrons, states made of two quarks (mesons) or three quarks (baryons), a property called confinement.  After the parton gets knocked out of the nucleon, it hadronizes – it breaks up into several mesons and baryons.  These are actually what we observe in our detector.  For each parton, we have a cone of hadrons called a jet.  This is an event display from the STAR experiment showing two jets in a proton-proton collision:

In a proton-proton collision, it’s easy to see jets, but in a heavy ion collision they’re in events like these:

So it’s not as easy to find jets in heavy ion collisions.  One thing we can do is look at very fast moving hadrons.  These are more likely to have come from jets.  This is the subject of the most recent ALICE paper.  This is the main result from that figure:

The x-axis is the momentum of the hadron perpendicular to the beam, called the transverse momentum.  The y-axis is something called RAA, which is the ratio of the number of hadrons we measure in lead-lead collisions to the number we would expect if a lead-lead collision were just a bunch of nucleon-nucleon collisions.  We take what we measure in proton-proton collisions and scale it by the number of proton-proton, proton-neutron, and neutron-neutron collisions we would expect.  (Yes, I’m skipping lots of technical details about how that scaling is done.)  Another way of putting it is that it’s what we get divided by what we expect.  If RAA were exactly 1.0, it’d mean there’s no physics in lead-lead collisions that isn’t in proton-proton collisions.  An RAA less than one means we see way fewer particles than we expect.  In the figure, the open points are what we measure for peripheral collisions, where the nuclei just barely graze each other.  The solid points show what we measure for central – head-on – collisions.  The big, obvious feature is the bump which peaks for particles with a transverse momentum of about 2 GeV/c.  There’s a lot of physics in there and it’s really interesting but it’s not what I’m talking about today.  Look at what it does at higher momenta – above about 5 GeV/c.  This is where we trust our theoretical calculations the most.  (At lower momenta, there’s much more theoretical uncertainty in what to expect.)  We see only about 15% of the number of particles we expect to see.  This was already observed at the Relativistic Heavy Ion Collider, but the effect is larger at the LHC.

This happens because the QGP is really, really dense.  It’s harder for a parton to go through the QGP than it’ll be to walk through a Target store on the day after Christmas.  The parton loses its energy in the QGP.  Imagine shooting a bullet into a block of lead – it’d just get stuck.

ATLAS’s recent paper exhibits this more directly.  Here’s a lead-lead event where the lead nuclei barely hit each other.  Here you can see two jets, like what you’d expect if neither parton got stuck in the QGP:

The φ axis is the angle around the beam pipe in radians, the η axis is a measure the angle between the particle and the beam pipe, and the z axis is the amount of energy observed in the calorimeter.  Imagine rolling this plot up into a tube, connecting φ=π to φ=-π and that would show you roughly where the energy is deposited.  The peaks are from jets, like in the event display from STAR above.  The amount of energy in each peak is about the same – if you added up each block in the peak for both peaks, they’d be about equal.  And here’s a lead-lead event where one of the partons got stuck in the medium:

In this plot one of the peaks is missing.  One of the jets is quenched – it got absorbed by the QGP.  This is the first direct observation of jet quenching in a single event.  It’s causing quite a buzz in the field.

Share