## Posts Tagged ‘astrophysics’

### Building an instrument to map the universe in 3-D

Wednesday, May 27th, 2015

The future Dark Energy Spectroscopic Instrument will be mounted on the Mayall 4-meter telescope. It will be used to create a 3-D map of the universe for studies of dark energy. Photo courtesy of NOAO

Dark energy makes up about 70 percent of the universe and is causing its accelerating expansion. But what it is or how it works remains a mystery.

The Dark Energy Spectroscopic Instrument (DESI) will study the origins and effects of dark energy by creating the largest 3-D map of the universe to date. It will produce a map of the northern sky that will span 11 billion light-years and measure around 25 million galaxies and quasars, extending back to when the universe was a mere 3 billion years old.

Once construction is complete, DESI will sit atop the Mayall 4-Meter Telescope in Arizona and take data for five years.

DESI will work by collecting light using optical fibers that look through the instrument’s lenses and can be wiggled around to point precisely at galaxies. With 5,000 fibers, it can collect light from 5,000 galaxies at a time. These fibers will pass the galaxy light to a spectrograph, and researchers will use this information to precisely determine each galaxy’s three-dimensional position in the universe.

Lawrence Berkeley National Laboratory is managing the DESI experiment, and Fermilab is making four main contributions: building the instrument’s barrel, packaging and testing charge-coupled devices, or CCDs, developing an online database and building the software that will tell the fibers exactly where to point.

The barrel is a structure that will hold DESI’s six lenses. Once complete, it will be around 2.5 meters tall and a meter wide, about the size of a telephone booth. Fermilab is assembling both the barrel and the structures that will hold it on the telescope.

“It’s a big object that needs to be built very precisely,” said Gaston Gutierrez, a Fermilab scientist managing the barrel construction. “It’s very important to position the lenses very accurately, otherwise the image will be blurred.”

DESI’s spectrograph will use CCDs, sensors that work by converting light collected from distant galaxies into electrons, then to digital values for analysis. Fermilab is responsible for packaging and testing these CCDs before they can be assembled into the spectrograph.

Fermilab is also creating a database that will store information required to operate DESI’s online systems, which direct the position of the telescope, control and read the CCDs, and ensure proper functioning of the spectrograph.

Lastly, Fermilab is developing the software that will convert the known positions of interesting galaxies and quasars to coordinates for the fiber positioning system.

Fermilab completed these same tasks when it built the Dark Energy Camera (DECam), an instrument that currently sits on the Victor Blanco Telescope in Chile, imaging the universe. Many of these scientists and engineers are bringing this expertise to DESI.

“DESI is the next step. DECam is going to precisely measure the sky in 2-D, and getting to the third dimension is a natural progression,” said Fermilab’s Brenna Flaugher, project manager for DECam and one of the leading scientists on DESI.

These four contributions are set to be completed by 2018, and DESI is expected to see first light in 2019.

“This is a great opportunity for students to learn the technology and participate in a nice instrumentation project,” said Juan Estrada, a Fermilab scientist leading the DESI CCD effort.

DESI is funded largely by the Department of Energy with significant contributions from non-U.S. and private funding sources. It is currently undergoing the DOE CD-2 review and approval process.

“We’re really appreciative of the strong technical and scientific support from Fermilab,” said Berkeley Lab’s Michael Levi, DESI project director.

Diana Kwon

### Absence of gravitational-wave signal extends limit on knowable universe

Thursday, April 9th, 2015

The Holometer is sensitive to high-frequency gravitational waves, allowing it to look for events such as cosmic strings. Photo: Reidar Hahn

Imagine an instrument that can measure motions a billion times smaller than an atom that last a millionth of a second. Fermilab’s Holometer is currently the only machine with the ability to take these very precise measurements of space and time, and recently collected data has improved the limits on theories about exotic objects from the early universe.

Our universe is as mysterious as it is vast. According to Albert Einstein’s theory of general relativity, anything that accelerates creates gravitational waves, which are disturbances in the fabric of space and time that travel at the speed of light and continue infinitely into space. Scientists are trying to measure these possible sources all the way to the beginning of the universe.

The Holometer experiment, based at the Department of Energy’s Fermilab, is sensitive to gravitational waves at frequencies in the range of a million cycles per second. Thus it addresses a spectrum not covered by experiments such as the Laser Interferometer Gravitational-Wave Observatory, which searches for lower-frequency waves to detect massive cosmic events such as colliding black holes and merging neutron stars.

“It’s a huge advance in sensitivity compared to what anyone had done before,” said Craig Hogan, director of the Center for Particle Astrophysics at Fermilab.

This unique sensitivity allows the Holometer to look for exotic sources that could not otherwise be found. These include tiny black holes and cosmic strings, both possible phenomena from the early universe that scientists expect to produce high-frequency gravitational waves. Tiny black holes could be less than a meter across and orbit each other a million times per second; cosmic strings are loops in space-time that vibrate at the speed of light.

The Holometer is composed of two Michelson interferometers that each split a laser beam down two 40-meter arms. The beams reflect off the mirrors at the ends of the arms and travel back to reunite. Passing gravitational waves alter the lengths of the beams’ paths, causing fluctuations in the laser light’s brightness, which physicists can detect.

The Holometer team spent five years building the apparatus and minimizing noise sources to prepare for experimentation. Now the Holometer is taking data continuously, and with an hour’s worth of data, physicists were able to confirm that there are no high-frequency gravitational waves at the magnitude where they were searching.

The absence of a signal provides valuable information about our universe. Although this result does not prove whether the exotic objects exist, it has eliminated the region of the universe where they could be present.

“It means that if there are primordial cosmic string loops or tiny black hole binaries, they have to be far away,” Hogan said. “It puts a limit on how much of that stuff can be out there.”

Detecting these high-frequency gravitational waves is a secondary goal of the Holometer. Its main purpose is to determine whether our universe acts like a 2-D hologram, where information is coded into two-dimensional bits at the Planck scale, a length around ten trillion trillion times smaller than an atom. That investigation is still in progress.

“For me, it’s gratifying to be able to contribute something new to science,” said researcher Bobby Lanza, who recently earned his Ph.D. conducting research on the Holometer. He is the lead author on an upcoming paper about the result. “It’s part of chipping away at the whole picture of the universe.”

Diana Kwon

### Expanding the cosmic search

Friday, March 20th, 2015

The South Pole Telescope scans the skies during a South Pole winter. Photo: Jason Gallicchio, University of Chicago

Down at the South Pole, where temperatures drop below negative 100 degrees Fahrenheit and darkness blankets the land for six months at a time, the South Pole Telescope (SPT) searches the skies for answers to the mysteries of our universe.

This mighty scavenger is about to get a major upgrade — a new camera that will help scientists further understand neutrinos, the ghost-like particles without electric charge that rarely interact with matter.

The 10-meter SPT is the largest telescope ever to make its way to the South Pole. It stands atop a two-mile thick plateau of ice, mapping the cosmic microwave background (CMB), the light left over from the big bang. Astrophysicists use these observations to understand the composition and evolution of the universe, all the way back to the first fraction of a second after the big bang, when scientists believe the universe quickly expanded during a period called inflation.

One of the goals of the SPT is to determine the masses of the neutrinos, which were produced in great abundance soon after the big bang. Though nearly massless, because neutrinos exist in huge numbers, they contribute to the total mass of the universe and affect its expansion. By mapping out the mass density of the universe through measurements of CMB lensing, the bending of light caused by immense objects such as large galaxies, astrophysicists are trying to determine the masses of these elusive particles.

A wafer of detectors for the SPT-3G camera undergoes inspection at Fermilab. Photo: Bradford Benson, University of Chicago

To conduct these extremely precise measurements, scientists are installing a bigger, more sensitive camera on the telescope. This new camera, SPT-3G, will be four times heavier and have a factor of about 10 more detectors than the current camera. Its higher level of sensitivity will allow researchers to make extremely precise measurements of the CMB that will hopefully make it possible to cosmologically detect neutrino mass.

This photo shows an up-close look at a single SPT-3G detector. Photo: Volodymyr Yefremenko, Argonne National Laboratory

“In the next several years, we should be able to get to the sensitivity level where we can measure the number of neutrinos and derive their mass, which will tell us how they contribute to the overall density of the universe,” explained Bradford Benson, the head of the CMB Group at Fermilab. “This measurement will also enable even more sensitive constraints on inflation and has the potential to measure the energy scale of the associated physics that caused it.”

SPT-3G is being completed by a collaboration of scientists spanning the DOE national laboratories, including Fermilab and Argonne, and universities including the University of Chicago and University of California, Berkeley. The national laboratories provide the resources needed for the bigger camera and larger detector array while the universities bring years of expertise in CMB research.

“The national labs are getting involved because we need to scale up our infrastructure to support the big experiments the field needs for the next generation of science goals,” Benson said. Fermilab’s main role is the initial construction and assembly of the camera, as well as its integration with the detectors. This upgrade is being supported mainly by the Department of Energy and the National Science Foundation, which also supports the operations of the experiment at the South Pole.

Once the camera is complete, scientists will bring it to the South Pole, where conditions are optimal for these experiments. The extreme cold prevents the air from holding much water vapor, which can absorb microwave signals, and the sun, another source of microwaves, does not rise between March and September.

The South Pole is accessible only for about three months during the year, starting in November. This fall, about 20 to 30 scientists will head down to the South Pole to assemble the camera on the telescope and make sure everything works before leaving in mid-February. Once installed, scientists will use it to observe the sky over four years.

“For every project I’ve worked on, it’s that beginning — when everyone is so excited not knowing what we’re going to find, then seeing things you’ve been dreaming about start to show up on the computer screen in front of you — that I find really exciting,” said University of Chicago’s John Carlstrom, the principal investigator for the SPT-3G project.

Diana Kwon

### Scientists find rare dwarf satellite galaxy candidates in Dark Energy Survey data

Tuesday, March 10th, 2015

This Fermilab press release came out on March 10, 2015.

Scientists on two continents have independently discovered a set of celestial objects that seem to belong to the rare category of dwarf satellite galaxies orbiting our home galaxy, the Milky Way.

Dwarf galaxies are the smallest known galaxies, and they could hold the key to understanding dark matter and the process by which larger galaxies form.

A team of researchers with the Dark Energy Survey, headquartered at the U.S. Department of Energy’s Fermi National Accelerator Laboratory, and an independent group from the University of Cambridge jointly announced their findings today. Both teams used data taken during the first year of the Dark Energy Survey, all of which is publicly available, to carry out their analysis.

“The large dark matter content of Milky Way satellite galaxies makes this a significant result for both astronomy and physics,” said Alex Drlica-Wagner of Fermilab, one of the leaders of the Dark Energy Survey analysis.

Satellite galaxies are small celestial objects that orbit larger galaxies, such as our own Milky Way. Dwarf galaxies can be found with fewer than 100 stars and are remarkably faint and difficult to spot. (By contrast, the Milky Way, an average-sized galaxy, contains billions of stars.)

These newly discovered objects are a billion times dimmer than the Milky Way and a million times less massive. The closest of them is about 100,000 light-years away.

“The discovery of so many satellites in such a small area of the sky was completely unexpected,” said Cambridge’s Institute of Astronomy’s Sergey Koposov, the Cambridge study’s lead author. “I could not believe my eyes.”

Scientists have previously found more than two dozen of these satellite galaxies around our Milky Way. About half of them were discovered in 2005 and 2006 by the Sloan Digital Sky Survey, the precursor to the Dark Energy Survey. After that initial explosion of discoveries, the rate fell to a trickle and dropped off entirely over the past five years.

The Dark Energy Survey is looking at a new portion of the southern hemisphere, covering a different area of sky than the Sloan Digital Sky Survey. The galaxies announced today were discovered in a search of only the first of the planned five years of Dark Energy Survey data, covering roughly one-third of the portion of sky that DES will study. Scientists expect that the full Dark Energy Survey will find up to 30 of these satellite galaxies within its area of study.

This illustration maps out the previously discovered dwarf satellite galaxies (in blue) and the newly discovered candidates (in red) as they sit outside the Milky Way. Image: Yao-Yuan Mao, Ralf Kaehler, Risa Wechsler (KIPAC/SLAC).

Atlas image obtained as part of the Two Micron All Sky Survey (2MASS), a joint project of the University of Massachusetts and the Infrared Processing and Analysis Center/California Institute of Technology, funded by the National Aeronautics and Space Administration and the National Science Foundation.

While more analysis is required to confirm any of the observed celestial objects as satellite galaxies, researchers note their size, low surface brightness and significant distance from the center of the Milky Way as evidence that they are excellent candidates. Further tests are ongoing, and data collected during the second year of the Dark Energy Survey could yield more of these potential dwarf galaxies to study.

Newly discovered galaxies would also present scientists with more opportunities to search for signatures of dark matter. Dwarf satellite galaxies are dark matter-dominated, meaning they have much more mass in unseen matter than in stars. The nature of this dark matter remains unknown but might consist of particles that annihilate each other and release gamma rays. Because dwarf galaxies do not host other gamma ray sources, they make ideal laboratories to search for signs of dark matter annihilation. Scientists are confident that further study of these objects will lead to even more sensitive searches for dark matter.

In a separate result also announced today, the Large Area Telescope Collaboration for NASA’s Fermi Gamma-Ray Telescope mission reported that they did not see any significant excess of gamma ray emission associated with the new Dark Energy Survey objects. This result demonstrates that new discoveries from optical telescopes can be quickly translated into tests of fundamental physics.

“We did not detect significant emission with the LAT, but the dwarf galaxies that DES has and will discover are extremely important targets for the dark matter search,” said Peter Michelson, spokesperson for the LAT collaboration. “If not leading to an identification of particle dark matter, they will certainly be useful to constrain its properties.”

The Dark Energy Survey is a five-year effort to photograph a large portion of the southern sky in unprecedented detail. Its primary instrument is the Dark Energy Camera, which – at 570 megapixels – is the most powerful digital camera in the world, able to see galaxies up to 8 billion light-years from Earth. Built and tested at Fermilab, the camera is now mounted on the 4-meter Victor M. Blanco telescope at the Cerro Tololo Inter-American Observatory in the Andes Mountains in Chile.

The survey’s five-year mission is to discover clues about the nature of dark energy, the mysterious force that makes up about 70 percent of all matter and energy in the universe. Scientists believe that dark energy may be the key to understanding why the expansion of the universe is accelerating.

“The Dark Energy Camera is a perfect instrument for discovering small satellite galaxies,” said Keith Bechtol of the Kavli Institute for Cosmological Physics at the University of Chicago, who helped lead the Dark Energy Survey analysis. “It has a very large field of view to quickly map the sky and great sensitivity, enabling us to look at very faint stars. These results show just how powerful the camera is and how significant the data it collects will be for many years to come.”

The Dark Energy Survey analysis is available here. The University of Cambridge analysis is available here.

The Dark Energy Survey is a collaboration of more than 300 scientists from 25 institutions in six countries. For more information about the survey, please visit the experiment’s website.

Funding for the DES Projects has been provided by the U.S. Department of Energy, the U.S. National Science Foundation, the Ministry of Science and Education of Spain, the Science and Technology Facilities Council of the United Kingdom, the Higher Education Funding Council for England, the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, the Kavli Institute of Cosmological Physics at the University of Chicago, Financiadora de Estudos e Projetos, Fundação Carlos Chagas Filho de Amparo à Pesquisa do Estado do Rio de Janeiro, Conselho Nacional de Desenvolvimento Científico e Tecnológico and the Ministério da Ciência e Tecnologia, the Deutsche Forschungsgemeinschaft and the collaborating institutions in the Dark Energy Survey. The DES participants from Spanish institutions are partially supported by MINECO under grants AYA2012-39559, ESP2013-48274, FPA2013-47986 and Centro de Excelencia Severo Ochoa SEV-2012-0234, some of which include ERDF funds from the European Union.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @Fermilab.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov .

The mission of the University of Cambridge is to contribute to society through the pursuit of education, learning and research at the highest international levels of excellence. To date, 90 affiliates of the university have won the Nobel Prize. Founded in 1209, the university comprises 31 autonomous colleges, which admit undergraduates and provide small-group tuition, and 150 departments, faculties and institutions. Cambridge is a global university. Its 19,000 student body includes 3,700 international students from 120 countries. Cambridge researchers collaborate with colleagues worldwide, and the university has established larger-scale partnerships in Asia, Africa and America. The university sits at the heart of one of the world’s largest technology clusters. The ‘Cambridge Phenomenon’ has created 1,500 hi-tech companies, 14 of them valued at over US$1 billion and two at over US$10 billion. Cambridge promotes the interface between academia and business and has a global reputation for innovation. www.cam.ac.uk .

### The Dark Energy Survey begins

Wednesday, September 4th, 2013

Over the next five years, scientists will capture some of the grandest images of the cosmos ever seen and use them to probe the mystery of dark energy. Image courtesy of the Dark Energy Survey Collaboration

Space: the final frontier. These are the voyages of the Dark Energy Survey. Its five-year mission: to map a portion of the southern sky in unprecedented detail. To use the world’s most powerful digital camera to probe the mystery of dark energy. To boldly photograph where no astrophysicist has photographed before.

The Dark Energy Survey officially began on Saturday, Aug. 31. Using the Dark Energy Camera, a 570-megapixel imaging device built at Fermilab, scientists plan to take clear, dazzling pictures of the largest number of galaxies ever studied in such a survey. The camera is mounted on a telescope at the Cerro Tololo Inter-American Observatory in Chile, which offers a mountaintop vista perfect for obtaining crystal-clear, high-resolution images.

“With the start of the survey, the work of more than 200 collaborators is coming to fruition,” says Fermilab physicist Josh Frieman, director of the Dark Energy Survey, in a press release. “It’s an exciting time in cosmology, when we can use observations of the distant universe to tell us about the fundamental nature of matter, energy, space and time.”

Over five years, scientists will capture full-color photographs of 300 million galaxies, 100,000 galaxy clusters and 4000 new supernovae. The camera is powerful enough to see light from more than 8 billion light years away. The Dark Energy Camera’s 62 charged coupled devices will provide a previously unheard-of level of sensitivity to red light. This will help determine the distances to galaxies—those that appear red are generally farther away, while those that appear blue are nearer by.

But the survey is not just about collecting pretty pictures. Scientists are searching for the answer to a fascinating mystery: Why is the expansion of the universe accelerating? The Dark Energy Survey will use four methods to probe dark energy, the phenomenon believed to be pushing the universe apart:

1. Counting galaxy clusters. While gravity pulls mass together to form galaxies and clusters of galaxies, dark energy pushes it apart. The Dark Energy Camera will see light from galaxy clusters billions of light years away, and counting those clusters at different points in time will offer insight into the cosmic competition between gravity and dark energy.

2. Measuring supernovae. A supernova is a star that explodes, becoming as bright as an entire galaxy of billions of stars. By measuring how bright it appears on Earth, scientists can tell how far away a supernova is and then use that information to determine how fast the universe has been expanding since the star’s explosion.

3. Studying the bending of light. When light from distant galaxies encounters dark matter in space, it bends around it, causing those galaxies to appear distorted in telescope images. The survey will measure the shapes of 200 million galaxies, exploring how gravity and dark energy mold the lumps of dark matter throughout space.

4. Using sound waves to map the universe’s expansion. Sound waves created hundreds of thousands of years after the big bang left an imprint on the way galaxies are distributed across the universe. The survey will measure the positions in space of 300 million galaxies to find this imprint and use it to infer the history of cosmic expansion.

The Dark Energy Camera achieved first light in September of last year. During the subsequent commissioning phase, the camera was put through its paces. Along the way, it captured dozens of sharp, clear images. You can see some of them at the survey’s official photo blog, Dark Energy Detectives, and even more in an interactive mosaic of 62 DECam images.

With the survey now officially underway, scientists will collect data between September and February each year through 2018. This “season” was chosen because the portion of the sky scientists wish to observe will be overhead during those months, and the weather in Chile will be the most cooperative.

Andre Salles

### I’m Going to Tell You…

Friday, October 19th, 2012

–by T.I. Meyer, Head of Strategic Planning and Communication

Public science lectures, events, cafés: They are everywhere!  This past weekend, the ATLAS group at TRIUMF went to Science World in downtown Vancouver and gave a science talk about the Higgs, hosted a virtual tour of the ATLAS control room, and answered thousands of questions. Nearly 10,000 people passed through the doors that day.  This past Tuesday night, Perimeter Institute director Neil Turok presented his third CBC Massey lecture, this one in Vancouver at UBC’s Chan Centre.  The sell-out crowd was nearly 1,000 people.  Last night near the waterfront station, TRIUMF science director Reiner Kruecken gave a talk about nuclear astrophysics at the public session of the APS Northwest Sectional meeting.  And on November 1, the director of the NIH Human Genome Research Institute Eric Green will be giving a public talk about genomics and its future influence on clinical practice at GenomeBC.

Why is all of this happening?  Can’t people just get enough of science and technology from YouTube, university classes, and specialized television programs?  Heck, why did *I* go to some of these events?  Is it the same reason I choose to attend certain music concerts or watch a play in person in the theatre?

Humans are social creatures.  Maybe I am showing my age, but I still prefer being in a group and learning about something rather than sitting at home in a darkened room and just clicking and scrolling on my computer.  I actually have different brain chemistry when in a group and listening to someone.  At the Massey lecture, there was even something fun about my seatmate whispering questions to me during the talk (for instance, If the universe is expanding at an accelerating rate, does that mean the Solar System is actually getting bigger right now?).  It would have been weird to have Neil Turok come over to my house and record his lecture in my living room with just me as the audience, right?

There’s something curious and fascinating about leading scientists and thinkers in person. I saw the Premier of British Columbia in a coffee shop this morning; she was just getting a cup of coffee like I was, and yet it was still “cool.”  Listening to Neil Turok was special because he peppered his discussion of “What banged (in the Big Bang)?” with personal anecdotes, with humor, and with observations about history.  I can get that same feel when I listen to the broadcasts on CBC Radio of course. I got to hear it “first” and “in the raw.”

There’s something neat about hearing something live, in the moment.  And I got to hear what was happening “right now” rather than waiting for the lecture to be broadcast or waiting for someone to write a Wikipedia article about it.

What do you think?  Why do people still throng to gather ‘round and listen to and talk about science and particle physics?  What can we do to provide even more of what is needed and wanted?

### Turtles all the way down?

Tuesday, March 20th, 2012

I recently got an interesting e-mail about the Big Bang. The writer said she didn’t see how you could make something out of nothing. She collects creation myths and thought that, no matter how you sliced it, it’s always “turtles all the way down.” This is a reference to creation myths where the world is poised on top of a turtle, which is itself poised on top of something else, but raises the issue: Is there any firm ground?

This is worth addressing because it illustrates the gulf between the understandings in people’s minds about the Big Bang on one hand, and how physicists deal with it on the other. To be clear – we have a wealth of observations that support the Big Bang, but you have to be careful. We can only look back into the universe to a moment 300,000 years after the ‘start,’ as best we can discern it. At this early moment, the universe went from being opaque to transparent. Before this moment, ionized gas kept light from traveling any distance, but once protons and electrons cooled enough to form neutral hydrogen, light (photons) could travel long distances. The remnant photons from this time are seen as the so-called cosmic microwave radiation. These photons were first observed by Arno Penzias and Robert Wilson in the 1960’s and continues to be a rich source of information about the early universe.

What do we see? We see galaxies moving away from each other. The further away we look, the faster they appear to recede. Einstein’s gravity has a number of solutions for possible universe structures. One of these solutions describes the expanding universe very well, and, if taken at face value, would extrapolate back in time to an initial state when all matter in the universe existed as a single point of infinite density. But, does a point of infinite density make sense? The author of the e-mail question thinks not, that it’s like pulling a rabbit out of the hat. You can’t make something from nothing, and this apparent absurdity invalidates the Big Bang model.

The main issue is that, although our observations are very consistent with this model of a Big Bang universe, we cannot actually see the initial moment. It’s hidden from view. We strongly suspect that the laws of physics might change dramatically when distances scales and energy densities approach the conditions very close to initial moment. We know that when the classical laws of physics are combined with quantum mechanics, new phenomena emerge. This was the case of our theory of electromagnetism. When we incorporated quantum mechanics with electromagnetism, the phenomenon of anti-matter became apparent. We have yet to find a satisfactory theory of gravity that combines quantum mechanics. The manifestations of quantum mechanics in gravity will only emerge at extremely high energy densities, such as those in the very early universe, near the time of this infinite density, and will likely modify our current models. For all we know, space-time might resemble some Escher print, eluding the concept of an infinite density starting point through a twisted configuration that folds in on itself.

Rather than dealing with a concept that seems almost theological in nature, physicists try to reconcile models against data. We fully realize that our models will extrapolate to conditions that raise difficult issues, like infinite densities. More often than not, the difficult conditions are something we avoid talking about, because, largely, we cannot really test or measure these. If it is inaccessible, it is inaccessible. The work can be perhaps more likened to the work of explorers. Our job is to map new territories, and, if anything, we can only report on territories we’ve explored. What lies beyond the horizon is a matter of speculation.
Responses? Questions? Contact me on Twitter @JohnHuth1

### Can We “Point” the LHC, Too?

Wednesday, January 28th, 2009

The Bad Astronomy blog is publicizing a chance to choose what the Hubble Space Telescope looks at.  The basic idea is that there’s going to be an internet vote between six objects that Hubble has never looked at, and Hubble will be pointed at the winner and send out pictures of it by April.  It seems like a fun way to get the public to learn more about, and feel more involved in, the Hubble project.

I’ll let you read more details at one of the links above, but I have another question to consider: can we do something similar with the LHC? That is, could we put up some kind of page where people could vote on what kind of physics we would study over the course of some particular week?  Maybe a choice between searching for Supersymmetry, or a high-mass Higgs boson, or a low-mass Higgs boson?  At first glance, the answer would seem to be “no.”  We obviously have no control over what kind of physics happens when the protons of the LHC collide — we just look at what comes out.  And it seems unlikely that any physicist would volunteer to put their work hours into a particular analysis because of a public vote, and anyway we’ll have people working on all the high-profile analyses and many low-profile ones besides.

But there actually is a sense in which ATLAS or CMS could to something similar.  Remember that our detectors can only record a few hundred events every second, out of the almost forty million times the beams cross during that second.  There are lots of collisions we have to throw out because we can’t store enough data, and it’s the trigger system that decides which few we keep.  In practice, there are a number of different signals that we program the trigger system to be interested in: we take a certain number of random low-energy events to help us calibrate what we see in our other events, and we have separate “trigger paths” for hadronic jets, for muons, for electrons, and so on.  We try to record all the events that might represent interesting new physics, but as the collision rate at the LHC increases, we’ll have to throw away even some of those.  When the committee meets to decide how to balance the different possible triggers, what is at issue is precisely which kinds of events the detector will “point at,” i.e. recognize as important and save.  People with different interests in terms of physics might make different choices about how to achieve that balance, and every study would always love more trigger bandwidth if it were available, and that’s why we have committees to argue about it in the first place.

So why not reserve 5% of the ATLAS or CMS trigger bandwidth for a public vote on what physics to look for, to give a little extra oomph to one study or another?  Actually, I can think of several good and practical reasons why not — but it’s fun to think about!

### Big Explosions

Sunday, May 25th, 2008

Now that I’ve gotten your attention with the entry title, I of course have to admit that there are no big explosions at CERN. That’s a good thing, too, because I’m talking about really big explosions.

CERN, like any big laboratory or university, has a fair number of lectures and colloquia on various topics in physics. One of the great things about being a physicist, and a physics student in particular, is that going to these lectures counts as work, at least if it doesn’t get in the way of things that have to be done. Since my work this week was mostly meetings about getting a new project and passing the old one off to another person, along with writing an ATLAS Infernal Internal Note on the old project, I had the opportunity and need for any educational breaks I could find.

As it happened, there were three very interesting talks by Princeton Professor Adam Burrows. Their nominal subject was “Black Holes and Neutron Stars,” but what he really wanted to show was stars exploding. The first talk, which was definitely my favorite, had a lot of movies and simulations of exactly that. A particularly pretty example is this movie of a Type Ia Supernova:

The neat thing about that video is that, not only does it look good, it’s also a real simulation. One of the main things I learned from the talks is that a substantial obstacle to understanding the details of supernovae is a lack of computing power: there are a lot of ideas about how they work exactly, but none of them come out quite right in simplified simulations. For example, Type II Supernovae probably need to lose their spherical symmetry so that the explosion can spread along one axis while new material collapses into the core from other directions, but it’s not clear exactly how this happens, and it can’t be simulated properly in only two dimensions.

Jokes about avoiding real work aside, it’s quite valuable for physicists to keep up with work in fields that are somewhat removed from our own work; you never know what interesting connections might come up. The details of supernovae have a lot of particle physics in them; for example, there are a tremendous number of neutrinos produced. In fact, neutrino detectors were the first instruments to “see” Supernova 1987a, because the weakly-interacting neutrinos escaped from the star a few hours ahead of the rest of the explosion.

[Image credit: NASA, ESA, J. Hester and A. Loll (Arizona State University)]

### The LHC Astro-Lab

Monday, May 5th, 2008

A few weeks ago the physics community got shaken by an announcement of the DAMA project (no not the bad guys on ‘Lost’, they’re the DHARMA initiative), an underground experiment in the Gran Sasso tunnel, which claimed to have found experimental evidence for Dark Matter. The claim is based on the fact that the motion of the earth around the sun should produce a modulation in the dark matter count rate, because the earth’s velocity needs to be added (or subtracted) to the dark matter (or halo) escape velocity. DAMA has found indeed an eight sigma signal of a modulation in their candidate count rate. The question remains whether any background source could cause this signal, and it will take scientists some time to exclude all reasons why this measurement might not be significant. Nevertheless the possibility of experimental evidence for dark matter is exciting. But what does this have to do with the LHC and in particular ALICE ?

Well, throughout the past few years relativistic heavy ion and high energy physics have stressed their significance towards understanding QCD and electro-weak symmetry breaking, but the original quest for the heavy ion program at RHIC and the LHC was to find a state of matter which would have taught us a lot about the evolution of the universe shortly after the Big Bang, at a time where matter as we know it (luminous and dark) should have formed. This original link has been disfavored for some time because scientists felt that the ‘Little Bang’ can not be easily applied to the ‘Big Bang’; the system is too small, the evolution is too fast. But several speculative explanations of experimental measurements at RHIC gave new life to the ‘astro-connection’ of relativistic heavy ion physics (see for example Peter Steinberg’s blog entries on Anti-de-Sitter space and Hawking-Unruh radiation). D.J.Schwarz from CERN in his very instructive article: ‘The first second of the universe’ showed the anticipated evolution of matter formation, and pointed out the relevance of the so-called QCD phase transition from quarks and gluons to hadrons for the evolution of the universe.

It is interesting to note that the LHC offers a two-prong approach to accelerator based astrophysics. Not only can the high energy proton proton collisions likely probe the Higgs field, extra dimensions, super symmetry and dark matter candidates, but the relativistic heavy ion collisions can probe physics in the strong force sector that has traditionally been assumed to occur at higher energies, such as CP violationwhich is necessary for baryogenesis in the universe, anti-baryonic dark matter candidates and the infamous 5-d quantum black holes.

So this is an exciting time, and the diversity of the LHC programme, bringing high energy and heavy ion physicists together by offering proton-proton and Pb-Pb collisions, will lead not only to breakthroughs in the understanding of QCD and potentially new physics beyond the standard model. It will also make the LHC the premier astro-lab in the world. I am glad that all three big experiments (ATLAS, CMS and ALICE) now feature a pp and a PbPb program. Although ALICE is the most versatile heavy ion detector, both ATLAS and CMS have strong programs with heavy ions, and only together and with the necessary verification of each other’s results will we be able to crack some of the cosmic mysteries that I am most interested in. I am looking forward to that and to your attempts of taking aim at some of my claims in this and future blogs

Cheers