• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for December, 2011

A new year, a new outlook

Saturday, December 31st, 2011

2011 has been a year of change and excitement. We’ve had plenty of good news and bad news to deal with. The new year doesn’t mean just another calendar on the wall, it means a new way of looking at physics. There’s no better way to bring in the new year than watching the fireworks in central London, surrounded by friends. There’s usually a fantastic display, because London is not only one of the most important cities in the world, but it’s also home of universal time. With the Greenwich Meridian running through the capital, we’re reminded of the role that timekeeping has played in the development our history and our science. But this year was even more special, since London is literally inviting the world to its streets this year for the Olympics. So I got caught up in the excitement of it all my thoughts turned to what we’ve seen in the world of physics, and where we’re going next.

New year fireworks in London (New York Times)

New year fireworks in London (New York Times)

2011 got off to a start with ATLAS announcing a startling asymmetry in the jet momenta in heavy ion collisions. However, the joy was tainted by a leaked abstract from an internal document. That document never made it through internal review and should never have been made public. We were faced with several issues of confidentiality, ethics and biases, and how having several thousand people, all armed with the internet and with friends on competing experiments makes the work tough for all of us. In the end we followed the right course, subjected all the analyses to the rigors of internal and external review, and presented some wonderful papers.

There was more gossip over the CDF dijet anomaly presented at Blois. CDF saw a bump, and D0 didn’t. Before jumping to any conclusions it’s important to remember why we have two experiments at Tevatron in the first place! These kinds of double checks are exactly what we need and they represent the high standard of scientific research that we expect and demand. The big news for Tevatron was, of course, the end of running. We’re all sad that the shutdown had to happen and grateful for such a long, productive run, but lets look to the future in the intensity frontier.

Meanwhile both ATLAS and CMS closed in on the Higgs boson, excluding the vast majority of the allowed regions. The combinations and results just got better and better, until eventually on December 13th we saw the result of 5fb-1 from each experiment. The world watched as the presentations were made and quite a few people were left feeling a little deflated. But that’s not the message we should take away. If the Higgs boson is there (and it probably is) then we’ll see by the end of the year. There’s no more of saying “Probably within a year, if we’re lucky”, or “Let’s not get ahead of ourselves”. This time we can be confident that this time next year we’ll have uncovered every reasonable stone. The strategies will change and we narrow the search. We may have new energies to explore, and we’ll tweak our analyses to get more discriminating power from the data. Now is the time to get excited! The game has changed and the end is definitely in sight.

Raise a glass as we say farewell to a great year of physics, and welcome another

Raise a glass as we say farewell to a great year of physics, and welcome another

It’s been a good year for heavy flavor physics as well. LHCb has gone from strength to strength, probing deeper and deeper into the data. We’ve seen the first new particle at the LHC, a state of bottomonium. Precision measurements of heavy flavor physics give some of the most sensitive tests of new physics models, and it’s easy to forget the vital role they play in discover.

ALICE has been busy exploring different questions about our origins, and they’ve studied the quark gluon plasma in great detail. The findings have told us that the plasma acts like a fluid, while showing unexpected suppression of excited bottomonium states. With even more data from 2011 being crunched we can expect even more from ALICE in 2012.

The result that came completely out of left field was the faster than light neutrinos from OPERA. After seeing neutrinos break the cosmic speed limit, OPERA repeated the measurements with finer proton bursts and got the same result. Something interesting is definitely happening with that result. Either it’s a subtle mistake that has eluded all the OPERA physicists and their colleagues across the world, or our worldview is about to be overturned. I don’t think we’ll get the answers in the immediate future, so let’s keep an eye out for results from MINOS and OPERA.

Finally it’s been an incredible year for public involvement. It’s been a pleasure to have such a responsive audience and to see how many people all across the world have been watching CERN and the LHC. A couple of years ago I would not have thought that the LHC and Higgs boson would get so much attention, and it’s been a of huge benefit to everyone. The discoveries we share with the world are not only captivating us all, they’re also inspiring the next generation of physicists. We need a constant supply of fresh ideas and new students to keep the cutting edge research going. If we can reach out to teenagers in schools and inspire some of them to choose careers in science then we’ll continue to answer the most fascinating, far reaching and beautiful questions about our origins.

So when you a raise a glass to the new year, don’t forget that we’ve had an incredible 2011 for physics, and that 2012 is going to deliver even more. We don’t even know what’s out there, but it’s going to be amazing. To physics!

Share

Greetings from the South Pole

Saturday, December 31st, 2011

As a new Quantum Diarist, I wanted to introduce myself and my research work. And say hello from the bottom of the world. I am at the South Pole Station working on a new neutrino experiment called ARA, the Askaryan Radio Array.

This project is a second generation effort to look at the the highest-energy (GZK) neutrinos using radio detection of the coherent (Askaryan) emission from showers in dense materials. We’re building at the South Pole to take advantage of the largest block of dense, radio-transparent media on Earth, the 3km thick ice sheet that covers the continent. This is my second summer season at Pole working on ARA, last year we installed an engineering detector that has operated quite successfully throughout the year and now we’re installing the first production detector station. Ultimately we’re aiming for a detector array covering about 100 square kilometers with the antennas capable of detecting signals down to the bedrock below. A truly large detector. It’s much larger, but optimized for higher-energy events, than the IceCube detector completed last summer season at the Pole.

ARA is built on the experiences of the ANITA balloon-borne radio neutrino detector and the ice-drilling and radio spinoff efforts (RICE, AURA, & SATRA) of the IceCube experiment. It’s a small collaboration but with most of the world experience in radio neutrino detection and a significant block of the experience in hot-water drilling down into the ice. The main goal are the so-called GZK neutrinos, neutrinos produced by the interaction of the highest energy (charged particle) cosmic rays with the 3K microwave background radiation. More on all these physics topics in future postings…right now mostly wanting to say hello.

It’s New Year’s Eve at the South Pole. There are about 240 people here at the US South Pole Base, Amundsen-Scott Base, living and working in the new elevated station, or in the many small smaller building around the area. Some folks here are fairly traditional astronomers, working on the 10m South Pole Telescope, others have magnetometers, aurora cameras, seismometers, air sampling gear, and other scientific pursuits. The majority of people at Pole are here in support roles, driving heavy equipment, cooking dinner, washing the dishes, managing the cargo flow, and staffing the communications facility. Tonight there will be four bands performing, followed by a DJ set. We get satellite network a few hours per day and I’ll post this in the morning, in the new year for us.

Physics has gotten me to a lot of interesting places, and this is certainly way up there on that list, though it’s not the first project that had gotten me to Antarctica. I had worked on the CREAM and ANITA balloon experiments, both of which have flown from McMurdo Base on Ross Island just off the coast of Antarctica. (It’s probably close enough to count as Antarctica.) My background is in cosmic rays, I worked on spacecraft isotopic measurements of the cosmic rays while I was a graduate student at the University of Chicago. As a postdoc at Penn State, I worked on the HEAT balloon experiment measuring cosmic ray antimatter and the Pierre Auger Observatory in the Argentine grasslands. Later as a professor at the University of Minnesota, I added CREAM (a cosmic ray elemental abundance balloon experiment) and ANITA (the balloon neutrino experiment parent of the current ARA work). Now I am at the University of Wisconsin at Madison working on ARA, IceCube, and the HAWC TeV gamma-ray observatory currently under construction in Mexico. Detectors and their associated hardware are as important to me intellectually as the physics now.

So, greetings from the South Pole, a good place to do physics. And I’m looking forward to sharing my corner of the world of astroparticle physics with you, my dear readers.

Share

For every problem, there is a simple solution: neat, plausible and wrong.

The philosophers such as Rudolf Carnap (1891 – 1970) and the Vienna Circle considered logical positivism the received view of the scientific method.  In the early to mid twentieth century, it dominated the philosophy of science discussions but is now widely viewed as seriously flawed—or as A. J. Ayer (1910 – 1989), a former advocate, put it: “I suppose the most important [defect]…was that nearly all of it was false.” Pity. But it was good while it lasted. So, what is logical positivism? It is sometimes defined by the statement: Only verifiable statements have meaning—note verifiable not falsifiable. The doctrine included opposition to all metaphysics, especially ontology and synthetic a priori propositions. Metaphysics is rejected not as wrong but as having no meaning.

Logical positivism is very nice idea: we work only with observations and what can be deduced directly from them. No need for theories, models or metaphysics. I can hear the cheering now, especially from my experimental colleagues. It was partially in response to the revolutions in physics in the early twentieth century. Quantum mechanics and relativity completely upended the metaphysics and philosophy built around classical mechanics, so the logical positivist wanted to eliminate the metaphysics to prevent this from happening again; a very laudable goal.

So what went wrong? As Ayer noted, almost everything. First, metaphysics tends to be like accents—something only the other person has. The very claim that metaphysics is not needed is itself a metaphysical claim.  Second, observations are not simple. As demonstrated by optical illusions, what we see is not necessarily what is there.  The perceptual apparatus does a lot of processing before the results are presented to the conscious mind. The model of the universe presented to the conscious mind probably has more uncontrolled assumptions than any accepted scientific model. But that is what the logical positivists took as the gospel truth. In addition there is Thomas Kuhn’s (1922 – 1996) claim that observations are model dependent. While that claim is disputable, it is clear that the interpretation of observations depend on the model, the paradigm or if you prefer the metaphysics; something beyond the observations themselves.

Third as Sir Karl Popper (1902 – 1994) argued, in general, scientific models cannot be verified only falsified (and one can argue that even that is impossible, see the first post in this series).  Thus, Only verifiable statements have meaning would exclude all of science from having meaning. Indeed, it would exclude even the statement itself since the statement Only verifiable statements have meaning cannot be verified.

Logical positivism: neat, plausible and wrong. Well can anything be salvaged? Perhaps a little. Consider the statement: In science, only models that can be empirically tested are worth discussing. Not to be overly broad, I restrict the statement to science. The criteria in mathematics are rather different and I do not wish to make a general statement about knowledge, at least not here. Second, I have replaced statement with model since by the Duhem-Quine thesis individual statements cannot be tested since one can make almost any statement true by varying the supporting assumptions. In the end it is global models that are tested. Science is observationally based, so the adjective empirical. I use tested to avoid complaints about the validity of verification or falsification. Tested is neutral in that regard. Finally, meaningful has been replaced by worth discussing. To see why consider the composition of the sun. In the late nineteenth century, it was regarded as something that would never be known. At that point the statement “The sun is composed mainly of hydrogen” would have been considered meaningless by the logical positivists and certainly, at that time, discussion of the issue would have been futile. But with the discovery of spectroscopic lines, models for the composition of the sun became very testable and the composition of sun is now considered well understood. It went from not worth discussing to well understood but the composition of the sun did not change. I would consider the statement “The sun is composed mainly of hydrogen” to be meaningful even before it could be tested; meaningful but not worth discussing.

My restatement above does, however, eliminate a lot of nonsense; like the omphalos hypothesis, the flying spaghetti monster, and a lot of metaphysics, from discussion. But its implications are more wide ranging. During my chequered career as a scientist, I have seen many pointless discussions of things that could not be tested: d-state of the deuteron, off-shell properties, nuclear spectroscopic factors and various other technical quantities that appear in the equations used by physicists. There was much heat but little light. It is important to keep track of what aspects of the models we produce are constrained by observation and which are not. Follow the logical positivists, not the yellow brick road, and keep careful track of what can actually be determined by measurements. What is behind the curtain is only interesting if the curtain can be pulled aside.

To conclude: Don’t waste your time discussing what can’t be empirically tested. That is all that’s left of logical positivism once the chaff has been blown away. And good advice it is—except for mathematicians. Either that or I have been lured to the rocks by the siren call of logical positivism and have another statement that is neat, plausible and wrong!

Additional posts in this series will appear most Friday afternoons at 3:30 pm Vancouver time. To receive a reminder follow me on Twitter: @musquod

Share

This article first appeared in ISGTW Dec. 21, 2011.

A night-time view of the Tevatron. Photo by Reidar Hahn.

This is the first part of a two-part series on the contribution Tevatron-related computing has made to the world of computing. This part begins in 1981, when the Tevatron was under construction, and brings us up to recent times. The second part will focus on the most recent years, and look ahead to future analysis.

Few laypeople think of computing innovation in connection with the Tevatron particle accelerator, which shut down earlier this year. Mention of the Tevatron inspires images of majestic machinery, or thoughts of immense energies and groundbreaking physics research, not circuit boards, hardware, networks, and software.

Yet over the course of more than three decades of planning and operation, a tremendous amount of computing innovation was necessary to keep the data flowing and physics results coming. In fact, computing continues to do its work. Although the proton and antiproton beams no longer brighten the Tevatron’s tunnel, physicists expect to be using computing to continue analyzing a vast quantity of collected data for several years to come.

When all that data is analyzed, when all the physics results are published, the Tevatron will leave behind an enduring legacy. Not just a physics legacy, but also a computing legacy.

In the beginning: The fixed-target experiments

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.

1981. The first Indiana Jones movie is released. Ronald Reagan is the U.S. President. Prince Charles makes Diana a Princess. And the first personal computers are introduced by IBM, setting the stage for a burst of computing innovation.

This image of an ACP system was taken in 1988. Photo by Reidar Hahn.Meanwhile, at the Fermi National Accelerator Laboratory in Batavia, Illinois, the Tevatron has been under development for two years. And in 1982, the Advanced Computer Program formed to confront key particle physics computing problems. ACP tried something new in high performance computing: building custom systems using commercial components, which were rapidly dropping in price thanks to the introduction of personal computers. For a fraction of the cost, the resulting 100-node system doubled the processing power of Fermilab’s contemporary mainframe-style supercomputers.

“The use of farms of parallel computers based upon commercially available processors is largely an invention of the ACP,” said Mark Fischler, a Fermilab researcher who was part of the ACP. “This is an innovation which laid the philosophical foundation for the rise of high throughput computing, which is an industry standard in our field.”

The Tevatron fixed-target program, in which protons were accelerated to record-setting speeds before striking a stationary target, launched in 1983 with five separate experiments. When ACP’s system went online in 1986, the experiments were able to rapidly work through an accumulated three years of data in a fraction of that time.

Entering the collider era: Protons and antiprotons and run one

1985. NSFNET (National Science Foundation Network), one of the precursors to the modern Internet, is launched. And the Tevatron’s CDF detector sees its first proton-antiproton collisions, although the Tevatron’s official collider run one won’t begin until 1992.

The experiment’s central computing architecture filtered incoming data by running Fortran-77 algorithms on ACP’s 32-bit processors. But for run one, they needed more powerful computing systems.

By that time, commercial workstation prices had dropped so low that networking them together was simply more cost-effective than a new ACP system. ACP had one more major contribution to make, however: the Cooperative Processes Software.

CPS divided a computational task into a set of processes and distributed them across a processor farm – a collection of networked workstations. Although the term “high throughput computing” was not coined until 1996, CPS fits the HTC mold. As with modern HTC, farms using CPS are not supercomputer replacements. They are designed to be cost-effective platforms for solving specific compute-intensive problems in which each byte of data read requires 500-2000 machine instructions.

CPS went into production-level use at Fermilab in 1989; by 1992 it was being used by nine Fermilab experiments as well as a number of other groups worldwide.

1992 was also the year that the Tevatron’s second detector experiment, DZero, saw its first collisions. DZero launched with 50 traditional compute nodes running in parallel, connected to the detector electronics; the nodes executed filtering software written in Fortran, E-Pascal, and C.

Gearing up for run two

"The Great Wall" of 8mm tape drives at the Tagged Photon Laboratory, circa 1990 - from the days before tape robots. Photo by Reidar Hahn.

1990. CERN’s Tim Berners-Lee launches the first publicly accessible World Wide Web server using his URL and HTML standards. One year later, Linus Torvalds releases Linux to several Usenet newsgroups. And both DZero and CDF begin planning for the Tevatron’s collider run two.

Between the end of collider run one in 1996 and the beginning of run two in 2001, the accelerator and detectors were scheduled for substantial upgrades. Physicists anticipated more particle collisions at higher energies, and multiple interactions that were difficult to analyze and untangle. That translated into managing and storing 20 times the data from run one, and a growing need for computing resources for data analysis.

Enter the Run Two Computing Project (R2CP), in which representatives from both experiments collaborated with Fermilab’s Computing Division to find common solutions in areas ranging from visualization and physics analysis software to data access and storage management.

R2CP officially launched in 1996. It was the early days of the dot com era. eBay had existed for a year, and Google was still under development. IBM’s Deep Blue defeated chess master Garry Kasparov. And Linux was well-established as a reliable open-source operating system. The stage is set for experiments to get wired and start transferring their irreplaceable data to storage via Ethernet.

The high-tech tape robot used today. Photo by Reidar Hahn.

“It was a big leap of faith that it could be done over the network rather than putting tapes in a car and driving them from one location to another on the site,” said Stephen Wolbers, head of the scientific computing facilities in Fermilab’s computing sector. He added ruefully, “It seems obvious now.”

The R2CP’s philosophy was to use commercial technologies wherever possible. In the realm of data storage and management, however, none of the existing commercial software met their needs. To fill the gap, teams within the R2CP created Enstore and the Sequential Access Model (SAM, which later stood for Sequential Access through Meta-data). Enstore interfaces with the data tapes stored in automated tape robots, while SAM provides distributed data access and flexible dataset history and management.

By the time the Tevatron’s run two began in 2001, DZero was using both Enstore and SAM, and by 2003, CDF was also up and running on both systems.

Linux comes into play

The R2CP’s PC Farm Project targeted the issue of computing power for data analysis. Between 1997 and 1998, the project team successfully ported CPS and CDF’s analysis software to Linux. To take the next step and deploy the system more widely for CDF, however, they needed their own version of Red Hat Enterprise Linux. Fermi Linux was born, offering improved security and a customized installer; CDF migrated to the PC Farm model in 1998.

The early computer farms at Fermilab, when they ran a version of Red Hat Linux (circa 1999). Photo by Reidar Hahn.

Fermi Linux enjoyed limited adoption outside of Fermilab, until 2003, when Red Hat Enterprise Linux ceased to be free. The Fermi Linux team rebuilt Red Hat Enterprise Linux into the prototype of Scientific Linux, and formed partnerships with colleagues at CERN in Geneva, Switzerland, as well as a number of other institutions; Scientific Linux was designed for site customizations, so that in supporting it they also supported Scientific Linux Fermi and Scientific Linux CERN.

Today, Scientific Linux is ranked 16th among open source operating systems; the latest version was downloaded over 3.5 million times in the first month following its release. It is used at government laboratories, universities, and even corporations all over the world.

“When we started Scientific Linux, we didn’t anticipate such widespread success,” said Connie Sieh, a Fermilab researcher and one of the leads on the Scientific Linux project. “We’re proud, though, that our work allows researchers across so many fields of study to keep on doing their science.”

Grid computing takes over

As both CDF and DZero datasets grew, so did the need for computing power. Dedicated computing farms reconstructed data, and users analyzed it using separate computing systems.

“As we moved into run two, people realized that we just couldn’t scale the system up to larger sizes,” Wolbers said. “We realized that there was really an opportunity here to use the same computer farms that we were using for reconstructing data, for user analysis.”

A wide-angle view of the modern Grid Computing Center at Fermilab. Today, the GCC provides computing to the Tevatron experiments as well as the Open Science Grid and the Worldwide Large Hadron Collider Computing Grid. Photo by Reidar Hahn.

Today, the concept of opportunistic computing is closely linked to grid computing. But in 1996 the term “grid computing” had yet to be coined. The Condor Project had been developing tools for opportunistic computing since 1988. In 1998, the first Globus Toolkit was released. Experimental grid infrastructures were popping up everywhere, and in 2003, Fermilab researchers, led by DZero, partnered with the US Particle Physics Data Grid, the UK’s GridPP, CDF, the Condor team, the Globus team, and others to create the Job and Information Management system, JIM. Combining JIM with SAM resulted in a grid-enabled version of SAM: SAMgrid.

“A pioneering idea of SAMGrid was to use the Condor Match-Making service as a decision making broker for routing of jobs, a concept that was later adopted by other grids,” said Fermilab-based DZero scientist Adam Lyon. “This is an example of the DZero experiment contributing to the development of the core Grid technologies.”

By April 2003, the SAMGrid prototype was running on six clusters across two continents, setting the stage for the transition to the Open Science Grid in 2006.

From the Tevatron to the LHC – and beyond

Throughout run two, researchers continued to improve the computing infrastructure for both experiments. A number of computing innovations emerged before the run ended in September 2011. Among these was CDF’s GlideCAF, a system that used the Condor glide-in system and Generic Connection Brokering to provide an avenue through which CDF could submit jobs to the Open Science Grid. GlideCAF served as the starting point for the subsequent development of a more generic glidein Work Management System. Today glideinWMS is used by a wide variety of research projects across diverse research disciplines.

Another notable contribution was the Frontier system, which was originally designed by CDF to distribute data from central databases to numerous clients around the world. Frontier is optimized for applications where there are large numbers of widely distributed clients that read the same data at about the same time. Today, Frontier is used by CMS and ATLAS at the LHC.

“By the time the Tevatron shut down, DZero was processing collision events in near real-time and CDF was not far behind,” said Patricia McBride, the head of scientific programs in Fermilab’s computing sector. “We’ve come a long way; a few decades ago the fixed-target experiments would wait months before they could conduct the most basic data analysis.”

One of the key outcomes of computing at the Tevatron was the expertise developed at Fermilab over the years. Today, the Fermilab computing sector has become a worldwide leader in scientific computing for particle physics, astrophysics, and other related fields. Some of the field’s top experts worked on computing for the Tevatron. Some of those experts have moved on to work elsewhere, while others remain at Fermilab where work continues on Tevatron data analysis, a variety of Fermilab experiments, and of course the LHC.

The accomplishments of the many contributors to Tevatron-related computing are noteworthy. But there is a larger picture here.

“Whether in the form of concepts, or software, over the years the Tevatron has exerted an undeniable influence on the field of scientific computing,” said Ruth Pordes, Fermilab’s head of grids and outreach. “We’re very proud of the computing legacy we’ve left behind for the broader world of science.”

— Miriam Boon

Share

Christmas time brings not only presents and pretty cookies but an outpouring of media lists proffering the best science stories of the year and predicting those that will top the list in 2012.

While the lists varied wildly everyone seemed excited by a few of the same things: upsetting Einstein’s theory of special relativity, a hint of the ‘god particle’ and finding planets like our own.

Several of the stories that made nearly every media outlet’s list, though in various rankings, have a connection, directly or indirectly, to Fermilab. Here’s a sampling with the rankings from the publications.

Discover magazine had the largest list, picking the top 100 science stories.

1: A claim by researchers at the OPERA experiment at CERN that they had measured neutrinos traveling faster than the speed of light, something disallowed by Einstein’s Theory of Special Relativity. Now the scientific community is looking for another experiment to cross-check OPERA’s claim.

That brought renewed interest to a 2007 measurement by the MINOS experiment based at Fermilab that found neutrinos skirting the cosmic speed limit, but only slightly. The MINOS collaboration always planned to study this further when it upgrades its detector in early 2012 but the OPERA result added new urgency.

Look in 2012 for MINOS to update the time of flight of neutrinos debate in three stages. First, MINOS is analyzing the data collected since its 2007 result to look for this phenomena. Results should be ready in early 2012. This likely will improve the MINOS  precision in this area by a factor of three from its 2007 result. Second, MINOS is in the process of upgrading its timing system within the next few months using a system of atomic clocks to detect when the neutrinos arrive at the detector. The atomic clock system will progressively improve resolution, which is needed to make the MINOS analysis comparable to the OPERA result and improve precision from the 2007 MINOS result by as much as a factor of 10. That will tell us if OPERA was on the right track or not, but may not be the definitive answer. That answer will come with the upgrades to the MINOS experiment  and a more powerful neutrino beam, producing a larger quantity of neutrino events to study. The upgraded MINOS experiment will be in many ways a more precise system than OPERA’s and could produce a result comparable with OPERA’s precision likely by January 2014.

4: Kepler’s search for Earth-like planets that could sustain life produces a bounty of cosmic surprises, fueled, in part, by the computing skills of a Fermilab astrophysicist.
32: The on-again, off-again rumor of finding the Higgs boson particle.  Physicists working with experiments at Fermilab’s Tevatron experiments and CERN’s Large Hadron Collider expect to answer the question of whether a Standard Model version of the Higgs exists in 2012.
65: The shutdown of the Tevatron at Fermilab after 28 years and numerous scientific and technological achievements.
82: Fermilab physicist Jason Steffen’s frustration with slow airplane boarding drives him to figure out a formula to speed up the aisle crawl.

Nature’s year in review didn’t rank stories but started off by mentioning the Tevatron’s shutdown after 28 years and following up shortly with the puzzling particle news of potentially FTL neutrinos and a Higgs sighting.

For science — as for politics and economics — 2011 was a year of upheaval, the effects of which will reverberate for decades. The United States lost three venerable symbols of its scientific might: the space-shuttle programme, the Tevatron particle collider and blockbuster profits from the world’s best-selling drug all came to an end.

Cosmos magazine rankings:

The MINOS far detector in the Soudan Mine in Minnesota. Credit: Fermilab

1: Kepler’s exoplanet findings
2: FTL neutrinos
3: Higgs

Scientific American‘s choices:

3: FTL neutrinos
5: Higgs

ABC News asked science radio and TV host physicist Michio Kaku for his top 10 picks. They include:

3: Hint of Higgs
5: Kepler’s exoplanet findings
10: Nobel Prize for the discovery that the expansion of the universe is accelerating, which laid the groundwork for the today’s search for dark energy. Fermilab has several connections to to this work. The latest tool in dark energy survey experiments, the Dark Energy Camera,  was constructed at Fermilab in 2011. One of the three prize winners, Saul Perlmutter, is a member of the group that will use the camera, the Dark Energy Survey collaboration. Adam Riess, another of the winners, is a member of the SDSS-II experiment, a predecessor to DES that Fermilab was key in building and later operating its computing system.

Live Science

5: FTL neutrinos
4: Kepler’s exoplanet findings
2: Higgs

If the Higgs boson’s mass is high, it is expected to decay predominantly into two W bosons. Plushies images from the Particle Zoo.

To make the Ars Technica list stories had to be awe inspiring in 2011 AND have a chance of making the 2012 list as well.

1: FTL neutrinos
2: Kepler’s exoplanet findings
6: Higgs hunt

Science magazine chose the best scientific breakthrough of the year. Kepler’s exoplanet hunt made it into the runner up list.

Tell us who you agree with or, better, yet give us your own top 10 science stories of the year.

— Tona Kunz

Share

Lincoln, Nebraska, where I live, is on the western end of the Central time zone, and as a result, the sun goes down pretty late on the clock here. Even at this time of year, sundown isn’t until 5 PM, and it’s not really dark until at least six. We usually get home with the kids around five, and then we do dinner and playtime inside until bed. That means that the children, who are five and three, are rarely outside when it is really dark out, and they don’t get to see the stars, beyond the bright planets, very often.

The past weekend was an exception; it was Chanukah and there were many evening celebrations, as you are supposed to light the candles at sundown, so we were out past bedtime. On Friday night, as we went out to our car to drive home, my daughter, the older kid, looked up at the cloudless sky and marveled at the number of stars that she could see. I looked up too, and took the opportunity to point out one of the few constellations that I can identify, Orion. (Whenever I think about Orion, I think about John Guare’s “The House of Blue Leaves” — sorry.) “See, it looks like a person, with a top part and a bottom part, and those three stars are a belt,” I explained. My daughter looked at this a little more, and then asked, “Does the world want it to be like that?”

Interesting question — what she meant was whether the stars were intentionally arranged in the shape of a person, or whether it was just something that people made up when they looked at the stars. The answer is the latter, of course, although perhaps the ancients thought differently. Our conversation for the evening went on to other topics in astronomy (“Planets are round,” she said, “so it’s very hard to stand on them.”), but I kept thinking about what she had asked me.

As scientists, we collect data from the world around us, and try to make patterns out of it that we can understand. These patterns are theories, really, and as more data come in, we re-evaluate the theories to see if they are still consistent with the data. Do all the stars make shapes that look like familiar things? Are all of the measurements from the LHC consistent with a Higgs boson at 125 GeV? Are we humans just imposing an anthropic view onto the world? Measurements throughout particle physics, not just at the LHC, seem to support the idea of the Higgs mechanism. Is that consistency just a pattern that we have invented? Or does the world actually want it to be like that?

A year from now, we hope to have an answer to this question. As we head into 2012, a potentially decisive year for particle physics, I hope that all of our Quantum Diaries readers have the opportunity to ask, and answer, their own questions about what the world wants it to be like.

Share

The Role of Faith in Science

Friday, December 23rd, 2011

Back in ancient history, when I was a graduate student we did not have a computer on every desk. We prepared decks of computer cards and trotted them down to the computer center and waited for the printed output. No getting upset if you did not have two-second response on you monitor. Anyway, I was calculating the same quantity in two different ways. One way involved a complicated calculation solving the Schrodinger equation for many different states and doing an obscure averaging. The other was a much simpler calculation using what is known as semi-classical approximations (to find out more, read my thesis). Relating the two involved a lot of math – calculus, differential equations, Laplace transforms, and various other techniques named after august, dead people.  As I sat there looking at the numbers coming out the same, I thought: Egad, you know, math really does work. But, is it correct to say I had faith in the validity of mathematics?

Similarly, is my belief in the utility of Newton’s laws of motions a matter of faith? But, you say, in this case, my belief or faith is misplaced because the laws are not absolutely correct. So?  Where they work, they work extraordinarily well: planetary motion, cars, books, baseballs (except when I am trying to catch them), chalk thrown by an irate teacher (it missed), etc. I do not worry about books starting to move by themselves. Is my belief in the continuing validity of other well-established models and techniques of science a matter of faith or something else? What about evolution, global warming, renormalization techniques, quantum mechanics, and the second law of thermodynamics? Are they all or any of them matters of faith?

The answer to the above question depends on what one means by faith. But if the above are examples of faith it is a rather trite use of the word. Indeed, it is a stretch to claim that any of these are examples of faith at all—certainly not in the same sense that faith is used in religious circles: Now faith is the assurance of things hoped for, the conviction of things not seen (Hebrews 11:1). To a large extent, faith, in this latter sense, is absent from science. The rules of engagement are well laid out and there is little need for a conviction of things not seen.

But, faith does come into science in two ways: one fundamental to science and the other optional and probably spurious. The spurious one is when science is taken beyond it legitimate bounds and claims are made about the ultimate nature of reality: claims about materialism, naturalism and realism. Here, indeed, we have a conviction of things not seen. These are not really a part of science but rather metaphysics and a matter of faith as discussed in a previous post, The Limits of Science. But what about things like atoms, electrons and quarks: things that are not seen in the normal sense of the word but are inferred?  They are internal parts of the models science builds and taking them to be definite parts of reality is an act of faith.  Like the ether they may go poof at some point in the future. Following Poincaré, I rather take their existence as a matter of convention and convenience. Are they really there? Who knows. If the math works out the same does it really matter?

At one point faith does play a key role in science. It could be called the fundamental axiom of science or science’s Nicene Creed: Patterns observed in the past enable us to predict what will happen in the future. The seriousness of the problem was originally pointed out by David Hume (1711 –1776) in his critique of scientific induction. His claim was that scientific induction does not exist. There is no logical reason for tomorrow to be the same as today. Hume had no answer to the problem other than to ignore it. Immanuel Kant (1724 – 1804), responding to Hume, tried to solve the problem and failed.

The fundamental axiom of science lies behind all of science and provides the foundation on which the scientific method rests. We build models based on past observations to predict future observations. This only works if the fundamental axiom is true. The scientific method is just the practical application of this idea.  In terms of Aristotle’s four types of causes, the scientific method is the formal cause, the scientists the effective cause, and final cause (the why) is to build models that will correctly predict the future based on the past. The ability to achieve the final cause rests entirely on the fundamental axiom. The formal cause, the scientific method, follows from trying to put the final cause into action. Since the constructs of science are abstract there is no material cause.

The fundamental axiom is a sophisticated version of the “Mount Saint Helens fallacy.” This was named after a person who refused to leave Mount Saint Helens because he did not believe it would blow up. It had not blown up in living memory so why would it blow up now? I am not sure his body was ever found. Today does not have to be like yesterday. But in science we assume the rules will be the same or at least change in predictable ways. Not as naively as the poor guy on Mount Saint Helens but the assumption is the same: the sun will rise tomorrow (in Vancouver in the winter that is, indeed, a thing not seen). Do the laws of physics tomorrow have to be the same as today? Will mathematics be different tomorrow? Maybe, just maybe, when I look at the two answers on my computer screen[1] tomorrow they will be different. It is possible, but I have faith

 

Additional posts in this series will appear most Friday afternoons at 3:30 pm Vancouver time. To receive a reminder follow me on Twitter: @musquod


[1] Note computer screen, not computer printout; things have changed.

Share

Higgs for the Holidays

Friday, December 23rd, 2011

 —  By Theorist David Morrissey & Particle Physicist Anadi Canepa

 Last week we hosted two particle physics workshops at TRIUMF – an ATLAS Canada collaboration meeting and a joint meeting for theorists and experimentalists to study new LHC results.  Everything went smoothly, no participants were lost to the wilds of Vancouver, and we had some really great discussions and seminars.  During one of these presentations, it occurred to me that these kinds of scientific meetings are not so different from a typical holiday gathering.  In both situations, you frequently run into people you know but that you haven’t seen in a long time.  You catch up, you gossip, and you eat too much food at the coffee breaks.  There’s usually a large group dinner where you often meet new people and strike up conversations about future work.  And every so often one of the participants has too much holiday cheer.

Despite these similarities, most scientific meetings don’t involve gifts.  But this time around we were really lucky, and our workshops had a gift exchange of sorts as well.  In this case, the gifts were the presentations by the ATLAS and CMS collaborations of exciting new results from their searches for the Higgs boson particle.  On top of the live streaming presentations from CERN in the early hours of the morning, we were treated to a longer seminar in the afternoon at TRIUMF by Rob McPherson.  His talk was standing-room only, and we had a great time bombarding him with questions about the ATLAS analysis.

The reason for all this excitement over a single particle is that the Higgs boson, first proposed nearly fifty years ago, is central to our current understanding of all known elementary particles, called the Standard Model.  (See here, here, and here for more details.)   In this theory, the Higgs is responsible for creating the masses of nearly all elementary particles and for making the weak force much weaker than electromagnetism.  Even though we have not yet seen the Higgs directly, we have indirect evidence for it from precision measurements of the weak and electromagnetic forces.  Discovering the Higgs boson would confirm the Standard Model, while not finding it would force us to drastically rethink our description of elementary particles and fundamental forces, which would perhaps be an even greater discovery.

 

Excitement about finding the Higgs has been building since the summer, when it became clear that the LHC would be able to collect enough data by the end of the year to possibly find it.  In the past few weeks the level has gone through the roof as rumours started to appear that the LHC experiments would soon release a significant result.  What we learned this week is that these latest searches did not discover the Higgs boson, but that they do suggest that it might be there with a mass close to 133 times that of a proton (125 GeV).  Finding a Higgs is hard work, and its delicate characteristic signal must be extracted from a huge amount of background noise.  What we have at the moment is an intersting bump, as you can see in the figure above taken from the ATLAS search, where we see more signal events than would typically be expected from the background alone for a candidate Higgs mass of about 125 GeV.  We just don’t have enough data right now to confirm that this bump is from a Higgs boson, and not just an especially unlucky spike in the background noise.  Fortunately, the ATLAS and CMS collaborations will be taking much more data in the new year.

So, for this year all we get is a gift-wrapped box that we’re allowed to shake and prod.  But if we’re good, we’ll get to open the box and find what’s inside at some point in 2012.  Dear Santa…


Share

Numerical Family Connections

Wednesday, December 21st, 2011

Just a brief random thought at the start of the first winter break in my life where I’m not visiting or living with my parents… Whenever I need the number π — that is, the ratio between a circle’s circumference and its diameter — in computer analysis code I’m writing, I always write it out like this:

3.141592654

That’s not exactly π, but it’s quite close. What I really should do is look up where it’s already defined in the math library I’m using, but this is more than accurate enough for any reasonable purpose. It’s too many digits, in fact, although I know a few more. So why do I always write out exactly that many places? Well, after thinking about it for a minute a little while ago, I remembered the answer: it’s the number of digits of π my dad taught me when I was a kid.

Share

Editor’s note: I like this so much that I think it is worth reprinting. This is from the now defunct FermiNews publication in 1998. If you celebrate Christmas, this is the perfect proof to keep Christmas magical for children at that age where they are teetering between believing or not. If you celeberate the season in another way, then this is just a fun physics lesson. Whatever your traditions are,  have a greet weekend with friends and family.

Santa at Nearly the Speed of Light
by Arnold Pompos, Purdue University, and Sharon Butler, Office of Public Affairs. Illustrations by Tracy Jurinek

About this time of year, inquisitive children of a certain age begin to question whether Santa is real. After all, Santa has a major delivery problem. There are some 2 billion children in the world expecting Christmas presents. Assuming an average of 2.5 children per household, then, Santa has to visit about 800 million homes scattered about the globe.

The distance Santa has to travel can be estimated from the following. First, while the surface area of Earth is about 1014 square meters, only about 30 percent of that is land mass, or about 0.3 x 1014 square meters. Second, we’ll assume, for simplicity’s sake, that the 800 million homes are equally distributed on this land mass. Dividing 0.3 x 1014 by 800 million gives 4 x 104 square meters occupied by every household (about six football fields); the square root of that is the distance between households, about 200 meters. Multiply this by the 800 million households to get the distance Santa must travel on Christmas Eve to deliver all the children’s gifts: 160 million kilometers, farther than the distance from here to the sun.

Thanks to the rotation of the earth, Santa has more time than children might initially think. Standing on the International Date Line, moving from east to west and crossing different time zones, Santa has not just 10 hours to deliver his presents (from 8 p.m., when children go to bed, until 6 a.m., when they wake up), but an extra 24 hours— 34 hours in all.

Even so, Santa’s task is daunting.

Now, some have guessed that Santa accomplishes his task by traveling at a speed close to that of light—let’s say, 99.999999 percent of the speed of light. By traveling that fast, in fact, Santa can deliver all his presents in just 500 seconds or so, with plenty of time left over (the remainder of the 34 hours) to polish off the cookies the children have left him on their kitchen tables.

There are certain consequences, however, of Santa’s traveling at this frantic pace. For example:

First, children may not be able to see Santa racing across the dark night sky, but they may be able to see a trail of light caused by Cerenkov radiation, a phenomenon created when charged objects travel faster than the speed of light (which they can do in transparent media, but not in a vacuum). Since the basic component of our atmosphere is nitrogen, light is slowed to 99.97 percent of its usual speed of 300,000 kilometers per second. Santa travels faster than this and undoubtedly is charged; as a consequence, then, he will emit visible photons. (Unfortunately, that light will be obscured by the light caused by the friction created when Santa rushes through the atmosphere. Also, Santa might roast in all this heat, but we’ll presume that Santa’s sleigh, like space capsules, has special protective shielding.)

Second, children will notice that as Rudolph, Santa’s lead reindeer, is rushing toward their homes, his nose is no longer red. The color depends on just how fast Rudolph is moving, turning yellow, then green, then blue, then violet, and finally turning invisible in the ultraviolet range as he accelerates to higher and higher speeds. This change in color is a well-known phenomenon, called the Doppler shift, which astronomers take advantage of to figure out the speeds with which the stars and galaxies in our expanding universe are moving with respect to us; from that information, the distances to these celestial objects can be deduced. Using the accompanying table, children can determine how fast Rudolph is traveling by noting the color of his nose.

One worry Santa has is whether, with his irremediable girth, he’ll be able to squeeze into all those chimneys. Traveling at nearly the speed of light makes the problem worse, because Santa gains mass (his kinetic energy adds to his mass, as Einstein’s famous E = mc2 attests). Children believe that Santa will easily fit in the chimney, because from their frame of reference, even though Santa is heavier, he has contracted. From Santa’s frame of reference, though, the chimney is narrower than Santa is.

But children need not fear. The theory of relativity assures us that Santa will fit (see figure 4), and their packages will be delivered on time.

Children might also wonder why Santa never seems to age. From year to year, he retains his cherub face and merry laugh, his long white beard and his round belly that jiggles like a bowlfull of jelly. The fact is that for objects traveling at close to the speed of light, time slows down. So, the more packages Santa delivers, the more he’ll travel, and the more he’ll remain the same, carrying on the Christmas tradition for generations of children to come.

Color of Rudolph’s nose: Red Yellow Green Blue Violet
Corresponding wavelength

(in nanometers):

650 580 550 480 400
Santa’s speed as a percentage

of the speed of light (v/c)*:

0 11 17 29 45

Can Santa fit in the chimney if he’s traveling at nearly the speed of light?
To answer that question, we need to talk about two frames of reference: Santa’s and ours. We also need to place two periodically blinking lights, A and B, on the sides of the chimney. These lights will help us and Santa find the edges of the chimney in the darkness and therefore will determine when Santa is right above the chimney, ready to slide in. For Santa to fit into the chimney, his right and left sides need to be between lights A and B when they blink.

Figure 1: If Santa is traveling at normal earthbound speeds, say, 100 km per hour, he sees lights A and B blink at the same time. Just as his left arm touches A, his right arm also touches B; therefore Santa fits in (since Santa is not bigger than the chimney).

Figure 2: If Santa is moving at close to the speed of light, the situation changes. From our frame of reference, according to Einstein’s theory of relativity, Santa’s width contracts and he is narrower than the chimney. Therefore Santa has plenty of space to slide in.

Figure 3: From Santa’s frame of reference, however, the chimney is moving backward and is, in fact, narrower than he is. If Santa were to see A and B blinking at the same time, the chimney would be too narrow for him.

Figure 4: Not to worry. From Santa’s frame of reference, the two lights are not blinking at the same time. As light A blinks, Santa’s left side slips into the chimney. The chimney keeps moving backward as Santa’s body squeezes in, until finally, when light B blinks, Santa’s right side is perfectly aligned with the side of the chimney. Now all of Santa is in.

Share