• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

Latest Posts

  • Flip
  • Tanedo
  • USLHC
  • USA

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Jim
  • Rohlf
  • USLHC
  • USA

Latest Posts

  • Emily
  • Thompson
  • USLHC
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts

Latest Posts

This Fermilab press release came out on Aug. 18, 2014.

This image of the NGC 1398 galaxy was taken with the Dark Energy Camera. This galaxy lives in the Fornax cluster, roughly 65 million light-years from Earth. It is 135,000 light-years in diameter, just slightly larger than our own Milky Way galaxy, and contains more than 100 billion stars. Credit: Dark Energy Survey

This image of the NGC 1398 galaxy was taken with the Dark Energy Camera. This galaxy lives in the Fornax cluster, roughly 65 million light-years from Earth. It is 135,000 light-years in diameter, just slightly larger than our own Milky Way galaxy, and contains more than 100 billion stars. Credit: Dark Energy Survey

On Aug. 15, with its successful first season behind it, the Dark Energy Survey (DES) collaboration began its second year of mapping the southern sky in unprecedented detail. Using the Dark Energy Camera, a 570-megapixel imaging device built by the collaboration and mounted on the Victor M. Blanco Telescope in Chile, the survey’s five-year mission is to unravel the fundamental mystery of dark energy and its impact on our universe.

Along the way, the survey will take some of the most breathtaking pictures of the cosmos ever captured. The survey team has announced two ways the public can see the images from the first year.

Today, the Dark Energy Survey relaunched Dark Energy Detectives, its successful photo blog. Once every two weeks during the survey’s second season, a new image or video will be posted to www.darkenergydetectives.org, with an explanation provided by a scientist. During its first year, Dark Energy Detectives drew thousands of readers and followers, including more than 46,000 followers on its Tumblr site.

Starting on Sept. 1, the one-year anniversary of the start of the survey, the data collected by DES in its first season will become freely available to researchers worldwide. The data will be hosted by the National Optical Astronomy Observatory. The Blanco Telescope is hosted at the National Science Foundation’s Cerro Tololo Inter-American Observatory, the southern branch of NOAO.

In addition, the hundreds of thousands of individual images of the sky taken during the first season are being analyzed by thousands of computers at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, Fermi National Accelerator Laboratory (Fermilab), and Lawrence Berkeley National Laboratory. The processed data will also be released in coming months.

Scientists on the survey will use these images to unravel the secrets of dark energy, the mysterious substance that makes up 70 percent of the mass and energy of the universe. Scientists have theorized that dark energy works in opposition to gravity and is responsible for the accelerating expansion of the universe.

“The first season was a resounding success, and we’ve already captured reams of data that will improve our understanding of the cosmos,” said DES Director Josh Frieman of the U.S. Department of Energy’s Fermi National Accelerator Laboratory and the University of Chicago. “We’re very excited to get the second season under way and continue to probe the mystery of dark energy.”

While results on the survey’s probe of dark energy are still more than a year away, a number of scientific results have already been published based on data collected with the Dark Energy Camera.

The first scientific paper based on Dark Energy Survey data was published in May by a team led by Ohio State University’s Peter Melchior. Using data that the survey team acquired while putting the Dark Energy Camera through its paces, they used a technique called gravitational lensing to determine the masses of clusters of galaxies.

In June, Dark Energy Survey researchers from the University of Portsmouth and their colleagues discovered a rare superluminous supernova in a galaxy 7.8 billion light years away. A group of students from the University of Michigan discovered five new objects in the Kuiper Belt, a region in the outer reaches of our solar system, including one that takes over a thousand years to orbit the Sun.

In February, Dark Energy Survey scientists used the camera to track a potentially hazardous asteroid that approached Earth. The data was used to show that the newly discovered Apollo-class asteroid 2014 BE63 would pose no risk.

Several more results are expected in the coming months, said Gary Bernstein of the University of Pennsylvania, project scientist for the Dark Energy Survey.

The Dark Energy Camera was built and tested at Fermilab. The camera can see light from more than 100,000 galaxies up to 8 billion light-years away in each crystal-clear digital snapshot.

“The Dark Energy Camera has proven to be a tremendous tool, not only for the Dark Energy Survey, but also for other important observations conducted year-round,” said Tom Diehl of Fermilab, operations scientist for the Dark Energy Survey. “The data collected during the survey’s first year — and its next four — will greatly improve our understanding of the way our universe works.”

The Dark Energy Survey Collaboration comprises more than 300 researchers from 25 institutions in six countries. For more information, visit http://www.darkenergysurvey.org.

Fermilab is America’s premier national laboratory for particle physics and accelerator research. A U.S. Department of Energy Office of Science laboratory, Fermilab is located near Chicago, Illinois, and operated under contract by the Fermi Research Alliance, LLC. Visit Fermilab’s website at www.fnal.gov and follow us on Twitter at @FermilabToday.

The DOE Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

The National Optical Astronomy Observatory (NOAO) is operated by the Association of Universities for Research in Astronomy (AURA), Inc., under cooperative agreement with the National Science Foundation.

Share
The Particle Clicker team working late into the night.

The Particle Clicker team working late into the night.

This article was also published here on CERN’s website.

This weekend CERN hosted its third Summer Student Webfest, a three-day caffeine-fuelled coding event at which participants worked in small teams to build innovative projects using open-source web technologies.

There were a host of projects to inspire the public to learn about CERN and particle physics, and others to encourage people to explore web-based solutions to humanitarian disasters with CERN’s partner UNOSAT.

The event opened with a session of three-minute pitches: participants with project ideas tried to recruit team members with particular skills, from software development and design expertise to acumen in physics. Projects crystallised, merged or floundered as 14 pitches resulted in the formation of eight teams. Coffee was brewed and the hacking commenced…

Run Broton Run

Members of the Run Broton Run team help each other out at the CERN Summer Student Webfest 2014 (Image: James Doherty)

The weekend was interspersed with mentor-led workshops introducing participants to web technologies. CERN’s James Devine detailed how Arduino products can be used to build cosmic-ray detectors or monitor LHC operation, while developers from PyBossa provided an introduction to building crowdsourced citizen science projects on crowdcrafting.org. (See a full list of workshops).

After three days of hard work and two largely sleepless nights, the eight teams were faced with the daunting task of presenting their projects to a panel of experts, with a trip to the Mozilla Festival in London up for grabs for one member of the overall winning team. The teams presented a remarkable range of applications built from scratch in under 48 hours.

Students had the opportunity to with Ben Segal, an inductee of the Internet Hall of Fame.

Students had the opportunity to collaborate with Ben Segal (middle), inductee of the Internet Hall of Fame.

Prizes were awarded as follows:

Best Innovative Project: Terrain Elevation

A mobile phone application that accurately measures elevation. Designed as an economical method of choosing sites with a low risk of flooding for refugee camps.

Find out more.

Best Technology Project: Blindstore

A private query database with real potential for improving online privacy.

Find out more here.

Best Design Project: GeotagX and PyBossa

An easy-to-use crowdsourcing platform for NGOs to use in responding to humanitarian disasters.

Find out more here and here.

Best Educational Project: Run Broton Run

An educational 3D game that uses Kinect technology.

Find out more here.

Overall Winning Project: Particle Clicker

Particle Clicker is an elegantly designed detector-simulation game for web.

Play here.

“It’s been an amazing weekend where we’ve seen many impressive projects from different branches of technology,” says Kevin Dungs, captain of this year’s winning team. “I’m really looking forward to next year’s Webfest.”

Participants of the CERN Summer Student Webfest 2014 in the CERN Auditorium after three busy days' coding.

Participants of the CERN Summer Student Webfest 2014 in the CERN Auditorium after three busy days’ coding.

The CERN Summer Student Webfest was organised by François Grey, Ben Segal and SP Mohanty, and sponsored by the Citizen Cyberlab, Citizen Cyberscience Centre, Mozilla Foundation and The Port. Event mentors were from CERN, PyBossa and UNITAR/UNOSTAT. The judges were Antonella del Rosso (CERN Communications), Bilge Demirkoz (CERN Researcher) and Fons Rademakers (CTO of CERN Openlab).

Share

The World’s Largest Detector?

Wednesday, August 13th, 2014

This morning, the @CERN_JOBS twitter feed tells us that the ATLAS experiment is the world’s largest detector:

CERN_JOBS Tweet Largest Detector

Weighing over 7,000 tons, 46 meters long, and 25 meters high, ATLAS is without a doubt the particle detector with the greatest volume ever built at a collider. I should point out, though, that my experiment, the Compact Muon Solenoid, is almost twice as heavy at over 12,000 tons:

CMS

CMS is smaller but heavier — which may be why we call it “compact.” What’s the difference? Well, it’s tough to tell from the pictures, in which CMS is open for tours and ATLAS is under construction, but the big difference is in the muon systems. CMS has short gaps between muon-detecting chambers, while ATLAS has a lot of space in order to allow muons to travel further and get a better measurement. That means that a lot of the volume of ATLAS is actually empty air! ATLAS folks often say that if you could somehow make it watertight, it would float; as a CMS member, I heartily recommend attempting to do this and seeing if it works. ;)

But the truth is that all this cross-LHC rivalry is small potatoes compared to another sort of detector: the ones that search for neutrinos require absolutely enormous volumes of material to get those ghostlike particles to interact even occasionally! For example, here’s IceCube:

"Icecube-architecture-diagram2009" by Nasa-verve - IceCube Science Team - Francis Halzen, Department of Physics, University of Wisconsin. Licensed under Creative Commons Attribution 3.0 via Wikimedia Commons - https://commons.wikimedia.org/wiki/File:Icecube-architecture-diagram2009.PNG#mediaviewer/File:Icecube-architecture-diagram2009.PNG

Most of its detecting volume is actually antarctic ice! Does that count? If it does, there may be a far bigger detector still. To follow that story, check out this 2012 post by Michael Duvernois: The Largest Neutrino Detector.

Share

Watch Fermilab Deputy Director Joe Lykken in the latest entry in Huffington Post's "Talk Nerdy To Me" video series.

Watch Fermilab Deputy Director Joe Lykken in the latest entry in Huffington Post’s “Talk Nerdy To Me” video series.


What’s the smallest thing in the universe? Check out the latest entry in Huffington Post‘s Talk Nerdy to Me video series. Host Jacqueline Howard takes the viewer inside Fermilab and explains how scientists look for the smallest components that make up our world. Fermilab Deputy Director Joe Lykken talks about the new discoveries we hope to make in exploring the the subatomic realm.

View the 3-minute video at Huffington Post.

Share

This article appeared in Fermilab Today on Aug. 6, 2014.

Yale University astrophysicist Meg Urry spoke about gender bias in science at the July 30 Fermilab Colloquium. Photo: Lauren Biron

Yale University astrophysicist Meg Urry spoke about gender bias in science at the July 30 Fermilab Colloquium. Photo: Lauren Biron

Both men and women need to improve how they evaluate women in the sciences to help eliminate bias, says Meg Urry, who spoke at last week’s Fermilab Colloquium. People of either gender fall victim to unconscious prejudices that affect who succeeds, particularly in physics.

“Less than 20 percent of the Ph.D.s in physics go to women,” Urry noted, a figure that has barely crept up even while fields such as medicine have approached parity.

Urry, a professor at Yale University and president of the American Astronomical Society, unleashed a torrent of studies demonstrating bias during her talk, “Women in Physics: Why So Few? And How to Move Toward Normal.”

In one example, letters of recommendation for men were more likely to include powerful adjectives and contain specifics, while those for women were often shorter, included hints of doubt or made explicit mention of gender.

Another study found that in jobs that were perceived as masculine, both men and women tended to award the position to the man even when the woman was the qualified individual.

Other data showed that women are less likely to be perceived as the leader in mixed-gender scenarios, Urry said. When small numbers of women are present, they can become an “other” that stands in for the whole gender, magnifying perceived mistakes and potentially confirming a bias that women are less proficient in physics.

“You need a large enough group that people stop thinking of them as the woman and start thinking of them as the scientist,” Urry said.

Urry advised the many young women in the audience to own their ambition, prep their elevator speeches, get male allies who will stand up if female voices are ignored, practice confidence and network. Above all, she said, work hard, do interesting work, and don’t be discouraged if things get rough.

Meanwhile, Urry said, leaders need to learn about bias, actively look for diverse candidates rather than wait for applications, mentor and prevalidate women, such as when introducing a speaker.

Urry worked hard to debunk the myth that hiring more women means lowering the bar for diversity’s sake.

“When you hire a diverse group of scientists, you are improving your quality, not lowering your standards,” Urry said, echoing sentiments from her lunchtime talk with 40 women. “We should be aspiring to diversity of thought to enrich science.”

Lauren Biron

Share

J/ψ

Wednesday, August 6th, 2014

The particle with two names: The J/ψ Vector Meson. Again, under 500 words.

jpsi_NOVA

Trident decay of J/Psi Credit: SLAC/NOVA

Hi All,

The J/ψ (or J/psi) is a very special particle. Its discovery was announced in 1974 independently by two groups: one lead by Samuel Ting at Brookhaven National Laboratory (BNL) in New York and the second lead by Burton Richter at Standford Linear Accelerator Center (SLAC) in California. J/ψ is special because it established the quark model as a credible description of nature. Having been invented by Gell-Man and Zweig as a bookkeeping tool, it was not until Glashow, Iliopoulos and Maiani (GIM) that the concept of quarks as real particles was taken seriously. GIM predicted that if quarks were real, then they should come in pairs, like the  up and down quarks. Candidates for the up, down, and strange were identified, but there was no partner for the strange quark. J/ψ was the key.

ting-group-335px_BNL

Samuel Ting and his BNL team. Credit: BNL

Like the proton or an atom, the J/ψ is a composite particle. This means that J/ψ is made of smaller, more elementary particles. Specifically, it is a bound state of  one charm quark and one anticharm quark. Since it is made of quarks, it is a “hadron“. But since it is made of exactly one quark and one antiquark, it is specifically a “meson.” Experimentally, we have learned that the  J/ψ has an intrinsic angular momentum (spin) of 1ħ (same as the photon), and call it a “vector meson.” We infer that the charm and anticharm, which are both spin ½ħ, are aligned in the same direction (½ħ + ½ħ = 1ħ). The J/ψ must also be electrically neutral because charm and anticharm quarks have equal but opposite electric charges.

richter_SLAC

Burton Richter following the announcement of co-winning the 1976 Nobel Prize. Credit: SLAC

At 3.1 GeV/c², the J/ψ is a about three times heavier than the proton and about three-quarters the mass of the bottom quark. However, because so few hadrons are lighter than it, the J/ψ possesses a remarkable feature: it decays 10% of the time to charged leptons, like an electron-positron pair. By conservation of energy, it is forbidden to decay to heavier hadrons. Because there are so few  J/ψ decay modes, it is appears as a very narrow peak in experiments. In fact, the particle’s mass and width are so well-known that experiments like ATLAS and CMS use them as calibration markers.

Credit: CMS

Drell-Yan spectrum data at 7 TeV LHC Credit: CMS

The J/ψ meson is one of the coolest things in the particle zoo. It is a hadronic bound state that decays into charged leptons. It shares the same quantum numbers as the photon and Z boson, so it appears as a Drell-Yan processes. It established the quark model, and is critical to new discoveries because of its use as a calibration tool. In my opinion, not too shabby.

Happy colliding.

Richard (@BraveLittleMuon)

Share

Super Fracking and Physics

Tuesday, August 5th, 2014

The cover story of the latest issue of Physics Today is part explanation, part discussion of the use of fracking techniques in the oil and natural gas industries in America. As this topic gained traction in the news and online, I was always admittedly ignorant when it came to the actual science and details of these methods. I vaguely knew that fracking could be seen as beneficial in that many US power plants now burn cleaner natural gas instead of coal, but it also seemed obvious that pumping high pressure liquid (which isn’t pure water) into the ground was bound to cause other environmental problems. Still, I neither consider myself strongly for nor against these practices, but I did greatly appreciate the explanations and discussions provided in this article. Below I’ll highlight the parts I found interesting, but I do recommend that the interested reader take a look at the article.

Fractures in siltstone and black shale in the Utica shale, near Fort Plain, New York. (Photograph by Michael C. Rygel.)

The article begins with an explanation of black shale itself. At left is a picture of some black shale, part of the Utica shale in upstate New York. So what is black shale? Well, to quote the article: “Just as sandstones are a rock equivalent of sand, shales are a rock equivalent of mud.” Organic material, oil and/or gas, trapped in the shale gives it the darker color and name, black shale. The oil and gas will only remain in the shale under anoxic conditions. No need to open that extra browser tab, I had to look up what anoxic meant too. Anoxic water is water depleted of much of the dissolved oxygen that is typically in water, this usually happens when water is left stagnant. The dissolved oxygen in normal water would tend to oxidize the carbon in the sediment, destroying the organic material. Under the right conditions, roughly 2-4 km beneath the Earth’s surface, the heat and pressure will convert the organic material into oil. Go a bit further down, roughly 3-6 km, and the temperature and pressure rises, breaking the oil down into gas.

As most people are now aware, the general idea of fracking is to pump liquid into the black shale, causing fractures in the rock which allows the oil and gas to escape its confines and be collected. Three categories of fracking can be distinguished. The first is natural fracking, which is to say, the normal fracturing of shale due to the internal pressure of oil and gas, the fractures in the picture above are due to natural fracturing. Sometimes natural fractures allow oil and gas to escape the shale, the largest such natural seepage area can be found off the coast of Santa Barbara, California. The other methods of fracking are described in the figure below. The main differences pointed out in the article were the volume and viscosity of the water used to carry out the hydraulic fracking. In traditional fracking, water is made viscous by adding guar gum or hydroxyethyl cellulose. Typically about 75-1000 cubic meters of water are used to create a single fracture though which the oil and/or gas may be extracted. High-voulme (or super) fracking, on the other hand, uses a low viscosity water based liquid pumped at a high rate to create many smaller fracture networks along a horizontal well that is periodically plugged to create a number of fracking sites. The water usage is typically 100 times greater in high-volume fracking as compared to traditional fracking. The benefit, of course, is that high-volume fracking is capable of extracting oil and gas from tight shale formations where either few natural fractures exist for the oil and gas to migrate though, or the natural fractures have been sealed over time by the deposition of silica and/or carbonates. For a detailed layout of the environmental concerns surrounding high-volume fracking, see the insert within the main article. 

Traditional and high-volume fracking. (a) In traditional fracking treatments, a high-viscosity fluid creates a single hydraulic fracture through which oil or gas (or both) migrates to the production well. (b) In high-volume fracking, or super fracking, large volumes of a low-viscosity liquid create a wide distribution of hydraulic fractures. Fossil fuels can then migrate through the fracture network to the production well. The sketch here shows the result of a sequence of four high-volume fracking injections. Such sequential injections would not be possible without directional drilling, which creates a horizontal production well in the target stratum.

The authors of this article found their way to studying fracking because of the occurrence of small earthquakes associated with high-volume fracking. Some production wells now monitor the seismic activity of the fracking with a series of seismometers distributed along the length of a monitoring well. Better earthquake prediction models would allow for better emergency preparedness by governments, more robust risk analysis by insurers, and possibly even save lives of those living in earthquake prone areas. So, from a research perspective, the earthquakes induced by fracking can provided a useful testbed for earthquake modeling. Below is a map of microseismicity associated with the Barnett shale in Texas. The monitoring well is situated at the origin, and each dot (or I guess + mark) represents a unique seismic event.

Small earthquakes associated with four high-volume frackings of the Barnett shale in Texas. Each tiny “+” symbol on this microseismicity map shows the epicenter of a microearthquake. Collectively, the symbols reveal the distribution of fractures induced by the injected water. The monitoring well is at the origin of the coordinate system shown. The injection well is off to the right; the thin line shows its horizontal extent. (Adapted from: S.Maxwell, Leading Edge 30, 340 (2011))

These small earthquakes are typically very weak and can not be felt on the surface. The frequency of natural earthquakes of a certain magnitude or greater follows a well defined function where the logarithm of the number of earthquakes with magnitude m or greater decreases linearly with m. This is just to say that small earthquakes are common and big earthquakes are rare. Studying both natural and fracking induced earthquakes, the distribution of earthquake magnitudes from high-volume fracking have a steeper fall off than natural earthquakes, meaning that a large earthquake would be extremely rare, but not ruled out. The authors quote that the probability of seeing a magnitude 4 earthquake (minimally damaging) from high-volume fracking is less than on in a billion. An effort has been made by an old acquaintance of mine, J. Quinn Norris at UC Davis to model the fracking earthquakes using “a type of graph-theory analysis called invasion percolation from a point source.” See his paper here.

The last part of the article that I found particularly interesting was the estimates from the Department of Energy in 2011 on the availability of recoverable oil in the 48 contiguous states. The total estimated volume of recoverable oil was 24 billion barrels. Of this, 3.6 billion barrels are attributed to the Bakken shale, mostly in North Dakota, and 15.4 billion barrels are expected from the Monterey shale along the coast of California. As a California native this was surprising to me, and is probably so because efforts to use high volume fracking on this shale have so far proved unfruitful because of the natural fractures which already exist. Maybe think of it like trying to fracture a sponge by pushing water through it, where the water will happily fill every nook and cranny instead of build up any pressure. Still, this source is likely to play some part in future energy discussions as other sources are depleted. Of course, just because this material exists does not mean we must burn it to satisfy our energy needs. Most of this oil and gas has been locked away for hundreds of millions of years and it would gladly remain so if we allowed it to. I for one am optimistic that fossil fuel consumption will significantly decrease within my lifetime and we can get on with solar powered hovercrafts and the like.

 

Share

René Descartes (1596 – 1650) was an outstanding physicist, mathematician and philosopher. In physics, he laid the ground work for Isaac Newton’s (1642 – 1727) laws of motion by pioneering work on the concept of inertia. In mathematics, he developed the foundations of analytic geometry, as illustrated by the term Cartesian[1] coordinates. However, it is in his role as a philosopher that he is best remembered. Rather ironic, as his breakthrough method was a failure.

Descartes’s goal in philosophy was to develop a sound basis for all knowledge based on ideas that were so obvious they could not be doubted. His touch stone was that anything he perceived clearly and distinctly as being true was true. The archetypical example of this was the famous I think therefore I am.  Unfortunately, little else is as obvious as that famous quote and even it can be––and has been––doubted.

Euclidean geometry provides the illusionary ideal to which Descartes and other philosophers have strived. You start with a few self-evident truths and derive a superstructure built on them.  Unfortunately even Euclidean geometry fails that test. The infamous parallel postulate has been questioned since ancient times as being a bit suspicious and even other Euclidean postulates have been questioned; extending a straight line depends on the space being continuous, unbounded and infinite.

So how are we to take Euclid’s postulates and axioms?  Perhaps we should follow the idea of Sir Karl Popper (1902 – 1994) and consider them to be bold hypotheses. This casts a different light on Euclid and his work; perhaps he was the first outstanding scientist.  If we take his basic assumptions as empirical[2] rather than sure and certain knowledge, all we lose is the illusion of certainty. Euclidean geometry then becomes an empirically testable model for the geometry of space time. The theorems, derived from the basic assumption, are prediction that can be checked against observations satisfying Popper’s demarcation criteria for science. Do the angles in a triangle add up to two right angles or not? If not, then one of the assumptions is false, probably the parallel line postulate.

Back to Descartes, he criticized Galileo Galilei (1564 – 1642) for having built without having considered the first causes of nature, he has merely sought reasons for particular effects; and thus he has built without a foundation. In the end, that lack of a foundation turned out to be less of a hindrance than Descartes’ faulty one.  To a large extent, sciences’ lack of a foundation, such as Descartes wished to provide, has not proved a significant obstacle to its advance.

Like Euclid, Sir Isaac Newton had his basic assumptions—the three laws of motion and the law of universal gravity—but he did not believe that they were self-evident; he believed that he had inferred them by the process of scientific induction. Unfortunately, scientific induction was as flawed as a foundation as the self-evident nature of the Euclidean postulates. Connecting the dots between a falling apple and the motion of the moon was an act of creative genius, a bold hypothesis, and not some algorithmic derivation from observation.

It is worth noting that, at the time, Newton’s explanation had a strong competitor in Descartes theory that planetary motion was due to vortices, large circulating bands of particles that keep the planets in place.  Descartes’s theory had the advantage that it lacked the occult action at a distance that is fundamental to Newton’s law of universal gravitation.  In spite of that, today, Descartes vortices are as unknown as is his claim that the pineal gland is the seat of the soul; so much for what he perceived clearly and distinctly as being true.

Galileo’s approach of solving problems one at time and not trying to solve all problems at once has paid big dividends. It has allowed science to advance one step at a time while Descartes’s approach has faded away as failed attempt followed failed attempt. We still do not have a grand theory of everything built on an unshakable foundation and probably never will. Rather we have models of widespread utility. Even if they are built on a shaky foundation, surely that is enough.

Peter Higgs (b. 1929) follows in the tradition of Galileo. He has not, despite his Noble prize, succeeded, where Descartes failed, in producing a foundation for all knowledge; but through creativity, he has proposed a bold hypothesis whose implications have been empirically confirmed.  Descartes would probably claim that he has merely sought reasons for a particular effect: mass. The answer to the ultimate question about life, the universe and everything still remains unanswered, much to Descartes’ chagrin but as scientists we are satisfied to solve one problem at a time then move on to the next one.

To receive a notice of future posts follow me on Twitter: @musquod.


[1] Cartesian from Descartes Latinized name Cartesius.

[2] As in the final analysis they are.

Share

This article appeared in Fermilab Today on July 30, 2014.

Fermilab physicist Arden Warner revolutionizes oil spill cleanup with magnetizable oil invention. Photo: Hanae Armitage

Fermilab physicist Arden Warner revolutionizes oil spill cleanup with magnetizable oil invention. Photo: Hanae Armitage

Four years ago, Fermilab accelerator physicist Arden Warner watched national news of the BP oil spill and found himself frustrated with the cleanup response.

“My wife asked ‘Can you separate oil from water?’ and I said ‘Maybe I could magnetize it!’” Warner recalled. “But that was just something I said. Later that night while I was falling asleep, I thought, you know what, that’s not a bad idea.”

Sleep forgone, Warner began experimenting in his garage. With shavings from his shovel, a splash of engine oil and a refrigerator magnet, Warner witnessed the preliminary success of a concept that could revolutionize the process of oil spill damage control.

Warner has received patent approval on the cleanup method.

The concept is simple: Take iron particles or magnetite dust and add them to oil. It turns out that these particles mix well with oil and form a loose colloidal suspension that floats in water. Mixed with the filings, the suspension is susceptible to magnetic forces. At a barely discernible 2 to 6 microns in size, the particles tend to clump together, and it only takes a sparse dusting for them to bond with the oil. When a magnetic field is applied to the oil and filings, they congeal into a viscous liquid known as a magnetorheological fluid. The fluid’s viscosity allows a magnetic field to pool both filings and oil to a single location, making them easy to remove. (View a 30-second video of the reaction.)

“It doesn’t take long — you add the filings, you pull them out. The entire process is even more efficient with hydrophobic filings. As soon as they hit the oil, they sink in,” said Warner, who works in the Accelerator Division. Hydrophobic filings are those that don’t like to interact with water — think of hydrophobic as water-fearing. “You could essentially have a device that disperses filings and a magnetic conveyor system behind it that picks it up. You don’t need a lot of material.”

Warner tested more than 100 oils, including sweet crude and heavy crude. As it turns out, the crude oils’ natural viscosity makes it fairly easy to magnetize and clear away. Currently, booms, floating devices that corral oil spills, are at best capable of containing the spill; oil removal is an entirely different process. But the iron filings can work in conjunction with an electromagnetic boom to allow tighter constriction and removal of the oil. Using solenoids, metal coils that carry an electrical current, the electromagnetic booms can steer the oil-filing mixture into collector tanks.

Unlike other oil cleanup methods, the magnetized oil technique is far more environmentally sound. There are no harmful chemicals introduced into the ocean — magnetite is a naturally occurring mineral. The filings are added and, briefly after, extracted. While there are some straggling iron particles, the vast majority is removed in one fell, magnetized swoop — the filings can even be dried and reused.

“This technique is more environmentally benign because it’s natural; we’re not adding soaps and chemicals to the ocean,” said Cherri Schmidt, head of Fermilab’s Office of Partnerships and Technology Transfer. “Other ‘cleanup’ techniques disperse the oil and make the droplets smaller or make the oil sink to the bottom. This doesn’t do that.”

Warner’s ideas for potential applications also include wildlife cleanup and the use of chemical sensors. Small devices that “smell” high and low concentrations of oil could be fastened to a motorized electromagnetic boom to direct it to the most oil-contaminated areas.

“I get crazy ideas all the time, but every so often one sticks,” Warner said. “This is one that I think could stick for the benefit of the environment and Fermilab.”

Hanae Armitage

Share

Inspired by the event at the UNESCO headquarters in Paris that celebrated the anniversary of the signature of the CERN convention, Sophie Redford wrote about her impressions on joining CERN as a young researcher. A CERN fellow designing detectors for the future CLIC accelerator, she did her PhD at the University of Oxford, observing rare B decays with the LHCb experiment.

The “60 years of CERN” celebrations give us all the chance to reflect on the history of our organization. As a young scientist, the early years of CERN might seem remote. However, the continuity of CERN and its values connects this distant past to the present day. At CERN, the past isn’t so far away.

Of course, no matter when you arrive at CERN for the first time, it doesn’t take long to realize that you are in a place with a special history. On the surface, CERN can appear scruffy. Haphazard buildings produce a maze of long corridors, labelled with seemingly random numbers to test the navigation of newcomers. Auditoriums retain original artefacts: ashtrays and blackboards unchanged since the beginning, alongside the modern-day gadgetry of projectors and video-conferencing systems.

The theme of re-use continues underground, where older machines form the injection chain for new. It is here, in the tunnels and caverns buried below the French and Swiss countryside, where CERN spends its money. Accelerators and detectors, their immense size juxtaposed with their minute detail, constitute an unparalleled scientific experiment gone global. As a young scientist this is the stuff of dreams, and you can’t help but feel lucky to be a part of it.

If the physical situation of CERN seems unique, so is the sociological. The row of flags flying outside the main entrance is a colourful red herring, for aside from our diverse allegiances during international sporting events, nationality is meaningless inside CERN. Despite its location straddling international borders, despite our wallets containing two currencies and our heads many languages, scientific excellence is the only thing that matters here. This is a community driven by curiosity, where coffee and cooperation result in particle beams. At CERN we question the laws of our universe. Many answers are as yet unknown but our shared goal of discovery bonds us irrespective of age or nationality.

As a young scientist at CERN I feel welcome and valued; this is an environment where reason and logic rule. I feel privileged to profit from the past endeavour of others, and great pride to contribute to the future of that which others have started. I have learnt that together we can achieve extraordinary things, and that seemingly insurmountable problems can be overcome.

In many ways, the second 60 years of CERN will be nothing like the first. But by continuing to build on our past we can carry the founding values of CERN into the future, allowing the next generation of young scientists to pursue knowledge without borders.

By Sophie Redford

Share