• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USA

Latest Posts

  • Flip
  • Tanedo
  • USA

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • Laura
  • Gladstone
  • University of Wisconsin, Madison
  • USA

Latest Posts

  • Richard
  • Ruiz
  • Univ. of Pittsburgh
  • U.S.A.

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Michael
  • DuVernois
  • Wisconsin IceCube Particle Astrophysics Center
  • USA

Latest Posts

  • Jim
  • Rohlf
  • USA

Latest Posts

  • Emily
  • Thompson
  • Switzerland

Latest Posts

  • Ken
  • Bloom
  • USA

Latest Posts

John Felde | University of Maryland | USA

Read Bio

Super Fracking and Physics

Tuesday, August 5th, 2014

The cover story of the latest issue of Physics Today is part explanation, part discussion of the use of fracking techniques in the oil and natural gas industries in America. As this topic gained traction in the news and online, I was always admittedly ignorant when it came to the actual science and details of these methods. I vaguely knew that fracking could be seen as beneficial in that many US power plants now burn cleaner natural gas instead of coal, but it also seemed obvious that pumping high pressure liquid (which isn’t pure water) into the ground was bound to cause other environmental problems. Still, I neither consider myself strongly for nor against these practices, but I did greatly appreciate the explanations and discussions provided in this article. Below I’ll highlight the parts I found interesting, but I do recommend that the interested reader take a look at the article.

Fractures in siltstone and black shale in the Utica shale, near Fort Plain, New York. (Photograph by Michael C. Rygel.)

The article begins with an explanation of black shale itself. At left is a picture of some black shale, part of the Utica shale in upstate New York. So what is black shale? Well, to quote the article: “Just as sandstones are a rock equivalent of sand, shales are a rock equivalent of mud.” Organic material, oil and/or gas, trapped in the shale gives it the darker color and name, black shale. The oil and gas will only remain in the shale under anoxic conditions. No need to open that extra browser tab, I had to look up what anoxic meant too. Anoxic water is water depleted of much of the dissolved oxygen that is typically in water, this usually happens when water is left stagnant. The dissolved oxygen in normal water would tend to oxidize the carbon in the sediment, destroying the organic material. Under the right conditions, roughly 2-4 km beneath the Earth’s surface, the heat and pressure will convert the organic material into oil. Go a bit further down, roughly 3-6 km, and the temperature and pressure rises, breaking the oil down into gas.

As most people are now aware, the general idea of fracking is to pump liquid into the black shale, causing fractures in the rock which allows the oil and gas to escape its confines and be collected. Three categories of fracking can be distinguished. The first is natural fracking, which is to say, the normal fracturing of shale due to the internal pressure of oil and gas, the fractures in the picture above are due to natural fracturing. Sometimes natural fractures allow oil and gas to escape the shale, the largest such natural seepage area can be found off the coast of Santa Barbara, California. The other methods of fracking are described in the figure below. The main differences pointed out in the article were the volume and viscosity of the water used to carry out the hydraulic fracking. In traditional fracking, water is made viscous by adding guar gum or hydroxyethyl cellulose. Typically about 75-1000 cubic meters of water are used to create a single fracture though which the oil and/or gas may be extracted. High-voulme (or super) fracking, on the other hand, uses a low viscosity water based liquid pumped at a high rate to create many smaller fracture networks along a horizontal well that is periodically plugged to create a number of fracking sites. The water usage is typically 100 times greater in high-volume fracking as compared to traditional fracking. The benefit, of course, is that high-volume fracking is capable of extracting oil and gas from tight shale formations where either few natural fractures exist for the oil and gas to migrate though, or the natural fractures have been sealed over time by the deposition of silica and/or carbonates. For a detailed layout of the environmental concerns surrounding high-volume fracking, see the insert within the main article. 

Traditional and high-volume fracking. (a) In traditional fracking treatments, a high-viscosity fluid creates a single hydraulic fracture through which oil or gas (or both) migrates to the production well. (b) In high-volume fracking, or super fracking, large volumes of a low-viscosity liquid create a wide distribution of hydraulic fractures. Fossil fuels can then migrate through the fracture network to the production well. The sketch here shows the result of a sequence of four high-volume fracking injections. Such sequential injections would not be possible without directional drilling, which creates a horizontal production well in the target stratum.

The authors of this article found their way to studying fracking because of the occurrence of small earthquakes associated with high-volume fracking. Some production wells now monitor the seismic activity of the fracking with a series of seismometers distributed along the length of a monitoring well. Better earthquake prediction models would allow for better emergency preparedness by governments, more robust risk analysis by insurers, and possibly even save lives of those living in earthquake prone areas. So, from a research perspective, the earthquakes induced by fracking can provided a useful testbed for earthquake modeling. Below is a map of microseismicity associated with the Barnett shale in Texas. The monitoring well is situated at the origin, and each dot (or I guess + mark) represents a unique seismic event.

Small earthquakes associated with four high-volume frackings of the Barnett shale in Texas. Each tiny “+” symbol on this microseismicity map shows the epicenter of a microearthquake. Collectively, the symbols reveal the distribution of fractures induced by the injected water. The monitoring well is at the origin of the coordinate system shown. The injection well is off to the right; the thin line shows its horizontal extent. (Adapted from: S.Maxwell, Leading Edge 30, 340 (2011))

These small earthquakes are typically very weak and can not be felt on the surface. The frequency of natural earthquakes of a certain magnitude or greater follows a well defined function where the logarithm of the number of earthquakes with magnitude m or greater decreases linearly with m. This is just to say that small earthquakes are common and big earthquakes are rare. Studying both natural and fracking induced earthquakes, the distribution of earthquake magnitudes from high-volume fracking have a steeper fall off than natural earthquakes, meaning that a large earthquake would be extremely rare, but not ruled out. The authors quote that the probability of seeing a magnitude 4 earthquake (minimally damaging) from high-volume fracking is less than on in a billion. An effort has been made by an old acquaintance of mine, J. Quinn Norris at UC Davis to model the fracking earthquakes using “a type of graph-theory analysis called invasion percolation from a point source.” See his paper here.

The last part of the article that I found particularly interesting was the estimates from the Department of Energy in 2011 on the availability of recoverable oil in the 48 contiguous states. The total estimated volume of recoverable oil was 24 billion barrels. Of this, 3.6 billion barrels are attributed to the Bakken shale, mostly in North Dakota, and 15.4 billion barrels are expected from the Monterey shale along the coast of California. As a California native this was surprising to me, and is probably so because efforts to use high volume fracking on this shale have so far proved unfruitful because of the natural fractures which already exist. Maybe think of it like trying to fracture a sponge by pushing water through it, where the water will happily fill every nook and cranny instead of build up any pressure. Still, this source is likely to play some part in future energy discussions as other sources are depleted. Of course, just because this material exists does not mean we must burn it to satisfy our energy needs. Most of this oil and gas has been locked away for hundreds of millions of years and it would gladly remain so if we allowed it to. I for one am optimistic that fossil fuel consumption will significantly decrease within my lifetime and we can get on with solar powered hovercrafts and the like.



Excitement from IceCube

Friday, December 13th, 2013

After a rather long hiatus (I was writing my PhD dissertation), I am getting back into the habit of posting about interesting things happening in particle physics. Since finishing my degree at UC Davis, I made an arduous cross country drive to start a new adventure as a postdoc at the University of Maryland working on the IceCube neutrino experiment at the South Pole. I have joined this collaboration at a particularly exciting time since the full detector was completed in May of 2011.

COVER Hit distribution (red, early; green, late) of a neutrino interaction with the Antarctic IceCube neutrino detector on 14 July 2011. Light from this transfer of 250 teraelectron volts of energy fills a sphere 600 meters across. This event, among the highest-energy neutrino interactions ever observed, forms part of the first evidence for a high-energy neutrino flux of astrophysical origin.

Back in June of this year, two neutrino events were reported with energies slightly above 1 PeV (peta-electronvolt). To put this number in context, the protons circulating in the Large Hadron Collider (LHC) at CERN have energies of about 4 TeV (tera-electronvolt) each. A PeV is 1,000 times greater than a TeV. Although we would love to be able to produce these higher energies at colliders like the LHC, it simply isn’t feasible at this time. As a result, we must rely on nature to produce these high energy particles for us, and hope that she flings a few our way so we can detect them. This is the job of the IceCube detector, a huge, 1 cubic kilometer, neutrino detector instrumented deep within the Antarctic ice. The enormous size is necessary since few of these particles are produced at such high energies, and even then the neutrino interaction probability is miniscule. Unfortunately, the physicist has no control over nature, nor physics, and so our only recourse is to build big! For those interested in more details about the detector, see the website at the University of Wisconsin – Madison here.

Today the collaboration reports findings from a new neutrino search published in Science. The new search includes neutrino events at lower energies as well, down to about 30 TeV. The results of this search indicate that it is highly unlikely that these neutrinos were produced by any mechanism at Earth. Many high energy neutrinos are produced in Earth’s atmosphere, but not this many, and not at these energies.

Of particular interest to the community is a very fundamental question: “Where do these particles come from anyway?” Since the neutrino interactions preserve some information about the neutrino’s direction, the hope is that these neutrino events will all be coming from a particular place in the universe. Looking for this, the results are tantalizing. Since not all of the events provide exact position information, our best guess of the particle’s direction can be a little fuzzy. So far, however, the most significant clustering of events can be seen below in the full skymap (bottom left side). This location roughly corresponds to the center of our galaxy, but the fuzziness of the event locations does not permit us to say where exactly these neutrinos are coming from.

Luckily, the detector continues to collect additional neutrino events, even possibly as you read this. Our fingers are crossed that more events will be detected in this regime, filling in our understanding of extraterrestrial neutrinos and the cosmos in general.

In celebration of these results, the online magazine, Physics World, has named the IceCube  findings the 2013 breakthrough of the year! A discussion will be held via Google hangout, and shown on the Physics World youtube channel to explain the results, and take questions form the audience today at 4pm UTC (11:00 EST).

Skymap of the IceCube neutrino events. The purple regions indicate more likely locations of neutrino sources (darker is more likely). The plane of the galaxy is shown as a grey line, and the center of the galaxy is denoted by a filled grey square (near the event marked #14).

Skymap of the IceCube neutrino events. The purple regions indicate more likely locations of neutrino sources (darker is more likely). The plane of the galaxy is shown as a grey line, and the center of the galaxy is denoted by a filled grey square (near the event marked #14). The best guess for each event’s location are indicated with either a + (shower like events) or a X (muon tracks).


Fast Photosensors for Neutrino Physics

Monday, March 5th, 2012

Last week we hosted Matthew Wetstein from Argonne National Lab for a High Energy Physics Seminar.  Matthew has been working on a project of great interest to me: the development of fast, large area photo sensors.  For a little background, a lot of neutrino (and indeed other) detectors, rely on detecting photons (light) that are produced from energetic particles. Currently, the most economical devices are photomultiplier tubes (PMTs). There are some practical limitations to these devices, including size, performance in a magnetic field, and timing.

The “Large-Area Picosecond Timing Project,” a group primarily from The University of Chicago, Argonne, Fermilab and Berkeley, are working to develop large (~8in^2), flat, fast, and cheap particle detectors for use in physics, medical, and industry applications.  The photosensors are designed as Micro-Channel Plates (MCP), but fabricated in a different, cheaper, way.  The timing resolution (how well you can tell when your signal came) is currently around 100 pico-seconds, or about 10 times better than typical large area PMTs.

Detectors such as these (above) could have real potential in neutrino physics. Many neutrino detectors function by detecting either direct Cherenkov light, and/or scintillation light produced in the detection medium. Our goal is always to maximize the detection of this light by having as many sensors as possible, hence we like things cheap.  Knowing exactly when the light reaches the sensitive part of our detector is how we determine where in the detector the interaction took place, hence we like things fast. Knowing exactly where on the wall of our detector the light hit helps constrain the geometry and type of event, hence we like lots of pixels.

As an example, one area where these sensors might be useful is in large water Cherenkov detectors.  A primary background for these detectors is distinguishing electrons from neutral pion decays.  An electron appears as a single fuzzy ring on the wall of the detector. A neutral pion will decay and appear as two separate electron like rings.  A major background come about because sometimes your detector simply can not tell distinguish two rings from one.  With highly pixelated and fast photosensors, one could better distinguish these two types of events by either both being more sensitive to seeing two rings, or separating the rings in time.

A great deal of work has been done by this group to understand the potential impact of these devices on the field, and we never get tired of thinking how we could get the most out of our detectors.

EDIT: For more info see here.




Recent Events at UC Davis

Tuesday, November 22nd, 2011

By now, I would imagine anyone tech savvy enough to be following this blog site has been exposed to the chilling stories, images, and videos of the events which took place last week at my university.

Not surprisingly, the actions of the university police, called for by the administration, have greatly enraged the students, alumni, staff, faculty, and friends of UC Davis.  A rally was held yesterday on the main quad to demonstrate, in overwhelming numbers, the disgust and shame felt by our community in light of these events.

In response to these events the undersigned faculty of the UC Davis Physics department have prepared a statement (original here):

Chancellor Linda Katehi                                     November 22, 2011

UC Davis

Dear Chancellor Katehi:

With a heavy heart and substantial deliberation, we the undersigned faculty of
the UC Davis physics department send you this letter expressing our lack of
confidence in your leadership and calling for your prompt resignation in the wake
of the outrageous, unnecessary, and brutal pepper spraying episode on campus
Friday, Nov. 18

The reasons for this are as follows.
•   The demonstrations were nonviolent, and the student encampments
posed no threat to the university community. The outcomes of sending in
police in Oakland, Berkeley, New York City, Portland, and Seattle should
have led you to exhaust all other options before resorting to police action.

•   Authorizing force after a single day of encampments constitutes a gross
violation of the UC Davis principles of community, especially the
commitment to civility: “We affirm the right of freedom of expression within
our community and affirm our commitment to the highest standards of
civility and decency towards all.”

•   Your response in the aftermath of these incidents has failed to restore
trust in your leadership in the university community.
We have appreciated your leadership during these difficult times on working to
maintain and enhance excellence at UC Davis. However, this incident and the
inadequacy of your response to it has already irreparably damaged the image of
UC Davis and caused the faculty, students, parents, and alumni of UC Davis to
lose confidence in your leadership. At this point we feel that the best thing that
you can do for this university is to take full responsibility and resign immediately.
Our campus community deserves a fresh start.


Andreas Albrecht                Glen Erickson                   Lori Lubin
(chair)                         Chris Fassnacht                 Markus Luty
Marusa Bradac                   Daniel Ferenc                   Michael Mulhearn
Steve Carlip                    Ching Fong                      David Pellett
Hsin-Chia Cheng                 Giulia Galli                    Wendell Potter
Maxwell Chertok                 Nemanja Kaloper                 Sergey Savrasov
John Conway                     Joe Kiskis                      Richard Scalettar
Daniel Cox                      Lloyd Knox                      Robert Svoboda
James P. Crutchfield            Dick Lander                     John Terning
Mani Tripathi
David Webb
David Wittman
Dong Yu
Gergely Zimanyi


Myself, along with many graduate and undergraduate students, stand with our faculty, and appreciate their strong voice on our behalf.


First Double Chooz Neutrino Oscillation Result

Wednesday, November 9th, 2011

Today the Double Chooz collaboration presented our first results with about 100 days of single detector data at the LowNu11 conference at Seoul National University, South Korea.  The presentation was given by our spokesperson, Herve de Kerret, and was such an exciting moment for our entire collaboration. A press release was submitted to interactions.org.

Spokesperson, Herve de Kerret, presenting "First results from the Double Chooz experiment."



In case a refresher is necessary, our experiment searches for the last unmeasured mixing angle, θ13, in the three-neutrino mixing matrix, via the disappearance of  νe produced by the dual 4.27 GWth Chooz B reactors.  This simply means that if the neutrinos are in fact oscillating, we should measure fewer neutrinos at our detector than what we would have expected the reactors to produce. The formula for this is given by:

where we see that sin213 sets the amplitude of the oscillation.

As I stated before, our initial analysis was performed on about 100 days of far detector only data.  This graph shows both the data taking efficiency broken up into sub categories, and the total data taking time.  During the August to September period, myself and others were on site to conduct the first radioactive source calibrations of the detector.  The special data acquired during this time helped understand the detector response and was used to determine certain errors on our measurement.

By combing though the data we can identify neutrino interactions by their unique signature.  The detection reaction is called inverse beta decay, and results in the neutrino creating a positron (the anti-particle of the electron) and a neutron.  The detector can measure the positron’s energy, and the energy released when the neutron captures on a Gadolinium nucleus inside the liquid scintillator.  The double signal is beneficial for reducing random backgrounds since the mean time between the events is about 30 micro seconds.

Counting up all of the candidate neutrino events, one can compare the number of detected neutrinos to the number expected based on the total power output of the nuclear power plant, information which is provided to us by the power company.  The following plot shows our detected neutrino rate per day along with the expected rate (blue dashed line), the average rate is 42.6 + 0.7 neutrinos per day.  We see great evidence that our extracted neutrino candidates are in fact directly correlated to the reactor power, as they should be.


Neutrino Candidate Rates (no background subtraction)

In the detection interaction, the positron will carry away most of the parent neutrino energy.  Since the neutrino energy is relevant for the oscillation probability, studying the “prompt” (positron) energy spectrum shape yields information on the value of θ13. Below is the prompt energy spectrum obtained along with the best fit:

Prompt energy spectrum from 100 days of far detector only.


The data, black dots, are shown as the number of neutrino events for each prompt energy bin.  In blue is the expected distribution if there were no oscillations, and in red is our best fit distribution based on the data.  In green, pink, and blue are the major background distributions.  These backgrounds must be accounted for, and contribute to the systematic error on our best fit value.  The rate + shape analysis gives a best fit value of:

sin213 = 0.085 + 0.029(stat) + 0.042(syst)

By itself, our result is not earth shattering, after all it is consistent with zero, but this is the first θ13 sensitive reactor neutrino measurement since the original Chooz result over ten years ago!  Both the statistical and systematic uncertainties will improve as we include more data in the analysis, and with the construction of the near detector, our systematic errors will be greatly reduced.  This plot shows the projected sensitivity of Double Chooz into the future:

Projected sensitivity with single (solid) and dual (dashed) detectors


As my first experience with preparing a physics result, the whole process has been enlightening.  The fervor of our daily (and very early for California) meetings over the past few weeks has certainly demonstrated the deep passion of our collaborators to this experiment and this physics. We look forward to more data and a better understanding of neutrino oscillations.


Physics GRE Bootcamp

Thursday, September 22nd, 2011

I recently participated in a weekend program offering free advice to undergraduate students preparing to take the Physics GRE exam.  In case you and not a physicist yourself, the Physics GRE (Graduate Record Exam) is one of those dreadful comprehensive exams that students must take before applying to most US Ph.D. programs.  Personally, I have never been much of a fan of these tests.  Probably because I was never very good at them…  Nevertheless, they are a reality which all students must face.

The event was sponsored by the California Professoriate for Access to Physics Careers (CPAPC), and funded by the UC Davis Office of Graduate Studies.  Attendance was surprisingly large, about 80 undergraduate students from many Universities in Northern and Southern California made the trip to Davis.

Me having lunch with some students from Cal State University, Chico.

The workshop began with a practice GRE exam taken under realistic test conditions.  The students then broke out into small groups to discuss the problems.  Rooms were prepared by subject: Classical Mechanics, Quantum Mechanics, Optics/Waves/Thermodynamics, and Special Relativity/Atomic Physics/Laboratory techniques.  In each room a current graduate student served as the topic expert to assist the discussions.  I was assigned to the room covering Special Relativity/Atomic Physics/Laboratory techniques.  In some cases students have limited exposure to these fields and so I experienced a constant barrage of questions, which I suppose is better than having no questions at all.

Student responses after the weekend were very positive.  In most cases the students appreciated the opportunity to begin studying for the exam, and to receive valuable advice from the graduate TAs and Faculty about the test and graduate school in general.  Since the weekend, plans have already been put forth for the next bootcamp, and even possibilities of expanding the operation to include other Universities.


arXiv.org Anniversary

Tuesday, August 16th, 2011

An interesting article about the history of arXiv.org was brought to my attention today.  I always find it interesting to read about historical events in physics, because the stories are often fascinating, and yet rarely make it into the textbooks.  Here is the link to the paper on the arXiv itself, written by the creator himself.



Working On Site

Thursday, July 21st, 2011

Hello again.  It has been far too long since my last post, sorry, but I have been living in northern France for the past couple of months working to install some equipment onto the Double Chooz neutrino detector.  While the work here has been rigorous and often stressful, the living conditions have been quite pleasant.  Our collaboration is provided housing in a chateau which, as I understand, was refurbished for us by the local government as a sort of incentive to build the experiment here.

Collaboration housing.

If you read my previous post, the experiment is in the very northern part of central France, very close to the Belgian border.  As such, there is a heavy Belgian influence here, and the local super markets have equally impressive assortments of French wine and Belgian beer.  The surrounding landscape is filled with heavily forested hillsides, and the areas along the river Muese are littered with small towns and expansive pastures.  The agricultural presence often reminds me of home in central California.

The town of Chooz with the power plant in the background.

Getting to the physics, our collaboration is working very hard to accumulate and analyze data in light of encouraging results from our friends at the T2K experiment in Japan who recently published results which seem to indicate that the neutrino mixing parameter, theta13, could be large.  A large value of theta13 is preferable to many in our field because it has implications for design of future neutrino experiments.  In a few months we will be able to weigh in on the possibility of a large value for theta13.  As with any experiment in physics, it takes time (meaning data!) to fully understand nature, so here is wishing all experiments good data!

Au revoir


Where in the world?

Thursday, April 14th, 2011

In a little more that a week I will be traveling to France to perform some work on the Double Chooz far detector.   When I tell my friends, family, and even colleagues, more often than not they assume I am going to CERN.  It’s not suprising seeing as how CERN is arguably the central hub for particle physics in Europe, and is at least partly in France.  Double Chooz, however, is not located at CERN.  It is located at a nuclear power plant near the town of Chooz (pronounced like “show,” not “choose”) near the French – Belgian border. 

I will be spending 3 months on site for this first trip, and likely be back later in the year for another stay.  This will be my first extended stay in Europe, and first time to Belgium and France.  Our collaboration is lucky to have housing near the reactor in the form of a Chateau.  It is really nice to have have to deal with finding housing!

This is an exciting time for me, but there is still a lot of work that needs to get done before I leave.  I look forward to sharing my stories of this new experience.


News from Double Chooz

Friday, February 25th, 2011

There was a nice article about the Double Chooz experiment in Symmetry Breaking recently.  It was also featured in today’s Fermilab Today news letter.  Since I don’t think I have posted an explanation of the experiment I thought I would just share this link to the article.

Last week the Double Chooz collaboration met in Heidelberg, Germany at the Max-Plank-Institute for Nuclear Physics.  This was a very exciting meeting because the first detector has been up and running for a few months now, and people have had a chance to look at the data.

This was my first meeting in Europe, and my first time ever in Germany.  The flight from California was arduousness, 11 hours, but we did fly non-stop which at least allowed for the possibility of sleep.  I dozed in and out a little, and then had to give a talk to the collaboration exactly 24 hours after I woke up that day!  It was a long day, but to my surprise I had plenty of energy to give a good talk.  The coffee helped too!

Unfortunately, our trip did not allow for any sightseeing.  There is a really neat castle in Heidelberg that we saw while walking to dinner, I would have liked to take a look inside.  We did have a nice dinner at an old German beer house.  Long wooden tables, family style meal, and large beer steins, what more could you ask for?

Although our detector is pumping out it’s first data, installation is not entirely complete.  UC Davis is responsible for the fabrication and installation of a glove box (yep, a big box that you stick your hands into) which will allow us to deploy radioactive sources into our detector for calibration.  Basically, we need to introduce a known signal and compare it with what our detector sees.  This allows us to better understand how well we can reconstruct the position and energy of neutrino interactions inside our detector.  Late next month I will go to Chooz to install the glove box and stay for a few months to deploy the sources.  Perhaps my next post will be from France.  Au Revoir.