• John
  • Felde
  • University of Maryland
  • USA

Latest Posts

  • USLHC
  • USLHC
  • USA

  • James
  • Doherty
  • Open University
  • United Kingdom

Latest Posts

  • Andrea
  • Signori
  • Nikhef
  • Netherlands

Latest Posts

  • CERN
  • Geneva
  • Switzerland

Latest Posts

  • Aidan
  • Randle-Conde
  • Université Libre de Bruxelles
  • Belgium

Latest Posts

  • TRIUMF
  • Vancouver, BC
  • Canada

Latest Posts

  • Laura
  • Gladstone
  • MIT
  • USA

Latest Posts

  • Steven
  • Goldfarb
  • University of Michigan

Latest Posts

  • Fermilab
  • Batavia, IL
  • USA

Latest Posts

  • Seth
  • Zenz
  • Imperial College London
  • UK

Latest Posts

  • Nhan
  • Tran
  • Fermilab
  • USA

Latest Posts

  • Alex
  • Millar
  • University of Melbourne
  • Australia

Latest Posts

  • Ken
  • Bloom
  • USLHC
  • USA

Latest Posts


Warning: file_put_contents(/srv/bindings/215f6720ac674a2d94a96e55caf4a892/code/wp-content/uploads/cache.dat): failed to open stream: No such file or directory in /home/customer/www/quantumdiaries.org/releases/3/web/wp-content/plugins/quantum_diaries_user_pics_header/quantum_diaries_user_pics_header.php on line 170

Archive for March, 2014

Voici la première partie d’une série de trois sur la supersymétrie, la théorie qui pourrait aller au-delà du Modèle standard. J’explique ici ce qu’est le Modèle standard et montre ses limites. Puis dans un deuxième volet, je présenterai la supersymétrie et expliquerai comment elle pourrait résoudre plusieurs gros défauts du Modèle Standard. Finalement, je passerai en revue comment les physicien-ne-s essaient de découvrir des « superparticules » au Grand collisionneur de hadrons (LHC) du CERN.

Le Modèle Standard décrit les composantes fondamentales de la matière et les forces qui assurent leur cohésion . Ce modèle repose sur deux idées toutes simples : toute la matière est faite de particules et ces particules interagissent entre elles en échangeant d’autres particules associées aux forces fondamentales.

Les grains de matière de base sont des fermions et les porteurs de force sont des bosons. Les noms de ces deux classes réfèrent à leur spin – une mesure de leur quantité de mouvement angulaire. Les fermions ont des valeurs de spin de un demi tandis que les bosons ont des valeurs entières tel qu’indiqué dans le diagramme ci-dessous.

ModèleStandardLes grains de matière, les fermions, se divisent en deux familles. La famille des leptons compte six membres, l’électron étant le plus connu. La famille des quarks contient six quarks. Les protons et les neutrons sont formés à partir de quarks up et down. Ces douze fermions sont les seules composantes de matière et chacun a une valeur de spin de ½.

Ces particules interagissent entre elles par l’intermédiaire de forces fondamentales. Chaque force vient avec un ou plusieurs porteurs de force. La force nucléaire vient avec le gluon et lie les quarks dans le proton et les neutrons. Le photon est associé à la force électromagnétique. L’interaction faible est responsable de la radioactivité. Elle vient avec les bosons Z et W. Tous ont un spin de 1.

Le point à retenir c’est qu’il existe des grains de matière, les fermions avec un spin de ½, et des porteurs de force, les bosons, avec une valeur entière de spin.

Le Modèle Standard est à la fois remarquablement simple et très puissant. Il vient bien sûr avec des équations complexes qui expriment tout cela d’une façon mathématique. Ces équations permettent aux théoricien-ne-s de faire des prédictions ultra précises. Presque chaque quantité qui a été mesurée dans les laboratoires de physique des particules  au cours des cinq dernières décennies tombe pile poil sur la valeur prévue si on tient compte des marges d’erreur expérimentales.

Alors, qu’est-ce qui cloche avec le Modèle Standard ? Essentiellement, on pourrait dire que le modèle entier manque de robustesse à plus haute énergie. Tant que nous observons divers phénomènes à basse énergie comme nous l’avons fait jusqu’à présent, tout se comporte correctement. Mais comme les accélérateurs deviennent de plus en plus puissants, nous sommes sur le point d’atteindre un niveau d’énergie qui n’avait jusqu’alors existé seulement que peu de temps après le Big Bang. Et à cette énergie, les équations du Modèle Standard commencent à chanceler.

C’est un peu comme avec les lois de la mécanique. Le mouvement d’une particule se déplaçant à une vitesse proche de celle de la lumière ne peut pas être décrite avec les lois simples de la mécanique de Newton. Il faut faire appel aux équations de la relativité.

Autre problème majeur du Modèle Standard : il n’inclut pas la gravité, une des quatre forces fondamentales. Le modèle échoue aussi à expliquer pourquoi la gravité est tellement plus faible que les forces électromagnétiques ou nucléaires. Par exemple, un simple petit aimant suffit pour contrecarrer l’attraction gravitationnelle de la Terre entière et peut maintenir un petit objet à votre frigo.

Cette différence énorme entre les forces fondamentales n’est qu’un aspect du « problème de hiérarchie ». Ce terme réfère aussi à la vaste étendue des valeurs de masse des particules élémentaires. Dans le tableau ci-dessus, les masses sont exprimées en unité d’électron-volt (eV), millions d’eV (MeV) et même milliard d’eV (GeV). L’électron est donc 3500 fois plus léger que le tau. Même chose pour les quarks : le quark top est 75000 fois plus lourd que les quarks up et down. Pourquoi existe-t-il une si grande variété de masses parmi les composantes de la matière? Imaginez si un jeu de blocs Lego contenait des briques de tailles aussi disparates!

Le problème de hiérarchie est aussi lié à la masse du boson de Higgs. Les équations du Modèle Standard établissent des relations entre les particules fondamentales. Par exemple, dans les équations, le boson de Higgs a une masse de base à laquelle les théoricien-ne-s doivent ajouter des corrections pour chaque particule interagissant avec le boson de Higgs. Plus la particule est lourde, plus cette correction est grande. Le quark top étant le plus lourd, il apporte une correction si grande à la masse théorique du boson de Higgs qu’il est difficile de comprendre comment la masse mesurée du boson de Higgs puisse être aussi petite.

Tout cela suggère l’existence de nouvelles particules. Par exemple, les corrections à la masse du Higgs venant du quark top pourraient être neutralisées par d’autres particules hypothétiques et expliquer pourquoi la masse du boson de Higgs est si petite. Justement, la supersymétrie prévoit l’existence de telles particules, d’où son attrait.

Finalement, le Modèle Standard ne décrit que la matière ordinaire, soit toute la matière que nous voyons sur Terre et dans les galaxies. Mais les preuves abondent indiquant que l’Univers contient cinq fois plus de « matière sombre », un type de matière complètement différente de celle que nous connaissons. La matière sombre n’émet pas de lumière, mais se manifeste par ses effets de gravitation. Parmi toutes les particules contenues dans le Modèle Standard, aucune n’a les propriétés de la matière sombre. Il est donc clair que le Modèle Standard ne donne qu’une image incomplète du contenu de l’Univers. Mais de la supersymétrie pourrait résoudre ce problème.

Pauline Gagnon

Pour être averti-e lors de la parution de nouveaux blogs, suivez-moi sur Twitter: @GagnonPauline ou par e-mail en ajoutant votre nom à cette liste de distribution

Share

Despite the old canard about nobody understanding quantum mechanics, physicists do understand it.  With all of the interpretations ever conceived for quantum mechanics[1], this claim may seem a bit of a stretch, but like the proverbial ostrich with its head in the sand, many physicists prefer to claim they do not understand quantum mechanics, rather than just admit that it is what it is and move on.

What is it about quantum mechanics that generates so much controversy and even had Albert Einstein (1879 – 1955) refusing to accept it? There are three points about quantum mechanics that generate controversy. It is probabilistic, eschews realism, and is local. Let us look at these three points in more detail.

  1. Quantum mechanics is probabilistic, not determinist. Consider a radioactive atom. It is impossible, within the confines of quantum mechanics, to predict when an individual atom will decay. There is no measurement or series of measurements that can be made on a given atom to allow me to predict when it will decay. I can calculate the probability of when it will decay or the time it takes half of a sample to decay but not the exact time a given atom will decay. This lack of ability to predict exact outcomes, but only probabilities, permeates all of quantum mechanics. No possible set of measurements on the initial state of a system allows one to predict precisely the result of all possible experiments on that state.
  2. Quantum mechanics eschews realism[2]. This is a corollary of the first point. A quantum mechanical system does not have well defined values for properties that have not been directly measured. This has been compared to the moon only existing when someone is looking at it. For deterministic systems one can always safely infer back from a measurement what the system was like before the measurement. Hence if I measure a particle’s position and motion I can infer not only where it will go but where it has come from. The probabilistic nature of quantum mechanics prevents this backward looking inference. If I measure the spin of an atom, there is no certainty that is had only that value before the measurement. It is this aspect of quantum mechanics that most disturbs people, but quantum mechanics is what it is.
  3. Quantum mechanics is local. To be precise, no action at point A will have an observable effect at point B that is instantaneous, or non-causal.  Note the word observable. Locality is often denied in an attempt to circumvent Point 2, but when restricted to what is observable, locality holds. Despite the Pentagon’s best efforts, no messages have been sent using quantum non-locality.

 

Realism, at least, is a common aspect of the macroscopic world. Even a baby quickly learns that the ball is behind the box even when he cannot see it. But much about the microscopic world is not obviously determinist, the weather in Vancouver for example (it is snowing as I write this). Nevertheless, we cling to determinism and realism like a child to his security blanket. It seems to me that determinism or realism, if they exist, would be at least as hard to understand as their lack. There is no theorem that states the universe should be deterministic and not probabilistic or vice versa. Perhaps god, contrary to Einstein’s assertion, does indeed like a good game of craps[3].

So quantum mechanics, at least at the surface level, has features many do not like. What has the response been? They have followed the example set by Philip Gosse (1810 – 1888) with the Omphalos hypothesis[4]. Gosse, being a literal Christian, had trouble with the geological evidence that the world was older than 6,000, so he came up with an interpretation of history that the world was created only 6,000 years ago but in such a manner that it appeared much older. This can be called an interpretation of history because it leaves all predictions for observations intact but changes the internal aspects of the model so that they match his preconceived ideas. To some extent, Tycho Brahe (1546 – 1601) used the same technique to keep the earth at the center of the universe. He had the earth fixed and the sun circle the earth and the other planets the sun. With the information available at the time, this was consistent with all observations.

The general technique is to adjust those aspects of the model that are not constrained by observation to make it conform to one’s ideas of how the universe should behave. In quantum mechanics these efforts are called interpretations. Hugh Everett (1930 – 1982) proposed many worlds in an attempt to make quantum mechanics deterministic and realistic. But it was only in the unobservable parts of the interpretation that this was achieved and the results of experiments in this world are still unpredictable. Louis de Broglie (1892 – 1987) and later David Bohm (1917 – 1992) introduced pilot waves in an effort to restore realism and determinism. In doing do they gave up locality. Like Gosse’s work, theirs was nice proof in principle that, with sufficient ingenuity, the universe could be made to conform to almost any preconceived ideas, or at least appear to do so. Reassuring I guess, but like Gosse it was done by introducing non-observable aspects to the model: not just unobserved but in principle unobservable. The observable aspects of the universe, at least as far as quantum mechanics is correct, are as stated in the three points above: probabilistic, nonrealistic and local.

Me, I am not convinced that there is anything to understand about quantum mechanics beyond the rules for its use given in standard quantum mechanics text books. However, interpretations of quantum mechanics might, possibly might, suggest different ways to tackle unsolved problems like quantum gravity and they do give one something to discuss after one has had a few beers (or is that a few too many beers).

To receive a notice of future posts follow me on Twitter: @musquod.


[1] See my February 2014 post “Reality and the Interpretations of Quantum Mechanics.”

[2] Realism as defined in the paper by Einstein, Podolsky and Rosen, Physical Review 47 (10): 777–780 (1935).

[3] Or dice.

Share

My Week as a Real Scientist

Thursday, March 6th, 2014

For a week at the end of January, I was a real scientist. Actually, I’m always a real scientist, but only for that week was I tweeting from the @realscientists Twitter account, which has a new scientist each week typing about his or her life and work. I tweeted a lot. I tweeted about the conference I was at. I tweeted about the philosophy of science and religion. I tweeted about how my wife, @CuratorPolly, wasn’t a big fan of me being called the “curator” of the account for the week. I tweeted about airplanes and very possibly bagels. But most of all I tweeted the answers to questions about particle physics and the LHC.

Real Scientists wrote posts for the start and end of my week, and all my tweets for the week are at this Storify page. My regular twitter account, by the way, is @sethzenz.

I was surprised by how many questions people had when I they were told that a real physicist at a relatively high-profile Twitter account was open for questions. A lot of the questions had answers that can already be found, often right here on Quantum Diaries! It got me thinking a bit about different ways to communicate to the public about physics. People really seem to value personal interaction, rather than just looking things up, and they interact a lot with an account that they know is tweeting in “real time.” (I almost never do a tweet per minute with my regular account, because I assume it will annoy people, but it’s what people expect stylistically from the @realscientists account.) So maybe we should do special tweet sessions from one of the CERN-related accounts, like @CMSexperiment, where we get four physicists around one computer for an hour and answer questions. (A lot of museums did a similar thing with #AskACurator day last September.) We’ve also discussed the possibility of doing a AMA on Reddit. And the Hangout with CERN series will be starting again soon!

But while you’re waiting for all that, let me tell you a secret: there are lots of physicists on Twitter. (Lists here and here and here, four-part Symmetry Magazine series here and here and here and here.) And I can’t speak for everyone, but an awful lot of us would answer questions if you had any. Anytime. No special events. Just because we like talking about our work. So leave us comments. Tweet at us. Your odds of getting an answer are pretty good.

In other news, Real Scientists is a finalist for the Shorty Award for social media’s best science. We’ll have to wait and see how they — we? — do in a head-to-head matchup with giants like NASA and Neil deGrasse Tyson. But I think it’s clear that people value hearing directly from researchers, and social media seems to give us more and more ways to communicate every year.

Share

Advances in accelerators built for fundamental physics research have inspired improved cancer treatment facilities. But will one of the most promising—a carbon ion treatment facility—be built in the U.S.? Participants at a symposium organized by Brookhaven Lab for the 2014 AAAS meeting explored the science and surrounding issues.

by Karen McNulty Walsh

Accelerator physicists are natural-born problem solvers, finding ever more powerful ways to generate and steer particle beams for research into the mysteries of physics, materials, and matter. And from the very beginning, this field born at the dawn of the atomic age has actively sought ways to apply advanced technologies to tackle more practical problems. At the top of the list—even in those early days— was taking aim at cancer, the second leading cause of death in the U.S. today, affecting one in two men and one in three women.

Using beams of accelerated protons or heavier ions such as carbon, oncologists can deliver cell-killing energy to precisely targeted tumors—and do so without causing extensive damage to surrounding healthy tissue, eliminating the major drawback of conventional radiation therapy using x-rays.

“This is cancer care aimed at curing cancer, not just treating it,” said Ken Peach, a physicist and professor at the Particle Therapy Cancer Research Institute at Oxford University.

Peach was one of six participants in a symposium exploring the latest advances and challenges in this field—and a related press briefing attended by more than 30 science journalists—at the 2014 meeting of the American Association for the Advancement of Science in Chicago on February 16. The session, “Targeting Tumors: Ion Beam Accelerators Take Aim at Cancer,” was organized by the U.S. Department of Energy’s (DOE’s) Brookhaven National Laboratory, an active partner in an effort to build a prototype carbon-ion accelerator for medical research and therapy. Brookhaven Lab is also currently the only place in the U.S. where scientists can conduct fundamental radiobiological studies of how beams of ions heavier than protons, such as carbon ions, affect cells and DNA.

Participants in a symposium and press briefing exploring the latest advances and challenges in particle therapy for cancer at the 2014 AAAS meeting: Eric Colby (U.S. Department of Energy), Jim Deye (National Cancer Institute), Hak Choy (University of Texas Southwestern Medical Center), Kathryn Held (Harvard Medical School and Massachusetts General Hospital), Stephen Peggs (Brookhaven National Laboratory and Stony Brook University), and Ken Peach (Oxford University). (Credit: AAAS)

Participants in a symposium and press briefing exploring the latest advances and challenges in particle therapy for cancer at the 2014 AAAS meeting: Eric Colby (U.S. Department of Energy), Jim Deye (National Cancer Institute), Hak Choy (University of Texas Southwestern Medical Center), Kathryn Held (Harvard Medical School and Massachusetts General Hospital), Stephen Peggs (Brookhaven National Laboratory and Stony Brook University), and Ken Peach (Oxford University). (Credit: AAAS)

“We could cure a very high percentage of tumors if we could give sufficiently high doses of radiation, but we can’t because of the damage to healthy tissue,” said radiation biologist Kathryn Held of Harvard Medical School and Massachusetts General Hospital during her presentation. “That’s the advantage of particles. We can tailor the dose to the tumor and limit the amount of damage in the critical surrounding normal tissues.”

Yet despite the promise of this approach and the emergence of encouraging clinical results from carbon treatment facilities in Asia and Europe, there are currently no carbon therapy centers operating in the U.S.

Participants in the Brookhaven-organized session agreed: That situation has to change—especially since the very idea of particle therapy was born in the U.S.

Physicists as pioneers

“When Harvard physicist Robert Wilson, who later became the first director of Fermilab, was asked to explore the potential dangers of proton particle radiation [just after World War II], he flipped the problem on its head and described how proton beams might be extremely useful—as effective killers of cancer cells,” said Stephen Peggs, an accelerator physicist at Brookhaven Lab and adjunct professor at Stony Brook University.

As Peggs explained, the reason is simple: Unlike conventional x-rays, which deposit energy—and cause damage—all along their path as they travel through healthy tissue en route to a tumor (and beyond it), protons and other ions deposit most of their energy where the beam stops. Using magnets, accelerators can steer these charged particles left, right, up, and down and vary the energy of the beam to precisely place the cell-killing energy right where it’s needed: in the tumor.

The first implementation of particle therapy used helium and other ions generated by the Bevatron at Berkeley Lab. Those spin-off studies “established a foundation for all subsequent ion therapy,” Peggs said. And as accelerators for physics research grew in size, pioneering experiments in particle therapy continued, operating “parasitically” until the very first accelerator built for hospital-based proton therapy was completed with the help of DOE scientists at Fermilab in 1990.

But even before that machine left Illinois for Loma Linda University Medical Center in California, physicists were thinking about how it could be made better. The mantra of making machines smaller, faster, cheaper—and capable of accelerating more kinds of ions—has driven the field since then.

Advances in magnet technology, including compact superconducting magnets and beam-delivery systems developed at Brookhaven Lab, hold great promise for new machines. Peggs is working to incorporate these technologies in a prototype ‘ion Rapid Cycling Medical Synchrotron’ (iRCMS) capable of delivering protons and/or carbon ions for radiobiology research and for treating patients.

Brookhaven Lab accelerator physicist Stephen Peggs with magnet technology that could reduce the size of particle accelerators needed to steer heavy ion beams and deliver cell-killing energy to precisely targeted tumors while sparing surrounding healthy tissue.

Brookhaven Lab accelerator physicist Stephen Peggs with magnet technology that could reduce the size of particle accelerators needed to steer heavy ion beams and deliver cell-killing energy to precisely targeted tumors while sparing surrounding healthy tissue.

Small machine, big particle impact

The benefits of using charged particles heavier than protons (e.g., carbon ions) stem not only from their physical properties—they stop and deposit their energy over an even smaller and better targeted tumor volume than protons—but also a range of biological advantages they have over x-rays.

As Kathryn Held elaborated in her talk, compared with x-ray photons, “carbon ions are much more effective at killing tumor cells. They put a huge hole through DNA compared to the small pinprick caused by x-rays, which causes clustered or complex DNA damage that is less accurately repaired between treatments—less repaired, period—and thus more lethal [to the tumor].” Carbon ions also appear to be more effective than x-rays at killing oxygen-deprived tumor cells, and might be most effective in fewer higher doses, “but we need more basic biological studies to really understand these effects,” Held said.

Different types of radiation treatment cause different kinds of damage to the DNA in a tumor cell. X-ray photons (top arrow) cause fairly simple damage (purple area) that cancer cells can sometimes repair between treatments. Charged particles—particularly ions heavier than protons (bottom arrow)—cause more and more complex forms of damage, resulting in less repair and a more lethal effect on the tumor. (Credit: NASA)

Different types of radiation treatment cause different kinds of damage to the DNA in a tumor cell. X-ray photons (top arrow) cause fairly simple damage (purple area) that cancer cells can sometimes repair between treatments. Charged particles—particularly ions heavier than protons (bottom arrow)—cause more and more complex forms of damage, resulting in less repair and a more lethal effect on the tumor. (Credit: NASA)

Held conducts research at the NASA Space Radiation Laboratory (NSRL) at Brookhaven Lab, an accelerator-based facility designed to fully understand risks and design protections for future astronauts exposed to radiation. But much of that research is relevant to understanding the mechanisms and basic radiobiological responses that can apply to the treatment of cancer. But additional facilities and funding are needed for research specifically aimed at understanding the radiobiological effects of heavier ions for potential cancer therapies, Held emphasized.

Hak Choy, a radiation oncologist and chair in the Department of Radiation Oncology at the University of Texas Southwestern Medical Center, presented compelling clinical data on the benefits of proton particle therapy, including improved outcomes and reduced side effects when compared with conventional radiation, particularly for treating tumors in sensitive areas such as the brain and spine and in children. “When you can target the tumor and spare critical tissue you get fewer side effects,” he said.

Data from Japan and Europe suggest that carbon ions could be three or four times more biologically potent than protons, Choy said, backing that claim with impressive survival statistics for certain types of cancers where carbon therapy surpassed protons, and was even better than surgery for one type of salivary gland cancer. “And carbon therapy is noninvasive,” he emphasized.

To learn more about this promising technology and the challenges of building a carbon ion treatment/research facility in the U.S., including perspectives from the National Cancer Institute, DOE and a discussion about economics, read the full summary of the AAAS symposium here: http://www.bnl.gov/newsroom/news.php?a=24672.

Karen McNulty Walsh is a science writer in the Media & Communications Office at Brookhaven National Laboratory.

 

Share

This article appeared in symmetry on February 28, 2014.

The Cryogenic Dark Matter Search has set more stringent limits on light dark matter.

The Cryogenic Dark Matter Search has set more stringent limits on light dark matter.

Scientists looking for dark matter face a serious challenge: No one knows what dark matter particles look like. So their search covers a wide range of possible traits—different masses, different probabilities of interacting with regular matter.

Today, scientists on the Cryogenic Dark Matter Search experiment, or CDMS, announced they have shifted the border of this search down to a dark-matter particle mass and rate of interaction that has never been probed.

“We’re pushing CDMS to as low mass as we can,” says Fermilab physicist Dan Bauer, the project manager for CDMS. “We’re proving the particle detector technology here.”

Their result, which does not claim any hints of dark matter particles, contradicts a result announced in January by another dark matter experiment, CoGeNT, which uses particle detectors made of germanium, the same material as used by CDMS.

To search for dark matter, CDMS scientists cool their detectors to very low temperatures in order to detect the very small energies deposited by the collisions of dark matter particles with the germanium. They operate their detectors half of a mile underground in a former iron ore mine in northern Minnesota. The mine provides shielding from cosmic rays that could clutter the detector as it waits for passing dark matter particles.

Today’s result carves out interesting new dark matter territory for masses below 6 billion electronvolts. The dark matter experiment Large Underground Xenon, or LUX, recently ruled out a wide range of masses and interaction rates above that with the announcement of its first result in October 2013.

Scientists have expressed an increasing amount of interest of late in the search for low-mass dark matter particles, with CDMS and three other experiments—DAMA, CoGeNT and CRESST—all finding their data compatible with the existence of dark matter particles between 5 billion and 20 billion electronvolts. But such light dark-matter particles are hard to pin down. The lower the mass of the dark-matter particles, the less energy they leave in detectors, and the more likely it is that background noise will drown out any signals.

Even more confounding is the fact that scientists don’t know whether dark matter particles interact in the same way in detectors built with different materials. In addition to germanium, scientists use argon, xenon, silicon and other materials to search for dark matter in more than a dozen experiments around the world.

“It’s important to look in as many materials as possible to try to understand whether dark matter interacts in this more complicated way,” says Adam Anderson, a graduate student at MIT who worked on the latest CDMS analysis as part of his thesis. “Some materials might have very weak interactions. If you only picked one, you might miss it.”

Scientists around the world seem to be taking that advice, building different types of detectors and constantly improving their methods.

“Progress is extremely fast,” Anderson says. “The sensitivity of these experiments is increasing by an order of magnitude every few years.”

Kathryn Jepsen

Share

Dear Google: Hire us!

Monday, March 3rd, 2014

In case you haven’t figured it out already from reading the US LHC blog or any of the others at Quantum Diaries, people who do research in particle physics feel passionate about their work. There is so much to be passionate about! There are challenging intellectual issues, tricky technical problems, and cutting-edge instrumentation to work with — all in pursuit of understanding the nature of the universe at its most fundamental level. Your work can lead to global attention and support Nobel Prizes. It’s a lot of effort put in over long days and nights, but there is also a lot of satisfaction to be gained from our accomplishments.

That being said, a fundamental truth about our field is that not everyone doing particle-physics research will be doing that for their entire career. There are fewer permanent jobs in the field than there are people who are qualified to hold them. It is certainly easy to do the math about university jobs in particular — each professor may supervise a large number of PhD students in his or her career, but only one could possibly inherit that job position in the end. Most of our researchers will end up working in other fields, quite likely in the for-profit sector, and as a field we do need to make sure that they are well-prepared for jobs in that part of the world.

I’ve always believed that we do a good job of this, but my belief was reinforced by a recent column by Tom Friedman in The New York Times. It was based around an interview with the Google staff member who oversees hiring for the company. The essay describes the attributes that Google looks for in new employees, and I couldn’t help but to think that people who work in the large experimental particle physics projects such as those at the LHC have all of those attributes. Google is not just looking for technical skills — it goes without saying that they are, and that particle physicists have those skills and great experience with digesting large amounts of computerized data. Google is also looking for social and personality traits that are also important for success in particle physics.

(Side note: I don’t support all of what Friedman writes in his essay; he is somewhat dismissive of the utility of a college education, and as a university professor I think that we are doing better than he suggests. But I will focus on some of his other points here. I also recognize that it is perhaps too easy for me to write about careers outside the field when I personally hold a permanent job in particle physics, but believe me that it just as easily could have wound up differently for me.)

For example, just reading from the Friedman column, one thing Google looks for is what is referred to as “emergent leadership”. This is not leadership in the form of holding a position with a particular title, but seeing when a group needs you to step forward to lead on something when the time is right, but also to step back and let someone else lead when needed. While the big particle-physics collaborations appear to be massive organizations, much of the day to day work, such as the development of a physics measurement, is done in smaller groups that function very organically. When they function well, people do step up to take on the most critical tasks, especially when they see that they are particularly positioned to do them. Everyone figures out how to interact in such a way that the job gets done. Another facet of this is ownership: everyone who is working together on a project feels personally responsible for it and will do what is right for the group, if not the entire experiment — even if it means putting aside your own ideas and efforts when someone else clearly has the better thing.

And related to that in turn is what is referred to in the column as “intellectual humility.” We are all very aggressive in making our arguments based on the facts that we have in hand. We look at the data and we draw conclusions, and we develop and promote research techniques that appear to be effective. But when presented with new information that demonstrates that the previous arguments are invalid, we happily drop what we had been pursuing and move on to the next thing. That’s how all of science works, really; all of your theories are only as good as the evidence that supports them, and are worthless in the face of contradictory evidence. Google wants people who take this kind of approach to their work.

I don’t think you have to be Google to be looking for the same qualities in your co-workers. If you are an employer who wants to have staff members who are smart, technically skilled, passionate about what they do, able to incorporate disparate pieces of information and generate new ideas, ready to take charge when they need to, feel responsible for the entire enterprise, and able to say they are wrong when they are wrong — you should be hiring particle physicists.

Share