The phenomenon that Einstein thought too spooky and strange to be true
What is entanglement? It's a connection between quantum particles, the building blocks of the universe. Once two particles are entangled, a change to one of them is reflected-instantly-in the other, be they in the same lab or light-years apart. So counterintuitive is this phenomenon and its implications that Einstein himself called it "spooky" and thought that it would lead to the downfall of quantum theory. Yet scientists have since discovered that quantum entanglement, the "God Effect," was one of Einstein's few-and perhaps one of his greatest-mistakes.
What does it mean? The possibilities offered by a fuller understanding of the nature of entanglement read like something out of science fiction: communications devices that could span the stars, codes that cannot be broken, computers that dwarf today's machines in speed and power, teleportation, and more.
In The God Effect, veteran science writer Brian Clegg has written an exceptionally readable and fascinating (and equation-free) account of entanglement, its history, and its application. Fans of Brian Greene and Amir Aczel and those interested in the marvelous possibilities coming down the quantum physics road will find much to marvel, illuminate, and delight.
The phenomenon that Einstein thought too spooky and strange to be true
What is entanglement? It's a connection between quantum particles, the building blocks of the universe. Once two particles are entangled, a change to one of them is reflected-instantly-in the other, be they in the same lab or light-years apart. So counterintuitive is this phenomenon and its implications that Einstein himself called it "spooky" and thought that it would lead to the downfall of quantum theory. Yet scientists have since discovered that quantum entanglement, the "God Effect," was one of Einstein's few-and perhaps one of his greatest-mistakes.
What does it mean? The possibilities offered by a fuller understanding of the nature of entanglement read like something out of science fiction: communications devices that could span the stars, codes that cannot be broken, computers that dwarf today's machines in speed and power, teleportation, and more.
In The God Effect, veteran science writer Brian Clegg has written an exceptionally readable and fascinating (and equation-free) account of entanglement, its history, and its application. Fans of Brian Greene and Amir Aczel and those interested in the marvelous possibilities coming down the quantum physics road will find much to marvel, illuminate, and delight.
Paperback(First Edition)
-
SHIP THIS ITEMTemporarily Out of Stock Online
-
PICK UP IN STORE
Your local store may have stock of this item.
Available within 2 business hours
Related collections and offers
Overview
The phenomenon that Einstein thought too spooky and strange to be true
What is entanglement? It's a connection between quantum particles, the building blocks of the universe. Once two particles are entangled, a change to one of them is reflected-instantly-in the other, be they in the same lab or light-years apart. So counterintuitive is this phenomenon and its implications that Einstein himself called it "spooky" and thought that it would lead to the downfall of quantum theory. Yet scientists have since discovered that quantum entanglement, the "God Effect," was one of Einstein's few-and perhaps one of his greatest-mistakes.
What does it mean? The possibilities offered by a fuller understanding of the nature of entanglement read like something out of science fiction: communications devices that could span the stars, codes that cannot be broken, computers that dwarf today's machines in speed and power, teleportation, and more.
In The God Effect, veteran science writer Brian Clegg has written an exceptionally readable and fascinating (and equation-free) account of entanglement, its history, and its application. Fans of Brian Greene and Amir Aczel and those interested in the marvelous possibilities coming down the quantum physics road will find much to marvel, illuminate, and delight.
Product Details
ISBN-13: | 9780312555306 |
---|---|
Publisher: | St. Martin's Press |
Publication date: | 07/21/2009 |
Edition description: | First Edition |
Pages: | 288 |
Sales rank: | 268,942 |
Product dimensions: | 5.50(w) x 8.32(h) x 0.76(d) |
About the Author
Brian Clegg is the author of A Brief History of Infinity, The First Scientist: A Life of Roger Bacon, and Light Years: The Extraordinary Story of Mankind's Fascination with Light. He holds a physics degree from Cambridge and has written regular columns, features, and reviews for numerous magazines. His books have been translated into ten languages. He lives in Wiltshire, England, with his wife and two children.
Read an Excerpt
The God Effect
CHAPTER ONE
ENTANGLEMENT BEGINS
Laws are generally found to be nets of such a texture, as the little creep through, the great break through, and the middle-sized are alone entangled in.
--WILLIAM SHENSTONE, Essays on Men, Manners, and Things
Entanglement. It's a word that is ripe with implications. It brings to mind a kitten tied up in an unraveled ball of wool, or the complex personal relationship between two human beings. In physics, though, it refers to a very specific and strange concept, an idea so bizarre, so fundamental, and so far reaching that I have called it the God Effect. Once two particles become entangled, it doesn't matter where those particles are; they retain an immediate and powerful connection that can be harnessed to perform seemingly impossible tasks.
The word "quantum" needs a little demystifying to be used safely. It does nothing more than establish that we are dealing with "quanta," the tiny packets of energy and matter that are the building blocks of reality. A quantum is usually a very small speck of something, a uniform building block normally found in vast numbers, whether it's a photon of light, an atom of matter, or a subatomic particle like an electron.
Dealing in quanta implies that we are working with something that comes in measured packages, fixed amounts, rather than delivered as a continuously variable quantity. In effect, the difference between something that is quantized and something continuous is similar to thedifference between digital information, based on quanta of Os and Is, and analog information that can take any value. In the physical world, a quantum is usually a very small unit, just as a quantum leap is a very small change--quite different from its implications in everyday speech.
The phenomenon at the heart of this book is a linkage between the incomprehensibly small particles that make up the world around us. At this quantum level, it is possible to link particles together so completely that the linked objects (photons, electrons, and atoms, for instance) become, to all intents and purposes, part of the same thing. Even if these entangled particles are then separated to opposite sides of the universe, they retain this strange connection. Make a change to one particle, and that change is instantly reflected in the other(s)--however far apart they may be. The God Effect has an unsettling omnipresence..
This unbounded linkage permits the remarkable applications of quantum entanglement that are being developed. It enables the distribution of a secret key for data encryption that is impossible to intercept. It plays a fundamental role in the operation of a quantum computer--a computer where each bit is an individual subatomic particle, capable of calculations that are beyond any conventional computer, even if the program ran for the whole lifetime of the universe. And entanglement makes it possible to transfer a particle, and potentially an object, from one place to another without passing through the space in between.
This counterintuitive ability of entanglement to provide an intimate link between two particles at a distance seems just as odd to physicists as it does to the rest of us. Albert Einstein, who was directly responsible for the origins of quantum theory that made entanglement inevitable,was never comfortable with the way entanglement acts at a distance, without anything connecting the entangled particles. He referred to the ability of quantum theory to ignore spatial separation as "spkhafte Fernwirkungen," literally spooky or ghostly distant actions, in a letter written to fellow scientist Max Born:
I cannot make a case for my attitude in physics which you would consider reasonable ... I cannot seriously believe in [quantum theory] because the theory cannot be reconciled with the idea that physics should represent a reality in time and space, free from spooky actions at a distance.
Entanglement, as a word, seems to have entered the language of physics at the hand of scientist Erwin Schrdinger, in an article in the Proceedings of the Cambridge Philosophical Society. Interestingly, although German, Schrdinger was working and writing in English at the time--and this may have inspired his use of "entanglement"--the German word for the phenomenon, Verschrnkung, has a rather different meaning than does his word choice in English.
The English term has subtly negative connotations. It gives a sense of being out of control and messed up. But the German word is more structured and neutral--it is about enfolding, crossing over in an orderly manner. A piece of string that is knotted and messed up is entangled, where a carefully woven tapestry has Verschrnkung. In practice, neither word seems ideal. Quantum entanglement may lack the disorder implied by "entanglement," but it is much stronger and more fundamental than the pallid Verschrnkung seems to suggest.
For Einstein, the prediction that entanglement should exist was a clear indicator of the lack of sense in quantum theory. The idea ofentanglement was an anathema to Einstein, a challenge to his view on what "reality" truly consisted of. And this was all because entanglement seemed to defy the concept of locality
Locality. It's the kind of principle that is so obvious we usually assume it without even being aware of it. If we want to act upon something that isn't directly connected to us--to give it a push, to pass a piece of information to it, or whatever--we need to get something from us to the object we wish to act upon. Often this "something" involves direct contact--I reach over and pick up my coffee cup to get it moving toward my mouth. But if we want to act on something at a distance without crossing the gap that separates us from that something, we need to send an intermediary from one place to the other.
Imagine that you are throwing stones at a can that's perched on a fence. If you want to knock the can off, you can't just look at it and make it jump into the air by some sort of mystical influence; you have to throw a stone at it. Your hand pushes the stone, the stone travels through the air and hits the can; as long as your aim is good (and the can isn't wedged in place), the can falls off and you smile smugly
Similarly, if I want to speak to someone across the other side of a room, my vocal chords vibrate, pushing against the nearest air molecules. These send a train of sound waves through the air, rippling molecules across the gap, until finally those vibrations get to the other person's ear, start her eardrum vibrating, and result in my voice being heard. In the first case, the ball was the intermediary, in the second the sound wave, but in both cases something physically traveled from A to B. This need for travel--travel that takes time--is what locality is all about. It says that you can't act on a remote object without that intervention.
All the evidence is that we are programmed from birth to find the ability to influence objects at a distance unnatural. Research on babieshas shown that they don't accept action at a distance, believing that there needs to be contact between two objects to allow one to act on the other.
This seems an extravagant assertion. After all, babies are hardly capable of telling us that this is what they think, and no one can remember how they saw the world in their first few months of life. The research technique that gets around this problem is delightfully cunning: babies are made bored by constant repetition of a particular scene, then after many repeats, some small aspect of the scene is changed. The babies are watched to see how they react. If the new movement involves action with visible contact, the babies get less worked up than if it appears to involve action at a distance. If a hand pushes a toy and it moves, the baby doesn't react; if a toy moves on its own, the baby does a double take. The inference that babies don't like the ability to act remotely is indirect, but the monitoring does appear to display babies' concern about action at a distance--the whole business feels unnatural.
Next time you are watching a magician at work, doing a trick where he manipulates an object at a distance, try to monitor your own reaction. As the magician's hand moves, so does the ball (or whatever the object he is controlling happens to be). Your mind rebels against the sight. You know that there has to be a trick. There has to be something linking the action of the hand and the movement of the object, whether directly--say, with a very thin wire--or indirectly, perhaps by a hidden person moving the object while watching the magician's hand. Your brain is entirely convinced that action at a distance is not real.
However, though action at a distance looks unreal, this doesn't rule out the possibility of its truly happening. We are used to having to overcome appearances, to take a step away from what looks natural, given extra knowledge. From an early age (unlike dogs and cats) we knowthat there aren't really little men behind the TV screen. Similarly, a modern child will have been taught about gravity, which itself gives the appearance of action at a distance. We know gravity works from a great range, yet there is no obvious linkage between the two bodies that are attracted to each other. Gravitation seems to offer a prime challenge to the concept of locality.
This idea of gravitational attraction emerged with the Newtonian view of the world, but even as far back as the ancient Greeks, before any idea of gravity existed, there was awareness of other apparent actions at a distance. Amber rubbed with a cloth attracts lightweight objects, such as fragments of paper, toward it. Lodestones, natural magnets, attract metal and spin around, when set on a cork to float on water, until they are pointing in a particular direction. In each case, the action has no obvious linkage to make it work. The attracted object moves toward the magnet--the floating lodestone spins and the static-charged amber summons its retinue of paper scraps as if by magic.
The Greeks had competing schools of thought on what might be happening. One group, the atomists, believed that everything was either atom or void--and, as nothing could act across a void, there had to be a continuous chain of atoms that linked cause to effect. Other Greek philosophers put action at a distance down to a sympathetic process--that some materials were inherently attracted to each other as one person attracts another. This was little more than a variant on the third possibility open to the Greek mind--supernatural intervention. In effect, this theory said there was something out there that provided an occult nudge to make things happen. This idea was widely respected in ancient times as the mechanism of the long-lasting if scientifically unsupportable concept of astrology, in which supernatural influence by the planets was thought to shape our lives.
Even though, nearly two thousand years later, Newton was able to exhibit pure genius in his description of what happened as a result of one apparent action at a distance--gravity--he was no better than the Greeks in explaining how one mass influenced another without anything connecting them. In his masterpiece, the Principia Mathematica, published in 1688, he said:
Hitherto, we have explained the phenomena of the heavens and of our sea by the power of gravity, but have not yet assigned the cause of this power. This is certain, that it must proceed from a cause that penetrates to the very centres of the sun and planets, without suffering the least diminution of its force; that operates not according to the quantity of the surfaces of the particles upon which it acts (as mechanical causes used to do) but according to the quantity of the solid matter which they contain, and propagates its virtue on all sides to immense distances ...
I have not been able to discover the cause of those properties of gravity from the phenomena, and I frame no hypothesis; for whatever is not deduced from the phenomena is to be called an hypothesis; and hypotheses, whether metaphysical or physical, whether of occult qualities or mechanical, have no place in experimental philosophy. In this philosophy particular propositions are inferred from the phenomena, and afterward rendered general by deduction ... And to us it is enough that gravity does really exist, and acts according to the laws which we have explained, and abundantly serves to account for all the motions of the celestial bodies, and of our sea.
This quote contains one of Newton's best-known lines, "I frame no hypothesis" ("hypotheses non fingo" in his original Latin). The moderntranslation of Principia, by Cohen and Whitman, points out that fingo was a derogatory term, implying making something up rather than the apparently neutral "frame." Newton was saying that gravity exists, but he wasn't going to provide a nonempirical guess at how it works. Some would continue to believe that gravity had some occult mechanism, on a par with astrology, but mostly the workings of gravity were swept under the carpet until Einstein came along.
One fundamental that came out of Einstein's work was that nothing could travel faster than light. We will revisit the reasoning behind this (and the implications of breaking Einstein's limit) in chapter 5. For the moment, though, relativity sounded the death knell for action at distance. It had been known since 1676 that light traveled at a finite speed, when the Danish astronomer Ole Roemer made the first effective determination of a velocity now set at around 186,000 miles per second. Einstein showed that action could not escape this constraint. Nothing, not even gravity, could travel faster than the speed of light. It was the ultimate limit.
We still don't know exactly how gravity works, but Einstein's limit was finally proved experimentally at the beginning of the twenty-first century--gravity does travel at the speed of light. If the sun suddenly vanished, just as we wouldn't see that it had disappeared for about eight minutes, we also wouldn't feel the catastrophic impact of the loss of its gravitational pull until then. Locality reigns.
Or at least that seemed to be the case, until experiments based on the work of an obscure physicist from Northern Ireland, John Bell, proved the existence of entanglement. Entanglement is genuine action at a distance, something that even now troubles many scientists. Of course, today we have a more sophisticated view of the universe--and have to face up to the fact that the concept of "distance" itself is perhaps not as clear and obvious as it once was. Theorist Berndt Muller, of DukeUniversity, has suggested that the quantum world has an extra unseen dimension through which apparently spatially separated objects can communicate as if they were side by side. Others imagine spatial separation to be invisible--in effect, nonexistent--to entangled particles. Even so, there is a powerful reluctance to allow that anything, however insubstantial and unable to carry information, could travel faster than light.
Although Einstein's objections to quantum theory based on its dependence on probability are frequently repeated (usually in one of several quotes about God not throwing dice), it was the breach of locality that really seemed to wound Einstein's sense of what was right. This is never more obvious than in a series of sharp handwritten remarks Einstein appended to the text draft of an article his friend Max Born had sent to him for comment:
The whole thing is rather sloppily thought out, and for this I must respectfully clip your ear ... whatever we regard as existing (real) should somehow be localized in time and space ... [otherwise] one has to assume that the physically real in [position] B suffers a sudden change as a result of a measurement in [position] A. My instinct for physics bristles at this. However, if one abandons the assumption that what exists in different parts of space has its own, independent, real existence then I simply cannot see what it is that physics is meant to describe.
The phenomenon that challenges locality, that makes action a distance a possibility once more, the phenomenon of entanglement, emerges from quantum theory, the modern science of the very small. To reach the conception of entanglement, we need to trace quantum theory's development from a useful fudge to fix a puzzling phenomenon,to a wide-ranging structure that would undermine all of classical physics.
Max Planck, a scientist with roots firmly in the nineteenth century, started it all in an attempt to find a practical solution to an otherwise intractable problem. Planck, born in Kiel, Germany, in 1858, was almost put off physics by his professor at the University of Munich, Phillip von Jolly. Von Jolly held the downbeat view that physics was a dead-end career for a young man. According to von Jolly, pretty well everything that happened in the world, with a couple of minor exceptions, was perfectly explained by the physical theories of the day, and there was nothing left to do but polish up the results and add a few decimal places. Planck could have been tempted to build on his musical capabilities and become a concert pianist, but instead he stuck with physics.
It's a good thing he did. Von Jolly could not have been more wrong, and it was one of those "minor exceptions," the dramatically named ultraviolet catastrophe, that began the process of undermining almost all of von Jolly's "near perfect" physics, and that would elevate Planck to the pantheon of the greats. As far as the best calculations of the day could determine, a blackbody (a typical physicist's simplification: an object that is a perfect absorber and emitter of radiation) should emit radiation at every frequency, with more and more output in the higher frequencies of light, producing in total an infinite blast of energy. This clearly wasn't true. Objects at room temperature only gave off a bit of infrared, rather than glowing with an explosion of blue, ultraviolet, and higher-power radiation.
In 1900, Planck got around this seemingly impossible situation by dividing the possible emissions or absorptions of electromagnetic energy by physical matter into fixed units (quanta, as Einstein would soon call them). This was something that Max Planck would never truly be comfortable with. He wrote:
The whole procedure was an act of despair because a theoretical interpretation had to be found at any price, no matter how high that may be.
For Planck, these "quanta" were not real. They were a vehicle to help him achieve a workable solution, a working method that had no direct connection to true physical entities. His quanta were, as far as he was concerned, an imaginary conceit. He considered them to be like numbers, when compared with physical objects. The number three (as opposed to "3," which is the symbol for the number three) isn't real. I can't show you three. I can't draw three, or weigh three. But I can show you three oranges--and the number proves very valuable when I want to make calculations concerning oranges. Similarly, Planck believed that quanta did not exist but made a valuable contribution to calculations on the energy of light and other forms of electromagnetic radiation.
There is an interesting parallel between Planck's attitude to quanta and the anonymous preface that was added to De Revolutionibus, the great work in which Nicholas Copernicus challenged the idea that the Sun traveled around the Earth. This tacked-on text, probably written by Andreas Osiander, the clergyman who supervised publication for the ailing Copernicus, is an introduction that dismisses the sun-centered theory of the book as a convenience for undertaking calculations that need bear no resemblance to reality. This was very similar to Planck's view of quanta.
Einstein, born twenty-one years later than Planck, was less fussy about detaching quanta from the real world. In a remarkable paper written in 1905 (the paper for which he later won his Nobel Prize), he suggested that light was actually made up of these quanta. Instead of its being continuous waves, he imagined it to be divided into minute packets of energy. Just how revolutionary Einstein's vision would be isn't clear from the title of the paper, Ober einen die Erzeugung und Verwandlungdes Lichtes betreffenden heuristischen Gesichtspunkt (On a Heuristic Viewpoint of the Creation and Transformation of Light). But revolutionary it was. Because, if there was one thing everyone was certain about--or at least everyone was certain about until Einstein changed everything--it was that light was a wave.
To be fair, Isaac Newton had always thought that light was made of particles, and his idea had kept going longer than it might otherwise because of the sheer momentum of the Newton name, but by the start of the twentieth century there was no contest between particles and waves. Not only did light exhibit behavior that made it a natural candidate for being a wave--bending around obstructions as the sea does around a breakwater, for instance--but Thomas Young had shown in a beautifully simple experiment in 1801 that light could produce interference patterns when passed through a pair of narrow slits. The mingled beams threw shadings of light and dark onto a screen, corresponding to the addition and subtraction of the ripples in the wave, just as waves did on the surface of water. No other explanation seemed capable of explaining light's behavior.
There was no way, for example, that the scientists of the time could imagine these interference patterns being developed by a series of particles. A particle had to follow a single path from source to screen. Passing a stream of particles through a pair of slits should result in two bright areas (one behind each slit) and large swathes of darkness, not the repeating dark and light patterns that everyone from Young onward could clearly see when they carried out the experiment.
In his paper on light, Einstein not only worked with Planck's quanta but showed that the radiation in a blackbody cavity behaved just like a gas of particles--he could apply the same statistical techniques that he had already successfully applied to gases. What's more, if light truly were made up of individual quanta rather than continuous waves, Einstein predicted it should be possible to generate a small electrical current when light was shone on certain metals, something that was suspected but had yet to be fully proved. This photoelectric effect really clinched the paper's significance.
Planck was no enthusiast for this promotion of his imaginary concept toward reality, and went so far as to criticize Einstein in a deeply condescending fashion. When Planck recommended the younger man for the Prussian Academy of Sciences in 1913, he asked that they wouldn't hold it against Einstein that he sometimes "missed the target in his speculations, as for example, in his theory of light quanta ..."
The same year that Planck made this remark, Einstein's idea would be absorbed and amplified by the man who later became Einstein's chief sparring partner over quantum theory, and particularly over quantum entanglement, Niels Bohr.
Bohr, born in Copenhagen, Denmark, in 1885 (and so around ten years younger than Einstein), came from a family with deeply academic roots. His father was a professor of physiology and his younger brotherbecame a math professor. After earning a doctorate in Copenhagen, Bohr had traveled to England, first to Cambridge and then to work in the northern industrial city of Manchester with the New Zealand--born physicist Ernest Rutherford, who had come to fame by discovering the atomic nucleus.
In 1913, Bohr devised a model of the atom's structure that relied on Einstein's quanta to explain its workings. His idea was to consider the atom, with its tiny but heavy central nucleus surrounded by much smaller electrons, as if it were a sun with its attendant planets in orbit. Although this model of the atom would be discarded relatively quickly in the scientific world, it became very popular with the general public, particularly in the 1950s, when symbols of a nucleus with electrons whirling around it proliferated endlessly--and even now, children are often taught this as their initial view of the atom, probably a mistake, as it is very difficult then to shake off this image and we now know that atoms just aren't built that way.
Of course, technically this inaccuracy is true of any description we make of physical phenomena, particularly those that take place on scales that are too large or too small to easily comprehend. Our explanations of the workings of the world, from the atom to the big bang, are all "models," the scientific equivalent of a metaphor. Metaphors (and models) can be misleading if taken too literally, as a conversation in the animated movie Shrek demonstrates.
The hero, Shrek, makes the comment that ogres are like onions because they have layers. His friend, Donkey, takes the metaphor too literally and assumes that ogres are like onions because they stink, or make you cry, or grow little hairs when left out in the sun. Metaphors and models can be dangerous if you take them too literally, or extend them too far--and sometimes they can be more misleading than any value gained from such an illustration.
Science writer John Gribbin once firmly criticized physicist Nick Herbert for saying he felt dishonest "whenever I draw for schoolchildren the popular planetary picture of the atom; it was known to be a lie even in their grandparents' day." Gribbin responded sternly, "Is it a lie? No! No more so, at least, than any other model of atomic reality." But this is unfair on Herbert--the fact is that some models are better than others.
The planetary model that is being criticized was better than the older "plum pudding" model that imagined negatively charged electrons to be scattered through a homogenous mass of positive charge, like fruit in a plum pudding. But equally, the planetary model is more misleading than newer alternatives that don't pretend that electrons behave in the neatly ordered manner of planets in stately motion around a sun.
In fact, as soon as Bohr came up with his planetary model, there was a problem (and this is where Einstein's quanta came into the picture). A satellite in orbit--the Earth around the Sun; the Moon or an artificial satellite traveling around the Earth--is constantly accelerating. This doesn't mean it gets quicker and quicker, because this is a different kind of acceleration. Ever since Newton, we've known that a moving body will travel in a straight line at a constant speed unless you apply a force to it. The satellite wants to fly off in a straight line, out of orbit. It is only the constantly applied force of gravity that pulls it out of the straight line and around the curve. And a body that has a force applied to it is said to accelerate.
In this case, the result is not straight-line, linear acceleration like a drag racer accelerating down the track, but angular acceleration. With this type of acceleration, the speed remains the same but the direction changes. Velocity, the true measure of rate of movement, comprises both speed and direction. The velocity is changing because, though thespeed remains the same, the direction is constantly being modified. That works fine with a satellite and, in a stable orbit, if there isn't any resistance, it could keep going around forever. But if this really were the case with an electron, there would be a different problem that would doom it to spiral inward and crash into the nucleus.
Bohr knew that an electron normally pumps out light when it is accelerated. That inevitably means losing energy--all electromagnetic radiation, like light, carries a certain amount of energy. An electron that is orbiting, that is accelerating, should emit a stream of light, rapidly losing energy, before crashing destructively into the nucleus. This doesn't happen. (Thankfully, or every atom of matter in existence would have self-destructed within a tiny fraction of a second of being created.) So Bohr had the clever idea of putting his electrons on imaginary tracks.
Instead of being able to swing around the nucleus in any old orbit, Bohr imagined that electrons were constrained to travel on fixed circuits, still confusingly called orbits. Once on a track--in what Bohr called a stationary state--the normal rules did not apply: it was as if the imaginary track held in the photons and stopped energy from leaking out. The electrons could jump from one orbit to another--giving out or absorbing a quantum of light--but could not live anywhere in between. It wasn't possible for electrons to gradually drift down and crash into the nucleus; they could only make instantaneous leaps between fixed orbits. These jumps between different tracks, gaining or losing a quantum of energy with each jump, were called quantum leaps. Bohr had taken the atom digital.
Niels Bohr will come back into the story, but first his ideas had to be transformed and upgraded by the young Turks of quantum theory--Prince Louis de Broglie, Werner Heisenberg, Erwin Schrdinger, andPaul Dirac. De Broglie inverted Einstein's idea that light--usually seen as a wave--could be thought of as particles, by showing that elementary particles like electrons could behave as if they were waves. Heisenberg abandoned Bohr's visually appealing orbits to produce matrix mechanics, a totally abstract mathematical description of the processes involved. Schrdinger came up with an alternative view, a description of the way de Broglie's waves changed with time, known as wave mechanics--and Dirac showed that Heisenberg and Schrdinger's approaches were not just consistent but totally equivalent, pulling the two together as quantum mechanics.
All was not rosy in the quantum mechanical garden, however. If Schrdinger's wave equations were taken as literal descriptions of the behavior of quantum particles (something he hoped for, as he hated the abstract nature of Heisenberg's matrices, with no accompanying picture of what was happening), there was a problem. If a particle like an electron were literally a wave, following the behavior specified by Schrdinger's equations, it would have to spread out in all directions, rapidly become ridiculously huge. And there were other complications in the way Schrdinger's equations used imaginary numbers and needed more than three dimensions when more than one particle was involved. The solution to making these wave equations usable came from another of the new generation of physicists, Einstein's friend Max Born.
Born may have been as close as anyone ever was to Einstein socially, but he brought into quantum theory the apparently simple concept that would cause Einstein and others so much trouble--probability. To make Schrdinger's wave equations sensibly map onto the observed world, he suggested that they did not describe how an electron (for instance) moves, or the nature of an electron as an entity, but rather provided adescription of the probability that an electron would be in a particular place. The equations weren't a distinct picture of an electron but a fuzzy map of its likely locations. It was as if he had moved our image of the world from an accurate modern atlas to a medieval muddle with areas labeled "here be electrons."
It is from this introduction of probability into the quantum world that Heisenberg's uncertainty principle would emerge. Werner Heisenberg showed that quantum particles had pairs of properties that were impossible to measure simultaneously in absolute detail (properties are just aspects of an object that can be measured like its mass, position, velocity and so on). The more accurately you knew one of the properties, the less accurately you could measure the other. For example, the more closely a particle's momentum was known, the less accurately its position could be determined. (Momentum is the mass of the particle multiplied by its velocity [directional speed], something physicist John Polkinghorne describes in a matter-of-fact way as "what it is doing.") At the extreme, if you knew exactly what momentum a quantum particle had, it literally could be positioned anywhere in the universe.
A good way of picturing the uncertainty principle is provided by Doctor Peet Morris, of Oxford University's Computing Laboratory Imagine you take a photograph of an object that is flying past at high speed. If you take the picture with a very quick shutter speed, it freezes the object in space. You get a good, clear image of what the object looks like. But you can't tell anything from the picture about its movement. It could be stationary; it could be hurtling past. If, on the other hand, you take a photograph with a slow shutter speed, the object will show up on the camera as an elongated blur. This won't tell you a lot about what the object looks like--it's too smudged--but will give a clear indication of its movement. The trade-off between momentum and position is a little like this.
This uncertainty principle seems to become obvious when you think about trying to make this measurement for real. Say you used a light beam to accurately measure the position of an electron. One of the definitive properties of light is its wavelength--the distance in which the imagined motion of a light wave goes through a complete ripple and gets back to the equivalent point in its travel.
The wavelength of the light determines how accurately it can be used to measure position. But the shorter the wavelength, the more energy the light carries--and the more effect being hit by that light will have on the momentum of the electron. The mere act of looking at a quantum particle changes it--something that would become a central tenet of quantum theory.
This description of the uncertainty principle as a side effect of measurement comes from Heisenberg's paper on the topic that used the example of a microscope in which the light that was being used to observe a particle disrupted it. However, this interpretation isn't as straightforward as it seems. In Heisenberg's original example, the act of measurement causes the uncertainty--so by implication, if no measurement were made, the momentum and position could have absolute values.
This seems to have been Heisenberg's original understanding, when he presented his microscope example to the great Niels Bohr. Heisenbergis said to have ended up in tears as Bohr pointed out that, though the uncertainty principle is correct, the microscope example is hopelessly misleading. It assumes an underlying reality--it pictures the electron as traveling along a clear, specific path until the light disrupts it. But as far as quantum theory is concerned, it isn't like that. Born had shown that Schrdinger's wave equation described not what the particle itself did, but the probability of its taking a particular route. An electron does not follow a specific path--all that can be said is that, when a measurement is taken, certain values are obtained. These give a guide (subject to Heisenberg's principle) of where the electron might be at a particular moment, but do not imply that it followed a specific, known path to get there.
Although this verges on the sort of philosophical conundrum that inclined seventeenth-century philosophers to ask whether a tree falling in a forest made any sound if there was no one there to hear it, it is important to the whole debate on what is really happening at the quantum level, and would be part of the reason behind the split that divided Einstein from many of his colleagues.
In fact, Einstein had been uncomfortable about the role of randomness in the physics of the very small from an early point in his career. Austrian quantum physicist Anton Zeilinger has pointed out that it was as early as 1909, in a presentation at a Salzburg gathering of scientists and medics (the magnificently named Gesellschaft der Deutschen Naturforscher und Artze Salzburg), that Einstein commented that he was discomforted (unbehagen) by the role that random events played in the new physics.
This was very obvious from a series of letters between Einstein and Max Born. On April 29, 1924, he wrote:
I find the idea quite intolerable that an electron exposed to radiation should choose of its own free will, not only its moment tojump off, but also its direction. In that case, I would rather be a cobbler, or even an employee in a gaming house, than a physicist.
Einstein could not accept this randomness: he felt there should be a strict, causal process underlying what was observed. As far as he was concerned, the electron jumped out of the metal it was in at a time and in a direction that could have been predicted, had all the facts been available. Quantum theory disagreed, saying it wasn't ever possible to know when the electron would pop out, or in what direction. Similarly, quantum theory assumed that a particle didn't have a position until the measurement was made--it was the act of measurement that transformed its position from a probability to an actual value. And the same went for the other properties of the particle.
By December 4, 1926, Einstein was sufficiently irritated by the topic to write his famous words:
Quantum mechanics is certainly imposing. But an inner voice tells me that it is not yet the real thing. The theory says a lot, but does not really bring us any closer to the secret of the "old one." I, at any rate, am convinced that He is not playing at dice.
The problem was this matter of probability and statistics. There was nothing new about apparently random effects in physics, and Einstein himself had made significant use of statistics and probability in his work; for example, in describing the effect called Brownian motion, whereby pollen and other small particles jump about in a fluid, cannoned into by fast-moving molecules. But his assumption had always been that there were real values underlying those probabilities.
For instance, probability theory shows that the chance of throwinga coin and getting a head is 50:50--there is a 1 in 2 probability. But any particular coin throw will have a real, specific outcome. It will produce a head or a tail, and when the probability is worked out over many coin throws, each of those throws would have a specific outcome. Now Born and Bohr were saying that, in quantum physics at least, the reality had to be thrown away All that existed was the probability. Einstein could not accept this--he made a huge distinction between what was actually happening and the tools that could be used to predict the outcome. To see what his problem was, it's worth indulging in a quick detour into the nature of probability.
"Statistics" and "probability" are terms that are often used with more enthusiasm than accuracy. The Victorian British prime minister Benjamin Disraeli said, "There are three kinds of lies: lies, damned lies, and statistics."
This contempt for statistics dates back to the original use of the word, when it was a political statement of facts about a country or community (the "stat" part is as in "state"). The origin of the complaint is the way statistics can be used to support almost any political argument--but political distaste should not be allowed to conceal the value that the statistical method has for science. Statistics gives us an overview of a large body of items that we couldn't possibly hope to monitor individually--almost any measurement on a gas in the real world (pressure, for instance) is statistical, because it combines the effects of all the many billions of gas molecules present.
Probability, on the other hand, is about chance. It usually describes something that may or may not happen. In any particular case, there is usually a single actual outcome, but we can give that outcome a probability So when the weather forecast tells us there's a 50 percent probability of rain, in practice either it will rain or it won't. We just know thatthe chances of it happening are pretty evenly balanced. Let's look at probability and statistics applying to the specific example we've already seen--tossing a coin.
Imagine you have a coin, and toss it a hundred times in a row. Each time you toss the coin, you note down the outcome. Probability tells us that each time you throw the coin you've a 50:50 chance of getting a head or a tail. So probability predicts that, on average, after a hundred throws, you will get fifty heads and fifty tails. If we actually count heads and tails, you might end up with forty-eight heads and fifty-two tails. These statistics tell us what happened. Probability is the likeliness of what will happen in the future--statistics describe the actual outcome. Probability tells us which combinations are more or less likely, while statistics open up reality. If we took more and more tosses, the statistical outcome would lead us toward deducing the 50:50 probability. The two are connected, but are only the same for certain if we had an infinitely large statistical sample, something that doesn't exist in the real world.
Einstein thought that the microscopic world should be like this, too. It's rather as if we had a clever machine that tossed the coins and automatically sent all the heads to one closed hopper and all the tails to another. We could then view the results by weight. The closest we would get to describing what has happened is a probability deduced from the statistics--we could see from the weights that there were forty-eight heads and fifty-two tails, and after many experiments could deduce the 50:50 probability. We never get to see a coin, or a toss, but we would know that the coins did exist in the machine. Similarly, Einstein was convinced that somewhere underneath the probabilities lay reality. He believed that, behind the screen of probability, there was a set of actual values that it was impossible to see. Whether these values, sometimescalled hidden variables, existed would trigger the whole quantum entanglement debate.
Although Einstein expressed his concern to Born, the originator of the probabilistic interpretation of Schrdinger's wave equations, it was with the older Niels Bohr that he engaged in the great battle over the acceptability of quantum theory. It didn't matter to Bohr that the more recent versions of the theory had pushed his atomic orbits out of the way; he was a great supporter of quantum mechanics--in fact, as a result of the support it got from Bohr's Danish center of operations, the most widely accepted understanding of quantum phenomena is still often refered to as the "Copenhagen interpretation."
Einstein developed the habit of teasing Bohr by turning up at conferences with thought experiments that challenged the validity of quantum theory. He was particularly successful at winding up his Danish counterpart at the Solvay Congresses, large-scale scientific gatherings that brought together many of the big names of physics at that time.
Einstein's first shot across Bohr's bow came at the fifth Solvay Congress in Brussels, in 1927. After hearing the big names talk about the latest developments in quantum theory, Einstein described a simple experiment that he thought highlighted a fundamental problem with the theory. He imagined firing a beam of electrons at a narrow slit. On the other side of the slit, the electrons (behaving like waves) were diffracted, spreading out from the slit rather than simply passing through in a straight line. Einstein then imagined a semicircular piece of film that was marked by the impact of the electrons.
According to quantum theory, it isn't possible to say where the individual electrons would be until they hit the film. Schrdinger's equations described the probability of an electron's being in any particular spot, but it is only at the moment of impact that one spot on the filmwould blacken and a measurement take place. Einstein wasn't happy with this. He tried to imagine the moment an electron hit the film. The way he envisaged the experiment, if quantum theory was true, it was as if every point on the film had a random chance of turning black, defined by the probability distribution. Anywhere was in danger of suddenly turning black. But the instant one bit of film did actually register an impact, somehow all the other bits of film had to immediately know not to turn black. It seemed to him as if an instantaneous communication had to connect all parts of the semicircle, telling each bit of the film whether or not to respond.
This time around, Bohr was not particularly bothered by Einstein's remarks--in fact, he found the idea totally confusing. He commented, "I feel myself in a very difficult position because I don't understand what precisely is the point which Einstein wants to [make]. No doubt it is my fault."
Einstein didn't give up but continued with two rather more complex thought experiments, producing them as he sat at breakfast in the hotel before the conference began. The details of these thought experiments don't really matter--what Einstein believed he had done was to show that, with some clever manipulation (for example, using a shutter to only allow a short pulse of electrons or photons through a slit, and by combining what was known about the shutter and the slit with the information gained by measuring the spread of the particles), he could defy the uncertainty principle, knowing more about the location and momentum of the particles than he was allowed to. If the uncertainty principle could be shown to be faulty, then quantum theory was in serious trouble.
This time, Bohr took the threat seriously--but Einstein's new ideas didn't take much opposing and Bohr was ready with a solution by dinner on each occasion. The problem was, Einstein hadn't taken into account the uncertainties that were present in measuring just what the shutter and slit were doing. He had assumed that these could be known absolutely. If, however, they, too, were subjected to the uncertainty principle, this extra information was lost. The thought experiment was entirely consistent with the uncertainty principle and couldn't be used to disprove it.
It was another three years before the two great minds had another chance to come head-to-head on the subject, at the next Solvay Congress, in 1930, held as usual in Brussels. The topic of the conference was magnetism, but this didn't stop Einstein from presenting one of his breakfast-table challenges to quantum mechanics. And this time he felt that he was onto a winner. Einstein had come up with a very clever thought experiment that appeared to challenge the reality of the uncertainty principle, and hence quantum theory.
Einstein's experiment (it should be emphasized that this wasn't a real experiment anyone intended to carry out, just a mental vehicle for testing the theory) consisted of a box with a source of radiation inside. On the wall of the box, a shutter covered a hole. The shutter was opened for a very brief time, during which a single photon shot out of the hole.
It was known by then that, though a photon of light didn't have any mass (if you could imagine stopping the photon and weighing it [impossible in practice, as Einstein had already established light has to always travel at light speed]), its energy when moving produced an effective mass that could be derived from E=mc2. So Einstein envisaged weighing the box before and after the photon was produced. This would give him a value for the energy of the photon that he could make as accurate as he liked. But he could also measure the time when the shutter was opened, to great accuracy. And this combination would allow him to know time and energy to better accuracy than he should know with the uncertainty principle, which applies to time and energy in just the same way as it does to position and momentum.
Bohr could not see the flaw in this devilish contraption. An observer at the time described Einstein as walking quietly away from the meeting with a "somewhat ironical smile" while Bohr trotted excitedly beside him.
Next morning, however, Bohr was ready to come back to Einstein with a counterargument, ironically using Einstein's own general relativity against him. He imagined a particular example of Einstein's apparatus (though the same argument can be applied to different arrangements with a bit of work). The box with the shutter is hung from a spring. This forms the weighing machine that will detect the change in mass. A clock inside the box opens and closes the shutter.
Now as the photon shoots out the box, the whole apparatus will move upward, indicating the change in weight and hence the change in energy. There is uncertainty in this value and uncertainty in the time measured, as clocks in motion in gravity run slowly, according to Einstein's general theory of relativity. The two uncertainties combine to produce exactly the desired results of the uncertainty principle. Once again, Einstein had to accept defeat. However he tried to attack the outcomes of quantum theory, the ability of the measurements to interfere with each other managed to destroy his challenge. Yet, still he believed that it should be possible to dig deeper and uncover the hidden reality.
Bohr was never quite sure whether these attacks from Einstein were purely technical or driven by an urge to irritate him (though there is no evidence that Einstein was motivated by anything other than a disquiet with quantum theory, he was a very accomplished teaser). Even years later, in 1948, Bohr was clearly unnerved by Einstein's challenges. Physicist Abraham Pais recounts how he was attempting to help Bohr put together some information on his disputes with Einstein: At thetime Bohr, was visiting the Institute for Advanced Study at Princeton and was using the office adjacent to Einstein's. (Technically, the office he was in was the one allocated to Einstein, but Einstein preferred the more cramped confines of the room that should have belonged to his assistant. )
Bohr was supposed to be dictating his text to Pais but, as often happened, he was having trouble stringing together a sentence. The concepts were there in his head, but Bohr found it difficult to formulate the wording. The eminent scientist was pacing around the table in the middle of the room at high speed, almost running, repeating, "Einstein ... Einstein . . :'to himself. After a little while, he walked to the window and gazed out, repeating now and then,"Einstein ... Einstein . . :' At that moment, the door opened very softly and in tiptoed Einstein, himself. He signed to Pais to keep quiet with what Pais later described as "his urchin smile" on his face.
It appears that Einstein had been ordered by his doctor not to buy any tobacco. Treating this injunction as literally as he could, Einstein decided that he couldn't go to the tobacconist but that it would be okay to raid Bohr's tobacco, which was in a pot on Bohr's table. After all, by stealing the tobacco, he was sticking to the injunction not to buy any. As Einstein crept into the room, Bohr was still facing the window, still occasionally muttering, "Einstein ... Einstein ..."
On tiptoe, Einstein made his way toward the desk. At that point Bohr uttered a final, firm "Einstein!" and spun round to find himself face-to-face with his longtime opponent, as if the incantation had magically summoned him. As Pais commented:
It is an understatement to say that for a moment Bohr was speechless. I myself, who had seen it coming, had distinctly felt uncanny for a moment, so I could well understand Bohr's own reaction.
But back in 1930, after his successful dismissal of Einstein's challenge, Bohr was feeling rather more comfortable in the security of quantum theory. It would be five years before Einstein would strike back again in a way that would throw Bohr into total confusion, though ironically it would make it possible to prove that the strange quantum world really did exist.
Over those five years, Einstein put together the elements of a thought experiment that would enable him to separate the two measurements linked by the uncertainty principle so firmly that he believed one would not be able to influence the other. That way, he became increasingly convinced, he could cheat uncertainty and shake quantum theory to its foundations. He first tried out the concept verbally on Leon Rosenfeld at the 1933 Solvay Congress, after hearing Bohr's latest thoughts on the quantum theory, but this time Einstein did not intend to launch a lightweight challenge over breakfast. This was to be a formal, scientific paper.
The move away from teasing Bohr by casually dropping nightmare problems in his lap reflected the increasingly dark situation in Europe. Hitler's Germany drove Einstein reluctantly to the United States, where he set up residence in what would be his home for the rest of his life, Princeton, New Jersey. The Institute for Advanced Study, recently founded in 1930, seemed the ideal location. Set up by Louis and Caroline Bamberger, it brought together experts in the theoretical sciences (there were, and still are, no laboratories), math, and history in a relaxed environment where there was an opportunity to work without the distraction of students and lectures. It provided all the good parts of a university (from the academic's viewpoint) without the time-wasting chores.
It was from here, with two collaborators, Boris Podolsky and Nathan Rosen, that Einstein would produce the paper that brought the implications of entanglement out into the open. Published in Physical Review onMay 15, 1935, and called "Can Quantum-Mechanical Description of Physical Reality Be Considered Complete," the paper became universally known by the initials of its authors, EPR. Its intention was to smash quantum theory by showing just how incredible the resultant entanglement would be. Einstein was out to destroy the opposition.
THE GOD EFFECT. Copyright @ 2006 by Brian Clegg. All rights reserved. For information, address St. Martin's Press, 175 Fifth Avenue, New York, N.Y 10010.
Table of Contents
Preface ix
A Note About Alice and Bob xi
Chapter 1 Entanglement Begins 1
Chapter 2 Quantum Armageddon 32
Chapter 3 Twins Of Light 52
Chapter 4 A Tangle Of Secrets 90
Chapter 5 The Blish Effect 116
Chapter 6 The Unreal Machine 147
Chapter 7 Mirror, Mirror 205
Chapter 8 Curiouser And Curiouser 220
Notes 247
Select Bibliography 257
Acknowledgments 259
Index 261
Related Subjects
Customer Reviews
Explore More Items
Relive the sensuality, the romance, and the drama of Fifty Shades Freedthe love story that enthralled millions of readers around the worldthrough the thoughts, reflections, and dreams of
E L James revisits the world of Fifty Shades with a deeper and darker take on the love story that has enthralled millions of readers around the globe.
Their scorching, sensual affair ended in
The science classic made more accessible
• More concise • Illustrated
FROM ONE OF THE MOST BRILLIANT MINDS OF OUR TIME COMES A BOOK THAT CLARIFIES HIS
A landmark volume in science writing by one of the great minds of our time, Stephen Hawking’s book explores such profound questions as: How did the universe
Daunted by the singular tastes and dark secrets of the beautiful, tormented young entrepreneur Christian Grey, Anastasia Steele has broken off their relationship to start a new career with a Seattle
Look for E L James’ passionate new
DRAMA DE CINCUENTA SOMBRAS LIBERADAS
A TRAVÉS DE LOS PENSAMIENTOS Y SUEÑOS
DE CHRISTIAN GREY.
E. L. James regresa, con una mirada más
With commentary by the greatest physicist of our time, Stephen Hawking, this anthology has garnered impressive reviews. PW has called it “a gem of a collection” while New Scientist
From two of the world's great physicists--Stephen Hawking and Nobel laureate Roger Penrose--a lively debate about the nature of space and time
Einstein said that the most incomprehensible thing about