What, exactly, are the ghostly streaks of light astronauts see-but can't photograph-when they're in space? And why is it impossible for two people to see the exact same rainbow? Why are scientists beginning to think that the sun is safer than sunscreen? And how does the fluctuation of sunspots-and its heartbeat-affect everything from satellite communications to wheat production across the globe?
Peppered with mind-blowing facts and memorable anecdotes about spectral curiosities-the recently-discovered "second sun" that lurks beneath the solar surface, the eerie majesty of a total solar eclipse-THE SUN'S HEARTBEAT offers a robust and entertaining narrative of how the Sun has shaped humanity and our understanding of the universe around us.
What, exactly, are the ghostly streaks of light astronauts see-but can't photograph-when they're in space? And why is it impossible for two people to see the exact same rainbow? Why are scientists beginning to think that the sun is safer than sunscreen? And how does the fluctuation of sunspots-and its heartbeat-affect everything from satellite communications to wheat production across the globe?
Peppered with mind-blowing facts and memorable anecdotes about spectral curiosities-the recently-discovered "second sun" that lurks beneath the solar surface, the eerie majesty of a total solar eclipse-THE SUN'S HEARTBEAT offers a robust and entertaining narrative of how the Sun has shaped humanity and our understanding of the universe around us.
The Sun's Heartbeat: And Other Stories from the Life of the Star That Powers Our Planet
The Sun's Heartbeat: And Other Stories from the Life of the Star That Powers Our Planet
eBook
Available on Compatible NOOK devices, the free NOOK App and in My Digital Library.
Related collections and offers
Overview
What, exactly, are the ghostly streaks of light astronauts see-but can't photograph-when they're in space? And why is it impossible for two people to see the exact same rainbow? Why are scientists beginning to think that the sun is safer than sunscreen? And how does the fluctuation of sunspots-and its heartbeat-affect everything from satellite communications to wheat production across the globe?
Peppered with mind-blowing facts and memorable anecdotes about spectral curiosities-the recently-discovered "second sun" that lurks beneath the solar surface, the eerie majesty of a total solar eclipse-THE SUN'S HEARTBEAT offers a robust and entertaining narrative of how the Sun has shaped humanity and our understanding of the universe around us.
Product Details
ISBN-13: | 9780316175395 |
---|---|
Publisher: | Little, Brown and Company |
Publication date: | 07/13/2011 |
Sold by: | Hachette Digital, Inc. |
Format: | eBook |
Sales rank: | 338,424 |
File size: | 3 MB |
About the Author
Bob Berman is one of America's top astronomy writers. For many years, he wrote the popular "Night Watchman" column for Discover magazine. He is currently a columnist for Astronomy magazine and a host on NPR's Northeast Public Radio, and he is the science editor of the Old Farmer's Almanac.
Read an Excerpt
The Sun's Heartbeat
And Other Stories from the Life of the Star That Powers Our PlanetBy Berman, Bob
Little, Brown and Company
Copyright © 2011 Berman, BobAll right reserved.
ISBN: 9780316091015
CHAPTER 1
Yon Flaming Orb
That lucky old Sun has nothing to do
But roll around heaven all day.
—Haven Gillespie, “That Lucky Old Sun,” 1949
EVERY DAY, a ball of fire crosses the sky.
This bizarre reality was so flabbergasting in ancient times that people could relate to it only with worship. The Aztecs and Egyptians were far from alone in regarding the Sun as a god. The Persians, Incas, and Tamils (of southern India) also elevated the Sun to the center of their spiritual lives. It was giver of life, holder of all power, an enigma beyond all earthly mysteries. You either bowed to it or just shut up and gawked.
Today it’s just “the Sun.” Familiarity is the enemy of awe, and for the most part people walk the busy streets with no upward glance. In fact, one of the common bits of advice about the Sun is that we shouldn’t look at it.
At its most elemental, the Sun is the sole source of our life and energy. We know this. And when the Sun occasionally pops into the mainstream news, like an old dog briefly roused from a nap, we give it a few moments’ attention. Recently, medical headlines announced that we’ve been hiding from the Sun excessively in an understandable desire to avoid skin cancer. The long-standing advice is now changing: it’s better to get too much sunlight than too little. That’s because sunlight-mediated vitamin D prevents far more cancers than are produced by going too far the other way and staying pale. Sunlight is not just good for us; it can save our lives.
This is news worth sharing. But, in truth, everything about the Sun is either amazing or useful. Beyond the peculiar history of its discoveries by a motley collection of geniuses, it directly or indirectly affects our lives, our health, our emotions, and even our dollars.
BECAUSE WE MUST START SOMEWHERE, we might spend a few minutes with the simplest solar truth, the one noticed even by hamsters and trilobites: the Sun is bright. Meaning, it emits lots of energy. We know it gives off ultraviolet. This is what burns us at the beach, and should we get too much of it, we might contract a fatal melanoma. Every 1 percent increase in a person’s lifetime UV exposure brings a 1 percent boost in her chances of being victimized by that deadliest of skin cancers. Indeed, Australia’s high melanoma rate has its simple origin in that region’s characteristically sunny skies.
We also need no expert to tell us that heat, those long infrared energy waves, gushes from the Sun. We feel it on our faces even when we’re behind glass and thus receiving no ultraviolet at all. But what does the Sun emit most strongly? Heat? Ultraviolet? Gamma rays? What?
Few would guess the answer. It is green light. That is the Sun’s peak emission.
The green dominance shows up in rainbows and in the spectrum cast on a wall by a prism, where green always seems to be the brightest color. Yet the Sun itself does not appear green. This is because our eyes were designed by nature so that when the Sun’s primary colors—green, red, and blue—strike the retina together, the mixture of light is always perceived as white. White means we’re getting it all.
When looking at grass, we might reasonably conclude that the botanical world—and the chlorophyll that fuels it—loves sunlight’s abundant emerald component. But leaves, plants, and grass actually do not like green at all. They feed mostly on the Sun’s blue light and also absorb its red wavelengths. Grass reflects away the green part of sunlight, which is why we see that color in lawns. Counterintuitively, the only thing we ever perceive is an object’s unwanted solar wavelength.
Our eyes perceive objects solely because particles of light or photons are bouncing off them. We’ve been designed to see the energy the Sun emits most strongly. It would be pointless to be sensitive to gamma rays, for example, because the Sun sends us scarcely any; they’re not bouncing off the objects around us. Whether we’re searching for a cheeseburger or underwear, we’d see nothing at all if our eyes were scanning for gamma rays. Instead, we see the most abundant solar wavelengths.
This gives us a Sun bias. When we look at the night sky, we’re comfortable with planets, which stand in the light of the Sun. And we feel good under a starry sky, since all stars give off the same type of light, composed of precisely the same colors, that the Sun does. We essentially scan the universe through the Sun’s eyes. Our retinas and brains were built to “Sun specs.” We’re not quite grapefruit, which seem to mimic the Sun’s very appearance, but the Sun is sort of a mother-father figure as far as our construction goes.
Since the Sun’s energy peak is green light, that’s also the color we perceive most easily. In deepening twilight, when light turns murky and colors fade, we still see grass as green, even though red sweatshirts and violet flowers have all turned to gray—the first step toward the color blindness we experience at night.
When we’re camping far from artificial lights under a full moon, the world looks green-blue. It’s such a familiar experience, we don’t stop and say, “What’s going on here? Why should a white moon make everything turquoise?”
Photographers, artists, and cinematographers are very aware of this strange effect and happily exploit it whenever they wish to portray an outdoor night scene. Since the moon’s surface is really just a dull mirror made of finely powdered sand, moonlight is sunlight whose brightness is cranked down 450,000 times. So cinematographers often shoot a sunlit scene, then use a filter to block out the warm colors and also cut down the light a few f-stops to make it dimmer. Voilà: the illusion of moonlight. But why exactly is the moonlit world cyan?
In daylight, our retinas perceive all colors with a peak sensitivity to yellow-green. But in low light, our perception shifts blueward. Now our greatest sensitivity is to the color green-blue, while hues at the edges of the spectrum, such as red and violet, vanish and become gray. This perception change at low light levels is called the Purkinje shift, named for the Polish scientist who first noticed it. (Jan Purkinje was also the first to suggest that fingerprints be used in criminal investigations.) “Purkinje shift” sounds so cool that I try to say it as much as possible, even when it’s not appropriate. Thanks to the varying sensitivities of the retina’s three types of cone-shaped color receptors, when light is fairly dim, we perceive nature’s greens but not its oranges and reds. This explains why most municipalities now paint fire trucks green. Gone is the traditional red, which cannot be seen well at night. It’s also why green was chosen for all US interstate highway signs—all forty-six thousand miles of them.
The oddities of the Sun’s light and color are limitless. Why, for instance, does the Sun produce violet but no purple, and why do we humans not see the real color of the daytime sky? But looming over all these Sun-related questions is the big one, the one asked by children since earliest recorded history: what makes the Sun shine?
This was surely one of the top ten most baffling questions for twenty-five thousand generations. Even ancient scriptures claiming wisdom or omniscience, such as the Vedas and the Bible, didn’t dare touch it, because nobody had a clue. We have known the answer for just one human lifetime. The underlying physics is now well understood and has yielded unexpected side oddities, such as the strange particles called neutrinos, which are created in the Sun but then magically change form as they fly through space. By-products of the Sun’s turning its own body into light, twenty trillion neutrinos whiz through each of our heads every second.
That the Sun is such a tireless power source is not surprising if we recall that nature uses material objects to store huge amounts of energy. When this energy is converted, the process unleashes an almost inconceivable fountain of light and heat. This “mass changes to energy” idea went unsuspected even by great thinkers such as Plato and Isaac Newton. Until a century ago, all hot, self-luminous objects were simply regarded as forms of fire, the act of burning. Everyone assumed that the Sun was burning, too.
Nineteenth-century scientists already knew that Earth completes one orbit per year, and when they figured out our planet’s distance from the Sun, the simplest physics let them calculate the mass of the star holding us in its grasp. It turns out the Sun weighs the same as 333,000 Earths. If its body consisted entirely of coal, the resulting fire could keep going for just two thousand years, and never mind where it would get the oxygen needed to sustain combustion. That means that if the Sun first ignited when the pyramids were built, the fire would have been out by the time Christ was born. Obviously, something was wrong with this picture. No matter the fuel, the Sun just couldn’t be burning. This enigma, staring us in the face daily, plagued the greatest minds.
It took Einstein’s “energy and matter are the same thing” idea a century ago to reveal the true process. Grasping the significance of E = mc2 a few years later, in 1920, the British astronomer Arthur Eddington correctly proposed that vast energy is released when hydrogen’s proton—the heavy particle in its nucleus—meets and sticks to a proton from another hydrogen atom.
Eddington was criticized because the required temperature for proton fusing was not thought to be available in stars such as the Sun. He replied, “I am aware that many critics consider the stars are not hot enough. They lay themselves open to an obvious retort; we tell them to go and find a hotter place.”
Touché, nice retort, but actually he was right. The fusing together of hydrogen nuclei is what makes the Sun shine.
By exchanging the burning-coal idea for the notion of nuclear fusion, science was really trading an amazing wrong idea for an amazing right one. Given the total power emitted by the Sun, which delivers nearly a kilowatt of energy to each square yard of Earth’s sunlit surface every second, and the formula E = mc2, it’s easy to calculate how much of the Sun’s body gets continuously consumed and turned into light. The truth is a little disconcerting: the Sun loses four million tons of itself each second.
This is not some mere abstraction. If we had a giant scale, we’d find that the Sun actually weighs four million tons less every second. So its energy isn’t “free.” It will pay a price for such a profligate output in the form of a series of dramatic events that will change the Sun into something unrecognizable from the star it is today. The main milestones, which will happen 1.1 billion years, 6 billion years, and 9 billion years from now, can be observed with fascination elsewhere in our galaxy when we observe other suns living out the middle and end of their lives.
“LET THERE BE LIGHT.” For centuries, this was easy to say but impossible to understand. Light, color, the Sun’s genesis, and the puzzles of its energy production have been pondered since the earliest humans acquired brains large enough to be tormented. But even these mysterious fundamentals are perhaps one-upped by a single intriguing solar property utterly unsuspected by the ancients: the Sun has a heartbeat. A pulse.
As first observed in the seventeenth century, the Sun alters its appearance and energy output in eleven-year cycles and reverses its magnetic polarity every twenty-two years. Although the Sun’s heartbeat is longer than the human variety, it’s just as dependable and as crucial. Depending on what part of the Sun’s period we’re in, weather gets cooler or warmer, Earth’s atmosphere thickens up, and the yield of wheat crops appears to respond, along with the average price of bread around the world. Ocean currents shift markedly.
Such effects on Earth make them worth our focus, especially since recent solar cycles have displayed extremely strange behavior—such as 801 days without a single sunspot—that has not been seen since the 1600s and that has resulted in powerful earthly influences. It even made some researchers wonder whether Earth was on the verge of returning to something like the frightening seventy-year period during early colonial times when the Sun’s heartbeat simply stopped.
One of the consequences then was a significant cooling of our planet. You might guess the natural next question: might the Sun be stepping in now to unwittingly help counteract global warming? Can this explain why the runaway carbon dioxide in our atmosphere has so far produced only a wimpy 1.5°F rise in Earth’s surface temperature over the past century?
But recent solar behavior has been both stranger and stronger even than this. The Sun’s modern somnambulance has unfolded while our planet’s own magnetic field has been steadily weakening. This double whammy translates into an inability to stoutly block incoming cosmic rays from the depths of the galaxy. The Sun’s rhythm since the year 2000, and especially its super-low ultraviolet output from 2006 to 2009, unseen until now by any living researcher, has created the darkest night skies since the Roaring Twenties. It has boosted our air’s carbon-14 content. It’s flattened the global warming curve. It’s chilled our upper atmosphere, which is 74°F colder than in any normal sunspot minimum, making it so thin and compressed that it has affected the lives and deaths of satellites. And the Sun’s own magnetic field is only half as strong as it was a mere twenty years ago. The bottom-line question is, what exactly is going on, and how is it affecting us?
We should really start at the beginning, rewinding back through human history—which has a kind of heartbeat of its own—to explore the star that, while doing so much else, hovers at the center of our lives.
CHAPTER 2
Genesis
The sun itself sees not till heaven clears.
—William Shakespeare, Sonnet 148
THE SUN’S BIRTH was accompanied only by silence. There were no onlookers. The nearest planets were light-years away, and that was a good thing. Like the modern art its looping, gassy tendrils resembled, the apparition would have been best appreciated from a great distance.
The process by which a boring cloud of plain-vanilla hydrogen gas becomes a blinding ball of white fire is epic in purpose and scale. The result, a stable star such as the Sun with a fourteen-billion-year life span, destined to create puppies and pomegranates, certainly deserves its own holiday. Yet no nation celebrates the Sun’s birth. We do, theoretically, honor its existence each Sunday. In practice, most use that time to sleep as late as possible and thus minimize any awareness of it.
What if we had been present at the moment of the Sun’s creation, 4.6 billion years ago? Anthropomorphically enough, the Sun came from the union of two parental sources. It had a true mother and father, each boasting a distinct personality.
The maternal side was a vast womb of thin hydrogen gas created 379,000 years after the big bang. This nebula was probably ten million times bigger than the Sun it produced, which is like a football stadium birthing a single apple seed on the fifty-yard line.
Into this cold, placid, Siberian near emptiness came roiling violence approaching from a distance. A massive blue star had blown up in our galaxy’s Perseus spiral arm. The supernova’s detritus, the paternal side of the solar DNA, included new substances utterly unknown in the nostalgic days of the big bang, nine billion years earlier. The meeting and mixture of these two disparate materials produced the star that was destined to be ours.
Although stars are always born in the same fashion, they end up very different from one another. Some become colossal orange spheres that would dwarf our planet the way a spherical tank twenty-five stories high would compare to the period at the end of this sentence. Such globes expand and contract like an anesthesiologist’s oxygen bag—surprisingly lively motion for something so enormous. Others reach their AARP years as tiny white balls crushed solid, harder than a diamond and smaller than Los Angeles.
Within a large cloud of hydrogen and helium—a nebula—any lump or extra bit of density exerts its own gravity and pulls in surrounding gas. During this contraction, an initial “barely there” random motion evolves into a leisurely spin, and as the blob shrinks, like a ballerina pulling in her arms, the rotation speed increases.
Gravitational collapse always produces heat, so the center gets ever hotter. Meanwhile, the nebula shapes itself into a ball. Stars and planets are spheres because these have the smallest surfaces of any geometric shape. No part is farther from the center than any other.
When you were little and played with clay, you could patty-cake it into a thin sheet, like pizza dough. If you wanted to color the large surface when it hardened, you needed lots of paint, especially considering both sides. Or, instead, you could roll the clay between your palms into a small ball that could be painted with a single brushful. You learned then that balls have small surface areas. Nature learned that, too, early on. Any object with enough material—it takes only one-hundredth the mass of our moon—has enough self-gravity to pull itself inward until it reaches the smallest 3-D shape, a sphere. Across the cosmos, only very low-mass objects such as asteroids avoid having round bodies.
So we have a spinning gas ball with a hot center. Now a critical event—nuclear fusion—either happens or fails to happen.
With enough gas and thus enough self-gravity present, the spinning ball’s core compresses to be white-hot. Heat simply means “moving atoms,” so each hydrogen atom is now hurled at its neighbor with such force that their protons stick together: fusion. It’s a form of alchemy, except that instead of gold, the fusing of hydrogen’s protons creates useless helium, good only for party balloons or making our voices sound like a Munchkin’s.
But proton fusing also releases heat and light. This is why, when you ask a physicist what makes stars shine, he says either “fusion” or, if he thinks you can handle it, “the proton-proton chain.”
THE MOMENT FUSION BEGINS, a star is officially born. The process releases such enormous energy that it’s self-sustaining. And this energy is immense. In each second, the Sun produces and emits the same energy as six trillion Hiroshima atomic bombs. This fusion will continue on its own for as long as hydrogen fuel remains—and in the case of the Sun, which weighs two octillion (2 followed by 27 zeros) tons, the fuel supply is no problem.
If the original hydrogen cloud is too skimpy, its self-gravity produces a ball with insufficient heat to get fusion started. The result is a brown dwarf. It never “shines.” It never creates its own light. But it’s still too hot to touch. In a way, the planet Jupiter is like that. Its core is the hottest place in the solar neighborhood, outside of the Sun itself, but it would have needed eighty times more mass than it has to ignite and become a star.
But what if it had ignited? Then ours would be a double-star system. Such binaries are so common that they make up half the stars in the universe. They run the gamut from two stars nearly touching each other, pulled tidally into football shapes and whirling crazily around their common center of gravity every few hours, to twins so far apart that millions of years must elapse before they complete a single mutual orbit.
The weight of the collapsing nebula determines everything. Too little mass and fusion never happens. Just enough and the low core pressure makes fusion proceed slowly: the star shines dimly as a cool red dwarf. With only one-thousandth the solar mass, such lightweight stars emit a feeble orange glow. These are the street dogs of the galaxy; they’re everywhere. Forty-three of the fifty stars nearest to Earth are of this dim, ruddy variety, which astronomers, without any apparent logic, call type M. These “Methuselah suns” use fuel so sparingly that they outlive everything else.
At the other extreme are massive stars. When these implode and ignite, gravity is so powerful that the core fusion is virtually a runaway, proceeding crazily, almost like a bomb: superheat creates a blue dazzle. Such blinding lighthouses are rare but appear disproportionately in the night sky because we see them from thousands of light-years away. But their fuel is spent quickly, with no thought to the future. These stars don’t last long.
It was just such a profligate, live-for-the-moment blue star that fathered our own Sun.
Astronomers classify these distinct star families with simple capital letters. There are only seven major star types—as if to retain the ancient classic obsession with that number, as in “seven seas” or “Seven Wonders of the World.” In many cultures, both six and eight were deemed unlucky and thus were avoided, which is why you rarely hear tales of the “six dwarfs” or the “Eight Sisters.”
Originally, the seven star categories were labeled A, B, and so on, according to how much hydrogen emission was in their light. But the sequence soon got reshuffled according to the stars’ temperature. For more than a century now, astronomers everywhere have known these star types—O, B, A, F, G, K, M—as intimately as chefs know their fundamental seasonings or geologists the basic kinds of rock. The letters are a continuum from the bluest, hottest, most massive stars (types O and B) to the reddest, coolest ones (type M). In between are creamy white, moderate stars of type G (like the Sun) and yellow or orange type K. There are no green stars in the universe.
These wildly disparate stellar life stories play out in real time in the twinkling night. Rigel, every stargazer’s familiar friend in Orion’s foot, is just like our Sun’s father. It is a type B star, 17 times heavier than the Sun, which makes it shine 65,000 times brighter. It lives happily for only 7 million years by changing hydrogen to helium, the most normal and efficient fusion behavior. As Rigel’s hydrogen gets depleted, it will increasingly “burn” helium, which creates carbon and oxygen, a process that will last only 700,000 years.
And things keep going downhill. The next step, changing some of its newborn oxygen to silicon, will buy Rigel only one solitary year. The final metamorphosis of silicon to iron will keep the star alive for a single additional day. Iron is the end of the line. Making atoms heavier than iron costs energy rather than producing it.
Then, out of fuel and down on its luck, with no outward-pushing power streaming from its core, the star will finally give in to gravity. The sheer weight of all its layers will make the star collapse on itself, rapidly upping the temperature until the whole thing blows as a supernova—the most intense brilliance nature ever creates.
The core will remain as a tiny, crushed ball twelve miles wide, spinning crazily. The rest of the star will be hurled outward two thousand times faster than a rifle bullet. Two million planet Earths’ worth of newly minted star stuff—all that fresh-baked oxygen, silicon, iron, and the rest—will fly across space in a flash brighter than a million Suns.
But more than that, a supernova’s incredible heat fuses new ultraheavy atoms. Indeed, the only possible way for nature to create iodine, lead, uranium, and every other element heavier than iron is in a supernova’s cauldron. So what we now have flying through space is absolutely all of the ninety-two natural elements, a big change from the mere hydrogen and helium and a dash of calming lithium the star started with. The massive star has been a factory, creating every element, including the oxygen we are breathing this very moment.
A supernova’s brilliance fades in a year or two, but the material keeps going. It would be a shame for it to go to waste.
Encountering a primordial nebula, the supernova detritus pushes that gas into dense filaments, creating knots of collapsing gas balls and a litter of new stars. These so-called second-generation stars are partially made up of heavier elements from the now deceased blue star. They are what astronomers call metal-rich.
Some of these second-generation stars are high-mass, blue O and B suns that go through their own life cycles in sped-up double time. When these blow up into supernovae, an even richer explosion spreads across the galaxy. When this enhanced material contacts yet another virgin gas cloud (these nebulae are almost everywhere), it forms third-generation stars, which are the latest models to date. Such stars, and any leftover matter that condenses into planets around them, are rich in oxygen and carbon. They become playgrounds for nature’s creative experimentation.
Still with me? If so, you’ve earned the punch line: our Sun is a third-generation star.
The planets orbiting it reflect this. That we have iodine in our thyroid glands proves that our bodies were fashioned from supernova material. The iron in our blood came from the cores of two previous star generations. The Sun gives off a bit of peculiar yellow light from fluorescing sodium vapor, an element inherited from its father, the type O or B blue star. It got copious hydrogen, which will supply it with life-sustaining fuel for billions of years to come, from its mother, the pristine nebula gas. Their marriage vows were exchanged 4.6 billion years ago. Since then, our galaxy has spun around twenty times, and we cannot pinpoint what, if anything, is left of the nebula that was our nursery. Nor do we see any trace of the supernova that was the other component of our genesis.
The residual, gaseous afterbirth of the Sun’s (and our own planet’s) origins has been diluted and lost over time. But the mechanism of the birth stares us in the face. Seen through even a one-dollar spectroscope, sunlight contains the fingerprints of its complex composition. Its high metal content is conclusive evidence that, unlike so many other stars in the universe, the Sun was not forged from primordial material alone. The universe harbors many ongoing mysteries, from the big bang to the nature of consciousness, but how the Sun was born is not among them.
Our own parents were the first generation of humans for whom the Sun’s origin no longer required the desperate guesses of philosophers or the invoked mysteries of theologians. The glow of gaseous metals contributes to the sunlight reflecting off the white feathers of gulls flying overhead and the faces of children playing in the yard. The Sun’s genesis illuminates the clouds above and the rippling waves of the sea below. It is all around us. In fact, in many ways, it is us.
CHAPTER 3
A Strange History of Seeing Spots
I hold this opinion about the universe, that the Sun remains fixed in the centre, without changing its place; and the Earth, turning upon itself, moves round the Sun.
—Galileo, Letter to Cristina di Lorena, Grand Duchess of Tuscany, 1615
I, Galileo, son of the late Vincenzo Galilei, Florentine, aged seventy years, arraigned personally before this tribunal, and kneeling before you,… swear that I… [will] abandon the false opinion that the Sun is the center and immovable, and that the Earth… moves.
—Galileo, at his inquisition, June 22, 1633
AS THE FIRST humans acquired tools and an appreciation of minimalist cave art, they turned their attention to improving their lives and understanding the cosmos. Homo erectus erected the first blazing fire 500,000 years ago. After burgers went from raw to medium-well, a truly long time elapsed before the next human milestone: the bun. The first planting of grains and other crops, which occurred just 12,000 years ago, ended our million-year low-carb diet and freed us from being hunters. No longer plagued by the frustration of trying to sneak up on animals with bigger ears and faster legs, humans started staying put. Our nomadic days were ending.
After the beginning of agriculture, the next milestone was written language. This was cuneiform, invented by the Sumerians around 3400 BC. That’s not so long ago. We thus have less than six thousand years of records to let us know what bygone people thought about the Sun or anything else.
The Egyptian Museum in Cairo—among the world’s top must-see destinations—is a wonderland of hieroglyphs that, at first glance, look like a kindergartner’s idea of animal portraiture. When they were finally deciphered in the mid-nineteenth century (“Aha, I see! It’s snake before stork except after fish!”), the inscriptions revealed how central the Sun was to daily life. Here was a god no one treated lightly.
The early Egyptians, of course, did not limit themselves to fine motor skills. When it was finished, the Great Pyramid at Giza, forty stories tall and weighing 6.5 million tons, was the most precisely aligned structure in the world, with each side exactly aimed toward the four cardinal directions with an accuracy of more than one-tenth of a degree. Such massive monuments were thus more than mausoleums; they doubled as aids in solar and seasonal reckoning.
Numerous early civilizations noticed that the Sun’s rising and setting positions shift regularly, and so they placed markers to keep track of the changes. You can do this, too. Starting the first day of winter, the solstice (December 21), watch where the Sun sets from your least obstructed window. As the days and weeks pass, the Sun will keep setting farther to the right. On some particular evening, it might go down behind some “monument,” such as a neighbor’s chimney or a distant telephone pole. On June 21, the summer solstice, it will set as far to the right as possible. Then for the next half year, it will move to the left, retracing its steps, again setting at all those horizon points it hit during the first six months. Thus, each spot on the horizon marks a sunset on two occasions each year. The solstices alone, at the left and right terminals, host the Sun only once.
In Manhattan, the setting Sun stands like an orange ball at the end of every numbered street on May 31 and again on July 11—“Manhattanhenge”—though rush-hour crowds seem disinclined to gather and stare. In Salt Lake City, whose roadways are far more perfectly aligned with the cardinal directions, the Sun dramatically and blindingly sets at the end of every street on March 20 and September 23—the equinoxes. If you were an obsessive-compulsive member of an ancient sky-watching culture and had a favorite community field, you’d certainly feel compelled to erect a set of stones to mark at least the extreme sunset positions, the two solstices.
The Mayans, the Egyptians, and those who built Stonehenge are usually said to have used the Sun to tell time. But “time” is abstract. It has no independent existence outside of being a human tool of perception. You cannot pick it up and analyze it, like cottage cheese. Indeed, Thomas O’Brian, the head of the National Institute of Standards and Technology’s Time and Frequency Division—which builds and runs our sophisticated atomic clocks—told me over lunch that he has “no idea what time is.” Neither did the ancients. Early cultures cared not about “time,” but about synchronicity and practical need: they knew that certain sunrise and sunset positions told them to plant crops or prepare for a likely stretch of scorching heat and low rainfall, which had previously happened when the Sun set at around the same place on the horizon.
The Egyptians represented the rising Sun as the god Horus, always depicted as a falcon. His job was to drive away darkness, and he never failed in that era when the aches and injuries from performing repetitive tasks went unrecognized. But even he played second fiddle to Ra, the chief Sun god, who also had the head of a falcon, but with a disk above it. It’s possible that the “wings” protruding from this disk may have been an esoteric depiction of the coronal streamers seen during a total eclipse.
Like those who came to power later, such as the Inca rulers and Japanese emperors who used the same ploy, the pharaohs claimed to be direct descendants of the Sun, thus assigning to themselves a pedigree it would be patently unwise to challenge. Imagine today, when we routinely belittle our leaders, if the president of the United States convinced us that he was descended from the Sun. Could he not get bills passed more easily—especially those involving solar energy?
The rulers at the height of the ancient Greek and Roman Empires lacked this degree of chutzpah. But Sun gods remained. To the Greeks, it was Helios, who each day rode across the sky on a chariot pulled by four horses we can only assume were clad in a fireproof material like asbestos.
Sun gods of various cultures often spilled over into other cultures, sometimes with name changes. For instance, Baal, worshipped by the Phoenicians, was also venerated by many Israelites. This was sufficient competition for the One God of the Bible that Baal put-downs appear periodically in its pages. “The priests of Baal were in great numbers,” disdainfully recounts 1 Kings 18:19. “Will you steal, murder, commit adultery, swear falsely, [and] make offerings to Baal?” asks Jeremiah 7:9. In practice, although the Sun god Baal was also the chief object of worship among the Canaanites, the name Baal started to be used for all manner of deities, which drove the Old Testament–writing rabbis bonkers.
Equally bonkers were the Sun myths recounted in numerous texts without editorial comment, no matter how ludicrous they might seem to us now. A famous Native American legend explains that a spider carries the Sun from a cave into his web each day. Surely every rebellious Iroquois teenager thought, “Yeah, Grandpa, right. A spider brings the Sun into his web. Sure.”
Chinese mythology says that long ago, the Sun gave birth to ten suns, which all lived in a mulberry tree until they flew into the sky. This, of course, made Earth way too hot, so an archer named Yi shot nine of them down. Apropos of nothing, the remaining Sun has a crow permanently living inside, which sometimes nibbles away pieces of it. Even to this day, the traditional Chinese symbol for the Sun is a red disk with a crow inside it.
Did anyone actually believe these stories? Some tales, no matter how dubious, were repeated for generations, but wiser heads naturally asked higher-level questions, such as whether Earth goes around the Sun or vice versa, and whether the Sun is a flat disk of fire, like burning kerosene on a plate, or, instead, a fiery sphere with no earthly analog whatsoever.
Great thinkers from less than a dozen countries came up with good answers and changed people from stargazers who gawked dumbly at the sky to those who still gawk, but on an advanced level. We cannot be too cocky these days, since our present grasp of the structure and origins of the cosmos remains so clearly incomplete. But the process of learning about sky objects started five thousand years ago with the Babylonians, Sumerians, Egyptians, Chinese, and later the Mayans, who accurately chronicled solar, lunar, and planetary cycles. Oddly enough, many other notable contemporary cultures—the Hebrews, Celts, Romans, and Japanese come to mind—brought us no astronomical advancement whatsoever.
Through it all, in the six centuries before Christ, the Greeks alone went far beyond merely observing and chronicling celestial patterns and rhythms. They came up with correct explanations. It’s a pity these individuals’ names are not universally known. They truly advanced our knowledge about the Sun and its relation to Earth, and even more or less figured out these two bodies’ correct distances, sizes, and motions. Yet outside of a few chance school assignments, they are now as forgotten as the Broadway headliners of the Gay Nineties.
Thales of Miletus (ca. 624–546 BC) was one founder of modern physical science. He thought that everything was made of water (true of living things, anyway) and that Earth was a disk floating in a huge ocean. Okay, but he was also the first to accurately plot the path of the Sun across the sky. He predicted the eclipse of May 28, 585 BC, which supposedly halted the battle between the Lydians and the Medes. Because of this eclipse, historians have been able to accurately pinpoint the date of this event, the earliest dated battle in history.
Pythagoras (ca. 580–500 BC) created the famous theorem we reluctantly studied in high school, but he didn’t do as well as an astronomer. He believed Earth is a sphere, but he thought it sat, unmoving, at the center of the universe, with the Sun going around it. At least he got the “Earth is a sphere” part right, even if a large majority continued to think otherwise for another two thousand years.
Anaxagoras (ca. 500–428 BC) correctly believed that objects on Earth and in the heavens are made of the same substances. He rightly said that the moon reflects light from the Sun, rather than glowing on its own.
Heracleides Ponticus (ca. 388–315 BC) was light-years ahead of his time. He was the first person known to propose that since Mercury and Venus stay so close to the Sun, they might orbit it, and that Earth might rotate on an axis. These ideas are absolutely correct, even if ignored at the time. How ironic that this most perspicacious of the early Greeks is also among the least known today.
Aristotle (384–322 BC) is still arguably the most famous of the ancient Greeks, but he held back science for the next two thousand years with his geocentric model of the universe, which went unquestioned until the time of Galileo and which bewilderingly became church doctrine. Later on, questioning Aristotle meant getting burned at the stake. A few of his other writings were correct. For example, he believed Earth is spherical, not flat.
Hipparchus of Nicaea (ca. 190–120 BC) discovered the twenty-six-century wobble of Earth’s axis called precession, and he created the first accurate star catalog, dividing stars into six magnitudes of brightness, a system that is still used today. He also determined the length of a year to within six minutes, even though he imagined he was timing the Sun’s annual orbit around Earth instead of the other way around.
Despite all these accomplishments, two other Greeks, Aristarchus of Samos (ca. 310–230 BC) and Eratosthenes of Cyrene (ca. 276–194 BC), probably deserve to sit atop Mount Olympus as the greatest of the great. Their deductions were not only dead-on correct but also eighteen centuries ahead of everyone else’s.
ARISTARCHUS WAS A mathematician and astronomer—the first person known to write and preach that the Sun is the center of the solar system and that Earth orbits around it once a year. Of course, it never pays to be significantly ahead of your time. Even for the freedom-loving Greeks, Aristarchus went too far. Cleanthes, the leader of the Stoics, tried to have him indicted on the charge of impiety for claiming that Earth doesn’t sit as the stately, immobile king of the cosmos, but rather does a lively circus two-step by orbiting the Sun while also spinning on an axis. To his contemporaries, that must have seemed as comical as suggesting that Earth pats its head while rubbing its belly. But when we justly praise Galileo and Copernicus for advocating the heliocentric theory, let’s remember who said it first, seventy-two generations earlier.
Aristarchus even made a stab at figuring the relative distance from Earth to the Sun and moon. Mathematically and logically, the idea is simple. When the moon precisely reaches its half-moon phase, the time when it should theoretically be located at right angles from the Sun, it does not actually stand 90 degrees from the Sun in the sky, but less. This geometry can easily be used for calculating where the Sun must be located, just as a friend at a known distance, whose face is exactly half-illuminated, tells you where the room’s lightbulb must be found.
Without the benefit of a telescope, however, figuring where exactly the moon sits the moment it is half-lit has a typical error of a few degrees—after all, the moon moves its own width each hour and seems just as perfectly “half” at 7:00 PM as at 8:00 PM. It’s not an easy judgment. Indeed, Aristarchus thought that when the moon was half, it stood 87 degrees from the Sun in the sky. Using that figure, he determined that the Sun must be eighteen times farther from Earth than the moon. And since they both appear to be the same size in the sky, it must mean that the Sun is physically eighteen times larger. This would make it several times bigger than Earth, and this general truth—that the Sun is at least bigger than Earth—is perhaps what convinced him that the Sun lies at the center of motion. A smaller body should logically orbit a larger one, not the other way around.
We know today that the half-moon actually hovers 89¾ degrees from the Sun, which places the Sun 400 times farther away than the moon, and therefore makes it 400 times bigger, which amounts to 109 times larger than Earth. Aristarchus had no way to precisely measure the correct angle, but he absolutely had the right idea. And, more than for anything else, he wins the cigar for his heliocentric model.
Aristarchus lived to be eighty, enjoying to his last days the slow pace of his native island of Samos. When he was thirty-five, a fellow genius, destined for an identically long life, was born in the city of Cyrene, now in Libya. Eratosthenes was brilliant, especially in his specialties of math and mapmaking, and it was he who coined the word “geography.” Eratosthenes’s great mind was recognized by all who knew him, and few were surprised when, at age forty, he was appointed by Ptolemy III to be in charge of the great library at Alexandria. Today that world famous repository of learning has been rebuilt, and after it reopened in 2002, I was thrilled to go there, though disheartened to discover that a foreigner must pay sixty dollars to get a library card. Still, a library card from the library of Alexandria! Almost irresistible. One wonders what it cost back in the library’s classical heyday, and for that matter, what they charged for an overdue papyrus.
Eratosthenes made a noble but futile attempt to calculate the distance to the Sun, too, but he remains most famous for being the first to accurately determine the size of Earth. And he did it without ever setting foot outside Egypt.
He was able to accomplish this seemingly impossible task because he recalled that on the day of the summer solstice, the Sun stands perfectly straight up for those who live in the city the Greeks called Syene, now Aswan. (Confusing things further, that large town on the Nile was known as Swenet by the Egyptians.) The straight-up Sun was not lost on Syene’s residents, who on that day could see sunlight shining to the very bottom of their wells. A precisely overhead Sun was the kind of cool thing that nowadays would become a tourist draw, complete with postcards. Eratosthenes was also very aware that the Sun is never even close to straight overhead as seen from his library in Alexandria. On the solstice, it misses the zenith by more than 7 degrees, or 14 Sun widths, or about one-fiftieth of a full circle.
Today any high school junior could set up a simple proportion to compare the 7-degree difference in the Sun’s angle between Syene and Alexandria with the 360 degrees in a full circle (meaning a ratio of 1:50), to obtain the unknown size of Earth. Here were three “knowns” and one “unknown.” Since Eratosthenes was both a mapmaker and a mathematician, the job was a piece of cake for him, too, even if no one had ever previously thought of it. The only tricky part was knowing the distance between Alexandria and Syene. Once he knew that, he’d just have to multiply it by fifty, and bingo: the circumference of our world.
Eratosthenes used the speed of camels and riverboats along the Nile and the time it took people to make the trip between Syene and Alexandria. You wouldn’t think that would be accurate enough to work, especially since any error was going to be multiplied fifty times over, but it did. He concluded that the cities were 5,000 stadia apart. The stade was equal in length to the average stadium of the time, very nearly one-tenth of a mile. Multiplying by fifty, he deduced that Earth was 252,000 stadia around, which happens to be the correct value within 1 percent.
Unlike other geniuses who were belittled or burned at the stake, Eratosthenes was quoted and supported by notable writers for centuries. (So much for the silly but oft-repeated notion that Columbus was ahead of his time in believing that Earth is round—though he was indeed ahead of his unschooled peers.) The only thing later in doubt was the exact length of the stade Eratosthenes used. Two thousand years in the future, scholarly second-guessers pointed out that the common Attic stadium was about 185 meters, and that would make Earth’s circumference come out to 29,000 miles, with a diameter of 9,200 miles, which is 16 percent too big. But if we assume that Eratosthenes used the 157.5-meter Egyptian stadium (and why not—he was, after all, the librarian in Alexandria), his 252,000 stadia means that Earth is 24,466 miles around and 7,788 miles wide. This figure comes wonderfully close to the known diameter of 7,926 miles today. For that, Eratosthenes definitely wins the giant stuffed bunny.
TWENTY-FOUR YEARS AFTER THE BALDING, lifelong bachelor Eratosthenes died around his eightieth birthday, another soon-to-be-famous Egyptian was born in that same city of Alexandria. Claudius Ptolemaeus, soon and forever after known as Ptolemy, thought deep thoughts but ended up being wrong about nearly everything. In his Almagest, Ptolemy turned his back on the brilliant works of Eratosthenes and Aristarchus and instead supported the idea of a geocentric universe, which became gospel for the next seventeen hundred years.
Ptolemy embraced Aristotle’s simplistic idea that everything is composed of the elements earth, water, air, and fire and that each element has its natural place and motion. Fire’s natural place is “up,” so it makes sense that flames and smoke always rise. Water and earth prefer being “down,” so items such as hamsters and pottery, composed of those elements, fall easily. It was hard to argue with.
Moreover, he said, the rules change beyond Earth. From the moon and beyond, all things are faultless and immutable, and their natural motion is neither up nor down but circular, since the circle is God’s perfect shape (because it’s beginningless and endless). To him, this explained why the Sun and moon move around us in circles (actually, they don’t), and it nicely jibed with the religious notion that God is “up there” in the perfect realm.
Tidy. But, of course, as wrong as serving Chicken McNuggets with white wine. So some Greeks nailed the truth, while others offered circular reasoning. Unfortunately, these latter views were the ones passed on and revered for the next two thousand years, thus keeping this garbage in schools and churches until the Renaissance, when Copernicus, Kepler, and Galileo finally prevailed. Even then, getting it right was a Sisyphean struggle, since anyone who contradicted the church’s Ptolemaic doctrine of perfection above was asking for serious trouble down here below.
In the late sixteenth and early seventeenth centuries, great peril lurked for anyone who preached that the Sun is mutable. The Italian scientist Giordano Bruno not only embraced the Sun-centered model of Aristarchus (and the more contemporary Copernicus), but he went so far as to suggest that space goes on and on, and that the Sun and planets are just one of many such systems in the cosmos. What would he say next? feared the church. That humans were not the only intelligent beings God created in the universe? For his prescience, Bruno was burned at the stake in 1600.
Spots on the Sun, which we now know to be the most obvious sign of solar changes that affect our lives and our world, were also the first bearers of information about the Sun itself. Of course, they came and went, and they seemed like blemishes, and thus were incompatible with the established notions of celestial perfection and changelessness. Then as now, however, a lot depended on your connections. With enough higher-ups in your Rolodex, you might just get away with some low-level blaspheming.
Such was the situation during the first few years after the invention of the telescope in 1608 by a Dutch spectacle maker. In 1610 and 1611, no fewer than five observers could make seemingly legitimate arguments that they were the very first to telescopically discover sunspots. Then as now, controversy gets noticed, and the world quickly became eager to confer great retroactive celebrity on whoever proved to be the first.
Johannes Fabricius, who lived in either Holland or Germany, depending on the map or time in history you choose, found sunspots on March 9, 1611, and excitedly watched them with his father for days on end through their telescope, until their eyes were damaged. Alas, good judgment comes only from experience, and experience comes only from bad judgment. These earliest solar observers had no one to warn them that, especially through a telescope, the Sun can quickly cook retinal cells.
Fabricius hurriedly published his discovery that very autumn—inarguably the first European book to discuss sunspots. Or maybe not inarguably. The Italian painter Raffael Gualterotti had already published his own detailed description of his naked-eye observation of a sunspot in 1604, which certainly trumped all the telescope users’ claims of priority.
Meanwhile, English telescope maker Thomas Harriot, fresh from a voyage of discovery cruise to Roanoke, Virginia, with Sir Walter Raleigh, started recording sunspots in December 1610. Since Harriot mailed his descriptions and drawings to colleagues, thereby documenting them, the British to this day regard him as the first discoverer of sunspots. Harriot had actually been the navigator and go-to science person on two voyages to the New World. Spending more than a year there, he was one of the very few who bothered to learn the Native Americans’ language and much about their customs. Back in England, he wrote a bestseller titled Briefe and True Report of the New Found Land of Virginia, which made him a celebrity and earned him a wealthy patron. That’s how he could afford to buy a telescope even before Galileo had one. One bit of irony is that in his book, Harriot praised the natives’ bodies by saying they were “notably preserved in health, and not know many grievious diseases, wherewithal we in England are often times afflicted.” He attributed this partly to their extensive smoking of tobacco, which he lauded because “it purgeth superfluous fleame and grosse humors and openeth all the pores and passages of the body.” Harriot himself took to smoking this miraculous substance and ultimately died of cancer that originated in his nasal membranes.
The true fireworks, however, erupted between Galileo, who likely observed sunspots in 1610 with a twenty-power telescope but failed to document them (even though he later insisted that he had told many friends), and the German Jesuit mathematics professor Christoph Scheiner, who first observed sunspots with a student on a date the student later insisted, to Galileo’s pleasure, was March 6, 1611. Actual priority of sunspot discovery has continued to be hotly debated by scholars for the past four hundred years, but at the time this issue created an unbelievably nasty international fight between Galileo and Scheiner that lasted until Galileo died decades later, in 1642.
Scheiner’s thorough and intense observations through the spring of 1611 resulted in letters later that year to an influential magistrate, who talked him into publishing his findings under the pseudonym Apelles. Because he was a Jesuit, hiding his true identity was important. After all, he was reporting that the Sun had blemishes, although for years afterward he believed the spots were objects that, like little planets, merely surrounded the Sun and thus did not corrupt it. Three years later, after his book got rave reviews and the public clamored to know who the author really was, a fellow Jesuit proudly announced that it was Scheiner who had first discovered the spots.
Galileo read Scheiner’s first sunspot book in January 1612, and for the next two years, the conflict between them raged. It ultimately engulfed the educated classes of both countries and reached such a furor that the genuinely earlier observations of Harriot, who had made 450 meticulous drawings of the spots, and of Gualterotti and Fabricius were essentially ignored.
Galileo and Scheiner published digs at each other that resembled the dialogue of two neurotic Woody Allen characters. Galileo first thought the spots were clouds or a vapor, while Scheiner wrongly clung to the notion that they were not at all associated with the Sun’s body. He pooh-poohed Galileo in print by asking sarcastically, “Who would ever place clouds around the Sun?” Galileo publicly replied, “Anyone who sees the spots and wants to say something probable about their nature.”
Galileo correctly charted the spots’ daily progress across the Sun and also rightly realized that some spots vanished off one side and then reappeared on the other two weeks later. He knew they were on, rather than near, the Sun because they assumed a foreshortened appearance when approaching the Sun’s edge.
But Galileo went too far in his criticisms. He wrongly ridiculed Scheiner’s later claims that the Sun was not uniformly bright. He also erred in dismissing the German’s discovery that the spots moved faster near the solar equator—which made Scheiner the first to realize that, unlike the moon and Earth, the gaseous Sun has a differential rotation: its equator spins completely in twenty-five days, but its polar regions require more than an extra week. Scheiner was also the first to see that the Sun’s axis is tilted 7 degrees, causing the spots usually to follow curved paths across its body.
Galileo adored this particular Scheiner discovery. It could serve him well in his crusade to demonstrate one of the Sun-centered theory’s subplots—that celestial bodies spin with cockeyed tilts. But there was no way that Galileo was going to credit Scheiner with anything that big, so he merely started claiming that he had discovered it first. “Plagiarism!” shouted Scheiner to all who would listen. And yet Scheiner also tried to maintain the expected public composure of a Jesuit. We can imagine him seething, forcing a sickly smile over clenched teeth. Yes, those were fun times.
For the remainder of his life, Galileo vociferously insisted that he alone had found the first sunspots, and he seldom wasted an opportunity to criticize Scheiner. The Jesuit, bitterly offended by Galileo’s attacks, struck back, possibly even using his connections to land Galileo in hot water with the church, which ultimately led to his permanent house arrest and forced recantation—also through clenched teeth.
Anyone whose mistakes have ever been embarrassingly pointed out would have to sympathize with how the brilliant if opinionated Galileo felt when Scheiner prominently published a list of twenty-four of Galileo’s errors. The Italian went through the roof, writing, “This pig, this malicious ass, he catalogues my mistakes which are but the result of a single slip-up.”
Despite the fire and smoke, both men mightily advanced our understanding of the Sun. The sunspot drawings Galileo published in 1613 were far superior to any others, and he was the first to believe that the spots were part of the Sun itself. But Scheiner correctly observed the Sun’s rotation to average twenty-seven days over its entire surface—curiously, the same period as the moon’s spin. He was also the first to note that no spots appear near the poles, nor are they precisely on the Sun’s equator, but rather in a zone 30 degrees north or south of it—roughly analogous to Earth’s subtropical belts. And Scheiner alone discovered that the Sun’s dark areas are intimately accompanied by bright ones. He named these brilliant patches faculae, Latin for “little torches.” These bright spots help explain why periods of maximum sunspots are, paradoxically, when the Sun emits the most energy.
In the end, however, none of these European observers was even remotely the first to discover sunspots. The ulcer-inducing war between Galileo and Scheiner would be like an argument between two modern bike-riding Kansas teenagers who, coming upon the Mississippi, both claimed to be the first human to discover the river. Three thousand years earlier, on a tablet in the world’s largest city, Babylon, an unknown writer described a black spot observed on the Sun on New Year’s Day. The Chinese also repeatedly noticed spots on the Sun long before the time of Christ, with the first mention of them coming as early as 800 BC. They kept detailed records of these spots starting in 28 BC. At least one of the ancient Greeks, Theophrastus, as well as the Roman poet Virgil, also kept records of them.
Since the prevailing Western religious belief was of a perfect and changeless Sun, it’s no surprise that few Europeans noted sunspots prior to the Renaissance. The laudatory term “risk taking,” so popular in childhood education today, carried a different connotation back then that strongly implied it was not at all a good thing. Even so, some cast caution to the wind and wrote what they saw anyway. On December 8, 1128, John of Worcester made a drawing on parchment showing two unequally large, black spots on the solar disk, along with this notation: “There appeared from the morning right up to the evening two black spheres against the Sun.” This is a particularly credible report because Korean records give a vivid account of an unusual, bright red aurora five days later. Indeed, the amazing and almost mythical northern lights are among the most dramatic visual echoes of the dynamic connection between our world and the changing Sun.
It’s clear that people saw spots on their sacred Sun throughout history. But neither Galileo, Scheiner, nor any other of the newly hatched cadre of dedicated solar observers suspected that just as their own lives were drawing to a close, these same spots would deliver to Earth a lifetime of suffering.
Continues...
Excerpted from The Sun's Heartbeat by Berman, Bob Copyright © 2011 by Berman, Bob. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Table of Contents
Introduction 3
Chapter 1 Yon Flaming Orb 7
Chapter 2 Genesis 15
Chapter 3 Strange History of Seeing Spots 23
Chapter 4 The Heartbeat Stops-and Other Peculiar Events 40
Chapter 5 The Unit 51
Chapter 6 Magnetic Attraction 64
Chapter 7 The Wild Science of the Bearded Men 77
Chapter 8 Cautionary Tales 93
Chapter 9 Why Jack Loved Carbon 100
Chapter 10 Tales of the Invisible 115
Chapter 11 The Sun Brings Death 127
Chapter 12 The Sun Will Save Your Life 134
Chapter 13 I'm an Aquarius; Trust Me 143
Chapter 14 Rhythms of Color 157
Chapter 15 Particle Man 166
Chapter 16 Totality: The Impossible Coincidence 176
Chapter 17 That's Entertainment 194
Chapter 18 Cold Winds 210
Chapter 19 The Weather Outside Is Frightful 231
Chapter 20 Tomorrow's Sun 247
Acknowledgments 255
Appendix: The Sun's Basics 257
Notes 259
Bibliography 279
Index 283
Reading Group Guide 293