Sunday, November 28, 2010

The day after Christmas, 2004

In my memory, the quickest, most abrupt end to a season of holiday cheer came on Dec. 26, 2004, when a monster earthquake killed more than 225,000 people (estimates vary widely.). For most of the victims, it was death by tsunami.

The quake, in the subduction zone off the western coast of the island of Sumatra, began 35 kilometers beneath the surface. Its rupture spread to the northwest at 2-3 kilometers a second for a distance of 1,600 kilometers over five minutes, devastating much of the island and massively displacing Indian Ocean water. The resulting tsunami killed some 200,000 along the northern and western Indian Ocean shoreline - 50,000 died in India and Sri Lanka - and went on to kill others in Africa seven hours after the initial shock.

At a magnitude of 9.3, the quake was the third most powerful ever recorded, equivalent to almost 2 billion Hiroshima bombs. In addition to thousands of aftershocks, it triggered an 8.7 magnitude quake not far to the southeast - itself the 7th-largest on record - set off two volcanoes, and even stirred some volcanic activity in Alaska. Unfortunately, many of the tsunami deaths were unnecessary. Seismologists worldwide knew of the danger within minutes, but (unlike around the Pacific) Indian Ocean countries had no warning system.

If it's any help, the Earth itself took time to mourn the deaths. According to theory, vertical seafloor motions from the quake changed the planet's moment of inertia enough to shorten the length of each day by 2.68 microseconds.

Saturday, November 27, 2010

Oceans on the 8s

It's time for "Oceans on the 8s!" But unlike the Weather Channel, we don't need to update the report every 10 minutes. Once every 8,000 years ought to cover it.

Let's start with the continental rifts, which split the land by pushing it back and allowing a baby ocean to form and grow. The ocean that will be created by the African rift hasn't been born yet, but it's looking healthy. Before long, it will look something like the Red Sea, a rift ocean which continues to grow nicely. (Incidentally, most rifts never create oceans. For instance, in the U.S. two rifts - one near where the Madrid earthquakes happened in 1811-1812 and the other stretching from Oklahoma to Michigan, failed to split North America.

The Atlantic Ocean also continues to grow as the mid-Atlantic ridge keeps spewing out more seafloor. There are no subduction zones off its shores, so it continues to push North America to the west. However, eventually the Atlantic will begin to shrink, and in the end will disappear. The Indian Ocean is hanging at the same size, its mid-Indian ridge and subduction beneath Indonesia balancing each other.

The Mediterranean is shrinking fast as Africa keeps moving north, and the Pacific Ocean also is getting smaller. The subduction zones all around the ocean are eating seafloor faster than it can be produced. (Between the pull of the Pacific and the push of the Atlantic, North America is moving west at about an inch a year.)

That's it for the 2010 report. Look for us again in 2810. (Because 8,000 years is something like 32,000 generations, you'll probably want to take pains to remind your kids and grandkids to keep passing on the word.)

Wednesday, November 24, 2010

Excitement in the Northwest

It was mid-morning on May 18, 1980, when a very puzzled and hesitant Helena radio DJ announced: "Er, you might want to put your car in the garage. They say Mount St. Helens has exploded, and ash is coming our way." I was every bit as puzzled as the DJ - St. Helens was good 500 miles to the west - but I dutifully drove my car into my seldom-used garage. Within hours, a light gray powder began to accumulate atop the freshly budded leaves of every shrub in my yard.

It is possible this is not the most exciting volcano-eruption story you've ever heard, but it was mine. I even brushed ash from my lilac leaves into vials - tangible proof for posterity!

The closer you were to the volcano, of course, the more exciting things got. In Spokane, some 200 miles from the blast, ash turned day into night. Ash in the air circled the globe in 15 days.

On or near the mountain, things were rather more serious. A final, 5.1 magnitude earthquake under the mountain - the last of 10,000 smaller quakes during the preceding few months - set off the blast, which caused the north side of the peak to give way. It was the largest landslide in the Earth's recorded history. Mudflows created by steam and ash combined with snow and water rushed down the Toutle River, taking out 27 bridges. Energy released by the eruption was equal to 27,000 Hiroshima nuclear bombs.

You want an exciting story? The last words radioed by volcanologist David Johnston from near the summit were: "Vancouver! Vancouver! This is it!"

Unfortunately, the story of volcanoes in the Pacific Northwest gets more exciting still. Mount Rainier, the tallest peak in the Cascades, looms above Seattle and Tacoma, still active, poised to erupt at any time. (Many Seattle homes were built on former mudflows.) When earthquake swarms inevitably begin rumbling beneath Rainier, millions of people had better do more than drive their car into the garage.

Tuesday, November 23, 2010

The importance of looking

Not too many years ago a paleontologist named Mary Higby Schweitzer was all over the news. Schweitzer, after more than a decade of dedicated research, had done what most of her colleagues thought was impossible - she showed that well-preserved fossilized bones of dinosaurs can contain blood cells and the remains of soft tissues that can tell us far more than we know today about these extinct animals.

The press went nuts. It didn't hurt that she was a good-looking woman, of course, but the fact was that scientists over the past 300 years had determined that dinosaur bones were all you could get. Any soft tissue that might remain would be so degraded after at least 65 million years as to be scientifically useless. (For most of those 300 years, of course, scientists didn't have the lab equipment to study it anyway.) But in 1992, Schweitzer noticed what looked like blood cells and other organic matter exactly where they should be found in fossil bones. Ignoring the received wisdom of other paleontologists, she patiently did the tests necessary to rule out other possibilities and published her tentative observations in 1993.

Still a graduate student, offering data that went against the common view, her paper got little attention. But she kept at it, finding more soft tissue in more dinosaurs, honing her observations, and finally publishing the work of her and her team in 2007 and 2008. Despite controversy, her findings not only made media waves but gained general acceptance in the field. (She's written an article about all this in the current Scientific American.) The bottom line: It turns out that while lab extrapolations say dinosaur tissue can't survive intact enough to study, there obviously are situations out in the real world in which the tissue can indeed outlast the huge time span.

Schweitzer got a lot of recognition in Montana because she had done her initial work and obtained her doctorate from Montana State University and had worked for MSU's Jack Horner, the celebrated dinosaur expert.

But it remains to be seen if she will be a footnote, a sidebar, or a major figure in future books about paleontology. Her studies continue; her results may tell the tale. Still, I think she should be remembered in any case for paying attention to something important in the face of generations of scientists who simply hadn't bothered to look.

Monday, November 22, 2010

Rocks of ages

The writer John McPhee once came up with a striking way to think about the age of the Earth. He said to imagine that the length of your arm represents the planet's history. Your shoulder is the farthest back in time - 4.567 billion years ago, scientists say - your elbow would be a little more than 2 billion years ago, and so on. And here's his point: If you gently brushed the tip of your fingernail with a nail file, you would erase all of human history.

Filing your nails should never be the same again.

But what's with that 4.567 billion year birthday? The age of rocks can be measured with great precision by studying the results of radioactive decay, but the oldest rocks ever found are dated only a little over 4 billion years. During its infancy, the Earth was constantly being bombarded with debris as the solar system pulled itself together, and no rock could form while that blasting kept the planet in a molten state.

So how do we get precisely 4.567 billion years? The answer is both simple and elegant: by dating rocks that are on the Earth, but are not of the Earth.

As the planets were being formed, countless other bits and pieces of matter never got a chance to join up, Those small, lonely asteroids just kept zooming around, pulled slightly this way and that whenever they neared a planet, but generally were doomed to isolation - until they found themselves aimed squarely at a crash landing on a planet such as the Earth. Over the eons, a great many of them have done just that.

Studies of the ages of these rocks show ... 4.567 billion years.

If you held such a rock in your hand, your fingers would be curling around not only an object exactly as old as the Earth, but a third as old as the universe itself.

Sunday, November 21, 2010

Doom from above

Sometime in the Earth's young childhood, well over 4 billion years ago when it still was mostly a molten mass of rock, the wildly chaotic gravitational pulls of the early solar system almost certainly sent a body the size of Mars or larger careening off our fledgling home, slamming material off into space. That material - or much of it - remained within the Earth's gravitational pull, and before long in cosmic years coalesced into the moon we watch circle our planet today.

Scientists think this is true because rocks brought back by astronauts match almost exactly the very makeup of the Earth's mantel.

I thought of this while reading this week's Newsweek, which uses its final page to try to graphically illustrate some issue. The current "Back Page," about dangerous asteroids, is headlined: "Is the end nigh?" Newsweek suggests not.

It does, however, point out that while an asteroid the size of a basketball crashes into the Earth's atmosphere daily, a basketball-court sized object blows into town an average of every 200 years. And a football-field sized object comes by once every 10,000 years.

The piece offers a table showing the most dangerous asteroids - that we know about - that could strike us in the next 100 years or so. The sizes range from 98 feet in diameter to 3,609 feet, more than half a mile. But the odds for all of them are low - from 1 in 770 for a cute little 121 footer to 1 in 53 million for that big guy. (And for comparison, the asteroid that put the dinosaurs and 75 percent all other species to bed 65 million years ago had to be seven to eight miles wide.)

So a really big, humankind-ending event isn't very likely anytime soon. But, as they say, you never know. It was only in 1908 that a 120-foot visitor broke up over Siberia, flattening 800 square miles of forest. I was struck by the comments of astronomer Alex Filippenko, who insisted that over the next few hundred millions of years the Earth is almost certain to be blasted by what would be the end of the world for us. He urged that we quickly begin readying the means to deflect such a monster. After all, he said, we are far from knowing about all that is out there - and an estimation of probabilities wouldn't have been much help to those dead dinosaurs.

Bodies the size of Mars no long zoom erratically throughout our solar system. We would have seen them. But who knows about those 10-milers?

Thursday, November 18, 2010

Moral befuddlement

I've just started reading a new book by Sam Harris called "The Moral Landscape: How Science Can Determine Human Values." Here Harris, a neuroscientist best known for his first book, "The End of Faith," isn't so much talking about religion but rather the need for a morality based on human well-being - a morality firmly based on science and rationality.

Of course he rejects religion as necessary for morality, but some of his deepest scorn goes to liberals who seem to think that cultural relativism - "tolerance" - is the greatest good. In a speech at a scientific conference Harris said that "the moment we admit we know anything about human well-being scientifically, morally speaking we must admit that certain individuals or cultures can be absolutely wrong about it." He mentioned the Taliban as an example.

After his speech, he was approached by a female scientist who serves on the "President's Commission for the Study of Bioethical Issues." Here is part of their conversation, "more or less verbatim:"

She: "What makes you think that science will ever be able to say that forcing women to wear burqas is wrong?"
Harris: "Because I think that right and wrong are a matter of increasing or decreasing well-being - and it is obvious that forcing half the population to live in cloth bags, and beating or killing them if they refuse, is not a good strategy for maximizing human well-being."
She: "But that's only your opinion."
Harris: "OK ... Let's make it even simpler. What if we found a culture that ritually blinded every third child by plucking out his or her eyes at birth, would you then agree that we had found a culture that was needlessly diminishing human well-being?"
She: "It would depend on why they were doing it."
Harris: "Let's say they were doing it on the basis of religious superstition. In their scripture, God says, 'Every third must walk in darkness.' "
She: "Then you could never say that they were wrong."

"Such opinions," Harris commented, "are not uncommon in the Ivory Tower."

I consider myself a liberal and believe me, it is thinking like this - right down there with the brilliant logic of conservatives - that makes me ponder the grim future of our (morally) stupid species.

Wednesday, November 17, 2010

On thinking straight

This afternoon my thoughts turned to what is called the Monte Hall Problem (based on the "Let's Make a Deal" TV game show). It is a devilish little puzzle that reveals how inept most of us are at reasoning. Suppose you are a contestant and are presented with three closed doors. One hides a shiny new car, the other two conceal goats. Your hope is to pick the door with the car.

Let's say you picked Door #1. Monte, that sly dog, then opens Door #2, revealing a goat. He asks if you want to switch your choice to Door #3. (Hint: you really, really should switch.)

This perplexes people to no end. After all, our powerful intuition is that because the car is behind one of just two doors, the chances are 50-50. Switching your choice can make no difference: the odds are 1 in 2 in any case.

Wrong! If you stick with Door #1, you have a one-third chance of getting the car. If you switch to Door #3, you have a two-thirds chance - twice as good!

Here's why: At the start of the game, each door has a one-third chance of hiding the car. When you pick a door, the other two doors add up to a two-thirds chance.

It doesn't matter if Door #2 is then opened to reveal its goat. Your choice - Door #1 - still has the same one-third chance. And the other two doors still have a collective two-thirds chance. So Door #3 - the only other unopened door - has a two thirds chance of concealing the car. And you're a dummy if you don't switch.

What interested me about all this today was that when I started considering writing about it, I realized that I'd have to explain this reasoning. But my mind was blank! I knew about the puzzle. I knew the correct choice was to switch. So why couldn't I remember how come?

Well, I soon managed to figure it out again. (As they say, this isn't exactly rocket science.) But I think my initial blankness was caused by my compelling intuition, unabated by the facts, that the odds had to be 50-50. (Dammit, that is obvious!) I think somewhere up in my brain, my intuition muscled in, kicked my better knowledge into a mental closet, and took over like some schoolyard bully.

The lesson? It's hard to think straight!

Tuesday, November 16, 2010

A decorous Tea Party

Some time ago, discussing the right-wing Tea Party enthusiasts, I made the off-hand comment that in contrast to the modern movement, the real Tea Party was well organized. (This, of course, was well before any of us knew the results of the midterm election. Oh, well.)

Anyway, I've since learned a few more details about how organized the original Tea Party really was.

The local Boston Committee of Correspondence and Safety - one of many such groups throughout the colonies that organized resistance to Great Britain and actually were in charge as war broke out - decided that tea delivered under the Tea Act must not be unloaded. When a ship carrying such tea arrived in the harbor, the committee asked the ship's American owner - in no uncertain terms - to have it sail back to England with the tea. The owner reluctantly agreed to ask British authorities if he could do that. He was turned down.

A week or two later, on Dec. 16, 1973, people dressed as Indians (to symbolically represent the whole of "America" to Europeans) took action while an estimated 8,000 others watched from the shore. The "Indians" told the ship's captain that no one would be harmed, and that only the tea would be destroyed. They meant it. The tea was to be tossed into the harbor, not stolen. No other property - other cargo, the captain's or crew's possessions - were to be disturbed. Private property was to honored. In fact, when the "Indians" broke a padlock in the process of getting to the tea, they quickly replaced it with a new one.

After the event - 342 barrels over the side - there were no riots or raucous celebrations. The thousands of participants and watchers simply went home. This was a carefully controlled protest.

In the spring of 1774, Parliament responded by passing the so-called "Intolerable Acts" that shut down the harbor vital to Boston's economy, suspended local colony government, renewed the quartering of British troops in colonial homes, and more. So much for a peaceful resolution. Next, a year later, came Lexington and Concord.

Monday, November 15, 2010

A lesson of the Boston Massacre

Recently, while learning about the origins of the War of 1812, I was puzzled: Why did the ruling party at the time declare war on the British Empire when America had only a tiny navy and a paltry little army of a few thousand men? I knew that the Republicans of this era were notoriously cheap, hating high taxes. And I knew that there was a general antagonism toward a "standing army." But, hey?

As it turns out, you only need to look back to the "Boston Massacre" of March, 1770. The British had sent 4,000 troops to Boston (a city of 15,000) to maintain order in the face of riots against Parliament's recent tax measures. The troops and Bostonians didn't get along. On March 5, some of the soldiers fired into a crowd of people protesting their presence, killing five of them. Nobody really knew if or how the troops had been provoked, or if the whole thing was premeditated. But, as a propaganda tool and a lasting rallying cry, it was a major step toward the war to come.

A big part of the propaganda had to do with a "standing army." As opposed to a legitimate army mustered in time of war, a standing army was seen as a weapon wielded by a tyrant to attack his own citizens whenever he suffered opposition. That's how most colonists saw the troops in Boston, and the incident on March 5 only confirmed their view. Countless speeches memorializing the killings over the next few years leading up to the revolution pounded the point home. No wonder that even on the eve of the War of 1812, a standing army remained a dirty word - a grim reminder that a despot always could turn that army against his own people.

Incidentally, one of the most effective pieces of propaganda came from none other than Paul Revere. He etched a picture of the attack showing redcoats, standing in a line like a firing squad, shooting into a crowd of unarmed people, bloody bodies lying on the street. The drawing was reproduced widely in newspapers and broadsheets for years. It helped keep the anger alive. I guess he should be remembered for more than his midnight ride.

Sunday, November 14, 2010

Before the Stamp Act

One thing I never really learned in school is that prior to the 1760s, the British colonies in North America had been, in a sense, the most free people in the world - for something like 150 years. London had regulated trade across the Atlantic, but the colonies got to elect their own assemblies and govern their own affairs. They were still nominally under the control of the king and the Parliament - and certainly thought of themselves as Englishmen - but they ran their own show. The empire kept hands off local decisions.

Then, at least as we were taught, Britain passed the Stamp Act in 1765 - the first direct taxation of all colonists - which sent people into the streets, caused them to boycott British goods, and to create "Sons of Liberty" organizations throughout the colonies. (The Stamp Act was followed, of course, by the Tea Act. Whoops! People dressed up as Indians and did their tea thing in Boston's harbor.)

But wait a minute. Isn't this a bit of a quick reaction? From loyal subjects to rioters overnight?

Yes, it is. It turns out that for several years, Parliament had been changing the rules. After enjoying many generations of self-government, the colonies realized the British had begun taking over:

- In 1763, Parliament passed a proclamation severely limiting colonists from settling west of the mountains toward the Mississippi River - land recently ceded to England from France. Colonists were saying: "What the hey? Isn't this our say?"

- In 1764, it passed the Sugar Act (also called the Revenue Act) involving custom taxes on molasses and many other commodities, severe penalties including the confiscation of ships, and forcing alleged violators to appear in an admiralty court in Halifax, which had no jury of peers.

- Also in 1764, it passed Currency Act, forbidding colonists from using local paper money to pay taxes - a law colonists said would impoverish them.

To the British, it all made sense: prevent chaos in the West, raise revenue to meet England's expenses in North America, and prevent paper-money fraud. To the colonists, it would not only destroy the economy but be a huge violation of the property rights of free and loyal British subjects who had no representation - a clear violation of rights dating back to the Magna Charta in 1215.

My point is to bitch about not learning about any of this stuff in school. My impression was of people going ape at the drop of a hat. It just wasn't so. Instead, it was a steady erosion of rights over several years that led to what was to come. These early Americans, still considering themselves English dudes, took some years to get really upset.

Friday, November 12, 2010

Another Founding Father?

As late as 1760, 16 years before the Declaration of Independence, the British colonists in North America may have had some disputes with London, but the idea that they might one day sever their ties as loyal subjects of the crown was simply unthinkable. This was a time in which a person born a subject of the King of England lived a subject, and died a subject. Any alternative never crossed one's mind.

But, starting in 1761, a new kind of political movement arose: Maybe the colonies didn't have to bow to every whim of the crown after all! The idea of resistance - if not yet actual revolution -began to spread. But what kicked it off?

Historians think that beginning can be traced to one James Otis, a lawyer who was hired by distillers in Boston who objected to British renewal of the "Writs of Assistance": laws that let custom agents board ships when they suspected contraband. The merchants, accustomed to bribing officials to get away with a little fast and loose business dealings, wanted the law struck down.

Otis argued that the Writs of Assistance were unconstitutional under the British Constitution because they violated a person's right to his property. His argument was shot down in court, but during the next few years he wrote pamphlets - "The Rights of the British Colonies Asserted and Proved" was one - that were widely read. The concept that colonists were empowered to resist those acts of Parliament they didn't like struck a chord. Refusal to accept "taxation without representation" wouldn't be far behind. President John Adams believed that it was Otis's arguments in the Writs case that helped spark what became the American Revolution.

Otis's story ends badly. Late in the 1760s, after being subjected to constant attacks by political opponents, he began suffering from mental illness. He died in 1783 after being struck by a bolt of lightning.

It is true that resistance was in the colonial air anyway. King George III would keep making make sure of that. Still, looking back 250 years, maybe we should be remembering James Otis as the initial Founding Father.

Wednesday, November 10, 2010

Students and BS detecting

For years I've admired Sharon Begley, a science writer lured from the Wall Street Journal to Newsweek several years ago - perhaps when she saw the writing on the wall about where the Journal's ownership was heading. Newsweek gives her a weekly column in which she can raise some neccessary questions.

One of her interests is science education. After all, studies keep showing that American kids keep falling farther and farther short of other nation's kids in their science and math skills. In a recent column, Begley says that the main problem is that K-12 schools aren't doing well teaching students to recognize BS, or "bad science."

Rather than concentrating only on all the facts of the various sciences - memorizing the structural formulas for alkanes, for instance - kids should learn the first, most important principle of science. And obviously, judging from grownups, they're not learning it. This principle is this, she says: "the most useful skill we could teach is the habit of asking oneself and others, how do you know? If knowledge comes from intuition or anecdote, it is likely wrong."

This is obvious, except to the human brain. For instance, it can't tell randomness from real patterns of data (climate warming, say), and it wants to assign causality to weak data in order to confirm it's own beliefs. The brain is a really cool thing, but it can really be dumb, too.

Science classes obviously have to teach the science - trig, the Krebs cycle, Ohm's law, mass equals the speed of light squared - but foremost, Begley says, it needs to teach kids that "science is not a collection of facts but a way of interrogating the world."

Not to mention, and she didn't, interrogating politicians, whose BS stands not just for bad science.

Tuesday, November 9, 2010

Secrets in the sky

When physicists and astronomers concentrate on the normal atoms and energy that make up the well-understood stuff of our universe, our stars and planets and us, they're only dealing with 4 percent of the total. (And when we point telescopes into the sky, the part of this normal matter that glows in optical wavelengths amounts to a scant 0.5 percent.) A full 96 percent of all the stuff that is out there goes largely undetected and is far from understood.

I don't know whether to be amused or alarmed.

It turns out dark matter amounts to 21 percent of the total mass-energy of the universe, and dark energy adds up to a full 75 percent. (The use of the adjective "dark" in both names is misleading. They have nothing to do with each other. Dark matter pulls in like normal matter does, helping interstellar gases and dust clump together to form stars, galaxies and galactic clusters; dark energy pushes out, accelerating the expansion of space itself.)

Scientists think dark matter is mostly what they call WIMPS - weakly interacting massive particles - that are strange particles indeed. Look in vain for such normal components of matter as protons and neutrons. It's a bit of an embarrassment that years of elaborate experiments have yet to detect their presence. Still, the dark matter exists. We wouldn't be here without it.

Dark energy, making up three-fourths of the mass-energy of the universe, really isn't understood at all. But it is there, and four or five billion years ago its outward-pushing power overcame that of gravity and stepped on the gas. There are theories about it - having to do with quantum fluctuations and something called quintessence - but I'll pass on the explanations. This is heavy-duty physics that wouldn't fit in a blog, even if a former small-town journalist could do it justice. Let's just say it is energy with negative pressure that speeds up the expansion of the universe. And the foot may stay on the accelerator forever.

I've had a layman's fascination with astronomy all my life. I've read many, many books of popular science, and I've just been watching a long series of up-to-date lectures on the subject. Physics and astronomy have an amazing record of accomplishment. On the other hand, apparently it's 4 percent down, 96 percent to go.

Sunday, November 7, 2010

What? The sky gets dark at night?

I've been listening my way through astronomy lectures about black holes, but also looking forward to upcoming ones on the next disc, which included "The Paradox of the Dark Night Sky." That's a cool topic, and I thought I'd write about it. But the DVD turned out to be flawed - it came with a big crack - so I was out of luck.

But the hell with it, I'll do a blog anyway. It's about this question: "Why is the sky dark at night?"

People who hear that question for the first time usually think it's pretty silly. (Gee, do you supposed it has something to do with the sun going down?") But the issue is a lot deeper than that. Go out tonight and gaze at those umpteen stars out there. They are suns, and their combined brightness (including all those too dim for our little eyes to see) should far exceed the brightness of our Sun. Not only should the night be bright, but the day should be brighter than it is - even considering that the stars appear to dim according to the square of their distance from us.

The paradox was pondered by the likes of Kepler and Newton, but it wasn't until the mid 19th Century that Wilhelm Olbers brought other astronomers to attention by pointing out that the darkness meant that at least one of scientists' many basic assumptions about the universe (he had no idea which one) must be wrong.

Several possible answers - light blocking by interstellar dust or dark matter - are easily ruled out. The best answer, especially now that we have a rather good handle on the age of the universe, appears to be that the night is dark because the universe is relatively young. There simply hasn't been enough time for light from the countless farthest stars to reach us yet. In another 10 billion years or so, that could change. Earthlings probably won't be around to see it after our Sun goes haywire, but by then the night sky could indeed be bright.

But I'm looking even farther down the road. If the expansion of space continues to accelerate due to dark energy, eventually all the stars in the night sky (except the stars in our "local group" of galaxies that are gravitationally locked together) will be so far away as to disappear from sight. The sky will be dark indeed. The Milky Way will really stand out!

Or, at least on a somewhat shorter time scale, if the universe is vastly bigger than the part that is currently visible to us, starlight in our part of the universe may eventually get really bright.

Or, if the universe actually is a "multiverse" - with other separate universes forever beyond our sight or reach - the fate of the night sky will depend on just how big our own particular universe actually is.

All this from thinking about why it gets dark at night. Gosh, maybe it's a seriously silly question!

Friday, November 5, 2010

My little guy

In Helena, Montana, we're in that weird early-November period in which temperatures swing like an out-of-control pendulum. The temperature might be somewhere around the mid-twenties at night, but rise to the 60s by around 4 p.m. - only to start falling toward below freezing a few hours later.

Something about these temperature swings makes the solitary red squirrel in my neighborhood act like a truly frantic rodent. I watched him today - the little hyper guy with the bushy tail whose territory apparently consists of my end of the block - as he jerked around like Brownian motion. He'll tight-walk the telephone lines above the alley, then sniff out what might be in the gravel down below. Up one tree. Down another. Along the top of a fence. Nosing among fallen leaves for the remains of oxidized apple crumbs left by deer. Nosing below bushes, looking for remaining seeds. Nosing around for anything he can find.

Soon winter will come. Snow will cover the ground, and I can't imagine how my squirrel will eat. But he will. He's squirreled away food. He'll have the energy for entertaining visiting females, the energy to climb the highest trees to search for any remaining nutrients, and the energy to emerge in spring as bushy-tailed as ever.

I like that little guy.

Thursday, November 4, 2010

The star-struck blues

Years ago I learned something ultimately terrible but still strangly comforting. The bad news was that the Sun eventually was going to quit on us. It would run out of fuel, expand into a red giant in a frantic attempt to compensate, but finally collapse into a white dwarf, dooming the Earth forever.

The good news was that stars of the mass of the Sun keep shining for about 10 billion years. As the Sun happens to be 4.6 billion years old, even a kid could do the math: We had a good five billion years to go! (I knew none of this had any bearing on me, personally, but still ...)

Anyway, it was an idea that kept the monsters securely trapped under my bed at night, and it's been a calming thought in the back of my mind ever since. Oh, foolish me!

It turns out that stars like the sun make their energy by fusing four hydrogen protons into one helium nucleus. That's a big net loss of particles. This has consequences, because the pressure needed to make the Sun's engine work is proportional to the product of the particle density and the temperature. As the numbers of particles go down, the termperature has to go up.

Rats. Within a few hundred million years, the Sun will have heated up so much that people on Earth will really, really notice it. Within, say, half a billion years all the water in all the oceans and lakes and rivers and toilet bowls will be gone. The Earth will far more unlivable than the worst desert you could ever imagine.

We have been robbed. Instead of another 5 billion years of earthly bliss, we get maybe half a billion. Somebody call the police!

Of course we may have options in the future. Maybe we can build a big space tug and tow our planet away from the heat. Maybe by then we can just download ourselves onto computer chips, load them into space ships, and go live somewhere else. Hell, maybe we'll even be able fix the damn Sun.

But wait a minute. Why am I saying "we?"

Wednesday, November 3, 2010

A prescient voice

Reading a New York Review of Books essay on "The Irony of Manifest Destiny: The Tragedy of America's Foreign Policy" by William Pfaff, I seem to be learning about a man who, over the past 50 years, has always gone contrary to the conventional wisdom, and has always been right.

For instance, those of us who are old enough will remember cold warriors strutting around and beating their chests about "Finlandization." (This was about that country's craven buckling to the Soviet Union after World War Two.) In fact, as Pfaff wrote at the time, Finland was attacked by Stalin in 1939 and heroically defended itself until forced to cede some territory. After the war, as a former German ally and facing absorption by the Soviets, Finland was in a terrible situation. But it maintained a careful neutrality, had to make some uncomfortable compromises, but managed to maintain its independence and democracy. Now, of course, Finland is an open country with a successful high-tech capitalist economy, state-of-the-art public health care, and a much-admired public education system. And it sure outlived the Soviet Union - thanks to restraint and patience. That's something that's been sorely lacking in the United States, which hasn't unequivocally won a war (excepting the dashing Reagan attack on the island of Grenada) since 1945.

Over the years, Pfaff has continued to be correct. Nearly 50 years ago, he argued that Soviet communism was inherently weak and eventually would collapse on its own. He advocated careful containment until then. But no, we had to put 54,000 names on the Vietnam War Memorial. He was right 20 years ago, when he argued that with the end of Cold War American's military should be reduced and adapted to new circumstance. But no, it was hugely increased - handy for our failed attempt to catch Bin Laden and impose democracy in western Asia. Now, in his 80s, Pfaff doubts the "enormity of the Islamic radical threat." Don't bet he's wrong.

Not being one to sit around reading scholarly articles on foreign policy, I hadn't come across Pfaff. But I wish he could live another 80 years. We need people who can see behind what the reviewer called the "received ideas, attitudes, and platitudes of the age."

Tuesday, November 2, 2010

Weirdness

The last couple of days I've been learning about two really strange and totally separate things: the weirdness of the moons of the outer planets of our solar system, and the weirdness of American politics in the first decade of the 19th Century. Guess which are weirder? Here's a hint - when have American politics not been weirder than anything else you can think of?

Sure, there is Io, a moon of Jupiter that is the most geologically active body in the solar system. It spurts sulfur compounds like a colicky baby. There is Titan, a moon of Saturn, with its methane lakes that might nurture microbes. But then there is early American politics, which make no darn sense at all.

Of course, the politics probably do make some sense to historians who have spent years studying the subject. But to the rest of us ... weird.

Modern politics sometime are echoed in the early 1800s. "Republicans" of the time, a very distant relative of the modern GOP, can sound very modern and tea party-ish: In 1812, Congressman John A. Harper of New Hampshire celebrated the United States as a loose confederation of sovereign states "without foreign or domestic wars, without taxation, without any more of the pressure of government than was absolutely necessary to keep the bands of society together." In a great oxymoron, Thomas Jefferson said the U.S. has the strongest government on earth, because its institutional weakness would ensure that people would come to its defense because it demanded so little of them. This was the party that hated the idea of a "standing army," decimated the navy and army, economically weakened the U.S. by a trade embargo that only helped Canada and Great Britain, yet soon declared war against Britain, the greatest military power in the world at a the time when the U.S. was basically a cripple among nations.

This was long before the Democratic Party was formed - hell, it was before the Whigs. The Federalists, scared big time by Republican policies, argued that the British were right, the Republicans were wrong, and the country was going to hell. Anarchy was on the way. They cared mostly about international trade. Liberals, they were not. Nobody gave a damn about liberty for blacks, only white men.

Politics have changed a lot in 200 years. Republicans fought for racial equality (in law, anyway) in the mid 1800s, progressivism under Teddy Roosevelt at the turn of the century, and went ape shit about the New Deal. Democrats backed southern slavers, then got all disorganized (Mark Twain said he belonged to no organized party - he was a Democrat) and then in the 1930s belatedly realized they needed black votes.

At least the distant moons, Io and Titan, have stayed constant in their weirdness. I don't even want to go into modern U.S. politics.

Monday, November 1, 2010

Our neighbors and climate change

I recently read an article in Scientific American magazine that had a subhead which caught my attention: "Why can't we have a civil conversation about climate?" The article centered on climate scientist Judith Curry, who has enraged many of her colleagues by saying that some global warming critics - certainly not most of them - have legitimate concerns about the science that is being conducted. She says these real worries are too often ignored (or responded to as though they were merely political claptrap) by the mainstream science community.

I read it with interest as one who tends (like most of us) to consider sources over substance. But the article jumped back into my mind this afternoon as my lecture series moved to "terrestrial" planets like Venus and Mars. Venus, covered by clouds of sulfuric acid and with an atmosphere made up of 96 percent carbon dioxide and 4 percent nitrogen, exhibits the ultimate "runaway greenhouse effect." It's atmospheric pressure is 90 times that of Earth. The CO2 captures most heat radiated by the planet, leaving Venus baking in a temperature of 480 degrees Celsius night and day, everywhere. It's hot enough to melt lead.

Mars, on the other hand, had an "inverse greenhouse effect." The planet used to have surface water - the evidence is everywhere - but today its atmosphere - also mostly carbon dioxide - has decreased to only one percent of the atmospheric pressure of that of Earth. That makes it incapable of retaining surface water anymore. In addition, its temperature drops as low as -130 C. Not a nice place to live, either. What happened to the atmosphere? Maybe, if Mars experienced a cooling trend, being so small and far from the Sun, carbon dioxide began freezing out, decreasing greenhouse warming. That would cool the temperature, causing more air to freeze out, and so on.

At any rate, on both Venus and Mars the changes were permanent and devastating. Life that either planet might may have had never stood a chance.

Back on Earth, we fight about climate change, with each side unhappy about uncertainties in the data and in the computer models. But the uncomfortable fact is that as long as the uncertainties exist, things could turn out to much rosier than projections indicate (Whoopee!), but things also could turn out to be much worse. The deep history of the Earth shows that our planet has endured periods of both warming and cooling and has come out of them - although today either sort of change would have staggering social consequences. But when it comes to worst-case scenarios, we've got a couple of close planetary neighbors to give us pause ... in no uncertain terms.