Thursday, December 30, 2010

A kind of immortality

It is impossible to think about one of the most important tools in medicine - the culture and study of human cells - without knowing about Henrietta Lacks, a poor, illiterate black woman who died of cervical cancer a week after her 31st birthday nearly 60 years ago.

I've known about Rebecca Skloot's book, "The Immortal Life of Henrietta Lacks," for many months, but it was only this week that I got around to buying a copy. I'm an idiot for waiting so long. It's one of the very best books of 2010, with stunningly moving writing that offers something for everyone - the science of cell cultures, the stumbling growth of medical ethics, race relations over the past 60 years, and the personal tale the lives of Henrietta's children and grandchildren. Let's add the author's long battle to win their trust and respect.

Shortly before Henrietta died doctors at Johns Hopkins operated to insert packets of radiation. In the process, without her knowledge or consent, they took samples of a tumor, hoping to get cells they could keep alive.

Her cancer cells not only thrived, but multiplied like lightning. If you've ever had a polio vaccine, thank Henrietta's cells - code-named HeLa - for showing the way. In fact, they've revolutionized the field. They've participated in atomic bomb tests to examine the effect of radiation on cells. They've been to the moon to study low gravity's effect. They've help lead to big advances in gene mapping, in vitro fertilization, and cloning.

They've also become a multi-million-dollar industry - not one dollar of which ever found its way to Henrietta's family, which knew nothing about HeLa for more than 20 years.

Skloot, an award-winning science writer, knew that this story required far more than science. It's a wonderful book. I'll never think of those vaccine-laden sugar cubes that we kids lined up to swallow in the early 1950s in quite the same way.

Tuesday, December 28, 2010

The start of something big

On some level, every generation of young people thinks it invented sex. (This is SO cool - it MUST be new!) In reality, of course, they know the business of internal fertilization - as opposed to spreading one's eggs on the seafloor for subsequent fertilization by males - has been around a long time. But just how long?

Until very recently, scientists thought our modern form of sexual intercourse was invented by a type of fish that includes early sharks roughly 350 million years ago. Now, however, new fossil study has revealed that copulation was brought into the world by primitive fish called placoderms ("plated skin"), armored creatures with backbones and jaws. They may have been ugly, but they invented the deed at least 25 million years earlier.

This leap back in time apparently has great significance for the study of evolution, but we're interested because placederms are directly on the long line leading to creatures with four limbs, including, eventually, humans.

And if humans are good at anything, it's you know what.

That proficiency took a long time to translate into significant population growth. Warfare, famine and especially disease kept our numbers down. It was one thing to make babies, but qute another to keep them alive to adulthood. Then, a few hundred years ago, all that quickly changed.

And, despite a general slowdown in women's birth rates, late in 2011 the Earth will reach a new milestone: a population of 7 billion souls. According to the UN Population Division, that number is projected to reach 9 billion by 2045.

Those ugly little placoderms had no idea what they'd done.

Monday, December 27, 2010

The glow of cigarettes

Since I quit smoking seven or eight years ago, I've tried not to turn into one of those post-smoker fanatics, pounding on tables and proclaiming: "Tobacco is the Devil and cigarette-company executives are his minions!"

It's been tough, however, given the revelations that keep popping up about what those executives knew, when they knew it, and how little they gave a damn.

For example, I just learned (from an article in the current Scientific American) that each of the almost six trillion cigarettes smoked each year contains a small amount of the uranium isotope polonium 210 (mostly from the phosphate rock from which fertilizer is made). It adds up to the equivalent radiation dosage of 300 chest x-rays a year for a person who smokes a pack and a half a day.

Polonium isn't the worst carcinogen in tobacco smoke, but it no doubt kills thousands of smokers a year. The tobacco industry has known about it for decades, and has come up with a variety of methods that could virtually eliminate the danger. But it was decided that "Removal of these materials would have no commercial advantage."

Grr. I'm looking around for a table to pound. But here is some good news. In June, 2009, President Obama signed into law an act that for the first time brings tobacco under the jurisdiction of the Food and Drug Administration. Requiring the industry to remove radiation from its tobacco products sounds like an obvious place to start.

Friday, December 24, 2010

Animals and math

Stories keep coming along about animals being better at mathematical computation than humans are. Many such stories at least are of scientific interest (with exceptions, such as the story about octopuses' knowledge of soccer), but they generally can be explained in other ways.

For instance, bees that seem to find the shortest path between many flowers in a meadow are said to have powers of calculation that far surpass our own. This is nothing more than a guess. To determine if a bee's path is optimal, you would have to measure all possible paths. Nobody's done that, or likely ever will. To suggest that bees have invented a general algorithm for picking the shortest path is a bit ridiculous - such tasks are so complex that they fall into a class of virtually unsolvable problems called NP-hard. Bees have evolved to be good at what they do, but they surely aren't doing math.

Another story asserts that pigeons are better than humans at "getting" our old friend the Monte Hall Problem. (Recall that a prize hides behind one of three doors. The contestant picks one door, and thus has a one-third chance of winning the prize. The game host then opens one of the remaining two doors, revealing no prize. Should the contestant switch to the remaining closed door? Yes. While his first choice still has that one-third chance, the remaining closed door now has a two-thirds chance of winning the prize.)

People are bad at this. In a recent study, even after playing multiple times (ample time to see that switching doubles one's chance of winning), most people switched only two-thirds of the time. It took only a few tries for pigeons to learn to switch every time.

Does this mean that pigeons are making some kind of statistical analysis? Are they thinking about the odds? Nope. They're just good observers who follow the evidence.

People, on the contrary, think too much. They overanalze, and get themselves all confused.

Thursday, December 23, 2010

Big findings, subtle flaws

Most of us put great weight on scientific consensus. When the results of peer-reviewed experiments gain general acceptance, it would be silly to doubt.

Unfortunately, it is becoming harder to keep the faith when results get harder to replicate as time goes by. The reason? Blame the human factor.

Let's take a fanciful example. You're a research biologist with a "brilliant" idea: red ants are "red" because back in their evolutionary history their main preditors were colr blind! Red and green looked the same to them!

You obtain grants and embark on experiments. Your grad students return with reams of data. Are you going to pick and choose between the data to confirm your hypothesis? Of course not, you're no fraud! But there still are subjective, delicate decisions to make regarding exactly which data to report. And, after all, you do hope for positive results - results more likely to be published in leading journals.

In recent years, attempts to replicate initial findings are tending to fail. For instance, the therapeutic value of certain new antipsychotic drugs seems to be waning. A study showing a srong correlation between bodily symmetry in animals and their reproductive success seems to be falling apart. A finding - already in the textbooks - that describing a face doesn't help us remember it may be true, but it is getting harder and harder to prove.

Chance plays a role in all this, of course, but subtle selectivity seems to be a big part of the story. This is unconscious - not fraud. Australian scientist Leigh Simmons (quoted by Jonah Lehrer in a recent New Yorker) put it this way: "The act of measurement is going to be vulnerable to all sorts of perception biases. That's not a cynical statement. That's just the way humans work."

Monday, December 20, 2010

An energy dilemma

Is increasing energy efficiency a bad thing? Well, maybe so.

In his article in the current New Yorker, author David Owen says the question was first raised 150 years ago. The 29-year-old British economist Willam Jevons concluded that more economical use of fuel results not in diminished consumption but an overall increase.

At the time, Great Britain was the world's leading industrial power, but its coal reserves were running out. Jevons said efforts to increase coal-burning efficiency would only backfire. As an example, he focused on the British iron industry. If a new process could produce iron with less coal, profits would rise, stimulating construction of new blast furnaces. Coal use at the increased number of furnaces would more than make up for the diminished consumption of each of them.

We no longer live in the Industrial Revolution, but similar examples still abound. The efficiency of refrigeration and air conditioning has improved greatly over the past half century, but their spread throughout society has meant that their overall energy use has grown far more. Cars have become much better at using fuel economically. But rather than fall, fuel consumption keeps rising. Whenever we save energy, we find more and more ways to use what we saved. Our standard of living goes up, but so does our energy use.

Those who downplay the Jevons effect believe that, in the end, it has little applicability in the modern world. And they raise this rather absurd corollary - if our energy use were to become less efficient, does that mean consumption would decrease?

So, is efficiency a bad thing? Well, maybe.

Sunday, December 19, 2010

Tragedy of the commons

Most of us have heard the story of the "tragedy of the commons." But have we heard the ecological version, or the ideological one?

The basic yarn posits a pasture used in common by herdsmen for their mutual benefit. But eventually, as people keep introducing more cattle, the pasture is degraded by overgrazing. People can't help trying to maximize their own wealth, even when it becomes clear that their individual actions are destroying the very pasture they all depend on.

Conservatives like to give the story an anti-socialism slant, saying that it proves the futility of collective ownership. Only individuals who own their land will take care of it properly.

This version appears not to be true. If it were, how could so many Middle-Ages collectives have succeeded for so many hundreds of years? There is indirect evidence in the historical record that success was the norm: Records of civil lawsuits contain almost no evidence that people were sued for damaging the commons by overgrazing. It seems likely that local ordinances - and a strong dose of peer pressure - kept greedy herdsmen in line.

Of course, the metaphor of the commons remains of vital ecological importance. I don't think I need to belabor all the ways that the Earth is under threat. Think of the commons not as a small pasture, but as the whole planet itself. And remember what Professor Eric G. Strauss of Boston College likes to point out: Around 1990, humans exceeded the capacity of Earth to support the demands we place on it.

Friday, December 17, 2010

Princess Leia

When the first Star Wars movie came out, I already was a bit too long in the tooth to fully embrace the Force. But for a time my son, 6 or 7 back then, fell deeply into a land of long ago and far away. He'd spend hours with his action figures - Luke Skywalker, Hans Solo, Darth Vader, Chewbacca and the rest - oblivious to the grown-up world.

But I wasn't completely immune. After all, there was Princess Leia. In her white dress. (Carrie Fisher recalls that when she first modeled the outfit for director George Lucas, he told her to lose the bra. "They don't wear underwear in space," Lucas said.

I took an interest in her career, and when HBO recently aired a feature-length movie of her one-woman production - "Wishful Drinking" - I made it a point to tune in. It's an excellent show, filled with powerful and gleeful wit along with some serious dish on her skirt-chasing father, Eddie Fisher, and her off-kilter mother, Debbie Reynolds.

It would be easy to joke about the poor little rich girl, but Fisher has had to cope with some big problems - not only her parents and her addictions to drugs and alcohol, but the bipolar disorder first diagnosed when she was a teenager. (At one point, she says, she was "invited" to a mental hospital. "You don't want to be rude. Right? So you go."

That kind of humor pervades the show. For instance, she described the end of her parents' marriage after the the death of their friend, Mike Todd, Elizabeth Taylor's husband: "Naturally my father flew to Elizabeth's side, gradually making his way, slowly, to her front." On her mother's fame: "She is literally an icon - a gay icon, but you take your iconic stature where you can find it."

Now in her mid-fifties, she says she recently Googled herself and found this posting: "WTF happened to Carrie Fisher? She used to be so hot. Now she looks like Elton John."

"Wishful Drinking" is a fun refusal by Fisher to take herself or her family too seriously, and it is welcome evidence that damaged people - even Princess Leia - can pull things together.

Tuesday, December 14, 2010

Taking scalps

During the War of 1812, America soldiers' greatest fear was being massacred, hacked to pieces and scalped by Indians fighting with the British. The sight and sound of war-painted, whooping warriors often was enough to send them into flight.

On May 29, 1813, British naval officer James Richardson watched as first one and then another boatload of American soldiers rowed toward his fleet on Lake Ontario, whitle flag flying. These 115 well-armed Americans were terrified of Indians on shore, and chose to surrender rather than fight them - all 36 warriors.

Richardson said the surrender can be explained by tales of Indian atrocities in the mouths of all mothers and nurses.

Americans blamed the British for spurring the Indians on. In Congress, Henry Clay refuted Federalist claims that Canadians were innocent. "Canada innocent? Canada unoffending? Is it not in Canada that the tomahawk of the savage had been molded into its death-like form?

The British, in turn, valued the Indians precisely because they scared the poop out of Americans. But they argued that Americans were hardly innocent. When an American general rebuked a british officer for the Indians' conduct, he reported that the officer cited as justification "that our government would send the Kentuckians into Canada."

A British sargeant reported that "These Kentucky men are wretches ... served out with blankets like the Indians, with a long knife and other barbarous articles ... After engagements they scallop the killed and wounded that could not get out of the way."

For what it's worth, it was an American who took the first scalp in the War of 1812. On July 29, 1812, Capt. William McCulloch killed and scalped a Menominee warrior, outraging the Menominee who had promised the British they would refain from taking scalps. Maybe it was a harsh sort of justice when, 10 days later, McCulloch fell into an ambush and lost his own scalp.

Sunday, December 12, 2010

Night thoughts

I had been having night thoughts anyway (albeit on a cold, cloudy Sunday afternoon) when I picked up the current New Yorker off the floor beside my chair and read an absolutely stunning essay by Joyce Carol Oates about the death of her husband of nearly 50 years.

As you would expect, Oates' piece is personal writing at its best. She describes how her robust if elderly husband, Raymond Smith, was hospitalized for an apparent case of pneumonia before succumbing a week later to raging bacteriological infections in his lungs. Interspersed with wise reflections about the nature of marriage, her memoir is a record of terror, wild hope, and ultimate despair. It is well worth the price of the Dec. 13 New Yorker.

As it happened, those night thoughts of mine had involved a similar topic. Oates spent most of her waking hours during her husband's last week at his bedside. When my Dad died - while my mother was home packing to stay with him at the hospice - I was 1,200 miles away. When, years later, my mother died of heart failure at 89, I was 1,800 miles away.

According to long tradition, a family gathers at a death bed to see a loved one off. The recent dispersal of family members across the country often makes that impossible. That last outpouring of feeling for the dying must be done from afar, and after the fact.

It can be suggested that, given what little we know of a dying person's last hours, the presence of family at the end may not really matter. It also can be said that, given the hopelessness and inevitability of the outcome, such a gathering is somehow futile. It even can be said that I was lucky to be 1,800 miles away.

Maybe. So why don't I feel lucky?

Friday, December 10, 2010

Early politics

After taking a break from history, I'm back reading Alan Tylor's "The Civil War of 1812." It didn't take long to find an analog to the deeply divided politics of today.

It was clear that by declaring and then winning the war, Republicans hoped to finally do in the Federalists. But for many, the passions of the day didn't want to wait that long. Thomas Jefferson, for instance, for instance, suggested that mobs armed with tar and feathers would intimidate Federalists in the South. In the North, where Federalists were more numerous, he implied the leaders should be hanged and their property confiscated.

Passions peaked in Baltimore, where shortly after the declaration of war a mob of hundreds attacked the office of a Federalist newspaper. The Federalists inside surrendered to city officials, who put them in jail for their safety. No luck. The next day a mob shouting "kill the tories" broke into the jail and attacked and tortured those who had been defending the newspaper. One died of a stab wound to the chest, 11 others suffered crippling injuries while the authorities refused to intervene. Rioters sang, "We'll feather and tar ev're d(amne)d British tory/And that is the way for American glory."

This sort of thing was the nightmare of the founders, who had especially feared political parties, or "factions," as a basic threat to the new republic. In their own way, Republicans agreed. Said Jefferson, "I will not say our party, the term is false and degrading, but our nation will be undone. For the Republican are the nation."

Thursday, December 9, 2010

Through a scanner, weakly

The recent flap over possible danger from airport body scanners - like the ongoing concern about cell phone radiation - demonstrates that for most Americans, radiation is scary, and that's all you need to know.

The appropriate meansurement for X-rays' impact on people is the amount of energy the rays contain. The unit used is the millirem, or mrem. According to international standards, 5,000 mrem per year is the maximum dose permitted for those who work with or around radioactive material. For the rest of us, the average yearly exposure is 620 mrem - most from outer space (from the sun, black holes, and supernovae) - and about 10 percent from medical procedures and other earthly causes.

Radiation exposure from airport full-body scanners is incredibly weak: about 0.01 mrem. After all, unlike medical X-rays, the scanners don't needd to look into your body - just through your clothes. By contrast, at 0.5 mrem, a dental X-ray is 50 times more powerful.

Here are some other figures, from data in the latest Newsweek magazine: 40 mrem for a mammogram, 50 mrem a year if you live in Denver, and 200 mrem a year from radon in the average home. It takes 100,000 mrem to get radiation sickness.

Worry about displaying your imperfections in an airport scanner if you must, but don't sweat the X-rays.

Tuesday, December 7, 2010

As with all surgery, side effects can occur

Hey, guys ... frustrated by the grim fact that women outlive men by an average of five to six years? That by age 85 there are roughly six women for every four men? That by age 100 the ratio is more than two to one?

What's the reason for this outrage? It appears that evolution is to blame. After all, evolution is a lot more interested in our offspring than it is about us. And healthy offspring need healthy moms. If the female body is weakened, reproduction is threatened. So gals get more maintenance - and apparently that early better health pays off in a longer lifespan.

Guys, on the other hand, while important to the wellbeing of their kids, don't do the womb thing. They don't do the suckle thing. No special maintenance for them.

In addition, it's been shown that, from an evolutionary point of view, the factors in males that lead to mating success aren't drivers of longevity. In fact, ligh levels of testosterone are quite bad for long-term survival.

The historical record doesn't show us whether eunuchs outlived normal healthy men, but there are recent studies that suggest they do. Not too long ago, castration of men in institutions for the mentally ill was surprisingly common. In a study of several hundred men at an unamed institution in Kansas, castrated men were shown to have lived 14 years long than those who were intact.

Ask your doctor if castration is right for you.

Sunday, December 5, 2010

Grow-up time?

Most scientists have long taken for granted that there had to be a multitude of alien civilizations out in our vast galaxy. Among so many stars, how could there not be? But of late, many scientists are starting to wonder: Is the Earth it?

Consider: The Earth is in the right 5 percent or so of the galaxy. Too far out, there's insufficient heavy elements, like the radioactive material that keeps continental plates moving and alive with energy. Too far in, it's a madhouse of dangerously close stars, a constant bombardment of comets and rocks, and fierce radiation. Also, our sun is just the right size. Stars that are too big don't last long enough to nurture life, stars that are too small (which are most of them) emit lower energy and create a habitable zone that's too close, resulting planets tidally locked so only one side always faces the sun. In addition, the earth is lucky in its neighbors. For instance, Jupiter is just in the right place and just the right size to steer most meteors and comets away. Meanwhile, our Moon is just the right size and in the right position to act as a gyroscope, minimizing changes in the tilt of the Earth's axis.

Add all the wondrously complex details that keep the planet alive - the rock cycle, the water cycle, the carbon cycle, and so on. The Earth, created with precisely the elements it needed, might exist despite overwhelming odds against it. Our planet might be a very special place indeed.

Do we treat it that way? Afraid not. Each year we lose about 70 gigatons of precious topsoil to the sea. We've filled in important wetlands. In the U.S. we've paved an area greater than the state of Ohio.

This is not to mention the huge and increasing changes we've made to the atmosphere, and all the extinctions we've caused in the biosphere. Geological eras are based on mass extinctions. All by ourselves, we've erased enough species to put paid to the current Cenozoic era - a 65-million-year era of time is crashing to an end.

Professor Michael E. Wysession of Washington University in Lt. Louis reminds us that humans are new to all this, and we're bound to make mistakes. He likens us to children.

Children grow up. Will we?

Friday, December 3, 2010

Global warming can be cool

These days, many people who hear the term "global warming" tend to wrinkle their noses and launch a lecture on responsibility. Of course, rapid human-caused climate change would be bad news for billions of people. But generally, global warming saves our butt every day. And, between 800 and 600 million years ago, runaway warming has cut short a particularly nasty sort of climate change: Snowball Earth.

Greenhouse gasses in the atmosphere - water vapor, carbon dioxide, ozone and methane - let sunlight pass on down to heat up the surface, but when the same amount of heat is reflected back at infrared frequencies, the gasses gobble it up and send much of it right back down to Earth. Be glad of this. Without that heat, the Earth would be too cold for liquid water to exist.

There is good geological evidence of several episodes of Snowball Earth - a time of cold and runaway glaciation that covered not only the continents with ice, but may well have frozen the entire surface of the oceans. Such episodes haven't happened since about 600 million years ago, perhaps because life forms like worms in marine sediments have churned up the seafloor, preventing the squestering of carbon and keeping it free to do its greenouse thing. (It is no surprise that fossiles of multicellular creatures don't exist until after this period.)

But how come the planet didn't just stay frozen? One leading theory is that as ice piled up, ocean levels dropped significantly. Seafloor, much of which lies atop huge deposits of methane, was exposed to erosion. Before long, methane (21 times as efficient, greenhousewise, as carbon dioxide) was streaming into the atmosphere. Runaway warming counteracted the runaway freezing.

This is not to say that humans ought to be messing with a planetary feedback system that's worked for billions of years. But think of global warming every time you enjoy a cool drink of water.

Sunday, November 28, 2010

The day after Christmas, 2004

In my memory, the quickest, most abrupt end to a season of holiday cheer came on Dec. 26, 2004, when a monster earthquake killed more than 225,000 people (estimates vary widely.). For most of the victims, it was death by tsunami.

The quake, in the subduction zone off the western coast of the island of Sumatra, began 35 kilometers beneath the surface. Its rupture spread to the northwest at 2-3 kilometers a second for a distance of 1,600 kilometers over five minutes, devastating much of the island and massively displacing Indian Ocean water. The resulting tsunami killed some 200,000 along the northern and western Indian Ocean shoreline - 50,000 died in India and Sri Lanka - and went on to kill others in Africa seven hours after the initial shock.

At a magnitude of 9.3, the quake was the third most powerful ever recorded, equivalent to almost 2 billion Hiroshima bombs. In addition to thousands of aftershocks, it triggered an 8.7 magnitude quake not far to the southeast - itself the 7th-largest on record - set off two volcanoes, and even stirred some volcanic activity in Alaska. Unfortunately, many of the tsunami deaths were unnecessary. Seismologists worldwide knew of the danger within minutes, but (unlike around the Pacific) Indian Ocean countries had no warning system.

If it's any help, the Earth itself took time to mourn the deaths. According to theory, vertical seafloor motions from the quake changed the planet's moment of inertia enough to shorten the length of each day by 2.68 microseconds.

Saturday, November 27, 2010

Oceans on the 8s

It's time for "Oceans on the 8s!" But unlike the Weather Channel, we don't need to update the report every 10 minutes. Once every 8,000 years ought to cover it.

Let's start with the continental rifts, which split the land by pushing it back and allowing a baby ocean to form and grow. The ocean that will be created by the African rift hasn't been born yet, but it's looking healthy. Before long, it will look something like the Red Sea, a rift ocean which continues to grow nicely. (Incidentally, most rifts never create oceans. For instance, in the U.S. two rifts - one near where the Madrid earthquakes happened in 1811-1812 and the other stretching from Oklahoma to Michigan, failed to split North America.

The Atlantic Ocean also continues to grow as the mid-Atlantic ridge keeps spewing out more seafloor. There are no subduction zones off its shores, so it continues to push North America to the west. However, eventually the Atlantic will begin to shrink, and in the end will disappear. The Indian Ocean is hanging at the same size, its mid-Indian ridge and subduction beneath Indonesia balancing each other.

The Mediterranean is shrinking fast as Africa keeps moving north, and the Pacific Ocean also is getting smaller. The subduction zones all around the ocean are eating seafloor faster than it can be produced. (Between the pull of the Pacific and the push of the Atlantic, North America is moving west at about an inch a year.)

That's it for the 2010 report. Look for us again in 2810. (Because 8,000 years is something like 32,000 generations, you'll probably want to take pains to remind your kids and grandkids to keep passing on the word.)

Wednesday, November 24, 2010

Excitement in the Northwest

It was mid-morning on May 18, 1980, when a very puzzled and hesitant Helena radio DJ announced: "Er, you might want to put your car in the garage. They say Mount St. Helens has exploded, and ash is coming our way." I was every bit as puzzled as the DJ - St. Helens was good 500 miles to the west - but I dutifully drove my car into my seldom-used garage. Within hours, a light gray powder began to accumulate atop the freshly budded leaves of every shrub in my yard.

It is possible this is not the most exciting volcano-eruption story you've ever heard, but it was mine. I even brushed ash from my lilac leaves into vials - tangible proof for posterity!

The closer you were to the volcano, of course, the more exciting things got. In Spokane, some 200 miles from the blast, ash turned day into night. Ash in the air circled the globe in 15 days.

On or near the mountain, things were rather more serious. A final, 5.1 magnitude earthquake under the mountain - the last of 10,000 smaller quakes during the preceding few months - set off the blast, which caused the north side of the peak to give way. It was the largest landslide in the Earth's recorded history. Mudflows created by steam and ash combined with snow and water rushed down the Toutle River, taking out 27 bridges. Energy released by the eruption was equal to 27,000 Hiroshima nuclear bombs.

You want an exciting story? The last words radioed by volcanologist David Johnston from near the summit were: "Vancouver! Vancouver! This is it!"

Unfortunately, the story of volcanoes in the Pacific Northwest gets more exciting still. Mount Rainier, the tallest peak in the Cascades, looms above Seattle and Tacoma, still active, poised to erupt at any time. (Many Seattle homes were built on former mudflows.) When earthquake swarms inevitably begin rumbling beneath Rainier, millions of people had better do more than drive their car into the garage.

Tuesday, November 23, 2010

The importance of looking

Not too many years ago a paleontologist named Mary Higby Schweitzer was all over the news. Schweitzer, after more than a decade of dedicated research, had done what most of her colleagues thought was impossible - she showed that well-preserved fossilized bones of dinosaurs can contain blood cells and the remains of soft tissues that can tell us far more than we know today about these extinct animals.

The press went nuts. It didn't hurt that she was a good-looking woman, of course, but the fact was that scientists over the past 300 years had determined that dinosaur bones were all you could get. Any soft tissue that might remain would be so degraded after at least 65 million years as to be scientifically useless. (For most of those 300 years, of course, scientists didn't have the lab equipment to study it anyway.) But in 1992, Schweitzer noticed what looked like blood cells and other organic matter exactly where they should be found in fossil bones. Ignoring the received wisdom of other paleontologists, she patiently did the tests necessary to rule out other possibilities and published her tentative observations in 1993.

Still a graduate student, offering data that went against the common view, her paper got little attention. But she kept at it, finding more soft tissue in more dinosaurs, honing her observations, and finally publishing the work of her and her team in 2007 and 2008. Despite controversy, her findings not only made media waves but gained general acceptance in the field. (She's written an article about all this in the current Scientific American.) The bottom line: It turns out that while lab extrapolations say dinosaur tissue can't survive intact enough to study, there obviously are situations out in the real world in which the tissue can indeed outlast the huge time span.

Schweitzer got a lot of recognition in Montana because she had done her initial work and obtained her doctorate from Montana State University and had worked for MSU's Jack Horner, the celebrated dinosaur expert.

But it remains to be seen if she will be a footnote, a sidebar, or a major figure in future books about paleontology. Her studies continue; her results may tell the tale. Still, I think she should be remembered in any case for paying attention to something important in the face of generations of scientists who simply hadn't bothered to look.

Monday, November 22, 2010

Rocks of ages

The writer John McPhee once came up with a striking way to think about the age of the Earth. He said to imagine that the length of your arm represents the planet's history. Your shoulder is the farthest back in time - 4.567 billion years ago, scientists say - your elbow would be a little more than 2 billion years ago, and so on. And here's his point: If you gently brushed the tip of your fingernail with a nail file, you would erase all of human history.

Filing your nails should never be the same again.

But what's with that 4.567 billion year birthday? The age of rocks can be measured with great precision by studying the results of radioactive decay, but the oldest rocks ever found are dated only a little over 4 billion years. During its infancy, the Earth was constantly being bombarded with debris as the solar system pulled itself together, and no rock could form while that blasting kept the planet in a molten state.

So how do we get precisely 4.567 billion years? The answer is both simple and elegant: by dating rocks that are on the Earth, but are not of the Earth.

As the planets were being formed, countless other bits and pieces of matter never got a chance to join up, Those small, lonely asteroids just kept zooming around, pulled slightly this way and that whenever they neared a planet, but generally were doomed to isolation - until they found themselves aimed squarely at a crash landing on a planet such as the Earth. Over the eons, a great many of them have done just that.

Studies of the ages of these rocks show ... 4.567 billion years.

If you held such a rock in your hand, your fingers would be curling around not only an object exactly as old as the Earth, but a third as old as the universe itself.

Sunday, November 21, 2010

Doom from above

Sometime in the Earth's young childhood, well over 4 billion years ago when it still was mostly a molten mass of rock, the wildly chaotic gravitational pulls of the early solar system almost certainly sent a body the size of Mars or larger careening off our fledgling home, slamming material off into space. That material - or much of it - remained within the Earth's gravitational pull, and before long in cosmic years coalesced into the moon we watch circle our planet today.

Scientists think this is true because rocks brought back by astronauts match almost exactly the very makeup of the Earth's mantel.

I thought of this while reading this week's Newsweek, which uses its final page to try to graphically illustrate some issue. The current "Back Page," about dangerous asteroids, is headlined: "Is the end nigh?" Newsweek suggests not.

It does, however, point out that while an asteroid the size of a basketball crashes into the Earth's atmosphere daily, a basketball-court sized object blows into town an average of every 200 years. And a football-field sized object comes by once every 10,000 years.

The piece offers a table showing the most dangerous asteroids - that we know about - that could strike us in the next 100 years or so. The sizes range from 98 feet in diameter to 3,609 feet, more than half a mile. But the odds for all of them are low - from 1 in 770 for a cute little 121 footer to 1 in 53 million for that big guy. (And for comparison, the asteroid that put the dinosaurs and 75 percent all other species to bed 65 million years ago had to be seven to eight miles wide.)

So a really big, humankind-ending event isn't very likely anytime soon. But, as they say, you never know. It was only in 1908 that a 120-foot visitor broke up over Siberia, flattening 800 square miles of forest. I was struck by the comments of astronomer Alex Filippenko, who insisted that over the next few hundred millions of years the Earth is almost certain to be blasted by what would be the end of the world for us. He urged that we quickly begin readying the means to deflect such a monster. After all, he said, we are far from knowing about all that is out there - and an estimation of probabilities wouldn't have been much help to those dead dinosaurs.

Bodies the size of Mars no long zoom erratically throughout our solar system. We would have seen them. But who knows about those 10-milers?

Thursday, November 18, 2010

Moral befuddlement

I've just started reading a new book by Sam Harris called "The Moral Landscape: How Science Can Determine Human Values." Here Harris, a neuroscientist best known for his first book, "The End of Faith," isn't so much talking about religion but rather the need for a morality based on human well-being - a morality firmly based on science and rationality.

Of course he rejects religion as necessary for morality, but some of his deepest scorn goes to liberals who seem to think that cultural relativism - "tolerance" - is the greatest good. In a speech at a scientific conference Harris said that "the moment we admit we know anything about human well-being scientifically, morally speaking we must admit that certain individuals or cultures can be absolutely wrong about it." He mentioned the Taliban as an example.

After his speech, he was approached by a female scientist who serves on the "President's Commission for the Study of Bioethical Issues." Here is part of their conversation, "more or less verbatim:"

She: "What makes you think that science will ever be able to say that forcing women to wear burqas is wrong?"
Harris: "Because I think that right and wrong are a matter of increasing or decreasing well-being - and it is obvious that forcing half the population to live in cloth bags, and beating or killing them if they refuse, is not a good strategy for maximizing human well-being."
She: "But that's only your opinion."
Harris: "OK ... Let's make it even simpler. What if we found a culture that ritually blinded every third child by plucking out his or her eyes at birth, would you then agree that we had found a culture that was needlessly diminishing human well-being?"
She: "It would depend on why they were doing it."
Harris: "Let's say they were doing it on the basis of religious superstition. In their scripture, God says, 'Every third must walk in darkness.' "
She: "Then you could never say that they were wrong."

"Such opinions," Harris commented, "are not uncommon in the Ivory Tower."

I consider myself a liberal and believe me, it is thinking like this - right down there with the brilliant logic of conservatives - that makes me ponder the grim future of our (morally) stupid species.

Wednesday, November 17, 2010

On thinking straight

This afternoon my thoughts turned to what is called the Monte Hall Problem (based on the "Let's Make a Deal" TV game show). It is a devilish little puzzle that reveals how inept most of us are at reasoning. Suppose you are a contestant and are presented with three closed doors. One hides a shiny new car, the other two conceal goats. Your hope is to pick the door with the car.

Let's say you picked Door #1. Monte, that sly dog, then opens Door #2, revealing a goat. He asks if you want to switch your choice to Door #3. (Hint: you really, really should switch.)

This perplexes people to no end. After all, our powerful intuition is that because the car is behind one of just two doors, the chances are 50-50. Switching your choice can make no difference: the odds are 1 in 2 in any case.

Wrong! If you stick with Door #1, you have a one-third chance of getting the car. If you switch to Door #3, you have a two-thirds chance - twice as good!

Here's why: At the start of the game, each door has a one-third chance of hiding the car. When you pick a door, the other two doors add up to a two-thirds chance.

It doesn't matter if Door #2 is then opened to reveal its goat. Your choice - Door #1 - still has the same one-third chance. And the other two doors still have a collective two-thirds chance. So Door #3 - the only other unopened door - has a two thirds chance of concealing the car. And you're a dummy if you don't switch.

What interested me about all this today was that when I started considering writing about it, I realized that I'd have to explain this reasoning. But my mind was blank! I knew about the puzzle. I knew the correct choice was to switch. So why couldn't I remember how come?

Well, I soon managed to figure it out again. (As they say, this isn't exactly rocket science.) But I think my initial blankness was caused by my compelling intuition, unabated by the facts, that the odds had to be 50-50. (Dammit, that is obvious!) I think somewhere up in my brain, my intuition muscled in, kicked my better knowledge into a mental closet, and took over like some schoolyard bully.

The lesson? It's hard to think straight!

Tuesday, November 16, 2010

A decorous Tea Party

Some time ago, discussing the right-wing Tea Party enthusiasts, I made the off-hand comment that in contrast to the modern movement, the real Tea Party was well organized. (This, of course, was well before any of us knew the results of the midterm election. Oh, well.)

Anyway, I've since learned a few more details about how organized the original Tea Party really was.

The local Boston Committee of Correspondence and Safety - one of many such groups throughout the colonies that organized resistance to Great Britain and actually were in charge as war broke out - decided that tea delivered under the Tea Act must not be unloaded. When a ship carrying such tea arrived in the harbor, the committee asked the ship's American owner - in no uncertain terms - to have it sail back to England with the tea. The owner reluctantly agreed to ask British authorities if he could do that. He was turned down.

A week or two later, on Dec. 16, 1973, people dressed as Indians (to symbolically represent the whole of "America" to Europeans) took action while an estimated 8,000 others watched from the shore. The "Indians" told the ship's captain that no one would be harmed, and that only the tea would be destroyed. They meant it. The tea was to be tossed into the harbor, not stolen. No other property - other cargo, the captain's or crew's possessions - were to be disturbed. Private property was to honored. In fact, when the "Indians" broke a padlock in the process of getting to the tea, they quickly replaced it with a new one.

After the event - 342 barrels over the side - there were no riots or raucous celebrations. The thousands of participants and watchers simply went home. This was a carefully controlled protest.

In the spring of 1774, Parliament responded by passing the so-called "Intolerable Acts" that shut down the harbor vital to Boston's economy, suspended local colony government, renewed the quartering of British troops in colonial homes, and more. So much for a peaceful resolution. Next, a year later, came Lexington and Concord.

Monday, November 15, 2010

A lesson of the Boston Massacre

Recently, while learning about the origins of the War of 1812, I was puzzled: Why did the ruling party at the time declare war on the British Empire when America had only a tiny navy and a paltry little army of a few thousand men? I knew that the Republicans of this era were notoriously cheap, hating high taxes. And I knew that there was a general antagonism toward a "standing army." But, hey?

As it turns out, you only need to look back to the "Boston Massacre" of March, 1770. The British had sent 4,000 troops to Boston (a city of 15,000) to maintain order in the face of riots against Parliament's recent tax measures. The troops and Bostonians didn't get along. On March 5, some of the soldiers fired into a crowd of people protesting their presence, killing five of them. Nobody really knew if or how the troops had been provoked, or if the whole thing was premeditated. But, as a propaganda tool and a lasting rallying cry, it was a major step toward the war to come.

A big part of the propaganda had to do with a "standing army." As opposed to a legitimate army mustered in time of war, a standing army was seen as a weapon wielded by a tyrant to attack his own citizens whenever he suffered opposition. That's how most colonists saw the troops in Boston, and the incident on March 5 only confirmed their view. Countless speeches memorializing the killings over the next few years leading up to the revolution pounded the point home. No wonder that even on the eve of the War of 1812, a standing army remained a dirty word - a grim reminder that a despot always could turn that army against his own people.

Incidentally, one of the most effective pieces of propaganda came from none other than Paul Revere. He etched a picture of the attack showing redcoats, standing in a line like a firing squad, shooting into a crowd of unarmed people, bloody bodies lying on the street. The drawing was reproduced widely in newspapers and broadsheets for years. It helped keep the anger alive. I guess he should be remembered for more than his midnight ride.

Sunday, November 14, 2010

Before the Stamp Act

One thing I never really learned in school is that prior to the 1760s, the British colonies in North America had been, in a sense, the most free people in the world - for something like 150 years. London had regulated trade across the Atlantic, but the colonies got to elect their own assemblies and govern their own affairs. They were still nominally under the control of the king and the Parliament - and certainly thought of themselves as Englishmen - but they ran their own show. The empire kept hands off local decisions.

Then, at least as we were taught, Britain passed the Stamp Act in 1765 - the first direct taxation of all colonists - which sent people into the streets, caused them to boycott British goods, and to create "Sons of Liberty" organizations throughout the colonies. (The Stamp Act was followed, of course, by the Tea Act. Whoops! People dressed up as Indians and did their tea thing in Boston's harbor.)

But wait a minute. Isn't this a bit of a quick reaction? From loyal subjects to rioters overnight?

Yes, it is. It turns out that for several years, Parliament had been changing the rules. After enjoying many generations of self-government, the colonies realized the British had begun taking over:

- In 1763, Parliament passed a proclamation severely limiting colonists from settling west of the mountains toward the Mississippi River - land recently ceded to England from France. Colonists were saying: "What the hey? Isn't this our say?"

- In 1764, it passed the Sugar Act (also called the Revenue Act) involving custom taxes on molasses and many other commodities, severe penalties including the confiscation of ships, and forcing alleged violators to appear in an admiralty court in Halifax, which had no jury of peers.

- Also in 1764, it passed Currency Act, forbidding colonists from using local paper money to pay taxes - a law colonists said would impoverish them.

To the British, it all made sense: prevent chaos in the West, raise revenue to meet England's expenses in North America, and prevent paper-money fraud. To the colonists, it would not only destroy the economy but be a huge violation of the property rights of free and loyal British subjects who had no representation - a clear violation of rights dating back to the Magna Charta in 1215.

My point is to bitch about not learning about any of this stuff in school. My impression was of people going ape at the drop of a hat. It just wasn't so. Instead, it was a steady erosion of rights over several years that led to what was to come. These early Americans, still considering themselves English dudes, took some years to get really upset.

Friday, November 12, 2010

Another Founding Father?

As late as 1760, 16 years before the Declaration of Independence, the British colonists in North America may have had some disputes with London, but the idea that they might one day sever their ties as loyal subjects of the crown was simply unthinkable. This was a time in which a person born a subject of the King of England lived a subject, and died a subject. Any alternative never crossed one's mind.

But, starting in 1761, a new kind of political movement arose: Maybe the colonies didn't have to bow to every whim of the crown after all! The idea of resistance - if not yet actual revolution -began to spread. But what kicked it off?

Historians think that beginning can be traced to one James Otis, a lawyer who was hired by distillers in Boston who objected to British renewal of the "Writs of Assistance": laws that let custom agents board ships when they suspected contraband. The merchants, accustomed to bribing officials to get away with a little fast and loose business dealings, wanted the law struck down.

Otis argued that the Writs of Assistance were unconstitutional under the British Constitution because they violated a person's right to his property. His argument was shot down in court, but during the next few years he wrote pamphlets - "The Rights of the British Colonies Asserted and Proved" was one - that were widely read. The concept that colonists were empowered to resist those acts of Parliament they didn't like struck a chord. Refusal to accept "taxation without representation" wouldn't be far behind. President John Adams believed that it was Otis's arguments in the Writs case that helped spark what became the American Revolution.

Otis's story ends badly. Late in the 1760s, after being subjected to constant attacks by political opponents, he began suffering from mental illness. He died in 1783 after being struck by a bolt of lightning.

It is true that resistance was in the colonial air anyway. King George III would keep making make sure of that. Still, looking back 250 years, maybe we should be remembering James Otis as the initial Founding Father.

Wednesday, November 10, 2010

Students and BS detecting

For years I've admired Sharon Begley, a science writer lured from the Wall Street Journal to Newsweek several years ago - perhaps when she saw the writing on the wall about where the Journal's ownership was heading. Newsweek gives her a weekly column in which she can raise some neccessary questions.

One of her interests is science education. After all, studies keep showing that American kids keep falling farther and farther short of other nation's kids in their science and math skills. In a recent column, Begley says that the main problem is that K-12 schools aren't doing well teaching students to recognize BS, or "bad science."

Rather than concentrating only on all the facts of the various sciences - memorizing the structural formulas for alkanes, for instance - kids should learn the first, most important principle of science. And obviously, judging from grownups, they're not learning it. This principle is this, she says: "the most useful skill we could teach is the habit of asking oneself and others, how do you know? If knowledge comes from intuition or anecdote, it is likely wrong."

This is obvious, except to the human brain. For instance, it can't tell randomness from real patterns of data (climate warming, say), and it wants to assign causality to weak data in order to confirm it's own beliefs. The brain is a really cool thing, but it can really be dumb, too.

Science classes obviously have to teach the science - trig, the Krebs cycle, Ohm's law, mass equals the speed of light squared - but foremost, Begley says, it needs to teach kids that "science is not a collection of facts but a way of interrogating the world."

Not to mention, and she didn't, interrogating politicians, whose BS stands not just for bad science.

Tuesday, November 9, 2010

Secrets in the sky

When physicists and astronomers concentrate on the normal atoms and energy that make up the well-understood stuff of our universe, our stars and planets and us, they're only dealing with 4 percent of the total. (And when we point telescopes into the sky, the part of this normal matter that glows in optical wavelengths amounts to a scant 0.5 percent.) A full 96 percent of all the stuff that is out there goes largely undetected and is far from understood.

I don't know whether to be amused or alarmed.

It turns out dark matter amounts to 21 percent of the total mass-energy of the universe, and dark energy adds up to a full 75 percent. (The use of the adjective "dark" in both names is misleading. They have nothing to do with each other. Dark matter pulls in like normal matter does, helping interstellar gases and dust clump together to form stars, galaxies and galactic clusters; dark energy pushes out, accelerating the expansion of space itself.)

Scientists think dark matter is mostly what they call WIMPS - weakly interacting massive particles - that are strange particles indeed. Look in vain for such normal components of matter as protons and neutrons. It's a bit of an embarrassment that years of elaborate experiments have yet to detect their presence. Still, the dark matter exists. We wouldn't be here without it.

Dark energy, making up three-fourths of the mass-energy of the universe, really isn't understood at all. But it is there, and four or five billion years ago its outward-pushing power overcame that of gravity and stepped on the gas. There are theories about it - having to do with quantum fluctuations and something called quintessence - but I'll pass on the explanations. This is heavy-duty physics that wouldn't fit in a blog, even if a former small-town journalist could do it justice. Let's just say it is energy with negative pressure that speeds up the expansion of the universe. And the foot may stay on the accelerator forever.

I've had a layman's fascination with astronomy all my life. I've read many, many books of popular science, and I've just been watching a long series of up-to-date lectures on the subject. Physics and astronomy have an amazing record of accomplishment. On the other hand, apparently it's 4 percent down, 96 percent to go.

Sunday, November 7, 2010

What? The sky gets dark at night?

I've been listening my way through astronomy lectures about black holes, but also looking forward to upcoming ones on the next disc, which included "The Paradox of the Dark Night Sky." That's a cool topic, and I thought I'd write about it. But the DVD turned out to be flawed - it came with a big crack - so I was out of luck.

But the hell with it, I'll do a blog anyway. It's about this question: "Why is the sky dark at night?"

People who hear that question for the first time usually think it's pretty silly. (Gee, do you supposed it has something to do with the sun going down?") But the issue is a lot deeper than that. Go out tonight and gaze at those umpteen stars out there. They are suns, and their combined brightness (including all those too dim for our little eyes to see) should far exceed the brightness of our Sun. Not only should the night be bright, but the day should be brighter than it is - even considering that the stars appear to dim according to the square of their distance from us.

The paradox was pondered by the likes of Kepler and Newton, but it wasn't until the mid 19th Century that Wilhelm Olbers brought other astronomers to attention by pointing out that the darkness meant that at least one of scientists' many basic assumptions about the universe (he had no idea which one) must be wrong.

Several possible answers - light blocking by interstellar dust or dark matter - are easily ruled out. The best answer, especially now that we have a rather good handle on the age of the universe, appears to be that the night is dark because the universe is relatively young. There simply hasn't been enough time for light from the countless farthest stars to reach us yet. In another 10 billion years or so, that could change. Earthlings probably won't be around to see it after our Sun goes haywire, but by then the night sky could indeed be bright.

But I'm looking even farther down the road. If the expansion of space continues to accelerate due to dark energy, eventually all the stars in the night sky (except the stars in our "local group" of galaxies that are gravitationally locked together) will be so far away as to disappear from sight. The sky will be dark indeed. The Milky Way will really stand out!

Or, at least on a somewhat shorter time scale, if the universe is vastly bigger than the part that is currently visible to us, starlight in our part of the universe may eventually get really bright.

Or, if the universe actually is a "multiverse" - with other separate universes forever beyond our sight or reach - the fate of the night sky will depend on just how big our own particular universe actually is.

All this from thinking about why it gets dark at night. Gosh, maybe it's a seriously silly question!

Friday, November 5, 2010

My little guy

In Helena, Montana, we're in that weird early-November period in which temperatures swing like an out-of-control pendulum. The temperature might be somewhere around the mid-twenties at night, but rise to the 60s by around 4 p.m. - only to start falling toward below freezing a few hours later.

Something about these temperature swings makes the solitary red squirrel in my neighborhood act like a truly frantic rodent. I watched him today - the little hyper guy with the bushy tail whose territory apparently consists of my end of the block - as he jerked around like Brownian motion. He'll tight-walk the telephone lines above the alley, then sniff out what might be in the gravel down below. Up one tree. Down another. Along the top of a fence. Nosing among fallen leaves for the remains of oxidized apple crumbs left by deer. Nosing below bushes, looking for remaining seeds. Nosing around for anything he can find.

Soon winter will come. Snow will cover the ground, and I can't imagine how my squirrel will eat. But he will. He's squirreled away food. He'll have the energy for entertaining visiting females, the energy to climb the highest trees to search for any remaining nutrients, and the energy to emerge in spring as bushy-tailed as ever.

I like that little guy.

Thursday, November 4, 2010

The star-struck blues

Years ago I learned something ultimately terrible but still strangly comforting. The bad news was that the Sun eventually was going to quit on us. It would run out of fuel, expand into a red giant in a frantic attempt to compensate, but finally collapse into a white dwarf, dooming the Earth forever.

The good news was that stars of the mass of the Sun keep shining for about 10 billion years. As the Sun happens to be 4.6 billion years old, even a kid could do the math: We had a good five billion years to go! (I knew none of this had any bearing on me, personally, but still ...)

Anyway, it was an idea that kept the monsters securely trapped under my bed at night, and it's been a calming thought in the back of my mind ever since. Oh, foolish me!

It turns out that stars like the sun make their energy by fusing four hydrogen protons into one helium nucleus. That's a big net loss of particles. This has consequences, because the pressure needed to make the Sun's engine work is proportional to the product of the particle density and the temperature. As the numbers of particles go down, the termperature has to go up.

Rats. Within a few hundred million years, the Sun will have heated up so much that people on Earth will really, really notice it. Within, say, half a billion years all the water in all the oceans and lakes and rivers and toilet bowls will be gone. The Earth will far more unlivable than the worst desert you could ever imagine.

We have been robbed. Instead of another 5 billion years of earthly bliss, we get maybe half a billion. Somebody call the police!

Of course we may have options in the future. Maybe we can build a big space tug and tow our planet away from the heat. Maybe by then we can just download ourselves onto computer chips, load them into space ships, and go live somewhere else. Hell, maybe we'll even be able fix the damn Sun.

But wait a minute. Why am I saying "we?"

Wednesday, November 3, 2010

A prescient voice

Reading a New York Review of Books essay on "The Irony of Manifest Destiny: The Tragedy of America's Foreign Policy" by William Pfaff, I seem to be learning about a man who, over the past 50 years, has always gone contrary to the conventional wisdom, and has always been right.

For instance, those of us who are old enough will remember cold warriors strutting around and beating their chests about "Finlandization." (This was about that country's craven buckling to the Soviet Union after World War Two.) In fact, as Pfaff wrote at the time, Finland was attacked by Stalin in 1939 and heroically defended itself until forced to cede some territory. After the war, as a former German ally and facing absorption by the Soviets, Finland was in a terrible situation. But it maintained a careful neutrality, had to make some uncomfortable compromises, but managed to maintain its independence and democracy. Now, of course, Finland is an open country with a successful high-tech capitalist economy, state-of-the-art public health care, and a much-admired public education system. And it sure outlived the Soviet Union - thanks to restraint and patience. That's something that's been sorely lacking in the United States, which hasn't unequivocally won a war (excepting the dashing Reagan attack on the island of Grenada) since 1945.

Over the years, Pfaff has continued to be correct. Nearly 50 years ago, he argued that Soviet communism was inherently weak and eventually would collapse on its own. He advocated careful containment until then. But no, we had to put 54,000 names on the Vietnam War Memorial. He was right 20 years ago, when he argued that with the end of Cold War American's military should be reduced and adapted to new circumstance. But no, it was hugely increased - handy for our failed attempt to catch Bin Laden and impose democracy in western Asia. Now, in his 80s, Pfaff doubts the "enormity of the Islamic radical threat." Don't bet he's wrong.

Not being one to sit around reading scholarly articles on foreign policy, I hadn't come across Pfaff. But I wish he could live another 80 years. We need people who can see behind what the reviewer called the "received ideas, attitudes, and platitudes of the age."

Tuesday, November 2, 2010

Weirdness

The last couple of days I've been learning about two really strange and totally separate things: the weirdness of the moons of the outer planets of our solar system, and the weirdness of American politics in the first decade of the 19th Century. Guess which are weirder? Here's a hint - when have American politics not been weirder than anything else you can think of?

Sure, there is Io, a moon of Jupiter that is the most geologically active body in the solar system. It spurts sulfur compounds like a colicky baby. There is Titan, a moon of Saturn, with its methane lakes that might nurture microbes. But then there is early American politics, which make no darn sense at all.

Of course, the politics probably do make some sense to historians who have spent years studying the subject. But to the rest of us ... weird.

Modern politics sometime are echoed in the early 1800s. "Republicans" of the time, a very distant relative of the modern GOP, can sound very modern and tea party-ish: In 1812, Congressman John A. Harper of New Hampshire celebrated the United States as a loose confederation of sovereign states "without foreign or domestic wars, without taxation, without any more of the pressure of government than was absolutely necessary to keep the bands of society together." In a great oxymoron, Thomas Jefferson said the U.S. has the strongest government on earth, because its institutional weakness would ensure that people would come to its defense because it demanded so little of them. This was the party that hated the idea of a "standing army," decimated the navy and army, economically weakened the U.S. by a trade embargo that only helped Canada and Great Britain, yet soon declared war against Britain, the greatest military power in the world at a the time when the U.S. was basically a cripple among nations.

This was long before the Democratic Party was formed - hell, it was before the Whigs. The Federalists, scared big time by Republican policies, argued that the British were right, the Republicans were wrong, and the country was going to hell. Anarchy was on the way. They cared mostly about international trade. Liberals, they were not. Nobody gave a damn about liberty for blacks, only white men.

Politics have changed a lot in 200 years. Republicans fought for racial equality (in law, anyway) in the mid 1800s, progressivism under Teddy Roosevelt at the turn of the century, and went ape shit about the New Deal. Democrats backed southern slavers, then got all disorganized (Mark Twain said he belonged to no organized party - he was a Democrat) and then in the 1930s belatedly realized they needed black votes.

At least the distant moons, Io and Titan, have stayed constant in their weirdness. I don't even want to go into modern U.S. politics.

Monday, November 1, 2010

Our neighbors and climate change

I recently read an article in Scientific American magazine that had a subhead which caught my attention: "Why can't we have a civil conversation about climate?" The article centered on climate scientist Judith Curry, who has enraged many of her colleagues by saying that some global warming critics - certainly not most of them - have legitimate concerns about the science that is being conducted. She says these real worries are too often ignored (or responded to as though they were merely political claptrap) by the mainstream science community.

I read it with interest as one who tends (like most of us) to consider sources over substance. But the article jumped back into my mind this afternoon as my lecture series moved to "terrestrial" planets like Venus and Mars. Venus, covered by clouds of sulfuric acid and with an atmosphere made up of 96 percent carbon dioxide and 4 percent nitrogen, exhibits the ultimate "runaway greenhouse effect." It's atmospheric pressure is 90 times that of Earth. The CO2 captures most heat radiated by the planet, leaving Venus baking in a temperature of 480 degrees Celsius night and day, everywhere. It's hot enough to melt lead.

Mars, on the other hand, had an "inverse greenhouse effect." The planet used to have surface water - the evidence is everywhere - but today its atmosphere - also mostly carbon dioxide - has decreased to only one percent of the atmospheric pressure of that of Earth. That makes it incapable of retaining surface water anymore. In addition, its temperature drops as low as -130 C. Not a nice place to live, either. What happened to the atmosphere? Maybe, if Mars experienced a cooling trend, being so small and far from the Sun, carbon dioxide began freezing out, decreasing greenhouse warming. That would cool the temperature, causing more air to freeze out, and so on.

At any rate, on both Venus and Mars the changes were permanent and devastating. Life that either planet might may have had never stood a chance.

Back on Earth, we fight about climate change, with each side unhappy about uncertainties in the data and in the computer models. But the uncomfortable fact is that as long as the uncertainties exist, things could turn out to much rosier than projections indicate (Whoopee!), but things also could turn out to be much worse. The deep history of the Earth shows that our planet has endured periods of both warming and cooling and has come out of them - although today either sort of change would have staggering social consequences. But when it comes to worst-case scenarios, we've got a couple of close planetary neighbors to give us pause ... in no uncertain terms.

Sunday, October 31, 2010

On emptiness

It's hard to read much popular science writing without learning that atoms - and consequently anything made of atoms, such as a chair - are pretty much made up of nothing. A hydrogen atom, say, has a nucleus consisting of a single proton and a single electron buzzing around it as a quantum probability cloud. All the space between the two has no matter in it at all. (Sure you want to sit in that chair?)

But just how empty is an atom? The guy teaching my DVD lecture course on astronomy did some math. It turns out that our hydrogen atom is empty indeed. It is 99.999999999999 percent empty. That's 14 nines. (Still going to sit in the chair?)

But there is emptiness, and there is emptiness. The teacher (Alex Filippenko, University of California, Berkeley) did some more math and found that the ratio of the radius of that electron cloud to the radius of that proton is about 50,000. Then, rather arbitrarily, he compared that result to the Milky Way Galaxy by finding the ratio of the distance to the nearest star (4.2 light years) to our sun's radius of 700,000 kilometers. That ratio turns out to be about 60 million. Considering just the stars, anyway - which of course are a rather small part of the whole considering the dark stuff - our galaxy is 1,200 times emptier than an atom!

And then there's my brain. But let's not go there.

Saturday, October 30, 2010

Galileo's middle finger

Most of us know about all we need to know about Galileo Galilei - a great early scientist whose telescopic observations of the phases of Venus and Jupiter's moons pretty much put paid to Ptolemy's ancient system in which all heavenly bodies circled the Earth. A martyr (of sorts) who was tried by the Catholic Church for advocating Copernicus's heliocentric system and forced to recant. A guy who, according what probably is a mere legend, muttered - kneeling before the Inquisition - "And yet it moves" (referring to the Earth).

But you want to know about the middle finger. Professor Alex Filippenko, the teacher of a lecture series I'm watching, said Galileo is one of his heroes, and after his senior year in college he traveled to Europe, visited Florence, and stumbled on a relatively small museum of science called the "Istituto e Museo di Storis della Scenzas" - the "Florence Institute and Museum of the History of Science" to you. Wandering through the collection of medieval scientific instruments, Filippenko came across the phalange in question.

It stands upright within a small glass egg, held in a larger cup, looking for all the world like, well, the finger.

Most sources I checked on the Web - secondary sources that were very secondary - said that after Galileo's death students stole the middle finger of his right hand from the corpse. Another source implied the whole hand was taken, and the finger wasn't snapped off until 95 years later (when it had sufficiently aged). Then it was "passed around" for a few hundred years until the Florence museum obtained it. Not the most exacting of provenance.

Anyway, as one Web site said, "It stands displayed upward defying all those that opposed the advances of science." Or maybe it remains little more than a student prank. I'll let you decide, if decide you must.

Friday, October 29, 2010

Enjoying winter in July

Until I recently figured it out, I was becoming more and more irritated by the lecture course I've been watching called "Understanding the Universe: An Introduction to Astronomy." Geez, it seemed like grade-school stuff - especially the lecture devoted to how the seasons occur. Then I realized these first few lectures were devoted to "celestial sights that everyone can see." OK, we'll move on...eventually.

Regarding that "how the seasons occur" lecture, I sat through much of it with a certain amount of ennui. The professor, Alex Filippenko of the University of California, Berkeley, explained that studies of the graduates of big-league universities consistently show that these people - by a 25-to-2 margin, say that seasons are caused because the earth moves away from and closer to the sun. That's why Filippenko felt it necessary to devote a whole lecture to the topic. So, off we go, talking about how the earth is tilted 23 and a half degrees away from the plane of the solar system, so the northern hemisphere gets more energy in summer, blah, blah. (I thought everybody knew this.)

Anyway, at the end of the lecture, the professor raised a topic that I had "learned," but had pretty much forgotten: "precession." (It was the similar (but strangely different) precession of Mercury that so befuddled early 20th Century scientists - a befuddlement ended by Einstein.) For Earth, precession means that gravitational forces of the sun and moon, mostly, want to tip us over. But our planet's rotation resists this, and instead it just precesses - it sort of wobbles around like a top.

This wobble occurs over a 26,000-year period. This means that in half that time - 13,000 years - the earth will tilt the other way! The northern hemisphere will have summer in what used to be the winter, and winter in what used to be the summer! And 13,000 years later still, we'll be back to where we are today!

I find this so cool! The transition will be so slow as to be imperceptible over anyone's lifetime. But just imagine (if we're lucky) kids enjoying a cold winter day in July, 15,010 - "Look at the snow! Grab your sled!"

Thursday, October 28, 2010

Deceiving children

Children of the world ... you are being deceived! Just as generations of children before you have been deceived! You think the sun is yellow!

Oh, the pity of it! Open any illustrated children's book. Find an illustration of children playing in the happy grass beneath a benevolent, glowing sun. Every time, that sun is colored yellow! Horrors! The sun, children, is white. Sure, kids, the sun looks yellow, or even red, late and early in the day as it sets or rises and has to shine through much more of our gunked-up atmosphere than it does from high in the sky. But sunshine actually is - we learn as we grow up if we pay attention in science classes - nothing but white light. You need a prism of some kind to find colors in it. Yet, brain-washed as you children so sadly are, you think the sun is yellow. You know enough not to stare at the noontime sun, but if you did, it would look, yes, white.

Would that were the only deception. But, sadly, children, even your eyes deceive you. It's enough to make one weep.

When the setting sun appears to be perched on the horizon - just kissing the earth - it actually has already set! Oh, the lies! It turns out the earth's atmosphere bends the sunlight, a refraction just about equal to the sun's diameter. So when, children, it looks like the sun is just ready to begin sinking under the horizon, it actually already has sunk below it! Do grownups tell you this? No! This is not right! It is not honest! Children, you are being deceived!

Children, it makes you wonder what else grownups are lying about!

Wednesday, October 27, 2010

Canada's free land

In the years immediately following the American Revolution, things were looking grim for what was still to become the United States. Colonies had to impose huge tax increases to meet their war debts, the Continental Congress had little power, and rancorous party divisions were well underway. There's no better example than Shays's Rebellion, which erupted in Massachusetts early in the second half of the 1780s and was named after Captain Daniel Shays, a disaffected army officer. Armed farmers, unable or unwilling to repay their debts and also cover their growing tax bills, shut down the courts and resisted state troops sent to suppress them. The British, already convinced that any such ridiculous thing as a "democratic republic" was bound to fail, wanted to pick up the pieces and return the wayward colonialists to the fold of the empire.

And they had a plan. How better to speed up the failure than to create, in Canada, an ideal British society? But such a society needed a much bigger population. British leaders offered Americans willing to settle in what was then called "Upper Canada" - the area above Lakes Erie and Ontario - a grant of 200 acres of land for only a tiny fee. And taxes on that land would be a small fraction of taxes to the south. Once Americans saw a well-governed, stable nation of happy settlers whose civil (if not political) rights were well protected, that silly "republic" surely would soon collapse!

We all know the plan didn't exactly work out. But what many of us don't know is that over the next 20 years, tens of thousands of Americans did indeed flock to Canada and swear allegiance to the king. According to historian Alan Taylor, one observer wrote in 1800 that "You would be astonished to see the people from all parts of the States, by land and by water, 250 wagons at a time, with their families, on the road, something like army on the move." The influx of "late loyalist" newcomers (mostly from New York, Pennsylvania and New Jersey) swelled the population of Upper Canada from 14,000 in 1791 to 75,000 in 1812.

Here's how the governor of Upper Canada, John Groves Simcoe, greeted a group of such settlers in 1795: "You are tired of the federal government; you like not any longer to have so many kings; you wish again for your old father ... You are perfectly right; come along, we love such good royalists as you are, we will give you land."

But other Brits had doubts. They feared these American settlers might bring with them the seeds of rebellion. One warned against "the silver-tongued and arsenick-hearted Americans." Another called the newcomers "a base and disloyal population, the dregs and outcasts of other countries." Yet another said they would "form a nest of vipers in the bosom that now so incautiously fosters them."

Of course, both sides of this debate were mostly wrong. As would be the case today, most of the settlers didn't care one way or the other about politics or ideology. But free land? Low taxes? That was cool!

In 1808 settler Michael Smith said he emigrated from Pennsylvania "in order to obtain land upon easy terms, as did most of the inhabitants now there, and for no other reason."

Monday, October 25, 2010

Logic and the religious

In the current "Scientific American" magazine there is column by the skeptic Michael Shermer praising Christopher Hitchens's "rapier logic." Shermer said he would "pit Hitchens against any of the purveyors of pseudoscience claptrap because of his unique and enviable skill at peeling back the layers of an argument and cutting to its core."

As an example, Shermer zeros in on Hitchens' observation of a television nature show which showed cave-living salamanders who had lost vision, a trait which no longer had any advantage in the darkness of the cave. Yet small indentations in their faces showed where their eyes once had been. Hitchens wrote that Creationists have long used the example of eyes as a major argument against evolution, yet what to say about eyes that disappeared - is this an imperfect God correcting himself?

I find this use of rhetoric rather beside the point when it comes to discussing religion. To be sure, it attacks those religious folk who feel the need to argue such things, but it ignores those whose who couldn't care less. In cases like this, I often think close to home - my mother, to whom faith was completely integral to her happiness. To lose faith would be to lose the meaning of her life. Arguments against faith had no influence on her. She didn't - wouldn't - hear them. Talk about irrelevant!

I think it one thing to argue about the place of religion in politics, but quite another to talk about religion in voters. "Rapier logic" isn't going to change votes. It is more likely to turn them away.

Sunday, October 24, 2010

Brain shrinkage

Let's consider an interesting fact: Over the past 10,000 years, on average the human brain in both males and females has shrunk in size by about 100 cubic centimeters - roughly 7 percent. (And, for what it's worth, our skulls have gotten thinner.) Should we be worried about this, given that expanding brain size is one of the major measures by which scientists have traced the rise of modern humans over their hominid and their still later ancestors, all of whom have gone extinct?

(Of course I'm tempted to talk politics now, but let's not go there.)

The question is why our brains are shrinking. The fact is that nobody knows why, or what it might mean for the future. One suggested reason for the decrease has to do with famines over that time period, as populations grew large and precarious, and crops - invented with farming 10,000 years ago - failed. It makes evolutionary sense that in times of famine, energy-hungry organs such as the brain would get smaller if food shortages persisted. Another idea about why brains are getting smaller has to do with the fact that as humans moved into cities they no longer had to learn all the details of their widespread hunting grounds. But one would think that such changes would be offset by the need for increased social interaction.

I would suggest that this brain-size thing is just one of the twists and turns of evolution that we simply don't understand, and not to worry. Maybe we're just getting more efficient, brain-wise. I will, however, to do my part, keep working those Sudoku puzzles.

Saturday, October 23, 2010

Mary Anning and ancient bones

It keeps happening! I keep bumping into important historical figures I've never heard of! (This retirement business is hard on the ego.)

I refer to Mary Anning. Heard of her? No? Well, then you clearly aren't a paleontologist.

Anning (1799-1847) was a poor, working-class woman who lived in the resort town Lyme Regis in Dorset, along England's southern coast. Like her carpenter father before her, who supplemented his income by selling fossils to wealthy tourists, she spent her life as a fossil hunter. It helped that her village was near cliffs of limestone and shale formed some 200 million years ago - cliffs that kept crumbling down from winter rains and high tides, constantly exposing new fossils. It also helped that she was a natural-born scientist, despite her lack of social standing, education, and being a woman during a time when women weren't exactly accepted by the gentleman scientists of her day. Yet she knew more about fossils - and what they meant - than most of her scientific contemporaries.

Anning was one of 10 children, of whom only Mary and her brother Joseph survived to adulthood. She learned to read and write in Sunday school - that was it except for her lifelong study of the literature on long-dead animals. (As difficult as that was for a poor, lower-class woman to do.) When she was 12, a year after her father died, sending the family deeper into poverty, she and her brother discovered the four-foot head of an ichthyosaur - one of the big, nasty creatures that inhabited the seas 200 million years ago. Over the next few months, Mary recovered the rest of the bones - all 17 feet of them. The find, sold to one of the gentleman scientists, sent shock waves into a society that largely believed extinction was impossible because it raised questions about the perfection of God's creation.

As the years passed, she discovered fossils of plesiosaurs, fossil fish and pterosaurs that basically revolutionized the early 19th Century study of fossils, a discipline that later became paleontology. Unfortunately, she seldom got the credit, which she resented. However, by the end of her life, many of the major scientists of the day visited her village and often accompanied her on fossil hunting expeditions. (It was Anning who first suggested that oddly shaped fossils containing the bones of small fish and other creatures were fossilized ichthyosaur poop - later named "coprolites," which have amused undergraduate students ever since.)

Anning died of breast cancer at 47, still largely underappreciated, and still poor. It wasn't until the 20th Century that she has received the recognition she deserved. I'm not pleased - oh, coprolite! - that my own appreciation had to wait until the fall of 2010.

Friday, October 22, 2010

Big Lies about the constitution

It clearly is fair to use Tea Party terminology and ask this: Why do those in the Tea Party movement hate the U.S. Constitution?

After all, they lie about it all the time.

For instance, it quickly became obvious to the patriots in the 1780s that the 13 colonies needed a real government. So, after much debate, they came up with our system of government. Yet Nevada Republican Senate nominee Sharron Angle, who says we need to "phase out" Social Security and Medicare, can say "government isn't what our founding fathers put into the Constitution." Another Tea Party lie: The writers of the Constitution - very much on purpose - left out any mention of God or Jesus from the document, and the First Amendment starts out by saying that Congress "shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof." Yet Sarah Palin can say that the Constitution "acknowledg(es) that our unalienable rights ... come from God."

The baloney continues. Says Glenn Beck: opponents of Tea Party politics aim "to separate us from our Constitution and God."

You have to wonder why these people - obvious haters of the U.S. Constitution as it really is - think they can pull this enormous bluff - this Big Lie - over on the American people. The bluff seems to be working for a simple reason - the economy is in the shitter. The jobless rate always tells the electoral tale - "It's the economy, stupid." (Never mind that anti-regulation Republicans raised the toilet seat.) Tea Party folk don't seem to realize that it is the rich, elite people who are paying their way. Hell, many of their favorite candidates are wealthy beyond their dreams. But, are Americans really so dumb? Maybe so, at least in 2010, when jobless rates remain terrible. Still, copies of the U.S. Constitution are easily available. A quick look would reveal the lies. Unfortunately, a quick look at Fox News apparently is easier.

Thursday, October 21, 2010

Out of the drink

You may be glad to know that while arthropods (little guys with tough exoskeltons) almost certainly were the first animals to make the transition from water to land, they had nothing to do with getting important creatures (like us!) out of the drink. That took a special kind of fish.

(Actually, of course, that kind of fish started the whole shebang of tetrapods - all four-legged vertebrates like mammals, amphibians and reptiles. And all that counts to be a tetrapod is that your ancestors started out on four legs. We're included, because early primates had four feet. Never mind that we went to two legs, apparently to play basketball above the rim. Even creatures that have lost their legs - whales, seals, even snakes - still are classed as tetrapods.)

Anyway, that special kind of fish is known as a lobe-finned fish - very rare today. Unlike most fish, who had (and have) goofy little fins with goofy little bones made for swimming, not walking, lobe-finned fish (known as sarcopterygian fish, as if you care) had fin bones that were thick and stubby and not too far away from the kind of leg and foot bones tetrapods were going to need to lift their big, fat bodies off dry ground and actually walk around.

Lobe-finned fish that made it to land evolved into amphibians, who at first spent little time out of water. Water was where the sex was. But they had an advantage - if their pool was shrinking, for instance, they could go cross-country to find a better pond. They also developed lungs, letting them stay on dry land for longer and longer periods.They also developed a neck (fish don't have necks) so they could move their heads around to snap up prey, like those delicious little arthropods. Good old natural selection!

There long has been plenty of fossil evidence for this transition to four-legged creatures, but in 2004 scientists just about had a cow when a "perfect" missing link type fossil was found on one of Canada's northern-most islands. It clearly had features of both a lobe-finned fish and a tetrapod.

This big transition took place some 400 million years ago (or so), but we probably ought to take the time to remember the lobe-finned fish, ugly as it was. By evolving as it did, it set us on our way.

Wednesday, October 20, 2010

Bugs that fly

Lately I've been watching a lecture series by Professor Anthony Martin called "Major Transitions in Evolution." And frankly, it is a bit of a problem. Too many Latin-derived names of extinct creatures, too many distinctions between 490, 450 or 420 million years ago, too much cladistics, too fast.

But sometimes the ideas pop out, as Martin wants them to. Such is the case with flying insects.

The study of when and how insects grew wings and began to fly - the first powered flight ever - is fascinating. It relates to the amazing ability of insects to evolve. They reproduce like crazy and deliver offspring like there's no tomorrow. So they can evolve to meet any and seemingly all environmental changes. Flying is that kind of an evolutionary change, which happened time and again. (The most famous flying bug, an early dragonfly, had a wingspan two and a half feet across. It could eat you.)

But what is interesting about flying insects is that they made the world as we see it. First, they co-evolved with plants to pollinate them. From that came fruits, which were essential for the later evolution of primates like us. Second, they became major vectors for disease, which also greatly affected primate evolution.

I'm not sure I need to know Latin-derived nomenclature not to cringe next time I see a dragonfly.

Canada's Paul Revere

I recently mentioned War of 1812 hero Laura Secord, known in Canada as their equivalent of Paul Revere according to U.S. historian Alan Taylor. I'd never heard of the woman, and I'll bet few other Americans have either. I bought Taylor's recent book on the war but - rats - there were few details. So ... off to the Web.

Laura Secord (1775-1868) was born in Great Barrington, Mass., but left for Canada as a child with her loyalist family. On May 27, 1813, she lived in the Niagara Falls area with her husband, James, who still was recovering from war wounds sustained the preceding October. On that day she learned - just how is unclear - that American forces occupying her region were planning a surprise attack on a much smaller group of British troops and Mohawks led by Lieutenant James FitzGibbon. The next morning, leaving her husband, the 38-year-old set off alone on foot through difficult, swampy terrain to warn FitzGibbon. As a result, the Canadian forces were able to set up an ambush and captured most of the American troops - some 500 of them - at the Battle of Beaver Dams. It limited American control of the Niagara Peninsula in one of the war's most strategic victories.

In later years, she told differing stories about how she gained the intelligence. Some think she was protecting an American source who would have faced charges of treason. In any event, it wasn't until much later that the Prince of Wales - Queen Victoria's eldest son and future King Edward VII - heard the story on a visit to Canada and granted her an interview. She was 75. He later sent her a 100-pound award - her only official recognition during her lifetime for her actions. But today, a full-size statue of Laura Secord stands at Valliant's Memorial at Ottawa.

Today, it is likely that even many Canadians know the name only from the Laura Secord Chocolate Company, founded in 1913. The chocolates are supposed to be really good.

Tuesday, October 19, 2010

A different sort of cartoon

Many cartoons about evolution show fish - somehow now with legs - climbing up a beach. But the sea creatures that probably climbed that beach first were really arthropods.

Arthropods today are a variety of creatures with chitinous exoskeletons like crustaceans, insects, arachnids, etc. Once on land some 490 million years ago, creatures like them needed to be able to avoid dehydration, have the strength to withstand gravity outside water, deal with temperature changes, and so on. Those exoskeletons helped. It was only later that certain fish made their way to land. Our salty innards recall those early days.

The cartoon pictures of fish climbing a beach, like the line of ape-like to modern humans in other cartoons, are rather misleading in implying progress. After all, big-brained types like us are so smart we really know how to screw up big time. You have to wonder whether those creatures who were first out of the salty oceans might be the last creatures standing. Enrico Fermi had this response as to whether E.T.s are out there. If they are, he wondered, "Where are they?" Well, do you suppose the E.T. equivalent of arthropods - insects, spiders and so on - upon taking over after the big-brained folk wiped themselves off the face of their planets - continued a life in which exploration of space isn't exactly a goal? Where religion, politics, weapons of mass destruction and all that simply go poof, never to exist again?

In the long term, maybe the last living creatures throughout the universe just keep living a good old arthropod life until their various suns go nova. This rather grim alternative cartoon - and of course that's all it is - suggests that life may end, not with fire or ice, but with a scuttle.

Sunday, October 17, 2010

Studying Robert Welch

When I was a boy growing up in northwestern Wisconsin, I already knew about Sen. Joseph McCarthy. Hell, I had an aunt who ate that kind of conspiracy baloney up for breakfast. So, in the early 1960s when I was in junior high (that's what middle school was called) and I started hearing and reading about the John Birch Society, nobody had to tell me that these folks were nut cases. And I saw that Robert Welch, the group's founder and chief nincompoop, appeared to be seriously impaired: He'd assert as fact that the U.S. government already was in the hands of the Communists, and that President Dwight D. Eisenhower was "a dedicated, conscious agent of the Communist conspiracy" and had been "all of his adult life."

I took this walk down memory lane - sorry, Ike - after reading an article by Sean Wilentz in the latest "New Yorker." Wilentz, a professor of history at Princeton whose books include "The Age of Reagan: A History, 1974-2008," wrote about how people like Glenn Beck and many other (but certainly not all) Tea Party types are channeling John Birch Society "facts" and making them their own. For instance, Beck echoes the John Birch line by saying that the conspiracy started with the Progressive era when, particularly, President Woodrow Wilson's administration committed the sins of originating the Federal Reserve system and the graduated income tax. "Wilson just despised what America was," Beck told his radio audience.

Wilentz's main point is to ask why a Republican Party that has mostly managed to stifle such nut-caseness within its ranks for half a century has suddenly quit trying to do so, embracing Tea Party thinking instead. It's a development that bodes little but ill over the long run for the party, despite the outcome of the coming general election. It's a development that wiser heads, such as the late Bill Buckley, fought long and hard to prevent.

When I grew up and moved to Montana, we had our share of strange ideas. Think the Freemen. Think people terrorized by thoughts of black helicopters. But these days, what worries me is that too many Republicans I'll find on the ballot this November might have been studying up on tracts of the John Birch Society.

Saturday, October 16, 2010

The war Canada won

In a recent post I raised the question of whether most Americans are as fuzzy about the origins of the War of 1812 as I was. I suspect that's so - most of us think we fought and won a defensive war (they burned Washington, after all), while actually we started the war and were lucky to come out of it intact.

Now there's a new book by respected historian Alan Taylor called "The Civil War of 1812: American citizens, British subjects, Irish Rebels, and Indian Allies." I ordered a copy from Amazon after reading a review in the latest New York Review of Books titled "The War We Lost - and Won."

By "Lost," the title refers to the fact that the U.S. achieved little but a few late naval victories that let it find an honorable peace. By "Won," the title refers to the fact that the war actually, for the first time, started to turn the citizens of the former colonies into feeling like a real nation.

Despite the murkiness of the start of the war - some historians think a combination of westerners with an eye on a Canada landgrab and southerners eyeing Spanish Florida pushed through the declaration of war - Taylor suggests a deep cause. He contends that both U.S. Republicans (most Federalist were against the war) and Loyalists in Canada felt the continent was not big enough for their two forms of government to coexist. America's rambunctious republic and an aristocratic empire were too different - one had to prevail. Taylor likens the War of 1812 to a continuation of the Revolutionary War, a "civil war between competing visions of America: one still loyal to the empire and the other still defined by its republican revolution against that empire."

That theory can be argued. But when it comes to who actually "won" the war, there's little question. Canada did. Canada, with a tiny fraction of the population of the U.S., held off the invaders. They took the American fort at the junction of Lake Huron and Lake Michigan without a fight. In August, 1812, an entire American army surrendered at Detroit without firing a shot. Did the British burn Washington? A year before, Americans burned and looted the capital of Upper Canada - now called Toronto. The war created, Taylor said, Canada's "own patriotic icons, particularly the martyr Isaac Brock and the plucky Laura Secord, their equivalent of Paul Revere." (No details about them in the review. Wait until I read the book.)

The war buoyed Canada into a feeling of nationhood, too. When the bicentennial of the war comes around in 2012, don't expect Americans to take much notice. But don't be surprised if, in Canada, there are parades and celebrations of victory against overwhelming odds.