Sunday, October 31, 2010

On emptiness

It's hard to read much popular science writing without learning that atoms - and consequently anything made of atoms, such as a chair - are pretty much made up of nothing. A hydrogen atom, say, has a nucleus consisting of a single proton and a single electron buzzing around it as a quantum probability cloud. All the space between the two has no matter in it at all. (Sure you want to sit in that chair?)

But just how empty is an atom? The guy teaching my DVD lecture course on astronomy did some math. It turns out that our hydrogen atom is empty indeed. It is 99.999999999999 percent empty. That's 14 nines. (Still going to sit in the chair?)

But there is emptiness, and there is emptiness. The teacher (Alex Filippenko, University of California, Berkeley) did some more math and found that the ratio of the radius of that electron cloud to the radius of that proton is about 50,000. Then, rather arbitrarily, he compared that result to the Milky Way Galaxy by finding the ratio of the distance to the nearest star (4.2 light years) to our sun's radius of 700,000 kilometers. That ratio turns out to be about 60 million. Considering just the stars, anyway - which of course are a rather small part of the whole considering the dark stuff - our galaxy is 1,200 times emptier than an atom!

And then there's my brain. But let's not go there.

Saturday, October 30, 2010

Galileo's middle finger

Most of us know about all we need to know about Galileo Galilei - a great early scientist whose telescopic observations of the phases of Venus and Jupiter's moons pretty much put paid to Ptolemy's ancient system in which all heavenly bodies circled the Earth. A martyr (of sorts) who was tried by the Catholic Church for advocating Copernicus's heliocentric system and forced to recant. A guy who, according what probably is a mere legend, muttered - kneeling before the Inquisition - "And yet it moves" (referring to the Earth).

But you want to know about the middle finger. Professor Alex Filippenko, the teacher of a lecture series I'm watching, said Galileo is one of his heroes, and after his senior year in college he traveled to Europe, visited Florence, and stumbled on a relatively small museum of science called the "Istituto e Museo di Storis della Scenzas" - the "Florence Institute and Museum of the History of Science" to you. Wandering through the collection of medieval scientific instruments, Filippenko came across the phalange in question.

It stands upright within a small glass egg, held in a larger cup, looking for all the world like, well, the finger.

Most sources I checked on the Web - secondary sources that were very secondary - said that after Galileo's death students stole the middle finger of his right hand from the corpse. Another source implied the whole hand was taken, and the finger wasn't snapped off until 95 years later (when it had sufficiently aged). Then it was "passed around" for a few hundred years until the Florence museum obtained it. Not the most exacting of provenance.

Anyway, as one Web site said, "It stands displayed upward defying all those that opposed the advances of science." Or maybe it remains little more than a student prank. I'll let you decide, if decide you must.

Friday, October 29, 2010

Enjoying winter in July

Until I recently figured it out, I was becoming more and more irritated by the lecture course I've been watching called "Understanding the Universe: An Introduction to Astronomy." Geez, it seemed like grade-school stuff - especially the lecture devoted to how the seasons occur. Then I realized these first few lectures were devoted to "celestial sights that everyone can see." OK, we'll move on...eventually.

Regarding that "how the seasons occur" lecture, I sat through much of it with a certain amount of ennui. The professor, Alex Filippenko of the University of California, Berkeley, explained that studies of the graduates of big-league universities consistently show that these people - by a 25-to-2 margin, say that seasons are caused because the earth moves away from and closer to the sun. That's why Filippenko felt it necessary to devote a whole lecture to the topic. So, off we go, talking about how the earth is tilted 23 and a half degrees away from the plane of the solar system, so the northern hemisphere gets more energy in summer, blah, blah. (I thought everybody knew this.)

Anyway, at the end of the lecture, the professor raised a topic that I had "learned," but had pretty much forgotten: "precession." (It was the similar (but strangely different) precession of Mercury that so befuddled early 20th Century scientists - a befuddlement ended by Einstein.) For Earth, precession means that gravitational forces of the sun and moon, mostly, want to tip us over. But our planet's rotation resists this, and instead it just precesses - it sort of wobbles around like a top.

This wobble occurs over a 26,000-year period. This means that in half that time - 13,000 years - the earth will tilt the other way! The northern hemisphere will have summer in what used to be the winter, and winter in what used to be the summer! And 13,000 years later still, we'll be back to where we are today!

I find this so cool! The transition will be so slow as to be imperceptible over anyone's lifetime. But just imagine (if we're lucky) kids enjoying a cold winter day in July, 15,010 - "Look at the snow! Grab your sled!"

Thursday, October 28, 2010

Deceiving children

Children of the world ... you are being deceived! Just as generations of children before you have been deceived! You think the sun is yellow!

Oh, the pity of it! Open any illustrated children's book. Find an illustration of children playing in the happy grass beneath a benevolent, glowing sun. Every time, that sun is colored yellow! Horrors! The sun, children, is white. Sure, kids, the sun looks yellow, or even red, late and early in the day as it sets or rises and has to shine through much more of our gunked-up atmosphere than it does from high in the sky. But sunshine actually is - we learn as we grow up if we pay attention in science classes - nothing but white light. You need a prism of some kind to find colors in it. Yet, brain-washed as you children so sadly are, you think the sun is yellow. You know enough not to stare at the noontime sun, but if you did, it would look, yes, white.

Would that were the only deception. But, sadly, children, even your eyes deceive you. It's enough to make one weep.

When the setting sun appears to be perched on the horizon - just kissing the earth - it actually has already set! Oh, the lies! It turns out the earth's atmosphere bends the sunlight, a refraction just about equal to the sun's diameter. So when, children, it looks like the sun is just ready to begin sinking under the horizon, it actually already has sunk below it! Do grownups tell you this? No! This is not right! It is not honest! Children, you are being deceived!

Children, it makes you wonder what else grownups are lying about!

Wednesday, October 27, 2010

Canada's free land

In the years immediately following the American Revolution, things were looking grim for what was still to become the United States. Colonies had to impose huge tax increases to meet their war debts, the Continental Congress had little power, and rancorous party divisions were well underway. There's no better example than Shays's Rebellion, which erupted in Massachusetts early in the second half of the 1780s and was named after Captain Daniel Shays, a disaffected army officer. Armed farmers, unable or unwilling to repay their debts and also cover their growing tax bills, shut down the courts and resisted state troops sent to suppress them. The British, already convinced that any such ridiculous thing as a "democratic republic" was bound to fail, wanted to pick up the pieces and return the wayward colonialists to the fold of the empire.

And they had a plan. How better to speed up the failure than to create, in Canada, an ideal British society? But such a society needed a much bigger population. British leaders offered Americans willing to settle in what was then called "Upper Canada" - the area above Lakes Erie and Ontario - a grant of 200 acres of land for only a tiny fee. And taxes on that land would be a small fraction of taxes to the south. Once Americans saw a well-governed, stable nation of happy settlers whose civil (if not political) rights were well protected, that silly "republic" surely would soon collapse!

We all know the plan didn't exactly work out. But what many of us don't know is that over the next 20 years, tens of thousands of Americans did indeed flock to Canada and swear allegiance to the king. According to historian Alan Taylor, one observer wrote in 1800 that "You would be astonished to see the people from all parts of the States, by land and by water, 250 wagons at a time, with their families, on the road, something like army on the move." The influx of "late loyalist" newcomers (mostly from New York, Pennsylvania and New Jersey) swelled the population of Upper Canada from 14,000 in 1791 to 75,000 in 1812.

Here's how the governor of Upper Canada, John Groves Simcoe, greeted a group of such settlers in 1795: "You are tired of the federal government; you like not any longer to have so many kings; you wish again for your old father ... You are perfectly right; come along, we love such good royalists as you are, we will give you land."

But other Brits had doubts. They feared these American settlers might bring with them the seeds of rebellion. One warned against "the silver-tongued and arsenick-hearted Americans." Another called the newcomers "a base and disloyal population, the dregs and outcasts of other countries." Yet another said they would "form a nest of vipers in the bosom that now so incautiously fosters them."

Of course, both sides of this debate were mostly wrong. As would be the case today, most of the settlers didn't care one way or the other about politics or ideology. But free land? Low taxes? That was cool!

In 1808 settler Michael Smith said he emigrated from Pennsylvania "in order to obtain land upon easy terms, as did most of the inhabitants now there, and for no other reason."

Monday, October 25, 2010

Logic and the religious

In the current "Scientific American" magazine there is column by the skeptic Michael Shermer praising Christopher Hitchens's "rapier logic." Shermer said he would "pit Hitchens against any of the purveyors of pseudoscience claptrap because of his unique and enviable skill at peeling back the layers of an argument and cutting to its core."

As an example, Shermer zeros in on Hitchens' observation of a television nature show which showed cave-living salamanders who had lost vision, a trait which no longer had any advantage in the darkness of the cave. Yet small indentations in their faces showed where their eyes once had been. Hitchens wrote that Creationists have long used the example of eyes as a major argument against evolution, yet what to say about eyes that disappeared - is this an imperfect God correcting himself?

I find this use of rhetoric rather beside the point when it comes to discussing religion. To be sure, it attacks those religious folk who feel the need to argue such things, but it ignores those whose who couldn't care less. In cases like this, I often think close to home - my mother, to whom faith was completely integral to her happiness. To lose faith would be to lose the meaning of her life. Arguments against faith had no influence on her. She didn't - wouldn't - hear them. Talk about irrelevant!

I think it one thing to argue about the place of religion in politics, but quite another to talk about religion in voters. "Rapier logic" isn't going to change votes. It is more likely to turn them away.

Sunday, October 24, 2010

Brain shrinkage

Let's consider an interesting fact: Over the past 10,000 years, on average the human brain in both males and females has shrunk in size by about 100 cubic centimeters - roughly 7 percent. (And, for what it's worth, our skulls have gotten thinner.) Should we be worried about this, given that expanding brain size is one of the major measures by which scientists have traced the rise of modern humans over their hominid and their still later ancestors, all of whom have gone extinct?

(Of course I'm tempted to talk politics now, but let's not go there.)

The question is why our brains are shrinking. The fact is that nobody knows why, or what it might mean for the future. One suggested reason for the decrease has to do with famines over that time period, as populations grew large and precarious, and crops - invented with farming 10,000 years ago - failed. It makes evolutionary sense that in times of famine, energy-hungry organs such as the brain would get smaller if food shortages persisted. Another idea about why brains are getting smaller has to do with the fact that as humans moved into cities they no longer had to learn all the details of their widespread hunting grounds. But one would think that such changes would be offset by the need for increased social interaction.

I would suggest that this brain-size thing is just one of the twists and turns of evolution that we simply don't understand, and not to worry. Maybe we're just getting more efficient, brain-wise. I will, however, to do my part, keep working those Sudoku puzzles.

Saturday, October 23, 2010

Mary Anning and ancient bones

It keeps happening! I keep bumping into important historical figures I've never heard of! (This retirement business is hard on the ego.)

I refer to Mary Anning. Heard of her? No? Well, then you clearly aren't a paleontologist.

Anning (1799-1847) was a poor, working-class woman who lived in the resort town Lyme Regis in Dorset, along England's southern coast. Like her carpenter father before her, who supplemented his income by selling fossils to wealthy tourists, she spent her life as a fossil hunter. It helped that her village was near cliffs of limestone and shale formed some 200 million years ago - cliffs that kept crumbling down from winter rains and high tides, constantly exposing new fossils. It also helped that she was a natural-born scientist, despite her lack of social standing, education, and being a woman during a time when women weren't exactly accepted by the gentleman scientists of her day. Yet she knew more about fossils - and what they meant - than most of her scientific contemporaries.

Anning was one of 10 children, of whom only Mary and her brother Joseph survived to adulthood. She learned to read and write in Sunday school - that was it except for her lifelong study of the literature on long-dead animals. (As difficult as that was for a poor, lower-class woman to do.) When she was 12, a year after her father died, sending the family deeper into poverty, she and her brother discovered the four-foot head of an ichthyosaur - one of the big, nasty creatures that inhabited the seas 200 million years ago. Over the next few months, Mary recovered the rest of the bones - all 17 feet of them. The find, sold to one of the gentleman scientists, sent shock waves into a society that largely believed extinction was impossible because it raised questions about the perfection of God's creation.

As the years passed, she discovered fossils of plesiosaurs, fossil fish and pterosaurs that basically revolutionized the early 19th Century study of fossils, a discipline that later became paleontology. Unfortunately, she seldom got the credit, which she resented. However, by the end of her life, many of the major scientists of the day visited her village and often accompanied her on fossil hunting expeditions. (It was Anning who first suggested that oddly shaped fossils containing the bones of small fish and other creatures were fossilized ichthyosaur poop - later named "coprolites," which have amused undergraduate students ever since.)

Anning died of breast cancer at 47, still largely underappreciated, and still poor. It wasn't until the 20th Century that she has received the recognition she deserved. I'm not pleased - oh, coprolite! - that my own appreciation had to wait until the fall of 2010.

Friday, October 22, 2010

Big Lies about the constitution

It clearly is fair to use Tea Party terminology and ask this: Why do those in the Tea Party movement hate the U.S. Constitution?

After all, they lie about it all the time.

For instance, it quickly became obvious to the patriots in the 1780s that the 13 colonies needed a real government. So, after much debate, they came up with our system of government. Yet Nevada Republican Senate nominee Sharron Angle, who says we need to "phase out" Social Security and Medicare, can say "government isn't what our founding fathers put into the Constitution." Another Tea Party lie: The writers of the Constitution - very much on purpose - left out any mention of God or Jesus from the document, and the First Amendment starts out by saying that Congress "shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof." Yet Sarah Palin can say that the Constitution "acknowledg(es) that our unalienable rights ... come from God."

The baloney continues. Says Glenn Beck: opponents of Tea Party politics aim "to separate us from our Constitution and God."

You have to wonder why these people - obvious haters of the U.S. Constitution as it really is - think they can pull this enormous bluff - this Big Lie - over on the American people. The bluff seems to be working for a simple reason - the economy is in the shitter. The jobless rate always tells the electoral tale - "It's the economy, stupid." (Never mind that anti-regulation Republicans raised the toilet seat.) Tea Party folk don't seem to realize that it is the rich, elite people who are paying their way. Hell, many of their favorite candidates are wealthy beyond their dreams. But, are Americans really so dumb? Maybe so, at least in 2010, when jobless rates remain terrible. Still, copies of the U.S. Constitution are easily available. A quick look would reveal the lies. Unfortunately, a quick look at Fox News apparently is easier.

Thursday, October 21, 2010

Out of the drink

You may be glad to know that while arthropods (little guys with tough exoskeltons) almost certainly were the first animals to make the transition from water to land, they had nothing to do with getting important creatures (like us!) out of the drink. That took a special kind of fish.

(Actually, of course, that kind of fish started the whole shebang of tetrapods - all four-legged vertebrates like mammals, amphibians and reptiles. And all that counts to be a tetrapod is that your ancestors started out on four legs. We're included, because early primates had four feet. Never mind that we went to two legs, apparently to play basketball above the rim. Even creatures that have lost their legs - whales, seals, even snakes - still are classed as tetrapods.)

Anyway, that special kind of fish is known as a lobe-finned fish - very rare today. Unlike most fish, who had (and have) goofy little fins with goofy little bones made for swimming, not walking, lobe-finned fish (known as sarcopterygian fish, as if you care) had fin bones that were thick and stubby and not too far away from the kind of leg and foot bones tetrapods were going to need to lift their big, fat bodies off dry ground and actually walk around.

Lobe-finned fish that made it to land evolved into amphibians, who at first spent little time out of water. Water was where the sex was. But they had an advantage - if their pool was shrinking, for instance, they could go cross-country to find a better pond. They also developed lungs, letting them stay on dry land for longer and longer periods.They also developed a neck (fish don't have necks) so they could move their heads around to snap up prey, like those delicious little arthropods. Good old natural selection!

There long has been plenty of fossil evidence for this transition to four-legged creatures, but in 2004 scientists just about had a cow when a "perfect" missing link type fossil was found on one of Canada's northern-most islands. It clearly had features of both a lobe-finned fish and a tetrapod.

This big transition took place some 400 million years ago (or so), but we probably ought to take the time to remember the lobe-finned fish, ugly as it was. By evolving as it did, it set us on our way.

Wednesday, October 20, 2010

Bugs that fly

Lately I've been watching a lecture series by Professor Anthony Martin called "Major Transitions in Evolution." And frankly, it is a bit of a problem. Too many Latin-derived names of extinct creatures, too many distinctions between 490, 450 or 420 million years ago, too much cladistics, too fast.

But sometimes the ideas pop out, as Martin wants them to. Such is the case with flying insects.

The study of when and how insects grew wings and began to fly - the first powered flight ever - is fascinating. It relates to the amazing ability of insects to evolve. They reproduce like crazy and deliver offspring like there's no tomorrow. So they can evolve to meet any and seemingly all environmental changes. Flying is that kind of an evolutionary change, which happened time and again. (The most famous flying bug, an early dragonfly, had a wingspan two and a half feet across. It could eat you.)

But what is interesting about flying insects is that they made the world as we see it. First, they co-evolved with plants to pollinate them. From that came fruits, which were essential for the later evolution of primates like us. Second, they became major vectors for disease, which also greatly affected primate evolution.

I'm not sure I need to know Latin-derived nomenclature not to cringe next time I see a dragonfly.

Canada's Paul Revere

I recently mentioned War of 1812 hero Laura Secord, known in Canada as their equivalent of Paul Revere according to U.S. historian Alan Taylor. I'd never heard of the woman, and I'll bet few other Americans have either. I bought Taylor's recent book on the war but - rats - there were few details. So ... off to the Web.

Laura Secord (1775-1868) was born in Great Barrington, Mass., but left for Canada as a child with her loyalist family. On May 27, 1813, she lived in the Niagara Falls area with her husband, James, who still was recovering from war wounds sustained the preceding October. On that day she learned - just how is unclear - that American forces occupying her region were planning a surprise attack on a much smaller group of British troops and Mohawks led by Lieutenant James FitzGibbon. The next morning, leaving her husband, the 38-year-old set off alone on foot through difficult, swampy terrain to warn FitzGibbon. As a result, the Canadian forces were able to set up an ambush and captured most of the American troops - some 500 of them - at the Battle of Beaver Dams. It limited American control of the Niagara Peninsula in one of the war's most strategic victories.

In later years, she told differing stories about how she gained the intelligence. Some think she was protecting an American source who would have faced charges of treason. In any event, it wasn't until much later that the Prince of Wales - Queen Victoria's eldest son and future King Edward VII - heard the story on a visit to Canada and granted her an interview. She was 75. He later sent her a 100-pound award - her only official recognition during her lifetime for her actions. But today, a full-size statue of Laura Secord stands at Valliant's Memorial at Ottawa.

Today, it is likely that even many Canadians know the name only from the Laura Secord Chocolate Company, founded in 1913. The chocolates are supposed to be really good.

Tuesday, October 19, 2010

A different sort of cartoon

Many cartoons about evolution show fish - somehow now with legs - climbing up a beach. But the sea creatures that probably climbed that beach first were really arthropods.

Arthropods today are a variety of creatures with chitinous exoskeletons like crustaceans, insects, arachnids, etc. Once on land some 490 million years ago, creatures like them needed to be able to avoid dehydration, have the strength to withstand gravity outside water, deal with temperature changes, and so on. Those exoskeletons helped. It was only later that certain fish made their way to land. Our salty innards recall those early days.

The cartoon pictures of fish climbing a beach, like the line of ape-like to modern humans in other cartoons, are rather misleading in implying progress. After all, big-brained types like us are so smart we really know how to screw up big time. You have to wonder whether those creatures who were first out of the salty oceans might be the last creatures standing. Enrico Fermi had this response as to whether E.T.s are out there. If they are, he wondered, "Where are they?" Well, do you suppose the E.T. equivalent of arthropods - insects, spiders and so on - upon taking over after the big-brained folk wiped themselves off the face of their planets - continued a life in which exploration of space isn't exactly a goal? Where religion, politics, weapons of mass destruction and all that simply go poof, never to exist again?

In the long term, maybe the last living creatures throughout the universe just keep living a good old arthropod life until their various suns go nova. This rather grim alternative cartoon - and of course that's all it is - suggests that life may end, not with fire or ice, but with a scuttle.

Sunday, October 17, 2010

Studying Robert Welch

When I was a boy growing up in northwestern Wisconsin, I already knew about Sen. Joseph McCarthy. Hell, I had an aunt who ate that kind of conspiracy baloney up for breakfast. So, in the early 1960s when I was in junior high (that's what middle school was called) and I started hearing and reading about the John Birch Society, nobody had to tell me that these folks were nut cases. And I saw that Robert Welch, the group's founder and chief nincompoop, appeared to be seriously impaired: He'd assert as fact that the U.S. government already was in the hands of the Communists, and that President Dwight D. Eisenhower was "a dedicated, conscious agent of the Communist conspiracy" and had been "all of his adult life."

I took this walk down memory lane - sorry, Ike - after reading an article by Sean Wilentz in the latest "New Yorker." Wilentz, a professor of history at Princeton whose books include "The Age of Reagan: A History, 1974-2008," wrote about how people like Glenn Beck and many other (but certainly not all) Tea Party types are channeling John Birch Society "facts" and making them their own. For instance, Beck echoes the John Birch line by saying that the conspiracy started with the Progressive era when, particularly, President Woodrow Wilson's administration committed the sins of originating the Federal Reserve system and the graduated income tax. "Wilson just despised what America was," Beck told his radio audience.

Wilentz's main point is to ask why a Republican Party that has mostly managed to stifle such nut-caseness within its ranks for half a century has suddenly quit trying to do so, embracing Tea Party thinking instead. It's a development that bodes little but ill over the long run for the party, despite the outcome of the coming general election. It's a development that wiser heads, such as the late Bill Buckley, fought long and hard to prevent.

When I grew up and moved to Montana, we had our share of strange ideas. Think the Freemen. Think people terrorized by thoughts of black helicopters. But these days, what worries me is that too many Republicans I'll find on the ballot this November might have been studying up on tracts of the John Birch Society.

Saturday, October 16, 2010

The war Canada won

In a recent post I raised the question of whether most Americans are as fuzzy about the origins of the War of 1812 as I was. I suspect that's so - most of us think we fought and won a defensive war (they burned Washington, after all), while actually we started the war and were lucky to come out of it intact.

Now there's a new book by respected historian Alan Taylor called "The Civil War of 1812: American citizens, British subjects, Irish Rebels, and Indian Allies." I ordered a copy from Amazon after reading a review in the latest New York Review of Books titled "The War We Lost - and Won."

By "Lost," the title refers to the fact that the U.S. achieved little but a few late naval victories that let it find an honorable peace. By "Won," the title refers to the fact that the war actually, for the first time, started to turn the citizens of the former colonies into feeling like a real nation.

Despite the murkiness of the start of the war - some historians think a combination of westerners with an eye on a Canada landgrab and southerners eyeing Spanish Florida pushed through the declaration of war - Taylor suggests a deep cause. He contends that both U.S. Republicans (most Federalist were against the war) and Loyalists in Canada felt the continent was not big enough for their two forms of government to coexist. America's rambunctious republic and an aristocratic empire were too different - one had to prevail. Taylor likens the War of 1812 to a continuation of the Revolutionary War, a "civil war between competing visions of America: one still loyal to the empire and the other still defined by its republican revolution against that empire."

That theory can be argued. But when it comes to who actually "won" the war, there's little question. Canada did. Canada, with a tiny fraction of the population of the U.S., held off the invaders. They took the American fort at the junction of Lake Huron and Lake Michigan without a fight. In August, 1812, an entire American army surrendered at Detroit without firing a shot. Did the British burn Washington? A year before, Americans burned and looted the capital of Upper Canada - now called Toronto. The war created, Taylor said, Canada's "own patriotic icons, particularly the martyr Isaac Brock and the plucky Laura Secord, their equivalent of Paul Revere." (No details about them in the review. Wait until I read the book.)

The war buoyed Canada into a feeling of nationhood, too. When the bicentennial of the war comes around in 2012, don't expect Americans to take much notice. But don't be surprised if, in Canada, there are parades and celebrations of victory against overwhelming odds.

Thursday, October 14, 2010

Merchants of Doubt

I recently read a review of three books about climate change in "Skeptical Inquirer," a magazine not exactly in tune with ESP, ghost hunting, evolution deniers, far-right foolishness, and silliness in general. I found it really informative.

The review, by a planetary scientist (not a climate scientist), discussed three books: "Storms of My Grandchildren" by climate scientist James Hansen, "Science as a Contact Sport" by the late climate scientist Stephen H. Schneider, and "Merchants of Doubt," by historians Naomi Oreskes and Erik M. Conway. The last book tells an especially interesting, and amply documented, tale.

It turns out there were four respected scientists who formed the core of those who signed on to try to debunk worries about global warming caused by the burning of fossil fuels. They were Fred Seitz, Fred Singer, Bill Nierenberg, and Robert Jastrow. Such scientists not only tried to cast doubt on warming, but worked to undercut the science behind the danger of cigarette smoking, worries about industrial smoke and later, acid rain, the advisability and the feasibility of the Star Wars defense system, the threat of DDT, the bad news about second-hand smoke, and the danger of ozone depletion. To a man, each was a fervent anti-communist opposed to detent with the Soviet Union, opposed to government regulation that would steal America's freedom, and so on. They turned their backs on scientific truth to back the positions of the ultra-right's political goals pushed by such groups as the Heritage Foundation, the Hoover Institution, the Hudson Institute, the Competitive Enterprise Institute, and the Cato Institute. To them, environmentalists were "watermelons": green on the outside, red on the inside.

According to review author David Morrison, as of this fall all but Singer has died. One wonders how these respected scientist, so loved by the George W. Bush administration, will be replaced.

Wednesday, October 13, 2010

Knowing Emmy Noether

I can embarrass myself simply by sitting alone in a chair reading a book. Here's what happened: Physicist Benjamin Schumacher was discussing the concept of symmetry when he mentioned one "Emmy Noether" with what seemed to be a reverence usually reserved for likes of David Hilbert or Albert Einstein. Who the hell was this Emmy Noether?

Well, it turns out that Emmy Noether (1882-1935) was only the greatest woman mathematician in the history of the world.

Oh.

No less that Hilbert and Einstein called her the most important woman in the history of mathematics, and mathematician Norbert Wiener said she was "the greatest woman mathematician who has ever lived." Her work on what is called "Noether's Theorem," involving differential invariants in the calculus of variations, has been called "one of the most important math theorems ever proved in guiding the development of modern physics" and "a cornerstone in general relativity."

Schumacher was awed by her 1915 proof that every symmetry in the laws of physics leads to a conservation law, and every conservation law arises from a symmetry of the laws of physics - a principle, he said, invaluable to this day.

But to the math folks, she's more famous for her theories of algebraic invariants and number fields that "transformed the face of algebra." I checked Noether out on line, and quickly was overwhelmed by page after page of discussion so far above my head it might as well as been hiding behind the moon.

Noether received treatment typical for a woman scientist of her time. She worked for many years without pay, and never received a full professorship in her native Germany. In 1933, because she was a Jew, the Nazi government jackbooted her out of the university and she moved to the United States with the help of Einstein and others. She took a position at Bryn Mawr College and lectured at the Institute for Advanced Studies at Princeton, continuing to produce ground-breaking work. She died at 53 after an operation for an ovarian cyst.

I probably shouldn't be beating myself up for never having heard of Noether. After all, my math training ended after first-year algebra.

But the "most important woman mathematician in history?" And I didn't know her from Josephine Blow? Arrgh!

Monday, October 11, 2010

The Black Hole and the Light Bulb

Walk into a dark room and turn on a light bulb. The light heads off in every direction - to all sides, up to the ceiling, down to the floor. Big whoop. After all, that's what light does.

But I think that very fact goes to heart of a misunderstanding that most of us have about what happens inside a black hole.

When we're told that a black hole's gravity is so strong that nothing can escape it, not even light, we naturally think of light trying to get out, but being overcome and falling back - much like a baseball thrown into the air or a failed rocket.

Unfortunately, that just isn't the way to think about it. Sorry.

Let's imagine our shining light bulb crossing the event horizon into a black hole. Suddenly, its light doesn't spread out in all directions anymore! It goes in only one direction - right smack toward the center of the black hole. It doesn't head in the direction of an escape route (only to fall back) because it literally has no direction that goes that way. (As the lecture I was watching this afternoon put it, "all roads lead to Rome."

The reason all roads lead to the center of the black hole is not gravity by itself, but the gravity's monstrously huge distortion of space-time that twists things in such a way that, for light and anything else, including the poor light bulb, there simply is no outward direction.

The lecture by Professor Benjamin Schumacher, an extremely accomplished physicist teaching a course for dummies like me, explains all this in terms of tilting "light cones," which is fine for those of us who are comfortable with light cones. But I just invented this light "bulb" method as a simpler way. Maybe it even helps.

Sunday, October 10, 2010

Forecasting the future

The French scientist Pierre-Simon Laplace, when asked by Napoleon why he didn't mention God in his work on celestial mechanics, famously responded that "I had no need of that hypothesis."

But Laplace, one of many scientists to deal with Newton's deterministic view of reality, is best known for inventing the idea of a being that has complete, perfect knowledge of the speeds and velocities of every particle in the universe. With such knowledge, he could in principle predict the future of the entire universe, exactly and forever.

Setting aside quantum mechanics and its brand of uncertainty and sticking to classical physics, Laplace's idea stuck around for a long time. I had young friends who remained wowed by the concept. But I, no scientist, still smelled a rat. And sure enough, along came chaos theory.

Edward Lorenz stumbled onto the concept while using computer programs to study weather forecasting. It turns out that weather - and indeed nearly all of what you can measure of the workings of the natural world - is extremely dependent on initial conditions. If you could gather weather data from around the world with 100 times greater accuracy, you probably would only extend your ability to forecast weather for a day or two.

Newton's laws are indeed deterministic, but in practice nobody actually living in this universe could muster the precision to predict the future. Maybe Laplace's "being" could, if it knew each data point to an infinite number of decimal places, perform such a miracle. But, of course, that would make it a "God."

In this case, anyway, it looks like Laplace needed that hypothesis after all.

Friday, October 8, 2010

Still, it's all English

In about 1490, the early printer William Caxton tells an interesting story. It seems that sailors whose ship had been becalmed in the Thames estuary went to shore to refresh their food supply. They happened on a farmhouse, and one of them asked the woman of the house if they could buy some "egges." The woman could hardly understand a word they were saying, and explained that she could speak no "frenshe." That upset the guy, who tried to say he could speak no French either. Then another in the party of sailors, who knew something of the dialect, broke in to say that they wanted some eyren. Sure, said the woman.

Both sides in this little drama spoke English. They just couldn't understand each other. One side talked about egges, the other about eyren. And, although the dialect gaps narrowed in England over the coming century, big differences remained. Those differences were the cause of the first regional accents in colonial America.

The first permanent settlement in Jamestown dates to 1607. Just 13 years later, the "Mayflower" came ashore in Cape Cod. They came from different places. The colonialists in the south came from England's "west country," the Puritans came from the east part of England near the North Sea. The western types typically pronounced an "r" that came after a vowel; those in the east didn't. Hence, New Englanders pronounced Harvard this way: "Haava'd."

Some echoes of those early accents remain along the east coast, but for the most part the United States managed to develop its own regional accents all on its own. For instance, I grew up in northwest Wisconsin, not too far east of the Twin Cities and the Mississippi River, and I have no trouble understanding that the worm in the backyard is an angleworm, a comforter is a quilt, and a boulevard is that grassy, city-owned strip between the sidewalk and the street.

On the other hand, I never heard anyone in the north talk about a chifforobe (a wardrobe), clabber (curdled milk), or goobers (peanuts). And I certainly never heard anyone utter, except in jest, y'all.

Thursday, October 7, 2010

Told you spelling was hard

Loanwords from other tongues make up significant part of English - as they do many other languages - and not surprisingly words from the natives of the east coast of North America were an early example. For instance, Algonquian terms like skunk and manitou (a deity) were recorded in English in 1588. By 1607, the language had already picked up such still-common words as moccasin, totem, moose, opossum, and tomahawk.

But apparently some adoptions were more difficult than others. According to David Crystal in "The Stories of English" (2004), in 1608 a London printer published an account of an early exploration of Virginia by Captain John Smith. In a story about his visit to the Powhatan Indians, he describes the chief as "richly hung with Manie Chaynes of great Pearles about his necke," and covered with a blanket made from skins of an animal Smith called a "rahaughcum."

During Smith's time, spelling wasn't exactly a refined art, but "rahaughcum? Can you figure out which animal he meant?


Answer: Smith's was the first English spelling of "raccoon." (Crystal, a Brit, spelled it "racoon.)

Tuesday, October 5, 2010

Eye of the beholder

Once again I've come across a famous experiment that I'd never heard of. Back in the early 1970s, Stanford University Professor David Rosenhan and a handful of his students and colleagues faked vague symptoms and got themselves admitted to a mental hospital. (They arrived at different times, and nobody at the facility knew they were up to something.)

But once admitted, they reverted immediately to their "normal" behavior. Observing them anywhere else, no one in the psychiatric community would have thought these people might be in need of psychiatric treatment. However, it turned out that to the hospital staff, the experimenters had become "types" - types in need of help, no matter how they acted. In fact, one of them - a nationally recognized expert in the treatment of depression - discussed with a hospital staffer at length and in detail his view that another patient suffering from depression was being given the wrong drug. Here's what the staffer wrote as his evaluation: "suffers from grandiosity."

Rosenhan wrote up his study of this sort of "typecasting" in the January, 1973, issue of "Science." He called his paper "On Being Sane in Insane Places." (And no, I really don't feel like talking about your office environment.)

Sunday, October 3, 2010

Darwin's nose

During the end of the 1700s and the beginning of the 1800s, there was great interest in the study of personality "types:" criminal types, lazy types, meticulous types, trustworthy types ... you name it.

One of the most prominent among such "scientists" was Joseph Lavater (1741-1801) who became known as the father of physiognomy. According Lavater, facial characteristics - skull shape, jutting of the chin, the way the lips formed expressions - represent the various types of personalities. Lavater's books included page after page of drawings of people's faces illustrating examples of what these types looked like. (Armed with these drawings, people could eye their neighbors and decide which were trustworthy and which were criminals.)

Lavater's work became very popular, and not just among the gullible. One of his fans was a ship captain named Robert Fitzroy (1805-1865). Fitzroy's vessel was named the "Beagle." And when he wanted to hire a naturalist to accompany his upcoming voyage, one of people he considered was Charles Darwin.

Darwin later recounted that during his interview, he had been disconcerted by the fact that Fitzroy kept staring at Darwin's nose. Darwin ended up being hired, and later in the five-year journey he and Fitzroy became close enough friends to have a personal conversation. Darwin wrote that Fitzroy told him that he very nearly had been rejected. It was your nose, Fitzroy told him. It just wasn't the nose of a competent, meticulous naturalist.

Darwin, the most famous and influential naturalist in the history of science, wrote that was glad the captain had concluded that his nose had lied.

Saturday, October 2, 2010

Late night thoughts in the afternoon

Back in 1972, a sort of late mid-point in the Cold War, a theory called "punctuated equilibrium" introduced by evolutionary biologists Stephen Jay Gould and Niles Eldredge provided a wry kind of solace. The theory was designed to explain how it was that fossils of many species seemed unchanged over vast periods at the same time that fossils of new species seemed to pop out of nowhere. The idea was that rapid changes in a species' ecology cause it to evolve quickly (in geologic time, remember) into a new species.

The solace, unscientific as it was and as unintended by the authors, involved the forlorn hope that a nuclear war - changing the earth into a world filled with radiation and cloaked in nuclear winter - might at least spawn evolutionary changes that could allow the surviving remnant of humankind to adapt to the new conditions.

Recent studies made possible by advances in genetics are dashing such hopes. While it is true that such studies have shown, for instance, that humans who migrated to the Tibetan plateau adapted to the thin air by evolving a gene variant that bolstered red cell production over a mere 3,000 years, most such variation takes something like 50,000 years to spread through a population. Not exactly a big help in the event of a global catastrophe.

So, people worried about drastic climate change or the emergence of some incurable AIDS-like pandemic out of some rainforest can forget about people adapting. Better to rely on technology. As "big history" shows, species such as humans don't adapt. They just quietly go extinct.