Saturday, July 31, 2010

Cable news keeps dumbing down

That I don't watch cable news networks should go without saying. Especially MSNBC and Fox. (Hey, can YOU watch Bill O'Reilly without serious intestinal medicine?) But somehow, in my mind, CNN has been the go-to network for breaking big-time news, such as shuttle crashes, wars, and so on. (Hell, it was Gulf War coverage that put CNN on the map.)

Nancy Franklin, the savvy television reporter for the New Yorker, has made me sad. She reports that CNN's best international reporter, Christiane Amanpour, has flown the coop for ABC. Meanwhile, Campbell Brown's show is leaving due to low ratings, and the venerable Larry King show is ending in the fall. So who are replacing them? Sheesh.

Brown's replacements will be Eliot Spitzer, the ex-governor of New York who resigned after the ole prostitute thing, and Kathleen Parker, a conservative columnist. I bought Parker's columns in my capacity as an editorial page editor. She's a bit of an air-head conservative, chosen to please our air-head conservative readers. And Spitzer. Well, can you spell hypocrisy? As for King's replacement, it seems to be British tabloid editor and celebrity interviewer Piers Morgan. All appear to yaw even farther from the course of mainstream hard journalism.

Good thing we'll always have newspapers to rely on.

Friday, July 30, 2010

Journalism vs. thin gruel

Contemplating the woes of newspapers (after a career spent in newspaper journalism), I started trying to think of earlier examples of technological changes that have brought down once-vital concerns.

It was surprisingly difficult. I guess you could count staged vaudeville revues losing out to radio entertainment, or radio "sitcoms," variety shows and spoken dramas giving way to TV sitcoms, the Ed Sullivan Show, and never-ending soaps. But what struck me is that, as the generations roll, the old forms just fade away, hardly missed.

So the question might be raised: in 2010, with new media (let's just lump it all together as the Internet) seeming to be shoving print journalism out of the picture, will the newspaper pass away equally unlamented? Just as I grew up with no nostalgia for a radio that offered more than a wallpaper of popular music, will today's young people quickly forget that daily stack of half-folded newsprint that used to start so many Americans' morning?

Maybe so. But in this case, we're not just talking about entertainment.

When I was young, the New York Times' brave printing of the Pentagon Papers and the Washington Post's Watergate exposes drove home the importance to democracy of enterprising, capable, thorough journalism found only in print. Recently, the Post's series on the out-of-control, counter-productive proliferation of a government-private "Top Secrete America" reminds us once again of that importance. Somehow, one has trouble imagining Internet bloggers, TV blabbers, or any other institution taking a newspaper's place - nationally, or on Main Street America.

Such reporting really is necessary to a democracy. Unless we keep reading - and paying for - those stacks of newsprint, technological changes will result in a thin gruel indeed.

Thursday, July 29, 2010

Studies and illusions

Sharon Begley is one of the country's most popular science writers, in part because she seems to have an uncanny ability to zoom in on what the educated public is wondering about. (Begley, if memory serves, was lured away from the Wall Street Journal by Newsweek around the time some strange, ominous force took over ownership of the Journal.)

Anyway, this week she zoomed in on something I've often wondered about - what does the pervasive use of U.S. college undergraduates in psychological studies really tell us about humanity as a whole?

According to Begley's reporting of current research, not so much.

For instance, she says, a recent paper by Joseph Henrich at the University of British Columbia and colleagues considered the well-know optical illusion involving two vertical lines, each the same length. One (A) has arrows at each end pointing out, the other (B) has arrows at each end pointing in. To most U.S. college kids (and me!), (B) looks longer by 20 percent. But the illusion doesn't fool African hunter-gatherers known as the San of the Kalahari. Begley suggests American college kids are fooled because they grew up in urban areas filled with right angles - so a universal trait of the human mind it isn't.

Likewise, consider the well-known "fairness" game in which scientists give money to one player, who decides how to split it with another player. If the second play doesn't like the split, neither gets anything. Say the first player gets $10. If he offers the second player anything less than $4, the second player will say "stick it in your ear," and both walk away with nothing. That's so among American undergrads, anyway. But to people from some nonindustral societies, an offer of about $2.50 will be accepted.

Even within the U.S., these kinds of studies can be misleading. For instance, sexual differences in spatial ability found among middle-class kids don't appear among poor kids.

Next time I attend college, and am approached by its psychology department to volunteer for an experiment, I just might have to decline.

Wednesday, July 28, 2010

Splitting nothing in two

Here's an interesting thing I've just learned: It turns out that, realizing that Einstein's famous equation E=mc squared (dammit, I have to learn how to make superscripts!) means that both matter and energy contribute to the total energy in the universe. It's a BIG amount. Atomic matter, dark matter, dark energy, radiation, it all counts. And it all creates gravity. But now consider this: gravity has negative energy. And, guess what? Gravity has the same amount of negative energy as mass/energy has positive energy! The total adds up to zero!

Is this some sort of cosmic joke? One thinks not. Somehow, the inflation of the Big Bang split "nothing" into two parts - mass/energy and gravity - and created a universe out of that split. We still end up, in a way, with "nothing!"

But, on the other hand, is this supposed to be surprising? After all, there's the law of energy conservation. It HAS to even out.

As you might guess, I'm trying to learn about cosmology, and I'll tell you - as a person without much math (let alone much physics or astronomy, which depends on advanced math), it's a headache.

And then there's the main question: If the universe started with a Big Bang, initiating space, time, and all matter and radiation, what the hell banged?

Cosmology currently seems to be stuck with the idea of some fundamental sort of quantum fluctuation in which enough matter, independently of non-existing space or time, "popped" into being without any reason - sparking inflation which, in fractions of a nano-second, created all the stuff of the universe. I understand there are other ideas - string-theory branes bumping together, for instance - but excuse me: it all sounds like Merlin adding that last alchemical drop.

Sure, I know my lack of mathematical ability is limiting my thinking. But I still am curious. And I have to worry: By the time cosmology cracks what might be the last big nut, my own personal universe might well be moot.

Tuesday, July 27, 2010

Paleoanthropological sex

I recently remarked on a scientist's announcement that up to four percent of the DNA of people living outside Africa in Eurasia came from Neandertals. Of course that finding has to replicated, and it is sure to be hotly debated. Biological- and paleo-anthropologists have argued among themselves for years over whether Neandertals were an archaic subgroup of humans or were a separate species. (If they were a subgroup of Homo sapiens, they could interbreed and produce fertile offspring.)

It turns out that the debate will be broader than that. There long has been a sharp divide between those argue on the basis of fossils that humans mated with Neandertals in Eurasia and with the descendents of Homo erectus hominids who had migrated to East Asia at least 100,000 years earlier. Others opposed to this "multiregional evolution theory" of modern human origins say hogwash. These "out of Africa" types insist that humans arose solely in Africa and when they finally migrated elsewhere, they could only breed with their own kind. Their point is that evolution of a species doesn't work that way - it is too random and contingent to happen the same way in widely separated places.

Whatever the outcome of these mysteries, my reading has solved a tiny little mystery of my own.

For years, I have wondered why some authors would talk about humans as Homo sapiens, while others would use the label Homo sapiens sapiens. Well, it seems that if Neandertals really are a subspecies of humans, then they should be called Homo sapiens neadertalensis. Thus humans must have the second sapiens. The authors have been taking sides all along!

Monday, July 26, 2010

Immigration and scared Americans

Recent CBS/Times polls show that U.S. citizens who consider immigration a "very serious problem" rose from 54 percent in 2006 to 65 percent in May. This despite Border Patrol figures that show that the number of illegals apprehended has declined more than 60 percent in the last decade.

The decline, attributed both to beefed up border coverage and, more recently, the lousy U.S. economy, is significant. Also significant is the fact that, according to the latest issue of the "New Yorker," while violent crime keeps rising in Mexico, it is declining in Southwestern border countries - down by 30 percent over the past 20 years. According to FBI numbers, the four safest big cities in the U.S. - San Diego, Phoenix, El Paso, and Austin - turn out to be in border states.

So what's the problem? There are many, but increased crime obviously isn't one of them. In some businesses, poor immigrants depress wages. And when it comes to under-the-table wages, government tax revenue suffers. But perception of the problem has less to do with that sort of thing but with fear of unemployment caused by Spanish-speaking outsiders and, ultimately, the skidding decline of the country itself.

The biggest challenge has to do with the many illegals already in the country. That's what immigration reform has to be about. George W. Bush tried to push through reform, but was beaten back by the right wing of his own party. These are the same demogogs who are blocking reform today, in part by intimidating Obama, who has a hell of lot of other things on his plate. In the meantime, millions of families supported by house maids and gardeners wait in limbo to learn their future.

Saturday, July 24, 2010

Big Earths

It hasn't received a lot of publicity, but there's a really cool thing up there in orbit. NASA's Kepler space observatory, launched last year, is busy staring at some 150,000 stars. It can measure dips in a star's brightness to one part in 10,000, and thus detect the passage of a planet across that star's surface. It will take several years - these studies are statistical - but the instruments already have detected new planets, and many more such finds are expected.

And think of this: while most "extrasolar" plants already discovered - some 450 of them - are Jupiter-size behemoths, more and more are not all that much bigger than the Earth. (Well, maybe 4 or 6 times. But, hey.)

What's interesting, however, is that theoretical geologic models suggest that a surprising number of them could be similar enough to Earth to support life.

In fact, the research shows, the Earth might be way down on the size-scale of life-supporting planets. We might not just be lucky enough to be on a planet the right distance from its sun, and one with the right mix of water and other elements, but one that is barely big enough.

The geological models are complicated, having to do with the more vigorous convection of bigger earths and their more speedy plate tectonics and subduction, hastening the carbon-silicate cycle, but the bottom line is that bigger earths could be even more hospitable to life than Earth-sized planets. And their larger masses would better hang on to their atmospheres and water.

Before long, scientists will be able to sample the light spectra of this sort of planet to detect the possibility of life.

This is good, because today I received an envelope in the mail that contained no outside indication of its source. But when I opened it, I learned that "The End of the World is Almost Here! HOLY GOD will bring Judgment Day on May 21, 2011."

Oh, poop.

Thursday, July 22, 2010

Hanging on to humanity

By now many of us are aware that recent research has shown that a small portion of every living human's DNA traces back to single woman in Africa - a woman dubbed "Eve." In addition, the DNA studies show that during a big-time glacial period lasting from 195,000 to 123,000 years ago, humanity damn near went extinct. (The numbers dropped from more than 10,000 breeding pairs of modern human individuals to just hundreds - one hell of a close call.)

So how did those few early humans hang on as the arid, cold conditions of that glacial stage (known as Marine Isotope Stage 6) made practically all of Africa uninhabitable by killing off the animals and plants that hunter-gatherers needed to survive? New research starting almost 20 years ago but culminating only recently has determined that those few humans who survived did so by living near the very southern tip of Africa. There they could make do on carbohydrate-rich tubers and bulbs of plants resistant to the cold and the rich protein found in shellfish thriving in the tidal waters.

I learned all this from an article in the August issue of Scientific American by Curtis W. Marean, a distinguished scientist who, with colleagues worldwide, has published the findings in 2008 and 2009 in the top scientific journals.

But what caught my eye was their solid evidence that humans were pretty darn smart a long, long time ago. European scholars over centuries had concluded that real thinking humans didn't emerge until some 40,000 years ago. They reached this conclusion because they dug only in Europe, and because, hey, there were nothing but savages anywhere else!

Some 160,000-110,000 years ago, on the tip of Africa, people saved themselves by figuring out how to survive in one of the only environments that offered them hope. They almost had to have a calendar (based on lunar phases) to determine when the lowest tides made collecting shellfish safest. They created ochre paints for aspects of culture that were clearly symbolic. And they invented complicated ways to heat certain stones in certain ways to manufacture special stone spearheads - knowledge that took smarts to pass down through generations, probably through speech.

The researchers have had to overcome the archaeology gospel that the Solutrean people in France invented heat treatment of stone about 20,000 years ago. It looks as though they have done so. Let's hear it for the really old guys, cracking shells and digging up tubers, watching the sea and the sky and the rocks around them, hanging on to humanity so we can, too.

Wednesday, July 21, 2010

You are my sunshine

In the past two days I've watched two movies with big-time computer generated images - "Avatar" and "2012" - and I have to admit the special effects are indeed special. (Of course, story-telling always lags behind. Call "Avatar" the latest "Dances with Wolves," where the white guy saves the natives. And "2012" is your typical disaster tale featuring cute kids, parents that love them, a big cast of different ethnic types, some impassioned speeches about saving as many people as possible, and fade-out happiness for the few who survive.)

But at least "2012" reduced the Mayan calendar factor to a cartoon.

Anyway, in "2012" the gimmick was an anomalous outburst from the sun that sent out neutrinos in such numbers that they heated the Earth's interior, causing the continents to float around. "Impossible!," said a scientist character early on. Well, of course. As you read this, neutrinos by the gazillions are blowing through your body, without effect. Relax.

But that doesn't mean that the sun itself can't act up. Solar flares, for instance, can fry satellites and any unprotected astronauts. But in recent years, the sun has acted down.

The sun has a cycle of roughly 11 years - big flares, coronal mass ejections, etc., peaking at the solar maximum; activity pooping out at the solar minimum. But in the last few years, the minimum scheduled to be in 2008 has been really quiet, for a very long time, and the scientists studying the sun don't have clue why. They're studying the speeds of flow of solar matter from the equator to the poles, the solar "jet stream," acoustic oscillations, and so on, but nobody has any idea about what's going on.

So, if you watch or have watched "2012," breathe easy two years from now. As for 2019, the next expected solar maximum ... no promises.

Tuesday, July 20, 2010

Beyond Darwin

Charles Darwin raises hackles to this day. Check out the bumper-sticker fish, and the fish with feet. But the whole issue remains a lot more interesting than that.

In the lectures I'm watching by Oxford University's Daniel N. Robinson, the professor wraps up his discussion on Darwin by noting that modern humans no longer find it possible to think in terms that "are non-Darwinian, let alone anti-Darwinian." But he adds that the question may be asked "whether this theory can tell us finally who we are and how we should live."

Well, of course not.

That's like asking a fish that recently "walked" out of water whether it would vote Democratic or Republican. Evolution, obvious as it is through genetics, geology, etc., may clarify many things, but it is totally silent about ethics. As Alfred Russell Wallace, the co-founder of evolutionary theory, said - abstract thought, aesthetics and ethics remain quite beyond the theory of evolution.

People interested in "intelligent design" and such silliness are mistaken in worrying about how humans have come to be. Now that we are here, however, they should be worrying about how we should act. That hasn't exactly been cleared up yet.

Monday, July 19, 2010

Deer music

It was a little after 4 p.m. today that I watched a full-grown mule deer doe relax in the shade of my garage, taking a break from heat in the high 80s. I could see her eyes open and then half-close again, although her ears remained on full alert.

Then I saw something I hadn't noticed before. The doe was gently swaying her head from side to side. Tiny sways - maybe half an inch - and her haunch was moving to the same beat.

She'd stop the sway, chew a little cud, and then start it up again.

Then, suddenly, some sound grabbed her attention. Her ears both pointed toward the source of the puzzle, bent forward, zeroed in on whatever it was. She remained that way, motionless, for several minutes.

Then she relaxed again, ears back in multidirectional mode. And before long that swaying began again. I have no idea what was going on. But it was as though she was listening to a rhythm no human being could hear. A music of the deers.

Saturday, July 17, 2010

Ever heard of this philosophy guy?

Those of us who can remember going to civics classes back around the 1960s will remember nervous teachers trying to explain something of the idea behind Communism. Remember, this was during the height of the Cold War, and those teachers - not long after the whole McCarthyism thing - had visions of hairy-knuckled fathers (or mothers) pounding on the school superintendent's desk, yelling about teaching their kids how to be commies!

Still, those teachers did their jobs, talking about something called "dialectical materialism." The deal was that those pinko freaks, inspired by some ultimate pinko freak named Marx, taught that communism was inevitable through some process of "thesis, antithesis, and synthesis" - conflict between opposing tendencies resulting in ... a Communist society!

Later, many of us learned that the idea of thesis, antithesis and so on actually came from a guy named Georg Wilhelm Friedrich Hegel (1770-1831). His philosophy, to the small extent I understand it, is way too involved to talk about here.

But what's cool is that whole dialectic thing didn't come from him, either! Enter: Johann Gottlieb Fichte. It was Fichte, a brilliant young contemporary of Kant, who came up with the thesis, etc., etc. Hegal actually didn't mention it much.

But don't blame Fichte for fascism, communism, etc. His idea had to do with human freedom: Man is born free (thesis). But he can't know this until his freedom is constrained (antithesis). He realizes that he must pass the stage of his own freedom to the need for the freedom of all (synthesis).

You'll have to figure out the meaning of all this on your own. I'd like to give you a one-sentence wrap-up of the philosophy of Marx, Hegel and Fichte, but I don't have a strong enough drink. Still, I bet those nervous 1960s civics teachers would have got the idea.

Friday, July 16, 2010

Freedom and wonder

Like most of us, I've long been vaguely familiar with the idea of a "Faustian bargain" - in other words, a "deal with the devil." And we all know how THAT kind of deal turns out!

But I'm learning that "Faust," by Johann Wolfgang von Goethe (say something like go-tay, but, if you can, with a German accent), is a lot more interesting than just another "Eek! The devil screwed me!" story.

"Faust" comes in two parts, written over most of Goethe's long adult life. In the first part, Faust is a rich, land-owning polymath who knows all of human knowledge, but is bored out his gourd. He wants more - something he never will tire of - and sure enough Mephistopheles offers a deal that will give him such a thing, with the down-the-road price, of course, of Faust's soul.

It is only in part two, published after Goethe's death in 1832, that Faust is redeemed. He has a vision - apparently independently of the devil - of all the peasants toiling in his fields suddenly owning the land themselves - lives of others lived in freedom, lives free and open to possibilities that no one - not even polymath Faust - ever could tire of. A selfless freedom trumps all.

Goethe was part of what has been called the "Romantic Idealists." These were folks who reacted to the Enlightenment's rejection of any authority (kings, religious dogma, tradition) in favor of science. The romantics had the feeling that science - deterministic, indifferent to the human condition -can't be enough. Human's freedom, and their appreciation of beauty and wonder, had to be what counts.

These days, nearly two centuries after Goethe's death, after our experiences with communism, fascism, and currently terrorism, we're aware that Big Ideas about human life can be ill fated. But, after all these years, freedom and wonder still sound cool to me.

Thursday, July 15, 2010

Republican senators not from Maine

Sen. Lindsey Graham, a conservative who still seems to have replaced John McCain as a "sane Senate Republican not from Maine," has little competition for such a title. Not only has McCain caved to the yahoos, but most of his GOP contemporaries are drooling big time. (For instance, take Mitt Romney. He recently decided that Obama's nuclear disarmament treaty was the president's "worst foreign policy mistake." If that was Obama's worst, his record must be rather sterling indeed.

But enough making fun of Republicans, shooting fish in a barrel, as it were. Instead, let's think about a recent Newsweek article that is pro-Republican to its core: The author, Andrew Romano, obviously wants a Republican resurgence. But he worries: In an article headlined "What would Reagan REALLY do?", the idea is that while Ronald Reagan was indeed a cool guy and a good president, he hardly was a model for current GOP dunderheads. Reagan, who talked a pristine conservative line, was a pragmatist. The author says that if today's candidates had to pass a Reaganite purity test, which many of them do, they'd be dealing with a test that Reagan himself wouldn't have passed.

For instance, in 1982 Reagan rolled back his earlier tax cut, restoring a third of the taxes - the largest tax hike in history. The next year, he raised gas taxes and created new taxes on payrolls to bolster Social Security. In 1984, he cut tax loopholes worth $10s of billions, and in 1986 he supported tax reform that hit businesses with hundreds of billions in new fees. Reagan was a pragmatist. His rhetoric was pure conservativism, his other actions - say, appointments to the high court - were sorry indeed, his foreign policy was incredibly lucky despite being blindsided by Gorbachov, but in the end Reagan often was ruled by reality.

That's more than can be said for most Republican senators not from Maine.

Wednesday, July 14, 2010

Ever heard of this guy?

One - perhaps the only? - benefit of getting old is realizing we don't know beans. It's understanding that learning more about stuff we think we already know enough about usually is worth the effort.

For instance, take phrenology - the study of bumps on a person's head. We're all learned to smile at such a dumb idea. The shape of your skull is supposed to reveal your intellectual or even your moral capabilities? What a laugh!

Well, pervasive as phrenology was during much of the 1800's and even beyond, it was a laugh. But the basic idea - that certain bodily functions are regulated by certain parts of the brain - remains important to this day.

Phrenology was the brain-child (heh) of one Franz Joseph Gall (1758-1828), one of the leading neuroanatomists of his time. The dude did his homework. He dissected the brains of naturally aborted fetuses, heinous criminals, celebrated persons, and other types of animals, and grew a theory. Part of it involved realizing that fetal and newborn skulls are soft - and so could be pushed outward by growth of brain tissue beneath. Gall knew enough to understand that, in adults, the size of brain parts didn't exactly match cranium shape, but he thought that shape came close enough to matter.

He was, very basically, wrong. And some later scientists testing his theories in the 1800's conducted horrific experiments on unanestheticised animals. You don't want to think about it. Yet Gall was not a bad guy. His basic idea - that the parts of a brain are localized in function - not only remains accepted today, but also served 200 years ago to yank thinking about mentality out of metaphysics and into science.

Of course, it also reminds us that modern science also is likely to need, as time goes by, a whole new bunch of yanks.

Tuesday, July 13, 2010

Mowing in the wind

The middle of a July morning, and a bright sun shines from a cloudless blue sky. And the wind-chill is something like 40 degrees. Priceless!

I was outside in short sleeves mowing my grass (which I never let get higher than eight inches or so), and the 25-to-30 mph gusts were waving tree branches with abandon, not quite howling, but raising goose bumps on my naked arms despite a "real" temperature in the mid-50s. It was a welcome break from temperatures in the 80s or more soon to return.

Of course, I left a few of those little eight-inch grass stalks, tassels waving in the breeze, to keep growing as they would. Not so much out of some sort of largesse or generosity towards individual brave blades, but because I have to wonder where those seeds might fly, or even grow. Fun to think about!

Then I went back into my house, plopped with some relief into my chair, watched the news (about the oil spill) and started watching the next philosophy lecture. This one was about Kant's "categorical imperative," a moral stance having nothing to do with the ends of one's actions, but the idea that we "act in such a way that the maxim (basic idea behind) your action would, if you were able, be instituted as a universal law of nature." That is just part of it. The idea also encompasses the idea that to use some other person as a tool means that you also may be used as a tool (morally speaking.) Thus, for instance, you may never lie.

Kant - and his most important influence, Hume - had not, I'm sure, ever thought about a cool summer day, an underwater oil spill, or a single stalk of grass blowing in the wind. And although Kant would call such concerns - gaining pleasure and avoiding pain - a lesser moral matter, underneath the "pure," "categorical," level of his moral thought, I wonder what essentially matters. A lie, or a seed in the wind?

Monday, July 12, 2010

Rights and virtue

As I sit here in front of my computer, blogging away, I realized I can say pretty much whatever I want. I can write porn. I can bad-mouth whatever government is in power. I can bad-mouth whatever wanna-be government isn't in power. As long as I don't falsely yell "fire" in a crowded theater, or knowingly lie about somebody to their detriment, I can spout off any way I like.

(Sure, there's the credibility factor - see the laughable Fox News - but still.)

This freedom I have comes directly from the Bill of Rights - the first 10 amendments to the Constitution. The question is, why weren't those rights initially in the constitution? Why did they have to be added?

The answer is basic to our country: to list basic rights is to limit them - so the "failure" to list them was deliberate. The Bill of Rights was necessary in order to ratify the constitution, and nobody is complaining!, but basically the founders thought that the "people," not some constitution writers, should determine the rights of U.S. citizens. As it turned out, some specific amendments - the 14th and 15th passed by Republicans come to mind - were rather important. But the U.S. Constitution always has been a work in progress. (Or, recently, regress.)

Anyway, the whole idea of a republic is that it be ultimately governed by a people that are intelligent folks of virtue. I rather suspect that the founding fathers never suspected that the very term "virtue" might become a matter of contention.

Sunday, July 11, 2010

True believers

I have just read one of the most stunning short stories I have had the opportunity to read. Also, one of the grimmest. Written by Steven Popkes, a long-time author of short fiction, it is an alternative-history piece of science fiction called "The Crocodiles," published in the May-June issue of "Fantasy & Science Fiction." It curls your toes.

The story is simple. A young family man, a chemical/biological engineer working in Germany at the beginning of World War II, is tasked with using a newly discovered combination of diseases to create among its victims a vicious, zombie-like horror that just might win the war for the Nazis. He thinks of these nasty changed people - formerly Jews, gypsies, and other outcasts - as "crocodiles" for their cold-blooded killing potential once infected. The idea is to turn these monsters against the Allies - particularly after they invade the continent.

The outcome is obvious - the end of the world as we know it - but the most chilling part of the story is the young man's love for his wife and young son, together with his blind faith in the Third Reich - and his belief that German heroism will prevail, despite the obvious success of his terrible creation.

The allegory is easy to see - man's evil, stupid nature will do him in, the monsters of his making will kill us all and take over the world, and so on -but Popkes' cool prose is powerful. Buried in a bunker, like a certain German leader we can think of, our protagonist must kill his family and himself. But his faith remains.

I read this while learning about the leaders of the French Enlightenment in the 1700s - Voltaire, Rousseau, Diderot, etc. - who saw the need to translate philosophy into a means of bringing about social and political change. Then came not only the French Revolution, but a few changes in Russia and Germany. Popkes' story, ultimately, is about true believers.

Saturday, July 10, 2010

Heard of this guy?

Walking into a giant Wall-Mart at 6 p.m. on a Saturday evening, you can have one of several mindsets. You can be focused, intent on whatever it is you want to buy. Or, you can be oblivious, deeply into your own head, not really there until a bargain presents itself. Or, like me this evening, you can be on laser alert.

There, ahead of me in the aisle, was the Huge Family - four of them, a collective 850 pounds if an ounce. There, ahead of me in the checkout line, a four-year-old "helping" pull the shopping cart, cute as is it is possible for a human being to be. And, as you leave the store, you notice a nice smile from the elderly "greeter" lady. You grin back. You're on laser alert.

You are thinking like Thomas Reid, 1710-1796, the philosopher who was a big critic of Hume and Locke. He's a dude I've just been learning about. He advocated a "common sense" philosophy - one that observes human nature, as well as animal nature, and learns from it, rather than tries to reject it.

Reid, a big deal among his Scottish colleagues - not to mention among America's founders - liked to think that a "common sense" philosophy about such things as altruism reveals, as shown by the proper Newtonian method, a God that cares about us. As I recall that little gal helping her mother by pulling the shopping cart, I'd sure like to think so.

Thursday, July 8, 2010

Spin-free lawn-mower instructions

How refreshing is this?:

"Lift the side discharge cover (1).
Align the slot on the discharge chute (2) with the pins on the underside of the discharge cover.
Lower the discharge chute until the hooks on the mower deck are secured in the openings of the discharge chute."

How absolutely refreshing! During my whole professional life as a newspaper dude I've been handling spin, be it the word from the local cop shop or from city commissioners, state offices, congressional offices, or - far worse - political parties or (Eek!) think-tank propagandists.

Now, after my electric lawn mower quit, leaving grass covering my front yard that is so long that it is unacceptable even to me, I've bought a new mower. Sure, now I'm expected to put the damn thing together. But hey, no "talking points!" These technical writers are doing their best to tell it like it is! No bull!

Donno. After all those years of political baloney, maybe I should buy some more lawn mowers. The blades spin. But all they do is cut up grass, not common sense.

Wednesday, July 7, 2010


Who in America knew that in Finland, the term "football" - soccer to you - is spelled jalkapallo? That's what's so cool about "New Yorker" writer Hendrik Hertzberg - a sharp stylist whose research always goes the extra mile.

(For instance, according to Hertzberg in his latest magazine column, the term "soccer" is a British import (now lost in its homeland), that derived from the slang for rugby - ruggers - and the short form of Assoc. (which governed the new game.) Hence, soccer. Hertzberg pointed out that Americans no doubt neglected to register this information, being engaged as they were in 1863.

Anyway, the author has fun talking about conservatives' derision for soccer, despite the fact that more Americans watched the American soccer teams' defeat at the hands of Ghana than the average number of viewers of last year's World Series, the Kentucky Derby, the final round of the Masters golf tourney, or the Daytona 500.

(Conservatives tend to call soccer a "socialist" sport. Glen Beck, of Fox News infamy, said: "I hate it so much, probably because the rest off the world loves it so much.")

What attracts me to all this is not so much soccer - beyond trying to make a rather boring topic interesting while honoring a cool writer - but that America has changed so much since I grew up all too long ago. Back then, in the 1950s and 1960s, soccer was an oddity. Somewhere mixed in with fencing, lacrosse, and sumo wrestling. Now kid-people, and even not-so-kid-like people, all over the country are soccer nuts. They know field-wide plays I can only imagine, hoping for that final successful header.

I see our kids' love of soccer as a welcome sign of broadening perspectives. Call me a guy taking a seventh-inning stretch. Just don't call me a conservative. I've somehow never learned how to hate a sport just because others love it.

Tuesday, July 6, 2010

Sex, hips, and a big toe

In light of a (relatively) recently found African fossil, some scientists think that actual bipedalism - rather than the knuckle-walking of apes - may have happened because of sex.

The fossil, 4.4 million years old, was discovered, encased in rock, in the mid-1990s. After years of freeing and reconstructing the bones, it turns out that the creature had a weird combination of monkey and hominid traits. (Hominids, like the famous Lucy, have been thought to be the true ancestors of humans because they walked fully upright.)

But this new specimen, from a time well before Lucy or other hominids, had hips that would let it walk, hands free. It also has enough little bones in its hands to let it carry things as it walked. But its feet retained the out-pointed big toe of apes and monkeys, the better to climb trees, rather than the straight-ahead pointed toe that helps power our (toes-first) walking. It was a mix! (But don't think "missing link." That's a long-since discredited concept.)

Anyway, why would a creature adapt to carrying things and walking upright, at least in the hips, despite an ape-like big toe? Walking still must have been rather difficult.

In the latest issue of National Geographic, author Jamie Shreeve reports that other apes of the time (as now) had long, fierce canine teeth, adapted to fighting off other males to win breeding rights. But this new fossil had much smaller teeth. They didn't fight for females, the story goes, instead they carried food back to their females and children, thus ensuring (they hoped) sexual fidelity.

And that, after all, is how evolution works. Producing children - yours! - that survive. Perpetuating those genes! Ladies, your mothers were not only right, they were right back to 4.4 million years: Men are after "one thing." Even if they have to walk on feet with dumb big toes.

Monday, July 5, 2010

Revenge of the surds

As I was scanning through my movie menu, I came across something called "Revenge of the Nerds, Part IV." Eek! My mind went crazy. It came up with: "Revenge of the surds."

The surds? What the hell was that about?

I actually had to go the dictionary. I'd been thinking about surds in a linguistic sense - voiceless sounds - but the word mostly means irrational (be it a mathamatically irrational number or something simply lacking sense.)

Slowly, I started to think about cognate words - "Absurd," etc. But why was I thinking about all this?

Maybe it had to do with reading a review of "Courage and Consequence: My Life as a Conservative in the Fight," by Carl Rove.

This is not to say I have read the actual book, any more than I would vote for an actual Republican. And so this isn't a review. Instead, it is a slight comment on the blabatudes of silly (and, of course, dangerous) conservatives like Rove and Glen Beck and so forth: Nobody blames the dumbheaded money makers, but the idiot Americans who buy into the baloney.

Back when I was a kid, many people hated long hair, rock and roll, and anyone who dared to question the government's policy in southeast Asia. The details of such political disputes change, but such folks never went away. These days, perhaps, we can just talk about the revenge of surds.

Sunday, July 4, 2010

A cool golfer

Christina Kim is one of the more interesting young ladies on the LPGA tour. Over-weight, exuberant, outlandishly dressed, outspoken, and extremely talented, she is a joy to follow. Not long ago I happily watched on TV when she made an eagle from the fairway. She skipped all the way down the fairway to the green.

Kim, perhaps much to her chagrin, was one of three Kims to make a playoff of four players this weekend during a tournament. The other Kims, not to mention the eventual winner, name of Choi, were Korean. Christina, a first-generation American of Korean ancestry, missed a couple birdie putts. Ended up tied for second.

In her recently released book, "Swinging from the Heels," she notes that when she started playing in the LPGA she was one of five players named Kim. Today, it's about a dozen. They all are damn good.

Christina writes about this - how foreign dominance of the LPGA is deterring sponsors - and how Korean fathers are part of it all. (As a child, she was required to make 500 swings a day.)

But, decked out in what she called her "pimp" hats, she remains a joy. (Even, as she writes, "it's hard to get laid on the tour."

At 25, the young woman is cool. One tries to think of other pro golfers as cool. It's not easy.

Friday, July 2, 2010

Evolution of playgrounds

Reading an article in the current New Yorker about new, innovative playgrounds being built in the city, my mind naturally returned to my grade-school playground. Bartlett School, a rather ancient brick structure even in the 1950s, stood a tall two stories high at the base of the bluff overlooking the Chippewa River from the east, just a few blocks from Eau Claire's city center. A long concrete stairway with sturdy metal railings climbed the bluff to provide access to children living atop.

On the south side of the school was a playground, a huge space to a kid, an unsafe nightmare to parents of today. As I remember, it was entirely blacktopped, although my mind boggles at the notion. The swings - tall structures from the perspective of a twerp - had hard wooden seats. Once during my time at Bartlett School, a little girl walked in front of a swing. It was like being brained by a baseball bat. I don't remember her fate.

More recently, of course, playgrounds have become safer. Monkey bars have sand pits below them to cushion a fall. Swing seats are made of soft rubber. Slides are gentle, a blacktopped playground is nowhere to be seen.

The latest playgrounds, it seems, involve "loose parts." The idea is to have - OK, some slides, etc., - but mostly lots of sand (with running water for the sand to dam), and those loose parts. Big, foam blocks and other parts - cogs, arches, hinges - that kids can work together with to build whatever their imaginations can come up with. Putting together young brains, is the hope.

It all seems a lot more sensible than the playground I remember (including the vicious dodgeball games). And it beats the inevitable winter occurrence I remember with a stab of childlike horror - some kid feeling the need to step onto that stair at the base of the bluff and put his or her tongue on that metal rail.

Thursday, July 1, 2010

Emily's irony

As I sometimes do, this evening I grabbed my copy of R.W. Franklin's collected poems of Emily Dickinson and opened it to a random page. Holy smoke! There was a Dickinson poem - not a "great" one worried about like an inportant bone by her army of scholars - but Emily at her most real.

Or so I think.

It's the one that goes:

My Faith is larger than the Hills
So when the hills decay -
My Faith must take the Purple Wheel
To show the sun the way -

(If the sun fails to rise, after the purple sunset, it must be led toward the purple dawn.)

'Tis first He steps opon the vane -
And then - opon the Hill -
And then abroad the World He go -
To do His Golden Will -

(See the sun rising - first to the barn-top, then to the hill-top, then everywhere.) (Apparently she pronouced "upon" like "opon.")

And if his Yellow Feet should miss -
The Bird would not arise -
The Flowers would slumber on their Stems -
No Bells have Paradise -

(If the warming sun fails to reach them, birds and flowers will not rise.)

How dare I, therefore, stint a faith
On which so vast depends -
Lest Firmament should fail for me -
The Rivet in the Bands

For Dickinson, a barrel was the main way in which commodities were conveyed. If a rivet holding the metal bands of a barrel failed, all would fall apart. At first, the poem sounds like a typical 19th-century affirmation of faith. "My Faith is larger than the Hills -". But listen hard. Can a faith necessary to keep the sun rising, to keep the world from falling apart, make any sense in a world in which, well, the sun is going to rise anyway, the birds are going to sing, the flowers bloom?

This was from Dickinson's most fertile period, during the Civil War, a time when death was in the air. I see it as an affirmation of life.