Saturday, May 29, 2010

A deer visit

Early this evening, as I watched two mule deer does grazing on grass and bushes across the street in a neighbor's front yard, I started feeling a little sorry for myself: Why didn't they visit MY yard?

And then they did. First one, then the second. And then a third came out from behind a spruce tree, crossed the street, and wandered up to my house - right in front (a foot or so away) from my front-door window - and browsed on an overgrown shrub next to my doorway. It was cool!

Then, from across the street, came deer Number Four. It was late because it only walked on three feet. Her left front hoof never touched the ground as it sort of hopped across the asphalt and reached my lawn. The other deer obviously accepted her as one of their own, but they roamed around her, ignoring her handicap. She also munched on grass grown long from recent rain, and on urban bushes flourishing as June approached, but then the other three deer started wandering off, back across the street, disappearing behind that same spruce.

The lame deer looked around, didn't see her friends, and also finally hobbled back across the street. I watched her go.

Friday, May 28, 2010

If your clone loved it ...

OK. Let's face it. I did it again. "Fantasy & Science Fiction" magazine ran another contest, and I got sucked in. The idea was to concoct a pickup line that works in a science fictional or fantastical setting to grab the interest of the man/woman/alien being you've got your eye on.

Among my entries (you could have up to six, but I ran out steam at four), my best guess for a winner is: "You might not know this, but your clone and I used to be HOT!"

Then there were the next two: "Of course we've met! It was at the 2020 Democratic convention!" and "Wow! What an outfit! Didn't you used to be that severely retarded paraplegic?"

Lastly, no doubt fatigued, I took the I'm-with-a-bimbo approach: "Look out! Get under my arm! Let's go to my apartment! There's a neutrino coming!"

You can see why I cut it off at four.

Anyway, enjoy the silliness. It probably won't come again until the magazine has another goofy contest.

Thursday, May 27, 2010

A wonder from Sweden

There are snobs who raise their noses at any novel that their professors failed to deem literature. Of course, such people are full of snob juice. It drips out of their noses. (I like to think of a grade-school joke: "I thought I had a bloody nose. But it'snot.")

Anyway, that kid joke off my childish chest, I now get to discuss an amazing story about a modern-day series of best sellers that may not have happened in the same way ever before in the history of publishing. One day, something like seven years ago, a Swedish man walked into the office of a publisher with manuscripts for a series of three thrillers. It turns out he had been writing these books as a hobby. His name was Stieg Larsson.

Larsson was an editor of a magazine that kept tabs on antidemocratic extremists like neo-Nazis and skin-heads, and when he died of a heart attack in his late 40s (not long after delivering his manuscripts), some wondered whether he had been killed or died as the result of stress from dealing with the bad guys. But his family denies this.

In any event, Larsson, out of nowhere, came up with stories - and a young heroine, the astounding Lisbeth Salander - that rocked the publishing world. I read the first of the three, published in English as "The Girl with the Dragon Tattoo," a couple of years ago. I loved it. As it happened, I only read the even-better second book, "The Girl Who Played with Fire," a month or so ago. I immediately wanted to read the third in the series, but it hadn't been published yet. Shit! I went to Amazon and preordered "The Girl Who Kicked a Hornet's Nest." It came today, and I've started reading it. It looks to be every bit as good.

The names of Swedish characters, and Swedish place names, can be slightly off-putting to Americans. But that quickly disappears into the story. Read the books in order. They may not be literature. But they show that in a world of 6 billion souls, wonders can come out of anywhere!

Wednesday, May 26, 2010

An OK snippet?

I use these blogs to write about whatever I am thinking about, and often - now that I am quite involuntarily retired and amusing myself by watching college lectures - I'm think about academic stuff. However, much of that, in a blog, would come off as booooring. But sometimes, maybe it isn't.

For instance, this afternoon I watched cool lectures about Francis Bacon, Rene Descartes, Isaac Newton, and Thomas Hobbes. Interesting stuff. But, in terms of a blog, who wants to sit down to read some half-baked, third-hand little snippet of philosophy? Sheesh.

Still, there was something in there that almost all us of know very well, but that almost all of us get rather fundamentally wrong. That is Descartes' famous quote in Latin: "cogito ergo sum," or "I think, therefore I am."

Understandably, that "I am," sometimes translated as "I exist," leads us to think that Descartes was trying to prove his existence. Not so.

Instead, he was talking about his fundamental thought that while ALL data from ALL our senses could be faked (by some evil-genius imp, say), the fakery still would have to be directed at some thinking entity that it is trying to fool. And that thinking entity is none other than him! A first step away from total skepticism! A first step toward reaching toward truth!

Not a bad snippet, I like to think.

Tuesday, May 25, 2010

I be enjoying this stuff

Some of you might actually be old enough to remember the great "Ebonics" flap of 1996-97. The Oakland School Board was considering using Black English to teach Black kids, and predictably the country went bananas.

People yelled that Black English was nothing but slang, but of course slang is a here-today, gone-tomorrow thing. Visit your nearest middle school and listen for how many students are saying "gag me with a spoon." Some said black English is an African language that used English words. That makes no sense for several reasons, including the fact that speakers of Black English use goofy and thoroughly English-language irregular verbs such as went and stood, and plural nouns such as men and feet. You don't find that kind of stuff in Africa.

In fact, Black English is what is called a nonstandard dialect of English, much like what you hear in the "hollers" of Appalachia or in countless parts of the British Isles.

But what was interesting to me was that a majority of Black English comes - not all but in large part - from those very backwaters of the British Isles!

It turns out that Black slaves often worked alongside indentured servants from the old country, people who generally did not speak standard English. For instance, Irish English speakers use what is called the habitual "be," just as Black English does. (When a Black English speaker says "She be walking to the store," it doesn't means she's doing it now, but that she regularly does.) Similarly, an Irish English speaker might well say that even "if I be there with friends, I be scared."

I think it is cool that many people freaked by a rapper's use of that talk don't have a clue that it comes from the merry old United Kingdom. And, as icing on the cake, that the rappers don't either.

Monday, May 24, 2010

English speakers ain't misbehaving

One of the great pleasures in life is listening to a linguist talk about how nonsensical certain rules of grammar can be. (Ooh, Dr. McWhorter, don't stop!)

By nonsensical rules I don't mean the innate rules of the English language. No parent ever has to tell little Johnny: "No - don't say 'boy the.' It's 'the boy.'" Nobody makes that kind of mistake. Nor am I talking about unnecessary if slightly useful rules like those involving perfect tense. If a man runs into the room yelling "The king has been shot!," the tense means not only that the king was shot in the past, but the shooting also has implications for the present. Nice, but we probably would have known that anyway.

No. I'm talking about two guys (Robert Lowth and Lindley Murray) who wrote prescriptive grammars in the late 1700s that were reprinted endlessly and have affected people right up until the present day.

They honored Latin and Greek as the pinnacle of language perfection, so they wanted English to be as much like them as possible. Hence rules against split infinitives and the horror parents still feel when hearing little Johnny say: "Billy and me are going to the store." Latin is very persnickety about that stuff, so "me," as a subject, has to be "I." Oh yeah? Who says it is so for English? Lowth and Murray.

(Incidentally, McWhorter points out, it is impossible to apply this rule consistently. If somebody asks you, "Who did it?" and you point to the two perpetrators across the room, you say "them." Those two people are the subject, so you should say "they," but you don't want to sound like an idiot.)

Another dumbness involved the idea that the loss of any word is the dreaded decay of your language. Languages always lose (and gain) words, and in the late 1700s, the word "whom" was quickly disappearing from English. Lowth and Lindley rescued it. Thanks a bunch, guys. Now generations of English speakers, including you and I (me?) have to be formally taught how to use "whom," because otherwise English no longer marks "what" and "who" for three cases (genitive, dative and accusative). (Say what?)

Then there is the dumbness that says English has to perfectly logical. Guys, NO language is perfectly logical! Even Shakespeare used double negatives! Nevertheless, the prescriptivists decided that two negatives equal a positive. That's fine in math, but as McWhorter reports, "every single nonstandard dialect of English uses double negatives worldwide, as do thousands of languages!"

(None of this means I personally don't rue certain language changes. I mourn the fact that so many people seem to fail to get the distinction between uninterested and disinterested, for instance, and I cringe when people use "beg the question" to mean "raise the question." But there is a difference between useful nuances and rules based on myth.)

We're probably stuck with these myth-generated rules, blessedly silly as they are. (Try getting a job as a newspaper reporter if you don't know them!) But that doesn't mean we can't lust in our hearts for a better English world. You and me, for who dumb rules don't make no sense, deserve better.

(If you easily understood what I was saying in that last sentence, tell me what exactly was "wrong" with it, and why, without sounding like some guy in late 1700s England who is wearing a goofy wig and seems to be inordinately proud of his calves.)

Sunday, May 23, 2010

Youse got a problem wit this?

Today I'm thinking about dialects. I'm thinking about them because dialects - the successful ones - formed every national language spoken and written today. They won the language lottery. Not because they are better, or cooler, or more expressive, but because some king and his court and his army spoke that way. (It has been said that the dialect that wins is the one with an army and a navy.)

Standard English, for instance, is spoken the way it is because that's the way it was spoken beside the Thames in a place called London where the power resided. Standard French is the dialect spoken around Paris - never mind that other French dialects abounded.

Dialects, rather than being some substandard type of speaking, are just as legitimate as any other. In fact, in their own way, they are sort of a linguistic thumbing of the nose, a rumbling toward the future, although none of their speakers think that way. Dialects are halfway between the mother language and a brand new language, just waiting to be born. They are, in short, the way new languages come about ... or don't.

We all know that Latin died, giving birth to the Romance languages. But what actually happened is that Latin, which spread over the huge Roman Empire, quickly began changing - differently in the regions that became France, Spain, Italy, etc. Dialects appeared, simply because languages always change. Those dialects weren't new languages, not yet. But they kept changing. We now call them French, Spanish, Italian, etc.

Think of it this way: In Britain, Standard English might say that "He's not going to tell you anything." In northern Britain, it might be "He's noan going to tell you nowt." Farther north, the Scots might say "He wina tell you onything." And in the U.S., in Brooklyn, somebody might say, "He ain't gonna tell you nothin'." Brooklynese is dying out, but imagine if it had been spoken in Washington, D.C., way back when. We might all be saying "youse."

Saturday, May 22, 2010

Tilting at language change

I had the not entirely welcome distinction, among my high-school set, as being the only teenage boy who actually enjoyed being forced to read Jane Austen's "Pride and Prejudice." But enjoy it I did. Such a perceptive woman. But, man, could she ever mess up our English language.

What's interesting, of course, is how we've messed with hers.

John McWhorter, my favorite linguistic dude, notes a few little "mistakes" that jar the modern ear. Austin has a character say that "She was small of her age." Someone greets a visitor by saying, "So, you are come at last." And, describing a meal, "much was ate."

Just 200 years ago, all of these things were perfectly proper for Austen's upper-middle-class characters to say.

About the same time, McWhorter notes, one William Cobbett wrote an English grammar consisting of letters to his 14-year-old son in which he admonished the kid to use the correct wording, such as "I bended," "loaden," and, for the word spit, "spitten."

Twenty years earlier John Walker's "Pronouncing Dictionary of English" recommended pronouncing the word dismiss as "diz-miss," cement as "SEE-ment," and balcony as "bal-COH-nee."

And, up to about 1870, when discussing a house under construction, one would say that the house is building. To say a house is being built was, I'm sorry, somewhat vulgar.

I know, maybe I should have gone into linguistics. But I vividly remember, as an aspiring journalist, reading book after book by "English mavens" decrying the decay of the language. These days I just label all the authors as so many Don Quixotes.

Friday, May 21, 2010

The brown shirt of authority

Sometimes, here in the 21st Century, you tend to freak out. I mean, back in the 20th Century there were all these demagogues - Huey Long, Father Charles Edward Coughlin, George "segregation forever" Wallace, Barry (extremism in defense of liberty is no vice" Goldwater, Newt (my wife has cancer so I guess I'll move on) Gingrich, etc., etc, who defied reason in order to scare people. (Notice I've left out the real nutcases.) You can imagine them all goose-stepping in a sort of scary unison.

But in the past few years, things have gone really nuts. I'm not just talking about dumb-head radio or Fox-TV commentators like Limbaugh and Beck and O'Reilly and the rest, or on-line idiots like Andrew Breitbart, and not just noodle-brained politicians like Alabama governor candidate Bradley Byrne who, attacked for believing in evolution, denied it with all his might, but NRA fools who think we need to carry guns in national parks because those octogenarians in Winnebagos might be dangerous. (I'm talking to you, Sen Testor.)

Idiocy is rampant. I keep getting emails about - oh my God, Mexicans are coming! - until I have to shake my head and erase them like Viagra spam. When I first heard about about Arizona's new law, my first thought was about Nazi Germany and SS troops demanding one's papers. It all probably seemed sensible to German fascists. But think: Cops stopping people to check their papers: This in America? My last thought on the subject is the same as my first: what's next? An arm thrust into the sky, a stomp of the foot, a brown shirt of authority?

Thursday, May 20, 2010

lol

When I was young boy, I somehow, in my head, pronounced the word "misled" as "my-zilled." He had been my-zilled. He was tricked into a wrong impression. Don't get my-zilled!

Soon after I joined a newspaper, I heard a really smart colleague pronounce "pique" as "pe-que." He really "pe-qued me!"

Of course, I soon realized that mis-led was the promounciation. And my friend learned that "pique" is pronounced "peek."

But such examples say something, I think, about language.

I think, for instance, that mispronunciations and the ignoring of fine language distinctions drive much of language change over the years. And we seem to get by.

For instance, I was taught early on that "disinterested" (unbiased, not a party to the dispute) should be differentiated from "uninterested" (bored by it). Should folks who don't make the distinction be slapped around and sent to their rooms to contemplate their sins? Is our language somehow decaying because people keep failing to make the distinction between laying (something) down and lying (themselves) down? Do an object and a subject really need different verbs?

I suspect not. (Just don't say "begs the question" when you mean "raises the question." I may have to step aside to let my head blow up.) I think whatever pronunciation wins, wins. And if whatever slurring of meanings works for us, no problem. If we need further distinctions, we'll come up with them.

So I think language change is cool. (Just don't change it with your damn thumbs. Adding a smiley face to grammar just hurts. The day lol is taught to first graders, I'll need to go the police to put my head into a protective container.)

Wednesday, May 19, 2010

We've got a groovy thing goin' baby

Professor John McWhorter, an author of linguistic works and other books who has written for many top magazines and newspapers and has appeared on a host of TV shows ranging from the Jim Lehrer NewsHour, to Today and Good Morning America, to Politically Incorrect, likes to illustrate the drift of the meaning of words (in all languages) this way:

In the 1940s, Jack Benny Radio Show band leader Phil Harris, a randy sort, asked his wife to agree that "Nobody makes love better than me." Rather risqué for a 1940s national radio show, wouldn't you say? Of course, Harris still used the term "making love" in a sense that had to with wooing a woman. No listener thought of it in the modern sense.

We're all used to slang expressions changing, and usually dropping out of sight. When was the last time you heard somebody in the real world talk about "hep cats?" (Think members of the groovy "beat generation," dressed in black and snapping their fingers to a poetry reading in some basement club in New York.)

But change isn't limited to slang. The meanings of all words change. All the time.

In English, the classic example is the word "silly." It means foolish, right? Back in old English, it meant "blessed." (So did its cognate in German.) But blessed can also mean "innocent," which "silly" came to mean. Years later "silly" made another natural progression to meaning "deserving of compassion." (That's the meaning that Shakespeare used in 1591 in a line from The Two Gentlemen of Verona where a character warns another not to do outrages "On silly woman or poor passengers." He didn't mean not to hit on air-head blondes, he meant to show compassion to women as well to the poor.) Eventually, the meaning of silly changed again. After all, those deserving compassion were "weak," and that's what the word came to mean. From there, the connotation of "simple" or "ignorant" was just another natural step.

I've used the word "wicked" to show what may be a meaning change that we are living through right now: The move from "evil" to "really cool." But that raises another question. Using the word "cool" to mean "really groovy" has had a good run, but how long can it last?

Tuesday, May 18, 2010

Kids and writing well

When I was a young reporter, fresh out of college and just a year or so into being a working newspaper journalist, I knew all too well how much I had to learn about the real-world job. It wasn't easy, despite some good older mentors. But at the same time, at 25 or so, I knew one thing for sure: my writing ability wasn't the problem. This wasn't some kind of baseless hubris or inflated self-esteem. I knew cool, effective writing when I read it, and I could see I was at least somewhat better at it than many (certainly not all!) of the people currently writing in newspapers around the state.

Anyway, one day I was going through a fat file folder of essays and other writings I had saved from my high school years - stuff that in my memory had been pretty darn good. Eek! It was horrible! Yuck!

Actually, inept was the word. Pedestrian. Almost childish.

Around that time I was asked to help judge a contest for high school newspapers. Guess which adjectives came to mind with every story I read.

Which started me thinking about a question: Why do kids, who automatically learn to speak at an early age, who are exposed to good writing throughout their school years, and who pretty much have all the basics of compositional knowledge by their high school years, still usually (there are exceptions) write like, well, adolescents?

This afternoon, as I was watching a lecture about Noam Chompsky's theory of Universal Grammar and disputes about it in the field of linguistics, it occurred to me that maybe one of Chompsky's ideas might help explain the mystery.

There are many who contest Chompsky's idea that the fact that much of the language kids hear is disjointed and a mess means kids must have a mental blueprint to sort it all out. Opponents of Chomsky say it isn't clear that so much of what kids hear is garbled. But listen to a recording of an intelligent college kid: "Yeah. It doesn't help the tree but it protects, keeps the moisture in. Uh huh. Because then it just soaks up moisture. It works by the water molecules adhere to the carbon moleh, molecules that are the ashes. It holds it on. And the plant takes it away from there."

This is a bright college student talking about a scientific subject. And it isn't exactly good English. We don't like to think so, but we all talk like this a lot. (Once a young journalist interviewed me over the phone. He taped my comments, and printed them with all my "ers" and awkward language intact. I called him back to suggest that it is only common courtesy to clean up an interviewee's verbal missteps. I did it all the time.)

Maybe it just takes time, well beyond the onset of puberty, independently of any problems like dyslexia or a deprived childhood, for the adolescent mind to figure out how to effectively take written language to the next level. If so, don't blame the kids.

Sunday, May 16, 2010

Think you know eggs?

Listen up, cooks. You may not be a gourmet chef at some fancy restaurant, you may even have trouble deciding when to remove that steak from the charcoal, but at least you know how to hard-boil eggs: 10 minutes in boiling water. Right?

WRONG! Go to your room. Ten-minute eggs have whites like rubber and graying, dry, yucky yolks.

It turns out that time in the heat has nothing to with making a tasty, palate-pleasing egg. It's all about the temperature!

The contents of an egg are proteins and water (yolks have fat as well), and when you cook them those proteins uncurl into strands that, at some point, depending on the particular type of protein, bind together into a mesh that traps water droplets, keeping the eggs soft and tasty. The temperature alone determines when this happens for the egg's various proteins! You can cook the eggs over night, for Pete's sake, as long as the temperature remains correct.

For instance, cooking eggs in the oven at precisely 149F (for whatever time) will give you an egg with whites "as delicately set and smooth as custard, and the yolk is still orange and soft." Bake them at 153F, and the yolk has begun thickening up. Do it at 158 F, and the eggs have a rather firm yolk but still have tender whites. For still-firmer eggs, try cooking at 167F or 176F. But boiling at 212-degrees? Way too hot. Hence the rubbery whites and dry yolks!

This from writer Patricia Gadsby in an article called "Cooking for Eggheads" published in 2006 in Discover magazine. She interviewed a French scientist studying "molecular gastronomy," a discipline that studies the science of cooking food rather than the art of doing so. She says she sure liked the eggs.

Friday, May 14, 2010

Can't place the face (Can't even see it)

Many of us are aware of a condition known as face blindness where, because of damage by trauma or a stroke to a small section of the brain that handles facial recognition, a person cannot differentiate between faces, or cannot even see them. The most famous example was reported by Oliver Sacks in his book about the man who mistook his wife for a hat.

But I didn't know until today that people can be born with the disorder - a lot of people.

(I read about this in a story by Joshua Davis, first published in "Wired," in a collection called "The Best American Science Writing - 2007.)

The tale begins with a fellow who, in his early adulthood, began to realize that he was very different. He couldn't even recognize himself in a mirror, but grew up thinking that was normal, or at most a minor handicap. (He had gotten by all those years by memorizing voices and noting things like hair and posture.) Now he started going to doctors, who proved to be no help. They'd never heard of such a thing. So the guy went to an Internet usergroup for people with neurological problems. The message he posted was titled "Trouble Recognizing Faces." Before long, responses started coming in.

A young scientist, learning about a heretofore unknown mental ailment, was quick to begin a study, with a ready-made study group right there on the Web.

It turns out, according to two separate studies, the problem is hardly uncommon. One study found that of 1,600 people given face-recognition tests, 32 were severely impaired. In another study, by a German researcher, 17 of 680 high school and college students were diagnosed. That works out to roughly 2 percent of the population, or nearly 6 million Americans.

We've always known that eyewitness testimony in criminal cases can be unreliable, but sheesh.

Thursday, May 13, 2010

Humility and witches

Humility is one of the most illusive, important, and of course prone-to-hypocrisy attributes of humankind. The story of witch hunts and their trials might illustrate the need for it most starkly. The lesson is simple: just when you think you really know something, that's when you should step back and say ... maybe I should wait a minute. I might be wrong!

Roughly between 1400 and 1700 and beyond, in Europe and the British Isles and of course across the Atlantic in Salem and elsewhere, some of the smartest folks around used scripture, the science of the time, and moral philosophy as they knew it, to work out a really serious justification of and practice for trying witches. After all, not only must their evils to be stopped; their immortal souls were at stake!

(This during the "Renaissance" - the age of Newton, Descartes, etc., etc.).

Using Roman law, the leaders of the era figured out to their satisfaction what legal procedures were necessary, what kinds of evidence were sufficient, and what punishments were just. Was the person just a nasty sort, or was Satan behind it all? This had to be determined. Sure, torture was one of the tools, but often the accused, most often elderly, eccentric women, admitted it all!

If they didn't, an adolescent accuser could throw herself to the floor, shake and tremble and scream that she was being possessed, and that would prove to be enough.

The point, of course, is that these paragons of the community, steeped in scripture and law and morality, knew they were doing the right thing as they put a witch to death.

There are sound reasons why we call certain actions witch-hunts today. Human psychology hasn't changed all that much. On countless issues of the day - religious, political, scientific - we need to exercise humility, step back, and say: wait a minute - despite how smart we are, we might be wrong.

We're pretty good these days at thinking about (heh, heh) witches. But what aren't we so good at?

Tuesday, May 11, 2010

Tea, anyone?

I've just read a fascinating article about the "tea party" movement written by Mark Lilla, professor of humanities at Columbia University. The piece, in the May 27 New York Review of Books, was titled "The Tea Party Jacobins." (The Jacobins' French Revolution made the Boston Tea Party seem like a matter of raising one's pinky as one sipped.)

Lilla's view, refreshingly clear of political rhetoric, argues that the tea party movement should be seen as more important than the derision of liberals might suggest, and more significant than the short-term fate of formerly politically moderate Republican office holders frantically trying to win primary elections.

Instead, he suggests, the movement represents a coming together of two libertarian trends that used to keep Americans apart. The first is the libertarian social upheavals of the 1960s; the second is the libertarian economic policies of Ronald Reagan's 1980s. Decades after the 1960s, most Americans have come to accept (excluding the abortion issue) Sixty's-type ideas such as, for instance, basic fairness for women, blacks and gays and acceptance of out-of-wedlock births. The 1980s brought the now widely accepted conservative idea that government interference in commerce and individual economic decision-making is counterproductive and wrong. Both revolutions have been won, at whatever cost, with whatever benefit. We are more free than we were. But now, rather suddenly, we have a tea party libertarianism that not only involves an irrational, pessimistic distrust of that abstract thing called government; it also includes an equally irrational optimistic feeling that all Americans would be better off making all their decisions.

Lilla concludes: "Now an angry group of Americans wants to be freer still - free from government agencies that protect their health, wealth and well-being, free from policies and problems too difficult to understand ...free from experts who think they know better than they do .... They want to say what they have to say without fear of contradiction, and hear someone on television tell them they're right. They don't want the rule of the people, though that's what they say. They want to be people without rules - and, who knows, they may succeed. This is America, where wishes come true. And where no one remember the adage "Beware what you wish for."

I'm still likely to smirk at goofs holding signs that say: "Keep it up, Jokers. ObamaCare does NOT cover Tar and Feathers." But, as a first-hand observer of the 1960s, I have to wonder: What have we wrought?

Monday, May 10, 2010

The kid knew cool

It was as a prepubescent boy of eight or nine in the mid-1950s that I first fell in love. The object of my affection was Lena Horne.

I'd been raised in an almost totally all-white northwestern-Wisconsin city, so I had none of the racist thoughts that, in the words of the musical South Pacific, "you have to be carefully taught." And although I later realized that my parents harbored some of the prejudices endemic to people born in this country in the 1910s, these deeply religious people imparted only one message to me: We're all God's children.

So when I'd see Lena Horne performing in some variety show on our black-and-white television - often singing her signature song "Stormy Weather" - I never even really saw the black. I saw beauty, talent, and the sort of intelligent eyes that blew me away.

The newscasts early this evening noted her death at age 92, and mentioned her commitment in the latter half of the 20th Century to civil rights. The broadcasts couldn't, of course, do justice to her having grown up black, coming of age, and becoming the first real "woman-of-color" movie star in Jim Crow America. Nor her disgust at those who said her looks, rather than her talent, explained her success.

All of that was well beyond the ken of a child in mid-1950s Wisconsin. But I like to think the kid knew cool.

Saturday, May 8, 2010

Questions

Here is an email I just sent to Book TV after watching an interview with an author (and conservative talk-show person). I was freaked. Not because of idiology, of which I have my share, but because I'm a journalist. Uninteresting, unproductive, and unintelligent questions make me sort of crazy. You can ask interesting questions without being biased, for pete's sake! Anyway, here's the email:

"I know and respect C-Span's mission. As a journalist myself, I understand bending over backwards to be unbiased. And I understand that you get a lot of messages from dumb-heads from the left and right.

But, as a journalist, I go nuts at the lack of even slightly interesting questions being asked.

For instance, today I've been watching a talk-show person hitting softballs out of the park.

What is wrong, even for you, to ask some simple questions. Like, please tell me about one time in which conservatives have been right in the past dozen years? After all, the Democrats have been critical. Fight back! Educate me! When have you people been right: About deregulation of the financial industry? About putting our Social Security money into the stock market? About WMD? About ... what? About "Drill, baby, drill?" Surely there is SOMETHING conservatives have been right about!

Once again, I know your mission. (And I know your funding source.) Of course, liberals should get similar sorts of simple questions. What have they done lately? (Say, in the last 40 years? Anything? Can you reveal them?) But, sheesh! The Fox-news-like pabulum on Book TV is making me beat my head on the floor."

OK, I'm still a little freaked about the kindergarten level of public discourse these days. Perhaps I get a little carried away. But, hey, wouldn't you like to hear the anwers to some, not tough, but interesting questions? I not only would like to, I'd like to still be asking them.

Friday, May 7, 2010

Animals in the dock

Although we may not all have been thrown by a horse, had a dog escape the yard and attack a hen house, or had a deer defecate right were we want to step in order to get into our garage, we're well aware that animals can be frustrating. But did you know that, hundreds of years ago in Europe, you could take the damn critters to court?

So says Michael S. Gazzaniga, a leading psychologist and neuroscientist, in a popular-science book called "Human: The Science Behind What Makes Us Unique." It's in a chapter about how humans can't help mentally conferring agency, or purpose, on animals and even inanimate objects. (My old clunker won't start; it's being stubborn this morning!)

Drawing on a 1906 book by one E.P Evans called "The Criminal Prosecution and Capital Punishment of Animals," (New York: E.P. Dutton), Gazzaniga said that animals in the 800s and for many hundreds of years thereafter could be arrested and imprisoned (in the same jails as humans) for overturning a cart, biting a person, or some other offense. The accused creatures were appointed lawyers and had to stand trial in a civil court. If an animal were convicted of buggery, "both it and buggerer were put to death." (This is the place for all sorts of comments I won't bother making.)

Back then, accused human criminals could be tortured and - if they didn't confess - have their sentences lessened. Likewise, Gazzaniga said, in the spirit of consistency, animals could be tortured and - if they didn't confess, which nobody expected - the animals' punishment could be reduced.

Humans' tendency to assign agency to animals and other things remains in place in our brains, of course. But these days, I'm glad some genius came up with the idea of just saying: "Bad dog!"

Wednesday, May 5, 2010

Ever heard of these guys

Great educators, unlike soldiers, don't just fade away. Something lives after them.

A case in point is Alcuin (732-804), a British scholar recruited by Charlemagne, who had united Europe and managed to get crowned "Holy Roman Emperor" (thus establishing the "divine right" of kings). Charlemagne, possibly illiterate himself, wanted all his subjects - sons of Lords and sons of serfs - to learn to read and write. He asked Alcuin to make it so. Alcuin created Abbey Schools everywhere - an idea that grew over the next few centuries into something new in the world: universities.

Then there is Bishop Robert Grosseteste of Oxford (1175-1253). The guy was a standard medieval religious dude, but he was as interested in science (as yet unnamed) as in theology. His experiments in optics led his best student, Roger Bacon, to advocated what he called "experimental physics."

The rest, as they say, is quantum mechanics, dark matter and dark energy, and whatever the hell is next.

(I left out Peter Abelard (1079-1142), probably the most revered teacher of the period, because he's too famous. (Recall the tragic story of his love affair with the brilliant Heloise: Think of a nunnery. Think of Heloise's uncle, a mistaken idea that Abelard had abandoned her, and an emasculation by goons hired by the uncle. Sorry to leave you out, Pete.)

Tuesday, May 4, 2010

Religious freedom

At a time when the latest Islamic nut case has been arrested for allegedly trying to blow up Times Square, it probably is useful to think briefly about Islam's contribution to the world.

Back in 632 A.D., Muhammad got a message from God and so knew - KNEW! - that he knew it all. Of course, many people before and since have gotten the same call. (Different messages, but, hey.) Still, it needs to be recognized that Muhammad's vision - unlike that of Christianity - contained not only religious and ethical teaching but also plenty of room for intellectual freedom.

And it is that intellectual freedom that delivered Western Civilization from the "Dark Ages."

Of course, people living in the West roughly between 650 and 850 A.D. didn't think of themselves as being in the "Dark Ages." But, judging from that time's output of philosophic and intellectual ideas, there they were. Heads befuddled by the church - not to mention attacks from Vandals, Goths, Muslims, you name it - fact is, they were going nowhere.

Meanwhile, Islamic scholarship thrived. It was Islamic translations of Greek thinking about science, logic, ethics, and even psychology, that eventually turned Western Civilization back on in the centuries to come.

In many ways, Islam ended up passing the intellectual baton to the West. But still, it retained it's intellectual nature. Never mind those jihad nincompoops. (Ever heard of the Inquisition? Christianity isn't exactly your bastion of justice or smarts.)

I say, let jihad nincompoops rot in jail. But give Islam the rights of religious freedom that have kept our country whole. Despite whenever somebody else gets a message from God - and of course they will, time and again - let's just make sure they and their followers get their happiness. Unless, of course, they want to blow up a federal building or something. We should do no less for the followers of Islam ... unless, of course ...

Monday, May 3, 2010

Flower power

This afternoon, I happened to be looking out of the window of my front door when a guy on a bicycle rolled by. He was wearing a pointed plastic helmut, and his pant legs were bound with rubber bands. He was staring at my door. As he wheeled on by, he craned his neck, still staring! Huh?

So I opened my front door, and the screen door, and peered outside. What was the guy looking at?

Oh! I'd forgotten. There, hanging on my doorknob, was a paper basket, with paper flowers (with pipe-cleaner stems) growing out of it. The message glued to the basket from a grade school a couple of blocks away said: "We're celebrating the return of Spring! It's a beautiful day in our neighborhood and we're glad that you're our neighbor! Have a happy May! From the Students and Staff at Central School."

Sure, now I remembered. Central School does this every spring. A May Day basket!

Of course, I realized that the timing isn't exactly a random thing. As it turns out, voters will turn out tomorrow to vote on the latest school levy.

But I prefer to think about a little girl, whose name is Auriana (it was printed, in a childish hand, on the basket), sitting at her desk with her tongue sticking out of one side her mouth, carefully cutting colored contact paper to make a flower. For someone like me.

Conservative justice, liberal justice, and the common good

"Fundamentalists rush in where liberals fear to tread." That's a comment that Harvard professor Michael J. Sandel, author of "Justice: What's the Right Thing to do?", might regret. It's just too easy, and tempting, to quote out of context.

I can't stuff hundreds of pages of careful, witty, and smart philosophical writing into a little blog. But here's a brief summary: After demolishing libertarianism as a source of justice (What? We have no ethical responsibility toward our fellow men besides not harming them?), Sandel takes up liberal thought. He chooses Kant and John Rawls, an important late 20th century thinker, to discuss such issues. Among many, many other things, these liberal thinkers believe that there are two basic categories of moral responsibility: natural duties (like not hurting others) and voluntary obligations (those that you have specifically agree to).

Sandel (and others) say there is a third category: Obligations of solidarity. Think family obligations, patriotism, etc. Aristotle certainly wouldn't have tried to keep such ideas out of the equation!

(I'm simplifying like a mad man; if Sandel ever reads this, I'm going to have to go out and eat deer poop.)

Anyway, he asks questions like: "If you think most moral obligations can't be REAL obligations unless you've consented to them - hell, your freedom is at stake! - what about moral obligations to a family member, or to a community member, or to a fellow member of your country. You didn't consent to those obligations, after all. You were born into them.

And, says Sandel, a politics "emptied of substantive moral engagement" - as ignoring such obligations would do - would impoverish civic life.

It would, he suggests, become "an open invitation to narrow, intolerant moralisms." Say goodbye to justice.

As I re-read this, I see I need about another 20 pages. Or 200. But, hey, think about it anyway. And read Sandel's book for a decent conclusion.