Thursday, September 30, 2010

Ever heard of this guy?

Every once in a while, like a pie in the face, my ignorance arrives with an unexpected splat. (You'd think I'd get used to it by now.) Anyway, I've just started reading a cool biography/history about the search for the origins of kindness. The fact of altruism has long been a huge thorn in the paw of evolutionary theory - face it, if natural selection means the progression of more and more fit species, out-competing their competitors, where does being nice fit in?

The book, "The Price of Altruism" by Oren Harman, is both a biography of George Price, a pretty-much unknown scientist who worked out mathematical equations that go a long way toward explaining altruism in evolutionary terms and a history of the whole problem. But, interestingly, it starts in part with a dude named Prince Peter Alekseyevich Kropotkin. He was an anarchist who rejected capitalism and socialism in favor of a no-government system in which people simply cooperated for the common good. He was famous. (Splat.)

As a young man in the mid- to late-19th Century in Russia of high aristocratic birth, he quickly became disillusioned with life in the court and sympathetic with the serfs. He arranged to be sent to be sent far to the east in Siberia, where he conducted important geological studies, but also observed the amazing cooperation among wild animals that contradicted the Darwinian "red in tooth and claw" view. Prey animals would post sentinels to warn of predators by jumping around, putting themselves at risk. Birds helped each other on nests. Wolves joined together to hunt in communal packs. How can this cooperation be reconciled with natural selection? Kropotkin saw a cooperative anarchism as the human answer.

The author starts his book this way to zero in on the problem of altruism for evolutionary theory. It's a cool way to begin. I just wish I'd heard of this Kropotkin guy before.

Wednesday, September 29, 2010

Degrees of danger: cells and dumbness

It takes one's breath away. Still, in the year 2010, articles keep appearing about a possible connection between cell phone use and brain cancer! This in spite of the sheer obviousness of the elementary fact that radiation from cell phones simply cannot do such a thing.

Here's a mental experiment. Imagine your head is a giant, coal-fired power plant somewhere on the plains of America. Suppose someone (a hater of coal power?) walks up to an outside wall of the plant and slaps it with his hand, as hard as he can. Do you expect the slap to affect the plant in any way? Even come close to affecting it? Of course not - the energy of the slap is, one might say with a certain understatement, rather inadequate. So it is with cell phones and the brain.

Here are some numbers: To mess up the molecular bonds inside brain cells enough to cause mutations, thus maybe causing cancer, you need big-time radiation such as x-rays, gamma rays and ultraviolet radiation. They have energies greater than 480 kilojoules per mole. Green-colored light emits photons with a power of 240 kJ/mole, and it can't quite break certain molecular bonds in our eyes. A cell phone emits radiation of less than 0.001 KJ/mole - 480,000 times weaker than UV light!

Next time you come across cell-phone radiation on the beach, kick sand on him. He can't do a damn thing about it.

Scare-mongers (and journalists) are relying on a "better to be safe than sorry" theory of public health. But such a theory is being misused when it makes no sense.

All of this draws me to an item I noticed in a recent "Scientific American." The magazine says Conservapedia, an online encyclopedia run by conservative lawyer Andrew Schlafly, implies that Einstein's theory of relativity is part of a liberal plot. This sort of thing could impel me to pontificate on the inverse relationship between conservatism and intelligence, but let's be positive. After all, isn't it rather suspicious that Albert Einstein actually hated fascism?

Tuesday, September 28, 2010

My brain made me do it

One major tradition in psychology says that mental illness doesn't exist because the mind is not an organ and so can't be diseased. But the brain sure can.

Here is where I learned something that I can't believe I hadn't come across before.

The background: There exists in rare cases a malady called "temporal lobe focal epilepsy." (Picture the temporal lobe by imagining that the brain, viewed from the side, looks like a right-handed boxing glove. The "thumb" is the temporal lobe. Focal simply means that the problem is sharply limited to one tiny area of the lobe. The term epilepsy is used because the disease causes that area to give off brain waves similar to those that cause epileptic seizers - but these waves have a far different result.)

The problem is the growth of a tiny calcified mass in a certain spot. It can make a person's personality change for as long as a full day from a normal person into an angry but seemingly otherwise highly organized criminal. They might lash out at others, smash property, or even take off on a senseless killing spree. When the episode passes, they have absolutely no memory of what they have done. And removal of the growth removes the disease forever.

Here is what blew me away. If you've been around long enough, you vividly remember news reports of the young man who grabbed a rifle, climbed atop a tower overlooking the center of the University of Texas, and shot his fellow students at will. Perhaps that should be, at "will," because an autopsy found that he had been a victim of temporal lobe focal epilepsy.

I don't know how it is that I never bumped into this information. It certainly has been around for a long time, and is used by those who argue that perhaps all "mental" illness is caused by a potentially treatable brain dysfunction. I'd never go so far as to suggest that all criminals are victims of a brain disorder and so should not be punished. That's ridiculous and would turn our justice system on its head. But for seemingly inexplicable crimes, maybe it's worth more consideration by the courts.

Monday, September 27, 2010

Ever heard of this guy?

His name is William Caxton (c. 1420-c. 1491), and he may be the single most significant reason that you speak the language you do.

Of course, there are a zillion ways in which early English dialects transformed themselves over the centuries, both randomly and inexorably, into the English we speak today. The most successful dialects, as they say, had a navy. (I know, by the way, that I'm thinking about the English I speak here in Montana in 2010. I'm certainly not thinking about Brooklynese, southern drawled English, or (heaven forbid) that thing the Brits speak.) But look back a thousand years or so. The Germanic "Old English" of the island was suddenly eclipsed (after 1066 A.D. and the Battle of Hastings) by Norman French. Old English, Germanic as it was, still was English, and it was nearing extinction as the French language took over all important writing and culture. From the 1000s to the 1200s,"English" texts are hard to find.

But that didn't mean that "English" didn't survive. It's just that few people thought to (or were able to) write it down. This vernacular English evolved into Middle English, including French influences galore but still English. Still, Middle English couldn't coalesce into what soon was to become the "modern" English of Shakespeare, etc., until the printing press began to fix the vernacular language into the lingo that would evolve into that which today you use to tell your girlfriend that you really will explode unless she puts out.

And the first person to use a printing press to publish in English was none other than William Caxton (in 1476).

Caxton was a successful British business man who thrived in the Low Countries of Europe and who learned printing in Germany before bringing a press back to his native country. It can be argued that the period of "Middle English" lasted from the Battle of Hastings to 1476, when the printing press helped begin to standardize English (not French) into what (in only about 100 years) became a language that could say, "To be or not to be - that is the question."

English was to be. Thanks, William Caxton. (At least for the version spoken in Montana.)

Saturday, September 25, 2010

Idioms and poetry

Idioms must be the bane of people trying to learn English as a second language. (These days, even native-born Americans can have a hard time keeping up with them.)

But imagine a foreign English-learner forced to deal with a phrase like "pull someone's leg." He or she must be taught what it means, or they are out of luck. Supposed they learn it, only to hear the next day someone tell a friend heading into a difficult test of some kind to "break a leg."

Some idioms can be figured out. "Head over heels in love" doesn't make a lot of sense, but you get the idea. To "spill one's guts" isn't especially lucid, still, in context, an English student can figure it out.

What, however, about those archaic phrases that all English speakers know, even if they no longer know what the words mean? I'm thinking of idioms like "kit and caboodle." Ask any middle-school kid what it means, and they'll know - call it the "whole ball of wax," to use another idiom. ("Kit" refers to all the necessary things a soldier takes with him in his kit; caboodle harkens back to "boodle" - a gambler's entire stake - and probably became "caboodle" to be an alliteration with kit."

But some idioms reach a certain level of poetry, however unconsciously. For instance, to "leave high and dry" would seem to involve escaping a flood - hardly the meaning. To "kick the bucket" wouldn't immediately evoke the gruesome sight of someone kicking out in one last death throe. You have to make the connection yourself.

One of the idioms I find most interesting can be attributed to some relatively affluent girl in some California suburb in the 1980s. It is this: "Gag me with a spoon." It may be deservedly disappearing, but I still can't hear it with thinking about spoiled adolescents and their silver spoon. Is this just a combination of idioms? Sure. Is the connection just taking place in my head? Of course. But the resonance of Valley Girls and the silver spoon they were born with won't go away. Isn't that what poetry is?

Thursday, September 23, 2010

Little kids and smarts

Back in the very early 1970s, I committed a grievous sin against the education of our youth. Here's what happened:

I had read somewhere that up to certain age - five or six or so - little kids are invariably bamboozled by a simple experiment. Pour the contents of a low, wide glass into a tall, narrow glass. Ask the kid which glass had the most liquid. Up to a certain stage of development, the child - in some way certain that the higher the liquid, the more there is - always will point to the tall, narrow container. Well, I decided to do the test on my four-year-old boy, Donny. Yep, his answer was the glass with the higher level - right in step with the research. I went on to explain to him that pouring a fixed amount from one glass to another didn't change the amount, no matter how "high" it ended up. He looked at me with big eyes that said: "Cool!"

Sure enough, within a week or two the gods of psychology school pulled one of their whimsical tricks and caused our neighbor, a young woman taking a psychology course at nearby Carroll College, to ask if she could administer a few tests to our son. Sure enough, one of the tests was the water-pouring thing. Sure enough, Don blew her away with his precociousness. I felt like a cad, and of course explained what had happened.

But as I was remembering this incident, I also recalled learning about studies that show that while kids have to pass through various stages of mental development, they also can blow us away with what they know, and with how early they know it. I'm thinking of a test given to four-month-old infants. The babies are shown a picture of box sitting on a table. They look at it briefly. Then they are shown a picture of a box sitting right at the edge of the table. They look at it a little longer, but not much. Then they are shown a box hovering in the air beside the table. The infants stare at it for a long, long time.

Way back in 1997, a lecture by Professor Daniel N. Robinson of Georgetown University ended with a quip: "The more we study infants, the smarter they get." Earlier this year, in the spring of 2010, I noticed a magazine article about how much smarter babies are than we think. It seems Robinson was on to something. So, apparently, are infants.

Wednesday, September 22, 2010

Silent filibusters

As I was reading an AP story about how a defense bill that included an end to a ban on gays serving openly in the military was defeated in the Senate 56-for, 43-against, I was thinking: Shades of the so many filibusters, over so many years, against civil rights legislation.

But wait a minute. The story never mentioned a filibuster. Yet the bill was defeated on an apparent cloture vote needing 60 affirmative votes to pass. What gives?

That's when I remembered a New York Review of Books article I read a month or so ago. It said that traditional filibusters (ala James Stewart in "Mr. Smith Goes to Washington") are rare now because things have changed. When hearing noise about a filibuster, the side favoring a threatened pieced of legislation will initiate a cloture motion. It can stymie a filibuster, but only with a 60-40 or more vote. (So the 56-43 vote on the issue Tuesday came four votes short.) National news stories just talk about Senate rules that require 60 votes, not wanting to bore their dull readers with the details.

The result is that Democratic voters have to wonder why a government run by a Democratic president and a Democratic Congress is too inept to pass its agenda. Republicans, who have forced Senate cloture votes at a rate recently at roughly twice the rate of Democrats in the past decade, just might have figured this out.

This week's gay-rights Senate vote certainly is another in a long history of conservatives working to put down people they conceive as unlike them - and using exceptions to majority rule to do so. It is vital to remember that the whole idea of a filibuster is, on important occasions when Congress goes nuts, a lifeline. But the idea of using it as one more oft-used obstructionist tool is something else. How many of us really want a dumb-head minority blocking every needed change?

Monday, September 20, 2010


Kaboom. Another of my beliefs bites the dust - this one involving the moon.

Of course the moon has regularly been the subject of delusions. Probably the most common has been that a full moon somehow causes insanity, or at least wild, irrational behavior. (I once knew a very capable county sheriff who was absolutely convinced that crime rose sharply on the night of a full moon.)

But my comeuppance involves an illusion common to us all: The fact that the moon just above the horizon appears to be far larger than the moon at its zenith. We know that the moon remains the same size, and that its orbit keeps it pretty much the same distance from us at all times, so this apparent difference in size does indeed have to be an illusion of some kind. But what causes it?

Like many of us, I came to conclude that it's a matter of context. We know the size of big buildings or mountains beyond which the low moon can be seen, and somehow that makes us exaggerate the moon's size. Straight up in the sky, there is no context, so the illusion is lost. (Put more scientifically, the difference is using "distal" cues - we know the sizes of buildings and mountains - and "proximal" cues from the retina, because we have no other cues.)

Other explanations have included the idea that looking at the moon near the horizon involves seeing through a longer column of atmosphere (or smog), thus somehow scattering its light and making it look bigger, or the suggestion that the "angle of regard" (looking straight ahead at the moon, or looking way up) has something to do with it.

Well, according to Professor Daniel N. Robertson, in a section on studies of perception in his course on "Great Ideas in Psychology," all these ideas are ruled out by studies that eliminated all such elements as context, atmosphere and angle of view. He says the damn moon STILL looks bigger on the horizon than high in the sky.

Obviously there remains a lot to be learned about this sort of thing. For now, I think I'll just call it a mild kind of, well, lunacy.

Saturday, September 18, 2010

Extremism and presidential elections

When I was a high school kid in 1964, I had a long-distance friendship with Jim, who lived in a small town in southwestern Wisconsin. (A sign leading into town proclaimed it "The best town by a dam site," although I never figured out where the dam was.) We knew each other because our parents, recipients of the GI Bill, lived in together in subsidized housing in "Badger Village," a Quonset-hut married-housing community at the University of Wisconsin, and became life-long friends.

That summer I took a bus to visit him and stay with his family. We hit it off once again, except for one thing: He was for Goldwater in the 1964 election for president. Gack!

Of course, I wasn't alone. Goldwater lost by a landslide. It turned out that Goldwater wasn't exactly a savvy campaigner (He was known for speaking at nursing homes demanding an end to Social Security). His biggest blunder was his resounding assertion that "extremism in defense of liberty is no vice!" This at a time when extremism was equated with campus radicals and inner-city riots, not to mention the fear that extreme anti-communism might lead to nuclear annihilation.

I think of this now as I read polls suggesting big GOP gains in November. This is hardly surprising - Americans have almost always voted against the party in power during bad economic times. But in a presidential election, extremism seldom plays very well. True believers, such as my friend Jim, tend to be rejected. One wonders if America's Tea Party types, heirs of Goldwater, will get the message.

Friday, September 17, 2010

Intolerance gets dangerous

In a way very different from how many European nations are threatened by extremists among impoverished Muslim immigrants, the U.S. has enjoyed a harmonious relationship. Muslims, more ethnically diverse than any other significant religion in the country, have fit in. They hold more graduate and postgraduate degrees than the national average, and so also have a lower unemployment rate and more professional jobs.

Over the years, Americans may have thought Islam exotic and odd, but hardly worth getting excited about. For instance, I can remember when the popular singer Cat Stevens gave up his music to become a Muslim. Americans were sad - not about the conversion, but about losing his songs.

Of course, Americans seldom need much of a spark to ignite their racial or religious prejudices. Witness their treatment of Native Americans, hatred of Irish immigrants (and the concurrent anti-Catholicism), the ban on southern (but not northern) European immigration in the early 1920s, the internment of Japanese Americans, the century of overt discrimination against African Americans, and today's freak-out against immigration from south of the border. So it is hardly surprising that simmering anguish over 9-11 didn't need much of a spark, either.

It actually started in small steps, like when the state Legislature of Oklahoma (Muslim population 0.81 percent) passed a law last spring banning judges from using shari'ah law in their decisions. But when a Muslim group, long in the New York neighborhood of the Twin Towers, decided to build a Muslim center devoted to improving relations with other U.S. religions and opposing by their example extremism, it unsurprisingly came under fire by professional and amateur haters, opportunistic politicians, and such demagogues as are to found on Fox News and the like.

Never mind that it is exactly this kind of moderate Islam - and an adherence to our republic's principles - that is our best hope for combating terrorism - especially the home-grown variety.

In the past, American prejudice in this regard really hasn't threatened (other than morally) the country. Or its soldiers. Today, that's not true anymore.

Wednesday, September 15, 2010

On turtles and strippers

Stephen Hawking, one of the greatest physicists of his generation (wheelchair, computer voice system and all), began his 1988 book "A Brief History of Time" with an old, no doubt apocryphal, tale of turtles. The tale has many versions, sometimes involving Hindu myth, often naming various 19th Century scientists. Basically, after a lecture on the nature of the solar system and the galaxy, a woman stands up to object that the world is a flat disc. The lecturer asked what holds up the disc. "A turtle," she says. And what holds up the turtle? "It's turtles all the way down."

I personally like a version that I'm told (by Wikipedia) was aired in a Season 5 episode of the sitcom "The Office." In it, an employee wants to organize a birthday surprise for her boss. She suggests a stripper jumping out of a cake, holding a smaller cake, out of which a smaller stripper jumps, holding a still smaller cake. And so on. When asked what the next stripper will be holding, she says: "A cupcake. It's cupcakes and strippers, all the way down."

For some reason, this "infinite regression" problem (call it the Turtle Problem) came to me while reading about an early Ionian school of "science" that believed the world could be reduced to principles that can be understood. Thales of Miletus (about 624 to 546 BC) is credited with predicting the first solar eclipse. Empedocles (roughly 490 to 430 BC) discovered there was a thing we call air. Democritus (roughly 460 to 370 BC) came up with atoms. Through careful measurements, Aristarchus (roughly 310 to 230 BC) concluded that the sun must be hugely larger than the earth, becoming the first to argue that the earth must not be the center of the solar system, but orbits the sun.

This Ionian thinking lost out (Aristotle, for instance, rejected atomism because he couldn't stomach the idea that humans are made of soulless bits) and it wasn't revived for about 2,000 years. So, during all that time, it basically was turtles (or cupcakes and strippers) all the way down.

Monday, September 13, 2010

The myth of Reconstruction

For young people growing up during the Civil Rights Movement - young people with liberal tendencies such as myself - the era of Reconstruction following the Civil War was an insufferable blot on U.S. history. This was the period that had ended up completely subverting the 13th, 14th, and 15th Amendments and set in motion the Jim Crow world of not only segregation but blatant discrimination that persisted for so many generations.

But, since the aftermath of the Civil War, quite another image of Reconstruction has persisted in this country - the image of corrupt Blacks, carpetbaggers (northerners who moved south to take advantage of white southerners), and scalawags (southern turncoats) taking over state governments and woefully oppressing what had been the confederacy.

This image was a myth. Although at the beginning, Blacks constituted 80 percent of Republican voters in the south (most whites had refused to register), it was whites who dominated the party. And, no more corrupt than their counterparts among state and local governments in the north, these Republican state governments compiled a credible record, enacting social, judicial and government reforms despite terrorist opposition from such groups as the Ku Klux Klan.

Still, the image lingered, helping to perpetuate racial discrimination. (It was abetted by such movies as "Birth of a Nation," in which Klan members were the heroes, and "Gone With the Wind," in which a Black and carpetbagger politician rode to confiscate Tara, the camera zooming in on the carpetbagger's ... carpetbag.) From my school-boy history studies in the late 1950s, the picture that remains most vivid is that of carpetbaggers and their exploitation.

Back then, I had yet to learn the myth was a lie. But despite that, all I really knew was that Rose Parks still, close to a century later, remained legally obligated to sit in the back of the bus. No twisting of history was going to change that grim fact.

Sunday, September 12, 2010

The sounds of silence

Sometimes, despite my laser-like ability to zero in on what is cool in 2010, my geezerhood breaks through. Not in pausing to rub my aches and pains, not in scampering (as best I can) away from a mirror, wondering who the hell had let that old guy into my house, but in suddenly feeling an overpowering need to listen once again to Simon and Garfunkel.

So I'd stand in front of my stereo, an album in each hand. One would be "Bridge over Troubled Water." I'd gaze at it in awe. Their last album, and their best. But wait! What could be better than "Sounds of Silence?" "Hello darkness my old friend, I've come to talk with you again." Dither, dither. Of course, I'll end up playing them both.

My generation, like every generation, was musically fickle to the point of sticking their heads into the sand. Nothing but the new was cool - the Beatles, the Stones. The "old" stuff like Buddy Holly. And, of course, in the mid- to late 1960s, Simon and Garfunkel.

As always seems to happen, a generation that adored Sinatra gave way to a generation that ignored him. Some of the coolest jazz seemed to disappear. And the best songs of our century - the efforts of the Great American Songbook (Porter, Berlin, etc.) - simply were not heard.

Of course, the same fate awaited the pop and rock of the 1960s. New fads came and went. Hip Hop seemed to have little to do with the past - and it didn't. But somehow, when Simon and Garfunkel staged a reunion concert at Central Park in the 90s, the park was filled. The older guys, anyway, hadn't forgotten.

But in 2010, says the old guy in the mirror, the end is in sight. As Simon and Garfunkel sang in the 60s, "Time hurries on. And the leave that are green turn to brown." The old guy and me unite in lamenting these songs turning into the sounds of silence.

Friday, September 10, 2010

A dumb war

If you are like me, all you learned about the War of 1812 in your high school history classes is that the British burned Washington, that they sailed on up the river but failed to take Fort Henry (an event prompting Francis Scott Key to write the Star Spangled Banner), and that Andrew Jackson heroically defended New Orleans, albeit after the war was officially over.

But how many of us can answer a few simple questions?:

A. Who started the war?
B. How did the U.S. think it could win against the vastly superior British military?
C. Who won the war?


A. The U.S.
B. It thought it could waltz into Canada and take it over, thus stopping British provocations.
C. Nobody, although the U.S., by bankrupting itself, didn't exactly come out even.

The fact that the War of 1812 is largely skipped over in high school history classes is that the politics involved, I've learned, is hellishly complicated. (Just explaining how the Republican Party, instigators of the war, has evolved into the entirely different thing it is today would take about a full semester.)

Suffice it here to say that the War of 1812 had to be the dumbest war the country has ever started. Arguably, anyway.

Wednesday, September 8, 2010

The problem with words

After years of reading, a fine K-12 and undergraduate education, and a childhood in which anything but standard, grammatical speech was forbidden (ain't isn't a word!), I found myself in the real world as a cub reporter. I knew I had a lot to learn - the grown-up realm of government, law enforcement, politics and all the rest - but I never thought that words, which were the basic tools of my trade, would be a problem. Oh, boy.

First came simple spelling. For instance, I always had pronounced disdain as thought it had a "t," so that's how I tried to spell it. That and many other small embarrassments - accommodate will accommodate all the c's and m's you stuff into it, dammit; you spell Calvary differently than cavalry, for God's sake - constituted an uncomfortably common part of my early career. However, after a few years, not to mention some tattered dictionaries, I pretty much mastered spelling. But, in daily small-town journalism, woes never cease. I'll never forget my mortification when a copy editor changed my nit-picking into knit-picking.

But I also quickly learned that readers reacted strongly when a word baffled them. What kind of elitist crap were we trying to pull? That's why journalists try to be careful. There's no sense in doing a Bill Buckley with one's vocabulary. Which led to the copy-editing blunder of which I am most ashamed.

A reporter had written a story which used the word segue (moving from one thing to another without pause). I'd never heard of the term, and assumed it was some foreign word inappropriate for our paper. The reporter looked at me as though I were the dolt that I was.

But even simple, uncontroversial words can cause grief. Once I used the term pay matrix referring to teachers' salaries. I got an irate call. I patiently proceeded to explain a grid with length of service down one side and degree of education across the top. Why, asked the disgusted caller, didn't you just call it a pay scale? Another caller, an elderly man almost in tears, was upset that I had used the word prescient in an editorial. How the hell, he wanted to know, was he supposed to know what it meant?

What got me thinking about all this today was remembering the day not too many years ago when my editor called me into his office to complain about a call he'd just taken from somebody blasting him for my use of the word kids. Obviously the product of some didactic schoolmarm, he yelled that a kid is nothing but a young goat. I rather think I kept my cool, explaining that using the term kids for children is hundreds of years old and is never misunderstood, while eyeing him with disdain as the dolt that he was.

But the point here, of course, is that words are always going to be a problem.

Monday, September 6, 2010

Bill of Rights

The Bill of Rights is seen by most Americans as an iconic document - a fitting coda to a long and difficult war for independence and the recent ratification of a brand new - and radically modern - Constitution. The document also was viewed at the time as a belated victory for Anti-federalists, who had strenuously opposed that Constitution. It was, however, a bittersweet victory at best.

A quick refresher: Federalists argued for the need for a strong central U.S. government (with checks and balances); Anti-federalists feared that such a government was a sure road to despotism and wanted a much greater sharing of power between the central government and the states, which had a closer relationship to the governed. (I recently watched a 12-lecture series called "The Great Debate: Advocates and Opponents of the American Constitution" by Thomas L. Pangle, professor at the University of Texas at Austin, so I am well aware that the issues were more complicated. But do you want to sit through a rehash of 12 lectures?)

Anyway, Anti-federalists hoped the Bill of Rights would include a healthy worry about big government, a guarantee of greater states' rights, and strong exhortations to civic virtue and religious piety. Their hopes were dashed. Bill of Rights author James Madison made sure the amendments did not dilute government powers, limiting the rights to such basic things as the press and religion. (The convoluted 2nd Amendment was perfectly understood at the time. A major issue was the wisdom of a standing U.S. military. The amendment reassured states that their militias would remain vital to America's defense. Gun control, as we now understand it, simply wasn't on anybody's mind.)

What's interesting to me is not the coolness of the Bill of Rights - which is cool indeed - but the issues that were left out of the document. More than 200 years later, those same issues remain central to much of U.S. politics.

Saturday, September 4, 2010

Remembering Mercy

Certain names from the early history of the United States - call them the Founding Fathers - will remain, indelible, in the American consciousness. But others fade from view as the decades and now centuries go by. That's doubly true of women. History, after all, has long been written mainly by men whose interest in women's accomplishments, especially in the 19th Century, was mainly limited to whether the house was clean and meals were hearty.

This week, for instance, for the first time in more than 40 years I came across the name of Mercy Otis Warren. I had a vague memory that she was an import propagandist against the British and Loyalists prior to the Revolutionary War and had the ear - and the respect - of the likes of Patrick Henry, John Hancock, Thomas Jefferson and George Washington. But that was all. So I looked around.

Warren was born in 1728, married in 1754, had five sons, and died in 1814. An excellent writer, she produced many satiric patriotic poems and plays that gave heart to revolutionaries as the war approached. She was, in fact, the colonies' first female playwright. Late in life, she wrote a three-volume work called "The History of the Rise, Progress, and Termination of the American Revolution" - a history with a definite Jeffersonian viewpoint as she retained her strong Anti-Federalist leanings to the end.

She was a significant factor in American history, and I hope it was only my helter-skelter reading habits that kept her under my radar for so long.

Thursday, September 2, 2010

Cheap books might set you free

Emanuel Haldeman-Julius, the son of a Russian Jewish immigrant bookbinder, was both representative of the last gasp of the freethought movement as a distinct part of American life and the rags-to-riches story of a successful capitalist.

Haldeman-Julius (1889-1951) had a dream of providing high-quality literature, political thought, and iconoclastic new writing at prices the masses could afford. He got started in 1919 by offering people 50 books in pamphlet form for $5 in advance. Enough people took him up that before long he was printing - at his most prolific - nearly a quarter million books a day.

When I first started reading about him this afternoon in Susan Jacoby's 2004 book "Freethinkers: A history of American Secularism," I thought, "Huh. Here's yet somebody else I've never heard of." Then, a few hours later, my feeble brain gave a rusty click as I remembered that these "little blue books" were the very books that flooded America during the 1920s, 30s, and 40s. I may not have known the name of the publisher, but I certainly knew of the phenomenon. (From J-school? Earlier? Who knows?)

Anyway, Haldeman-Julius, a Democratic Socialist who hated communism and fascism with equal passion, succeeded both in bringing real, thoughtful writing to millions of Americans and making some big bucks. Even as the freethought movement was splintering to countless separate causes, Haldeman-Julius had been laying an important foundation.