Saturday, April 2, 2011

The End

Today my father, author of this blog, passed away from complications related to lung cancer.

It was only about 1 ½ years ago I set up this blog for him as a birthday present. He was always impossible to find a gift for and I was hoping that this might help him full the void left with he was laid off of his editorial writer job at the paper. It turned out better than I had ever dared hope.

Soon he was writing three to four times a week and was so excited anytime his hit counter incremented. During one of our many phone conversations he even mentioned that it was the best present I had even got him.

Though this blog I learned things about him I never knew. Sometimes I felt like he was writing just to me to tell me something he never got around to or did not know how to say in person. But really he was just a great writer and a truly amazing guy that had some enteraining things to say.

I am sure he would want his readers to know he had not abandoned them but instead been forced into yet another form of retirement. I in turn wanted to thank every one who pushed his hit counter up you can not know how much enjoyment he got out of writing these and thinking that maybe someone out that was enjoying reading it too.

Monday, March 14, 2011

Wishful thinking

I've been suffering from insomnia of late, but I'm fighting back not with pills - no big luna moths latching onto my back - but with a lecture series called "The Modern Intellectual Tradition: From Descartes to Derrida."

I'm only partway into the course - learning about Descartes, Locke, Berkeley, Spinoza, Leibnez, Rousseau, those early guys of "modern" philosophical thought - but as soperifics they fit the bill for a sleepless guy like me. Don't get me wrong. These philosophers remain smarter than 99 percent of everybody in the world today - you could say they invented much of the world of today - but they each suffered a fatal flaw. Their ultimate goal was to prove intellectually, beyond doubt, the existence of a perfect God.

(As your neighborhood far-right-Christian will tell you, it is faith, not the intellect, that has to do the trick.)

Anyway, let's just take Descartes, of "I think, therefore I am" fame. Descartes decided to discover if anything could survive his doubt. He rejected all experience and feelings about the world, noting that it all could be concocted by an evil imp just to fool him. But the mere fact that the imp tried to fool him meant that he had to have a mind. (What else could the imp be trying to fool?) His problem, however, was that he mainly wanted to prove that the existence of God the same way, without relying on experience. So Descartes reasoned thus: The fact of his mind's existence implies its creation by an infinite, perfect being. There IS a perfect God! And because science is part of mind, and a perfect God could not be a deceiver, there must be a material world that the scientific method deals with. This "solution" to the mind-body problem has bedeviled philosophers for centuries.

Yikes! I'm supposed to stay awake through this drivel? Who says god isn't the imp? Who says our conception of perfect is anywhere near right? Who says that just as human scientists test lower animals in a deceptive maze, God doesn't do the same with us? In other words, Descartes, (and all the philosopher mentioned above) are doing little more than wishful thinking.

As I ponder such things, my head starts to droop, and with tsunamis and plagues slowly circling my mind and my body like so many imps, I fall asleep in my chair.

Saturday, March 12, 2011

A magic hand

The other day I discovered (or rediscovered - how could I have missed such a simple visual illusion?) that I have a three hands.

Here's the trick: place your open hands, palms toward you, over your face. Keep your little and ring fingers pressed together on your forehead, but leave your palms apart just enough to make room for your nose. If you're doing it right, you can see past your nose about 45 degrees into a small part of the room to your left with your right eye open, the opposite with your left eye open. (This is how magicians who "blindfold" a subject trick the audience.)

But now open both eyes, and slowly move both hands directly away from the center of your face. There between your hands is a third, misshapen hand. It is skinny enough to look sort of like a foot. But, damn it, it shouldn't be there!

Well, of course it should. It's a normal way for your binocular vision system to handle such a situation.

But viewing my odd, ghostly third hand, I though of Oliver Sacks's new book, "The Mind's Eye." In it, Sacks offers a series of essays about how nornally talented people have coped with strange kinds of vision disrputions. Sacks has been hooking us for years with such tales - as in his "The Man who Mistook his Wife for a Hat" - but this time he has a further surprise: Sacks himself suffers from a visual malady that makes him unable to recognize people he knows when they are out of their usual context.

Anyway, a reviewer of the book, Israel Rosenfield in the current Harper's magazine, thinks it opens a "deep and surprising" view of reading, perceiving, and understanding. He refers to a man with good vision - a writer and musician -who after a stroke could no longer read words or music, or even, at first, distinguish individual letters or multi-digit numbers. The man eventually learned to distinguish them (but not read his own writing) by sweeping his arm to trace out a letter. The reviewer called it reading by the act of writing ... or by movement.

Rosenfield notes that only animals - not plants - have brains. This is because animals move. Their world keeps changing, and they require a brain to keep track of this. He goes on to suggest that the stroke victim's very movement in tracing the shape of letters tuned into his brain's most basic reason for existence.

I imagine sitting before a group of young, wide-eyed children, showing them their very own third hands, and spinning a tale of its magic abilities. (Yes, I have such daydreams.) I'm not sure how this relates to Oliver Sacks, but it should.

Wednesday, March 9, 2011

The American way

Somewhere back in the golden days shortly before the boom dropped on the economy, my daughter was convinced she should buy a house. She was tired of paying rent. Never mind that she couldn't afford one - buying a house was an investment! her community could do nothing but grow!

She lived in the house for a few months, until the nasty monthly payments took their toll. Then she was out of her dream house, out of her dream, and out of a whole bunch of money. But the paperwork had been done. The mortgage was off to wherever such mortgages go. I didn't know and didn't much care where the that mortgage went. The episode was over.

Of course, what I didn't know at the time was that the mortgage reached Wall Street, was tossed together with a bunch of other bum mortgages into a mysterious bond package and sold as a solid investment. (Rating agencies like Moody's and Standard and Poor's were apparently contemplating their navels.) And then, more magic: those packages were cut up and resold as derivatives, such as the misleadingly named "collaterized debt obligations." When the junk couldn't be repaid, the market bombed, Wall Street bankers and traders pocketed big money, and other americans paid for the losses, not to mention suffered the monster recession that followed.

Now the Republican House is bent on gutting a new (and weak) reform law. We can't afford it, is the GOP refrain, hiding behind the budget so they don't have to argue that nothing should restrain those rich traders from stealing.

My daughter is renting again. She lost her waitress job when the recession proved too much for her truck-stop employer to survive. I hope she'll ignore most of her rural Western neighbors and refuse to vote Republican in 2012.

Sunday, March 6, 2011

Information overload

Newsweek's March 2 cover was dire indeed: A person whose head was encased in ice. The headline shouted: "Brain Freeze." The subhead: "How the deluge of information paralyzes our ability make good decisions."

The article - by Sharon Begley, one of the country's better science reporters - said that today, with Twitter and Facebook and countless apps fed into our smart phones, the flow of facts and opinions never stops. And trying to drink from that firehose of information has hamful cognitive effects. We weren't evolved to handle such a rush, and it shows in many ways. For instance, we tend to latch on to the latest news from among the mad rush of data, whether or not it is important. The brain's working memory can only hold so much.

This is a big deal. Begley is a big deal. Making good decisions is a big deal. So why am I grinning?

It probably dates back to an ancient newscase by Tom Brokaw in which he wrapped up a story about a new thing called a "cell phone" with a look at the camera that said, unmistakably, "Who the hell would want to carry a phone around in his pocket?" I remember that look well, because I was wearing it, too.

I've never had a cell phone, smart or otherwise. Apps don't seem to connect with my landline phone. Twitter? Facebook? Huh?

Maybe I'm just being an out-of-step snot. But it turns out I've known for years about information overload. As a writer of short-form text - newspaper editorials and now a blog - I know well the evils of too much data. Every topic groans with different aspects, political slants, trainloads of dross impossible to stuff into a short comment. The solution is what I think of as a clank. Based on what you know about the topic - rather a lot, one hopes - decide what you want to write. Then, clank!, down comes the door on everything else. Wait! There's breaking news! Clank.

And then, when you've written enough, you stop.

Thursday, March 3, 2011


Both time I heard a gun advocate say on the news that we'd be better off if every law-abiding citizen carried a concealed weapon, I thought of Chris Dana.

The advocate let us into his dream world where he imagined pulling his weapon from beneath his coat and wasting bad guys. In his fantasy no one else is hurt, or even in danger. I see him blowing away the last smoke from his barrel and shoving his handgun back behind his belt.

I first heard of Chris Dana after a sudden, shocked silence had descended on our newspaper office. One of our ad reps, a middle-aged woman, had dropped her phone and run out of the building, sobbing that "Chris shot himself."

Dana, 23, was one of those National Guardsmen who returned from Iraq with invisible wounds. Ever since his arrival home in 2005, his family knew something was wrong. They tried to get him help, but there was little to be had. On March 4, 2007, Dana quit his job at Target, cleaned his car, shut himself in his room, pulled a blanket over his head, and shot himself dead with a .22 caliber rifle. Sometime later his Dad, Gary, found a letter from the Montana National Guard in a wastebasket, near a Wal Mart receipt for .22 shells. The letter announced Dana's dismissal from the Guard for skipping drills.

Since then, the Montana Guard has taken many steps to improve its handling of such cases. But today I read a Time magazine article featuring a returned Guardsman, Matthew Magdzas, also 23, of Superior, Wis., who shot not only himself with his 9-mm semiautomatic pistol but his wife, her near-term baby, his 13-month-old daughter, and his three dogs. The magazine said he was one of 113 Guard members to commit suicide in 2010, up 450 percent from 2004.

We'll never know if Dana or Magdzas ever daydreamed about being a hero, killing off bad guys with a cool, NRA nonchalance. But you can bet they came home knowing more about guns than that guy I heard on the national news.

Sunday, February 27, 2011

Foxy pets

Natural selection may be a dirty word for certain evolution deniers, but artificial selection - that is, the domestication and designing of animals (and crops) by people - has at least a 15,000-year history.

Take animals. Wolves were domesticated and turned into dogs 15,000 years ago in Asia. Between about 12,000 and 10,000 years ago, sheep, cows, goats, pigs and cats were domesticated, and the chicken came about 8,000 years ago. Somewhere between 6,000 and 4,500 years ago came the llama and alpaca, the horse, the dromedary and Bactrian camel, and the water buffalo and the yak. About 3,000 years ago in North America, Indians domesticated the turkey.

But people have failed to domesticate any other animal, of all the many thousands of species in the wild, ever since. Their genes won't let it happen. But, starting about 50 years ago, some Russians began domesticating the fox. I know better than to suggest a fox could ever replace a dog as man's best friend. But it just might join them. Early in the program, researchers in Siberia were surprised to find some foxes so doglike that they would leap into a person's arms and lick their faces.

The research isn't aimed at making pets, but at tracing the animals' genes. And it's tough work. Even today, much effort remains to tease apart behavior traits - fear, boldness, passivity and curiosity - into individual genes or sets of genes that actually make domestication possible. (Incidentally, the research involves not only selecting and breeding the "nicest" foxes, but also the meanest. These foxes, kept in a separate compound, would just as soon tear your head off.)

Among the mysteries remaining is how the nice foxes, which no longer would last long in the wild, also got smart. They can engage with people using social cues like pointing and eye movement. But most of all, like dogs, they really LIKE people. It's enough to get some reserchers wondering: How did the earliest pre-human hominids get domesticated?

Thursday, February 24, 2011

A singular idea

There's an idea floating around that some find very nearly obvious and others see more as a kind of scientific myth. But unlike the myths of old, this new idea is embraced not by the uneducated masses, but by some of the smartest scientists and engineers of today. The idea: a "singularity" that will change mankind forever is quickly coming our way.

The main man behind the idea is scientific entrepreneur and futurist Ray Kurzwell, who in 2005 published "The Singularity is Near: When Humans Transcend Biology." The basic idea is that technology continues to expand exponentially - on a graph it is quickly shooting toward infinity - and within only decades computers, say, will be as intelligent (and soon much more intelligent) than humans. This revolutionary new situation will mean the computers can help us exterminate aging - or let us hook up with the machines themselves. Either way, man will become utterly changed, and immortality - and the whole universe - will be his.

Some scientists and engineers buy this because they, of all people, understand how very fast technology is advancing. They see a merger of technology and intelligence as not only inevitable, but not very far away. Yet many - most - others among their peers scoff. Could there be things happening in our brains that can't be duplicated electronically? By banishing death, would our lives still have meaning?

Christ Edwards, a skeptic and author, has a couple of big doubts. For one, who's to say technology will continue to grow so quickly? Will all complexity comes cost, which can stop growth in its tracks, much like progress in flight, from Kitty Hawk to a trip to the moon, has stalled in the face of the costs of flight to other stars. Another doubt calls into question Kurzwell's contention (based on the anthropic principle) that because a singularity has obviously not happened anywhere else, earthlings must be in the lead. But while such a principle might be used to help explain highly unlikely events in the past, it can't be used to confidently predict future unlikely events.

This is not to say that some sort of signularity can't await sometime ahead, but that possibility doesn't guarantee that a leap to a whole new kind of humankind is only 20 or 30 years away.

Tuesday, February 22, 2011

Teens and risk

Many adults, if they are honest, will admit they were lucky to get out of their adolescence alive. Or without serving jail time. Or without screwing up their lives in countless other ways. Adolescence is a mine field, but it's a mine field laid by none other than the adolescent himself.

None of this is exactly breaking news, but new studies appear to be tossing an additional piece of the puzzle into the mix.

Studies of adolescent rats show that they are eager to spend less time with their parents and more with other young rats. They want to explore their world and grab what they want, and they are willing to take risks to do so. For instance, a "teen" rat (about seven weeks old) who wants squirts of sweetened condensed milk will press a lever far more times than younger or older rats - even though they are paying way too much in energy expended for the squirts.

Similarly, human teens facing card games or other tests in the lab will take ridiculous risks trying to win. Now, it happens that our reward circuitry in the brain has separate systems for trying to win and for assessing risks. Teens' thinking about risk may lag behind their ability to think about rewards.

This evolutionary goad to help get teens out of the nest may be backfiring in modern times, when risks such as fast driving, drug use and unsafe sex are a lot more dangerous than risks used to be. Mother Nature may be out of sync with today's real world.

Sunday, February 20, 2011

Lady, you NEED more shoes

As many neuroscientists dream about brain-wave control of machines to not only let people rise out of their wheelchairs but eventually lead to mind melds and thought downloads, others of us worry about mind control. If we can interact mentally with machines, how will machines (and their evil masters) be able to mess with our minds?

Well, we've got time to worry about it. Unless, that is, we're worms.

A team at Harvard University has built a computerized system to manipulate the behavior of worms by stimulating their neurons individually with laser light. The team can make the worms start and stop, give them the sensation of being touched, and even make them lay eggs, all while the worms are swimming freely in a Petri dish.

The researchers want to understand how neurons work together, efventually learning how to help people with neurological problems. But all I can think of is worms wiggling lock-wiggle into a wormy shopping mall, buying whatever their masters decree. Sure, worms aren't all that bright. Their brains are smaller than a grain of sand. But people are so much smarter?

Saturday, February 19, 2011

Bison attack!

Most Americans love bison - they call them buffalo - from afar. But there are plenty who trek to Yellowstone National Park each year. The visitors may or may not see a bear or a moose, but bison are sure to be enjoyed - provided people don't do the dumb tourist thing and walk up close to a bison for a better picture, only to get the potentially fatal sharp-horned heave-ho.

Now a scientist suggests bison may also have given the final heave-ho to most of America's now-extinct big mammals. Beavers the size of bears, mammoths, horses, camels and saber-toothed cats all were gone by 11,000 years ago. The main cause was climate change that reduced the food and water supply (human hunters may have helped, too) but, suggests Eric Scott, curator of paleontology at the San Bernardino County Museum in Redlands, Calif., it was the ever-growing bison herds that may have forced a "tipping point" for the doomed species.

Bison would have had advantages over other large herbivores, such as their multiple stomachs that probably allowed them to obtain maximum nutrition from their food. Their population growth since migrating from Asia may have malnourished nursing mothers of other big species just enough to cause their numbers to collapse. And, with far fewer large herbivores to feed on, dire wolves, lions and other big carnivores would have starved as well.

It wasn't until plains Indians obtained horses - and Buffalo Bill and his fellows went to work - that bison too finally faced the threat of extinction.

Thursday, February 17, 2011

Pushing my buttons

"I know cell phones aren't likely to give anybody brain cancer," the man said. But, he added darkly, "I expect waves of skin cancer to break out any time now." (A quote I read or heard somewhere recently.)

Aargh! I know, the cell-phone/cancer thing pushes my buttons. It's a stand-in for all the dumbness out there. After all, the Legislature is in session here in Montana's capital city. I'm lucky if I don't go spurting off like a venting balloon, ending up somewhere in a wrinkled heap.

Anyway, here's yet another way to look at microwaves. All electromagnetic waves are made of photons, which span a gigantic range of energy. It's not until they reach the microwave energy that they have any impact at all. In a cell phone, all microwaves could do is warm you up. But that wouldn't matter, because by then you'd long since be dead and buried.

Way at the far end of the visible spectrum, nearly into the invisible ultraviolet, photons can spark skin cancer. That's why you wear sun block at the beach. Photons at this energy level are a million times more energetic than microwaves, yet they still can't get below the skin.

So, next time you read about microwave radiation causing cancer, just keep it away from me, OK?

Sunday, February 13, 2011

Health reform and the Constitution

Regardless of where their political leanings take them, many Americans still wonder about whether the new health reform bill is or is not constitutional. Does the "individual mandate," the part of the plan that penalizes people who could pay for health insurance but don't buy it, somehow make the whole bill unconstitutional? Is it beyound the government's power to make people buy health insurance?

According to a recent New York Review article by David Cole, a law professor at Columbia University, that's easy. He says precedents over the past 70 years make it clear that the Commerce Clause and the Necessary and Proper Clause give the government full power to regulate all salient aspects of the insurance business - including taxing free riders who duck paying their share to the detriment of the whole system.

Congress certainly can tax to provide health insurance - it does so already through Medicare and Medicaid. Likewise, Congress has ample authority to enact the individual mandate. Absent a return to a constitutional jurisprudence that has been rejected for more than 70 years, Cole says, "the individual mandate is plainly constitutional."

(Incidentally, I recently learned that such a mandate has been a central part of Republican health-care thinking since 1991 - a response to the dreaded "single-payer" reform favored by liberals. Only after the GOP failed to stop Obama's reform bill did they decide to try to convince the courts it was a new and unconstitutional regulation.)

Thursday, February 10, 2011

Living in a soap

I got hooked on the television series Mad Men a couple of years ago after buying DVD sets of the first two seasons. (I'd walk a mile to avoid a TV commercial.) I later bought the third season, and await the fourth, due March 29.

So I eagerly read a review of the series about advertising men in the early 1960s by Daniel Mendelsohn, an essayist and teacher at Bard. Yikes. His New York Review article, going against most critics, trashes the series as a mere "soap opera" rather than a thoughtful drama that attempts to say something about the human condition.

But, confesses Mendelsohn, he's hooked on Mad Men, too. How can that be, he wonders.

I like his answer. The basic attraction of Mad Men, he says, is not the period accuracy or the titillation of bygone mores, but the quiet bewilderment of the kids on the show. That makes a soap opera - not a drama that explores the adult characters' actions - just what these children are seeing and trying to understand as they grow up. And the kids of the early 1960s are none other than the 40- and 50-year-olds who watch the show with such fascination today.

I would only add that kids' puzzlement is not limited to one generation. Sometime in 2050 or so, a similar show might well air, attracting people who were growing up around 2011. They, like the children of Don Draper in the 60s, also are growing up trying to make sense out of a senseless soap opera.

Monday, February 7, 2011

Warming and life

Not too long ago we visited "Snowball Earth," a period when most of the planet's land was covered by glaciers between 750 and 635 million years ago, and we concentrated on how it was a kind of runaway global warming that finally brought the icy era to an end.

But the warming may have done a lot more than that. By melting the glaciers, it may have kick-started life as we know it.

The line between the PreCambrian and Cambrian eras, about 600 million years ago, is marked by what is called the "Cambrian Explosion" - suddenly, all over the world, the fossils of countless brand-new creatures began showing up. And, in a mere 85 million years or so, animals evolved and radiated over much of the world's land and oceans. What made this possible?

Scientists have long suspected that the melting of the glaciers, which had dragged along nutrients scraped from the Earth's surface and now was releasing them into the sea, had provided enough food like phosphorus to spur a huge algae growth. The algae, in turn, would have given off enough oxygen to allow the quick spread of air-breathing animals.

But for years, there was little proof of this. Now, however, scientists have figured out how to use iron-rich deposits from ancient, low-oxygen oceans to estimate how much phosphorus was in the water. (Iron scavenges phosphorus is a predictable way.) It turns out phosphorus spiked at just the right time.

Gosh ... yet another reason to thank at least one episode of global warming.

Friday, February 4, 2011

Jefferson's moose

Thomas Jefferson was pissed. He wanted a moose. He wanted it big. And he wanted it dead.

OK, enough already with the cheapo narrative tricks. All that stuff in the first paragraph is true, but it hypes a story that's not all that exciting. However, it's a good bet that many people have never heard the story of Jefferson's moose.

The tale begins in the late 1700s, when Georges-Louis leclerc (known as Count Buffon, the most influencial natural scientists of his century) published his 36-volume "Natural History." In it, Buffon said America was a degenerate place as shown by its weak and stunted flora, fauna, and people.

Jefferson, fearing such an impression could impede the economic and cultural maturation of the U.S., wanted to bring Buffon the bones of a huge moose, hoping its size would convince him to drop his degeneracy theory.

Before Jefferson left to be ambassador to France, he wrote to friends pleading for hunters to procure for him the skeleton of a giant moose. In France, meanwhile, Jefferson met with Buffon, telling him that a European "reindeer could walk under the belly of our moose." He left the meeting with the impression that Buffon would change his mind if he could see such a creature.

Finally, after many delays, Jefferson received the moose bones. Here the story kind of peters out. Jefferson did show Buffon the moose and felt he was suitably impressed, but Buffon was ill and died before he could change his book.

The author of the Scentific American article I've used for this - historian Lee Dugatkin - said that some form of the idea of degeneracy in the New World lasted "for at least another six decades before withering and leaving only a dried husk of general anti-Americanism."

Wednesday, February 2, 2011

Mandatory vaccines?

Back in 1998 British doctor Andrew Wakefield published a study in "Lancet" claiming to have found a link between the measles-mumps-rubella vaccine and autism. In Britain and the U.S. vaccination of children dropped sharply, and the incidence of preventable diseases is exploding.

It was all a hoax. No one has been able to replicate Wakefield's findings, and the report has been retracted by "Lancet." Wakefield has been stripped of the right to practice medicine in Britain, and recent reports call him a fraud who altered data and took more than half a million dollars from a law firm planning to sue vaccine makers. He also allegedly was involved in schemes to profit by offering services and analysis to fearful parents.

Despite all the evidence, the damage he has caused may long lasting. Many parents are convinced vaccines are dangerous and show no interest in changing their minds. This is a serious threat to the nation's health.

The success of vaccines depends on high immunization rates - up to 95 percent in some cases - to protect people who are not immune. Not taking vaccines endangers not only a misguided parent's own chidren, but those too young to be vaccinated, whose whose immune systems are compromised, and even vaccinated youngsters because all vaccines fail to protect a certain percentage of people.

Today, all but two states (Mississippi and West Virginia) allow parents to opt out of vaccinating their kids for religious or philosophical reasons. That means 48 state legislatures need to get on the stick and make vaccination mandatory. Parent's right end where they become a proven threat to the public welfare.

Monday, January 31, 2011

Football worries

When the Green Bay Packers take the Super Bowl field Sunday, you can bet I'll be in front of the TV, tuned to the dreaded Fox, ready to root them on. They may have taken the boy out of Cheese Head Land, but put the Pack into the championship, and the youthful thrill returns.

Still, in the back of my mind, worries already are growing. Packer quarterback Aaron Rodgers already has suffered two concussions this seaon. Will this be the game that sets Rodgers on the road to chonic traumatic encephalopathy or even Alzheimer's?

Rodgers, in true football fashion, has asserted that he won't alter his scrambling style in hopes of avoiding another concussion. But brave talk can't change a growing undercurrent of concern that head hits - from under the Friday night lights to Sunday afternoon extravaganzas - may be signaling the end of football as we know it.

So fart, studies offer plenty of cause for worry. But much remains unproved. For instance, scientists don't even know if 50 minor head hits are as dangerous as two or three concussions. But it is clear that college football players suffer over 1,000 such hits a year in games and practice.

My boyhood conviction that the Packers are My team is back this year. But I don't want my players turning into punch-drunk losers in life.

Saturday, January 29, 2011

Saying goodbye to helium

Looking for more to worry about? Well, how about helium? The world is running out of it.

There's something absurdly ironic about this. Throughout the universe, helium accounts for 24 percent of the mass of the elements. It was created during the first three minutes of the Big Bang, when it still was hot enought to act as a cosmic furnace sythesizing the simplest of atomic nuclei. Our sun is stuffed with the stuff. So are Jupiter and the other gas giants.

But the Earth is too small. Practically all helium free in the atmospher heads up and away, straight into space. That's why it is rare. In fact, the element was discovered in the sun in 1868, before it was found on Earth.

It turns out that helium is found captured in natural gas that has been around radioactive decay. (It's a decay product.) It was mostly produced in the U.S. on the Great Plains.

Now, according to a National Geographic article, the National Research Council says we're running out. The magazine said the U.S. began stockpiling helium in 1960 but later decided to sell it off. When it is gone, most production will be in Russia, Algeria and Qatar. And that could be just 40 years worth.

That's bad news. Helium is crucial for cooling things like MRI scanners, purging rocket engines, and much more.

But worst of all: Can you imagine a kid's birthday party without helium balloons?

Wednesday, January 26, 2011

Why beer batter is better

It is time for something light and fluffy ... something like beer batter.

My late father-in-law was a good guy who was raised on a farm, had no education after high school, and was a swing-shift laborer at a tire factory for most of his life. But when it came to finding, catching and cooking walleyes, he was a genius.

He made his own beer batter - alternately swigging and pouring in search of perfection - before dipping the boneless fillets and gently dropping them into the hot oil.
The result was a kind of bliss. The fish tasted better than a fish has any right to taste.

So what was the secret? Now I know.

It turns out that it really is the beer. The beverage is saturated with carbon dioxide, which (unlike salt or sugar) doesn't dissolve well in hot liquids. Instead. it emits bubbles that expand the batter mix and gives it that lacy, crisp texture.

But if the bubbles just flew off, like champagne bubbles, they wouldn't do much good. Beer, however, has foaming agents that not only give a glass of beer its head, but keep the bubbles in the batter. The foam also insulates the meat so it can cook gently while the batter turns golden brown.

The alcohol helps, too. It evaporates faster than batter made of water or milk, so it doesn't have to cook as long. And the faster the batter dries, the lower the risk of overcooking the food.

I don't know whether or not my father-in-law knew any of this. He'd never been to cooking school. But, being a genius, maybe he did.

Monday, January 24, 2011

Time to redefine 'universe?'

I can't read anything by physicist Steven Weinberg without learning something (albeit in a limited, popular-science kind of way). In a review of a 2010 book called "The Grand Design" by Stephen Hawking, Weinberg notes that there is a new and startling necessity for "fine tuning" of the universe to allow for life.

It has to do with dark energy, the energy of "empty" space that is driving the accelerating expansion of the universe. The problem is that, via quantum mechanical calculations, dark energy should be so powerful that the expansion would prevent galaxies, stars and planets from forming. Obviously, this is not the case, and other factors must cancel out almost all of that speed. The thing is, those other factors - not well enough understood to calculate - must be fine tuned to about 56 decimal places.

That's a lot of fine tuning! But if we live in a "multiverse" where a gazillion different universes exist, only a tiny fraction of which could support life, the fine tuning would be moot. A universe like ours could exist - would almost have to exist - along with all those other universes with entirely different laws of nature. And it is no big whoop that we happen to live in a good one.

The multiverse idea, much discussed among physicists these days, remains speculative, but multiple lines of scientific thought support it.

I've been interested in this stuff since I was about 10 and read George Gamow's book explaining general relativity. I wish I could rewind back to 10 years of age ... just to keep up on whatever unfolds next.

Saturday, January 22, 2011

Language and thought

Scientific paradigms come and go. An idea will emerge and flower, often to be overshadowed by a newer, better idea, and find itself left to wither and die. But sometimes, rarely, the old idea stirs in its grave, threatening to rise from the dead.

A case in point is the idea by linguists Edward Sapir and Benjamin Lee Whorf that a language may determine h0w its speakers are able to think. Their most famous example came from studying the Hopi language, which they said lacked many markers for past, present and future, leaving them to speculate that Hopis don't think of time in the way that we do.

The Sapir-Whorf hypothesis gained a big following in the 1930s and 40s, but by the 1970s it was all but abandoned. The problem? It suffered a near complete lack of evidence to support the claims. And, it turned out, they didn't know enough Hopi to realize it did the tense job in other ways. Linguists came up with a new idea: thought is universal.

Enter an article in the current Scientific American by Lera Boroditsky, an assistant professor at Stanford and editor in chief of "Frontiers in Cultural Psychology." Now, she said, researchers have the evidence.

She's armed with plenty of examples,, such as the little girl from an Australian Aboriginal tribe that has no words for "left" and "right," but only uses absolute cardinal directions. She can instantly point to due north from anywhere (you try that away from home), or the Amazon language that lacks numbers and only has words for "few" and many."

Examples abound, and although more work is needed, the Sapir-Whorf hypothesis seems intuitively correct. But here's a self test: If you were told that "grue" is a color half way between green and blue, would you start noticing the color grue?

Thursday, January 20, 2011

A sad tale

A transit of Venus was a really big deal in astronomical circles during the last few centuries - timing the planet's apparent passage across the surface of the sun was the only way to get a better estimate of the Earth's distance from its very own star.

Transits visible from the Earth are rare. They occur in an odd pattern: eight years apart, then 121.5 years, then eight years, then 105.5 years. (The next one happens in 2012, then we wait 12 decades.)

Usually we read about successful measurements. Here's the other side of the coin.

Guillaume Le Gentil (Guillaume Joseph Hyacinthe Jean Baptist Le Gentil de la Galaisier to his friends) was born in 1725 in Coutances, a city in the northwest part of France near the English Channel. Le Gentil trained for the church, but became fascinated by astronomy. He did well, and discovered many Messier objects (galaxies, star clusters, etc., that looked like blobs in contemporary telescopes).

As the 1761 transit of Venus approached, he was one of many astronomers who scattered all over the world to view the event. Le Gentil headed for India, but conflict with the British and other problems delayed him. He ended up watching the transit from the rolling deck of a ship, making accurate measurements impossible.

Then he made the fateful choice to stick around exploring the region until the next transit in 1769. When the big day arrived, clouds obscured his view. Rats.

Dejected, he headed home, only to run to more delays - storms, dysentery, you name it. When he finally got back to France, he discovered he had been declared dead. His wife was remarried, and his relatives had divided up his property.

(There is a happier ending. He remarried, regained his property via lawsuits (and some help from the king) and got his job back. He died at 67.)

Tuesday, January 18, 2011

That old-time constitution

Jill Lepore, a New Yorker staff writer and professor of history at Harvard, began her recent piece on the U.S. Constitution by quoting Benjamin Franklin urging delegates to sign the document. "I confess that there are several parts of this constitution which I do not at present approve," he said. But he hoped that all delegates with reservations "would with me, on this occasion, doubt a little of his own infallibility, and to make manifest our unanimity, put his name to this instrument."

Lepore was contrasting Franklin's open-mindedness with "originalism," the idea judges must only interpret the constitution by determining the Founding Fathers' intent. This is nothing new. In 1916, conservatives blasted Woodrow Wilson's concept of a "living constitution. In 1921 Warren Harding called the constitution "divinely inspired." Then along came the New Deal and, a little later, civil rights legislation to further stir the conservative pot.

The fight goes on. After Ronald Reagan nominated Antonin Scalia to the Supreme Court, Thurgood Marshall was moved to comment: "I do not believe that the meaning of the constitution was forever fixed at the Philadelphia Convention."

Liberal legal scholars are quick to note that the writers of the constitution are long dead. And even if they could be brought to life, they would be nothing like us. Columbia law professor Jamal Greene wonders how rote obedience to views more than 200 years old, in a time of "wildly different racial, ethnic, sexual, and cultural composition," can be justified on democratic grounds."

If the Founding Fathers were revived, they would understand so little and be baffled by so much. Just their views on the place of women and blacks would render them pariahs.

Originalism will march on in the hearts of many conservatives. And remember: For many of them, their stance is no great leap. After all, many are pretty sure the Bible is infallible, too.

Sunday, January 16, 2011

An icy lesson

It didn't take a kid growing up in northwest Wisconsin long to realize that, pleasant as the respite from the cold might be, a January Thaw was bad news. We may have been throwing snow into higher and higher piles for a couple of months, but it was snow. Ice was a different story.

Icy, dangerous sidewalks. Icy, slippery roads, with the ice often of the no-see-um black variety. A January Thaw never lasted long, and the people were glad. Let new snow bury all that ice.

Saturday morning, as I backed out of my garage into the alley, ice was far from my mind. That changed in a hurry when my futilely spinning tires sang: "Abandon all hope." Before long I managed to end up crossways across my alley, my tires trapped in icy ruts made all the more slippery by the film of water freed by the miserable 45-degree temperature.

Abandoning pride as well as hope, I called Helena Towing Service. The guy said he could help, but warned it probably would cost at least $75. Had he been within reach, I would have answered with a hug.

He arrived 45 minutes later with a big bag of something called FloorDry, made for cleaning up machine-shop spills. He had me turn the wheels right and left as he spread the stuff, and suddenly I could back into a neighbor's yard, pull into the alley (heading downhill this time!) and scoot off to run my errand.

It was an expensive lesson, but I learned it well: FloorDry. Never back out of your garage without it.

Thursday, January 13, 2011

A pipe dream

It probably is a pipe dream of the highest order to even hope that current talk of greater inter-party cooperation in Congress might lead to reform of that body's most noxious cancer - filibusters.

Before the new Congress convened this year, all of the returning Democratic senators urged Majority Leader Harry Reid to do something about filibusters (and about "holds," by which an anonymous legislator can block a nomination without giving a reason). Few paid any attention. But now ... after Tucson?

Let's dream on for a moment. After all, it was filibusters and holds that kept the healthcare law from including a public alternative to private insurance, kept financial reform from being able to prevent another meltdown, and kept nearly 200 executive and judicial nominees in limbo. While totally abolishing filibusters is beyond even dreams, what if there were new rules that required filibusters to actually be conducted, not just threatened? What if a rule would limit them to final passage of a bill, not every little stage a bill goes through? Such things would go a long way toward meaningful reform ... and cooperation.

Hey, I can dream. Would you pass the opium?

Tuesday, January 11, 2011

Out of it in TV land

Probably because of all my years as a print journalist, I never formed the habit of watching the local TV news. After all, these people didn't have the staff or the airtime to cover much of anything. And, unlike bigger cities, small-town Montana seldom had fires or bleeding with which to lead the broadcast.

But about a month ago I re-caught the Jeopardy bug. "THIS is Jeopardy!" finds me moving to the front of my seat, ready (or not) for the challenge.

Jeopardy is broadcast on a Great Falls station, Channel 9, just after the local news. But while Great Falls is 90 miles to the north, I discovered that the local news is HELENA local news! (It's the same on the Butte station, 65 winding mountain miles to the south.) Have people in Great Falls and Butte suddenly developed a craving for news from the capital city? I rather doubted it.

So I did some digging. It didn't take long. It turns out the new reality dates back to the 1996 Telecommunications Act, which required planning to begin on changing TV stations from analog to digital. It's taken a while, but you might remember those scary warnings that as of June 12, 2009, your analog TV is junk without a conversion box. One of the purposes of the act was to free up broadcast spectrum space, and analog takes far more space than digital does. Now, broadcasters have room to play. That's why those mini-channels are popping up.

I must report that local TV news hasn't improved much, wherever it comes from. But I finally took a close look at the Channel 9 logo. On my TV, it says Channel 9.1.

Sunday, January 9, 2011

Doing time at a traffic light

I drove southwest on Helena Avenue through falling snow, pulling to a stop behind two other cars at Malfunction Junction - a five-legged monster further complicated (the way I was going) by yet another traffic light just a few hundred feet ahead and quick to turn red.

I was sitting at the longest red light in town. I knew this because, about a quarter century ago, I wrote a feature story about Helena's traffic signals. Settling in for at least a two-minute wait, I began recalling some history.

The first changing traffic signal arrived years before cars came along. In London in 1868, red semaphore arms and a red gaslight told carriage drivers to stop. Green meant caution - go ahead carefully. In 1912 Salt Lake City installed traffic lights for its streets - designed to be wide enough for a long mule train to turn around - but each signal required an operator. Two years later Cleveland installed the first electric lights.

By 1921 Detroit had the first automated, hour-way, three-light signals, installed towers in the middle the street. But the Detroit police should be more famous for another feat. In 1915 some genius invented the stop sign.

Whoa - the cars in front of me were pulling out. But as they chugged along up the snow-covered little hill, hope of making the next light dimmed. But maybe ... maybe ... rats!

Thursday, January 6, 2011

Let's walk for sex!

If you want to watch a paleoanthropologist blow a gasket, just ask him about the "missing link."

"The term is wrong in so many ways, it's hard to know where to begin," said Tim
White of the University of California, Berkeley. "Worst of all is the implication that at some point there existed something halfway between a chimp and a human. That's a popular misconception that has plagued evolutionary thought from the beginning, and one Ardi should bury, once and for all."

"Ardi" - short for Ardipithecus ramidus - is the 4.4 million year old adult female fossil White and his team found about 15 years ago that may be the oldest hominid on the books. It's more than a million years older than the famous "Lucy," which was 3.2 million years old. (It certainly is NOT a link to the far-off creature that was an ancestor to both apes and humans.)

But what's ironic is that nowadays, Ardi may be just what many people have in mind when they think about a missing link.

Ardi is a curious mix of ancient traits far older than the apes of its time and modern hominid traaits that lead straight toward humans. For instance, take Ardi's foot. It allowed the creature to walk upright, if not easily, and walking upright is the main definition of a hominid. The foot is built much like Lucy's - who strolled with ease - but it also had a big toe that stuck out to the side - great for climbing trees.

And then there's Ardi's pelvis, which National Geographic author Jamie Shreeve called a perfect example of a "primitive primate caught in the act of becoming human." Lucy had hips made for walking; chimps have hips made for climbing (they walk with a huge lurch); but Ardi's hips are (dare I say it?) half and half.

So why walk at all? Some think it had to do with sex. Apes have giant, sharp canines to fight off other males for mates. Maybe Ardi's menfolk, with their small canines, made sex-for-food deals with females and had to have hands free to carry the food home.

Now, there's the beginning of a human trait!

Tuesday, January 4, 2011

History and memory

An essay reviewing "The Whites of their Eyes: The Tea Party's Revolution and the Battle Over American History" by Jill Lepore opened by own eyes a little. (Apparently they only open so far.)

Gordon S. Wood, professor emeritus at Brown, wrote that Lepore came down a bit too hard on the Tea Party, which is only doing what Americans (and people around the world) have been doing forever - taking inspiration from THEIR version of history. That version says nothing of slavery, the subjugation of women, and all that. It's all about white people in white wigs fighting against taxation and for freedom on the individual level.

But the Right is hardly the only side to exploit history for its own uses. For example, when tried in 1970 for blocking a military base, the radical historian Howard Zinn told the court he was acting "in the grand tradition of the Boston Tea Party."

Freethinkers love to point out that Jefferson was a deist and that the founders deliberately excluded God from the Constitution. Unfazed, the fundamentalist Right praises God that the United States has been a Christian nation from the very beginning. Never mind that either point of view leave out a whole lot of history.

Wood says there is as big difference between the job of professional historians - to dispel myths - and the popular memory that acts as a touchstone for people's beliefs. For instance, a 1996 biography of Sojourner Truth correctly pointed out that she never said, "A'n't I a woman?" Fans of the woman hated the book. They felt blindsided by the debunking of beloved myth.

Many think that kind of collective memory, true or not, is essential for fostering community, identity, and continuity. At the least, it deserves respect for its role in the understanding of the emotional lives of a people.

Sunday, January 2, 2011

Be happy

If love is a mystery, happiness is a conundrum. What is it? What causes it? Why do some people have it, while others don't?

Many of us know people - a family subsisting on food stamps, a cheerful waitress earning the minimum wage - who seem to be among the happiest folks in town. We also know other people, much better off financially, who never seem happy at all.

Here's another question: How much should we value happiness? A thought experiment by the philospher Robert Nozick imagines a machine that can give you any experience you want by stimulating your brain (while the real you floats in a tank). You think and feel that you are accomplishing great things, or winning the love of your life. You could preprogram enough such wonderful experiences to last your whole lifetime. You would experience lifelong bliss, never knowing you were hooked up to a machine. Would it be right to say that you had a happy life?

Social scientists have long conducted surveys on happiness. One consistent finding is that, beyond a modest level of sufficiency, people's reports on their own happiness aren't strongly correlated with income. This suggests that people tend to find happiness in their actual circumstances, as long as they aren't too dire. Another finding, unwelcome to liberals, is that economic inequality has little bearing on happiness. Nor does the amount a nation spends on social welfare programs.

Maybe the U.S. should focus less on economic inequality in general and more on specific causes of unhappiness - things like inadequate medical protection, chronic pain, and depression.

Or maybe we're just stuck with singing along with Bobby McFerrin: "Don't Worry, Be Happy."