Sunday, February 27, 2011

Foxy pets

Natural selection may be a dirty word for certain evolution deniers, but artificial selection - that is, the domestication and designing of animals (and crops) by people - has at least a 15,000-year history.

Take animals. Wolves were domesticated and turned into dogs 15,000 years ago in Asia. Between about 12,000 and 10,000 years ago, sheep, cows, goats, pigs and cats were domesticated, and the chicken came about 8,000 years ago. Somewhere between 6,000 and 4,500 years ago came the llama and alpaca, the horse, the dromedary and Bactrian camel, and the water buffalo and the yak. About 3,000 years ago in North America, Indians domesticated the turkey.

But people have failed to domesticate any other animal, of all the many thousands of species in the wild, ever since. Their genes won't let it happen. But, starting about 50 years ago, some Russians began domesticating the fox. I know better than to suggest a fox could ever replace a dog as man's best friend. But it just might join them. Early in the program, researchers in Siberia were surprised to find some foxes so doglike that they would leap into a person's arms and lick their faces.

The research isn't aimed at making pets, but at tracing the animals' genes. And it's tough work. Even today, much effort remains to tease apart behavior traits - fear, boldness, passivity and curiosity - into individual genes or sets of genes that actually make domestication possible. (Incidentally, the research involves not only selecting and breeding the "nicest" foxes, but also the meanest. These foxes, kept in a separate compound, would just as soon tear your head off.)

Among the mysteries remaining is how the nice foxes, which no longer would last long in the wild, also got smart. They can engage with people using social cues like pointing and eye movement. But most of all, like dogs, they really LIKE people. It's enough to get some reserchers wondering: How did the earliest pre-human hominids get domesticated?

Thursday, February 24, 2011

A singular idea

There's an idea floating around that some find very nearly obvious and others see more as a kind of scientific myth. But unlike the myths of old, this new idea is embraced not by the uneducated masses, but by some of the smartest scientists and engineers of today. The idea: a "singularity" that will change mankind forever is quickly coming our way.

The main man behind the idea is scientific entrepreneur and futurist Ray Kurzwell, who in 2005 published "The Singularity is Near: When Humans Transcend Biology." The basic idea is that technology continues to expand exponentially - on a graph it is quickly shooting toward infinity - and within only decades computers, say, will be as intelligent (and soon much more intelligent) than humans. This revolutionary new situation will mean the computers can help us exterminate aging - or let us hook up with the machines themselves. Either way, man will become utterly changed, and immortality - and the whole universe - will be his.

Some scientists and engineers buy this because they, of all people, understand how very fast technology is advancing. They see a merger of technology and intelligence as not only inevitable, but not very far away. Yet many - most - others among their peers scoff. Could there be things happening in our brains that can't be duplicated electronically? By banishing death, would our lives still have meaning?

Christ Edwards, a skeptic and author, has a couple of big doubts. For one, who's to say technology will continue to grow so quickly? Will all complexity comes cost, which can stop growth in its tracks, much like progress in flight, from Kitty Hawk to a trip to the moon, has stalled in the face of the costs of flight to other stars. Another doubt calls into question Kurzwell's contention (based on the anthropic principle) that because a singularity has obviously not happened anywhere else, earthlings must be in the lead. But while such a principle might be used to help explain highly unlikely events in the past, it can't be used to confidently predict future unlikely events.

This is not to say that some sort of signularity can't await sometime ahead, but that possibility doesn't guarantee that a leap to a whole new kind of humankind is only 20 or 30 years away.

Tuesday, February 22, 2011

Teens and risk

Many adults, if they are honest, will admit they were lucky to get out of their adolescence alive. Or without serving jail time. Or without screwing up their lives in countless other ways. Adolescence is a mine field, but it's a mine field laid by none other than the adolescent himself.

None of this is exactly breaking news, but new studies appear to be tossing an additional piece of the puzzle into the mix.

Studies of adolescent rats show that they are eager to spend less time with their parents and more with other young rats. They want to explore their world and grab what they want, and they are willing to take risks to do so. For instance, a "teen" rat (about seven weeks old) who wants squirts of sweetened condensed milk will press a lever far more times than younger or older rats - even though they are paying way too much in energy expended for the squirts.

Similarly, human teens facing card games or other tests in the lab will take ridiculous risks trying to win. Now, it happens that our reward circuitry in the brain has separate systems for trying to win and for assessing risks. Teens' thinking about risk may lag behind their ability to think about rewards.

This evolutionary goad to help get teens out of the nest may be backfiring in modern times, when risks such as fast driving, drug use and unsafe sex are a lot more dangerous than risks used to be. Mother Nature may be out of sync with today's real world.

Sunday, February 20, 2011

Lady, you NEED more shoes

As many neuroscientists dream about brain-wave control of machines to not only let people rise out of their wheelchairs but eventually lead to mind melds and thought downloads, others of us worry about mind control. If we can interact mentally with machines, how will machines (and their evil masters) be able to mess with our minds?

Well, we've got time to worry about it. Unless, that is, we're worms.

A team at Harvard University has built a computerized system to manipulate the behavior of worms by stimulating their neurons individually with laser light. The team can make the worms start and stop, give them the sensation of being touched, and even make them lay eggs, all while the worms are swimming freely in a Petri dish.

The researchers want to understand how neurons work together, efventually learning how to help people with neurological problems. But all I can think of is worms wiggling lock-wiggle into a wormy shopping mall, buying whatever their masters decree. Sure, worms aren't all that bright. Their brains are smaller than a grain of sand. But people are so much smarter?

Saturday, February 19, 2011

Bison attack!

Most Americans love bison - they call them buffalo - from afar. But there are plenty who trek to Yellowstone National Park each year. The visitors may or may not see a bear or a moose, but bison are sure to be enjoyed - provided people don't do the dumb tourist thing and walk up close to a bison for a better picture, only to get the potentially fatal sharp-horned heave-ho.

Now a scientist suggests bison may also have given the final heave-ho to most of America's now-extinct big mammals. Beavers the size of bears, mammoths, horses, camels and saber-toothed cats all were gone by 11,000 years ago. The main cause was climate change that reduced the food and water supply (human hunters may have helped, too) but, suggests Eric Scott, curator of paleontology at the San Bernardino County Museum in Redlands, Calif., it was the ever-growing bison herds that may have forced a "tipping point" for the doomed species.

Bison would have had advantages over other large herbivores, such as their multiple stomachs that probably allowed them to obtain maximum nutrition from their food. Their population growth since migrating from Asia may have malnourished nursing mothers of other big species just enough to cause their numbers to collapse. And, with far fewer large herbivores to feed on, dire wolves, lions and other big carnivores would have starved as well.

It wasn't until plains Indians obtained horses - and Buffalo Bill and his fellows went to work - that bison too finally faced the threat of extinction.

Thursday, February 17, 2011

Pushing my buttons

"I know cell phones aren't likely to give anybody brain cancer," the man said. But, he added darkly, "I expect waves of skin cancer to break out any time now." (A quote I read or heard somewhere recently.)

Aargh! I know, the cell-phone/cancer thing pushes my buttons. It's a stand-in for all the dumbness out there. After all, the Legislature is in session here in Montana's capital city. I'm lucky if I don't go spurting off like a venting balloon, ending up somewhere in a wrinkled heap.

Anyway, here's yet another way to look at microwaves. All electromagnetic waves are made of photons, which span a gigantic range of energy. It's not until they reach the microwave energy that they have any impact at all. In a cell phone, all microwaves could do is warm you up. But that wouldn't matter, because by then you'd long since be dead and buried.

Way at the far end of the visible spectrum, nearly into the invisible ultraviolet, photons can spark skin cancer. That's why you wear sun block at the beach. Photons at this energy level are a million times more energetic than microwaves, yet they still can't get below the skin.

So, next time you read about microwave radiation causing cancer, just keep it away from me, OK?

Sunday, February 13, 2011

Health reform and the Constitution

Regardless of where their political leanings take them, many Americans still wonder about whether the new health reform bill is or is not constitutional. Does the "individual mandate," the part of the plan that penalizes people who could pay for health insurance but don't buy it, somehow make the whole bill unconstitutional? Is it beyound the government's power to make people buy health insurance?

According to a recent New York Review article by David Cole, a law professor at Columbia University, that's easy. He says precedents over the past 70 years make it clear that the Commerce Clause and the Necessary and Proper Clause give the government full power to regulate all salient aspects of the insurance business - including taxing free riders who duck paying their share to the detriment of the whole system.

Congress certainly can tax to provide health insurance - it does so already through Medicare and Medicaid. Likewise, Congress has ample authority to enact the individual mandate. Absent a return to a constitutional jurisprudence that has been rejected for more than 70 years, Cole says, "the individual mandate is plainly constitutional."

(Incidentally, I recently learned that such a mandate has been a central part of Republican health-care thinking since 1991 - a response to the dreaded "single-payer" reform favored by liberals. Only after the GOP failed to stop Obama's reform bill did they decide to try to convince the courts it was a new and unconstitutional regulation.)

Thursday, February 10, 2011

Living in a soap

I got hooked on the television series Mad Men a couple of years ago after buying DVD sets of the first two seasons. (I'd walk a mile to avoid a TV commercial.) I later bought the third season, and await the fourth, due March 29.

So I eagerly read a review of the series about advertising men in the early 1960s by Daniel Mendelsohn, an essayist and teacher at Bard. Yikes. His New York Review article, going against most critics, trashes the series as a mere "soap opera" rather than a thoughtful drama that attempts to say something about the human condition.

But, confesses Mendelsohn, he's hooked on Mad Men, too. How can that be, he wonders.

I like his answer. The basic attraction of Mad Men, he says, is not the period accuracy or the titillation of bygone mores, but the quiet bewilderment of the kids on the show. That makes a soap opera - not a drama that explores the adult characters' actions - just what these children are seeing and trying to understand as they grow up. And the kids of the early 1960s are none other than the 40- and 50-year-olds who watch the show with such fascination today.

I would only add that kids' puzzlement is not limited to one generation. Sometime in 2050 or so, a similar show might well air, attracting people who were growing up around 2011. They, like the children of Don Draper in the 60s, also are growing up trying to make sense out of a senseless soap opera.

Monday, February 7, 2011

Warming and life

Not too long ago we visited "Snowball Earth," a period when most of the planet's land was covered by glaciers between 750 and 635 million years ago, and we concentrated on how it was a kind of runaway global warming that finally brought the icy era to an end.

But the warming may have done a lot more than that. By melting the glaciers, it may have kick-started life as we know it.

The line between the PreCambrian and Cambrian eras, about 600 million years ago, is marked by what is called the "Cambrian Explosion" - suddenly, all over the world, the fossils of countless brand-new creatures began showing up. And, in a mere 85 million years or so, animals evolved and radiated over much of the world's land and oceans. What made this possible?

Scientists have long suspected that the melting of the glaciers, which had dragged along nutrients scraped from the Earth's surface and now was releasing them into the sea, had provided enough food like phosphorus to spur a huge algae growth. The algae, in turn, would have given off enough oxygen to allow the quick spread of air-breathing animals.

But for years, there was little proof of this. Now, however, scientists have figured out how to use iron-rich deposits from ancient, low-oxygen oceans to estimate how much phosphorus was in the water. (Iron scavenges phosphorus is a predictable way.) It turns out phosphorus spiked at just the right time.

Gosh ... yet another reason to thank at least one episode of global warming.

Friday, February 4, 2011

Jefferson's moose

Thomas Jefferson was pissed. He wanted a moose. He wanted it big. And he wanted it dead.

OK, enough already with the cheapo narrative tricks. All that stuff in the first paragraph is true, but it hypes a story that's not all that exciting. However, it's a good bet that many people have never heard the story of Jefferson's moose.

The tale begins in the late 1700s, when Georges-Louis leclerc (known as Count Buffon, the most influencial natural scientists of his century) published his 36-volume "Natural History." In it, Buffon said America was a degenerate place as shown by its weak and stunted flora, fauna, and people.

Jefferson, fearing such an impression could impede the economic and cultural maturation of the U.S., wanted to bring Buffon the bones of a huge moose, hoping its size would convince him to drop his degeneracy theory.

Before Jefferson left to be ambassador to France, he wrote to friends pleading for hunters to procure for him the skeleton of a giant moose. In France, meanwhile, Jefferson met with Buffon, telling him that a European "reindeer could walk under the belly of our moose." He left the meeting with the impression that Buffon would change his mind if he could see such a creature.

Finally, after many delays, Jefferson received the moose bones. Here the story kind of peters out. Jefferson did show Buffon the moose and felt he was suitably impressed, but Buffon was ill and died before he could change his book.

The author of the Scentific American article I've used for this - historian Lee Dugatkin - said that some form of the idea of degeneracy in the New World lasted "for at least another six decades before withering and leaving only a dried husk of general anti-Americanism."

Wednesday, February 2, 2011

Mandatory vaccines?

Back in 1998 British doctor Andrew Wakefield published a study in "Lancet" claiming to have found a link between the measles-mumps-rubella vaccine and autism. In Britain and the U.S. vaccination of children dropped sharply, and the incidence of preventable diseases is exploding.

It was all a hoax. No one has been able to replicate Wakefield's findings, and the report has been retracted by "Lancet." Wakefield has been stripped of the right to practice medicine in Britain, and recent reports call him a fraud who altered data and took more than half a million dollars from a law firm planning to sue vaccine makers. He also allegedly was involved in schemes to profit by offering services and analysis to fearful parents.

Despite all the evidence, the damage he has caused may long lasting. Many parents are convinced vaccines are dangerous and show no interest in changing their minds. This is a serious threat to the nation's health.

The success of vaccines depends on high immunization rates - up to 95 percent in some cases - to protect people who are not immune. Not taking vaccines endangers not only a misguided parent's own chidren, but those too young to be vaccinated, whose whose immune systems are compromised, and even vaccinated youngsters because all vaccines fail to protect a certain percentage of people.

Today, all but two states (Mississippi and West Virginia) allow parents to opt out of vaccinating their kids for religious or philosophical reasons. That means 48 state legislatures need to get on the stick and make vaccination mandatory. Parent's right end where they become a proven threat to the public welfare.