2007-04-07

My Blog Against Theocracy

The list of other blogs is here, but I was told to link to www.firstfreedomfirst.org so I did :-). And shout out to A Bird and A Bottle, where I saw mention of this.

Anyway, I'll keep it short and simple (rare for me). Theocracy, I know instinctively, fits no where in my grand plan, my grand project. As I got to in one of my first (and, sadly, few) posts on my main project, it seems like people ought to relate to one another on the basis of aptitudes and interests and respect. Why ought they do so? Well, fine, maybe because God says so. Or the Gods say so. But you know what? That's as far as one needs to go.

If you need to use religion to justify and explain why the right way to behave is, indeed, the right way to behave, then you're more than welcome to do so. But understand that others may not need to resort to religion to justify their behavior. Or they may not resort to your religion. Or they may not need to resort to any justification at all.

Your religion may be fascinating. We may want to talk with you about it. We may even solicit an invitation to partake of it. But for god's sake (heh!), we probably don't. And we certainly don't want to live according to all the extra rules and regulations and myths and fables that your religion has.

Use your religion, if you must, to ensure you're just and respectful. But remember that being just and respectful does not require you to foist other doctrine on me! People were good long before any of today's religions existed, and they'll be good long after the last practitioners decay away.

So, I pray, keep your religion out of my government. And I'll keep my government out of your religion.

2007-04-06

A biological element of morality?

On 22 March 2007, the New York Times ran this article (behind the Select firewall, accessible if you have a .edu email address) In Brain Injury Said to Affect Moral Choices, Benedict Carey wrote about the discovery that brain damage to a particular part of the brain affected the moral or ethical decisions people made.

Basically, the utilitarian aspect of their mind continues to function: they'd let one person die if it meant saving five. And it seemed to preserve some sense of moral hierarchy: they wouldn't send a daughter to the porn industry to fend of poverty, and wouldn't kill a child that they couldn't care for (I'm not inclined to think either of those decisions are actually right, but that's what both individuals with the injury and without the injury similarly decided).

It did, however, seem to destroy the distinction between doing and allowing: those with the injury were far more likely than healthy people to actually directly kill one person to save five or to kill a crying baby to save a hiding group from discovery and death.

This distinction between doing and allowing is troubling to me because it shows some kind of superiority -- it's okay for someone to kill that crying baby, but I can't bring myself to do it.

The article quotes one of the injured people as saying with horror that he realized he'd become "a killer". So it's not just that normal people find it harder to kill a baby or murder one person to save five than they do to let the baby die or indirectly cause the death of one person to save the lives of five. It's also that all the people in the study regard those who would kill directly as "killers" and don't so regard those who would let people die.

Such condemnation is still universal, and thus either originates in another part of the brain or originates in society and is stored in another part of the brain. The behavioral aversion is not universal -- it's traced directly to the injured part of the brain. Those people with the injury now have a mental process that lets them kill, even as they are aware that killing is bad.

It's an interesting mapping -- it seems to say that the utilitarian cost-benefit analysis mechanism is distinct from the deontological behavior assessment mechanism, and that both are in effect in healthy people. It also seems to say that our ability to pass judgment on others for their actions, or at least to feel shame for our own actions, is distinct from the mechanism that assesses the value of those actions. If they were the same process, I would assume that the injured people wouldn't feel bad about what they did -- they would have lost both processes.

Anyway, it's interesting. And I'm blathering.

But when I try and fit it in with a class I took a while back on cognitive neuroscience, I get caught up in the possibilities. I'm captivated by the idea that this aspect of morality is neurological and akin to senses like depth perception, the ability to see curves, the ability to recognize faces, the ability to associate words and images, and various other particular abilities that have been traced to parts of the brain via similar studies.