Kabbalah Madonna

Jun 21st, 2004 11:59 pm | By

A kind reader, by which I mean Norm Geras, emailed me to point out this absurd piece by Mary Kenny in the Guardian. Norm has already made some pointed comments about it, so I’ll try not to go over the same bit of ground. But there’s really quite a lot to say, because there’s quite a lot wrong with the piece (and the pervasive way of thinking it typifies), so I think I’ll manage to find a few words.

But first I’ll point out one of Norm’s most amusing remarks, in reply to Kenny’s utterly ridiculous ‘Faith is a feminine thing.’

I have some questions here. First, how does Kenny know that faith is feminine? She doesn’t say. But I can think of a few counter-examples: the Pope, Desmond Tutu and a Jehovah’s Witness I once made the mistake of inviting through my front door for a chat. I’m compiling a more comprehensive list but won’t be able to post it till… I’ll have to get back to you on that.

Yeah, it does take some brain-cudgelling, doesn’t it. Hmm, hmm, let’s see, male-type people in the religion game. The Pope? Oh, Norm already said that. Umm – gosh this is hard – oh, how about the Archbishop of Canterbury? Yes, that’s one. Err – that guy on Oxford Street with the ‘End is Nigh’ sign? Is he still there?

So anyway. More seriously.

They want to give their children values. And they quite often feel a stirring of these transcendent values themselves, at about the same time…If you don’t believe me, look at the evidence, and visit a church, chapel or synagogue on a day of worship: you will find that at least two-thirds of the worshippers present are women, and 90% of these are mothers.

How the hell does she know what percentage of the women she sees in various random (note indirect article: a church, not my church, or St. Boniface-on-the-Green’s church, but any old church) religious gathering places, are mothers? Eh? Do they wear badges? Are they marked in some way? Or is she just extrapolating from statistics on what percentage of women are mothers. But that’s not safe – in fact it’s question-begging. For all she knows all the women in those religious gathering places are not mothers, and have come in either to rejoice at their freedom or to pray for conception. She doesn’t get to assume that 90% of any given gathering of women consists of mothers and then tell us ‘See? Look at all the mothers!’

But of course I also wanted to quote the stark nonsense about ‘transcendent values’ even though Norm already has. Note the quick assumption that values are ‘transcendent’ values, and also that church or synagogue attendance has some obvious connection with wanting to give children ‘values.’ And then yawn violently and think about something else.

Then there’s this absurdity:

It is a fairly well-kept secret that feminism originally arose among religious women in the 19th century: from Hannah More and Josephine Butler in Britain to Susan B Anthony and Elizabeth Cady Stanton in the US, feminism was an offshoot of evangelical Christianity, and that spiritual energy still hovers.

How could it possibly be a secret? Were there a great many atheists in the 19th century? Especially among women? (No, I’m not making Kenny’s point for her. The rarity of atheism among women in the 19th [as well as earlier] centuries is a contingent historical fact, not nonsense about the inherent ‘spirituality’ of women.) Of course feminism arose among (mostly) religious women in the 19th century – what other kind of women would it arise among? All those emancipated intellectual women living in their own book-lined flats in London and New York? News flash – there weren’t a lot of women like that in the 19th century. Naturally most 19th century feminist women were religious. It doesn’t follow that they have to go on being now.

For many women, perhaps even most women, some form of religious sensibility is what gets them through the night, and helps them lead the examined life, too.

Possibly. And possibly the same is true of many, perhaps even most men, too. So what? People can always learn to lead the examined life in a secular manner, after all. People change – even women do.



Damn Elitists!

Jun 18th, 2004 8:57 pm | By

I watched part of an old ‘Frontline’ on tv the other evening. ‘Frontline’ is one of the few fairly good shows on US public tv – actually one of the two, I would say, ‘Nova’ being the other. US public tv is so mediocre it’s painful. (And public radio is even worse. But that’s a separate subject.) It was about ‘Alternative’ Medicine. One part of it I found particularly extraordinary – an interview with Utah Senator Orrin Hatch. I’ve always disliked Hatch, frankly. He’s very conservative, and he has an irritating voice. He sounds like someone who’s trying to soothe a rowdy room full of six-year-olds – in fact I suppose he sounds a bit like Mr Rogers. Mr Rogers was a very nice fella, but I’m afraid those soothing calming bland voices make me want to punch something.

But that’s neither here nor there. Hatch could have an irritating voice and still be a good Senator. (Though perhaps not one of the best. It may be that a really good voice is basic equipment for a Senator. That’s an interesting question…but not the one I want to look at right now.) But there’s more wrong with him than the voice. The excerpt from the interview was about a 1994 bill he sponsored that de-regulated ‘dietary supplements,’ which means that the FDA (the Food and Drug Administration) cannot monitor dietary supplements in the way it can (and must and does) monitor drugs. It can only act after a supplement has been shown to cause harm, after it has gone on the market. Here is what Hatch says on the matter:

We had to take on the whole FDA and the whole raft of left-wing groups that believe that everything in our lives should be regulated and that we can’t– we’re so stupid as a people, we can’t make our own decisions and that we’re so dumb that we don’t know what’s good for us. It’s the attitude that government should tell you everything you should do. You don’t have any right to make any choices yourself. And they threw everything but the kitchen sink at us, but we had the people with us. And the reason we had the people is because a hundred million people have benefited from dietary supplements.

I’ve heard a lot of infuriating right-wing rhetoric in my time (as we all have) but that takes the biscuit. Though it certainly is impeccably conventional – the right does just love to pretend that any form of safety regulation amounts to assuming that people are stupid. But Hatch of course doesn’t bother explaining how all these brilliant people are supposed to know what’s in the bottles on the shelves. What – we just know by looking that the contents are safe? Are what they claim to be? How? How, exactly, do we know that? How do we look at a heap of gleaming capsules and divine what is inside them? Do we carry a laboratory with us when we go to the store and buy our vitamins and other supplements?

And I was reminded of Hatch’s comments when I read this Guardian article in which the Health Secretary, John Reid, makes a similar kind of claim.

The health secretary, John Reid, angered health campaigners and anti-smoking groups when he said yesterday that smoking is one of the few pleasures left for the poor on sink estates and in working men’s clubs. Mr Reid said that the middle classes were obsessed with giving instruction to people from lower socio-economic backgrounds and that smoking was not one of the worst problems facing poorer people…He said he was an advocate of informed choice for adults, rather than bans, describing himself as favouring empowerment, rather than instruction. Mr Reid fears advocates of a ban are behaving as if members of the public are incapable of coming to their own sensible decisions.

He favours empowerment rather than instruction? What can that mean? Are the two in tension? Are they mutually exclusive? Does learning something disempower people? If so, how? But that’s a trusty bit of rhetoric. If there’s something you disagree with, if you can manage to frame it as someone assuming other people are stupid, you’re on your way to victory, however nonsensical the claim may be.



Summer and Autumn

Jun 17th, 2004 11:17 pm | By

Horrible day here. In the upper 80s. The air quality doesn’t look too bad – the sky at the horizon is not brown – but it smells terrible outside all the same. It always does once it gets this hot. Heated-up car exhaust, I assume. I don’t like summer much.

But never mind that. The Dictionary gets printed next week. Once that happens, you see, it will be a book. Rectangular thing, open on three sides, pages with printed words on them. Something one can hold in the hand. Something one can read more or less anywhere – on the bus, in the park, in the checkout line at the supermarket, on the treadmill. That’s much harder to do with a stack of pages open on all four sides, a stack that can blow all over the room if a breeze comes in the window. No doubt that’s why some clever inventor thought of binding – fastens the thing down, you see, and makes it easy to turn the pages without making a mess. Wonderful invention, books.

I know, you’re thinking I’m very naive and fatuous, going on and on about one little old book. All very well for you, of course, you write books every day, but it’s all new to me. Well plus there’s the fact that I am naive and fatuous, of course; that has something to do with it.

So it will be printed and then before long it will be published, and then you will be able to read it. I’ll sign your copy for you if you like. I might zoom over to London when it comes out, just so that I can jump up and down and squeal and generally act like a fool. I might as well, after all, because it’s not as if I’m not one. The weather will be cooler by then, too.



Freedom From Atheism

Jun 16th, 2004 1:47 am | By

Update on last Comment –

And there is the Supreme Court decision (or non-decision) in the Pledge of Allegiance case. Students will go on invoking the deity in public (state) schools for now, thus making sure we don’t go overboard with this separation of church and state stuff. Don’t forget, freedom of religion does not mean freedom from religion, any more than Adam and Eve means Adam and Steve. No, religion is still mandatory in God’s country. (Of course, the Pledge is not actually required, so young atheists can just refuse to recite it and laugh cheerily when their devout classmates beat them up in the playground. Ain’t liberty grand.)



High Tension

Jun 15th, 2004 11:40 pm | By

A lot of vexed religious issues around at the moment. There is the Vardy foundation which wants ‘to take over seven comprehensives and turn them into Christian Academies promoting Old Testament views of the world’s creation. This includes the claim that it was made in six days, 10,000 years ago.’ There is the never-ending stampede of both political parties in the US to outdo each other in god-bothering. There is the prospect of Shari’a in Ontario (and the campaign against it). There is a group forming to ‘defend’ the hijab. And there is the Begum case, which is under discussion at Crooked Timber.

So, one way and another, there is a lot of debate and discussion of this question of special rights for religion and religious believers, especially in matters of education. One thing that doesn’t seem to get discussed much, no doubt because of the very reluctance to challenge religion head-on that I’m talking about here, is that there is (surely) an inherent tension between education and religion. At least, depending on how one defines both terms. But surely education that really is education is not supposed to teach counterfactuals. Nobody wants schools teaching that the French Revolution happened in the 14th century and the Black Death happened in 1927. A lot of education is not as straightforwardly factual as that, of course; not as answerable with a yes or no, true or false. But still – schools usually distinguish between fiction and the other thing; they don’t teach Jane Eyre as a biography. So where does that leave religion? Religion can of course be taught as a subject without asserting anything about supernatural entities – but religion as religion can’t. In short, it seems to me there is a radical tension between schools’ responsibility to refrain from teaching falsehoods, and religions’ commitments to their version of the truth. This is no doubt why religions want special rights, but it’s also why they shouldn’t have them.



Special Rules

Jun 13th, 2004 11:39 pm | By

And on a more serious note, on the same David Aaronovitch column – he does make a number of important points.

His argument seems to be that it’s a human right to attend a denominational school and given these may be further away from home than the local school, parents should not be subject to the same penalties as those whose child’s journey results purely from choice. In other words, a religious choice in education is a matter of freedom of conscience, whereas any other kind of choice isn’t. Steam emerges from every orifice at this. Especially when the barrister adds: ‘When I got married we promised to bring up our children in the Catholic faith and so we put them through a Catholic school.’ This is the non sequitur upon which he bases his claim to be accorded superior treatment. Perhaps he would like a little sticker for his car that reads ‘Free parking for monotheist pupils only’.

Well, he probably would like exactly that. Religious believers often seem to take the idea of their ‘special’ status and special rights so for granted that they are unable to see how odd that idea is, no matter how carefully anyone tries to explain. But why? Why should people have special rights because they believe in a deity? It is a pervasive (increasingly so, I think) notion, but one that I have a hard time seeing the logic of. Is it kind of like endangered species legislation? That things that are vulnerable need special protection? And belief in a deity is vulnerable because it depends on ‘faith’ as opposed to evidence and logic? Is that it? That’s the only reason I can think of, really. But if so…surely the reductio is pretty obvious. Should we give special rights to astrologers and people who think there’s a Disneyland on Jupiter, and withold them from people who try not to believe six impossible things before breakfast? That could end up having some unfortunate results, one would think.

What is going on here, I think, is an attempt to protect the young from modernity…One proselytiser for Muslim education who sends out letters to the media captures this very well. When there was a conviction for an ‘honour killing’ in London last autumn, this campaigner argued that the victim, killed by her father, ‘was educated to be a Westernized woman, instead of a Muslim’…This is a social agenda, as much as a religious one. It was argued by a pro-faith school columnist that at least the two great faiths – Catholicism and Islam – permit equality to believers and co-religionists. But they don’t. If they did there would be women priests and women imams. My fear is that this emphasis on faith schooling is an attempt, albeit unconscious – to return us to the days before feminism, an attempt which affects all of us.

But it’s difficult to talk honestly about the subject, in part precisely because of the ‘special rights’ idea – because believers think their beliefs should be protected from discussion or question. And some believers, I have reason to know, seem to think that the very fact that they are believers means that nothing they do can be wrong – pretty much by definition. So they feel perfectly cheerful about launching torrents of sexist, obscene raving at wicked unbelievers like me. I should know, I have the spittle-flecked (virtually speaking) emails to prove it. (I have a feeling I get a double if not triple dose because of being a female. Uppity women just do piss some people off, you know…)



Punk Eek

Jun 13th, 2004 9:55 pm | By

I can’t resist – because it made me laugh too hard just now when I read it. An update on the comma question – another example of the ‘eats, shoots and leaves’ phenomenon. This is from a column by David Aaronovitch in the Guardian:

This week a local barrister is looking into whether the scheme breaches human rights legislation according to the Hampstead and Highgate Express.

Oh? But why? Why does anyone care about HR legislation according to the Ham and High? And what about the Brixton Tribune or the West Kilburn Times? What’s their take on human rights legislation, eh?

Well you see what I mean. What a difference a comma can make.



Belief

Jun 12th, 2004 10:16 pm | By

Quite a lot of atheist material lately. There is this review of Nicholas Everitt’s The Non-Existence of God in The New Humanist

…some theists maintain that asking for reasons to believe in God’s existence is beside the point. The demand for reasons in this context is, they say, either blasphemous or vacuous. As Kierkegaard put it, echoing Luther, belief in God is a matter of faith; it’s not like our ordinary belief in the existence of things like tables and chairs, which can be justified or shown to be false. Everitt is impatient with such manoeuvres, and dispatches them rather effectively.

Good. I wonder if he also dispatches the maneuver we’ve noticed a lot in these arguments – what one might call the having it both ways maneuver. Claim that God is ineffable, transcendent, beyond our understanding or anything we can say about it, etc etc, but nevertheless be more than willing to say all sorts of things on the matter. What it seems to mean in practice is: God is ineffable therefore atheists can’t say anything on the matter, but theists on the other hand can and should say whatever it occurs to them to say.

Two sets of rules, one might say. The author of this article on discrimination against atheists might say, for example. Apparently there is a general belief that there is really no such thing as discrimination against or ill-treatment of atheists, but Margaret Downey has researched the question and found otherwise. She has also found a likely reason the problem is not recognized:

One would think that any atheist who had experienced discrimination would be eager to submit an affidavit. Instead, the fear of suffering further discrimination as a “whistleblower” was widespread. Some victims told me that they did not want to go public lest still more hatred come their way. This is the trauma of discrimination, just the sort of intimidation that discourages discrimination reports and makes it difficult to find plaintiffs for needed litigation.

Downey presents a few examples of small-town persecution – harassment, threats, firings, pictures of Jesus left on one’s desk, organized shunning, stalking with a butcher’s knife. I read somewhere recently – I forget where, but I think it was in something I linked to – about the nice old tradition of the much-loved atheist in every US village. That’s bullshit. In most of the US, atheists are greeted with venom and hostility unless they maintain complete silence on the matter (and sometimes even then).

And finally there’s this article on Bush’s superstition by Edmund Cohen, who seems to have taken a surprisingly long time to notice.

Until recently, I had not seriously thought that supernaturalism or superstition could be an issue of concern as regards the second Bush presidency…Surely that establishment must have vetted its candidate well enough to rule out nominating an unstable religious eccentric. When he speaks in churchly terms, surely he is only employing regional idiom and one cannot take him literally.

Er – no. The Republican establishment does a staggeringly bad job of ‘vetting’ its candidates. The Democratic establishment doesn’t do any better, mind you – because it’s not about vetting, especially now that the primary system is so much more important than it once was.

According to [Bush confidant] Robison, there are but two worldviews: Biblical Christianity and Relativism. Biblical Christianity represents the “Absolutes.” By “Relativism,” he means complete lack of criteria for distinguishing right from wrong or truth from falsity. All those who are not Bible-believers are ipso facto Relativists. For Robison, liberal Democrats, Islamist terrorists, and all others who are not Christian Bible-believers count as Relativists and are therefore all interchangeable with one another.

Yep, I know the type, I’ve even (to my sorrow) had conversations with one or two. I’ve been informed that people who ‘acknowledge’ no higher authority have no ability to feel remorse – which is quite an interesting idea. No wonder the believers go in for shunning and threats.



Nussbaum

Jun 12th, 2004 2:10 am | By

This was a nice little coincidence, or confluence, or something, this morning. I started reading Martha Nussbaum’s new book Hiding from Humanity and then when I got on the computer I found this interview with her. It’s an interesting and amusing interview, too.

As for philosophers, I find Mill the most soothing because I imagine him as a friend to whom one would like to talk. Most male philosophers of the past are not the friends of women, but Mill is.

I like Mill a lot. And come to think of it, one of the things I like in him is one of the things I like in Nussbaum, too: they’re both extremely lucid.

The interviewer asks ‘Is it the legal expert, the academic, or the philosopher in you that gets angry about specious arguments (say, Judith Butler or Allen Bloom)?

I really don’t like bad arguments, but what I especially dislike are bad arguments put forward cultishly, with an in-group air of authority. I think that philosophy should stick to its Socratic roots, as an egalitarian public activity open to everyone. Thus even some admittedly great philosophers, e.g. Wittgenstein, inspire me with unease because they allowed a cult to grow up around themselves and wrote undemocratically. Heidegger was guilty of the same, but he is a much less distinguished philosopher than Wittgenstein, and he also did bad things in politics.

Exactly – ‘bad arguments put forward cultishly, with an in-group air of authority.’ That’s exactly it, that’s why it gets up my nose so when people worship Butler. It’s that cultish, in-group thing – it drives me insane. And that’s probably why I love Mill and Nussbaum, because they are as I said so lucid. They do the exact opposite of what Butler does. She makes a few small ideas obscure; Mill and Nussbaum make an ocean of large ideas utterly clear. They make philosophy ‘an egalitarian public activity open to everyone’ rather than a smelly little orthodoxy just for the trendy few. Down with cultishness, up with lucidity.

The new book is enthralling so far. And in another bit of serendipity, it’s also very relevant to this discussion about the relationship between Theory of Mind and empathy, and my suggestion that empathy and related qualities are cognitive before they’re emotional. Nussbaum talks about exactly that subject:

…it is quite unconvincing to suggest that all emotions are ‘irrational.’ Indeed, they are very much bound up with thought, including thoughts about what matters most to us in the world. If we imagine a living creature that is truly without thought, let us say a shellfish, we cannot plausibly ascribe to that creature grief, and fear, and anger. Our own emotions incorporate thoughts, sometimes very complicated, about people and things we care about.

So there you are, you see – I went to all that trouble to say something Nussbaum had already said. She goes into the matter further in an earlier book, Upheavals of Thought, which I’ve looked into but not read yet.



Mattering and Meaning

Jun 10th, 2004 9:51 pm | By

We were talking about meaning the other day. I read something in Daniel Dennett’s Consciousness Explained that seems relevant:

So the conscious mind is not just the place where the witnessed colors and smells are, and not just the thinking thing. It is where the appreciating happens. It is the ultimate arbiter of why anything matters…It stands to reason – doesn’t it? – that if doing things that matter depends on consciousness, mattering (enjoying, appreciating, suffering, caring) should depend on consciousness as well.

Mattering is about caring – therefore (surely?) meaning is related to caring – perhaps is another word for the same thing, or both words name the same thing but from different angles. I said much the same thing in the Comment – ‘Yes of course, we want to think our lives (hence the world they take place in) matter, have significance and importance, ‘mean’ something – something more than what they mean to us.’ Meaning is about what matters to us: what matters to us is what we care about. (At least, that seems to be part of what meaning is. I’m not claiming it’s an exhaustive account, and I don’t think it is, I think there’s more to it. But it’s a part.) All these words and ideas circle around a common knot or core. What is important and significant is what we care about, what matters to us, what means something to us. We could think of meaning, caring, importance, as sorting-devices: this item matters and that one doesn’t, because of what I care about, what is important to me. All a bit circular and subjective, obviously, but then that was my original point: that subjective is exactly what meaning is, and therefore it’s a bit of a dodge to claim that religion ‘gives’ meaning – it only gives it because we decide it does.

Caring is also interesting in a slightly different (though related) way: as motivation, as the engine that keeps our forward momentum going. This is (I take it) what Damasio is talking about in Descartes’ Error: people who have a kind of brain damage that impairs their ability to care even though it leaves cognitive abilities intact, can’t function properly. They don’t do anything, because they can’t decide among possibilities – even though they can understand and state pros and cons – because they don’t care. Indifference is a paralyzer, it seems. Which we all probably know from experience with depressed people or with depression. Depression plays hell with motivation.

We also know it because we know that ‘I don’t care’ can be a terrible, an appalling thing to say. It’s mildly rude even as an answer to trivial questions (What shall we make for dinner? Coffee or tea? Red or white?), and it’s brutality or worse as an answer to non-trivial questions or statements – ‘I’m frightened,’ ‘she needs help,’ ‘you hurt him when you said that,’ ‘there’s a genocide going on.’ Or for that matter ‘I love you,’ ‘she won first prize,’ ‘he’s safe.’ There’s a reason ‘Don’t care was made to care, don’t care was hanged’ was such a popular nursery saying. We need to care ourselves, and we need the people we care about to care too, or at least not to tell us they don’t. About some things we need everyone on the planet to care.



Punctuated Equilibrium

Jun 9th, 2004 10:45 pm | By

I find this a little bit amusing. Not the whole thing, just one part of it. The whole thing is a discussion of Eve Garrard’s second piece on Amnesty International at Normblog. That’s not particularly amusing, turning as it does on the murder, torture and general pushing-around of millions upon millions of people around the world. No, not an amusing subject. What amused me was just one item at the end of Chris’ post.

Finally — and I’m picking nits now — Eve writes that “the idea that the force of an argument should be materially altered by an (allegedly) misplaced comma is … delightful and charming.” It may be, but my complaint focused not on the force of the argument but on its meaning , and it is pretty commonplace that commas can and do alter the meaning of sentences: Eats, Shoots & Leaves.

Well there you are, you see. It’s not only tiny words (she not he, here not there, on not in) that can alter the meaning of sentences, it’s little marks that don’t even represent a vocalization, that represent at most a pause or a tone of voice (? sounds one way, ! sounds another), but can separate an adjective from a noun or change a noun to a verb or otherwise change the meaning of a sentence.

I’m all the more aware of this because it comes up in proofreading, at least it does when I’m the proofreader. The editors of TPM like to make fun of me for adding a comma at the end of a list. Well, ha ha, very droll, but I have my reasons – because commas do make a difference. The one at the end of a list is optional, it’s true, but I often like to exercise the option and insert it, especially when the list in question is a list of phrases rather than single words. A list like ‘this, this, this, this and this’ is not too bad, but a list like ‘this does that, that does this, those did these and these did those’ can be confusing – it can be unclear whether the last clause is actually two clauses separated by ‘and’ or all one clause with an ‘and’ in the middle. Unless you add a comma before the ‘and’ – which is why I often do just that. So mock mock mock all you like, but it does make a difference. As, of course, Eats, Shoots & Leaves has reminded everyone lately.

But then other times – for instance when I’m writing as opposed to proofreading – I leave commas out with wild abandon. I perpetrate chaotic unpuncutated headlong sentences of a kind that one is taught not to perpetrate when one is twelve or so. Not invariably, but it’s something I have a tendency to do. Some sentences just seem to need to be uttered all in one breath, without punctuation (i.e. without pauses), so I write them that way. Then on reading them I sometimes realize – they will work if readers hear them exactly the way I heard them in my head – but what is the likelihood of that? So sometimes I decide to punctuate them in a more conventional manner. But not always. Yes, that’s nice; and your point is? Nothing – just that even commas, even those little tiny silent marks, are something one can lavish thought on, and that can alter the meaning of sentences. Odd, isn’t it.

I wonder if commas have Theory of Mind.



Fantasyland

Jun 8th, 2004 11:00 pm | By

I’m still pondering this link between Theory of Mind and – and a lot of things: imagination, social cognition, lying, pretending. And via those things it links to even more things – empathy, story-telling, literature and art, religion, politics, manipulation, coalitions – really pretty much everything that has to do with humans as conscious intentional reflective social beings. It all starts with this ability to realize that Other Minds are other minds.

This all raises a number of thoughts or questions. A reader (who has a post on a related subject on his own blog) mentioned this article by Pascal Boyer.

Social interaction requires the operation of complex mental systems: to represent not just other people’s beliefs and their intentions, but also the extent to which they can be trusted, the extent to which they find us trustworthy, how social exchange works, how to detect cheaters, how to build alliances, and so on…Now interaction with supernatural agents, through sacrifice, ritual, prayer, etc., is framed by those systems. Although the agents are said to be very special, the way people think about interaction with them is directly mapped from their interaction with actual people.

Boyer doesn’t use the term but he’s talking about Theory of Mind there. Very interesting notion. ‘What’s she thinking? What are they thinking? What are you thinking that I’m thinking about what you’re thinking?’ And all that applied also to supernatural beings – so there is no body language, no gestures, no facial expressions, no rocks flung or sticks brandished, no conversation, no shouts or swearing or name-calling. Nothing to go on, one would think – except maybe the weather and the odd earthquake.

One thing that interests me about the subject is that it means (surely) that some (maybe all?) basic virtues are really cognitive first. Maybe that seems self-evident? But I don’t think so – I think we think of virtue as rooted in love. That love comes first and creates sympathy. But if I understand all this correctly, surely it’s perfectly possible to ‘love’ others without understanding that they have their own minds, and therefore that they’re not feeling or thinking what we are. Surely we can’t even begin to have virtues like empathy, compassion, responsibility, generosity, kindness, fairness, until we understand that others have thoughts and feelings different from our own. This basic ability that other animals apart from chimps apparently don’t have (though Frans de Waal for one would disagree with that) is absolutely required for empathy to even exist. Theory of Mind is the same thing as empathy. And it’s not so much a virtue or an emotion as a mental ability.

Another thing that interests me is the way ToM connects with imagination, fantasy, pretending. Empathy is not the only thing that ToM makes possible; lying is another. Children learn that other people can have false beliefs, so the next step is to create them. Autistic children never do either, nor do they pretend.

They will not play with dolls, pretending they are people (when they know that they are not really alive); they will not pick up a telephone and hold a conversation with an imaginary person at the other end of the line; they never pretend to be asleep in order to play a joke on someone else. In short they live in a world that is absolutely real as it stands: they cannot conceive of the situation being other than exactly how it is. And that in turn means that they cannot lie. [Robin Dunbar: The Trouble With Science]

I suppose one reason that interests me so much is that I was a really dedicated pretender when I was a child. It was like a career, a calling. I never knew any other children as deeply into pretending as I was – and I always thought they were eccentric for not being. It seemed to me the only way to play. How do you play? You go outside (or the attic or the basement if it’s raining) and you pretend to be someone else – Jo March, Mary Lennox, Davy Crockett, whoever – for hours and hours. That’s how. What else would you do? I kindly taught friends to play the same way – but I don’t suppose they kept it up anywhere else. But why not? Why not? That’s what I never understood. It’s so much fun. You get to live in another century, in another place, doing unfamiliar things, living in a different story. I used to think children who don’t pretend must be slightly stunted, mentally. (Of course, I’m only two feet tall, so I shouldn’t talk.) I’m not sure I still think that, and yet I do think the ability to fantasize, to imagine things as other than they are, is one that ought to be fostered. At least as much as the ability to play soccer.



Names

Jun 5th, 2004 10:42 pm | By

There is a review by Mary Midgley of a new book by Judith Butler in the Guardian. Midgley has a special place in our affections here at B&W, since in a sense she named it. In another sense of course she didn’t, Al Pope did, because she was quoting him, but in the sense that matters she did, because her use of the quotation is what the Namer of B&W had in mind. Actually the Namer and I have had many violent brawls on the subject, with books thrown and fists pounded on desks and screams screamed and horrible wounding insulting things said. No not really, I’m only joking, because it’s Saturday. But it’s almost true. I have received many emails from readers upbraiding us for not citing Pope, and (until I finally learned better) I used to forward them and ask whiningly why we couldn’t just add two little words – ‘quoting Pope’ – to the About page. Only to receive in reply a blistering indictment of my pedantry, elitism, sucking-up tendencies, docility, sycophancy, conformity, timidity, lack of imagination, tunnel vision, and general fatuity. Not not really, I’m just amusing myself. But it was almost like that. Anyway, Midgley named B&W by using the quotation in accusing someone else of a foolish misunderstanding when in fact the misunderstanding was, not to put too fine a point on it, her own. But all the same, she is at least somewhat skeptical of the profundity of Butler.

Although she does go a bit wrong in the very first sentence –

This little book contains five fairly indignant essays by the distinguished Californian feminist and literary critic Judith Butler…

Distinguished? What’s so distinguished about her? I’m serious. That’s not a jokey question, it’s a real one. There is, as I have noted here in the past, a great deal of inflated praise of Butler kicking around – she is always being called famous, important, significant, etc. But why? On the basis of what? Is her work really so conspicuously better than that of hundreds of her colleagues? There are a lot of knowledgeable people who think it is in fact much worse. So it’s a bit irritating to see her called ‘distinguished’ for no apparent reason. One can’t help thinking that’s just a sort of meme (now that would be ironic), picked up because of all those people who call her famous and important. People can become famous (as of course we all know) just because a lot of other people say they are famous – it can be a self-fulfilling prophecy.

But Midgley does note some flaws, so that’s better than nothing.

I found a large part of the book unhelpful because it is so abstract. It consists of arguments about Foucault’s doctrine of a transition from “sovereignty” to “governmentality” in the structure of states, and about Levinas’s notion of “the face” as the factor that makes us able to see people as vulnerable fellow humans…Discussion of these ideas leads into hair-splitting of the kind that often develops when prophets such as Foucault and Levinas have deliberately used paradox to make an unfamiliar point. Scholars pile in afterwards, trying to domesticate the paradox to fit it for students’ essays. Nietzsche, who started the paradox game, would have been rather cross to see the kind of theorising to which it now leads. And readers might reasonably ask why this theorising is relevant to the moral case against American foreign policy. The trouble is that that case can obviously be stated in perfectly familiar terms – terms widely shared, terms that the transgressing parties themselves already officially acknowledge. Is there anything to be gained by translating it into new and exotic language?

Well, I wouldn’t think so, and that’s exactly why I don’t think Butler is distinguished. I think she’s much more pseudo-distinguished – much more keen to impress the credulous by way of Levinas and Foucault and baroque theoryspeak than to actually say something or enlighten anyone. And that’s exactly why the whole ‘distinguished’ thing is so annoying. That’s not what academics should be doing – writing in a show-offy, obscure for the sake of being obscure way. Necessary obscurity, unavoidable obscurity, obscurity that is inherent in the subject, that’s one thing, but obscurity used to impress and get called famous and distinguished, is another. And I defy anyone to read a few pages of Butler without thinking that is exactly what she’s doing.



Mind Your Peas and Kews

Jun 4th, 2004 9:17 pm | By

Here’s an amusing bit of serendipity. I just added a quotation to Quotations and only after posting it (and doing various other tasks) realized it’s highly relevant to a little argument we were having the other day about the importance and value of precision in language. My colleague posted a Comment which made much of the difference between saying ‘a something’ and ‘the something.’ He also pointed out that ‘Precision of language matters, if you want to be understood.’ That seems like such an obvious, incontrovertible statement, doesn’t it? But people do attempt to controvert it. People in fact actually mocked the idea of making anything of the difference between ‘a’ and ‘the’.

Very well. Behold that Stanley Fish quotation (and he’s a US academic, last I heard, so maybe it’s not a US-UK thing. As I said, I certainly hope it isn’t.):

Everything follows from the statement that the pursuit of truth is a — I would say the — central purpose of the university. For the serious embrace of that purpose precludes deciding what the truth is in advance, or ruling out certain accounts of the truth before they have been given a hearing, or making evaluations of those accounts turn on the known or suspected political affiliations of those who present them.

Italics his. So…he seems to think there’s a difference, a difference worth remarking on in an interjection, a difference worth emphasizing with italics – between a central purpose and the central purpose. He doesn’t seem to think it’s obsessive or peculiar to notice the difference.

I’ve seen a couple of other good remarks on the value and necessity of precise language in the past couple of days. One is somewhat indirectly relevant, but it’s suggestive. It’s by Robin Dunbar, in The Trouble with Science (page 106). He’s talking about strong inference, and the way it has accelerated the progress of science in various fields.

Precisely formulated hypotheses are compatible with a very much narrower range of empirical results than more loosely formulated ones.

He’s not actually talking about language there, but the point is the same. Woolly language allows a much wider range of meaning, which can be nice in poetry (though precise meaning can be very good in poetry too) but is not nice at all in substantive discussion.

The other is from Susan Haack in Manifesto of a Passionate Moderate (page 53). She is discussing Peirce’s view of science and inquiry.

…’studying in a literary spirit’…implies a preoccupation with what is aesthetically pleasing that diverts attention from inquiry and pulls against what ought to be the highest priorities of philosophical writing: not elegance, euphony, allusion, suggestiveness, but clarity, precision, explicitness, directness.

So there you are. Keep the wool for knitting sweaters and guillotines, and be precise when using language.



Theory of Mind

Jun 4th, 2004 1:12 am | By

Animal cognition seems to be in the air this month. I read a review by Frans de Waal of two books on the subject a few days ago, and today find that one along with two more at SciTech. Each is about one of the books that de Waal reviews, so the three together make an interesting comparative package, and they’re all interesting in themselves.

This one on Clive D.L. Wynne’s Do Animals Think? is not only interesting but also quite amusing.

Students in the first-year university philosophy classes that I teach often believe that their dogs, cats, budgies, and goldfish are thinking pretty much the same thoughts they are. Unfortunately, some of them are right, I point out – but I point it out only when I’m in a grumpy mood…Ditto for tales about dolphins using “an elaborate language among themselves that we are not smart enough to decode,” to say nothing of whale songs, weeping elephants, and loyal hounds.

The weeping elephant item is of course a sly reference to Masson’s book When Elephants Weep. I especially like the dig because I had a similar one in the Fashionable Dictionary, but had to take it out on grounds of obscurity. Masson and the book are not well known enough, so the joke would have fallen flat. But I can put it in the FD on the site. I’ll have to do that one of these days.

And another item.

Do Animals Think? contains a series of intermittent chapters in which Wynne describes and enthusiastically marvels over honeybee hive life, bat echolocation techniques, and pigeon homing methods.

That word ‘echolocation’ appears in one of the FD definitions – one of the original ones, so it’s already on the site. My colleague wondered if it was a real word – it looked like something to do with virtual chocolate. Well, see, that’s the difference between sociologists and zookeepers. He is familiar with words like functionalism and Durkheim, and I’m familiar with words like echolocation and shovel. Anyway, there is the word, big as life, and used by someone other than me, which I take to be pretty good evidence that it’s a real word.

The other review, of Intelligence of Apes and Other Rational Beings, is not particularly amusing, but it is interesting.

Robin Dunbar was on Start the Week last week, and he was so interesting that I was inspired to re-read his excellent book The Trouble With Science. (There is a paragraph on the book In the Library.) He talks about social cognition, and whether animals have a theory of mind – chimpanzees have some, the equivalent of a five-year-old child’s, but they are left in the dust by a child of six, and dolphins have none at all. Then he discusses what an elaborate theory of mind humans actually do have, that we can actually go five levels (she thinks that he thinks that they think that you think that I think), and that doing that takes an enormous amount of brain power, which seems wasteful. What is it for? (Wasteful things seem to need explaining, of course, because it seems as if they would be selected against.) He has a suggestion – Andrew Marr thought it was ‘religion’ but Dunbar corrected him: not quite. Imagination, is what he thinks it’s useful for: imagination which makes two things possible: religion and story-telling. Both of those, he thinks, make social cohesion possible. Humans live in groups, with an implicit social contract, which means they have to sacrifice immediate benefits for long-term ones, at times, which is a situation exploitable by cheaters (you know: Prisoner’s Dilemma, game theory, all that). So religion works to discourage cheaters – if Dunbar is right, at least. At any rate it makes for a very interesting discussion. Marr asks if he thinks that that means religion will always be with us. ‘I hope not!’ says Dunbar, and everyone laughs a good deal. And they talk about the way that religion makes small group cohesion possible and by the same token makes people want to kill people who believe differently. Yes doesn’t it though. Well now I’ve told you nearly all of what was said, but never mind, listen anyway.



Meaning

Jun 2nd, 2004 10:00 pm | By

I’ve been thinking about religion and the arguments people use to defend it, again. Or more likely I’ve never stopped. It’s a line of thought that shrinks or expands, that takes up a position in the middle of the living room or creeps into the back of a closet, depending on what I’ve heard or read lately, but it probably never goes away entirely, never actually packs the wheely suitcase and marches away into the sunset (which would be inadvisable from here, actually, because you would drown). Anyway I’ve been thinking about it. I’ve been thinking about the idea that religion has something to do with humans’ desire for meaning – that religion does something about that desire. Satisfies it, answers it, solves it in some way. We see that general idea (and it is general) expressed a lot in these disagreements over religion. It’s always (at least in my experience) expressed in a very vague, hand-waving fashion. I’m tempted to say (so I will) a carefully vague, hand-waving fashion – because it can’t really be expressed in any other way, because there’s not really anything non-vague to say. At least so it seems to me.

I was browsing and I found this old Comment on the subject, prompted by a review (now subscription, unfortunately) of Richard Dawkins’ A Devil’s Chaplain by Allen Orr and the similarity of something he said to something Michael Ruse said about the same book. Orr:

You might argue that what conflicts did occur between science and religion were due to misunderstandings of one or the other. Indeed you might argue that Dawkins’s belief that science and religion can conflict reflects a misconstrual of the nature of religious belief: while scientific beliefs are propositions about the state of the world, religious beliefs are something else—an attempt to attach meaning or value to the world. Religion and science thus move in different dimensions, as Gould and many others have argued.

Ruse:

People like Dawkins, and the Creationists for that matter, make a mistake about the purposes of science and religion. Science tries to tell us about the physical world and how it works. Religion aims at giving a meaning to the world and to our place in it.

In that old Comment I focused on the truth-claim aspect, but now I want to take a look at the ‘meaning’ question. One obvious question is, what does it mean to give a meaning, to attach meaning to the world? Give or attach meaning how? How does religion do that? By making factual claims, that’s how. By saying there is a god of a certain kind, and that that god is what makes the world meaningful. Well – factual claims are factual claims, and anyone and everyone can query them, emphatically including scientists; therefore that sharp separation is completely bogus.

And then, what does that phrase even mean? Give a meaning? Surely it’s obvious that that’s simply a very human idea, a human want, reflecting human thoughts and feelings. Yes of course, we want to think our lives (hence the world they take place in) matter, have significance and importance, ‘mean’ something – something more than what they mean to us. And we suspect that actually sub specie aeternitatis they don’t. So we want something – something non-factual, anti-factual, non-empirical, counter-empirical – to ‘give’ the world meaning for us. For the something to ‘give’ meaning it has to be anti-factual because we already know the facts aren’t going to do it for us – that’s the problem, that’s why we talk about ‘giving’ meaning in the first place. The facts are that we live brief lives and in three or four generations at most are as forgotten as if we’d never lived. We might as well give meaning to the life of the chicken we just ate for dinner or the ants we stepped on as we crossed the street. The meaning doesn’t seem to be in the brute facts; we want meaning; so we invent a non-factual magical Something that gives meaning for us. Very well. That’s consoling for many. But the fact remains that the trick only seems to work if this supposed extra-factual transcendent magical Something is after all quite mixed up with The Facts – with factual claims, that is, as opposed to real facts. It has to be there and to have certain attributes for the meaning to stick. So we decide it is and does.

There are other ways of ‘giving’ meaning, that don’t rely on a special external supernatural Something to ratify the meaning – but of course they are so much the less consoling. They make smaller humbler claims, they don’t deny death and extinction, they settle for human versions of meaning.

But that’s the nature of the enterprise. Either ‘meaning’ is a limited, temporal, human creation and interpretation – the world has meaning because of love, or art, or progress, or hope, or beauty, or all those – or it’s a timeless transcendent non-human creation ratified by a supernatural being – in which case it is making factual claims that are entirely open to dispute and investigation.

It’s a dishonest form of sleight of hand to try to blur the two and then pretend the whole thing is off-limits to inquiry.



Majority-Minority

Jun 1st, 2004 8:30 pm | By

There is a lot lurking behind this question (as there so often is with questions of this kind) about what is more interesting – the widespread acceptance of a given social practice or custom, or the minority dissent from it. For one thing there is the comparison or analogy with everyday life and with present politics, reform, ideas of progress and improvement. Looked at in that way, it may be said that at least in some ways the reformist side is more interesting than the pro-status quo side. That’s almost a truism, or what Jerry S calls in that scholarly way of his that I can never hope to emulate, an argument by definition. Imagine to yourself a conversation. X says ‘the way this is done is all wrong and could be done much better’ and Y says ‘nonsense, the way it’s done now is exactly right and should be left as it is.’ Which is more interesting? Which offers more of an opening for further conversation, for thought and research, for something to do and plan and hope for? (Other things being equal, of course – assuming the reformer is not a monumental bore and windbag while the status quo-ist is not. Sadly, not always the case in the real world.) Or to put it another way, X says ‘the way this is done is unjust and an outrage and causes needless misery to millions of people’ and Y says ‘Oh? I’ve never thought about it’ and changes the subject to last night’s game. Which person (ceteris paribus again) seems more boring?

So in that sense it may seem true that dissent from the majority way of doing things is more interesting than mindless or conformist acceptance of it. Thoreau certainly seems a great deal more interesting than the quietly desperate people around him. Huck is more interesting, with his despairing decision to go to hell, than the self-regarding slaveowning hell-avoiders around him. The great Bartolomé de las Casas is more interesting than the murderous Indian-abusers he exposed. Montaigne is more interesting in his views on Indians in ‘On Coaches’ than the indifferent people around him. As people, dissenters and reformers are generally more interesting – though that also of course depends on our views of what it is they’re reforming. On exactly what the majority and minority practices are. I’m not sure I find people who rail about the dangers of women running around in the streets with their hair uncovered just doing whatever they want to do without asking a man, particularly fascinating. So there’s another complication, another place we have to be careful to distinguish between some and all. But in addition to that, another reason the conformist view is also interesting is because it is not just personal. It is also for instance a moral question. I for one find it fascinating to read about the rationalizations that Southern slaveowners were able to come up with, because it is interesting to know how people can justify to themselves what now seem to us intolerable cruelties. I find it interesting to read the Jefferson-Adams letters, for instance, partly for this reason. Jefferson is a fascinating study (and I’m obviously not the only one who thinks so: there have been a good few books on the subject lately). He had all the equipment to see things otherwise, including his friendship (broken for many years then restored) with both Adamses, yet he didn’t. Surely the reasons are interesting! Was it just because he wanted the wine and the books and the upkeep of Monticello? Would he have thought the same if he hadn’t owned any slaves? If not, can any of us trust ourselves to make disinterested judgments on moral questions? And that’s just one example.

Well it’s a large subject, and I don’t have time to write a book on it, so that will do for now.



Is the Ubiquitous Interesting?

Jun 1st, 2004 1:55 pm | By

Some people find inter-blog disputes tedious, other people fun. And no doubt many people who claim to find them tedious actually find them fun. But this at least is a dispute about a substantive matter…

So to business. Ralph on Clio. He claimed, a while ago, on B&W:

“When something is ubiquitous, the interesting question isn’t ‘how could it have been tolerated?’ because it was commonly and widely accepted.”

I think this is very silly. Ralph objects to my thinking it very silly. He says:

I made the claim in the context of a discussion of slavery and its ubiquity in the early modern world. Explaining the presence of pro-slavery arguments in a world in which slavery was ubiquitous is less interesting, I think, than explaining how an anti-slavery argument emerged in the face of slavery’s ubiquity. It is important to understand received frameworks and institutions and, beyond that, to understand how even a ubiquitous institution like slavery varied from place to place. But history’s drama is not found in received frameworks and institutions. Rather, it is found in the emergence of subversive challenges to and contentions with them. So, the interesting question is how anti-slavery emerged in the face of slavery’s ubiquity or, as certainly, how feminisms emerged to challenge the ubiquity of patriarchal “known worlds.”

So let’s unpack this paragraph.

I made the claim in the context of a discussion of slavery and its ubiquity in the early modern world.

Yes, but the claim was not a specific one about slavery. "When something is ubiquitous…" It would have been very easy to have phrased it in a more restricted way (e.g., "what was interesting about slavery"). Precision of language matters, if you want to be understood.

Explaining the presence of pro-slavery arguments in a world in which slavery was ubiquitous is less interesting, I think, than explaining how an anti-slavery argument emerged in the face of slavery’s ubiquity.

Sure. I can agree with that. But it isn’t the same claim. The fact that something is "less interesting" doesn’t mean it is not interesting. But the definite article in the first claim ("the interesting question") suggests that other issues are not interesting at all. Again, precision of language counts.

It is important to understand received frameworks and institutions and, beyond that, to understand how even a ubiquitous institution like slavery varied from place to place.

Agreed.

But history’s drama is not found in received frameworks and institutions.

There’s a hint of an argument by definition here. If the claim is that it is only drama which is of interest in historical terms, then that’s just wrong.

So, the interesting question is how anti-slavery emerged in the face of slavery’s ubiquity or, as certainly, how feminisms emerged to challenge the ubiquity of patriarchal “known worlds.”

And we’re back to the definite article again. The interesting question…



Jokes and Conversations

May 30th, 2004 11:20 pm | By

One or two or more unrelated items of interest. One from Normblog, a moment of dialogue with a very sophisticated theorist of some sort –

All Googling, and that includes self-Googling, is culturally specific and also gendered. There’s an excellent paper on it by Lesley DeTrobe. He-or-she – that’s Lesley I’m talking about, who has renounced maleness and femaleness since June 1999 – there deconstructs the notion of a universalizing universalism, showing this to be a grwelphdoop. The concept of grwelphdoopism is one of DeTrobe’s most illuminating accomplishments. Think Foucault, think Derrida, think Dr Susie Nupledor Jr and her black dog Melvy. Of course, ‘dog’ is itself one of the very grwelphdoops sent packing by DeTrobe, but in the language game ‘humans and animals’ Melvy is a dog.

In the language game animals and humans, on the other hand, no doubt Melvy is a human and Nupledor is a dog – or perhaps Nupledor is just That One and the beloved Nylabone is a human. What would Wittgenstein do? Anyway. There was also a great deal of innocent fun to be had in reading this article about Colin Wilson. I also heard him interviewed on Front Row the other day. What a dork he does sound! I’ve never been the smallest bit tempted to read him, I must say, especially not after reading Kingsley Amis’ and Dwight Macdonald’s lovingly savage essays on The Outsider, both of which make him sound like the biggest pseud of all time. This article merely reinforces that impression – which makes it a very amusing read.

Where it drags is when he gets on to his ideas. His philosophy is basically existentialism with non-rational excrescences and characterised by bizarre nomenclature – Faculty X, Upside Downness, Peak Experiences, Right Men, The Dominant Five Per Cent, King Rats. It seems to constitute an attempt to classify human feelings and behaviour as written by a Martian who has never met an Earthling…As a child he was so introverted, so uninterested in other people, he might have been diagnosed today with Asperger’s syndrome…In a way, sex was his salvation – he wanted to sleep with girls so was forced to talk to them.

You know – I’m not sure that’s all that Asperger’sish. As a matter of fact I have an idea it’s quite common. But maybe the men are just odd here in Antarctica.

And then there was this Thinking Allowed with Deborah Cameron. I commented on another Thinking Allowed with Deborah Cameron a longish time ago, perhaps a year or so ago, and this one is very good too. About the globalization of nicey-nice talk and why that’s not an entirely brilliant idea. Then there was this one more recently about shyness, which deals with some of the same issues. Sometimes not talking is simply a choice rather than a social handicap or a disease, but that possibility seems to get overlooked. Interesting stuff.



The Gulf

May 29th, 2004 8:31 pm | By

It’s all quite interesting, as I said – albeit in a rather depressing way. It shows what a great yawning gulf there can be between secular modes of thought and the non-secular variety; between rational discussion and irrational discussion; between unbelief and belief; between atheism and theism. I’m not sure I think that’s always the case. I think it may be possible for atheists and theists to find some common ground at least sometimes. At least I think I think that – but maybe I only hope it, or would like to think it, or feel as if I ought to think it, because it seems so rude not to. (Which of course is a bad principle, and just the sort of thing B&W is supposed to be against, but it’s not always easy to extirpate every vestige of bad thinking from one’s equipment, is it.) But I must say sometimes I’m just not sure. Thinking airily in the abstract about generalities and likelihoods, it seems possible to think that the two sides can talk (surely?), but then when we get down to it, it becomes rapidly apparent that we can’t.

At least, not unless we maintain a polite silence. Not unless atheists are bashfully tactfully politely quiet about their atheism. Not unless we in fact pretend that we don’t actually disagree with our theist pals about anything. If we pretend it’s all just a matter of taste or inclination – you like blue, I like red; you like pizza, I like curry – then it’s all right. If the atheists pretend or at least allow it to be assumed that they’re atheists because they haven’t thought about it, or they’re lazy, or they just haven’t met the right god yet, then that’s okay, and we can all get along. But if atheists are so aggressive and unfriendly as to say they’re atheists for a reason, and to say what that reason is – then they are ‘strident’. And that’s at best – that’s almost a compliment. ‘X is strident but not actually a homicidal maniac.’ To the really committed theist, atheists are indeed – apparently by definition – homicidal maniacs. Atheists are to be called Madame Defarge and fans of Stalinist outrages – and this by the very people who complain of the fact that secularists don’t always want to work with theists! It’s really quite hilarious, in one way.

But in another way it isn’t, in another way it’s a symptom of the entrenchment of irrationalism in sectors where it ought to have faded away a long time ago. And this is another reason – or another version of the same reason – secularists don’t want to work with theists. It’s because theism depends on irrationalism, and secularists are just never sure that the theists can keep their irrationalism confined to that one area. How can we be sure? How could we? If you let yourself believe something against the evidence in one area, why would you not do so in other areas? What is the difference? What is the bright line that tells people ‘faith okay here, not okay there’? How do people know when to be credulous and when not to be? That’s what secularists don’t know, and that’s why we’re always a little leery of working on secular political issues with people who are not committed to secular ways of thinking. Maybe that’s wrong, maybe there is a bright line, maybe the theists are perfectly capable of keeping things separate. But it’s hard to believe that sometimes.