Proliferation

Sep 16th, 2003 12:32 am | By

It’s interesting how ideas can go off in unexpected directions. Sort of a six degrees of separation thing – it can seem as if any given idea can lead to any other in three or four steps, however remote they may seem at the beginning. I noticed it yesterday, for instance: I started writing my TPM essay thinking it was going to be about one thing, and after the first paragraph found myself talking about something quite different. I started out thinking the idea led into one subject (and it did) but in the writing found that it also led into another, so followed it there instead.

The core idea was that of competing goods. A familiar enough idea: that many desirable things are incompatible with many other desirable things. Equality and freedom, just for one example. So I meant to do a semi-jokey rant about the unfairness of the arrangement, but found myself instead doing an ironic rant about the unfairness of various other arrangements, and never got to the competing goods aspect at all. Then this morning I was reading a bit of Wayne Booth’s The Rhetoric of Fiction, for no particular reason (I often read things and bits of things for no particular reason), and he said something interesting about ‘the tension between sympathy and judgment,’ which made me want to write the first essay again. We’re never finished with ideas. We take them on, we think about them, we wrestle with them, we come up with a further idea or two, we think that’s that. And then a day or two later a new thought occurs, and we realize that’s not that after all. And so we keep ourselves occupied.



Just a Bit More

Sep 14th, 2003 11:36 pm | By

Just a little more about the religion article. Because there really is a lot of nonsense in that piece. I only talked about some of it, and I find there’s another bit I just can’t leave alone, in the last paragraph.

It is often said that science answers “how” questions while religion asks “why”, but that is simplistic. The greater point lies in their scope. Religion, properly conceived, attempts to provide an account of all there is: the most complete narrative that human beings are capable of. Science, by contrast, is – as the British zoologist Sir Peter Medawar put the matter – “the art of the soluble”. It addresses only those questions that it occurs to scientists to ask, and feel they have a chance of answering. The account it provides is wonderful. It has shown that the universe is incomparably more extraordinary, and altogether more glorious, than could ever be conceived by the unaided imagination. Yet it succeeds by narrowing its focus, as a matter of strategy. The story that science tells us, then, does not stand in contrast to that of religion (properly conceived). It is embedded within it.

The longer you look at that the more ridiculous it becomes. First of course there’s the obvious point, that religion can ‘ask why’ all it wants to, but it shouldn’t be forgotten that it can’t answer the question any more than anyone else can. It claims to answer it, of course, but as I keep saying, that’s another matter. Claiming isn’t doing; the word is not the deed and shouldn’t be taken for it. But that’s a comparatively minor point next to the really absurd last three sentences. Science is somehow inferior or subordinate to religion because it narrows its focus, it addresses only the questions scientists feel they have a chance of answering. Oh, I see – that’s a problem, is it? It’s better to do what religion does, and ask questions it doesn’t have a chance of answering? And then answer them anyway, by the simple expedient of making it up? That’s better, is it? Ask impossible questions and then make up answers instead of finding pesky old evidence? Thus coming up with the most complete [however fictional] narrative that human beings are capable of? What about those of us who don’t actually want a ‘narrative’ (which is a nice way, i.e. stealth rhetoric, of saying myth or fairy tale or story) but instead want an explanation or a hypothesis? Are we ’embedded’ in the story that religion (properly conceived) tells us too? I refuse, I refuse to be embedded.

A reader emailed me the witty suggestion that the article is a Sokallish hoax. Interesting thought. ‘Perhaps funnier though mortifying if he really meant it. I hope it is an attempt to expose how ‘religious tolerance’ allows utter drivel to not just be printed but thought.’ Indeed. Religous tolerance has a lot to answer for.



Another Stack of Jumpers

Sep 14th, 2003 1:39 am | By

Oh good, more fuzzy-headed nonsense about religion. There does seem to be an inexhaustible supply of it out there. This one is so full of odd, vague, fuzzy statements it’s hard to know where to begin.

One of the highlights of this week’s meeting of the British Association for the Advancement of Science was a discussion on why, although the existing religions do not capture all of what’s out there in the universe, some at least of their endeavours must be taken seriously.

Well what on earth does that mean? ‘Endeavours’? What do you mean endeavours? For that matter, what do you even mean by ‘capture all of what’s out there in the universe’? What do you mean by ‘capture’, what do you mean by ‘all of what’s out there’? Oh never mind. But ‘endeavours’…that could cover a lot of territory. Kindly hand-waving, or telling people they’re damned to hell if they masturbate, or flying planes into tall buildings – the existing religions have had a part in all those ‘endeavours’. Yes I know the planes thing isn’t an official part of the religion in question, but that’s just it, that’s why we need some precision of language here. But vaguest of all – what do you mean ‘must be taken seriously’? I do take religion very seriously indeed, I assure you; I think it can be very dangerous, and I also think it does a lot of more subtle harm even when it’s not actually killing or damning people. But perhaps what is meant by ‘taken seriously’ is ‘believed’. One of those bits of stealth rhetoric like ‘on its own terms’ which we discussed a few weeks ago. Only people don’t want to come right out and say ‘There are some at least parts of religion which must be believed’ because that would seem a bit much. Ever so slightly coercive. So instead there’s just some vague unmeaning rigamarole about taking some (unspecified) endeavours seriously. Well sure, we can all agree to that, right, hon? On account of how we don’t know what it means.

Watson’s and Dawkins’s atheism is rooted at least in part in a mistake. They seem to assume that serious interest in religion must be fundamentalist.

No they don’t. You just assume they do, perhaps because you don’t think anyone can actually disagree with a little harmless theism. But you assume incorrectly. Dawkins is by no means talking only about fundamentalism. He is among other things talking about this line of nonsense:

Religion of course can be discussed from many angles, but the absolute and immediate importance of religion lies in its contribution to morality.

What contribution? What contribution does religion make that philosophy and other kinds of secular thought cannot make? What qualifies religious people to pronounce on morality? The Bible? No, because we pick and choose which bits of the Bible we admire and which we don’t, so clearly we’re using our own judgment on that issue (and besides Tudge has already told us he doesn’t mean fundamentalism, so that lets out Biblical inerrancy). Long practice in thinking about the subject? But moral philosophers can make the same claim, and so can thoughtful amateurs. What then? Just tradition and authority, custom and habit, as far as I can tell. And those aren’t good reasons.

And then there’s this ridiculous assertion:

Religion, properly conceived, attempts to provide an account of all there is: the most complete narrative that human beings are capable of. Science, by contrast, is – as the British zoologist Sir Peter Medawar put the matter – “the art of the soluble”. It addresses only those questions that it occurs to scientists to ask, and feel they have a chance of answering.

Stealth rhetoric again. ‘Complete.’ Well yes, no doubt (although really I would think people like Tolkien and his many imitators who write those great, thick books full of imaginary worlds would give some stiff competion), but since the narrative isn’t true, why is that such a stunning achievement? Are we all supposed to be little children who just want to be told a Story and let it go at that? There’s a very, very great deal of merit to the art of the soluble – to not offering answers to questions you don’t in fact think you have a chance of answering without resorting to ‘narrative’. There’s something deeply unattractive about trying to persuade people that it’s better to have complete but untrue narratives rather than incomplete but well-supported explanations of soluble questions.



Rigorous What?

Sep 12th, 2003 8:06 pm | By

There’s a bizarrely idiotic argument from a commentator on NPR here. The subject is religion, and the Brights, and Dennett’s editorial again. The commentary starts off with the name, which I have no intention of defending: I think it’s absurd, and I’d rather be nibbled by sharks than call myself a Bright. But then it goes on.

55% of people with post-graduate degrees (lawyers, doctors, dentists, and the like) believe in the Devil. 53% believe in Hell. 72% believe in miracles. Remember these are people with post-graduate educations. 78% if them believe in the survival of the soul after death. 60% believe in the virgin birth. And 64% believe in the resurrection of Christ. You can’t get a post-graduate degree without being taught rigorous examination of evidence – figuring out which symptoms indicate a particular disease, or what facts could justify a lawsuit.

Sure you can, if you pick the right subject. I believe there are PhDs in theology, for instance, and in Critical Theory. And besides that, learning rigorous examination of the evidence that applies to one field is not automatically the same thing as learning what counts as evidence in general. Don’t we all know that? Don’t we all know people who are expert in their own field and lost in the fog as soon as they leave it?

Skeptics would say that the human need for something beyond the realities we can touch is so strong that even highly educated people end up manufacturing delusional belief systems. But there is another possibility – that some of these rationally oriented people have found actual proof for their beliefs. Maybe they’ve had a personal supernatural experience with prayer that makes them believe in God or an afterlife. Maybe they’ve found a compelling logic to their views. Perhaps they’ve looked at the universe and said, “something made the big bang happen.” For some highly educated people, faith is not a matter of faith. Rather, they see around them evidence. Evidence that is, to be sure, hard to explain or prove to others, but is nonetheless quite compelling to them.

Our commentator, for instance, seems to be pretty much lost in the fog. Just for a start, ‘actual proof’? A ‘personal supernatural experience with prayer’ constitutes ‘actual proof’? Is that the rigorous examination of evidence Steven Waldman was taught when he got his postgraduate degree? First, to say proof when he means evidence, and then to take someone’s ‘personal supernatural experience’ as evidence? And third, to claim that evidence that can’t be explained or proven to others is nevertheless evidence? Isn’t that a bit of an oxymoron? If evidence is convincing to no one but the person who presents it as ‘evidence’ then it really isn’t evidence, is it, it’s something else. By definition. One would think that would be one of the very first things one would learn when being taught this ‘rigorous examination of evidence’ Waldman says all these highly educated believers in the Devil and Hell and the resurrection of Jesus are in fact taught. Perhaps Waldman skipped class that day.



2001, 1973, 1953

Sep 11th, 2003 7:06 pm | By

Well, I know, it’s all too obvious, everyone is saying it, it’s in the air this year (it wasn’t last year, as far as I remember). Maybe because the anniversary is a round 30 instead of 29, and clearly also because of Peter Kornbluh’s book. But however obvious it is, I’m going to say it anyway. September 11 is a horror-anniversary for Chile as well as for the US. And, stomach-churningly for an American, the Chilean horror show was in large part caused and helped and funded and backed by the US. Which emphatically does not mean that I’m saying we got what we deserved or that Osama bin Laden and his disgusting pals were avenging Allende. But it does mean that the US has done its share (well no, more than its share really) of massacring innocents, abusing human rights, turning a blind eye to murder and torture, and overthrowing democratically elected governments.

And that’s not widely known here. We’re famously amnesiac about history, and history is anything that happened more than about five years ago. But it’s not entirely our fault: it’s not as if the popular media remind us every few days of Pinochet’s coup against Allende and the role Nixon and Kissinger played in the whole thing. And if Chile isn’t discussed a lot, what happened in Iran two decades earlier is even less well-known – and that CIA coup could possibly have more connection to al Qaeda than the Chilean one does. If Iran had been allowed to keep its democratically elected government in 1953 instead of having it snatched away by the CIA and replaced with the Shah and his secret police…who knows what might have happened, who knows whether the idea might have caught on and spread or not. We seem to think that democracy in Iraq will be catching, so maybe it would have been equally catching in Iran fifty years ago. Maybe the whole Middle East would have become democratic, peaceful, prosperous, happy – and thus gutted the pool of recruits for al Qaeda. Who knows. I don’t know, but I do wonder. So even apart from the obvious moral questions, it also seems highly likely that the Iranian coup was a truly terrible idea on consequentialist grounds, for us as well as for the people of the region.

There’s an interesting discussion of the Chilean matter at Crooked Timber today.



Happy Birthday to Us

Sep 9th, 2003 8:50 pm | By

Well, just think – Butterflies and Wheels is a year old. Yes, it’s our birthday. Depending on when you start counting – if it’s from the first day there was anything at all on the site, then the birthday was a few days ago. But I’ve decided to count from the first articles we posted in News, and that was September 10. Three hours from now in London where one leg (or wing) of BandW is, and eleven hours from now in Seattle, where the other leg is. Close enough.

It’s funny – we expected a good deal of hostility and criticism. In fact one of us was looking forward to it, and has been disappointed that we’ve had so little. But then again it’s very good to know that there in fact are more than three or four people out there who are not charmed by Nonsense. So apart from the non-appearance of entertaining insults and abuse, it’s been a good first year.

And much of that is down to the stars who sent us articles or interviews right at the beginning, when we were a completely unknown quantity. Mary Lefkowitz, Richard Evans, Norman Levitt, Steven Pinker, Allen Esterson, Simon Blackburn, Daniel Dennett, Robert Nola. They got us off to a very strong start, and we love them all.



Fair and Unbalanced

Sep 8th, 2003 10:40 pm | By

There is an interesting post and discussion on Crooked Timber today, on the tension between trying to work out a reasoned position on issues like global warming, and the political commitments of some (or all?) of the sources one relies on to make such judgements. It grabbed my attention because of course that tension is what B and W is all about. Also because I bump against it (can one bump against a tension? never mind, two idioms collide) all the time in going about my daily task of finding news and other links. ‘Hmm, interesting article, makes some good points, but do I really want to link to the Washington Times/Reason/the Telegraph?’ Sometimes I do, sometimes I don’t. Well…actually I don’t think I ever have linked to Reason or the Washington Times. So far I haven’t found anything brilliant enough to over-rule my intense distaste for both of them. Other right-wing sources I don’t mind so much. But strictly speaking, perhaps I ought not to think about it that way (if it can be called thinking – it’s more like a reflex). Perhaps I ought to be sublimely unaware of the source, and link or not link purely on the merits. If this article would be good enough from the Guardian or the Independent, it ought to be good enough from Fox News or Rush Limbaugh. But I’m not, and it’s not. I do set higher hurdles for the very right-wing stuff. If there were a shortage of material I might not, but that’s not the case. So that’s my bias at work: now you know.



Doubt is Possible

Sep 7th, 2003 8:22 pm | By

This is an interesting little case study in the use and abuse of evidence, investigative techniques, language and rhetoric, inference and conclusion. One of those (all too familiar) occasions when attention-seeking and self-aggrandizement dress themselves up in scientific (or pseudo-scientific) vocabulary and give the whole enterprise a bad name.

Dominique Labbé, a specialist in what is known as lexical statistics, claims that he has solved a “fascinating scientific enigma” by determining that all of Molière’s masterpieces…were in fact the work of Pierre Corneille…”There is such a powerful convergence of clues that no doubt is possible,” Mr. Labbé said. The centerpiece of his supposed discovery is that the vocabularies used in the greatest plays of Molière and two comedies of Corneille bear an uncanny similarity…Mr. Labbé contends he has infallible statistical evidence of Corneille’s “fingerprints” all over Molière’s greatest works.

Well that’s a bad sign right there, a ‘scientist’ saying that no doubt is possible. That’s a pretty dodgy thing to say even with overwhelming evidence – in fact one could say it’s simply nonsensical, because doubt is always possible. And in real science (which avoids words like ‘infallible’) it’s also essential. A putative scientist who tries to rule it out in advance, even (or no, perhaps especially) when talking to journalists, is letting the side down. And Zanganeh’s article does an excellent job of collecting quotes from other scholars (scientists, honest inquirers) in other fields that point out what a lot of doubt is possible and why.

“Lexical statistics can be useful as an exploratory tool with a descriptive and investigative goal…In no way can it be used as a proof.” In a nutshell, attribution of authorship necessitates a convergence of presumptions. Joseph Rudman, a professor of applied statistics at Carnegie Mellon, agrees that even the best authorship-attribution studies could yield only probabilities. “You can never say definitely, just like in a DNA result,” he said…Indeed, at the heart of this debate lies a more fundamental question about the use and abuse of scientific tools in the field of letters.

But it could be argued that the way Labbé has used it, his method isn’t really a ‘scientific tool’ at all. Obviously these matters are always contested and debated and argued over, but surely one hallmark of good science is not producing evidence of one narrow fact (that Molière and Corneille used very similar vocabularies in their plays) and extrapolating from that to much broader claims. And surely another is not relying on only one narrow piece of evidence to support (let alone ‘prove’) a claim which would require evdience from a number of other fields – a ‘convergence of presumptions’. Ignoring, apparently not even noticing, what kind of evidence one needs to make a case, is hardly good science.



Our Banner: No Consensus for Loonies

Sep 5th, 2003 5:15 pm | By

Here’s an item for all you students of artful rhetoric: an article about pagans, Wiccans and other ‘alternative’ groups and the use they make of Stonehenge and similar sites. Pure wool from beginning to end – enough wool there to make jumpers for the entire Butterflies and Wheels staff.

Spiritual site-users, specifically Pagans and Travellers, have traditionally been negatively represented by the media…However, this report outlines the growing need for recognition of the rights of Pagans, who come from all walks of life…Pagan and other spiritual site-users believe that the spirits and energies of the land can be most strongly felt at sacred sites enabling connections to be made with our ancestors.

Yes, and? So what? What if I believe that the spirits and energies of [your choice of entity here] can be most strongly felt at the library, or the nearest art museum, or the theatre, or your living room, and so go there and carry on in a manner of my choosing? Will two Doctors at Sheffield Hallam University do research that outlines the growing need for recognition of the rights of me to do whatever I want to because I come from all walks of life? If so, why? And apart from that, apart from the consequentialist aspect of the question, what of the epistemological one? What about this fatuous nonsense that ‘spiritual site-users’ believe or claim to believe? Why the studied refusal to mention the nonsensicality of the belief?

Increasingly, they are campaigning to be able to engage with the sites in their own ways…Activities such as the lighting of fires and graffiti on the monuments, which have taken place as part of rituals, are in opposition to the preservation wishes of heritage managers. An extreme example of destructive behaviour was demonstrated by one such alternative group who decided that a stone circle was not positioned correctly and so they attempted to move it. Whereas Pagans are very active site users, heritage managers tend to promote a passive visitor experience, which includes accessing information on display and utilising the visitor centre. The research discusses how, for Pagan and spiritual site-users, the land isn’t something to visit or marvel at in this way, it is a living landscape and to be at the site is like ‘coming home’.

Sly, no? ‘Active’ versus ‘passive’ – now we know active is good and passive is bad, right? So obviously these energetic, lively, engaged, take charge Pagans are the Good People, and the spineless supine limp passive weak ‘heritage managers’, otherwise known as archaeologists and historians and tiresome pedantic people like that – they’re obviously the Bad People. The dear Pagans improve boring old places like Stonehenge by scribbling on them, setting fire to them and moving the stones around. Now that’s what I call initiative! Whereas the dreary old scholars just stand around and study the thing. Yuk, so left-brain, so linear, so Eurocentric and rational. And yet nobody represents them ‘negatively’ in the media.

The recent news that Stonehenge is to get a £57m makeover to improve its visitor centre and facilities is a step in the right direction, but further demonstrates this passive view of the visitor experience held by heritage managers. This research argues that although the views of heritage managers, archaeologists and spiritual site-users appear irreconcilable, the future of these sacred sites depends upon education, negotiation and consensus between all groups.

Why does the future of these ‘sacred’ sites depend on those things? Why is negotiation required? And especially why is consensus? Why can’t they just be ignored? Why can’t people who hold (or pretend to hold because it makes life interesting and gets them some attention even if it is ‘negative’) ridiculous unfounded ideas simply not be taken into account if their ideas entail destructive practices? What if a group decided it could only properly appreciate Shakespeare by eating the four remaining copies of the First Folio – would we have to negotiate and reach consensus, perhaps allow them to eat two, or chew all of them but then spit them out again? Or would we just tell them to go away. Let’s do more of that.

P.S. Thanks to PM, here is ‘sacred sites’ for your exploring pleasure.



Psychology and Psychiatry

Sep 4th, 2003 9:29 pm | By

We had a discussion/disagreement recently about the validity or otherwise of psychiatric diagnoses or labels, designer drugs, and the DSM [see Comments on the N&C ‘Opinion’ on 26 August if you’re interested]. I was browsing my disorderly collection of printed-out articles this morning and so re-read this article by Carol Tavris that I posted in News last March. What she says is highly pertinent to the discussion/disagreement. In fact, it raises a whole set of questions that are very much B and W territory: what is science and what isn’t, what is pseudoscience, what kind of evidence is reliable and what isn’t and why, what kind of harm can be done by taking shaky evidence as more reliable than it is. And perhaps above all, the strange way the less reliable, well-founded, evidence-based branch of a discipline has become dominant in the public realm while the more cautious, skeptical, research-based branch is comparatively ignored.

Yet while the public assumes, vaguely, that therapists must be “scientists” of some sort, many of the widely accepted claims promulgated by therapists are based on subjective clinical opinions and have been resoundingly disproved by empirical research conducted by psychological scientists…Indeed, the split between the research and practice wings of psychology has grown so wide that many psychologists now speak glumly of the “scientist-practitioner gap”…Unfortunately, the numbers of scientifically trained clinicians have been shrinking. More and more therapists are getting their degrees from “free-standing” schools, so called because they are independent of research institutions or academic psychology departments. In these schools, students are trained only to do therapy, and they do not necessarily even learn which kinds of therapy have been shown to be most effective for particular problems.

And so we come to the DSM – the Diagnostic and Statistical Manual, that is, the ‘bible’ of US psychiatrists. The DSM is a product of the clinician side rather than the research side of this debate. There is a review-article here that discusses the same issues Tavris does while reviewing Science and Pseudoscience in Clinical Psychology (which has a foreword by Tavris). The differences (in all senses) between clinical psychologists and research psychologists, what kind of evidence they rely on, which treatment techniques are effective, which are ineffective, and which are actually harmful, the importance of distinguishing between science and pseudoscience. The book also discuss the controveries that can erupt over these issues. And the DSM.

These two drastically divergent conclusions demonstrate not only how varying standards of evidence can result in vastly different perceptions of treatment effectiveness, but also how some standards, such as clinical-anecdotal literature and clinical acceptance, are inappropriate measures. These conclusions also raise concern about relying on the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition (DSM-IV), which is also based on practitioner consensus rather than empiricism (American Psychiatric Association [APA], 1994). This concern is reinforced in the chapter titled, “The Science and Pseudoscience of Expert Testimony,” by Joseph T. McCann, Kelley L. Shindler, and Tammy R. Hammond, which observes that the APA acknowledges that DSM-IV diagnoses do not rise to the standard of legal evidence (APA, 1994).

Well there you have it. All those syndromes and disorders in the DSM rely on practitioner consensus (sounds like what Susan Haack calls ‘vulgar Rortyism’) rather than empiricism, and the diagnoses do not rise to the standard of legal evidence. Just what I’ve long thought, so it’s good to see it said so explicitly. And the Candace Newmaker case is discussed too, not surprisingly.

Tavris is not alone among the contributors in suggesting that these practices are not only unscientific, but harmful; at worst, they can lead to inappropriate custody decisions and jury verdicts. Lilienfeld and his coeditors expand upon this concept in their opening chapter, identifying varying degrees of harm that may result from pseudoscientific techniques. Some treatments are truly dangerous, such as the “rebirthing” techniques that gained attention only after the suffocation death of Candace Newmaker; “memory recovery,” which caused many innocent people to be accused-and some convicted-of heinous crimes; and “critical incident stress debriefing,” which exacerbates the trauma it is purported to mitigate.

This is a large subject, and one we’ll be turning our attention to.



Reading

Sep 4th, 2003 8:23 pm | By

Erin O’Connor says some very interesting things in this article in the Chronicle of Higher Education. They’re things I’ve been thinking for some time myself.

But almost everyone agrees with the astounding premise that it’s reasonable to use the freshman reading program to stage a political debate…On both sides of the debate, a book’s politics are assumed to matter more than its scholarly merit or literary quality…The tacit assumption by both liberals and conservatives that Chapel Hill’s summer reading program is more about politics than about reading should give us pause. We ought to be asking what it means to read opinionated works as either a confirmation or negation of identity — but instead we are fighting endlessly about whose identity gets top billing when readings are assigned.

Just so. One of the things that has soured or curdled or at least altered my leftist views or commitments – I still have them, but they tend to be hedged about with sighs or snarls or rolled eyes these days – is just that claustrophobic idea that politics is the only way to think. That if one is not thinking politically one is not thinking at all, and then that politics boils down to identity politics. What a deadly combination. First, come up with a radically diminished impoverished and in many ways regressive idea of what politics is about, and then make everything be about (that version of) politics. Then run absolutely everything in life through that dreary wringer and see the result: we’re not allowed to read Shakespeare or Austen any more without getting an endless turgid point-missing lecture on their failure to be as right-on about Colonialism or queer theory as we are. Yawn.

Yes, they are exposing the program’s considerable liberal slant, but only on the way to revealing their own embarrassingly impoverished concept of reading…Both liberals and conservatives should remember that there is no book worth reading that is not somehow partial to something, and that there is no education worth having that does not involve exposure to partialities other than one’s own.

And even that there are partialities that have little or nothing to do with politics, especially with politics in the narrow boring parochial sense in which we’ve taken to defining it.



Gloating

Sep 2nd, 2003 11:22 pm | By

I knew I was right to like ‘Queer Eye for the Straight Guy’! [see N&C August 27 if you care] Drone about stereotypes all you like, but hey, if it pisses off Brent Bozell, it’s right up there with Euripides and Chekhov, as far as I’m concerned.

“I want to vomit,” L. Brent Bozell, president of the Parents Television Council, which monitors TV content, wrote of Bravo’s smash “Queer Eye” in his weekly column last month. “Ever seen a show more dedicated to a ‘straight-bashing’ proposition? … Try this idea for a show and tell me how many seconds it would last in a Hollywood pitch session: ‘A team of five fabulous straight guys teach a masculinity-deprived gay man how to throw a football, hunt for game, drink something manlier than fruity wine coolers and appreciate the fiction of Tom Clancy.'”

Haaa! Suck it up, Brentster! You’re dead right, and that calls for champagne on the house. The dreary boring football-throwing game-slaughtering Tom Clancy-appreciating side doesn’t get to have a funny dishy silly giggly show on Bravo, oh isn’t that too bad. Well guess what, that’s because Tom Clancy is awful, hunting for game is boring and destructive, throwing a football is boring – in fact you’re kind of insulting your own sex with all those stereotypes, dude. Especially the Tom Clancy one – that takes more than being a straight guy, you have to be dead between the ears for that one.

And as an added benefit, Alan Wolfe seems to be irritated, too.

So how does marriage fit in? Not comfortably — at least not yet, says Boston College political scientist Alan Wolfe, who directs the Boisi Center for Religion and the American Public Life. “Americans make a pretty sharp distinction between things in private and things in public, and right now the bottom line is that sexuality is nobody’s business, but marriage is public.”

As the director of the OB Margin for Noreligion and American Peculiar Life, I say anything that annoys Brent Bozell and Alan Wolfe has a lot going for it.



High what? What brow?

Sep 2nd, 2003 10:55 pm | By

Perhaps this is cruel, or petty, but I think it needs saying. Rather often, actually, because we have here one of those incomprehensibly inflated reputations that the world is better off for deflating.

If NPR is the Promised Land of high-brow book publicity, what do you call an author who snags not one shot at public radio’s upscale, book-loving audience, but a recurring gig to talk about a book that he hasn’t even written yet?

Listen, if NPR is the Promised Land of highbrow anything at all, what to call some author is the least of our problems. (Not to mention the slight oxymoron of ‘highbrow’ [what an obnoxious word] publicity, but never mind that.) NPR (the US’s National Public Radio, that is) is about as highbrow as ‘Who Wants to Marry a Millionaire?’. People here only think it’s ‘highbrow’ because everything else on the radio is so much worse. But that is setting a pitifully low standard! NPR insults all our intelligences every time it opens its mouth. Highbrow, indeed! It’s about as highbrow as Fox is fair and balanced.



Broad Brush

Sep 2nd, 2003 7:39 pm | By

Well, clearly we at B and W take it as our self-appointed mission to say, with varying degrees of mockery and rudeness, when we think our fellow leftists are being silly, but there is a limit. Which is to say we try to do it with a certain amount of precision and accuracy – in fact accuracy broadly construed is the whole point of the enterprise: when ideology or political commitment is in conflict with the truth, it ought not to be the truth that gives way. That applies all around, not just to them there pesky leftist intellectuals. All of which is to say there is a very sloppy article in Prospect that doesn’t worry enough about precision and accuracy.

To look back at the responses which the murder evoked from the literary and political intelligentsia is to see something more than many clever and famous people making fools of themselves…But it was writers-with-a-W who really excelled…Imaginative writers are distinguished not by a sweeter character (too often very much not), greater intellectual honesty, or even deeper intelligence…If the old Leninist left was buried politically in the rubble of the Berlin wall, the literary-academic intelligentsia disappeared morally in the ashes of ground zero.

What on earth is he talking about? Writers? What writers? All writers? That’s a hell of a lot of people, not all of whom say the kind of thing he’s complaining of, to put it mildly. Does he mean fiction writers, or fiction-and-poetry writers? At times he seems to, with ‘imaginative writers’ (which is a silly phrase), but then at others he talks about the ‘ literary-academic intelligentsia’. Well which is it? Surely he doesn’t think the two are identical does he? Does he mean both? If so then why mention ‘imaginative writers’? And in any case, again, that’s much too broad a brush, because there are plenty of academics who don’t fit his category, and even some ‘imaginative’ writers as well. So what does he mean? Not really much, apparently. Some novelists and some academics, is what it boils down to; but then why not say that? Why on earth write the thing as if all ‘writers’ or all ‘imaginative’ and academic writers were guilty of saying stupid things about September 11? Anti-intellectualism perhaps? But if so, what in hell is an anti-intellectual piece of claptrap doing in a magazine like Prospect?

[Confusing me, for a start. There I was railing at the wrong magazine. Selective attention, or cognitive miserhood, or some other nice alibi for stupidity.]



Influenza

Sep 1st, 2003 11:48 pm | By

As you may have noticed, I have a perennial or chronic or obsessive interest in the question of what one might call cultural influence. Or one might call it memes, or fashion, or groupthink, or conformity, or any number of things. And in being interested in that, I also become interested in the self-fulfilling prophecy. That is to say, I’m interested in the way people (especially influential people) say things like ‘Most Americans believe in God/family values/the market’ and the statement becomes a little bit more true for having been said. I say Americans partly because I am one so I hear more of the American version than the UK one, and partly because I think there is probably more of that kind of thing here than there is there. It may be no accident (she said darkly) that the word ‘bloody-minded’ isn’t part of the American idiom. I think it’s one of those things like the apocryphal 900 Inuit words for snow, or however many it’s supposed to be. Sapir-Whorf. That we don’t have the word and so we can’t really quite imagine the state of mind. If people keep nagging us endlessly to have ‘faith’ or to admire dimwits in high office, well, there’s a bit of water on a stone effect. Drip drip drip. We like to think we’re terrifically independent and individualistic, but…that nice herd of sheep does look awfully cozy and happy and comfortable, I think I’ll just sidle over and kind of insinuate myself among them.

So when a lot of Deep Thinkers and think tank occupants and Religious Leaders and Senators and actors and suchlike important people make what seem to be simple factual descriptive comments on what a religious people we are, they’re not just describing or stating the facts, they’re also telling us to go and do likewise. Most people do and therefore you ought to too, obviously. Most people do and so who the hell do you think you are doing something different? Most people do and therefore they must be right, because most people are never wrong. Most people do and so you have to because this is a democracy, remember?

It’s not usually explicit, that kind of thing. But then that makes it all the more insidious. Perhaps if it were explicit, even we would be bloody-minded enough to resist. But since it’s not, the drip drip has its effect. We start to feel a little uneasy. ‘Who do I think I am? Can that many people all be wrong while I’m right?’ And so the statistics inch up and up, until pretty soon the opinion polls will be telling us that a full 99% of Americans believe in God and that last 1% will just have to move to a cabin in the mountains or emigrate or something.



Pedantry

Sep 1st, 2003 2:25 am | By

Well it’s shooting fish in a barrel, but I just have to say something. I know it’s an easy target, people getting university degrees in video games. But so what? Did I ever sign the International Agreement on Not Shooting at Easy Targets? Not that I remember.

And there is actually a serious point to the whole matter – which is that people seem to have no idea that there is, or there can be, or it is possible to imagine that there is, any difference between education as vocational training and education as a good in itself. If vocational training is the only purpose of education, then fine, teach people to design video games, there’s good money in it. But if it has anything to do with ideas about valuing understanding and knowledge as intrinsic goods for humans, then teaching people to design video games at university might not be such a brilliant idea. Maybe that would be a better subject for technical colleges. But meditation on such possibilities seems a bit scarce in video game circles.

More such research will boom, says Janet Murray at Georgia Tech’s School of Literature, Communication, and Culture. ‘There is this critical need for the game designers of the future to be broadly educated in the liberal arts,’ she says. ‘It’s not surprising that several people working in game design at higher levels hold degrees in film.’

So…broadly educated in the liberal arts means having a degree in film? Not history, not philosophy, not French or German, but film? Will education in the future be carried on entirely by means of pictures? With the slight limitations that implies? One can’t help wondering. The fish are a little too comfortable in their barrel.



Shrill? Moi?

Aug 29th, 2003 8:31 pm | By

But then it’s the fashion, Humpty Dumptyism is. Or perhaps that’s wrong, perhaps it’s never not been the fashion, in which case it’s not the fashion, it’s just what humans do. No more a fashion than eating or breathing. But it’s hard to believe that it’s not at least a little more pervasive and evident and popular now in the age of mass media and incessant communication and non-stop information – not to mention democracy. Henry VIII and Louis XIV didn’t have a lot of need to persuade the farm laborers and weavers and sturdy beggars of their world to love and admire and vote for them, so that must have cut back on the amount of word play right there. And then there’s selling, too. If there’s not much to sell – ‘You can have the brown cloth or the other brown cloth’ – there’s not much need for advertising language, is there. No, surely it’s fair to say there’s more Humpty Dumptyism around now than there was in, say, 1479 or even 1979.

And I must say, I’m enjoying a good malicious laugh at Fox News and its version of the practice. Fair and balanced indeed – and thinking it owned the words! What next! I think we should start suing everyone who uses the words ‘butterfly’ and ‘wheel’ and ‘nonsense’. I don’t know which is funnier, Fox claiming to be fair and balanced, or Fox claiming that people would be confused about Franken’s meaning, or the mental picture of the courtroom squealing with helpless laughter as the judge questioned Fox’s lawyer, or Fox calling Franken ‘shrill and unstable’. Okay, I’ve decided: that last bit is the funniest. Fox calling other people shrill! No wonder the courtroom was falling about!

It’s good to have a source of laughter and derision from the right now and then, when the left seems to spend so much time making a fool of itself. The item about Brown University’s ‘Third World Training Program’ I posted yesterday is enough to make one want to join some third direction that hasn’t been named yet. Not mainstream, thank you very much, not moderate, no, it’s not radicalism I object to, it’s bloody silliness.



Redefine

Aug 29th, 2003 7:36 pm | By

It’s interesting how willing people often are to redefine religion in order to defend it, and how thoroughly they’re willing to redefine it for that purpose. In fact they do such a thorough job of it that one would have thought there was nothing left that needed defending. Who would bother to argue against feelings of awe or wonder, or an appreciation of stories and myths and poetry? I certainly wouldn’t, in fact I think those are fine things. But they’re not what I take religion to be, and I don’t think they’re what people generally mean when they talk about religion, either. If that’s what religion means, then what do we call what I mean by religion, to wit: belief in the existence of a supernatural being who created the universe, and perhaps personal immortality for humans?

Richard Dawkins discussed this issue in his usual incisive way a few years ago in an article that is also included in his most recent book, A Devil’s Chaplain. I urge you to read the article, it makes my point for me. I feel like quoting the whole thing but will restrain myself.

If you count Einstein and Hawking as religious, if you allow the cosmic awe of Goodenough, Davies, Sagan, and me as true religion, then religion and science have indeed merged, especially when you factor in such atheistic priests as Don Cupitt and many university chaplains. But if the term religion is allowed such a flabbily elastic definition, what word is left for conventional religion, religion as the ordinary person in the pew or on the prayer mat understands it today–indeed, as any intellectual would have understood it in previous centuries, when intellectuals were religious like everybody else?

Just so. Very well, if I’m quite wrong about what the word ‘religion’ means, and it’s really just a word for some attitudes and emotions rather than a set of supernatural truth claims, fine. That’s not what I’m talking about then in the ‘Science and Religion’ In Focus. I’m talking about something else – you know – that familiar stuff about God and Jesus and Allah, prayers and the soul and heaven, resurrection and immortality and sin and atonement. I don’t know what the right word for that is if it’s not religion, and I’m not at all convinced that people who claim that’s not what the word ‘religion’ refers to are correct, but at any rate that is the subject I’m talking about.

If God is a synonym for the deepest principles of physics, what word is left for a hypothetical being who answers prayers, intervenes to save cancer patients or helps evolution over difficult jumps, forgives sins or dies for them? If we are allowed to relabel scientific awe as a religious impulse, the case goes through on the nod. You have redefined science as religion, so it’s hardly surprising if they turn out to ‘converge.’

Just so, again. It’s sheer Humpty Dumptyism, is what it is. ‘Religion is whatever I say it is for the purposes of this discussion so that I can claim that atheists and secularists are silly and shallow, dogmatic and ignorant, stubborn and perverse.’ Only in Looking-glass Land where words don’t mean what they mean.



Most People

Aug 28th, 2003 9:15 pm | By

And so back to this nagging question of majority opinion and how coercive it can be. One issue is what one might call mission creep – the way we extend democracy and majoritarianism from the political, electoral realm to other areas where it is arguably less useful, where it is in fact arguably harmful, such as opinion, education, culture. This creep or extension may or may not be a good idea, but the question whether it is or not doesn’t get enough discussion, because people don’t really notice when the extension is happening. The border between politics and everything else gets ignored: everything is political, and majority opinion is right and should be heeded in all areas of life, not just in who gets elected president or councillor. The erasure or at least re-positioning of that border is in many ways a good thing, because it is for instance a political question who does all the housework and why. But in many other ways that border-shift is a very bad thing, and in fact B and W was founded in order to point out some of the ways politics obscures issues instead of clarifying them. So in that sense it’s all one phenomenon we’re talking about here: using one kind of thought when another kind is what’s needed.

So with using the majority as a cattle prod to keep people in line. ‘Most people don’t think what you think, therefore what you think must be wrong.’ Well, no, not necessarily. It’s not completely unknown in history for most people to think things that are in fact not true. It’s not crystal clear that majority opinion is even able to choose the best person to vote for in any given election, so why would it be unerring on any other subject? ‘Two million people bought that book, therefore it must be good.’ Well, no, not necessarily. Maybe the book had a lot of publicity. Maybe it depends how we define ‘good’. Maybe the particular two million people who bought that book wouldn’t know a good book from a plate of chopped liver. ‘Most people around here believe in God.’ Well, maybe, but then they’ve been listening to people like you say that most people believe in God all their lives, haven’t they, so maybe that constant repetition has influenced what they believe, and furthermore maybe they feel embarrassed or ashamed to say they don’t believe what most people believe, so that it all becomes a self-fulfilling prophecy. Could that have anything to do with it?



How Does That Look?

Aug 28th, 2003 12:02 am | By

On a less frivolous note. There is this little matter of the Bush administration’s repeated, insistent attacks on the International Criminal Court, which I find massively depressing and disgusting. The Clinton administration wasn’t a great deal better, but I’m not sure they would have acted quite so aggressively as the Bush team, for instance actually bullying countries that don’t exempt the US from the Court’s jurisdiction. And I’m not sure they would have threatened to veto a UN resolution to protect humanitarian workers simply because it had the unmitigated temerity to mention the Court.

Yes I know the rationale: they’re afraid such a court would bring ‘frivolous’ prosecutions against US soldiers. Yes, but what if US soldiers do commit war crimes? What if there is another My Lai, for example? And then there’s the way the whole idea of international law is undermined if the most powerful country on the planet refuses to be subject to it. If we won’t, why should anyone else? And how do we think it makes us look to all those others? Like people who are planning to commit war crimes and want immunity in advance, is what it makes us look like. Appearances do matter.