Old Red

Feb 20th, 2005 8:32 pm | By

The promised more on Janet Browne’s Darwin biography. A couple of sentences down on the same page (page 141, to be exact):

And when Sedgwick arrived he tried to entertain him in an appropriately geological fashion by telling him of the gravel pits near Shrewsbury. But Darwin’s story of the labourer who found a tropical shell in the gravel brought only a peal of laughter and the remark that this could not be true. If the shell were genuinely embedded there, said Sedgwick, it would overthrow everything that was known about the superficial deposits of the Midland counties…Recounting the story later, Darwin remembered being astonished that Sedgwick was not more delighted by his strange fact. ‘Nothing before had ever made me thoroughly realise, though I had read various scientific books, that science consists in grouping facts so that general laws or conclusions may be drawn from them.’ What Sedgwick went on to explain to him was that there must be a great deal of mutually supportive material for scientific theories of all denominations. Once such theories were established, it took more than an isolated shell to change them.

A simple point, but interesting, I think. Interesting partly that he hadn’t realized it before, and that one incident made it so clear to him. ‘And from the way Darwin continued to hold this salutory episode in mind,’ Browne goes on to say, ‘it evidently had a marked effect on his scientific practice.’ One shell, one story, one peal of laughter. So learning takes place.

And 142-143. They are on the field trip. According to Greenough’s map, there should be Old Red Sandstone underlying an escarpment where Sedgwick saw no sign of it. He sent Darwin to search for signs on one side while he searched on the other.

On his own for the first time since leaving Shrewsbury, Darwin could not find any trace of the desired rock. He was more than a little anxious by the time he returned to Sedgwick, because it was easy to miss details in the field and hard to contradict an acknowledged authority like Greenough. He had scoured the countryside for elusive corroborative signs. Yet Sedgwick was very pleased with him…explaining how his researches would require the revision of a major portion of the national map. Sedgwick too had not seen ‘a particle’ of Old Red…[F]ew professors would have accepted a major negative claim like Darwin’s without backtracking to check on the data.

I like that because it’s the black swan thing, and because it made Darwin anxious. The black swan makes for instance UN weapons inspectors anxious, too, because however hard they look they can’t know, and they know they can’t know, that they have searched everywhere. In fact in that case they know perfectly well they haven’t.

His chagrin at Sedgwick’s brusque response to the tropical shell in the gravel pit was transformed into a fleeting but thoroughly practical awareness of the philosophical structure of science. He went on his way to Barmouth with his wits sharpened and with a good deal more intellectual purpose…

Interesting, don’t you think?



Practice

Feb 19th, 2005 10:07 pm | By

Time to say a few words in praise of lateral reading. I’m a great fan of lateral reading – not just via links but also in books. You know how that goes – you read an essay which sends you to a book which sends you to two more books and you find connections you didn’t know about. This is why (at last it can be revealed) I know absolutely nothing about anything in any depth: because I read laterally rather than vertically. I’ve never read an entire book from beginning to end in my life, but I’ve read two pages of a million or so. But never mind – I comfort myself with Johnson’s retort when Elphinston said ‘What, have you not read it through?’ – to wit: ‘No, Sir, do you read books through?’ Also with Bacon’s ‘Some books are to be tasted, others to be swallowed, and some few to be chewed and digested.’ Mind you, I skip the chewed and digested part, but I’m a great taster.

So. I read an essay of Philip Kitcher’s the other day, ‘A Plea for Science Studies.’ It said some interesting things about Martin Rudwick’s The Great Devonian Controversy and it also mentioned and quoted from a review of same by Stephen Jay Gould in An Urchin in the Storm – a book I happen to have an old copy of, with a hedgehog (is an urchin a hedgehog?) in front of a tornado on the cover. So I read the review, and that confirmed the impression Kitcher had already given me that I really ought to read this Rudwick book without delay. It was this comment that did it:

After a superficial first glance, most readers of good will and broad knowledge might dismiss The Great Devonian Controversy as being too much about too little. They would be making one of the biggest mistakes of their intellectual lives.

Well that’s the sort of admonition I can never ignore, so I got The Great Devonian Controversy out of the library along with Philip Kitcher’s Science, Truth and Democracy (and a few other items, but that needn’t concern us here – just a few more books to read two pages of). Read parts of the Kitcher book – Chapter One, Chapter Eight, Chapter Eleven, more or less simultaneously as opposed to sequentially. Laterally, you see. The book has three bookmarks poking out of it now. Chapter Eleven caused me to read another Gould essay, this one in Ever Since Darwin, which I have in a nice old Pelican with a whimsical moose on the cover – ‘Biological Potentiality v Biological Determinism.’ It’s interesting stuff – and The Great Devonian Controversy is, just as everyone said, highly interesting. It’s about a disagreement over geology in the 1830s…so of course as one reads one keeps thinking ‘Darwin. He must have known about all this…’ So (being lateral) I put down The Great Devonian Controversy and picked up the first volume of Janet Browne’s brilliant biography of Darwin. Read it? Not that I have, but I’ve read quite a lot of it, at various times. I remembered from previous incomplete readings that Darwin had been interested in Lyell, and I’d been intrigued by something Rudwick says about Lyell’s having created the myth of a split between catastrophists and uniformitarians. I knew I’d read about that in Browne’s book so wanted to refresh my memory – so found Lyell in the Index and started with him, but then kept getting pushed back earlier and earlier to find the beginning, with Henslow and Sedgwick.

So there’s a very long preamble – to lead up to the fact that I wanted to mention one or two items from the Browne biography, simply because I think they’re interesting, and I wanted to explain how I got there. Now you know. It’s spring 1831, Darwin is a student at Cambridge, his friend Henslow the professor of botany at Cambridge has got his friend Adam Sedgwick, the professor of geology at Cambridge, to take Darwin along on a geology field trip.

Darwin was hardly complacent either. He secretly practised his geology in the fields around home before Sedgwick got there, hoping to impress him before they took to the hills together, and was chastened to find it a great deal harder than he expected.

I find that interesting in various ways. The fact that he was chastened, the fact that he hadn’t expected it to be so hard, the fact that it was (and is) hard. And it resonates interestingly with something Rudwick says that snagged my attention. (This is where the lateral reading comes in. The comment was fresh in my mind because I’d just read it, whereas it wouldn’t have been if I’d read the whole book before picking up the Browne. That, I think, is why I like reading laterally. It seems to make it easier to see such connections.) What Rudwick says (on page 10) is in a section titled ‘Research as Skilled Craftsmanship’ (see? it connects already – young Darwin was trying to improve and practice his craftsmanship, and finding it not easy). He talks about Michael Polyani’s emphasis on the communal framework of tacit knowledge –

like the skills of the craftsman, they are learned not from textbooks but by working alongside a more experienced practitioner within a living communal tradition. This picture of scientific work as skilled craftsmanship…jarred…against the fiercely held convictions of many philosophers…Even now, its validity would be more widely appreciated if those who analyze scientific work were not generally such narrow bookish people, and if they had firsthand experience not only of scientific reserach itself but also of skilled manual crafts outside the intellectual or academic sphere altogether.

There. I like the way those two things resonate with each other – and with further things, like the low status scientific subjects had in British education for years and years. There’s a scene in ‘Breaking the Code’ in which the adolescent Turing says that one of the masters at his school still refers to science as ‘stinks’. Science had low status, I read somewhere once, don’t ask me where, precisely because it was manual work, because it did involve getting the hands dirty. It was all too much like just plain labour. (Although geology was also, confusingly and complicatingly, like Good Clean Sport, and just the thing for gentlemen; Rudwick’s book is all about the gentlemen aspect. But there are complications. Some gentlemen geologists got mistaken for laborers at times, breaking rocks with their hammers. Then there’s the fact that Robert Darwin was a doctor – which was not a high-status profession at the time. Keats and his pills, Dr Johnny, you know. But that’s by the way.)

Another thing it resonates with is this lovely post at Pharyngula yesterday.

I don’t know exactly what the answer is, but the root of it has to lie in teaching kids to enjoy figuring things out. One geeky personal example: I got introduced to model rocketry when I was in fifth grade, and I was a member of the model rocket club at my school up through junior high. I think, though, that I built precisely two rockets and launched them just once. The first time I’d watched these things, the instructor had handed me some gadget that I looked through and measured the angle to the rocket at the top of its flight, and showed me how to calculate how far it went. That was it for me. Who cared about balsa wood and cardboard when there was geometry and trigonometry to do? I thought Calvin’s problem was the fun part!

There’s more. There’s the shell in the quarry and how Sedgwick laughed, and what an impression his laughter made on young Darwin. But this is long enough for now, and I have to rush away.



Circling Skeptics

Feb 17th, 2005 6:17 pm | By

The second meeting of the Skeptics’ Circle has taken place at Orac’s site. The first, at St. Nates’, was two weeks ago. And the archive site with schedule of future meetings is here. As St. Nate said –

But we are not content to rest on our laurels! I want this Circle to endure and to keep getting better and more popular. I want to expand our membership! The blogosphere still remains a cesspool of the paranormal, pseudoscience, and quackery! We’ve had one success–a good start–but we must not let up now!

Go, Skeptics! (Also skeptics – you go too.)



The New Big Twenty

Feb 17th, 2005 4:09 am | By

So, they’re making up a new Ten or rather Twenty Commandments, eh. Without the participation of the Archbishop of Canterbury. Well you can’t blame him, can you. Much as his job’s worth, probably, trying that on. Would be kind of like Charles suddenly up and throwing out all the ermine and gold carriages and sceptres and whatnot and drawing up a new plan of action. All these huge houses bursting with Rembrandts and Rollses and gewgaws to be turned into community centres. All Royals to get jobs as maintenance workers in housing estates or driving buses. Parliament henceforth to be opened by Sandra ‘Doc’ Tudge of 47 Ribena Lane, Kidderminster. Factories, hospitals, bridges and suchlike to be opened by the cast of The Archers on a rotating schedule. Well it wouldn’t fly, would it. The Queen would just tell him ‘I don’t think so’ and send him on a peace mission to that little island about seventy kilometres from that other little island that you can just barely see with a powerful telescope from that uninhabited island off Tierra del Fuego. Same with the Archbish. His boss wouldn’t be particularly pleased and flattered to have him re-writing the rules, would he. Kind of implies he didn’t do a good job the first time. Which of course he didn’t – Alabama judges to the contrary notwithstanding – but he doesn’t expect his own servants to tell him that, does he.

Whatever. That’s his problem. Nothing to do with us.

Thou shalt not own or drive or buy or covet or admire an SUV.

Neither shalt thou talk on thy cell phone [mobile] whilst driving thy small automobile.

Thou shalt not put pineapple on pizza.

Thou shalt not talk loudly, caper, squeal, grimace or argue whilst walking about in public.

Thou shalt not wear thy hair in the manner of Donald Trump.

Thou shalt not wear purple and yellow together, nay, not even if thou art a ‘Husky fan.’

Thou shalt not wear lycra spandex undergarments outside thine own house unless they are augmented with a seemly outer garment. Thou shalt not make a display of thy buttocks, whether on a bicycle, or running, or standing in a supermarket checkout lane.

Thou shalt not expectorate on the public right of way.

Thou shalt not make unseemly gestures with thy hands whilst at the wheel of thy small unobtrusive automobile.

Thou shalt not call women female dogs, nay, not even if thou art flushed with rage, or beside thy wits, or singing a rhythmic tuneless song, or pretending to be a ‘homey.’.

Thou shalt not turn up the volume and bass on thy small car’s sound system such that it causes passers-by to totter and bump into walls.

Thou shalt not serve sushi to guests who are not expecting it.

Neither shalt thou serve calimari, nor oysters, nor peanut butter and grape jelly on Wonder bread.

Thou shalt not go on a low-carb diet.

Thou shalt not talk about carbs and carb-counting.

Neither shalt thou serve thy guests low-carb meals that leave them hungrier after eating than they were before. Thou shalt provide pasta or rice or bread (not of the Wonder clan) or potatoes as I have laid it down for thee.

Thou shalt not tell stories about thy children, neither about thy dog. Thou shalt talk about interesting subjects, or be silent.

Thou shalt not floss thy teeth in the living room whilst guests are present, nor yet when they are absent. Thou shalt never floss thy teeth in the living room.

Thou shalt not vote for any present or former motion picture thespian for any political office whatsoever, nay, not even if it be country assessor in a rural county in South Dakota.

Thou shalt not take it upon thyself to invent new deities. Thou hast more than enough to deal with already.



Squeaky Wheels

Feb 15th, 2005 11:57 pm | By

This is good. Now if lots of people start saying the same thing, maybe one of these days it will begin to sink in.

In case it isn’t already obvious, competition has broken out between the religious elements of our society for the label of ‘Most Sensitive’. Every time someone gets offended, it has become standard policy to complain that followers of other faiths are treated with more respect…[B]roadcasters, production companies and even theatre houses can fall into a trap of trying to keep the ‘representatives’ happy. In an environment where they’re evidently competing with each other, this is a dangerous policy because there is no way back. With Behzti for example, it gave the impression to those being consulted that they had editorial control over the final product. For news organisations it can mean bias in reporting. For young British Asians who want to tell their own stories through theatre, it can mean facing an environment where censorship is imposed on them by their own community…The worry is that in the desire to be politically correct, British institutions end up listening only to highly vocal and organised religious groups. There is a tendency to assume they represent everyone in their respective communities.

Yup, there is. In fact – that bit about ‘young British Asians who want to tell their own stories through theatre, it can mean facing an environment where censorship is imposed on them by their own community’ – that reminds me of something – gosh, what is it – it’s hovering right there – oh yes! I remember now. Just change the word ‘theatre’ to ‘literature’ or ‘fiction’ and you have the situation Salman Rushdie found himself in. And still does, since the fatwa was touchingly renewed the other day. Gives communitarianism a whole new meaning, that kind of thing.

Harry has a post on the subject at his Place.



Why Are You so Silent?

Feb 14th, 2005 10:35 pm | By

Hmm. There’s an odd statement in here – in the AAUP’s statement on the Ward Churchill fuss. Well, that’s not surprising, I guess. Pretty much whenever people start talking about freedom of speech and academic freedom, odd statements get made. It seems to be a subject that inspires odd statements – no doubt because there are so many competing goods at issue, and because people don’t always notice the competitive aspect, so they’ll cheerfully make contradictory statements from one sentence to the next.

Needless to say, the AAUP thinks Churchill should not be fired for writing the ‘little Eichmanns’ article, no matter how livid the right-wing pundits get. Needless to say, I agree with them, however much I may mock Churchill’s Billy Jack routine. But there are some oddities, all the same.

One of them is utterly routine and predictable, but it’s one that always makes me wonder a good deal.

Freedom of faculty members to express views, however unpopular or distasteful, is an essential condition of an institution of higher learning that is truly free. We deplore threats of violence heaped upon Professor Churchill, and we reject the notion that some viewpoints are so offensive or disturbing that the academic community should not allow them to be heard and debated.

The thing that always bothers me about statements like that is that it leaves out a real problem – thus making the free speech position seem a lot easier than it really is. Because there are views and viewpoints that are not just unpopular or distasteful or offensive or disturbing – they are dangerous or harmful. That’s where a lot of the disagreement takes place, obviously. That’s the issue that’s central to the disagreement over the incitement to religious hatred law in the UK – whether such a law can, in principle and in fact, distinguish between speech that is unpopular or distasteful or offensive or disturbing, and speech that is dangerous – or (more complicated still) potentially dangerous. And surely the idea of danger is behind laws against incitement to racial hatred. The point is not that such speech is offensive, it’s that it has the potential to get people killed. And yet – free speech statements so seldom talk about the subject in those terms. That seems to me to be an evasive way of proceeding. I think I think the religious hatred law is a bad idea, but I also think that it’s quite true that it is possible to incite hatred and violence by means of speech about religion. Competing goods, you see. I think there are competing goods here (as there usually are, after all), as opposed to all good versus all bad. Statements endorsing free speech that pretend the worst speech can do is offend or disturb people are stacking the deck. (Which is not, just in case it’s not clear, to say that I think Churchill’s article is dangerous; I don’t; the point is a general one. I’m not making a ‘don’t you know there’s a war on?’ argument against Churchill.)

In fact the tension is visible right inside the statement. ‘We deplore threats of violence heaped upon Professor Churchill.’ Yup. But threats of violence are speech too. But they go beyond unpopular or distasteful or offensive or disturbing. I think that should have been mentioned somewhere in that statement, if only as a parenthetical stipulation. ‘Freedom of faculty members to express views, however unpopular or distasteful (provided they fall short of threats or incitement),’ perhaps. There’s a large snake-swallowing-tail element in all this, because people often use their freedom of speech to make threats against other people in order to shut them up. As we saw in Birmingham a few weeks ago. Well that’s how free speech is, isn’t it – there’s a huge de facto element. The powerful have more free speech than the powerless; those who own newspapers and radio stations have more free speech than those who don’t; the rich who can buy advertising and bribe politicians have more than the poor who can’t; and so on. ‘Sure, honey, you have a constitutional right to say whatever you like, and if you say it I’m going to punch you in the face. Go ahead.’



High Tension

Feb 11th, 2005 8:50 pm | By

A couple of further thoughts on the Taboo question. There is a lot of tension in all this – because there are some rational, non-ostrich-like, non-fingers-in-ears, non-You Can’t Say That reasons for worry about, for instance, saying that a particular identifiable set of people may have, in however small a statistical sense, less of a given ability than another set or sets. One such reason is the self-fulfilling prophesy. The worry is that if you tell people – especially and all the more so if you tell them officially academically scientifically studies have shownically – that they are, or they belong to a group or subset of the population that is, statistically, however slightly and tail end effectly, innately less good at X, there is very often a strong tendency for the people in question to give up on X as a result. To relax their efforts, to decide it’s hopeless, to give themselves permission not to bang their heads against a wall.

A book on US education, The Learning Gap, by Harold Stevenson and James Stigler, discusses one aspect of this problem in chapter 5, Effort and Ability. They argue that Americans put more emphasis on innate ability while Chinese and Japanese people put more on persistent effort. ‘In sum, the relative importance people assign to factors beyond their control, like ability, compared to factors that they can control, like effort, can strongly influence the way they approach learning. Ability models subvert learning…’ I have a friend who teaches high school math, and she is apt to go off like a bomb when anyone says maybe girls and women find math more difficult than boys and men. She spends much of her working life trying to counter that idea in her girl students: she says they believe it, and the result is that they don’t try. I find that highly plausible, since that was my own attitude to math when I was in school – I decided very early that I hated it and was no good at it, so I never tried hard enough.

So you can see where such ideas can be disastrous. Group X is good at A. B, and C. I belong to group X: I’m good at A-C, less good at D-W. What follows is not only ‘I’ll do better at A-C, I might fail at D-W, A-C will be easier,’ and the like. There is also the even more insidious thought that ‘I won’t be an authentic X if I try to do D-W. Xs don’t do D-W, it’s not their scene, it’s a Y thing, a Z thing, not an X thing. I’m proud to be an X, I don’t want to imitate Ys or Zs – even or especially if Ys and Zs are above Xs in the social hierarchy. That’s all the more reason to be a loyal X, an authentic X. Ys and Zs are successful, rich, important, powerful, sure, but-therefore, they are wicked, heartless, selfish, materialistic, phony, money-mad, alienated, too clever by half. I will never desert my people – I will do X things.’

So…one can see why people would want everyone to just shut up about the possibility of a statistical tail end effect in women’s math ability, even if it is or may be true. But at the same time one can also see that that wanting everyone to shut up about something is generally incompatible with scholarship and inquiry. So there’s a tension. It makes my head hurt. Kind of the way algebra used to.



Taboo U

Feb 9th, 2005 8:04 pm | By

There is a lot of discussion of the Taboo mentality going on right now – which is good, in the sense that the dangers of the taboo mentality are being pointed out, but it’s bad, in the sense that there is also a lot of Taboo mentality around right now. Is it worth it to have some people thinking badly to give an occasion for other people to explain what’s wrong with bad thinking? Wouldn’t it be better and simpler just to have everyone thinking clearly to begin with? Yes, probably, but since that’s not going to happen, it’s a good thing there are people around to do some nudging.

Salman Rushdie at Open Democracy, for example.

At Cambridge University I was taught a laudable method of argument: you never personalise, but you have absolutely no respect for people’s opinions. You are never rude to the person, but you can be savagely rude about what the person thinks. That seems to me a crucial distinction: people must be protected from discrimination by virtue of their race, but you cannot ring-fence their ideas. The moment you say that any idea system is sacred, whether it’s a religious belief system or a secular ideology, the moment you declare a set of ideas to be immune from criticism, satire, derision, or contempt, freedom of thought becomes impossible.

There’s a lot of disagreement over that thought, but it seems right to me. Declaring a set of ideas immune from criticism or satire does seem like the one thing you don’t want to do with a set of ideas. You could say that that’s what the word ‘God’ is for – a kind of imaginary rubber stamp or strongbox or chastity belt serving to render a particular set of ideas undiscussable, unchangeable, non-negotiable. Given what human ideas can be and what they can do, that seems like a very risky approach.

Steven Pinker in The New Republic is talking about the same general idea, though in a different instantiation.

To what degree these and other differences originate in biology must be determined by research, not fatwa. History tells us that how much we want to believe a proposition is not a reliable guide as to whether it is true…

And not only history. An easy thought-experiment can show us the same thing. Let’s see…I want it to be true that there is a steaming-hot pizza with feta, pesto and artichokes on the table. But there isn’t, and it isn’t. I guess my wanting isn’t all that powerful, then.

What are we to make of the breakdown of standards of intellectual discourse in this affair–the statistical innumeracy, the confusion of fairness with sameness, the refusal to glance at the scientific literature? It is not a disease of tenured radicals; comparable lapses can be found among the political right (just look at its treatment of evolution). Instead, we may be seeing the operation of a fascinating bit of human psychology. The psychologist Philip Tetlock has argued that the mentality of taboo–the belief that certain ideas are so dangerous that it is sinful even to think them–is not a quirk of Polynesian culture or religious superstition but is ingrained into our moral sense.

As a matter of fact, it was reading Pinker on Tetlock and others on taboo that inspired my colleague to create the ‘Taboo’ game. Of course, some ideas are ‘so dangerous’ – the ideas that swirl around ethnic cleansing, genocide, purity, eugenics, generally cleaning up humanity by thinning it out radically, are an obvious example. But as Pinker puts it –

Unfortunately, the psychology of taboo is incompatible with the ideal of scholarship, which is that any idea is worth thinking about, if only to determine whether it is wrong…The tragedy is that this mentality of taboo needlessly puts a laudable cause on a collision course with the findings of science and the spirit of free inquiry.

Yes. Books published by Taboo University Press are not the ones that promise a searching look at which ideas are wrong for what reasons. I’ll order mine from Free Inquiry Press, thanks.



How Do I Look in This Beret?

Feb 7th, 2005 11:21 pm | By

Norman Levitt has some very pointed things to say about Harvard.

Harvard University, the oldest in the USA and the wealthiest in the world, thinks very well of itself…It is an open secret that [Summers] was handed the helm at Harvard out of a growing sense that the place had grown stale, complacent, and narcissistic. Too many Harvard professors had settled into the habit of assuming that any old doctrine, opinion, or casual observation they chanced to utter was, ipso facto, profound and epochal merely because it issued from the great faux-Georgian citadel on the Charles. In truth, the place had grown somewhat dowdy, intellectually speaking, and, even worse, had proved itself susceptible to the vagaries of academic fashion…In some areas, Harvard had not only tolerated trendy mediocrity, but actively embraced it. Summers’ task, then, was to shake things up and to restore a relentlessly meritocratic ethic to the process of hiring and rewarding faculty where mere piety and sentimentality had previously been permitted to call the shots.

It’s funny how exactly like the New York Times that description sounds – at least to me. Thinks very well of itself; stale, complacent, and narcissistic; profound merely because it is itself; intellectually dowdy; medicrity; piety. The Times has a dreadful habit of announcing that it’s the best newspaper in the world – which apart from anything else simply can’t be true, can it? Surely in the entire world there are better newspapers than the Times – aren’t there? If not I think I’ll have to join the French Foreign Legion.

But that’s a digression – except it’s not entirely: because the phenomenon of the complacently mediocre top of the heap is interesting, and it’s part of what Levitt is talking about. His account sounds plausible to me because I’ve seen the same sort of smugness in other institutions with excessively solid reputations. Or in people with the same things. Remember the Cornel West fuss?

Summers lost no time in taking up the challenge. Early in his regime, he notoriously confronted Black Studies eminence Cornel West, essentially accusing him of goofing off with flashy and trivial projects (like voice-overs on hip-hop CDs) rather than turning out scholarly work of real substance. The touchy West promptly picked up his marbles and headed for Princeton, where a certain soft-heartedness still reigns. Many Harvard students, bred on the platitudes of ‘diversity’ and greatly susceptible to West’s showmanship, were outraged…But though some still blame Summers for ‘losing’ West, the prevailing opinion – most often stated anonymously, of course – is that Summers did the university a favour by cleverly easing out a dubious academic ‘superstar.’

Showmanship. Just so. That’s a serious occupational hazard for academics, you know. It comes of spending much of one’s waking time telling callow ignorant young people what’s what. (We have a joke about it in the Dictionary. ‘Socratic deformation or elenchusitis.’) Men are especially prone to it. Yes they are; don’t argue. Come on, you know they are. It’s the sex thing. They know their students are going to get crushes on them – how can they help strutting a little? Whereas women mostly know their students are not going to get crushes on them, and are mostly not all that flattered if they do. (Why? Because young men are repellent, while young women are attractive. Next question.) Then add some flashy ‘radical’ politics and ‘Indian’ credentials (however bogus), and you’ve got yourself a first-class Che-wannabe. Tweak the ingredients and you’ve got Cornel West. Tweak again and you’ve got Judith Butler (I said men are especially prone, not exclusively). Particularly at a time when there are a lot of people around inexplicably willing to call some academics ‘superstars’ – the temptation is clearly almost overpowering. But how nice it would be if the dears would resist. They kind of discredit the whole enterprise when they preen themselves in public. They feed into the suspicion of people like Fox News anchors that universities are nothing but theatrical settings for people who like to dress up as wevowutionawies and frighten the bourgeoisie.

I was actually going to talk about the substance of Levitt’s article but I got sidetracked by the style question. But that’s just it: in a lot of cases I think the style is the substance. Looking at and reading Ward Churchill, I find myself convinced that he doesn’t really mean any of it, that he just says the most ‘radical’ thing he can manage to think of, for the sake of saying it. To show off, basically. I knew people like that when I was at university – boy, did I. They were so much more into posing than they were into really thinking about what they were talking about. Vanity, vanity, all is vanity, saith the preacher. Well he was a showoff too.



Churchillian

Feb 6th, 2005 10:05 pm | By

This Ward Churchill guy is quite funny. I shouldn’t say that, I suppose, but he is. He’s so…obvious. The hair, the shades, the jaw, the flocks of doting students. You can tell he thinks he’s Nick Nolte crossed with Russell Means with just a dash of Springsteen. Yeah dude you’re just like totally cool man.

Ward L. Churchill has been angry for years, shaking a clenched fist at American power from the streets of Denver and the lecterns of academia.

Where it’s both safe and profitable to do so, one can’t help noting.

Born near Peoria, Ill., Churchill has a master’s degree in communications and is a U.S. Army veteran.

He’s also a full professor. Usually people need a PhD to get to be full professors. The Nolte schtick seems to have paid off. But not everyone buys it.

But others see him differently, including some Native Americans angry over his claims to be one of them. At the top of his resume, Churchill lists his enrollment in the United Keetoowah Band of Cherokee Indians. Yet the chief of the Oklahoma tribe, George Wickliffe, said they “had no association with Churchill in any capacity whatsoever.” Churchill says he is three-sixteenths Cherokee. Suzan Shown Harjo — president of the Morning Star Institute, a Native American rights group in Washington, D.C. — has Census data showing Churchill as born to parents listed as white. She said he had not shown up on the rolls of the tribes he said he belonged to. “This is not a Native person. He goes around college campuses, saying he was at the occupation of Alcatraz, Wounded Knee and at the Bureau of Indian Affairs takeover in 1972. But no one can remember him being there,” she said. “I was at the BIA takeover as a reporter, and I never saw him.”

Add a dash of Kevin Costner to the mix. You can see why I find him funny. It’s the Walter Mitty stuff, the Billy Liar routine, the Zelig business. He was everywhere, man – Alcatraz, Wounded Knee – Little Big Horn, the Trail of Tears, in the audience when Brando refused the Oscar, aboard the Titanic, at Wat Tyler’s side…

David Bradley, a well-known Indian artist in Santa Fe, earned Churchill’s wrath by championing federal legislation that required those selling their work as Indian art to be able to prove their tribal ties. “In the 1980s, money was flying like confetti around here. You had dozens of people pretending they were Indian and selling their art,” Bradley said. “We had everything stolen from us for 500 years, and I wasn’t going to let them take our art as well.” Churchill, who is also a painter, took issue with the effort. “He wrote this slanderous attack about me. He tried to impugn my motives,” Bradley said. “He ought to be fired. Shame on CU [University of Colorado] for giving this con man a job.” Bradley believes Churchill opposed the law because it affected his ability to sell his paintings. Churchill attacked the 1990 Indian Arts and Crafts legislation, saying it gave rise to “witch hunts” among tribes looking for phony Indians and put undue importance on racial purity.

Oh yeah? Undue importance? Well what are you doing teaching ‘ethnic studies’ then?

The American Indian Movement, based in Minnesota, has called for his dismissal from the university, saying he “fraudulently represented himself as an Indian” to build his career.

Hey, I’m three sixteenths Cherokee, which is good enough, because racial purity is not important, so please can I be a professor of ethnic studies at this nice university, with my MA in communications and all?

To build his career? Oh, surely not!

Timothy Burke has an excellent post here on Churchill.

In that context, it becomes awfully hard to defend the comfortably ensconsed position of someone like Churchill within academic discourse, and equally hard to explain an invitation to him to speak anywhere. There’s nothing in his work to suggest a thoughtful regard for evidence, an appreciation of complexity, a taste for dialogue with unlike minds, a proportionality, a meaningful working out of his own contradictions, a civil ability to engage in dialogue with his colleagues and peers in his own fields of specialization. He stands for the reduction of scholarship to nothing more than mouth-frothing polemic. We cannot hold ourselves up as places which have thoroughly and systematically created institutional structures that differentiate careful or or thoughtful scholarship from polemical hackery and then at the same time, have those same structures turn around and continually confirm the legitimacy of someone like Churchill.

And Margaret Soltan has another – in fact she has a whole series. It appears that the University of Colorado has been covering itself with non-glory for some time.

UD doesn’t want to kick CU while it’s down, but all you need to do is type University of Colorado in that Search thing up there to find in her blog endless accounts of sports and alcohol and academic fuckupery on campus…The spokesperson would then announce a series of real changes that will now take place. Those changes could involve firing the entire board of regents, shutting down fraternities, shutting down the sports programs, and pressuring some of the hundreds of bars adjacent to the campus to leave. They could, more immediately, involve shutting down the ethnic studies program, which, this spokesperson will admit, is a disgracefully shoddy academic unit. “We have been asleep at the wheel,” this person will conclude; “and Ward Churchill was the crash that ensued. I assure you that we at this university are now fully awake. This proud institution, which we love, will shake itself off and find its way home again.”

Very interesting.



Happy Birthday, Ayn

Feb 3rd, 2005 7:38 pm | By

Okay am I missing something here? Am I just, like, hopelessly out of touch? Why are people taking Ayn Rand seriously? What do they mean by it?

Carlin Romano, for instance. What’s up with that? Carlin Romano’s not an adolescent or a Wall Street Journal addict or the chairman of the Fed, so why is he treating Rand like someone who is worth paying attention to?

Well he is cagey about it. He doesn’t actually say he thinks she’s any good himself – he just says other people do. She’s famous, she’s influential, she made a big noise. But a hasty reader might get the impression that he agrees with those other people. It’s actually a somewhat interesting bit of journalistic sharp practice, I think. ‘Write us a thousand words on Ayn Rand for the centenary.’ ‘Oh hell – do I have to?’ ‘Come on – she’s important. Just hold your nose and do it. You don’t have to say she’s a second Hume or anything.’ ‘All right, all right – I’ll tell them how damn influential she was. I can certainly say that without lying! Unfortunately.’ And that’s all he does say. It’s quite amusing in a way. Like any bit of fan-dom. Hey, she’s famous, she’s really famous, she’s very very famous. The end.

No one, however, now doubts that she pulled off a major, enduring American career as both novelist and thinker, and that her influence and popularity have persisted among readers since her death in 1982…A 1991 survey by the Library of Congress found Atlas Shrugged to be the American novel most influential on readers’ lives. Her books have sold more than 30 million copies around the world and sell hundred of thousands every year in the United States.

Err…yeah. Notice how consistent all that (and the rest of the article) is with thinking her books are piles of crap. Notice how resolutely Romano never says what he actually thinks. Quite funny in a way.

Scott McLemee is much less coy.

My own outlook, of course, being more of the “first, let’s tax the rich to death” variety. We’re all doomed — doomed, I tells ya — until there is a rigorous program of confiscation of incomes above (let’s say for starters) a million dollars. If that is a political fantasy, certainly it is no more so than Rand’s utopian capitalism. The more I think about it, the more her worldview resembles a Soviet era socialist-realist novel with the word “communism” scratched out and “capitalism” written in. The joke has it that they were “boy meets tractor” romances. In her case, it’s more like “masochistic girl meets skyscraper.”

That’s more like it.



Autonomy Revisited

Feb 1st, 2005 11:50 pm | By

In a N&C (Circumstances) a few days ago I asked a lot of questions about the relation (if any) between ethical commitments and autonomy. About whether it’s possible to have ethical commitments (as opposed to rules) at all without autonomy. I don’t know the answer. But I am skeptical about the possibility, and I think that problem (if it is one) gets overlooked too easily, when people think about religion as a source of ethical commitments and ideas.

I happened on some relevant remarks this morning, so thought I would add them to the mix. They’re by Susan Moller Okin in Is Multiculturalism Bad for Women? pp. 129-130.

Even the most prominent ‘political liberal’ of all, John Rawls, who rejects the imposition on religious sects that ‘oppose the modern world’ of the requirement that their children be educated so as to value autonomy and individuality , also argues that the liberal state should require that all children be eudcated so as to be self-supporting and be informed of their rights as citizens, including freedom of conscience…Nussbaum, who also endorses political liberalism, says that, while it respects nonautonomous lives, it ‘insists that every citizen have a wide range of liberties and opportunities; so it agrees…that a nonautonomous life should not be thrust upon someone by the luck of birth…Many parents belonging to religions or cultures that do not respect autonomy would (and do) very strongly resist their children’s being exposed to any religious or cultural views but their own. But, like Nussbaum (and to a lesser degree Rawls) I do not think that liberal states should allow this to happen. I believe that a certain amount of nonautonomy should be available as an option to a mature adult with extensive knowledge of other options, but not thrust on a person by his or her parents or group, through indoctrination – including sexist socialization – and lack of exposure to alternatives.

There. Exactly what I was thinking. ‘Lack of exposure to alternatives.’ The argument that religion should be treated with special consideration because ethical commitments are an especially valued and valuable part of individual identity seems dubious to me not only because religion is not the only source of such commitments, but also because for many people ethical commitments can be a kind of closed loop. They can be a closed loop and still be a valued part of identity…but does that matter? Or at least does it matter more than other considerations that come to mind? Like considerations about the merit of the ethical commitments in question (maybe the children of Mafiosi, of white supremacists, of warlords, grow up with strong ethical commitments to extortion and murder and genocide), and about whether the people who hold them have ever actually thought about them, and whether this question relates to the previous one. In other words, does it make any difference whether an ethical commitment is something you’re just born to, or whether it’s something you’ve consciously made? Surely it does. If we never think about ethical commitments, how do we separate the decent ones from the disastrous ones? Ethical commitments are not just adornments for our identities, they’re motivations to treat people well or badly; so they’re not just personal private concerns, they’re also public ones.



Ether

Feb 1st, 2005 8:36 pm | By

Speaking of radio (there’s a deft transition for you), I keep meaning to recommend this In Our Time from last month. It’s on the Mind-Body Problem, and the contestants are – no, that’s not right – the people doing the talking are Sue James, Anthony Grayling and Julian Baggini. It gets very amusing toward the end when Julian and Anthony Grayling get in a punch-up. No, I’m only joking. But Grayling says a rude word to Julian in Latin, and Julian laughs – rough stuff for philosophy! No not really, philosophy is actually very aggressive; it’s more aggressive than squash. No, not really, nothing is more aggressive than squash. They were talking about how aggressive the squash game in Ian McEwan’s new novel is, on Saturday Review the other day – and Tom Sutcliffe was also talking about how boring it is (the squash game, not the novel, which they all liked a lot) (apart from Tom Sutcliffe and the squash game – he didn’t like that part). Anyway this Latin punch-up is all the more amusing because the rude word Grayling calls Julian just happens to be the title of one of Julian’s Bad Moves. Is that a staggering coincidence or what! Well maybe it’s not a coincidence. Maybe Grayling carefully studied the B&W page that has the most recent Bad Moves listed by title, and memorized a few so that he could inject one into the discussion. Anyway, you should give it a listen, it’s very interesting as well as amusing.

Speaking of Julian (another deft transition for you), I tripped over this article of his in the Guardian a couple of days ago. What I was doing was, I was looking (via Google) for that review of the Dictionary I’d been told about, just in case it had smuggled itself online somewhere after all – and I found this article. It’s about internet interaction, a subject which has interested me for awhile – ever since I started internet interacting, I suppose. Before that it bored me senseless. The three of us talked about it over dinner that time – you remember: last October, when I was over there. I told you all about it at the time.

This is the part that interests me:

Leading philosophers who have written on the web, such as Hubert Dreyfus and Gordon Graham, have argued that this is largely because face-to-face we interact with the whole person, whereas in virtual environments we only have access to a small part of what they choose to reveal. Therefore a purely online relationship can never be with a whole human being. It’s an intuitively plausible argument, but even if the most intimate of relationships require physical contact, in our working lives, we do not and need not deal with the whole person. For example, many people behave very differently at work to how they do at home. Personal identity is extremely malleable, and people play different roles at different times of the same day. That means it is simply not necessary to know “the whole person” in order to have a good working relationship with them.

And there’s a little more to it than that, I think. It may be that some people actually reveal less of ‘the whole person’ in the real world than they do in virtual or written interactions. It may be that real world interactions bring out hostility, aggressiveness, shyness, inhibition, suspicion, awkwardness, inarticulacy and similar qualities that are not helpful to social interaction, that are not really particularly central or important to the ‘real’ nature of the person who shows them. John Carey says something to this effect in his introduction to the Everyman edition of Orwell’s essays. After a few sentences about how prickly, and ill at ease friends remember Blair as being, he says:

George Orwell the writer, by contrast, is confident, relaxed, open, democratic. This is not to claim that his writing misrepresents his ‘true’ self. You could just as easily argue that the true self was masked by shyness or awkwardness in life and came out in the writing.

Just so. It’s an interesting question. Is the person that other people see more real than the one we experience from inside? In some ways, it seems reasonable to say yes. Some of those ways are related to recent studies on self-esteem that indicate people tend to over-estimate many of their abilities. It’s possible that we all think we’re kinder, pleasanter, more considerate, less rude and selfish and me-firsty, than we in fact are. (It’s also possible that a lot of us think we’re less boring than we are. I’m pretty sure of that. I have only to think of various people who have bored me into near-comatose states to realize that. They surely didn’t know how boring they were, did they? If they had surely they would have shut up, somewhere along the way.) In that sense, and no doubt many similar ones, people who can see us from the outside do know us better than we know ourselves. But in others…they don’t, at least not necessarily. There is always at least the possibility of a gap between appearance and reality, between what shows externally and what is going on internally. It’s the other minds problem. We think we know; especially in very close, intimate relationships, we think we know. And maybe we do. But, then again…



Tel Hits One Out of the Park

Feb 1st, 2005 6:58 pm | By

Update – I decided to move this one too, since the discussion is still going on. Chris M supplied this link and this one.

Oh, jeezis. I saw a reference to Terry Eagleton’s piece in the Guardian at Normblog earlier today, but didn’t read it. I saw another reference just now at Harry’s place, and this time I did read it. It was – very horrible. Way more horrible than I expected. I’m not sure why. There’s just something about the preening, lit-critty, self-admiring tone of it all, of the aesthetic approach to mass murder, that just made my gorge rise. It’s as if he’s, I don’t know, admiring his reflection in a pool of blood, or combing his hair with someone’s blown-off hand. He’s not really making a political argument, that’s what’s weird – he’s doing some sort of languid, semi-ironic literary criticism. Literary criticism of suicide bombing – just what the world needs. What can he think he’s playing at?

Like hunger strikers, suicide bombers are not necessarily in love with death. They kill themselves because they can see no other way of attaining justice; and the fact that they have to do so is part of the injustice…People like Rosa Luxemburg or Steve Biko give up what they see as precious (their lives) for an even more valuable cause. They die not because they see death as desirable in itself, but in the name of a more abundant life all round. Suicide bombers also die in the name of a better life for others; it is just that, unlike martyrs, they take others with them in the process. The martyr bets his life on a future of justice and freedom; the suicide bomber bets your life on it. But both believe that a life is only worth living if it contains something worth dying for. On this theory, what makes existence meaningful is what you are prepared to relinquish it for. This used to be known as God; in modern times it is mostly known as the nation. For Islamic radicals it is both inseparably.

How about that ‘just’ in ‘it is just that, unlike martyrs, they take others with them’? That’s quite a ‘just’! Oh is that all – well silly me then not to think of the suicide bombers as just like Steve Biko and Rosa Luxemburg. And then notice how quickly he forgets the thing about taking others – ‘But both believe that a life is only worth living if it contains something worth dying for.’ Not just dying for, Bub: killing for. Making other people die for. Imagine a fiery-eyed student popping into your office and locking the door and telling you he was about to give you the glory of dying along with him for something that makes life worth living. Would you take quite such an aesthetic view of the matter then? And he does it again – ‘what you are prepared to relinquish it for’. No! Pay attention, dammit. What you are prepared to make others relinquish it for. It’s not about you, it’s about them. Can’t you get that? Are you so caught up in this stupid dandyish word-spinning that you can’t hang on to such an obvious thought for two sentences?

See, this is what I’m saying about the ethical commitments thing, the identity thing. It’s not just about the damn ego, it’s about what you do to people.

And there are three more paragraphs of really disgusting verbal pirouetting, just as if he were droning about Henry James or Dostoevsky (oh yes, so he is), about the meaning of suicide bombing – ending up at this rich mess:

Blowing himself to pieces in a packed marketplace is likely to prove by far the most historic event of the bomber’s life. Nothing in his life, to quote Macbeth, becomes him like the leaving of it. This is both his triumph and his defeat. However miserable or impoverished, most men and women have one formidable power at their disposal: the power to die as devastatingly as possible. And not only devastatingly, but surreally. There is a smack of avant garde theatre about this horrific act. In a social order that seems progressively more depthless, transparent, rationalised and instantly communicable, the brutal slaughter of the innocent, like some Dadaist happening, warps the mind as well as the body. It is an assault on meaning as well as on the flesh – an ultimate act of defamiliarisation, which transforms the everyday into the monstrously unrecognisable.

Honest to fucking Christ. Is that cute or what? Can cultural theorists spin a metaphor or can they not. If that doesn’t make you sick, you have a stronger stomach than I do.



Comfort-Myth

Feb 1st, 2005 1:56 am | By

There are a lot of bizarre remarks in this piece in the LRB.

Within the limits he sets himself, Sharpe’s book is admirable…He takes pride in bringing to his task the skills of a professional historian, determined to ‘get history right’. He sets out to expose the stories told about Turpin since his death as factually incorrect…Sharpe is uncomfortable with myths.

Um…why should Sharpe not be ‘uncomfortable’ with myths? (That sentence is a good example of why ‘comfortable’ is one of the first words that was defined in the Fashionable Dictionary – the original one, the one on B&W. ‘Comfortable’ is such a weasel word. What’s comfort got to do with anything? It’s not about bums on seats, or even about elevated heart rate and sweaty palms. It’s about critical thinking, epistemology, rational inquiry. Thinking that myths are out of place in ‘professional’ history is not a giveaway of pathetic nerdy insecurities, it’s simply a reasonable idea of what history is supposed to be: to wit, evidence-based and logically sound.) Why should he not be ‘determined’ to ‘get history right’? What should he be determined to do, get it wrong?

What interests Sharpe about this story (which he has read in the much abbreviated fifth edition) is that it is false: what should have interested him is that Ainsworth’s readers (and the book was an enormous bestseller) thought it was true.

I beg your pardon? Why is that what should have interested him? If that’s not the book he’s writing, then why should it interest him? It may well be a highly interesting question, why Ainsworth’s readers thought the Black Bess story (that Turpin rode a horse 200 miles in 12 hours) was true, but it’s not the only possible subject. David Wootton does go on to say some interesting things on this question, but it doesn’t follow that Sharpe ought to have written them instead of what he did write.

This doesn’t occur to Sharpe. His idea of the historian as someone who gets at the facts means that he can give a fine account of the activities of Turpin and the Essex gang, but it makes him quite unfitted to be a reader of Rookwood.

Um…so? So on earth what? You read Rookwood if you want to, but that doesn’t mean everyone has to. And what is with this absurd scorn for the idea of ‘ the historian as someone who gets at the facts’? Well we know what’s with it. Alas.

Sharpe could have been provoked by his subject into reinventing the idea of what history is: instead, his conclusion, ‘Dick Turpin and the Meaning of History’, retreats to the old cliché that the business of the historian is to deal in facts…The language of fact and fiction, critical and uncritical thinking, is useful if one wants to address the question of whether Turpin was a thug. But it hardly helps one address the question of why Rookwood appealed to the imagination of its readers.

The old cliché – it’s a shame, isn’t it, the way historians will go on thinking that they ought to deal in facts rather than myths. Unless, of course, they are in fact doing histories of myth, which some historians do. But they don’t all do that, and why should they? (And if they do, one hopes they do it with some reference to facts somewhere along the way, lest they tell us about Navajo myths that actually belonged to the Chinese, and Egyptian myths that in fact originated with the people of Tierra del Fuego.) And some historians inquire into the history of literary taste; but, again, not all of them. If the question of why Rookwood appealed to the imagination of its readers was not Sharpe’s subject, it’s not clear why he should have addressed it.

Guess what David Wootton is writing about. It made me snort with laughter when I saw it. Can you guess? I’ll tell you. ‘He is writing a history of the body from Hippocrates to Foucault.’ Attaboy! No chance of any old clichés there! That’s some fresh untilled cliché-free ground, all right. I think I’ll review it when it comes out, and keep asking why he didn’t write a history of underwear from Nefertiti to Adorno instead.



Order, Design, Whatever

Feb 1st, 2005 1:55 am | By

I heard a classic example of the journalistic habit of translation that I have pointed out a few times in the past, earlier today on the BBC World Service. It was a discussion of creationism and the pressure to get it taught in US schools, between Peter Atkins and creationist Donald DeYoung. At one point DeYoung (or else the journalist) mentioned ‘design’ and Atkins said ‘There is no design in nature.’ DeYoung didn’t hear, and Atkins repeated with great distinctness and emphasis, ‘There. is. no. design. in. nature.’ DeYoung, a physicist, disagreed and talked about the weight of the proton: if it had been just a tiny amount heavier, etc (the anthropic principle, in short). The journalist cut that off, as being too far afield, and said ‘You [meaning Atkins] say there is no order in nature – ‘ ‘That’s not what he said,’ I shouted at the radio. ‘That’s not what I said,’ Atkins said without shouting (well he was nearby). ‘I said there is no design in nature.’ ‘Same thing,’ said the journalist. ‘It’s not the same thing!!’ I shouted even louder. ‘It’s not the same thing,’ said Atkins.

I mean. Come on. The guy thinks order and design are the same thing?! And all [I mean many – not all – I haven’t checked them all, have I!] journalists, apparently, think ‘no evidence that’ is the same thing as ‘proof that not,’ and so they use the two interchangeably with gay abandon. It’s an outrage.

Really. Journalists ought to be licensed, or something. And they ought to learn some basic vocabulary and concepts before they get that license. I mean that literally – well except of course that if they did, Julian would run out of Bad Moves. As it is the supply seems to be infinite.



Good Morning, Senator

Jan 31st, 2005 3:47 am | By

The ‘Academic Bill of Rights’ issue seems to be warming up. Unfortunately. Because the idea seems as full of holes as a colander. It seemed leaky when I wrote an In Focus on the subject a year and a half ago (it doesn’t seem that long, but it was), and it seems leaky now. The difficulties seem so obvious…I mentioned some –

Who would decide the law was being violated? What would the criteria be? What would constitute evidence? Would the testimony of students be sufficient? If so, what of the possibility that for instance a student who’d received a C, or one who’d been bored, or one who simply disagreed with a teacher would file charges? If student testimony would not be sufficient, would administrative staff sit in on classes? Would they go undercover? Might that lead to an underage hence underexperienced and underqualified administration?…And what about all the issues that don’t divide along a neat left-right axis? What about points of view that aren’t really particularly political at all but that the legislature doesn’t happen to agree with? So perhaps we begin to see the advantage of dealing with such issues by means of discussion, debate, argument, books and articles and websites, rather than handcuffs and subpoenas.

And Brian Leiter mentions similar ones

The problem is that it is plainly within the purview of academic freedom for instructors to determine what counts as “serious scholarly opinion” and what counts as “human knowledge.” The only way, then, that this provision can have any bite is if it authorizes others (who exactly? the Ohio legislature?) to override the instructor’s academic freedom in setting the curriculum…suppose the student’s political, ideological, or religious beliefs with respect to the subject matter of the course are false, i.e., contradicted by “serious scholarly opinion” and “human knowledge” as assessed by the instructor? Does the creationist get a free pass for her ignorance in evolutionary biology simply because she has a “religious belief” that conflicts with the scientific content of the course? Is the NeoNazi entitled not to be graded down when he writes a paper for his Anthropology or Biology class defending his “ideological belief” that “the Jewish race is biologically inferior”?

Just so. Question after question. But that hasn’t stopped the Ohio legislature, as Leiter usefully points out. Coming soon to a university near you: teaching entirely taken over by political hacks, lobbyists, bribers and bribe-takers, and religious zealots who have the ear of the political hacks. Spiffy. That will certainly improve things.



Felicide

Jan 30th, 2005 8:42 pm | By

Any of you read the TLS? An informant told me via email that ‘apparently’ there is a review of the Fashionable Dictionary in the latest one, but that it doesn’t seem to be available online. I asked a few questions, such as who wrote it, but the informant didn’t answer, so I’m thinking it was probably a joke. I love jokes. So – if any of you do read the TLS – is there a review of the DFN in there? Silly of me to be so curious, I know, but – well it’s probably an American thing.



The Clash

Jan 28th, 2005 8:18 pm | By

This articles intersects with a couple of issues we’ve been talking about lately. (Well, I say ‘we’ – I’ve been talking about them. I know that. It’s just me, going jaw, jaw, jaw. I realize that. But I think of it as a discussion anyway – I think ‘we’re’ talking about them. Because…because of a lot of things. Comments, and emails I get, and that tiny little high-pitched voice that no one else hears, and – what meds? I’m fine, cut it out, get your hands off me – )

Sorry. Where was I. A couple of issues. The one about various tensions between cherished goals and ideas, and the one about special treatment of religion.

In the bitter controversy that followed, the Christian Legal Society sued Ohio State, charging that the university’s nondiscrimination policy violated the group’s First Amendment right to freedom of religion by forcing it to accept unwanted members. This past fall, without ever going to court, the group won a complete victory when Ohio State changed its policy to exempt student groups formed to promote “sincerely held religious beliefs.”…Requiring a Christian-student association to admit non-Christians or gay people, “would be like requiring a vegetarian group to admit meat eaters,” asserts Jordan Lorence, a senior lawyer at the Alliance Defense Fund, which is based in Scottsdale, Ariz. “It would be like forcing the College Democrats to accept Republicans.”…Emotionally charged conflicts like the one at Ohio State have forced colleges to choose which of two basic principles is more important: freedom of religion, guaranteed by the First Amendment, or equal protection under the law, as established by the 14th Amendment. “There are times when constitutional rights come into conflict with one another,” says Jeffrey Gamso, legal director of the American Civil Liberties Union of Ohio.

Aren’t there just. And such times force one to think hard about which rights, goals, values, ideas are more important (or valuable or basic or non-negotiable or central or various other terms indicating which is less possible to give up) and which are less so. And often such thinking gets one nowhere but at a stalemate, an ‘I don’t know.’ And I don’t know. Because groups are discriminatory, aren’t they. If you have a book group, you discriminate against people who never read. If you have a cooking group, you discriminate against people who eat out of tins and foil bags. If you have a runners’ group, you discriminate against people who prefer to amble or sit or lie flat. And so on. But you don’t really want to see the same principle extended to the Anti-Semites’ group, or the Misogynists’ Club, or the Homophobia Alliance. So you seem to want to judge groups on the merits of their principles of discrimination – and what a can of worms that would be to get into! What a lot of time and trouble that would be. As all these college adminstrators and litigators remark in the article.

Critics of the change are particularly concerned that the settlement exempts only religious student groups from nondiscrimination rules, which may represent an unconstitutional favoring of religious groups over nonreligious ones, says Ruth Colker, a professor of constitutional law at Ohio State. She predicts that the decision could lead to future lawsuits if nonreligious groups are denied recognition because they practice some form of discrimination.

Well, exactly. Here we are again. Why do ‘sincerely held religious beliefs’ get special consideration when other kinds of sincerely held beliefs don’t? There seem to be a lot of reasons – habit; religion is consoling; religion is taken to be central to people’s identity; religious people are willing to sue; and no doubt more. But none of them really seems like a knock-down argument or reason, does it. Which is why people aren’t always pleased when sincerely held religious beliefs get special treatment that sincerely held secular beliefs don’t.



Circumstances

Jan 27th, 2005 6:34 pm | By

I said I was going to drone some more about ethical commitments. Why? Because the subject interests me, especially now, when there is so much pressure to take religion seriously, to be sympathetic towards religion, to give religion the benefit of the doubt, to be careful not to dismiss religion ‘lightly’ or ‘contemptuously’ or quickly or any other way that doesn’t involve the aforementioned taking it seriously. I don’t say there is no merit to those suggestions and urgings, but I do think they are too much in fashion right now, and the other view is too much out of fashion. So I think it’s useful to take a look at the underpinnings of the idea. I take the thought that religion is one important source of ethical commitments, to be one of those underpinnings. Hence the utility of poking away at the thought.

Ethical commitments are not just general, they are also particular. Let’s look at them in particular – as the ethical commitments of people in particular situations. Because what situation one is in makes a considerable difference to how one forms one’s ethical commitments. The phrase has very different meanings depending on whether the person who holds them is: autonomous; responsible for others; in control, authority, power over others; subordinate, owned by, obliged to others.

The phrase also has different meanings depending on past history. Has the person who holds them ever been in a position to think about and choose them, to consider alternatives, to look around them on all sides, to decide? Has the person been issued the ethical commitments from birth? Were they issued as orders and commandments, as imperatives and mandates? Were they issued as mandates by a person or people who (as part of this ethical commitment) was/were in a position of unquestionable power, authority, ownership over the person? Are the ethical commitments the commitments of everyone the person has ever met, heard, seen, read? Are they actually commitments rather than an inherited conglomerate? Does it matter?

The meaning also depends on context, environment, geography, social and national history as well as personal history. Also on what kinds of indoctrination, socialization, education the person has had access to. Is the person able to go to a library? Can she go by herself and read freely, anything she wants to? The meaning depends on the nature of the schools, religious institutions, libraries, television, radio, newspapers, magazines, books that are available. Bookshops, coffee houses, gathering places. The police, security service, military. Vigilante groups, gangs, militias. Security cameras, and who is watching them.

So, to put it more concretely and specifically. The ethical commitments of a single man or woman in London or New York who had secular liberal parents and went to state schools and a secular university, have been arrived at in different ways from the ethical commitments of a married woman in a village in Bangladesh who never went to school and was married off at age twelve.

And not only formed differently but ontologically different. Different all the time. It seems reasonable to wonder if some people – people in certain situations, situations not of their own making or choosing, situations that are controlled and given and coercive – are in a position to have anything that can be called ethical commitments at all. Or if they can be called that, can the people who hold them be said to hold their own? Or do they hold other people’s? In which case ‘ethical commitments’ would be an oxymoron, wouldn’t it? So they don’t have them. They have rules; they obey. A ‘commitment’ can’t be something imposed on them from the outside, can it? Or can it. It seems like a contradiction in terms. Commitment is voluntary or else it’s not commitment, it’s something else. Isn’t it?

It’s obvious what I’m getting at (the female pronoun is one clue). Some people have much more freedom, power, authority, responsibility, to have ethical commitments than other people do. And, furthermore, some people are in a position to impose their own ethical commitments on other people, while other people are in a position such that they are required to accept them. Others again are somewhere between these two extremes.

Some people have more of this freedom because they are of a gender, class, status, caste, and in an environment, such that they are allowed autonomy, they are allowed to think for themselves. Other people are not. In many contexts and situations, a male parent has the ethical commitments for everyone in the household. He has the freedom to have them (except that within many religions he actually doesn’t), and he also has the power, authority, and responsibility to impose them on all the people below him in the hierarchy. He would be considered to be failing in his responsibilities if he didn’t impose them. The responsibility to impose his ethical commitments is in fact an ethical commitment itself. (As, it could be argued, is all teaching and persuasion. Yes. But it does surely make a difference how mandatory the teaching is. What backs it up. [Whipping? Incarceration? Disapproval? Regret?] Who does it, what the relations between the two parties are.)

The man has the freedom and duty. What about the others? Does the woman in such a situation (subject to a male head of household) have ethical commitments? And is there any way to know whether she does or not? Is there any way to judge whether she is simply doing what she is told, or actually agrees with every commandment she is given?

And the same questions for the children, doubly so for the girls. The boy children expect to grow up to impose ethical commitments on other people, the girls expect to grow up to have them imposed on them. That must make a large difference in the meaning of ethical commitments for each.

Which is not to claim that ethical commitments can be formed in a vacuum. Obviously they come from somewhere in all cases, they’re not created ex nihilo, there’s always influence, from parents, teachers, friends, books, tv, music, all sorts. But all the same. Some choices are freer than others. Some people are in a position to choose among alternatives while others are not; with variations in between.

So, in a sense, one can see what is meant by concern for ethical commitments, but even so, it seems relevant that not all ‘ethical commitments’ are ethical commitments. It may be worth asking if it is simply naive to take them all as on a par with one another – all as formed in the same way, by people with fully equal freedom and opportunity to form them, consider them, question them. What can an ‘ethical commitment’ be if you never actually had the chance to form it yourself but merely had it inserted into your head like a coin in a parking meter?