Follies of the Wise
I’m reading Frederick Crews’s Follies of the Wise, which is terrific; don’t miss it. I thought I would give you a bit that resonated strongly with me.
When I began distancing myself from Freudianism around 1970, it was because of a growing, and personally vexing, sense that psychoanalytic ‘knowledge’ is acquired and certified by fatally lax means. I realized at that juncture that my deepest loyalty was not to any particular doctrine but to empirical rationality itself – the ethos that characterizes not just science but every investigative discipline worthy of the name. Ever since then, I’ve been fascinated by irrationalist movements that make a strong appeal to educated people who ought to know better. [page 344]
Well. It may be obvious why that resonates with me. It’s a pretty succinct and eloquent statement of the point of B&W. First the fact that my deepest loyalty is not to any particular doctrine but to empirical rationality itself, and then the fascination with educated people who ought to know better (and who teach other people, so ought to be especially careful and responsible) playing with irrationalist movements and failing (often flatly and explicitly refusing) to give their deepest loyalty to empirical rationality itself. That’s B&W, in a nutshell.
That has prompted me to ponder a little the question of why my deepest loyalty is not to any particular doctrine but to empirical rationality itself. It’s perhaps a slightly strange way to assign one’s deepest loyalty – loyalty usually seems like the kind of thing that is owed to more passion-inspiring entities than empirical rationality. It usually seems like the kind of thing that goes with inspiring doctrines but not so much with methods of inquiry. And yet deepest loyalty is the right phrase; that does describe it; it’s cognitive but also emotional; the two are thorougly entwined. So the question is why is that? I’ve come up with one version of an answer; I might write a book around it; but I’m not sure I’ve completely explored the question. We’ll see.
Write that book, explore that question. I think it has something to do with having both your emotions and your rationality going in the same direction, like adding vectors or something, but it isn’t just that. [To the vector go the spoils…] It’s that the rationality doesn’t switch off when the emotion switches on, maybe?
I began distancing myself from Fraudianism almost as early as Crews, though I was just a teen. I simply couldn’t believe the lameness of someone else trying to tell me what was in my subconscious, which they had never seen, any more than I had. I encountered this sort of thing more than once with college counselors–but the tide began to turn the day I figured out that if I couldn’t prove a blasted thing, neither could they. I made couch-stuffing out of one of them before the story was done. But it sure would have been easier if I had had Crews’ books. Now my library has them, thanks to my urging, and maybe some student now might have an easier time than I did.
Loyalty not to any particular doctrine but to empirical rationality itself–that sounds like it is higher up the abstraction ladder or something, more of a view from the top, and so more freedom in the long run. That’s just a guess; it’s been a long day…
I’m in the middle of it, too (just finishing Darwin’s Sunday School and getting into Zen). It’s been particularly useful with Freud, where I had previously felt a bit out of my depth. And of course, the point isn’t whether all psychoanalysts are benign or evil, it’s whether there is actually anything behind their methods. If claims are made that memories are recoverable from an age prior to our neurological ability to store them, one can come to certain conclusions. No wonder so many papers have languished under lock and key for so long; releasing them could be a death sentence for a golden goose.
But what most essays seem to have in common is the exposure of the opposite of the rational enquiry point you make. And maybe that is one of the most important answers to your “why” question. If you’re not guided by empirical rationality, or are guided by something that either ignores it or goes against it, doesn’t it have to mean that you’re guided by something that someone, either recently or long ago, simply made up? How many other possibilities are there? What’s left is merely the question of whether you choose to follow the path offering wishful thinking, or sold with persuasiveness or endowed with seeming authority or simply enforced through intimidation. Am I wrong in putting it in such dichotomous terms? There are lots of unsupported, unevidenced beliefs out there, but they all have in common what they don’t have in common with the method that limits itself to that for which there is evidence. It seems to me one can draw a pretty clear line between everything for which that point is paramount and everything for which it isn’t.
Just to respond in very general terms, I suspect it may have had something to do with being brought up to certain basic values, such as telling the truth, and expecting other people to do the same. Confronted later in life with the convolutions and distortions of various belief-systems and theories, a certain subset of humanity remains honest enough to cry ‘Enough! This is rubbish, you’re making no sense…’
The transition from ‘truth’ to ‘sense’ is an interesting one, of course, but they are undoubtedly entwined there somewhere.
I think being so wedded to empirical reason is at least partly due to inherent personality attributes. While no personality theory is wholly satisfactory, several of them identify a thinking\feeling continuum as one of the important aspects of personality. This may be why some people who are brought up in a deeply unreasoning environment eventually come to see that religion is a purely anthropological phenomenon, and also why some people brought up in perfectly reasonable families become drawn to all sorts of touchy feely nonsense.
“That has prompted me to ponder a little the question of why my deepest loyalty is not to any particular doctrine but to empirical rationality itself.”
I don’t think it’s strange at all – it’s a basic belief in openness, in the primacy of process over destination.
Personally, while for example my atheism is important to me, it is quite subordinate to the (I like to think) logical reasoning that has led me there. My endorsement of various theories about how the world does and doesn’t work is secondary to my endorsement of the scientific method that has developed these theories. (And, partly analogously, my support for any given political party or policy depends on my support for democracy as a means for selecting governments and policies.)
What this amounts to is a shift from believing “X, Y and Z are right” to believing “A, B and C are the best ways to assess rightness”. We may still agree with X, Y and Z, but we remain open to being persuaded (via A, B and C) otherwise.
Progress depends on reassessing and revising old beliefs; freedom depends on being able to question authority and received wisdom. To feel passionately about the process is no oddity.
Great comments. I’m glad I asked!
I think that’s right, Stewart. If your deepest loyalty isn’t to e.r. then you’re always vulnerable to being deluded, tricked, fooled, led astray. And something in me (and in a lot of people – though not enough) just fights that, hates it, recoils at it. I think that’s personality more than upbringing, as Mike says.
What I was thinking slightly strange was the idea of loyalty; that I agree with Fred that it’s loyalty. But – “freedom depends on being able to question authority and received wisdom. To feel passionately about the process is no oddity.”
Well, yes. Just so. Not so strange then. How interesting.
It ties up with Sen’s book, as a matter of fact. That’s one of his key points in Identity and Violence – in fact it’s the key point. Freedom depends on being able to think about and reason about what we identify with, in order to be able to choose what to prioritize at any given moment. And freedom matters. Feeling loyalty to the idea (and the reality) of freedom doesn’t seem at all odd, so neither should the idea of loyalty to empirical rationality.
“Just to respond in very general terms, I suspect it may have had something to do with being brought up to certain basic values, such as telling the truth, and expecting other people to do the same”
The funny thing is that telling the truth was a big part of what I took from my religious (Catholic) upbringing, and that this concern for truth is the very thing that eventually led me away from all religious beliefs. I think that’s a tension that many people brought up in organized religion feel, although they might not feel comfortable thinking about it too much.
The question for me, similar to OB’s point, is why the concern for truth came to trump all or most of everything else for me, when it hasn’t done so for other people from similar backgrounds. After all, most of us were brought up with the same core values, but secular rationalists are still about as common as Komodo dragons. It’s an interesting question as to why that is.
Phil
Me too! That is, the question of how I got to be a committed rationalist/skeptic type fella has been nagging at me for some time, and I keep coming round to the question without being satisfied by the answers. When I concluded that I simply didn’t believe any of this god nonsense at age twelve-ish (I was merely dubious about it before that age), I don’t think I’d ever even heard or read the word “atheist” – let alone personally encountered any freethinkers. How does a nice Midwestern boy brought up with a fairly typical amount of religious nonsense and with no particular training in or exposure to critical thinking turn out to be an atheist before puberty? I dunno.
I hope you continue your speculations on that question in public, because I’d certainly like to read more of others’ insights. The discussion has been interesting so far.
So we all wonder then (she said, generalizing grandly from a sample of eight) (a highly non-random sample at that). How interesting.
I think in my case it’s partly to do with the fact that I find it an imposition to be expected or urged or ordered to believe things that lack evidence or good arguments. I take it personally, in a weird way (possibly weird). Or not really – I don’t literally take it personally (unless it’s personal – addressed to me rather than to people or atheists or arrogant intellectuals in general) – but I take it as if personally. It’s a fault to take things personally that aren’t personal; an acquaintance rebuked me for doing that the other day; that’s part of what is so repellent and irksome about Motoons and Satanic Verses and Brick Lane fits; and yet…
And yet things (qualities) like that can be irksome faults in one light and motivations for useful allegiances in another.
But the real point is not the taking it personally but the imposition. That’s one place where the goats split off from the sheep, I think – people who don’t mind having nonsense thrust upon them, and people who do.
I tend to think it is something very basic about temperament – I suppose because it matches with what I’ve always known about my own temperament, even from childhood. That it’s bloody-minded, no-saying, resistant, obstinate.
I tend to think that, but only tend. I’m very tentative about the tending.
But I’m not at all obstinate, so that explanation doesn’t help me at all.
(If any of you knew me in what the kids these days call the meat world, you’d find the previous sentence spitting-your-drink-onto-your-screen funny.)
Okay, so there seems to be something to what I’ll arbitrarily label the “innate personality traits” hypothesis. I never did leave the stage of childhood where every claim an adult makes is followed by a “Why?” – and I never did learn to accept any variation on “Because I said so!” as an answer. But I can’t help feeling more than a little unsatisfied by an answer that hinges on the supposed innateness of the behavior/attitudes. It isn’t really an explanation, is it? It amounts to saying “I’m this way because this is the way I am.”
So what would be a more satisfactory explanation? Seems to me that, firstly, there needs to be some clearer grasp on exactly what constellation of behaviors and attitudes we’re trying to explain. Allegiance to empirical reasoning is a phrase that gestures towards what we’re talking about, but doesn’t seem to characterize it very fully.
This is what an education in philosophy does to a guy: First thing I want to do is define the terms.
But wait. I was the kind of person who wanted clear, explanatory definitions of important terms BEFORE I studied philosophy. Which is probably part of the constellation of behaviors and attitudes we’re trying to characterize.
Anyway, I suggest that the first step to explaining how dedicated critical thinkers/empirical reasoners get to be that way is to define more clearly exactly what collection of behaviors and attitudes we’re trying to explain. We all tend to use terms like critical thinker, skeptic, free thinker, rationalist, and so on and so forth, but none of those labels seems to get all that close to the core of what we’re talking about.
If I come up with anything, I’ll get back to you.
:-)
Ah, no, it isn’t really an explanation; I certainly didn’t intend it as complete. It was just meant to be one possibility about one aspect. I’m interested in the idea that it’s partly emotional and that it’s deep-rooted – at least in some people. I’m interested in the way temperament hooks up with cognitive choices.
(By the way, yeah, I could get exactly the same reaction from people who know me by saying exactly the same thing. But hey – obstinacy is a good thing!)
But I wouldn’t want to say temperament is either necessary or sufficient. I especially wouldn’t want to say it’s necessary, because I think the stance can be taught – at least up to a point. (I’m thinking the point up to which it can be taught is the priority – the deepest loyalty aspect. I’m guessing that could be really basic. It’s a guess though.)
“Allegiance to empirical reasoning is a phrase that gestures towards what we’re talking about, but doesn’t seem to characterize it very fully.”
No – but the thing that especially grabbed my attention was just that – the deepest loyalty to empirical rationality. The two things look like such a mismatch – and yet for us they’re not. I find that compelling. Look at them – deepest loyalty: empirical rationality. You might as well say mad passion: logical positivism. And yet it’s right on target.
Welcome to Nerds’ Corner.
I see what you mean about the attention-grabbing. My favorite paradoxical-sounding phrase that just isn’t a contradiction no matter how much it looks like one is this: absolute commitment to fallibilism.
Does this make us fundamentalists about reasoning?
:-)
G
One rarely comes to conscious answers without first thinking of the questions. While there may be those with an innately greater tendency to be questioning in that way, I think there is one factor belonging more to the “environmental” explanations of which I, certainly, am very conscious. Many of us succeed in getting or going through life without being directly confronted with certain problems and for that very simple reason, don’t devote thought to them. I daresay, for example, that however nasty the general reaction of the Christians in the “Jew boy in Delaware” story, I consider it extremely likely that there will be some (no matter how quiet they keep it) who suddenly realize how wrong some of the behaviour is, though they would never have given the issue a thought had it not played out in front of their noses. I have certainly had the experience (more than once) myself of expressing my thoughts on an issue, to be met with “I never actually thought about it.” When that has happened, my realization has been “of course, you never had to; I was confronted with it in a way that left me no choice.”
We all emerge from certain environments and are brought up by someone, whether parents or substitutes for them. I suspect it is far more common for those doing the bringing up to believe their main task (in this area) is to inculcate certain values or moral attitudes (which may or may not include religion) than it is for them to believe it is important to instil a questioning attitude that will enable the child to reach independent opinions.
I suppose the saying for this ought to be something like “Teach a child to fish and he’ll become a fisherman. Teach a child to think and he may well end up questioning the morality of fishing.” All that said, I don’t think upbringing can either force or prevent intellectual attitudes. There are too many examples around to contradict any rules one would like to find.
I think one of my above points needs sharpening. It isn’t just that being educated to question seems to be lagging behind inculcation; much of what is being inculcated is explicitly intended to be beyond questioning.
There’s a very good analysis of all this in the last chapter of a new book titled “Why Truth Matters”