cognitive bias

William liked the bit in my last post where I said that most believers are carrying a map of the real world somewhere, because they know in advance what excuses to make for the apparent absence of gods and dragons. Of course, I stole it from Overcoming Bias (mentioned previously here). Carl Sagan’s point in the original invisible dragon story is about falsifiability. The crew over at Overcoming Bias use it another way, to think about what’s going on in dragon-believer’s head when they know enough anticipate the results of testing for the dragon, but not enough to say “there’s no dragon”.

It’s that sort of keen observation that keeps me going back to Overcoming Bias despite all the stuff about freezing your head when you die. The aim of the game for Biasers is to have a map which matches the territory, and to be able to read it aloud. They’ve started Less Wrong, a new site where anyone can contribute something they think will help achieve this aim. It’s based on the code for Reddit, where users can vote stories up or down, though at Less Wrong, the editors manually promote stories to the front page, and there’s a separate page where you can view stuff that’s merely popular. You can follow Less Wrong on LiveJournal by adding less_wrong to your friends list.

The community is working pretty well so far. Watching the decline of Kuro5hin makes me worry that community moderated sites will turn to crap (although there’s still some good stuff over at k5, such as an article about the tendency of community moderated sites to turn to crap), but having real humans in charge of promoting articles might mitigate that. The system has given some new voices a chance, notably Yvain. Here are some of my favourite articles so far:

I’ve made a few comments over there, although nothing earth-shattering: sympathising with someone whose girlfriend left him for Jesus, or talking about Bernard Woolley and irregular verbs.

I’ve been thinking about posting some more about what I’ve got out of Overcoming Bias and Less Wrong here on LJ. It’s all very well ranting about religion, but rationality isn’t graded on a curve. Don’t worry, religion-rant fans: I’ve got a few more of those lined up too.

Some of you already read Overcoming Bias, the blog of Oxford’s Future of Humanity Institute (I’ve seen gjm11 commenting there, and wildeabandon mentioned it, I think). I’ve been reading quite a bit of the archives recently, as evidence by the number of comments I’ve made referring to old postings there.

The bias of the title is cognitive bias, the psychological term for systematic mistakes human minds tend to make. The purpose of the blog is self-help and social change: “If we know the common patterns of error or self-deception, maybe we can work around them ourselves, or build social structures for smarter groups. We know we aren’t perfect, and can’t be perfect, but trying is better than not trying.”

Eliezer Yudkowsky is one of the main contributors there. He’s an interesting character: heavily invested in ideas about the Singularity and Friendly AI. His stuff on Overcoming Bias touches on those interests, but is worthwhile even if you consider such ideas silly (I’m not sure whether I do or not at this point: my instinctive reaction that this stuff is far-fetched may be an example of bias).

What I like about his writing is that it’s usually clear and incisive. He shows a passion for reason (contrary to Star Trek, a passion for reason isn’t a contradiction in terms) and almost a reverence for it. You get the feeling that his SF stuff about Bayesian masters undergoing the Ritual Of Changing One’s Mind isn’t just an illustrative analogy. Coming so soon after I read Anathem, I see the blog as one place where this world’s avout hang out. Stuff like Diax’s Rake would be right up their alley.

livredor once told me that one of my biases is to latch on to someone very clever and align my beliefs to theirs (I think this bias is a common one among technical people who have taught themselves some philosophy). So I ought to be a little careful when I read his stuff. Yudkowsky’s faults are that he’s also self-taught, so needs his (likewise very clever) commenters to point out that he’s covering old ground, has missed out on the standard arguments against his position, or is not using the standard definitions of some terms (such as the case where he argues his moral views are not moral relativism, for example). Some of the postings where he talks about how he used to think a whole load of wrong stuff and now doesn’t can get tedious (ahem). In some cases he’s made extended series of posts where I don’t understand the conclusion he’s trying to draw (the series on morality is an example).

Still, I’m very much enjoying articles like his articles on staging a Crisis of Faith (which isn’t ultimately about losing religious faith, but about changing long-held opinions. It’s good introduction to the blog as a whole, as there are links to many other good articles at the bottom of it), Cached Thoughts, Are Your Enemies Innately Evil? (shades of Bartlet’s “They weren’t born wanting to do this” there), Avoiding Your Belief’s Real Weak Points, Belief in Belief (not quite your standard Dennett argument); and his argument that Elijah conducted the original scientific experiment.

I recommend the blog to you lot. If you like reading blogs on LJ, you can find it at overcomingbias.