Overcoming Bias: how can we obtain beliefs closer to reality?

Some of you already read Overcoming Bias, the blog of Oxford’s Future of Humanity Institute (I’ve seen gjm11 commenting there, and wildeabandon mentioned it, I think). I’ve been reading quite a bit of the archives recently, as evidence by the number of comments I’ve made referring to old postings there.

The bias of the title is cognitive bias, the psychological term for systematic mistakes human minds tend to make. The purpose of the blog is self-help and social change: “If we know the common patterns of error or self-deception, maybe we can work around them ourselves, or build social structures for smarter groups. We know we aren’t perfect, and can’t be perfect, but trying is better than not trying.”

Eliezer Yudkowsky is one of the main contributors there. He’s an interesting character: heavily invested in ideas about the Singularity and Friendly AI. His stuff on Overcoming Bias touches on those interests, but is worthwhile even if you consider such ideas silly (I’m not sure whether I do or not at this point: my instinctive reaction that this stuff is far-fetched may be an example of bias).

What I like about his writing is that it’s usually clear and incisive. He shows a passion for reason (contrary to Star Trek, a passion for reason isn’t a contradiction in terms) and almost a reverence for it. You get the feeling that his SF stuff about Bayesian masters undergoing the Ritual Of Changing One’s Mind isn’t just an illustrative analogy. Coming so soon after I read Anathem, I see the blog as one place where this world’s avout hang out. Stuff like Diax’s Rake would be right up their alley.

livredor once told me that one of my biases is to latch on to someone very clever and align my beliefs to theirs (I think this bias is a common one among technical people who have taught themselves some philosophy). So I ought to be a little careful when I read his stuff. Yudkowsky’s faults are that he’s also self-taught, so needs his (likewise very clever) commenters to point out that he’s covering old ground, has missed out on the standard arguments against his position, or is not using the standard definitions of some terms (such as the case where he argues his moral views are not moral relativism, for example). Some of the postings where he talks about how he used to think a whole load of wrong stuff and now doesn’t can get tedious (ahem). In some cases he’s made extended series of posts where I don’t understand the conclusion he’s trying to draw (the series on morality is an example).

Still, I’m very much enjoying articles like his articles on staging a Crisis of Faith (which isn’t ultimately about losing religious faith, but about changing long-held opinions. It’s good introduction to the blog as a whole, as there are links to many other good articles at the bottom of it), Cached Thoughts, Are Your Enemies Innately Evil? (shades of Bartlet’s “They weren’t born wanting to do this” there), Avoiding Your Belief’s Real Weak Points, Belief in Belief (not quite your standard Dennett argument); and his argument that Elijah conducted the original scientific experiment.

I recommend the blog to you lot. If you like reading blogs on LJ, you can find it at overcomingbias.

8 thoughts on “Overcoming Bias: how can we obtain beliefs closer to reality?”

  1. Yes, OB is frequently very good. High-quality comments, too.

    I liked your allusion to “Saunt Eliezer” in a recent post.

    It may be worth saying explicitly here that his article about a “Crisis of Faith” *isn’t* primarily about (ir)religious ones; in fact, it largely assumes that his readers are irreligious to begin with.

  2. Subject: Less Wrong
    User pw201 referenced to your post from Less Wrong saying: […] absence of gods and dragons. Of course, I stole it from Overcoming Bias (mentioned previously here). Carl Sagan’s point in the original invisible dragon story is about falsifiability. The crew over at Overcoming Bias use it another way, … […]

Leave a Reply

Your email address will not be published. Required fields are marked *