No more crying then

Listen, I tell you a mystery: We will not all sleep, but we will all be changed – in a flash, in the twinkling of an eye, at the last trumpet. For the trumpet will sound, the dead will be raised imperishable, and we will be changed.

The idea of the Singularity comes from an article by Vernor Vinge. Vinge says that, what with the progress in genetics and computer hardware (such as the trend computer people know as Moore’s Law), it’s likely that we will eventually create entities with greater than human intelligence. When this happens, those entities will know how to create entities even more intelligent than they are, and so on. With ever increasing speed, the curve of intelligence against time will move upwards, and our successors wlll pass beyond our comprehension (the name “Singularity” comes from the mathematical term for the place on a curve where an infinity occurs, such as at x=0 in the graph of 1/x). As Vinge writes:

This change will be a throwing-away of all the human rules, perhaps in the blink of an eye – an exponential runaway beyond any hope of control. Developments that were thought might only happen in a million years (if ever) will likely happen in the next century.

<lj-cut> The essence of the Singularity is a change in human nature which is incomprehensible to those who, either by choice or lack of opportunity, are not part of the change.

In my wanderings around the web, I came across various sites belonging to people who would like to bring about the Singularity. Now, Vinge’s original article is mostly about how the Singularity could go wrong. Evil computers could take over the world, or the aggressive evolutionary heritage of enhanced humans could bring them down (though I’d like to believe that even a superintelligent being which was as selfish as a human would have a better idea of what was good for it than we do right now). So this enthusiasm was a little surprising at first.

I heard a loud voice from the throne saying, “Now the dwelling of God is with men, and he will live with them. They will be his people, and God himself will be with them and be their God. He will wipe every tear from their eyes. There will be no more death or mourning or crying or pain, for the old order of things has passed away.” He who was seated on the throne said, “I am making everything new!”

When I look at the pages advocating the Singularity, I see something like a Christian eschatology for techies. Not for nothing has Charles Stross called the Singularity “The Rapture of the Nerds“. I’m an engineer, of sorts. I make things. I look at the world and want to fix it. Things which are broken and could be fixed if only someone would think frustrate me, and apparently others too.

I have had it. I have had it with crack houses, dictatorships, torture chambers, disease, old age, spinal paralysis, and world hunger. I have had it with a planetary death rate of 150,000 sentient beings per day. I have had it with this planet. I have had it with mortality. None of this is necessary. The time has come to stop turning away from the mugging on the corner, the beggar on the street. It is no longer necessary to look nervously away, repeating the mantra: “I can’t solve all the problems of the world.” We can. We can end this. — Staring into the Singularity, Eliezer S. Yudkowsky

Cory Doctorow writes that the Singularity idea gives you that tingly, numinous feeling when you read about it on the web because the Google guided flow of the net puts you in the right mood for the experience. I don’t agree with his other objections: there’s no reason why you can’t simulate the body, glands and all; Penrose might be right, but, unless I’ve missed out on some very big news, there’s no particular reason to think so right now; and there’ll be another fad in AI when genetic algorithms have paid out. True, the AI cheque always seems to be in the post (allow 30 years for delivery), but biotech, and so what Vinge calls IA rather than AI, might be going places.

Could I say truly that this generation will not pass away until these things have come to pass? I’m not sure I quite share the Singulatarians’ beliefs: Doctorow is right in saying that it sounds too good to be true. The dot-bomb should have taught nerds caution when extrapolating trends which seem to be good for them. But sooner or later, if we survive, I suppose I or my descendants might face some questions about whether to join the change.

And being a bit of a techie, I find it easier to believe in a transformed life beyond that particular veil than in Heaven (though both of them have the problem of not quite being able to imagine how I’d still be me). I don’t believe that suffering is essential to humanity (if I was the protagonist at the end of Greg Egan’s short “Reasons to be Cheerful“, I’d have cried for a bit and then moved the sliders: some emotions are best experienced in small doses). And, Dr Asimov, although it’s true that eyes do more than see, their successors should be better yet. So, if I’m still around, sign me up.

9 Comments on "No more crying then"


  1. It’s an interesting idea… though if I’m allowed to be pedantic, I think it’s misnamed. Each entity’s ability to create more intelligent entities being proportional to its own intelligence, the growth would be exponential, so you wouldn’t reach infinity in a finite time. And it certainly won’t happen this generation if you’re relying on genetic manipulation. The trouble with things that grow really fast, is that they tend to be very sensitive to initial conditions. We couldn’t expect to tell now (and with mortal intelligence) whether the end result will be evil machines who take over the world, or universal peace and harmony.

    Reply

    1. Each entity’s ability to create more intelligent entities being proportional to its own intelligence, the growth would be exponential, so you wouldn’t reach infinity in a finite time.

      True. But Singularity sounds cooler than Exponentiality, I suppose.

      And it certainly won’t happen this generation if you’re relying on genetic manipulation.

      Depends how long a generation is once you’ve been genetically manipulated, but yes, that’s the slower way in (but also the way that sounds more likely to me). Your standard William Gibson/Matrix neural interface might be another way: one can imagine how someone who’s a virtuoso on that particular instrument would effectively be more intelligent than someone without that advantage.

      We couldn’t expect to tell now (and with mortal intelligence) whether the end result will be evil machines who take over the world, or universal peace and harmony.

      Yes. The argument from people who want this to happen is that technological socities must either destroy ourselves or transcend. I’m not sure I believe this: you’ve vaguely plausible SF scenarios like the Culture (yes, it’s them again) with people surfing the breaking wave of technology without falling off or just going for all out Godhood, even where that option’s available (Bank’s “Subliming” sounds pretty like a Singularity to me).

      Reply

    2. you wouldn’t reach infinity in a finite time

      I was going to point this out, but decided it would be too mathmo.

      I’m not convinced that an entity’s ability to create more intelligent entities is propotional to its intelligence. For starters, there’s a minimum intelligence at which it is possible to create entities. Personally, I would expect it to be super-linear, making a finite-time infinity possible, especially as the amount of time required to create an entity might decrease as intelligence increases.

      Reply

        1. What about regression toawrd the mean?

          Isn’t the point of singularity that you’re going away from the mean, rather than towards it?

          In order to go towards the mean, you need to require that above a certain level of intelligence, you could only produce entities that are less intelligent that yourself…

          Now there’s a thought. Perhaps this is, in fact, the case, and this “mean” intelligence is what we erroneously call “omniscience”. An omniscient entity can only create other omniscient entities, or sub-omniscient entities.

          Reply

          1. See, it might be the point of singularity, but it might be the point at which the model collapses. One generation could probably produce a next generation of greater intelligence (physical fitness, disease free-nee, beauty), but the generation after would probably regress like Billy-O. Just to spite them. Ha.

            Reply

  2. Could I say truly that this generation will not pass away until these things have come to pass?

    Why do we want this generation to pass away? Historically, this is one of the best times to be alive. And of course there are problems. But instead of doing away with an entire “generation” wouldn’t it be better to solve these problems in a practical way? So, instead of saying “Lets build machines that can solve the problem”, we could actually solve the problems ourselves. Don’t turn away from the beggar on the street. Send food to someone who’s starving. Give money to the poor. Become a doctor and cure old age.

    This way, instead of staring blindly at what could be in the future or wishing for a fantastic life for our great-great-great grandchildren, we could enjoy the lives we have now.

    Reply

    1. Why do we want this generation to pass away?

      Well, I don’t, seeing as I’m part of this generation (I’m using “pass away” to mean die here). It’d be a bummer to be in the last generation that dies of old age, rather than in the first which doesn’t. So you can see how the Singularity people would predict that it’ll occur within the lifetimes of people living now. (And I’ve phrased it like that because I’m quoting yet another apocalyptic bit of the Bible).

      we could actually solve the problems ourselves

      We can make a difference to that starfish, but I’m at a loss to know how I’d feed all the hungry or even arrange for it to be done. Perhaps I’m too pessimistic, but it often seems that there’s nothing I can do which will make a big enough difference to make it worth doing.

      The attraction of apocalyptic stuff is the idea that wrongs are righted at a stroke (punishing of evil-doers in lakes of fire optional). You can understand why: Daniel and Revelation both seem to be written to persecuted communities and have been adopted by the persecuted ever since, according to Melyvn Bragg and his Radio 4 posse.

      Reply

  3. Perhaps the super intelligent beings derived from modern day humanity in the far future will discover a LJ backup tape at an archaelogical site (let’s say Earth) and will use their God like infinite powers to recreate us all extrapolating from our LJ posts? 🙂

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *