“2035. Probably earlier.”

There’s fast, and then there’s … something more

Eliezer Yudkowski now categorizes his article ‘Staring into Singularity‘ as ‘obsolete’. Yet it remains among the most brilliant philosophical essays ever written. Rarely, if ever, has so much of value been said about the absolutely unthinkable (or, more specifically, the absolutely unthinkable for us).

For instance, Yudkowsky scarcely pauses at the phenomenon of exponential growth, despite the fact that this already overtaxes all comfortable intuition and ensures revolutionary changes of such magnitude that speculation falters. He is adamant that exponentiation (even Kurzweil‘s ‘double exponentiation’) only reaches the starting point of computational acceleration, and that propulsion into Singularity is not exponential, but hyperbolic.

Each time the speed of thought doubles, time-schedules halve. When technology, including the design of intelligences, succumbs to such dynamics, it becomes recursive. The rate of self-improvement collapses with smoothly increasing rapidity towards instantaneity: a true, mathematically exact, or punctual Singularity. What lies beyond is not merely difficult to imagine, it is absolutely inconceivable. Attempting to picture or describe it is a ridiculous futility. Science fiction dies.

“A group of human-equivalent computers spends 2 years to double computer speeds. Then they spend another 2 subjective years, or 1 year in human terms, to double it again. Then they spend another 2 subjective years, or six months, to double it again. After four years total, the computing power goes to infinity.

“That is the ‘Transcended’ version of the doubling sequence. Let’s call the ‘Transcend’ of a sequence {a0, a1, a2…} the function where the interval between an and an+1 is inversely proportional to an. So a Transcended doubling function starts with 1, in which case it takes 1 time-unit to go to 2. Then it takes 1/2 time-units to go to 4. Then it takes 1/4 time-units to go to 8. This function, if it were continuous, would be the hyperbolic function y = 2/(2 – x). When x = 2, then (2 – x) = 0 and y = infinity. The behavior at that point is known mathematically as a singularity.”

There could scarcely be a more precise, plausible, or consequential formula: Doubling periods halve. On the slide into Singularity — I.J.Good’s ‘intelligence explosion‘ — exponentiation is compounded by a hyperbolic trend. The arithmetic of such a process is quite simple, but its historical implications are strictly incomprehensible.

“I am a Singularitarian because I have some small appreciation of how utterly, finally, absolutely impossible it is to think like someone even a little tiny bit smarter than you are. I know that we are all missing the obvious, every day. There are no hard problems, only problems that are hard to a certain level of intelligence. Move the smallest bit upwards, and some problems will suddenly move from ‘impossible’ to ‘obvious’. Move a substantial degree upwards, and all of them will become obvious. Move a huge distance upwards… “

Since the argument takes human thought to its shattering point, it is natural for some to be repulsed by it. Yet its basics are almost impregnable to logical objection. Intelligence is a function of the brain. The brain has been ‘designed’ by natural processes (posing no discernible special difficulties). Thus, intelligence is obviously an ultimately tractable engineering problem. Nature has already ‘engineered it’ whilst employing design methods of such stupefying inefficiency that only brute, obstinate force, combined of course with complete ruthlessness, have moved things forwards. Yet the tripling of cortical mass within the lineage of the higher primates has only taken a few million years, and — for most of this period — a modest experimental population (in the low millions or less).

The contemporary technological problem, in contrast to the preliminary biological one, is vastly easier. It draws upon a wider range of materials and techniques, an installed intelligence and knowledge base, superior information media, more highly-dynamized feedback systems, and a self-amplifying resource network. Unsurprisingly it is advancing at incomparably greater speed.

“If we had a time machine, 100K of information from the future could specify a protein that built a device that would give us nanotechnology overnight. 100K could contain the code for a seed AI. Ever since the late 90’s, the Singularity has been only a problem of software. And software is information, the magic stuff that changes at arbitrarily high speeds. As far as technology is concerned, the Singularity could happen tomorrow. One breakthrough – just one major insight – in the science of protein engineering or atomic manipulation or Artificial Intelligence, one really good day at Webmind or Zyvex, and the door to Singularity sweeps open.”

[Tomb]
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s