A coupla weeks back, EDGE did a whole thing where they asked Scientists to sound off on what they consider the most dangerous ideas of our day. Ignoring the obvious questions (dangerous to whom for one), the resutlts were predictable: specialists will specialize as they are wont to do, regardless of how relevant their field actually is to anything that affects anyone. Long and short, their ideas just weren't that dangerous.**
Browsing Google Blog Search, I stumbled onto the webpage of a certain Ran Prieur and, thenceforth (), a true Dangerous Idea:
"The Singularity" is the biggest idea in techno-utopianism. The word is derived from black hole science -- it's the point at the core where matter has contracted to zero volume and infinite density, beyond the laws of time and space, with gravity so strong that not even light can escape. The line of no return is called the event horizon, and the word "singularity," in techno-utopianism, is meant to imply that "progress" will take us to a place we can neither predict, nor understand, nor return from.
The mechanism of this change is the "acceleration." Techies invoke "Moore's Law," which says that computer power is increasing exponentially -- in fact it's now increasing faster than exponentially. But Moore himself never called this a law, because it isn't -- it's a behavior of the present system, and it's anyone's guess how long it will continue.
But they imagine that it is somehow built into history, or even metaphysics. They trace the acceleration back into the Paleolithic, or farther, and trace it speculatively forward to computers that are more complex than the human brain, that are more aware and smarter and faster than us, that keep improving until they replace humans or even biological life itself ...
A question they never answer is: why? They seem to believe it's self-justifying, that density/speed of information processing is valuable as density/speed of information processing. They might argue that just as the biosphere is better than the universe by being more densely complex, so a computer chip is better than the biosphere.
Dangerous to whom, you ask?. Me, of course.
More seriously, the guy's premise (as detailed in other essays) is that western civilization as we know it is a accretionary cancer on the biosphere in particular and humans themselves in general. Nothing new there I know, but he not only talks a very good game, he apparently lives it too.
Much respect for that - even if he's failed to convince me the Singularity isn't a valid future. I will address that now. We agree on many things
Evolution is a biological process in which the totality of life grows more diverse and complex, and then apparently gets cut down by some catastrophe every 60 million years, and then rebuilds itself, maybe better than the time before, maybe not. Evolution is not about one life form pushing out another, or we wouldn't still have algae and bacteria and 350,000 known species of beetles. It's not about "survival of the fittest" unless fitness is defined as the ability to add to the harmonious diversity and abundance of the whole.
e.g. that evolution tends toward cooperation - or at least, interaction through complexity. I'll ignore the idea that evolution is a solely biological phenomenon. However, Ran Prieur fails on four points:
1) he's not thinking big enough
2) with one exception, he lumps all singularitarians and transhumanists together.
3) he completely ignores the possibility (and possibilities) of superintelligence
4) left to conventional natural selection, human nature will not change quickly enough not to end up right here again.
What do I mean by 1)? Find out in the sequel.
-------------------------
*And "Dangerous" is crap. Interesting, yes. Food for thought, sure. Unsettling ... maybe, at best.
**well, with the exception of Richard Dawkins, Thomas Metzinger, Geoffrey Miller and maybe two or three others
0 Comments:
Post a Comment
<< Home