I'm fascinated with the fact that raising a question also raises doubts. The World Future Society's annual conference was last weekend in Toronto. Sometimes I wonder: "Is the WFS still relevant?" As a kid, I loved thinking about all the futuristic flying cars and moon trips, but somewhere along the way, I became jaded with the pages of Omni magazine and moved on. My stereotypical futurist. An old guy who wants to live forever in a cryogenic body on Mars.
The frost on my interest melted a bit a while back when I heard that Ray Kurzweil was going to keynote at the 2004 WFS gathering. I splurged for the full conference proceedings on MP3 and listened straight through (well, there were breaks). Turns out, the society does still attract a few edge cases (not necessarily in the same warm sense in which Liz Lawley meant when she said it about Scoble), but there are many more great folks who have perspectives well worth considering, integrating and forwarding.
The breadth of interests is wider than any reasonable person should expect, and there is sure to be someone who you just hafta meet (though it may take a bit more haystack diving than it would at SXSW-interactive). There is a kind of curmudgeonly patina to the whole WFS event, which can be kind of endearing (for short stints).
In one evening, I listened to Erwin Laszlo on one stage, and then Ray Kurzweil on another. How I would love to watch these two reconcile the space between their positions. Anyone who is following Kurzweil recently knows his latest book (The Singularity is Near) highlights the progress of exponential growth that has been occuring for a while, and which is starting to get to a point that changes will be very surprising in very short time-spans. One of the biggest claims is that Moore's law will result in a chip that is as powerful as the human brain by the 2020's, and that 20 years beyond that, a single chip will have the processing power of all of humanity.
Kurzweil is first to admit that processing power alone does not equal intelligence, but says that we're real intelligence as a human technological artifact is not far behind. I don't quarrel with this contention, as long as we realize what we're talking about. If we view the human brain as a closed container that produces intelligence purely from internal processes, it won't matter if it's in silicon or carbon based computers.
But is this a fair assumption? Kevin Kelly claims that the Singularity is Always Near. He makes some good points. One in particular echoes my own concern. While listening to Kurzweil, I couldn't help thinking that his argument was oversimplified in one important respect. The jump in processing power from one human brain to all human brains is simply arithmetic in Kurzweil's model. Processing power of 6 billion brains is simply 6 billion times the processing power of a single brain.
Let's look for an analogy to ground this thought: neurons within the brain. What is the processing power of a single neuron? How many neurons are there in the brain? Is the processing power of a brain a simple multiple of the number of neurons in the brain, or do all of those synaptic connections matter? It's a bit trite, but the whole really is greater than the sum of the parts. That's the thing with emergent systems. It seems obvious that the same kind of emergent property is true for humans as well. The number of connections between humans means that the total processing power of all humanity is increasing as the internet matures, even if the total human population remains constant or even declines some.
My suspicion of the techno-savior hypothesis as an AI which will save humanity from itself was more intuitive and vague before this past weekend. Thinking through the issue a bit more (and reading Kelly's essay) seems to clarify the issue a bit more. As computers take over some of the grunt-work from us, we won't simply relinquish our thinking to machines. Rather, it seems much more likely that we will awaken new talents for thinking in much more abstract realms. Realms that computers will struggle to keep up with.
Technology does proceed faster than the changes wrought by biological evolution, but I suspect that evolution has already seeded our biological wiring with capacities that are still dormant and just starting to come online. The singularity is already here. The past is receding at an accelerating rate. Maybe it's the case that it always has and always will. I don't believe that the brain contains intelligence, but is more like an intelligence resonator. The more you connect, the more you can connect.
Well, anyway. I enjoyed the conference immensely, and look forward to next year's meeting. Hopefully, the society will have caught up with the 1990's by then. That'd be super neato.
Comments