What if we are living at the edge of changes and breakthroughs that will lead us into an unknown stage of development?
What if we are living now at the edge of tremendous changes and breakthroughs that will lead the humankind to evolve faster than ever along the twenty-first century? What if we are at the edge of a tipping-point in the history of humankind?
Mathematician and sci-fi writer Vernor Vinge wrote, back in 1993, that acceleration of technological progress has taken us to “the edge of change comparable to the rise of human life on Earth.” Vinge, one of the first writers to envisage cyberspace, wrote on an essay for NASA in 1993, later published by the Whole Earth Review, that the cause of this sweeping change would be “the imminent creation by technology of entities with greater than human intelligence.” He called this tipping point a singularity.
A singularity, in physics, in astronomy, is where the laws of physics as we know them break down. My friend Ulisses Leitão, professor of Physics and Linux evangelist, tells me that the greatest singularity, is the Big Bang moment: “density would be so high, tending to infinity, that present Physics would not be able to describe its physical behavior. It would require a unified theory of all fundamental forces of Nature: electromagnetic, strong nuclear, weak nuclear and gravitational. A theory encompassing Quantum Physics (small dimensions), Relativistic (high energy), and Universal Gravitation (large distances). “This theory doesn’t exist, we’ve been searching for it for the last 60 years…” If the search is possible, i.e. if we have the tools to search for it, than, in the long run, the theory is possible. That’s the point to me.
Vinge thinks that technological singularity means a moment beyond which huge, but unpredictable, changes occur, as John Hind explains on an article for the Guardian, in 2002. On his original presentation of singularity, Vernor Vinge said that “when greater-than-human intelligence drives progress, that progress will be much more rapid.” He went even farther: “there seems no reason why progress itself would not involve the creation of still more intelligent entities, on a still-shorter time scale.” That is thinking as creatively about the future as possible.
Envisaging inevitable surprises we can anticipate without ever knowing in advance their consequences to us, as Peter Schwartz proposes in his 2003 book Inevitable Surprises: Thinking Ahead in a Time of Turbulence, is also about creative thinking. They are two different and equally valid ways of thinking about a “history” for the future that is boldly visionary and technically sound.
Inevitable surprises we can anticipate, as hypothesized by Peter Schwartz, are not in contradiction with the possibility that we are plunging into a singularity, a whirlpool of vertiginous change. The former tells us about changes we can anticipate, but not know its consequences. The latter tells about a tipping point after which change will accelerate beyond imagining to arrive at a quantum leap on human evolution that goes beyond everything we’ve known so far.
The difference is that one way of thinking points to the possibility of anticipating the changes that could create the means for the emergence of singularity. The other invites us to try to anticipate the broader consequences of these events. To look at the time, Vinge invites us, “where our old models must be discarded and a new reality rules.” From the human point of view this change will be “a throwing away of all the previous rules, perhaps in the blink of an eye, an exponential runaway beyond any hope of control.”
Can you imagine how much controversy this idea has created in the academic and intellectual circles more than 15 years ago? Reaction to the singularity hypothesis was widespread. Supporters have also multiplied. Social scientist Robin Hanson once collected several comments on Vinge’s singularity. One comment has direct implications for the whole idea of looking into the future: it stated that nothing is certain, we’re always dealing with hypothesis. Nick Bostrom, director of the Future of Humanity Institute said that he did not “regard the singularity as being a certainty, just one of the more likely scenarios”.
Singularity has raised controversy since the first time Vernor Vinge used the idea, fictionally and rather diffusely, on a novel, Marooned In Real Time. On the story, a character says, at a certain point of the plot: “It was the Singularity, a place where extrapolation breaks down and new models must be applied. And those new models are beyond our intelligence.” It is a breaking point, a paradigm shift beyond the concepts we’re used to. Similar to the passage from the Middle Ages to the Enlightenment.
It is easy to understand why all the controversy. We are talking about two orders of unknowns and neither is really easy to look into. The future is an entertaining idea until we start to realize it points to our ineluctable finitude. We have to make ourselves comfortable with the idea of looking beyond ourselves and our beloved ones. Singularity radicalizes this vision. It points beyond human dominance in the universe. Not comfortable at all. Ray Kurzweil and several others took this idea much further, into the real of transhumanism. But that’s far beyond my view.
Stewart Brand has a point worth recalling in his The Clock of the Long Now: time is asymmetrical to us. We can see the past but we can’t change it. Yet we still argue about the past, I’d add. We cannot see the future, he continues, but we can influence it. He is not implying we can control the way future events will unfold. It is not about trying to control the future, but trying to give it, i.e. to future generations, the tools to help itself.
Isn’t that precisely what we are trying to do about climate change? We know, or most of us know, we cannot control natural laws. There is very little we can do with the tools we have today about the amount of GHG we’ve already sent to our atmosphere, or the global warming we’ve already bought with the carbon we’ve emitted so far. We can develop tools to adapt human society to these very likely events, though. We can develop tools and the required means of governance to reduce future emissions and avoid worst case scenarios.
None of these challenges is about certainty, about knowing beyond any doubt. Certainty will always fall within the realm of our finitude. It is about uncertainty, risk, chances we should no take. We can estimate probabilities and educatedly guess probable consequences. To do that we must look into the future, and doing it with art, creativity, imagination and boldness helps a lot. Worst than to reveal good and bad things that might be brewing in our future, would be to make these views dull and obvious.
I, for myself, as far as climate change is concerned, would rather be warned of risk greater than what is most likely to happen than to be informed of risk that might fall short of probable outcomes. The same is true for me regarding the future history of this century. I’d rather think that humankind will have overcome its frailties, brutality and insensitiveness; learn solidarity to the sufferings of those different from oneself; domesticate the propensity of the powerful to oppression and of the rich to amass far more wealth they can manage; than to imagine it will be all the same in 2100.
There is a tipping point looming on the horizon of our future. It may not have anything to do with Vinge’s singularity. We can only be sure of one thing: change will be overwhelming and our old models will have to be discarded, a new reality will rule.
Tags: climatechange, futures, globalwarming, scenario, scifi, singularity, tippingpoint