martedì 14 luglio 2009

ACCELERATING CHANGE


Some singularity proponents argue its inevitability through extrapolation of past trends, especially those pertaining to shortening gaps between improvements to technology. In one of the first uses of the term "singularity" in the context of technological progress, Ulam (1958) tells of a conversation with John von Neumann about accelerating change:

One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.

Hawkins (1983) writes that "mindsteps", dramatic and irreversible changes to paradigms or world views, are accelerating in frequency as quantified in his mindstep equation. He cites the inventions of writing, mathematics, and the computer as examples of such changes.

Ray Kurzweil's analysis of history concludes that technological progress follows a pattern of exponential growth, following what he calls The Law of Accelerating Returns. He generalizes Moore's Law, which describes geometric growth in integrated semiconductor complexity, to include technologies from far before the integrated circuit.

Whenever technology approaches a barrier, Kurzweil writes, new technologies will cross it. He predicts paradigm shifts will become increasingly common, leading to "technological change so rapid and profound it represents a rupture in the fabric of human history".(Kurzweil 2001) Kurzweil believes that the singularity will occur before the end of the 21st century, setting the date at 2045 (Kurzweil 2005). His predictions differ from Vinge’s in that he predicts a gradual ascent to the singularity, rather than Vinge’s rapidly self-improving superhuman intelligence.

This leads to the conclusion that an artificial intelligence that is capable of improving on its own design is also faced with a singularity. This idea is explored by Dan Simmons in his novel Hyperion, where a collection of artificial intelligences debate whether or not to make themselves obsolete by creating a new generation of "ultimate" intelligence.

The Acceleration Studies Foundation, an educational non-profit foundation founded by John Smart, engages in outreach, education, research and advocacy concerning accelerating change.(Acceleration Studies Foundation 2007) It produces the Accelerating Change conference at Stanford University, and maintains the educational site Acceleration Watch.

Presumably, a technological singularity would lead to a rapid development of a Kardashev Type I civilization where a Kardashev Type I civilization has achieved mastery of the resources of its home planet, Type II of itsplanetary system, and Type III of its galaxy.[3] Given the fact that, depending on the calculations used, humans on Earth will reach 0.73 on the Kardashev scale by 2030 or sooner, a technological singularity between now and then would push us rapidly over that limit.

[edit]

Nessun commento: