The technological singularity, often simply called the singularity, is a hypothetical event in which technological growth accelerates beyond human control, producing unpredictable changes in human civilization. According to the most popular version of the singularity hypothesis, I. J. Good's intelligence explosion model of 1965, an upgradable intelligent agent could eventually enter a positive feedback loop of successive self-improvement cycles; more intelligent generations would appear more and more rapidly, causing an explosive increase in intelligence that culminates in a powerful superintelligence, far surpassing human intelligence.
Some scientists, including Stephen Hawking, have expressed concern that artificial superintelligence could result in human extinction. The consequences of a technological singularity and its potential benefit or harm to the human species have been intensely debated.
