The Human Importance of the
Intelligence Explosion
Eliezer YudkowskySingularity Institute for Artificial Intelligence
singinst.org
"Intelligence explosion:"
• Concept invented by I. J. Good (famous name in Bayesian statistics) in 1965.
• Hypothesis: The smarter you are, the more creativity you can apply to the task of making yourself even smarter.
• Prediction: Positive feedback cycle rapidly leading to superintelligence.
Eliezer Yudkowsky Singularity Institute for AI
(Good, I. J. 1965. Speculations Concerning the First Ultraintelligent Machine. Pp. 31-88 in Advances in Computers, 6, F. L. Alt and M. Rubinoff, eds. New York: Academic Press.)
Intelligence explosion hypothesis does not imply, nor require:
• More change occurred from 1970 to 2000 than from 1940 to 1970.
• Technological progress follows a predictable curve.
• Does not even require that "Real AI" is possible! (An intelligence explosion could happen with augmented humans.)
Eliezer Yudkowsky Singularity Institute for AI
"Book smarts" vs. cognition:
"Book smarts" evokes images of:
• Calculus• Chess• Good recall of facts
Other stuff that happens in the brain:
• Social persuasion• Enthusiasm• Reading faces• Rationality• Strategic ability
Eliezer Yudkowsky Singularity Institute for AI
The scale of intelligent minds:a parochial view.
Village idiot Einstein
Eliezer Yudkowsky Singularity Institute for AI
The scale of intelligent minds:a parochial view.
Village idiot Einstein
A more cosmopolitan view:
Village idiot
Chimp Einstein
Mouse
Eliezer Yudkowsky Singularity Institute for AI
The power of intelligence:
• Fire
• Language
• Nuclear weapons
• Skyscrapers
• Spaceships
• Money
• Science
Eliezer Yudkowsky Singularity Institute for AI
One of these things is not like the other...
• Space travel• Extended lifespans
• Artificial Intelligence• Nanofactories
Eliezer Yudkowsky Singularity Institute for AI
Intelligence:
• The most powerful force in the known universe - see effects every day
• Most confusing question in today's science - ask ten scientists, get ten answers
• Not complete mystery – huge library of knowledge about mind / brain / cognition – but scattered across dozens of different fields!
Eliezer Yudkowsky Singularity Institute for AI
If I am ignorant about a phenomenon,
this is a fact about my state of mind,
not a fact about the phenomenon.
Confusion exists in the mind, not in reality.
There are mysterious questions.Never mysterious answers.
(Inspired by Jaynes, E.T. 2003. Probability Theory: The Logic of Science. Cambridge: Cambridge University Press.)
Eliezer Yudkowsky Singularity Institute for AI
For more about intelligence:
• Go to http://singinst.org/
(Or google "Singularity Institute")
• Click on "Summit Notes"
• Lecture video, book chapters
Eliezer Yudkowsky Singularity Institute for AI
The brain's biological bottleneck:
• Neurons run at 100Hz
• No read access
• No write access
• No new neurons
• Existing code not human-readable
Eliezer Yudkowsky Singularity Institute for AI
Relative difficulty:
• Build a Boeing 747 from scratch.
• Starting with a bird,• Modify the design to
create a 747-sized bird,• That actually flies,• As fast as a 747,• Then migrate actual living
bird to new design,• Without killing the bird or
making it very unhappy.
Eliezer Yudkowsky Singularity Institute for AI
The AI Advantage(for self-improvement)
• Total read/write access to own state
• Absorb more hardware(possibly orders of magnitude more!)
• Understandable code
• Modular design
• Clean internal environment
Eliezer Yudkowsky Singularity Institute for AI
Biological bottleneck(for serial speed)
• Lightspeed >106 times faster than axons, dendrites.
• Synaptic spike dissipates >106 minimum heat (though transistors do worse)
• Transistor clock speed >>106 times faster than neuron spiking frequency
Eliezer Yudkowsky Singularity Institute for AI
• Physically possible to build brain at least 1,000,000 times as fast as human brain
• Even without shrinking brain, lowering temperature, quantum computing, etc...
• Drexler's Nanosystems says sensorimotor speedup of >>106 also possible
• 1 year → 31 seconds
Eliezer Yudkowsky Singularity Institute for AI
Village idiot
Chimp Einstein
Mouse
10,000 years to nanotech?(for superintelligence)
• Solve chosen special case of protein folding
• Order custom proteins from online labs with 72-hour turnaround time
• Proteins self-assemble to primitive device that takes acoustic instructions
• Use to build 2nd-stage nanotech, 3rd-stage nanotech, etc.
• Total time: 10,000 years ~ 4 days
Eliezer Yudkowsky Singularity Institute for AI
Respect the power of creativityand be careful what you call
"impossible".
Eliezer Yudkowsky Singularity Institute for AI
Eliezer Yudkowsky Singularity Institute for AI
Ancient Greeks
Agriculture
Hunter-gatherers
Renaissance
Industrial Revolution
Electrical Revolution
Nuclear, Space, Computer, Biotech, Internet Revolutions
Molecular nanotechnology
vs.
Eliezer Yudkowsky Singularity Institute for AI
Hunter-gatherers
Chimps Internet
Bees
Ancient Greeks
Agriculture
Hunter-gatherers
Renaissance
Industrial Revolution
Electrical Revolution
Nuclear, Space, Computer, Biotech, Internet Revolutions
Molecular nanotechnology
Can an intelligence explosionbe avoided?
• Self-amplifying once it starts to tip over
• Very difficult to avoid in the long run
• But many possible short-term delays
• Argument: A human-level civilization occupies an unstable state; will eventually wander into a superintelligent region or an extinct region.
Eliezer Yudkowsky Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could create a mile-high cheesecake.
• Minor premise: Someone will create a recursively self-improving AI.
• Conclusion: The future will be full of giant cheesecakes.
Power does not imply motive.
Eliezer Yudkowsky Singularity Institute for AI
Fallacy of the Giant Cheesecake
• Major premise: A superintelligence could create a mile-high cheesecake.
• Minor premise: Someone will create a recursively self-improving AI.
• Conclusion: The future will be full of giant cheesecakes.
Power does not imply motive.
Eliezer Yudkowsky Singularity Institute for AI
Spot the missing premise:
• A sufficiently powerful AI could wipe out humanity.
• Therefore we should not build AI.
• A sufficiently powerful AI could develop new medical technologies and save millions of lives.
• Therefore, build AI.
Eliezer Yudkowsky Singularity Institute for AI
Spot the missing premise:
• A sufficiently powerful AI could wipe out humanity.
• [And the AI would decide to do so.]
• Therefore we should not build AI.
• A sufficiently powerful AI could develop new medical technologies and save millions of lives.
• [And the AI would decide to do so.]
• Therefore, build AI.
Eliezer Yudkowsky Singularity Institute for AI
Design space ofminds-in-general
All human minds
Gloopy AIs
Freepy AIs
Bipping AIs
Eliezer Yudkowsky Singularity Institute for AI
AI isn't a prediction problem,it's an engineering problem.
We have to reach into mind design space, and pull out a mind such that
we're glad we created it...
Eliezer Yudkowsky Singularity Institute for AI
AI isn't a prediction problem,it's an engineering problem.
We have to reach into mind design space, and pull out a mind such that
we're glad we created it...
Challenge is difficult and technical!
Eliezer Yudkowsky Singularity Institute for AI
"Do not propose solutions until the problem has been discussed as thoroughly as possible without suggesting any."
-- Norman R. F. Maier
"I have often used this edict with groups I have led - particularly when they face a very tough problem, which is when group members are most apt to propose solutions immediately."
-- Robyn Dawes
(Dawes, R.M. 1988. Rational Choice in an Uncertain World. San Diego, CA: Harcourt, Brace, Jovanovich.)
Eliezer Yudkowsky Singularity Institute for AI
What kind of AI do we want to see?
Much easier to describe AIswe don't want to see...
Eliezer Yudkowsky Singularity Institute for AI
"Friendly AI"...
(the challenge of creating an AIthat, e.g., cures cancer, rather
than wiping out humanity)
...looks possible but very difficult.
Eliezer Yudkowsky Singularity Institute for AI
The intelligence explosion:Enough power to...
make the world a better place?
Eliezer Yudkowsky Singularity Institute for AI
Someday,the human species
has to grow up.Why not sooner
rather than later?
Eliezer Yudkowsky Singularity Institute for AI
In a hundred million years,no one's going to care
who won the World Series,but they'll remember the first AI.
Eliezer Yudkowsky Singularity Institute for AI
For more information, please visit the Singularity Institute athttp://singinst.org/