1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
|
<head>
<script>
function loadPercentile(name, startK5, startK50, startK95, stopK5, stopK50, stopK95) {
var args =
{
"caption": "Loaded our interpretation of " + name + "\'s probability.",
"Q8.y2010.0k5.0": startK5,
"Q8.y2010.0k50.0": startK50,
"Q8.y2010.0k95.0": startK95,
"Q8.y2070.0k5.0": stopK5,
"Q8.y2070.0k50.0": stopK50,
"Q8.y2070.0k95.0": stopK95
};
top.loadData(args);
}
</script>
</head>
<body>
<P CLASS="western" STYLE="margin-bottom: 0in">What is the probability that a disaster not related to nuclear weapons will set progress toward human-level AI back decades or centuries? For example, consider runaway climate
change, a biotechnologically engineered plague, self-replicating
nanomachines, economic collapse, a planetary totalitarian government
that restricts technology development, or something unknown.</P>
<UL>
<LI><P CLASS="western" STYLE="margin-bottom: 0in">
<B>Claim: </B>"I
think it is no exaggeration to say we are on the cusp of the further
perfection of extreme evil, an evil whose possibility spreads well
beyond that which weapons of mass destruction bequeathed to
nation-states, on to a surprising and terrible empowerment of
extreme individuals. An immediate consequence of the Faustian
bargain in obtaining the great power of nanotechnology is that we
run a grave risk — the risk that we might destroy the biosphere on
which all life depends. If our own extinction is a likely, or even
possible, outcome of our technological development, shouldn't we
proceed with great caution?"<BR>
<B>Implication: </B>The chance
of civilization-destroying disaster could be significant.<BR>
<B>Source:
</B>Joy, Bill. <U><A TARGET="_blank" CLASS="western" HREF="http://www.wired.com/wired/archive/8.04/joy.html">"Why the future doesn't need us"</A></U>. <I>WIRED</I>, Issue 8.04, April 2000.</P>
</UL>
<UL>
<LI><P CLASS="western">
<B>Claim:</B> It's highly unlikely for us,
as observers, to be among the first tiny fraction of humans ever
born.<BR>
<B>Implication:</B> The total number of humans is probably
comparable to the number of humans already born, implying a major
population-limiting catastrophe sometime within the next few
thousand years. One estimate gives a 95% confidence estimate of a
0.4% to 0.8% chance of such a catastrophe between now and 2070, or
about 0.67% - 1.3% per decade if the probability is distributed
evenly over time.<BR>
<B>Source:</B> Leslie, John. <U><A TARGET="_blank" CLASS="western" HREF="http://www.amazon.com/End-World-Science-Ethics-Extinction/dp/0415184479/">The
End of the World: The Science and Ethics of Human Extinction</A></U>.
New York: Routledge, 1998.</P>
<LI><P CLASS="western">
<B>Claim:</B> Sir Martin Rees, Astronomer
Royal and President of the Royal Society, puts the odds for human
extinction this century at 50%.
<input type="button" onclick="loadPercentile('Rees', -2.7796, -2.3010, -1.8239, -2.7796, -2.3010, -1.8239);" value="Load distribution"</input><BR>
<B>Source: </B>Rees, Martin. "The
Energy Challenge." The World Question Center 2007. Retrieved 9
Aug. 2008. <<U><A TARGET="_blank" CLASS="western" HREF="http://www.edge.org/q2007/q07_15.html" TARGET="_blank">http://www.edge.org/q2007/q07_15.html</A></U>>.</P>
<LI><P CLASS="western">
<B>Claim:</B> Oil production will peak soon,
which will not kill off the species but will put an end to
technology development.
<input type="button" onclick="loadPercentile('Heinberg', -2.6227, -1.6990, -0.8036, -2.7786, -2.3021, -1.8274);" value="Load distribution"</input><BR>
<B>Implication: </B>Our species will
survive, albeit with a more primitive technological base. Progress
towards AI would essentially cease.<BR>
<B>Source:</B> Heinberg,
Richard. (2003.) <I>The Party's Over</I>. British Columbia: New
Society Publishers.</P>
<LI><P CLASS="western">
<B>Claim:</B> The end of the world has been
predicted many, many times before, and it's never
happened.
<input type="button" onclick="loadPercentile('Nelson', -6.0, -4.6990, -4.3979, -6.0, -4.6990, -4.3979);" value="Load distribution"</input><BR>
<B>Implication:</B> The human species will continue in
the near future, pretty much as it has been.<BR>
<B>Source:</B>
Nelson, Chris. "A Brief History of the Apocalypse." 13
Oct. 2005. Retrieved 9 Aug. 2008. <<FONT COLOR="#000080"><U><A TARGET="_blank" CLASS="western" HREF="http://www.abhota.info/" TARGET="_blank">http://www.abhota.info/</A></U></FONT>>.</P>
<LI><P CLASS="western">
<B>Claim: </B>We may systematically
underestimate the likelihood of a humanity-destroying disaster
because of selection effects. Since we're obviously here and alive,
we necessarily live in a time when a humanity-destroying disaster
hasn't occurred. It's impossible to learn from catastrophic
disasters that wipe out humanity, because when they occur, the
observers are eliminated. So species-destroying threats are always
theoretical — by the time they actually happen, it's too late to
learn from mistakes.<BR>
<B>Implication: </B>The probability of
disaster may be higher than many people think.
<input type="button" onclick="loadPercentile('Bostrom', -4.0, -2.6990, -1.4057, -4.0, -2.6990, -1.4057);" value="Load distribution"</input><BR>
<B>Source:</B> Bostrom, Nick. "Existential Risks: Analyzing Human
Extinction Scenarios". <I>Journal of Evolution and Technology</I>,
vol.9, March 2002. <<U><A TARGET="_blank" CLASS="western" HREF="http://nickbostrom.com/existential/risks.html">http://nickbostrom.com/existential/risks.html</A></U>></P>
<LI><P CLASS="western">
<B>Claim: </B>We may underestimate extinction
risks because they are presented in generalities rather than highly
detailed, specific scenarios. Psychological studies have
demonstrated an effect called the conjunction fallacy, in which
subjects believe that the joint occurrence of two events, A and B,
is more likely than the occurrence of A (with or without B). For
instance, the vivid image of nuclear war between the US and China
over Taiwan may seem more probable than the abstract idea of nuclear
war for any reason. <BR>
<B>Implication: </B>The probability of
disaster may be higher than many people think.<BR><B>Source:
</B>Yudkowsky, Eliezer. <U><A TARGET="_blank" CLASS="western" HREF="http://www.acceleratingfuture.com/uncertainfuture/www.singinst.org/upload/cognitive-biases.pdf">"Cognitive
biases affecting judgment of global risks"</A></U> <EM>Global
Catastrophic Risks</EM>, eds. Nick Bostrom and Milan Cirkovic.
</P>
</UL>
</body>
|