Posted by Michael83 on September 10, 2007, at 21:43:47
In reply to Re: I have good news...and some more good news., posted by caraher on September 10, 2007, at 19:05:39
I believe the risk a machine that has been described as having the potential of "destroying the entire universe" to not be scalable (even if that risk has been described as literally "infinitely small") Given the enormous potential consequences, no matter the statistically likelihood (even if it's extremely low), any risk is an immediate forfeit of the reward.
I'm not basing it on my own personal fear. I'm basing it on the fact that this machine carries the "ultimate risk." That is a single exception that negates all reward, no matter how great, in my opinion.
It's not even closely comparable to say a nuclear bomb or a terrorist attack. If an event such as that were to occur, we can rebuilt. The human race will go on. This is not comparable. There is nothing above this in terms of consequences. It's an exception.
poster:Michael83
thread:781229
URL: http://www.dr-bob.org/babble/social/20070827/msgs/782132.html