Posted by caraher on September 12, 2007, at 22:40:23
In reply to I also want to add to this, caraher... (important), posted by Michael83 on September 10, 2007, at 22:02:01
I appreciate the logic of the argument that if you multiply an infinitely awful outcome by an appreciable probability it's hard to for any expected good to outweigh the risk of harm.
But I also believe there are some things that carry essentially zero risk, and this is among them. Here's why I don't worry. First, there are natural high-energy gamma rays and other phenomena that are more energetic than any accelerated particle man can create. Human powers are feeble compared with what's out in nature elsewhere in the universe. If there were any genuine danger of concentrating that much energy in a tiny volume of space destroying the universe, some natural process very likely would have done that already.
A variation on the "it probably would have happened already" theme depends on what you think of Drake's Law, basically a formula for guessing how many alien civilizations might be in a galaxy. The number you come up with varies with the guesses you make for things like how common habitable planets are, etc. but generally suggests that the universe is such a big place that Earth is very unlikely to be the only home to civilizations. If you accept that we are not alone, I'd expect that some would be more adventurous scientifically and some less. By now, someone "out there" has probably tried these kinds of experiments, and it seems like we're still here!
My own view is that doomsday scenarios based on accelerator physics are wildly speculative. There are far more serious scientific and technological risks to worry about. When experiments start probing the universe on the so-called Planck scale, that's when I'd start taking seriously any theories of an accelerator-induced doomsday.
poster:caraher
thread:781229
URL: http://www.dr-bob.org/babble/social/20070827/msgs/782572.html