Posted by Adam on February 23, 2000, at 12:18:42
In reply to Re: Out walking the dogma..., posted by Elizabeth on February 22, 2000, at 22:22:37
> I think this is necessary (but not sufficient) if we're to believe in such a thing as "free will" (nondeterminism). Then again, as you point out, it's not so obvious that we have "free will" in this sense (regardless of whether we would *like* to have it).
I hadn't heard the free will argument, but it is interesting. I guess it all depends on what you define as "free". I would imagine all living organisms are "free" in that there's no good evidence for an invisible puppet master pulling the strings. However, as an ant appears to be an automiton to us, so me might appear to a more advanced species, and would differ in our enslavement to "hardwired" impetus only by a matter of degree.
One argument for the presence of a quantum computer might go back to Turing again. I can't remember all the details of his universal computer, but I do recall that it could perform any calculation except one: avoid infinite loops. For that it needed an "Oracle", an undescribed component that stopped the computer from getting into infinite loops before it got stuck. I guess Turing and others felt we have this "Oracle". For example...
10 print "this is absurd"
20 goto 10
You can just look at this and know it will go on forever, but no computer we have ever created can make that assessment on its own, and I don't think anyone is sure how we do it. You could, for instance, do this...
10 let N=0
20 print "this is absurd"
40 if N=1000000000 then 60
50 goto 20
60 print "this is really absurd"
I know this is a gross oversimplification, but it does seem that the only way you can stop an infinite loop in a program from happening is to include some sort of explicit instruction. But what if you need to go through a gazillion iterations of something before you get the right answer and you don't know how large N will be before you have that answer? I guess then you just set an upper limit for N and hope you get what you want. Again, the computer can't make those decisions on its own, so it doesn't appear to think, just calculate. Hence the brain seems to many to have some special property. This might fit in with "free will".
I'm not so sure this is true, though. The first program I showed above is so simple, why don't we suppose the brain is evolved to just run the program until "boredom" sets in and then quit, deciding, perhaps erroniously in some cases, that the program would go on forever. If a program is sufficiently complex, the programmer might not be in any better shape than a computer: She runs the program, and if an infinite loop exists, hits the kill switch after some arbitrary time and debugs. She does an experiment, gets a result, and acts accordingly. I can't see why a digital computer couldn't be programmed to do the same thing, if it were sophisticated enough.
So, again, maybe all you need is a really huge but fundamentally simple neural network computer that doesn't require "oracular" powers to do everything we do.
> > Maybe people think the mind is more complex than it really is.
> Probably; we like to flatter ourselves. :-)
Exactly. What is a little scary is that maybe the next step in our evolution, or at least our creative endeavors, is to create a faster and more robust array of molecular switches and turn it loose to decide what to do on its own. There's no good reason the believe that it wouldn't be conscious, have emotions, or have a desire to procreate and improve on itself, and could do all that better than we could. It might also be able to readily download the contents of its brain to the next generation, giving its offspring something even better than instinct: experience (though I supposed from an evolutionary/genetic standpoint, there might be no fundamental difference except in the way information is transferred.) I imagine something like that would evolve at an exponential rate until it hit the limits set on computation by uncertainty and the speed of light. Where would we fit in such a scenerio? Maybe it would be compassionate and wouldn't kill us off, but our thoughts might seem as interesting to it as a bacterium's is to us. Then we'd be left aware of our own pathetically limited status as carbon-based bags of saline while some god-like entitiy was out there rearranging the cosmos. I can only hope we could somehow downoload our own thoughts and become part of this new species, while agreeing to abstain from sexual reproduction so as not to perpetuate the existance of a species with an intractable inferiority complex.