D3thAdd3r wrote: ↑Sat Dec 09, 2017 7:57 pmEdit - wanted to say, if it was proven the human brain employed quantum principles in it's operation...then I think machines are screwed for a long time before achieving that.
Nope. Even if it employs that, our machinery is already affected just as well. I mean things since long ceased to be ideal digital '1' and '0'. There are errors and error correction everywhere, even in ordinary home computers. Normal CPUs of course still live in this ideal world, at least from the perspective of the software running on them, but complex systems are already well prepared to handle disturbances. Many of the AI technologies especially wander out of the black-and-white digital domain (think of fuzzy logic). In a modern artificial system there could be millions of independent processing units which could be disturbed, could fail any time, possibly some in every second, yet the overall system still runs, and its exact behavior is affected by those disturbances. Think of butterfly effect. You can no longer predict the exact outcome despite that the components are digital.
This is the new world.
Elevated intelligence will eventually happen. If not, that would be because we drive ourself down the drain, not unlike what happened to classical Greek democracy (if you read about that, you could find a lot of similarities to our present day world!). However if we go on, eventually we will cross that point when we will no longer know what the heck is going on. I would say we are very near, even.
Then... Can you remember
THIS. From 1970, almost fifty years before.
Today the Colossus won't be having the key to the nuclear launch system. It would (or already does?) have the key to public media! And that's a power even more scary... I believe we could cross a point when even the powers realize that they can no longer pull the plug (or we already crossed that?), and they are merely witnessing the events unfolding, wrinkled shaking hands on the red button which could only doom it all. If the systems would even still cooperate.
An I am not scared of this. I am scared of humanity. That this can happen because of
us in the system. Because a mass of human is just a set of numbers in a statistic, because you can control the crowd.
If such an intelligence was created, it wouldn't necessarily be "evil". Maybe it wouldn't even be possible to associate morality to it (the movie above does a good job in this regard, again, that's fifty years before!). Maybe we wouldn't even be able to affix any judgment on it since when it happens, it will be just too late for us. Maybe we wouldn't even notice. Murky area.
I am doing sci-fi, thinking about such intelligences as well and their consequences. Humanity is clearly in a transient state, something very unstable which is guaranteed to become something different in the oncoming centuries, and what's this compared to the time it takes for species to evolve, life to flourish or fall. Possibly this is one reason why in the recent years I see we are getting the more and more afraid of the idea of contacting any intelligent specie beyond our solar system. If they exist, they would fundamentally differ as it is very unlikely we caught anything in the same glimpse of a transient like us.
Creepy, eh?
Anyway, just if this popped up. Opinions. If you feel like it is going too far off, OK, I will remove this post.