I'm sure everyone here has seen "Terminator" or maybe the movie "AI" my question is what are your thoughts of AI life. Mine I dilike it and I beleive we shoudn't even explore the possibiltity of it. Robots should be controled by humans not themselves.
It's an interesting topic. Robots taking over the world. Robots would not have the capability of AI without human programing. Yes they may be able to adapt but that is human programming.
If push comes to shove and robots do find a way to take over the world then we cant take them down. As Chiliad said... EMP's will take them down in a heart beat. And just to be save we could always put self destruction devices in robots so we can just hit a big red button and all the robots go *BOOM!*
There's a weakness with this: Were robots to become "self-aware" they may inevitably become aware of their own operating parameters, including said quick-release.
Given the nature of AI functioning as we see it, AI wouldn't evolve- it would adapt at a rate far faster than could be described as evolution. They'd probably devise EMP shielding, or form a surveillance network that directly thwarted opportunities for us to use EMP.
I don't think it really matters that much that we are able to "grant" AI the drive to adaptation- the alternative I see to it being granted to them, would be that they "evolve" it themselves: we'd generate some kind of organic micro-unit and toss it into a community with pressures and see which ones survive...
It can give a lot, but we cannot let it get extremely far. If everything is done with AI, then nothing will be done by humans. Ever read the book Dune? I don't think it will get that far, nor do I call upon any part of the book as an argument. I just wanted to get an Idea across.
human intelligence is increasing at an exponential rate. 20 years ago we made a breakthrough or serious advancement every 7 years, now it's every seventeen months. someone said that we'd have AI in 500 years. if the human species is alive in 500 years, i believe that we will already have populated other planets and will have advanced spacecraft, probably even made contact with alien species (when i say this i mean us going to them). according to the rate of advancement and our current placement in AI, i'd say we'll probably have it in about 40 years.
Ive never seen Al but ive seen the Terminators and ive seen I-robot, and so all i know is that no robot can think for it self if they get a virus they will rather do nothing till they run out of battery or it will shut down or it will blow up... so yea i guss if they blow up then yea they can be dangerous... but they will never kill anyone enless thats what there disigned to do... so i would never recomend the army start makeing robots because they might get a virus and start killing us.
For an AI to be classed as life it must be able to fulfill the seven requirments of life. Among them reproduction. Not just making copies of itself, for that will not produce any variation so it is not reproducing. This would be hard for an AI to achive. So can an AI ever really be life?