My question is, what do you think would happen if robots became more intelligent than us? Please say why you think that would happen. (BTW They gained this intellegence from the bottom up approach and nothing was preprogramed into them other than the technology to gain and use intellegence with no help from a human)
Uhm, robots are already more intelligent than us. But my particular robot of interest would never harm me, Siri <3.
They actualy aren't more intellegent than us and just store memory, they are all preprogramed to do things and the things that they do don't even compare to what we can do, I think the most intellegent robot is a one who is programmed to catch his own food (flies) and runs completly on that.
Btw just researched that Siri thing that you mentioned and that actualy in my mind dosen't have much intellegence since it is just a voice recognition system and nothing else.
Btw just researched that Siri thing that you mentioned and that actualy in my mind dosen't have much intellegence since it is just a voice recognition system and nothing else
dont talk to siri like that!
and if u didnt notice robots an acroynm really obvious bomb of tech smart or in normal talk a bomb that goes to the school of tech smart
what do you think would happen if robots became more intelligent than us?
We would have to make a robot that would be able to teach itself how to do stuff. That would be the only way a robot could become smarter then a human.
Uhm, robots are already more intelligent than us. But my particular robot of interest would never harm me, Siri <3.
They know more then us if they have access to a database and are programmed to be able to access to that data. Intelligence and smarts (lack of a better word ) are two different things.
It depends. If we allowed these robots to multiply and spread and act out in a physical form, then we would be idiots. But if we controled them then there wouldn't be a problem.
They can be as smart as they want. As long as they don't posses any characteristics that could pose a threat, such as lasers or giant teeth, or things like that.
and if u didnt notice robots an acroynm really obvious bomb of tech smart or in normal talk a bomb that goes to the school of tech smart
Wait... Please rephrase that cause I couldn't understand that at all.
if they did i would thing that they would eventually take over
But why WOULD they want to take over?
They can be as smart as they want. As long as they don't posses any characteristics that could pose a threat, such as lasers or giant teeth, or things like that.
But why would they want to do any harm? Why would a sudden burst of intelligence make them want to kill and control? The only reason why we do that is because of survival of the fittest and kill or be killed, but robots don't have that problem.
But why would they want to do any harm? Why would a sudden burst of intelligence make them want to kill and control? The only reason why we do that is because of survival of the fittest and kill or be killed, but robots don't have that problem.
If they became smart enough to take over jobs, people would kill them. Just saying.
The proper term for this is called "setience", meaning that one day computers will become self-aware.And unless being able to answer questions programmed into you means "concuring the world",I don't think robots are concuring anything any time soon.=)