Robot colonies have learned and can communicate!
- 10 Replies
Start arming yourselves people! Little flashing, lying robots are coming to get us. They might be telling the truth, but we've got to shoot first and ask questions later.
I love it! I did my senior thesis on strong and weak AI. There is this robot they made a few years back (can't remember his name) that they always left on at night and allowed him to "watch" TV. Basically his scanners for his eyes moved back and forth watching the movement.
So, one night they left him on and forgot to turn on the TV. When they came back he was waving his robotic arm in front of his face because he got BORED!
That is also scary. The last thing we need is AI to get bored!
Maybe you can help me out then, Ash. People are using words here like "bored" "lying" and "hero" to describe these robots' actions. Bored is something that can be reductively explained as a lack of stimulation of some kind. Although a robot lying or waving its arm in front of its face to alleviate boredom implies a conscious decision - at least I think it does.
Is there some programming explanation for what a lie would be? And if so, can that explanation be used in people?
Okay Cog was the robot I was talking about. Kismet is this new one that has social learning instead of a more cognitive learning. I think reading this might answer some questions. I'm not really sure how to explain the lying. But Cog got bored because he lacked the stimulus which he learned he was supposed to do every night.
I am waiting for people to do the same kind of experiment later on with more advanced robots...one of them pretends to break down. When a human walks in, it pounces! Along with all the other ones! Kinda like tribbles. But meaner. And more roboty.
Or something like that.
That article is great, Ash. I've started looking at another robot called Joey Chaos. It looks great, but isn't nearly as sophisticated as other robots.
Well. The only thing I have to say here is my definition of thinking that I have been using in other threads. It conforms with the thinking of AI, and it is as follows: Reacting to a stimuli or Choosing the best course of action using several factors and any circumstance.
Devoidless is funny.
I wrote a 7 page paper on AI, and came up with two methods of doing AI.
1: bottom up: remake the brain, cell for cell.
2: top down: use a complex answer web.
I have had very little experience in programming AI myself, but I did do lego mindstorms as a class, and learned how to make a pong AI.
Haha, I am sorry. I am not adding anything to the conversation. I just wanted to point out I got my own paragragh about me being funny! Granted it was only a one sentence paragraph, but a sentence none the less.
No, it wouldn't. And if it does, great.
Their programing evolved to be the only ones alive. Remember, they had 50 generations.
Whoa. Just thirty genes? That seems to be little. From what I remember humans have around 20,000 genes many adding to our behavior. Can they really develop the concept of "lying" for their own personal gain? Nah. That would need for them to develop a sense of self; a consciousness.
Thread is locked!