For those who don't know AI = artificial intelligence, basically a machine that reacts as the human brain does. This means that it has to be able to learn and adapt to change. What I want to know is what AG thinks of it? Anyone automatically thinking of the Terminator and The Matrix?
Yes. Counting in binary isn't random, making into text is. There may be a pattern to the numbers they use for each letter, but that's just for simplicities sake, it could have been anything. That websites explanation of binary code isn't actually binary code, they're calling it that to make it simpler (which is terribly misleading), it's actually just teaching you how to count in binary.
It says "This is accomplished by assigning a bit string to each particular symbol or instruction".
If we are assigning it, there doesn't need to be logic behind it. We can assign "a" to 01100001, or we can give it anoteher string of numbers such as 010110110. It's
thats the same as saying braille reading is random.
It pretty much is. We have "A" represented by a dot lowered then a dot raised (i believe), but why couldn't we have just had it as a raised dot then a lowered dot? Would it have made a difference? No, blind people would have just had to learn that it was raised then lowered not lowered then raised.
it actualy is a 4-bit system. simplified to make people understand easyer.
Once again, math in binary ins't binary code.
it's not misleading it's explaining basics.
It's explaining the basics of binary math, not binary code, so it is misleading.
if it was a 64 or a 128-bit system then it will become to hard to explain because of the loads of numbers.
It's still irrelevant, but it's not harder to explain, you just keep adding a bit.
anyway it seems your not willing to understand that EVRYTHING a computer does is based on this. so i wont try it either.
Why does it seem like that? I've written computer programs that can convert text to binary. I've made a calculator (in minecraft (it's still using circuitry). I already understand all of this, you are just ignoring my arguments.
Getting a bit off topic, aren't we? The fundamentals of binary were decided by the machines that use it. Essentially, binary is not made by us. Computers made it for themselves. Early AIs
binary is not made by us. Computers made it for themselves
there are forms of binary since 200 BC. In 1605 Francis Bacon discussed a system whereby letters of the alphabet could be reduced to sequences of binary digits.
Yes, but let's see humans rattling off the code as fast as the things that it was meant for? If it was invented by man, we'd program in binary. All programming. But all we use is text conversion programs. Now, a bit off topic, AIs, not binary.
It says "This is accomplished by assigning a bit string to each particular symbol or instruction".
If we are assigning it, there doesn't need to be logic behind it. We can assign "a" to 01100001, or we can give it anoteher string of numbers such as 010110110. It's
So now explain to me why it matters which order the 1's and 0's are in if we assigned them ourselves, and explain why it would have mattered if we had assigned them a different order. Or you can use the usual WEPR tactics of not responding to an argument you can't win and claiming it's for another reason.
Yes, but let's see humans rattling off the code as fast as the things that it was meant for? If it was invented by man, we'd program in binary. All programming. But all we use is text conversion programs. Now, a bit off topic, AIs, not binary.
We technically are, because all code is compiled down to machine language and fed the CPU's instruction set, which sends different amounts of power (more commonly referred to as 1's and 0's) into the rest of the processor.
Old info. Can we get back to the entireties of AIs, instead of arguing about binary? Because it really is pointless. We'll create a thread for binary if we have to, but we should get back to the main topic on this thread first.