Question:
What is the history of binary code?
Austin
2008-10-25 19:59:58 UTC
Who discovered that you can store and communicate any form of information imaginable through a series of ones and zeros? To me, that's simply amazing. How did people figure this out, and, come to think of it, how does binary code actually end up representing anything in the first place if it's just a series of ones and zeroes?
Nine answers:
d0tc0mguy
2008-10-25 20:03:36 UTC
The ancient Indian writer Pingala developed advanced mathematical concepts for describing prosody, and in doing so presented the first known description of a binary numeral system.[1][2]



A full set of 8 trigrams and 64 hexagrams, analogous to the 3-bit and 6-bit binary numerals, were known to the ancient Chinese in the classic text I Ching. An arrangement of the hexagrams of the I Ching, ordered according to the values of the corresponding binary numbers (from 0 to 63), and a method for generating the same, was developed by the Chinese scholar and philosopher Shao Yong in the 11th century. However, there is no evidence that Shao understood binary computation; the ordering is also the lexicographical order on sextuples of elements chosen from a two-element set.



Similar sets of binary combinations have also been used in traditional African divination systems such as Ifá as well as in medieval Western geomancy.



In 1605 Francis Bacon discussed a system by which letters of the alphabet could be reduced to sequences of binary digits, which could then be encoded as scarcely visible variations in the font in any random text. Importantly for the general theory of binary encoding, he added that this method could be used with any objects at all: "provided those objects be capable of a twofold difference only; as by Bells, by Trumpets, by Lights and Torches, by the report of Muskets, and any instruments of like nature".[3] (See Bacon's cipher.)



The modern binary number system was fully documented by Gottfried Leibniz in the 17th century in his article Explication de l'Arithmétique Binaire. Leibniz's system used 0 and 1, like the modern binary numeral system. As a Sinophile, Leibniz was aware of the I Ching and noted with fascination how its hexagrams correspond to the binary numbers from 0 to 111111, and concluded that this mapping was evidence of major Chinese accomplishments in the sort of philosophical mathematics he admired.[4]



In 1854, British mathematician George Boole published a landmark paper detailing an algebraic system of logic that would become known as Boolean algebra. His logical calculus was to become instrumental in the design of digital electronic circuitry.



In 1937, Claude Shannon produced his master's thesis at MIT that implemented Boolean algebra and binary arithmetic using electronic relays and switches for the first time in history. Entitled A Symbolic Analysis of Relay and Switching Circuits, Shannon's thesis essentially founded practical digital circuit design.



In November 1937, George Stibitz, then working at Bell Labs, completed a relay-based computer he dubbed the "Model K" (for "Kitchen", where he had assembled it), which calculated using binary addition. Bell Labs thus authorized a full research program in late 1938 with Stibitz at the helm. Their Complex Number Computer, completed January 8, 1940, was able to calculate complex numbers. In a demonstration to the American Mathematical Society conference at Dartmouth College on September 11, 1940, Stibitz was able to send the Complex Number Calculator remote commands over telephone lines by a teletype. It was the first computing machine ever used remotely over a phone line. Some participants of the conference who witnessed the demonstration were John Von Neumann, John Mauchly, and Norbert Wiener, who wrote about it in his memoirs.
kendry
2016-12-28 14:50:34 UTC
History Of Binary Code
?
2016-09-29 09:11:32 UTC
Who Invented Binary Code
anonymous
2016-03-13 04:45:43 UTC
That's like asking how a lock "understands" a key. Each binary code (there are only 65,635 possibilities in a computer that uses 16 bit opcodes) is a set of on and off states - 16 of them. Those states cause the computer to do something. (It's like turning on certain LEDs in your monitor - how can one monitor "understand" every single language, and know every single picture, in the world?) The CPU (that's what reads the codes) does what the code calls for, because the code turned certain switches on (the 1s) and certain other ones off (the 0s). Maybe it calls for the computer to load the next 3 sets of binary codes, or to start running the code from a different place in memory, or put a certain code pattern out to the screen. That's what it does.
anonymous
2015-08-09 06:34:29 UTC
This Site Might Help You.



RE:

What is the history of binary code?

Who discovered that you can store and communicate any form of information imaginable through a series of ones and zeros? To me, that's simply amazing. How did people figure this out, and, come to think of it, how does binary code actually end up representing anything in the first place if it's just...
ME!!!
2008-10-25 20:05:22 UTC
binary code is used because of electronic signal transfer where a 1 represents a signal and a 0 is blank. so if the code was 1100101 that's like saying "on on off off on off on" or sometimes it includes a null or a space in which case it has 2 signal types and then off, like saying 1100 10 01 is "--++ off -+ off +-"
Chame_chato
2008-10-25 20:03:15 UTC
the binary code is the language that the computer understand
Shellback
2008-10-25 20:03:59 UTC
011101 10010 01 101? 011100 01000 01 10!!! 0001110 001010 011.
Fox
2008-10-25 20:02:55 UTC
http://en.wikipedia.org/wiki/Binary_numeral_system#History


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...