Now yin, now yang, that is tao; what makes it continue is goodness, what completes it is nature.

^{1}

Underlying much of Chinese philosophy is the principle of yin and yang, which imagines the balance of opposites as fundamental to the structure of matter and existence. Hot is balanced by cold; light is balanced by dark; high is balanced by low; masculine is balanced by feminine; life is balanced by death. Importantly, for anything to exist at all, its opposing force must also be present: a coin cannot have a head without a tail; wetness cannot exist without dryness; earth cannot exist without the heavens; the active cannot exist without the passive; movement cannot exist without stagnation.

In 1703, the German thinker Gottfried Wilhelm Leibniz (1646-1716) argued that using progressions of twos opened the door to new doors in mathematics.

reckoning by twos, that is, by 0 and 1… is the most fundamental way of reckoning for science… as numbers are reduced to the simplest principles, like 0 and 1, a wonderful order is apparent throughout.

^{2}

His idea became the basis for binary numbers. Leibniz, an avid student of Chinese culture, noted with great enthusiasm that his “arithmetic by 0 and 1” paralleled ancient Chinese hexagrams, which were underpinned by the states of yin and yang. Like yin and yang, Leibniz was convinced that 0 and 1 could represent the “switches” of the natural world.

Building on several inventions in the 1700s, the French merchant Joseph Marie Jacquard (1752-1834) unveiled a loom that relied on “a chain of cards punched according to the pattern to be produced on the cloth.”^{3} The punched cards simply contained holes and solid spaces, which correspondingly raised or lowered the threads to reproduce the pattern. By using the “programmable” up/down “code” on the punch cards, the loom revolutionized the textile industry, enabling identical copies of the same complex patterns to be reproduced again and again.

Jacquard’s loom represents an incredible realization for humanity: any design imaginable could be abstracted into a sequence of holes and solid spaces. Once coded in a punch card, a picture could be reproduced in fabric.

In 1836, the American painter Samuel Morse (1791-1872) devised a brilliantly simple way of utilizing electricity for communication purposes. He proposed a “language” of short *dots* and long *dashes* that would be used to send and interpret electrical pulses. For the first time in history, messages could travel faster than the messenger. The telegraph was born.

In 1847, the English thinker George Boole (1815-1864) demonstrated that math itself depends on an underlying logic system, which he described as *binary*. Just as a light switch can only be in one of two states – *on* or *off* – every “equation is reducible to a system of equations” that depends on “binary products,” which form a “general doctrine in Logic with reference to the ultimate analysis of Propositions…”^{4}

Yin and yang.

0s and 1s.

Holes and solids.

Dots and dashes.

On and off.

Our power to abstract, store, and transmit data rests in our ability to convert what we see in reality into strings of binary.

“It is possible to invent a single machine which can be used to compute any computable sequence,” Alan Turing famously proposed in the mid 1930s.^{5} Turing’s idea of an a-machine – an automatic machine – suggested that binary logic could be used to create a computational device that could do math without relying on a human’s mental or reasoning ability:

The “computable” numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by finite means… If an a-machine prints two kinds of symbols, of which the first kind (called figures) consists entirely of 0 and 1 (the others being called symbols of the second kind), then the machine will be called a computing machine.

The American mathematician and engineer Claude Shannon (1916-2001) continued to develop this line of thinking. In his 1948 paper, *A Mathematical Theory of Communication*, he pointed out that underlying every piece of information is a “logarithmic base”, that is, “two stable positions, such as a relay or flip-flop circuit” that can therefore “store one bit information”. These on/off states “may be called binary digits, or more briefly bits…”^{6}

Fast forward to today. We might absentmindedly transmit music, text, and images through bits, but beneath all the communications we take for granted, lays a remarkable principle: information is a tangible aspect of reality. It is interwoven into the universe. Even you are made of information, as your genetic code is essentially a string of switches. Like information, life is organized by two-phase states. Furthermore, your comprehension of these words is the result of millions of neurons ‘firing’ between two states, on and off.

Chou Tun-i, I-t’ung shu. Wilhelm, Hellmut. (1995). Understanding the I Ching: The Wilhelm Lecutres on The Book of Changes. Princeton: Princeton University Press. p. 113 ↩

Leibniz, Gottfried Wilhelm. (1703). Explanation of binary arithmetic, which uses only the characters 0 and 1, with some remarks on its usefulness, and on the light it throws on the ancient Chinese figures of Fuxi. Memoires de l’Academie Royale des Sciences. Source: Die mathematische schriften von Gottfried Wilheim Leibniz, Vol. VII. C. I. Gerhardt (ed). pp. 223-227. p. 225. Trans. Lloyd Strickland 2007. http://www.leibniz-translations.com/pdf/binary.pdf ↩

Newton, William (ed). (1866). Newton’s London journal of arts and sciences; being a record of the progress of invention as applied to the arts. New Series. Vol. XXIII. London: Newton and son. p. 334 ↩

Boole, George. (1847). The Mathematical Analysis of Logic: being an essay towards a calculus of deductive reasoning. Cambridge: MacMillan, Barclay, & MacMillan. London: George Bell. p. 76 ↩

Turing, Alan. (1936). On computable numbers, with an application to the entscheidungsproblem. Proceedings of the London Mathematical Society. November 12, 1936. 230-265. p. 241. Blockquote from pp. 230,232. ↩

Shannon, Claude E. (1948). A Mathematical Theory of Communication. The Bell System Technical Journal. Vol. XXVII. July, 1948. No. 3. p. 380. Also see Shannon, Claude E. & Weaver, Warren. (1964[1949]) The Mathematical Theory of Communication. Urbana: University of Illinois Press. ↩