In The Information (2011), James Gleick gives us a history of how humans have stored, transferred and theorized information. His overall purpose is to show that our dealings with information didn’t begin with the advent of the ‘information age,’ though certainly they have intensified. He develops a narrative that begins with pictographs and talking drums, continues on to the invention of the alphabet, libraries, dictionaries, telegraphy, and onto the information technologies we use today. His narrative remains one continuous thread through a methodological technique of taking the principles and ideas of information theory (which began development in the 1960s) and showing how they have always been what is at issue in human dealings with information, even before we had terms for them – principles and ideas like bandwidth, signal/bit size, noise, redundancy and entropy. I’m not going to explain these here, rather I’ll just consider a few general notions that keep popping up:
Everything is being informationalized. Information theory has its roots in electrical engineering (early telephone systems at Bell) and early computer science, but already, in its short life-span, information theory has been fruitfully applied to physics (thermodynamics can be understood in information-theoretic terms), genetics (the genetic code is information which is translated into a particular phenotype, neurology (the brain as neuronal network of information), and the study of complex systems (information theory presents a way to understand chaos and randomness). There are even some that maintain that information (in the form of ‘bits’) is the fundamental constituent, the most basic building-block of reality.
The increasing ubiquity of informational ways of thinking is propelled by the fundamental insight of information theory, proposed by Claude Shannon, the discovery that information can be quantized, that is, broken down from a continuous series and into a set of discrete parts (bits), thus making it measurable, manipulable in mathematical formulas, formalizable. Gleick elaborates:
For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague – force, mass, motion and even time – and gave them new meaning. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just a soft and inclusive term as information. For Aristoteleans, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematized it, giving energy its fundamental place in the physicist’s view of nature.
It was the same with information. A rite of purification became necessary. And then, when it was made simple, distilled, counted in bits, information was to be found everywhere (8-9).
What’s a bit? Here’s the interesting part: it’s a fundamental particle, and it’s not just invisible to the naked eye, it’s completely immaterial – we can say it’s virtual or abstract. A bit is a binary choice, or a virtual switch that has two states: 1 or 0, yes or no. Of course, a bit requires some kind of physical medium to be embodied in, but it doesn’t have to be a transistor. 6 billion bits in the form of DNA make up a human being. Some even believe that bits are irreducible, that bits make up matter, and even space and time. Physicist John A. Wheeler put it like this:
It from bit. Otherwise put, every ‘it’—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. ‘It from bit’ symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe. ( Wheeler 1990: 5)
This position is called digital ontology and it’s a new concept for me. An interesting one too in that it’s fundamentally immaterialist, yet tempting, and I’m a rather committed materialist… but it’s too soon for me to say anything interesting about it. Though it seems that Luciano Floridi, who has founded the subfield of philosophy of information, doesn’t agree, and argues instead for an ‘informational ontology’ (I’ll get to Floridi’s book in a few months, but here’s an article here for the intrigued: http://www.philosophyofinformation.net/publications/pdf/ado.pdf).
My other thought regarding quantization is: what remains to be usefully quantized that hasn’t yet been? I offer not even a tenuous lead at this point.
Information, as technically conceived in bits by Shannon, is not concerned at all with meaning. The meaning of the bits sent in an information transfer is unessential to understanding the success or failure, quality or decay of the information. The same qualities apply to speech as do to drumbeats or the data a guided missile uses, Gleick explains. All can be analyzed in the same way. I see this meaninglessness as contributing a sort of universality to information, a potential for the knocking-down of borders or a horizontalizing, a potential for dehierarchizations.
Vague notions in this vein led to me think about how apolitical Gleick’s text is. Granted, it’s a history text about information, not a philosophical/political argument, and one can’t cover every possible facet of anything in one book. Yet, I wish he would have, at the end at least, considered the more political deployments of information technology especially in recent years. He touches on this when discussing the strange new world of digital (informational) property, but doesn’t have much to say about the control of information, or theories on how the flow of information in society ought to be directed, promoted or stifled. What is revolutionary information handling? What is the opposite?
Maybe some of my questions will be answered in my next read about information: “Glut: Mastering Information through the Ages” by Alex Wright. http://www.amazon.com/books/dp/0801475090
Wheeler, John A. A Journey Into Gravity and Spacetime (1990). Scientific American Library. W.H. Freeman & Company