Category Archives: Uncategorized

Preliminary Discourse on the Potato Chip and Seasonings Thereof

The time has come to settle a seemingly interminable debate: whether the potato chip ought to be seasoned or not (it should be understood that here “not seasoned” refers to completely unseasoned chips, as well as those to which only salt, and the necessary oil for cooking, have been added – such chips will be referred to in the course of this work as “plain”). Years have passed since the subject was first broached in my company and a satisfactory resolution to the dispute has yet to be obtained, nor has even the glimpse of one on a distant and savory horizon been descried. This document will, it is hoped, open debate upon this trying subject.

Antiseasoningist Wysman (2012, personal correspondence with the author) and seasoning-skeptic Schiller (2015, forthcoming) have suggested that the potato chip requires naught but salt for a rewarding gustatory experience, but their reasoning has not been made sufficiently clear. The author awaits their forthcoming works and here undertakes to dispel some of the more commonly held antiseasoningist positions, in order to “clear the ground,” as it were.

A ready first argument in favour of plain chips is the historical one, also called the pre-synthetic argument. It begins by invoking the image of the primal potato, newly drawn from the earth, which clearly is not seasoned. However, the potato must undergo chopping and frying before it constitutes potato chips. Neither chopping nor frying add anything to the potato, it is said. Thus, the plain potato chip has not been synthesized with foreign substances and thus, it remains a true potato chip. The synthesis of potato chip and seasoning produces something that is not a potato chip but rather an entity that may be defined schematically in this way:

Post synthesis “potato chip” (PSPC) = (potato chip+s, where s=anything edible).

This stands opposed to:

True Potato Chip (TPC) = (potato)

While compelling in its parsimony, this argument fails to take into account the minimal quantity of salt that is added to almost all plain chips, and further, to consider the residual oil that is absorbed into the potato during the frying process. For, if salt and oil are accepted as somehow not counting as s, or as being somehow “pre-s,” it can only be so on the basis of a highly ad-hoc rule. It remains unclear how ketchup, for example, could be excluded from the pre-s situation under the same logic.

Second, the content-garnish argument is also worth noting. This argument holds that the content of a potato chip is the potato and that seasoning plays a secondary role as garnish. This a value-theoretical argument in that it holds potato as a primary value and seasoning as valuable only in a secondary or derivative way, which is dependent on the primary value. The seasoning is only valued, such puritans argue, because it accompanies the potato. Alone, the seasoning would be akin to the flavor packets found in instant noodle packages – unappetizing and blatantly carcinogenic.

This position does have an intuitive appeal, as it is certainly true that one does not go about eating Mr. Noodles seasoning packets sans noodles. However, this argument unjustifiably presupposes the primacy of the potato in its valuation schema. Is it not true that there exist chips made of non-potatoes? The exotic corn-based chips known “tortilla” chips attest to this fact. Further, chips fashioned from pita bread are said to exist in the far East. Therefore: not all chips are potato chips. Thus, chips are not eaten because they taste like potato (supplementary: no one has ever eaten an ungarnished potato, except in a survival scenario – the same can be said for bread, which is well-known to be a potato-analogue). One is forced to conclude that, if potato chips are eaten, it is precisely because of some non-potato element in the potato chip complex. As the frying oil seeming a rather unlikely candidate, we are led to conclude that the motivation for eating potato chips lies in the seasoning applied to them. This assertion has the happy corollary of explaining the existence of chip dip, which is merely a sublimated form of direct seasoning. In chip dip, seasoning is delayed and the anticipation is heightened. The chip is left bare until mere microseconds before mastication. It might be asserted that chip dip functions in some way analogous to the Freudian eros, which seeks to prolong the ecstatic process towards release. On the quantitative level of chips consumed with dip, it then seems that dip represents a massive indulgence and simultaneous submission to restriction – a decadent sadomasochism of snacking.

Meditation on the content-garnish argument eventually leads a sincere thinker to an inversion of the model proposed by the historical/pre-synthetic argument. The fact that non-potato chips enjoy great popularity suggests that the potato is arbitrary, despite it being included in the name “potato chip”. If the potato is non-essential, it is merely included in the potato chip as a vehicle for seasoning. The author therefore proposes what he terms the Vehicle Theory Model of the True Potato Chip:

True Potato Chip (TPC) = (seasoning + v, where v= anything edible)

The particularity of the vehicle is shown to be completely arbitrary, in reality. Hence the proliferation of non-potato chips, as well as the propensity for humans to dip non-chips in chip dip. However radical this may sound, the author asks you to prepare yourself for an even more radical additional consequence. If one follows the Vehicle Theory logic to its logical extreme, and leaning outside the realm of Vehicle Theory to draw on developments in theoretical physics, one is forced to admit that the vehicle itself is totally superfluous, as represented in this formula:

True Potato Chip (TPC) = (seasoning)

As Feynman (1999) argues, nothing of what we know about physics makes engineering at the molecular level impossible. Indeed, it seems highly probable that we will eventually invent machines that can work at this level. From then on slipshod macro-level construction of things will be replaced by novel techniques of creating things from the very bottom-up. Molecular assembling technology, as it is called, holds out the opportunity of creating chips solely out of seasoning, using the molecular bonds of the seasoning’s constitutive molecules to craft chips of unheard of shapes and textures – without the interference of the potato. There are a few peaks from which the future appears unexpectedly bright – this may be the brightest.
The third argument continues the topic of texture that the second ended on. Antiseasoningists often point out the ruffled potato chip as evidence of the worthlessness of seasonings. “Why would plain, ruffled chips ever have come into being,” a prominent antiseasoningist was recently overhead slurring in an “exotic” massage parlour lobby, “if not because plain chips have been so evolutionarily successful as to have spawned a sister-species? How could anyone think otherwise?” The argument, as far as the author comprehends it, is that were there no demand for more plain chips, a ruffled format would never have appeared. Indeed, the continued thriving of two species of plain chips suggests that they are considered desirable.

I cannot deign to call this argument appealing, for the facts are diametrically at odds with it. The ridges and valleys of the rippled chip exist only to increase the chip’s function as vehicle for seasoning. One may find evidence for this by looking inside the human body. The human brain, as is well-known, appears wrinkled on its surface – the ridges known as gyri and the valleys known as sulci. This shape increases the surface area of the cortex which is able to fit within the skull cavity, enabling the advanced level of reflexive thought that humans enjoy. Similarly, rippled chips have an increased surface area for improved retention of seasoning, allowing an increased delivery of the payload. It was this technological advancement in potato chip seasoning capacity that produced the plain, rippled chip as epiphenomenon. Indeed, recent studies have shown (Steinhoff 2015, forthcoming) that plain, rippled chips exist only because currently employed seasoning machines inevitably miss many of their targets, due to difficulties in modulating the rippled texture to agree with the vast dimensional fluctuations in post-GMO potatoes, and thus leave many chips unseasoned.

What more can be said? What can the antiseasoningists possibly marshal to counter such a devastating, yet genteel critique?

Leave a comment

Filed under Uncategorized

Quantizing consciousness, informational dualism. Notes on Koch’s Consciousness

In my last post, I wondered what sort of unexpected things might be quantized as information. My latest read has provided me with a somewhat startling answer: consciousness. Christof Koch’s Consciousness: Confessions of a Romantic Reductionist (2012) suggests that consciousness can be quantized, and thus measured, as precisely as the data streaming through your internet connection. In fact, there are several points of interest in this book, which is a sort of hybrid memoir/pop-science book from the renowned neuroscientist.

First, the theory of integrated information. This theory, developed by psychiatrist/neuroscientist Giulio Tononi is based on two propositions. The first is that every conscious state is differentiated, or that every conscious state represents a massive amount of information determined not only by what is perceived in the conscious state (your computer screen, for example) but also by everything that that conscious state does not consist of (a view of the moon, the title scroll of Star Wars, a blue jay, etc.). Koch puts it this way: [a particular] “subjective experience implicitly rules out all these other things you could have seen, could have imagined, could have heard, could have smelled. This reduction in uncertainty (also known as entropy) is how the father of information theory, electrical engineer Claude Shannon, defined information. To wit: Each conscious experience is extraordinarily informative, extraordinarily differentiated” (125). Shannon’s idea was that information exists only where there is uncertainty. If you were to communicate with a single binary switch that can only be one of two states, 1 or 0 (or a fair coin toss) the uncertainty or Shannon entropy is 1 bit. I’ll let Wikipedia elaborate for me:

Entropy is a measure of unpredictability or information content. To get an informal, intuitive understanding of the connection between these three English terms, consider the example of a poll on some political issue. Usually, such polls happen because the outcome of the poll isn’t already known. In other words, the outcome of the poll is relatively unpredictable, and actually performing the poll and learning the results gives some new information; these are just different ways of saying that the entropy of the poll results is large. Now, consider the case that the same poll is performed a second time shortly after the first poll. Since the result of the first poll is already known, the outcome of the second poll can be predicted well and the results should not contain much new information; in this case the entropy of the second poll results is small.

Now consider the example of a coin toss. When the coin is fair, that is, when the probability of heads is the same as the probability of tails, then the entropy of the coin toss is as high as it could be. This is because there is no way to predict the outcome of the coin toss ahead of time—the best we can do is predict that the coin will come up heads, and our prediction will be correct with probability 1/2. Such a coin toss has one bit of entropy since there are two possible outcomes that occur with equal probability, and learning the actual outcome contains one bit of information. Contrarily, a coin toss with a coin that has two heads and no tails has zero entropy since the coin will always come up heads, and the outcome can be predicted perfectly. Most collections of data in the real world lie somewhere in between (


Keeping that in mind, the second proposition of integrated information is that conscious states are highly integrated. Conscious states are gestalt wholes, irreducible to parts that can be experienced independently. You can’t make yourself start seeing in black and white, Koch says, nor can you will certain sound waves out of your auditory perception.

These two premises form the ground of the theory. Any system that is “a single, integrated entity with a large repertoire of highly differentiated states” is thus conscious, to some extent. Tononi “posits that the quantity of conscious experience generated by any physical system in a particular state is equal to the amount of integrated information generated by the system in that state above and beyond the information generated by its parts. The system must discriminate among a large repertoire of states (differentiation) and it must do so as part of a unified whole, one that can’t be decomposed into a collection of causally independent parts (integration)” (126). The quantity of conscious experience is measured as Φ (phi), which is expressed in bits as “the reduction of uncertainty that occurs in a system, above and beyond the information generated independently by its parts, when that system enters a particular state” (127).  So Φ expresses the amount of reduction of uncertainty in your brain/central nervous system as it enters a conscious state and can do the same for any system that exhibits the two necessary properties, integration and differentiation. Animals, artificial intelligences and computers can be measured in the same way. Koch even suggests that “the Web may already be sentient. By what signs will we recognize its consciousness? … The implications don’t stop there. Even simple matter has a modicum of Φ. Protons and neutrons consist of a triad of quarks that are never observed in isolation. They constitute an infinitesimal integrated system” (132). This notion, of course, leads to the entire universe having consciousness – panpsychism – a proposition I’ve always found repugnant. I might be forced to reconsider. Integrated information is appealing to me in that it dethrones human consciousness from a place of privilege (a mainstay of Western religion). If successful, the theory would institute a thorough demystification, like Newton’s quantization of mass, force, etc. did for Aristotelean physics.

Related to the resultant panpsychism of the consciousness as information theory that Koch endorses is his endorsement of an informational substance dualism. He asserts that “subjectivity is too radically different from anything physical for it to be an emergent phenomenon” (119). The feeling of subjectivity, doesn’t just emerge from a complex network of neurons (my belief), he is saying. He instead makes a surprising claim: “I believe that consciousness is a fundamental, an elementary, property of living matter. It can’t be derived from anything else; it is a simple substance, in Leibniz’s words” (119). Just as positive or negative electrical charge is an intrinsic property of protons and electrons, and there are no uncharged particles waiting to be charged, consciousness is intrinsic in “all organized chunks of matter. It is immanent in the organization of the system. It is a property of complex entities and cannot be further reduced to the action of more elementary particles” (120).  Thus, Koch holds what he admits is “a form of property dualism: The theory of integrated information postulates that conscious, phenomenal experience is distinct from its underlying physical carrier … The conscious sensation arises from integrated information: the causality flows from the underlying physics of the brain, but not in any easy-to-understand manner. This is because consciousness depends on the system being more than the sum of its parts” (152).

This is really a bewildering theory, to my mind. In truth, I can’t wrap my head around it just yet. I’m very accustomed to thinking of consciousness as emergent property of networks, and partial in particular, to Douglas Hofstadter’s ‘strange loop’ theory. Anyhow, it turns out philosopher of mind David Chalmers holds a similarly dualistic view to Koch, which he terms “double-aspect theory” in which information has two aspects – physical and phenomenal. I am HIGHLY uncomfortable with such ideas and cannot figure out how information is meant to be understood as immaterial. I mean, it is always embodied in some sort of matter (neurons, circuits, text on paper) and requires some sort of energy to transfer/communicate – this even proponents of such theories have to admit – but they also assert that “Information is neither matter nor energy”. So what is it? It must be a pattern, but a pattern requires a medium in which it is instantiated to exist, doesn’t it? When and how would there be a pattern which exists with no material instantiation? If you can help this prejudiced materialist, or recommend reading, please do.

Koch also touches on free will. His position is interesting. He first rules out strong determinism (Ligotti’s puppet-nature), due to quantum indeterminacy in the basic structure of matter. But he also rules out strong free will, in the Cartesian or Christian sense, the kind that says in exactly the same situation, were it repeated, you could will yourself to act differently. This because physics rules out any action of an immaterial will that escapes material laws and somehow compels the physical brain to act. He also admits that experiments show that brain decides on actions (relatively) long before consciousness becomes aware of the decisions. Or that “the sensation of agency or authorship – is secondary to the actual cause. Agency has phenomenal content, or qualia, just as sensory forms of conscious experience do … How the decision is formed remains unconscious. Why you choose the way you do is largely opaque to you” (111). Feeling in control is just that – a feeling. And it can be manipulated by shooting electricity into certain parts of the brain. It is certain, he says, that a large part of consciousness is really just “zombie agents” or unconscious circuits that we attribute agency to only after the fact. He doesn’t come out and say it explicitly – he asserts a kind of compatibilism – but it seems he’s essentially admitting there’s very little free will, if any. He asserts that “Yet we cannot rule out the possibility that quantum indeterminacy … leads to behavioural indeterminacy … evolution might favour circuits that exploint quantum randomness for certain acts or decisions” (101). It seems he keeps a shred of free will by optimistically assessing the linkage between quantum events and the actions of conscious systems, a connection I was under the impression is tenuous at best (though I’m no expert). Yet he also says, “Personally, I find determinism abhorrent … (Of course, my personal feelings on this matter are irrelevant to how the world is)” (101). It would behoove him then, to spend more time explaining his compatibilism, because as it stands, it seems unjustified. Particularly when taken in combination with the closing chapter of the book in which he waxes hyperoptimistic and says “My tribulations are not meaningless – I am no nihilist … I am less free than I feel I am … Yet I can’t hide behind biological urges or anonymous social forces. I must act as if “I” am fully responsible, for otherwise all meaning would be leached from this word and from the notions of good and evil” (164). Again, the conspiracy, the lie… As Cioran writes in The Trouble With Being Born: “Lucidity is the only vice that makes us free – free in a desert” (12).

Leave a comment

Filed under Uncategorized

Gleick’s ‘The Information ‘ – a few thoughts

In The Information (2011), James Gleick gives us a history of how humans have stored, transferred and theorized information. His overall purpose is to show that our dealings with information didn’t begin with the advent of the ‘information age,’ though certainly they have intensified. He develops a narrative that begins with pictographs and talking drums, continues on to the invention of the alphabet, libraries, dictionaries, telegraphy, and onto the information technologies we use today. His narrative remains one continuous thread through a methodological technique of taking the principles and ideas of information theory (which began development in the 1960s) and showing how they have always been what is at issue in human dealings with information, even before we had terms for them – principles and ideas like bandwidth, signal/bit size, noise, redundancy and entropy. I’m not going to explain these here, rather I’ll just consider a few general notions that keep popping up:

1)      Ubiquity.

Everything is being informationalized. Information theory has its roots in electrical engineering (early telephone systems at Bell) and early computer science, but already, in its short life-span, information theory has been fruitfully applied to physics (thermodynamics can be understood in information-theoretic terms), genetics (the genetic code is information which is translated into a particular phenotype, neurology (the brain as neuronal network of information), and the study of complex systems (information theory presents a way to understand chaos and randomness). There are even some that maintain that information (in the form of ‘bits’) is the fundamental constituent, the most basic building-block of reality.

2)      Quantization.

The increasing ubiquity of informational ways of thinking is propelled by the fundamental insight of information theory, proposed by Claude Shannon, the discovery that information can be quantized, that is, broken down from a continuous series and into a set of discrete parts (bits), thus making it measurable, manipulable in mathematical formulas, formalizable. Gleick elaborates:


For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague – force, mass, motion and even time – and gave them new meaning. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just a soft and inclusive term as information. For Aristoteleans, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematized it, giving energy its fundamental place in the physicist’s view of nature.

It was the same with information. A rite of purification became necessary. And then, when it was made simple, distilled, counted in bits, information was to be found everywhere (8-9).


What’s a bit? Here’s the interesting part: it’s a fundamental particle, and it’s not just invisible to the naked eye, it’s completely immaterial – we can say it’s virtual or abstract. A bit is a binary choice, or a virtual switch that has two states: 1 or 0, yes or no. Of course, a bit requires some kind of physical medium to be embodied in, but it doesn’t have to be a transistor. 6 billion bits in the form of DNA make up a human being. Some even believe that bits are irreducible, that bits make up matter, and even space and time. Physicist John A. Wheeler put it like this:

It from bit. Otherwise put, every ‘it’—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. ‘It from bit’ symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe. ( Wheeler 1990: 5)


This position is called digital ontology and it’s a new concept for me. An interesting one too in that it’s fundamentally immaterialist, yet tempting, and I’m a rather committed materialist… but it’s too soon for me to say anything interesting about it. Though it seems that Luciano Floridi, who has founded the subfield of philosophy of information, doesn’t agree, and argues instead for an ‘informational ontology’ (I’ll get to Floridi’s book in a few months, but here’s an article here for the intrigued:

            My other thought regarding quantization is: what remains to be usefully quantized that hasn’t yet been? I offer not even a tenuous lead at this point.

3)      Meaninglessness.

Information, as technically conceived in bits by Shannon, is not concerned at all with meaning. The meaning of the bits sent in an information transfer is unessential to understanding the success or failure, quality or decay of the information. The same qualities apply to speech as do to drumbeats or the data a guided missile uses, Gleick explains. All can be analyzed in the same way. I see this meaninglessness as contributing a sort of universality to information, a potential for the knocking-down of borders or a horizontalizing, a potential for dehierarchizations.

4)      Apoliticism

Vague notions in this vein led to me think about how apolitical Gleick’s text is. Granted, it’s a history text about information, not a philosophical/political argument, and one can’t cover every possible facet of anything in one book. Yet, I wish he would have, at the end at least, considered the more political deployments of information technology especially in recent years. He touches on this when discussing the strange new world of digital (informational) property, but  doesn’t have much to say about the control of information, or theories on how the flow of information in society ought to be directed, promoted or stifled. What is revolutionary information handling? What is the opposite?

Maybe some of my questions will be answered in my next read about information: “Glut: Mastering Information through the Ages” by Alex Wright.








Wheeler, John A. A Journey Into Gravity and Spacetime (1990). Scientific American Library. W.H. Freeman & Company 


Filed under Uncategorized