University of Melbourne
Abstract: In the contemporary era, everything is digital and the digital is everything. Everything is digitized to data, then modulated between storage and display in an endless network of protocol-based negotiation that both severs any link to the data's semantic source and creates an ever-growing excess of data weirdly related to, but ontologically distinct from, its originating data source. Since the very ‘concept of medium' means that there are media, plural, i.e., differentiated media, and since the digital converges all media into a single state (that is to say, digital data), then by definition the concept of media disappears. Instead of media, there are simulations of media. This is the ‘event' that needs to be thought through. In this paper, we construct an ontology appropriate to the era of digital networks and draw out several consequences for the relationship between humans and digital networks.
Everything is Digital- 2 -
Today, everything is digital and the digital is everything.- 3 -
Surely such a totalising, yet reductive, assertion can’t be right, even if we accepted that it had any sense? What about rocks and stones and trees? The great eighteenth-century literary critic Dr Johnson famously responded to Bishop Berkeley’s idealist, ‘immaterialist’ philosophy broached in A Treatise Concerning Human Understanding by kicking a boulder and declaring “I refute it thus!” (Boswell, 238). Such a naturalist rejoinder has found its contemporary avatars in the field of media studies under a variety of names, ranging from Bruno Latour and Actor-Network-Theory  to Luciano Floridi’s philosophy of information (Latour, 2007; Floridi, 2014). Despite their manifest, manifold differences, what unites such projects is their commitment to fundamentally naturalistic redescriptions of the complex interactions of trans-human agents. They are directed at displacing ‘the human’ from the centre of action, multiplying the sites and forces and functions that are to be analysed, at the same time that these factors are simultaneously treated as part of a sole and single natural world. For such redescriptions, then, the grand assertion with which we began not only fails basic evidentiary testing, but lacks any real pragmatic or intellectual interest; at best, such assertions would be precisely symptoms of an outmoded or misguided approach.- 4 -
Something similar would go for those who attend to the new technologies of our time from a social, historical and political perspective. ‘History,’ under this description, is considered as a non-teleological temporal becoming ruled by contingencies: whatever happens might also not have happened; yet what happens also happens as a result of human intentions and actions; finally, it is with respect to the consequences for the latter that what happens comes to matter at all. Insofar as this is the case, it is the task of scholars to attend to the complex local and temporal interactions of the heterogeneous sites, expertise, interests and acts that have led to the development, say, of the microprocessor or HTTP or Web 2.0. What is now widely called ‘software studies’ would be exemplary in this regard (although it is not of course the only way of attending to media under this general rubric). The very title of ‘Chapter 1’ of Adrian Mackenzie’s Transductions (2002) summarises the fundamental axioms of this approach directly: “Radical contingency and the materialisations of technology.”  Similarly, Lev Manovich attends to the multiple forces driving new media developments in a socio-political frame; in doing so, he necessarily points to the impact upon the means and vocabulary with which we attend to the new phenomena. “ [T]hese very terms,” Manovich writes, “content, cultural object, cultural production, and cultural consumption—are redefined by web 2.0 practices” (2009, 326). Or, as he puts it in another text, software studies “has to investigate the role of software in contemporary culture, and the cultural and social forces that are shaping the development of software itself” (Manovich, 2013: 10). For such accounts, then, there can also be no fundamental interest in big ontological claims. Rather, there is a commitment to minutely tracking developmental processes that are integrally mediated through the human, along with their social and political implications. Ontology, when it enters at all, can only do so as an historically-circumscribed concern.- 5 -
Yet there is another field—that of contemporary speculative philosophy—which, given it should in principle immediately take an interest in grand ontological assertions as they pertain to the digital, nonetheless has shown few signs of doing so with the requisite attention or detail. The ‘return to ontology’ reignited by Alain Badiou’s Being and Event in 1988 (English translation 2004) has seen the emergence in the 2000s of a project broadly denominated as ‘speculative realism,’ and whose major representatives include Ray Brassier, Graham Harman and Quentin Meillassoux.  What broadly characterises this trend is its commitment to a return to a full-blown metaphysics of ‘being,’ outside of any subjective or human ‘correlation.’ Perhaps the best-known tributary of this movement is ‘object oriented ontology’ (OOO), expressly named as such by Harman on the model of ‘object oriented programming’—but just as expressly without any further relation to computing than that. What is certainly notable about all these ontologies is their radicalisation of what Martin Heidegger (1996) phrased ‘the ontico-ontological difference’: that is, the difference between ‘beings’ and ‘being.’ For Heidegger, this distinction had been constitutionally forgotten by Western metaphysics, such that metaphysics came to consider being itself as one being among others, albeit as the highest or supreme kind of being (such as ‘God’ in Christian theology)—to the point where the very forgetting had itself been forgotten. To escape metaphysical enscapsulation, then, Heidegger attempted by a variety of means to reinvigorate the ontological difference; he found that he had to abandon received uses of predication and description as the appropriate means of doing so. In their place, he began to offer an extraordinary meta-theory that can itself be seen as a radical form of media theory, that is, by way of a return to language as the opening of any possible revelation (“language is the house of being”), and with poetry as its privileged witness in our destitute times, governed as they are by modern technology (Heidegger, 2000: 83). Despite their own difficulties with Heidegger, the speculative realists share his anti-descriptivist rage in their constructions of systems of real objects utterly indifferent to any human concerns. In doing so, however, they are also concerned to attend to the abstract problematics of transmission, that is, of ‘media’ in the most rigorous way.- 6 -
It is noteworthy that these three major trajectories (the naturalist, the socio-historical and the object-oriented-ontological) in contemporary media studies are incommensurable, if in perhaps unexpected ways. First, the naturalists and sociologists share a descriptivist approach, although they differ strenuously on the place that they assign to the human; the naturalists and ontologists share a hostility to the human, but differ strenuously regarding the status of description; the sociologists can only take up the ontologists as a supplement, whereas the reverse does not hold; the ontologists, ironically, fail entirely to think of or about the actual status of the new media upon which they are nonetheless clearly dependent, except by recourse to sociological or naturalistic motifs which then undermine (or even overmine) the ontology.- 7 -
Certainly, we have characterised these trajectories both briefly and broadly, as a form of what Max Weber might have called ‘ideal types.’ In fact, many of the studies we invoked above are more hybrid in practice, mixing and matching elements according to situational and pragmatic demands. Yet this hybridity hardly vitiates the tension between the trajectories, which, as we have indicated, is irreducible; this tension leads to certain opacities or blind-spots regarding the status of media for each trajectory, which the others can supply only at the cost of their own blindness. Indeed, hybrid practices themselves tend to obscure the consequences and real stakes of the incommensurability insofar as it cannot simply be a matter of picking-and-choosing from each; such an option repeats rather than resolves the difficulties. What we propose in this paper, then, is to use elements from each of these tendencies in a way that none of them can do alone; in doing so, we will construct a specifically digital ontology which, while tied in an integral way to the new media of our times, also exceeds their current forms; this construction will enable us to show that the modalities of differentiation in new media do not only occur at the level of display, nor at the level of programming, but in a genuinely ontological way. This ontology will be at once historicist, inhuman, and anti-descriptivist. It will be processual, multiple, and without objects. Yet it will be able to account for the genesis and transmission of all sorts of digital entities. It will, finally, have a probative value in that it is able to reassign some of the descriptive claims that seem to be made about media as moments of different levels of different kinds of operations upon being.
Digital ontology is the event of the end of media- 8 -
For anything to appear in the digital realm—here, in the usual acception of ‘digital media’—it must first be digitised to data, then modulated between storage and display in an endless protocol-based negotiation that both severs any link to the data’s semantic source and creates an ever-growing excess of data weirdly related to, but ontologically distinct from, its originating data source. This distinction between digital data and its display has been investigated by many contemporary thinkers, often in a manner indebted to the Platonic concepts of ‘amamnesis’ and ‘hypomnesis.’ The German ‘post-media’ scholar Friedrich Kittler famously relies on this split to assert that there are no longer any media, saying, “with numbers, everything goes […] a digital base will erase the very concept of medium” (Kittler, 1999: 34). Since the very concept of media by definition presumes that there are media, plural (for example, differentiated media), and since the digital converges all media into a single state (that is to say digital data), then by definition the concept of media simply disappears. In other words, data is the Great Leveler. There still seem to be media in the world, because this plastic data state is eminently able to be modulated into arbitrary display states, but these states are now rather a simulation of media. This is not to say that data is itself simply undifferentiated, only that its essentially modulable plasticity cannot be considered under the received rubrics of formal or predicative differences. To the contrary, as we explain further below, it will be necessary to reconsider differences themselves on the basis of a radicalised understanding of the digital base. Kittler’s argument is directed towards the concept of media as intervening agencies or materials that must precisely be differentiated from each other to be considered as media. Because digitisation places the emphasis on a plurality of modulations of the same material, just as Spinoza conceives of a single substance expressed in an infinity of modes, these modulations are no longer media in any traditional sense. One indication of this, as many commentators have underlined, is that even quite ordinary uses of digital media—one example being current social media—make it impossible to assign their operations to traditional categories of media studies. What is the ‘sender,’ ‘producer,’ ‘receiver,’ ‘message’ of a simple Facebook post? The digitisation process creates an excess of digital data through its own operations, an actual excess greater than the sum of just simple meta-media and the retroactive virtuality of the media being digitised as virtual content. Once modulated into a display state, the reconstituted data simulates media differentiation, and therefore can be analysed in terms of McLuhan’s nostalgic rear-view mirror (McLuhan and Fiore, 2001: 75). Indeed, Kittler acknowledges this (1999, 2). The point, however, remains: we are not in a media situation, but in a simulated-media situation. It is not that contemporary media saturate us with simulations, but that these media are themselves simulations. This is the ‘event’ that needs to be thought through.
Data must be modulated- 9 -
For, insofar as it is digitised or digitisable, there is no meaningful distinction between, say, an image and a sound, a video and a stock market price, until that data is once again modulated into a display register. Moreover, there is no necessary reason—ontological or otherwise—why any given set of digital data should be modulated into any given display state. Indeed, it is this contingency that founds the possibility of software studies. For example, a digitised image need not be modulated into a visible display register as an image; instead, it could be modulated into figures populating a spreadsheet. It is a simple (relatively speaking) matter to construct a protocol that will ‘make sense’ of the data in the context of a spreadsheet, and such operations are regularly seen in the form of ‘information visualisation’ graphics (a highly fraught practice).  As stated above, the fact that there still seem to be media in the world, apparently differentiating themselves from each other, is entirely due to this protocol-driven modulation process to and from display states. This virtualisation of media in the digital represents the simultaneous apotheosis and demise of McLuhan’s prescient aphorism that the content of any new medium is all prior media (McLuhan and Fiore, 2001: 75). The age of media is over, for there is now only one medium; and one medium is no medium at all.
Modulation as Simondonian Becoming- 10 -
The English word modulate is derived from the Latin modulari, meaning to regulate. Modulari is itself derived from modus, meaning measure, rhythm, song or manner, and sometimes mood. The word first appeared in English in the 17th century in relation to music, where (then and now) it is a technical term meaning to change key during a performance or composition, and sometimes to change volume. It has come to denote a similar performative action in speech. The sense of regulated change into a different register, condition or form that is present in these usages of the word have allowed it to take on a broader meaning in general language, closer to its original Latin, so that the first definition in the Oxford Dictionary of the verb modulate is simply, “exert a modifying or controlling influence on.”- 11 -
The word was given another specific technical meaning when it was chosen, in the very early 20th century by pioneering radio engineers, to describe modifying a carrier wave with a signal, or information, wave. In this mode, the word takes on a definite sense of shaping, even of sculpting, since the shape of the carrier wave is modified by the signal for transmission, and this is the means by which regulation occurs. This technical meaning has persisted in telecommunications, where it is now a fundamental transmission technique, and also in electronics.- 12 -
In this article we wish the term modulation to ring with all of these meanings working together, in a manner that is close to Simondon’s usage of the word. The Simondon scholar Muriel Combes puts it best when she says that modulation is “the putting into relation of an operation and a structure” (Combes, 2013: 15). Here, she is using modulation to describe the nature of Simondon’s subtle concept of the allagmatic, his ontogenetic theory of operations, or transduction, that describes the constant bringing together of disparate structures in a creative process of individuation. Combes maintains that such individuation is modulation (2013: 5). Simondon’s understanding of modulation brings together all senses of the term and extends the electronics/telecommunications sense, of which he was very much aware, to give equal weight to both participating elements in the modulatory operation, each influencing each other. He does this to oppose any sense of hylomorphism, where form is actively given to passive matter, preferring a kind of dialectical process where both elements influence each other equally and recursively in cycles of constant change. Adrian Mackenzie says that, for transduction to work, there must be “some disparity, discontinuity or mismatch within a domain; two different forms or potentials whose disparity can be modulated” (2002, 25). Deleuze and Guattari, in A Thousand Plateaus, acknowledge Simondon’s understanding of modulation in relation to the hylomorphic idea of molding when they quote from his The Mode of Existence of Technical Objects: “modulation is molding in a continuous and perpetually variable manner” (Deleuze and Guattari, 1987: 562 n92).- 13 -
Just as Simondon complicates and deepens the notion of modulation, avoiding any simple oppositions, so he does with the notions of being and becoming. He understands these two terms, not in fundamental opposition to each other, rather as dimensions or modes of each other:
- 14 -- 15 -
The being in which individuation comes to fruition is that in which a resolution appears by its division into stages, which implies becoming: becoming is not a framework in which the being exists; it is one of the dimensions of the being, a mode of resolving an initial incompatibility that was rife with potentials. (Simondon, 1992: 301).
In a footnote to that sentence, Simondon further explicated the role of becoming, saying, “in a certain sense, ontogenetic development itself can be considered as mediation” (Simondon, 1992: 317 n2). In this mediating sense, we can understand modulation as becoming, although for Simondon mediation never refers to a third entity acting between two structures, rather he understands mediation purely as the process of interactive communication between the two structures, a process which always amplifies (Simondon, 1992: 304).- 16 -
Finally, to understand Simondon’s system—what Marie-Pier Boucher calls “the onto-epistemology of the emergence of living techniques as biotechnical individuals” (Boucher, 2013: 92)—we need to grasp his understanding of the term information, yet another term to which Simondon added depth and nuance, and which is crucial to his philosophy. This understanding is distinct from the cybernetic understanding of information—an understanding heavily influenced by Claude Shannon’s theories—which Simondon accepted within a certain technological framework but wished to deepen. Rather than the technological idea of information as message, Simondon saw information as “the meaning that arises on the heels of a disparation” (Simondon, 1992: 316). He considered transduction to conserve information, in other words when two disparate structures are modulated, all information is conserved and magnified in the amplifying process that individuates a new being structured from the previously disparate structures, creating new informational structures in which meaning inheres. Here, we are asserting that we may, to a certain useful degree and with certain important differences, analogise Simondon’s overall system with the digital-data-display continuum, where digital data in its undifferentiated state is the preindividual metastable system from which any displayed phenomenon is individuated. Broadly speaking, this is because Simondon’s ontological assertion, with which we mostly agree, is that the “principles of the excluded middle and of identity are inapplicable at the level of the being since at this point individuation has not yet occurred” (Simondon, 1992: 312). We agree with his assertion about identity here, but we believe that, rather than the excluded middle, it is the principle of contradiction that is too narrow a concept to be properly applicable since both individual and milieu are created, preserved and amplified all at the same time. We discuss the implications of the principles of excluded middle and contradiction, in relation to the digital, in more detail below. In the meantime, if there were any doubt remaining as to what Simondon is saying here, it is dispelled when he says, “it could be said that the sole principle by which we can be guided is that of the conservation of being through becoming” (Simondon, 1992: 301, emphasis in original).
The Technogenesis of Digital Ontology- 17 -
Somewhat similarly, Mark Hansen sees Kittler’s thesis as indebted to an overly literal reading of Shannon’s famous work that formed the foundation of information theory, in which information and meaning are separated. Following from the ideas of Donald McKay (a contemporary of Shannon’s), where the interpretation of information is inseparable from its technical structure, Hansen prefers to see the differentiation of media as “inseparable from the cognitive activity of the brain,” with their role being to accord an expanded scope to embodied human agency (Hansen, 2004: 3; Hansen, 2006: 3). This technogenetic attitude is echoed, slightly more objectively, by Katherine Hayles, calling on the ideas of both Gilbert Simondon and contemporary neuroscientist Andy Clark, among others (Hayles, 2012: 13, 103). While these technogenetic ideas somewhat acknowledge the levelling nature of digital data, they do not sufficiently attend to the ontological significance of the data/display split, even in relation to the object-oriented network of dynamic interactions they see as continually “cross-connecting machine processes with human responses” (Hayles, 2012: 13). Citing a Simondonian concept of epigenetic evolution, where humans co-evolve with technology in an assemblage involving complex temporalities, Hayles does not go quite as far as Bernard Stiegler, who repurposes the Epimetheus myth in order to show that humans have no essence separable from the technologies they require for life (Stiegler, 1998). Stiegler complicates temporality even further by asserting that technics, “far from being merely in time, properly constitutes time” (1998: 27). The logical conclusion of this train of thought is that ‘the human’ emerges as a post-facto image from particular technological situations (and not from all of them!), a conclusion that Hayles’s version of Simondon’s and others’ approaches does acknowledge without fully accepting, by insisting on the adaptive approach of epigenetic evolution (Hayles, 2012: 90). Francisco Varela’s work with organic living systems, abstracted to apply to the assemblages formed between technical systems and organic beings, also strongly informs this mode of thought. Varela talks of “embodied cognitive structures” and models of understanding based on “microworlds and microidentities,” as well as of knowledge that is “built from small domains” (Varela, 1992: 334). He defines embodied cognition as the experience of a body with sensorimotor capacities that are “themselves embedded in a more encompassing biological and cultural context” (Varela, 1992: 329). Deleuze’s reading of Spinoza’s concept of a body is available to apply this theory to digital environments, and many contemporary theorists of new media and affect have done just that, including Anna Munster, Claire Colebrook and especially Luicana Parisi in her book Abstract Sex. Parisi uses such readings to move beyond the dichotomy of embodiment and disembodiment. She also calls on Donna Haraway’s famous explication of the cyborg, reminding us of the need to revisit Haraway’s thesis in the light of the contemporary era of cyborgian digital networks (Parisi, 2004: 135).
Consequences of the realisation of the universal machine- 18 -
In both a Simondonian concept of technical beings and the contemporary turn towards object-oriented ontology, there is a risk of ignoring the crucial smoothness, plasticity and generic non-objectness of digital data-as-data, that is data not modulated into some display state. To speak of technical objects is to imply single-purpose machines (for example, Global Positioning Systems (GPS) trackers, Radio Frequency Identification (RFID) tags, mobile phones, social media websites, word processing software, and so on), ignoring the ontological significance of digital computers: that they represent the realisation of—for the first time in human history—the universal machine (Ceruzzi, 2012: 27). In other words, we now have machines that can become any other machine, and they do this by modulating digital data into some specific display register. As we saw earlier in relation to Kittler and media, it is only once digital data has been modulated into a display state that it can be said to be differentiated, and therefore to constitute an object that has qualities, a technical being in Simondon’s terms or a temporal object in Stiegler’s. Yet, if time is constituted in technics and the human is constituted in co-evolution with technical beings, then must the concept of technology itself go the way of Kittler’s media? Regardless—we can disregard it since the same logic of differentiation or simulated differentiation applies here, and since we are not interested in ‘technology,’ but the ontology of digital data specifically—it is clear that digital computers are, therefore, ‘virtual’ machines, highlighting the paradox inherent in Stiegler’s claims, specifically, that humans are only, and have only ever been, ‘virtually’ human. 
The virtual is not the digital- 19 -
Even though the term ‘virtual’ has fallen out of fashion in media studies (much like ‘simulation,’ in the wake of Baudrillard)—mainly because it is mistakenly seen as a remnant of a failed disembodiment experiment in the 1990s—popular usage of the term has continued to grow in the era of digital social networks. Usually, it is used to denote anything that happens on a digital network and causes affect in the non-digital world, for example ‘virtual sex’ or ‘virtual environment.’ However, when we say that media has become virtualised, we do not mean it in the technical or digital sense, rather in the Deleuzian sense that Anna Munster uses:
- 20 -- 21 -
[T]he virtual dimension for corporeal experience evoked here lies in the way it poses the potential for embodied distribution as a condition of experience for information culture by dislocating habitual bodily relations between looking and proprioception. Virtual forces are vectors that pulse through the contours and directions of matter (Munster, 2006: 90, emphasis in original).
Munster’s words help us understand the apparent conflict between the levelling nature of digital data and the instances of specific differentiation caused by modulation into display by thinking of all elements in the modulation process as interdependently transformative negotiations of flows rather than assimilations of one thing into another. Later, we will see how this could help us also understand immanently digital entities and their relationship to the world. For the moment, though, we can see parallels between Munster’s attempts to understand the levelled nature of the digital and Claire Colebrook’s, when the latter calls on Deleuze and Guattari’s concept of ‘desiring machines’:
- 22 -- 23 -
It is naïve and uncritical to see the analogue as a pure and continuous feeling or bodily proximity that is then submitted to the quantification of the digital, a digital that will always be an imposition on organic and vital life. There is, however, an inorganic mode of the analogue that is not a return to a quality before its digital quantification, but a move from digital quantities or actual units to pure quantities, quantities that are not quantities of this or that substance so much as intensive forces that enter into differential relations to produce fields or spaces that can then be articulated into digits (Colebrook, 2010: 124).
Both Munster and Colebrook are concerned with interpreting Deleuze’s nuanced philosophical concept of ‘the virtual’ in the light of the digital era and, like the Simondonian thinkers above in relation to technical objects, trying to understand the relationship between the digital and the non-digital (Munster, 2013: 8). We provided above a loose homology between the Spinozan metaphysics of substance-modes and the relation between data and storage/display states, which is undoubtedly too part of the appeal of Deleuze for scholars of new media (we will modify this homology below). What we can agree with is the resolutely anti-phenomenological approach taken by these writers. Data has no possible phenomenal presentation until it has been modulated into a display, and its very differentiation into such a state detains the excess of its not-thatness.
Digital entities are performances- 24 -
Even if we cannot simply assent to the accounts of embodiment generated by such Deleuzian-inspired accounts, we are in complete accord with their anti-phenomenological tenor. Precisely because of the necessity of modulation, no individual experience of any kind can offer more than an anecdotal testimony to the powers of new media. Rather, the problem is to reconsider the ontogenesis of the entities ‘we’ work with at the level of the interface, and this has to be constructed by way of absolutely non-phenomenal technical concerns. Yet there is still a value in showing the consequences of modulation upon the structuring of experience. Hence, as Boris Groys has pointed out in relation to digital images, the ‘experience’ of the digital becomes one of process, a performance (Groys, 2008: 84). As we have already implied, being qua data proceeds from its operations. When modulating some digital data into a display state and experiencing it as an image (say, in order to show a picture of your child to a friend on your phone), you can no more say that it is the same, or even a copy of the image you showed a different friend yesterday than you could say the D flat played by Martha Goldstein in her 1970 performance of Chopin’s “Etude Op.25 No. 8” was the same D flat that Hermann Scholtz played in his 1879 performance of the same work, let alone the same D flat that Abel Tesfaye sang in The Weeknd’s 2011 performance of their song “The Knowing”.- 25 -
We are careful here not to conflate this understanding of the performative nature of the digital with a too-simplistic assertion of digital data as script or score. Certainly, the use of protocols is an attempt to ensure a predictable display result, but as Sha recognises, performance is always privileged over the “instructions to the maker for use in the making” (Sha, 2013: 45). This is a properly Simondonian privileging of the process over the product.
Modulation as contingent singularization of data- 26 -
Within any digital system, the environment itself is digital data, with only the modulation into specific display states via various protocols allowing a chimeric, phenomenological differentiation. Because of this, the environment itself could at any time become input data to remodulate back into itself and vice versa. Any given element or assemblage (a web page, an image, a sound, an animation, a status update, a tweet) is only ‘produced’ at the time it is modulated into a corresponding display state: at all other times it exists as unmodulated digital data with no clear ontological state. This is true of all digital data. These may perhaps be termed ‘immanently digital entities,’ and could be said to be true of any and all of the excess of data created by the use of digital data and networks. ‘Likes’ and ‘friends’ and ‘photos’ and ‘text’ and ‘links’ and ‘Tweets’ and ‘followers’ and all other ostensibly differentiable digital phenomena–some of these may have identifiable provenance in the non-digital world, and some may be uniquely generated by and in the digital sphere, but all can be said to be ‘immanently digital entities.’ It is this problem, unique to the non-medium of the digital, that leads thinkers such as those mentioned above to concepts of ‘inorganic life’ and technogenesis, as well as scientists like Stephen Wolfram and Edward Fredkin to posit—in a move emblematic of Kittler’s assertion that “media determine our situation”—an hypothesis of the universe as a digital computer (Kittler, 1999: xxxix; Chaitin, 1999: 108).
The digital is other than the media that express it- 27 -
The Wolfram-Fredkin hypothesis—that the universe is itself a kind of digital computer—is of extreme interest in this context, for a number of reasons. First of all, we can agree with one aspect of their motivations for this claim: digital media do indeed reveal something previously unexpected about the natural structure of our universe. To summarise this aspect of their implicit reasoning as simply as possible, it is that the unprecedented powers of thought and action that derive from the electrification of Boolean algebra first established by Shannon, and now incarnated globally in the form of contemporary computing, must have some fundamental anchoring in ‘being’ for it to function at all (Shannon, 1937). Second, this means that to speak of ‘digital media’ is not simply to speak of a set of hardware and software components developed by a particular species on a particular planet at a particular time using particular materials: it is rather to be given a new access to being itself, and one which must thereafter guide our thinking of natural processes more generally. Third, if there remains something unthought in the Wolfram-Fredkin hypothesis, it is simply that there is something preprogrammed, indeed too representational, about the direct projection of a contemporaneously-dominant media paradigm onto being itself. Moreover, if the social conditions of such a projection are occluded, then we should expect such an occlusion to create certain symptoms too; not least the immediate carrying-across of a number of features of contemporary computing to nature itself in an unjustified manner. What we wish to do here, then, is radicalise the Wolfram-Fredkin hypothesis along logical lines. Above all, we agree that when we speak about the ‘digital,’ this must have an extension far greater than simply referring to the actualities of new media, at the same time that these new media must simultaneously function as our primary mode of access to this recognition. But we disagree that the universe is a computer. We believe, rather, that being is digital, if in a very particular sense.
Digital ≠ Numerical- 28 -
At this point we would do well to remember the difference between the digital and the numeric, quantitative or mathematical. Kittler and Deleuze (among many others) conflate the two, but in fact the digital is not necessarily numeric. The digital is rather a binary enactment of logical switches. The ‘zeroes and ones’ of popular parlance are not really numbers being fed to a machine (indeed, it is difficult to rationalize either zero or one as purely numbers anyway), rather they are symbolic placeholders for binary switches: on/off, +/-, yes/no, is/is-not (Chun, 2011: 139). The move to pure quantities is far easier to understand when we accept the numerical as simply another parameter in the modulation process between data and its display, and may help us move closer to an understanding of the relationship between the contemporary technical interdependence of virtual/material and the Deleuzian interdependence of virtual/actual (Nash, 2012).
The digital renders number as rational ideology- 29 -
In the present instance, then, the digital is not the numerical, but the two seem to enter a zone of indistinction, and the irreducibility of this indistinction is one primary mode of contemporary ideology. Numbers are themselves essentially ideological in a digital framework, because they appear as the only viable way of characterising the contingencies of actualisation (modulation and associated sub-operations, such as aggregation) that escapes pure assertion. What is peculiar about this ideology is that it is also essentially true: numbers (in the form of statistics, the modelling of rates of change on a mass scale, the correlation of data from an enormous range of different sources, etc.) are the only way to ensure a minimally rational comparability and consistency of data sets. Outside of such modelings, we have only the phantasms of opinion and self-interest. Unfortunately, being true does not at all prevent numbers from being iniquitously ideological, either; and they are ideological in this new sense, that they are produced on the basis of absolute binary operations whose operations vanish in the presentation of numbers, thereby also remodulating the data they present. The very organisation of data through various forms of modulation puts all sorts of pressures on the numbers that numbers themselves cannot say (Mackenzie, 2012: 335–350). We are thus committed here to understanding the digital as prior to number. Above all, we use this fact as a hint in our construction of a digital ontology. We maintain that it is vital to understand that to construct a digital ontology is to have recourse to a logical and not a mathematical ontology. But what does this mean?
Towards a digital onto-logy- 30 -
In his two major treatises on being and existence, Alain Badiou has offered a series of exceptional, ingenious propositions and arguments concerning the differing status of pure mathematics (in the form of ‘set theory’) and pure logic (in the form of ‘category theory’).  If one wants to argue according to reason—as distinct from proffering statements that are reducible to one kind of opinion or another—one has to take those discourses that take as their object the structure of reason itself as model. Mathematics and logic are those discourses. Yet their absolute affirmation of reason, or rather forms of reasoning, doesn’t mean that there aren’t still differences to be identified or arguments to be made; quite to the contrary, any attempt at reason necessitates an encounter with a rift at the heart of reason itself. This is where something like philosophy becomes necessary to evaluate and to decide on the distribution of rational tasks. For Badiou, ‘mathematics is ontology,’ which means simply that pure systematic mathematics is the discourse concerning being (2006, 4). Taking up the axioms of ZFC set theory, Badiou directly transliterates these axioms into meta-ontological theses, which establish for example the void as the proper name of being, and the existence of infinite infinities. - 31 -
Precisely because set theory is ‘pure’ (that is, devoid of any direct empirical referent), ‘foundational’ (if only in the sense that all prior forms of mathematics can be rewritten without loss in its own terms), and ‘declarative’ (in the sense that it makes absolute assertions about the status of what entities exist), it can function as the grounds for a general ontology.  Logic, on the other hand, is strictly definitional without declaration; it speaks of, enables, the rigorous construction of potentially infinite different kinds of possible worlds (which cohere or consist), but doesn’t prescribe their actuality. In making this distinction, Badiou is forced to clarify that it also entails another claim: that ontology (the thinking of being as being) must conform to classical logic, that is, it is ultimately founded on the two principles of non-contradiction and excluded middle; whereas tendencies in contemporary logics, such as those that are broadly designated by ‘intuitionism’ or ‘paraconsistency,’ offer, respectively, their definitions on the basis of the primacy of non-contradiction without excluded middle, or on excluded middle with a modified principle of non-contradiction.  This for Badiou is not only because modern mathematics is axiomatised in its quest for consistency, and, by such axiomatisation, thereby also necessarily asserts what exists for it and how; it is because classical logic requires that something be or not be, without gradations or paradoxes within existence, in order to meet the criterion of univocity. - 32 -
We agree with Badiou on several fundamental points. First, that any consistent thought of ontology requires the rigour of mathematics or logic in order to be something more than one or another form of theology (disguised or not). Second, that the deployment of mathematics/logic must be undertaken in a foundational, and not in a descriptive sense; that is, what it prescribes cannot simply be ‘read off’ or projected onto the (phenomenal, empirical) world as we find it, and vice-versa. Third, that these procedures establish ‘objectivities’ prior to any subjective or phenomenological apparition. But we, four, disagree that ontology must be classical, as well as, five, that logic can only describe (and therefore does not prescribe) existents. So it is time for us to bring together all the points we have made above into a clear and distinct summary of digital ontology.
The binary is not necessarily Boolean- 33 -
In Shannon’s famous master’s thesis, he identifies an “analogue between the calculus of propositions and the symbolic relay analysis,” which runs both ways: at once to represent electrical circuits by logical relations, and logical relations by electrical circuits (Shannon, 1937: 16). As James Gleick summarizes Shannon’s achievement:
- 34 -- 35 -
Like Boole, Shannon showed that he needed only two numbers for his equations: zero and one. Zero represented a closed circuit; one represented an open circuit. […] Circuits in series, he noted, corresponded to the logical connective and; whereas circuits in parallel had the effect of or. An operation of logic that could be matched electrically was negation, converting a value into its opposite. As in logic, he saw that circuitry could make “if…then” choices (Gleick, 2011: 174).
We note that, as essentially binary, Boolean logic is a classical mathematized logic: that is, it establishes an analysis of symbolic propositions which conform to non-contradiction and excluded middle. Yet recent developments in logic—those broadly denominated ‘paraconsistent’—have attempted to construct logical systems in which contradictions are not necessarily ‘explosive.’ In traditional propositional logic, everything follows from a contradiction, but variants of para-consistent logic propose otherwise. As Greg Restall explains:
Paraconsistent logics are distinctive in that they do not mandate explosion. […] Instead, for paraconsistent logics the entailment fails […] in the semantics for these logics there are interpretations in which A and -A may both be taken to be true, but in which not everything is true (Restall, 2006: 76).
It is essential here to understand that paraconsistency separates out contradiction from consistency, such that certain contradictions might be true, without all of them being so. Whereas consistency and the foreclosure-of-contradiction are identical in classical and intuitionist logics, this is not the case for paraconsistent ones. Moreover, this situation establishes the actuality that there may well be many different modes of constructing logical systems, even a kind of logical pluralism. This reopens the old question regarding the foundations of logic in a radical new fashion.- 37 -
What we want to underline is that the instantiation of Boolean logic in post-war computing led very quickly to the appearance of a vacillation in the data computers were handling, such that paraconsistent logics initially came to be developed “to prevent computers, such as expert medical systems, from deducing anything whatsoever from contradictory data… because of the principle of ex falso quodlibet” (Meillasoux, 2009: 76).  Unlike regimes governed by classical logic, then, such a digital ontology would render pure difference (not identity) fundamental; unlike intuitionist logic, digital ontology could also affirm actual infinities. One corollary is the possibility, even actuality, perhaps even necessity, of true contradictions; another is the patency of contingency in any modulation.
Is digital ontology monist or dualist?- 38 -
If, moreover—as we began by saying—everything is digital and the digital is everything, it now turns out that this is not quite true. The problem here goes back to one of the most fundamental of all metaphysical disputes: monism or dualism? Monism, in its classical form, holds that there is one and only one substance—that is, everything has the same ontological status—and differentiation must therefore be accounted for in terms other than those provided by substance itself. For monism, equality or univocity of being is primordial; differentiation is to be explained genetically. Monism tends to think immanently, but also totally, for it thinks in terms of ‘everything.’ In doing so, it presumes the whole. For dualism, on the other hand, there are fundamentally two different substances or a breach within substance—whether this is, as with the ancient atomists, the two ‘substances’ of ‘atoms’ and the ‘void,’ or, as with Descartes, ‘mind’ and ‘body.’ Differentiation is primary; relations thereby become problematic. Dualism tends to think transcendentally, in the sense that it posits incommensurable orders of Being, but it also thinks partially, because division splits totality. The digital ontology we are proposing here is of especial interest in such a case, for a number of reasons—not least because being qua data (‘substance’) proceeds from its operations (‘the Two’), and not vice versa. So to speak of undifferentiated data is, strictly speaking, false: to continue the line we have already broached by speaking of digital ontology as founded on pure differences established by the primacy of excluded middle, data should be considered a hyperdifferentiated consistency without identity. We thereby reiterate and extend our fundamental point about digital data, which is that it scrambles inherited metaphysical polarities. The principle of excluded middle rules the foundations of the digital universe, not the principle of non-contradiction. Digital ontology is paraconsistent, not classical or intuitionist.
Differences of difference: from absolutely minimal difference to existents- 39 -
So far, we have seen that what are usually referred to as digital entities are phenomenally differentiated only inasmuch as they are modulated into a display state and the resultant boundaries between them and their environment, or each other, is more or less arbitrarily determined by protocol. This is because such differentiated entities and their environment are made of the same stuff, and any such differentiation therefore only holds in the nostalgic McLuhanist sense, raising interesting questions about its ontological state. This is particularly true when considering the physical environment of digital data, that of the ferromagnetic material on the surface of the disk that stores the data. Illustrating our point about digital data not being ‘numerical,’ the binary switches of bits of data are represented on the disk by the direction of magnetism, either positive or negative. Initially we may be tempted to conclude that this is the environment in which the being of digital data occurs, but this is confounded by two facts. First, it need not be this magnetic material that stores the data—it could be punched cards, scrawls on a piece of paper (indeed, Alan Turing’s original concept of the universal machine involved an infinitely long roll of paper tape with readable, writable and erasable symbols), or billions of egg cartons with each cavity either holding an egg or empty (Ceruzzi, 2012: 27). All that is required is a protocol to dictate how these recorded switches are modulated into a display state. Nonetheless, the fact remains that our experience of digital data in the world only involves ferromagnetic material and electronics, and we have never experienced digital data stored in billions of egg cartons, not least because speed of operation is a crucial factor. Yet our point here is that, at base, the digital is a model of logic, not a specific technology (Chun, 2011: 140). Second, and just as importantly, because the physical reality of the computer is an electronic and magnetic enactment of this logic, it is impossible to ever identify any specific being of digital data, since the ‘movement’ of data back and forth between disks, RAM (Random Access Memory), caches and registers on the CPU (Central Processing Unit), is in fact a constant process of modulation between states of magnetic polarity or electric charge in these physical objects. Because of this, it is impossible to say that any given bit of digital data is even the same as itself, or point to its localisation or appearance in the world as a criterion or determinant of its identity, laying bare the fundamental spuriousness of the concept of a ‘copy.’ Data’s ‘identity’ is a pure, non-phenomenal, distributed-cohering-across-materials. And, ‘underneath’ that, there are simply absolutely minimal differences or pure binaries—which are thus differences-without-identity, not subject to the laws of non-contradiction.
Conclusions: Digital-Data-Display- 40 -
On the basis of the current world domination of digital media, we have wanted to discern, albeit in non- or anti-Heideggerean terms, a new ‘destining of being.’ What this means is to identify within the actuality of the new media certain key points that have introduced unprecedented differences into the thinking and manipulation of nature. These differences subtend the antagonism that Heidegger saw as operative between the ‘planetary reign of technology’ and the precarious dis-closures of the poem. In doing so, we precisely targeted the digital aspect of these new technologies, and, even more precisely, their binary nature. Taking this binary nature with the utmost seriousness, we then suggested how it might subtend classical logics, including Boolean algebra; moreover, that such a structure could be foundational and not simply descriptive or representational. From there, we proposed a new abstract ontology that is integrally linked to contemporary computing and yet exceeds the restriction to its particular material situation. The digital universe comes to be founded upon local pure binaries without identity. Data, in turn, becomes minimally consistent sets of digital differences which do not themselves have a differentiated identity, yet which are essentially modulable. Display arises from modulating this data: what are usually called ’media’ are the diverse modulations of this data according to patently-contingent technical display/storage protocols. To put this another way: logic enables us to propose absolutely minimal difference (in the form of pure binaries prior to content and thus contradiction) as pure being itself; mathematics to establish how such differences are given a necessary minimal coherence in actuality (in the abstract-concrete form of Boolean algebras); technics to produce and evaluate their instantiation as operatory machines (Shannon-Turing machines). Or, to put this differently, we have argued that: logic is the medium of being, insofar as it inscribes the necessity of pure minimal differences before contradiction; mathematics the medium that concretises minimal differences into consistency-without-phenomenal-identity as the possibility of any actualisation; technics the actualising medium of modulating these consistencies in turn. This tripartite distinction—difference, consistency, modulation—entails that all phenomenal presentations are at once infinitely variable as they are entirely constrained in specific ways.- 41 -
To return to our beginning: we hope that it is now clear why we can draw from naturalist, historical and ontological accounts at the same time that we believe we can show in what regards they mistake certain crucial aspects of the nature and consequences of the event of contemporary digital technologies. Moreover, our ontological construction enables the application of a set of discriminators that makes all talk of ‘originals’ and ‘copies’ otiose, as well as establishing the sub-structuring of the ‘necessary ideologies’ of such contemporary phenomena as big data.- 42 -
We not only live a digital ontology — we are a digital ontology.
Author Biographies- 43 -
Justin Clemens is currently working on a number of projects, including an ARC Discovery Project on ‘Avatars & Identities’ with Tom Apperley and John Frow, and a Future Fellowship on contemporary Australian poetry. He is based at the University of Melbourne.- 44 -
Adam Nash is an artist, composer, programmer, performer and researcher in digital virtual environments as audiovisual performance spaces, data/motion/affect capture sites, artificially intelligent, evolutionary and generative platforms. His work has been exhibited in prestigious galleries, festivals and online worldwide. He is director of the Playable Media Lab in the Centre for Game Design Research, and program manager of the Bachelor of Design (Digital Media), both at RMIT University.
Acknowledgement- 45 -
The authors would like to thank the two anonymous reviewers for their detailed and stringent feedback on the initial submitted version of this article, which has enabled them to make significant improvements. Clemens’ contribution to this article has been supported by the research project DP140101503 Avatars and Identities (Apperley, Clemens & Frow 2014–2016) that is funded by the Australian Research Council, with research assistance from Nicholas Heron.
-  Despite the title of Latour’s book, Reassembling the Social, perhaps suggesting a sociological analysis, this is precisely not the case. As one of Latour’s interlocutors puts it in an interview: “I got the feeling you’re engaging in a full-scale war against all of the sociologists of the social…” To which Latour replies: “I want to develop one argument, which is completely orthogonal to sociology; which is connected to this obsession about non-humans and controversy,” Anders Blok and Torben Jensen, Bruno Latour: Hybrid Thoughts in a Hybrid World (New York: Routledge, 2011), 160.
-  See also: “Software such as codecs poses several analytical problems. Firstly, they are monstrously complicated. Methodologically speaking, coming to grips with them as technical processes may entail long excursions into labryinths of mathematical formalism and machine architecture, and then finding ways of backing out of them bringing the most relevant features. In relation to video codecs, this probably means making sense of how transform compression and motion estimation work together. Second, at a phenomenological level, they deeply influence the very texture, flow, and materiality of sounds and images. Yet the processes and parameters at work in codecs are quite counterintuitive. Originating in problems of audiovisual perception, codecs actually lie quite a long way away from commonsense understandings of perception. Third, from the perspective of political economy, codecs structure contemporary media economies and cultures in important ways. This may come to light occasionally, usually in the form of an error message saying that something is missing: the right codec has not been installed and the file cannot be played. Despite or perhaps because of their convoluted obscurity, codecs catalyze new relations between people, things, spaces, and times in events and forms,” Adrian Mackenzie, “Codecs”, in Matthew Fuller, Software Studies A Lexicon (Cambridge: MIT Press, 2008), 48.
-  See, inter alia, Ray Brassier, Nihil Unbound (Houndsmills: Palgrave Macmillan, 2007); Levi Bryant et al. (eds.), The Speculative Turn: Continental Materialism and Realism (Melbourne: Re.press, 2011); Q. Meillassoux, After Finitude: An Essay on the Necessity of Contingency, trans. R. Brassier (London and New York: Continuum, 2009).
-  This is, once again, the professed basis of “software studies.” In Manovich’s words: “My book discusses what I take to be the key part of these ‘machines’ today (because it is the only part which most users see and use directly): application software,” Lev Manovich, Software Takes Command (New York: Bloomsbury, 2013), 10. While we completely agree that such studies are necessary and desirable, here we want to underline that such an approach is still, despite its own anathemas against, say, Kittler’s alleged ‘modernism,’ itself a modernist type of phenomenological humanism. This also goes for the field of so-called ‘media archaeology’ more generally, from Kittler himself through Jussi Parikka and beyond. Even if there is a constant tension in such studies, between the places of the human and the non-human, they still retain a sort of minimal commitment to a form of ideology-critique, e.g., a la Jussi Parikka, What is Media Archaeology? (Cambridge: Polity, 2012). We would also include such writers as Alexander Galloway and Eugene Thacker here, for whom technical issues are constantly flowing into issues of power: see A.R. Galloway, Protocol (Cambridge: MIT, 2004) for several fundamental assertions in this regard. One crucial element in such studies is their absolutely vital emphasis upon the motivated contingencies that govern the complex stratifications (“layerings,” “protocols,” etc.) of contemporary control, which exceed and rebuke the inherited terms in which they are so often characterised; another is their (sometimes implicit, sometimes explicit) collapsing of the logical and mathematical bases of new technologies into technology itself. In all these cases, moreover, the phenomenology of modulation is taken as primary; and above all, the programming of modulation insofar as it primarily concerns ‘us,’ contemporary societies and users, at the level of interaction.
-  See also Tom Boellstorff, Coming of Age in Second Life: An Anthropologist Explores the Virtually Human (Princeton: Princeton University Press, 2008), 237.
-  See Alain Badiou, Being and Event, esp. 6–9; A. Badiou, Logics of Worlds, trans. A. Toscano (London and New York: Continuum, 2009), e.g., “just as Being and Event drastically transformed the ontology of truths by putting it under the condition of the Cantor-event and of the mathematical theory of the multiple, so Logics of Worlds drastically transforms the articulation of the transcendental and the empirical, by putting it under the condition of the Grothendieck-event (or of Eilenberg, or Mac Lane, or Lawvere…) and of the logical theory of sheaves,” 38. It is crucial that, in both cases, mathematics and logics establish the formalization of objectivities prior to any subject whatsoever.
-  “ZFC set theory” is shorthand for “Zermelo-Fraenkel set theory with the axiom of choice.” For a detailed technical discussion of some of the philosophical hallmarks of Badiou’s transliteration of axioms into philosophese, see J. Clemens, “Doubles of Nothing: The Problem of Binding Truth to Being in the Work of Alain Badiou,” Filozofski Vestnik, Vol. XXVI, No. 2 (2005), 21–35.
-  For more extended discussion of Badiou’s use of set theory, see J. Clemens and O. Feltham, “Introduction” in Alain Badiou, Infinite Thought (London and New York: Continuum, 2003), 1–38; P. Hallward, Badiou: A Subject to Truth (Minneapolis: University of Minnesota Press, 2003); C. Norris, Being and Event: A Reader’s Guide (London and New York: Continuum, 2009.
-  Badiou’s own brief summation of the differences between these formations is very clear: “A classical logic simultaneously validates the principle of the excluded middle and the principle of non-contradiction (the truth of the statement p and that of the statement non-p cannot be given at the same time). An intuitionist logic validates the principle of non-contradiction, but not the principle of the excluded middle. A para-consistent logic validates the principle of the excluded middle, but not the general form of the principle of non-contradiction. In each case, we are dealing with important variations in the definition and the meaning of the operator of negation,” Logics of Worlds, 183. For an influential recent variation of para-consistent logics, see the work of Graham Priest on “dialetheism,” for which a system can be inconsistent (i.e., generate irreducible contradictions from operations such as self-reference) but not be incoherent (i.e., not unusable), e.g., G. Priest Beyond the limits of thought (Cambridge: Cambridge University Press, 1995) and G. Priest In Contradiction: A Study of the Transconsistent (Dordrecht: Martinus Nijhoff, 1987). For his part, Newton C.A. da Costa, one of the ground-breaking logicians of paraconsistency, has recently written, with Decio Krause, first, that the development of paraconsistent logics does not make classical logic wrong (the former is a supplement to a field which now appears more restricted than previously) and, second, that paraconsistent logics may be of more use in explaining certain physical phenomena in an applied frame, see Costa, Newton C.A. da and N. Krause, ’Remarks on the applications of paraconsistent logics to physics.’ available philsci-archive.pitt.edu/1566/1/CosKraPATTY.pdf, downloaded 18 November 2014.
-  In Badiou’s own words, “A fundamental example of a classical world is ontology, or the theory of the pure multiple, or historical mathematics. This is essentially because a set is defined extensionally: a set is identified with the collection of its elements. This definition really only acquires meaning if one rigorously accepts the following principle: given an element, either it belongs to a set, or it does not. There is no third possibility,” Logics of Worlds, 185. As he also puts it: “A classical world is a world whose transcendental is Boolean,” Logics, 188 (emphasis in original).
-  Note that Meillassoux himself is very hostile to paraconsistent logics, insofar as he believes they apply only to certain peculiarities in the handling of data, rather than to real states of affairs. For him, the law of non-contradiction is the key operator that enables thought to construct radical ontological propositions; for our part, we interpret his restrictions in this regard as symptomatic of a hostility to the revelatory powers of new technologies. Data is a real part of the world, not just a subclass of information about that world.
-  See C.A. Middleburg’s “A Survey of Paraconsistent Logics,” in which some of the key systems and approaches are surveyed, with their fascinating differences, e.g., with three truth values (true, false and both-true-and-false), relevance (the antecedent of an implication must be relevant to its consequent), non-truth-functional negation, non-adjunctive (A ^ B from A and B fails), annotated truth values according to belief.
- Badiou, Alain. Being and Event. London and New York: Continuum, 2006.
- Badiou, Alain. Logics of Worlds. London and New York: Continuum, 2009.
- Blok, Anders and Torben Elgaard Jensen. Bruno Latour: Hybrid Thoughts in a Hybrid World. New York: Routledge, 2011.
-Boellstorff, Tom. Coming of Age in Second Life: An Anthropologist Explores the Virtually Human. Princeton: Princeton University Press, 2008.
- Boswell, James. The Life of Samuel Johnson. Ware: Wordsworth, 1999.
- Boucher, Marie-Pier. “Infra-Psychic Individualization: Transductive Connections and the Genesis of Living Techniques” in Gilbert Simondon: Being and Technology. Edingburgh: Edinburgh University Press, 2012.
- Brassier, Ray. Nihil Unbound. Houndsmills: Palgrave Macmillan, 2007
- Bryant Levi et al. (eds.), The Speculative Turn: Continental Materialism and Realism. Melbourne: Re.press, 2011.
- Ceruzzi, Paul E. Computing, a Concise History. Cambridge: MIT Press, 2012.
- Chaitin, Gregory. The Unknowable. Singapore: Springer-Verlag, 1999.
- Chun, Wendy Hui Kyong. Programmed Visions: Software and Memory. Cambridge: MIT Press, 2011
- Clemens, Justin. “Doubles of Nothing: The Problem of Binding Truth to Being in the Work of Alain Badiou,” Filozofski Vestnik, Vol. XXVI, No. 2 (2005), 21–35.
- Clemens, Justin and Oliver Feltham. “Introduction” in Alain Badiou, Infinite Thought. London and New York: Continuum, 2003, 1–38.
- Clemens, Justin and Adam Nash. Seven theses on the concept of ‘post-convergence’. Australian Centre of Virtual Art website, accessed June 20th, 2012. https://www.acva.net.au/blog/detail/seven_theses_on_the_concept_of_post-convergence
- Colebrook, Claire. Deleuze and the Meaning of Life. London: Continuum, 2010.
- Combes, Muriel. Gilbert Simondon and the Philosophy of the Transindividual. Cambridge: MIT Press, 2013.
- Costa, Newton C.A. da and Décio Krause, ’Remarks on the applications of paraconsistent logics to physics.’ available philsci-archive.pitt.edu/1566/1/CosKraPATTY.pdf, downloaded 18 November 2014.
- Damasio, Antonio. The feeling of what happens: body, emotion and the making of consciousness. London: Vintage, 2000.
- Deleuze, Gilles & Guattari, Felix. A Thousand Plateaus. Minneapolis: University of Minnesota Press, 1987.
- Floridi, Luciano. The Fourth Revolution. Oxford: Oxford University Press, 2014.
- Galloway, Alexander. Protocol. Cambridge: MIT, 2004.
- Gleick, James. The Information: A History, a Theory, a Flood. London: Fourth Estate, 2011.
- Groys, Boris. Art Power. Cambridge: MIT Press, 2008.
- Hallward, Peter. Badiou: A Subject to Truth. Minneapolis: University of Minnesota Press, 2003.
- Hansen, Mark. Bodies in Code. New York: Routledge, 2006.
- Hansen, Mark. New Philosophy for New Media. Cambridge: MIT Press, 2004.
- Haraway, Donna. “A manifesto for cyborgs: Science, technology, and socialist feminism in the 1980s,” in The Postmodern Turn: New Perspectives on Social Theory, Edited by Steven Seidman. Cambridge: Cambridge University Press, 1994.
- Harman, Graham. The Quadruple Object. Winchester: Zero Books, 2011.
- Hayles, N. Katherine. How We Think: Digital Media and Contemporary Technogenesis. Chicago: University of Chicago Press, 2012.
- Heidegger, Martin. Being and Time. Trans. J. Stambaugh. Albany: SUNY, 1996.
- Heidegger, Martin. “Letter on Humanism,” Global Religious Vision, Vol. 1, No. 1 (2000), 83–109.
- Heidegger, Martin. Poetry, Language, Thought. Trans and Intro. A. Hofstadter. New York: Harper and Row, 1971.
- Kittler, Friedrich. Gramophone, Film, Typewriter. Stanford: Stanford University Press, 1999.
- Latour, Bruno. Reassembling the Social: An introduction to Actor Network Theory. Oxford: Oxford University Press, 2007.
- Mackenzie, Adrian. ’More parts than elements: how databases multiply.’ Society and Space, Vol. 30 (2012). pp. 335–350.
- —————-. Transductions: Bodies and Machines at Speed. London and New York: Continuum, 2002.
- Manovich, Lev. ’The Practice of Everyday (Media) Life: From Mass Consumption to Mass Cultural Production?’ Critical Inquiry, Vol. 35, No. 2 (Winter 2009), pp. 319–331.
- —————. Software Takes Command. London and New York: Bloomsbury, 2013.
- McLuhan, Marshall. & Fiore, Quentin. The Medium is the Massage, An Inventory of Effects. Corte Madera: Gingko Press, 2001.
- Meillassoux, Quentin. After Finitude: An Essay on the Necessity of Contingency. Trans. R. Brassier. London and New York: Continuum, 2009.
- Middleburg, C.A., ’A Survey of Paraconsistent Logics,’ available https://arxiv.org/pdf/1103.4324 downloaded 18 November 2014.
- Munster, Anna. Materializing New Media: Embodiment in Information Aesthetics. Hanover: Dartmouth College Press, 2006.
- Munster, Anna. An Aesthesia of Networks. Cambridge: MIT Press, 2013.
- Nash, Adam. “Affect and the Medium of Digital Data,” Fibreculture Journal, issue 21, 2012.
- Norris, Christopher. Being and Event: A Reader’s Guide. London and New York: Continuum, 2009.
- Oberman, Lindsay M, Pineda, Jaime A and Vilayanur S. Ramachandran. “The human mirror neuron system: A link between action observation and social skills,” Social Cognitive & Affective Neuroscience, Vol. 2 (2007), Issue 1, 62–66.
- Parikka, Jussi. What is Media Archaeology? Cambridge: Polity, 2012.
- Parisi, Luciana. Abstract Sex: Philosophy, Bio-technology and the Mutations of Desire. London: Continuum, 2004.
- Priest, Graham. Beyond the limits of thought. Cambridge: Cambridge University Press, 1995.
- Priest, Graham. In Contradiction: A Study of the Transconsistent. Dordrecht: Martinus Nijhoff, 1987.
- Priest, Graham et al. Eds. The Law of Non-Contradiction: New Philosophical Essays. Oxford: Clarendon, 2006.
- Ramachandran, Vilayanur S. and Altschuler, Eric Lewin. ’A simple method to stand outside oneself,” Perception, Vol. 36 (2007), 632–634.
- Ramachandran, Vilayanur S. and Altschuler, Eric Lewin. “The use of visual feedback, in particular mirror visual feedback, in restoring brain function,” Brain, No. 132 (2009), 1693–1710.
- Sha, Xin Wei. Poiesis and Enchantment in Topological Matter. Cambridge: MIT Press. 2013.
- Shannon, Claude “A Symbolic Analysis of Relay and Switching Circuits,” Master’s Thesis, MIT, 1937.
- Simondon, Gilbert. “The Genesis of the Individual” in Incorporations. Zone: New York. 1992.
- Stiegler, Bernard. Technics and Time, 1: The Fault of Epimetheus. Stanford: Stanford University Press, 1998.
- Varela, Francisco. “The Reenchantment of the Concrete,” in Incorporations. New York: Urzone, 1992.
- Virno, Paolo. “Natural-Historical Diagrams: The ’New Global’ Movement and the Biological Invariant” in L. Chiesa and A. Toscano, eds. The Italian Difference: Between Nihilism and Biopolitics. Melbourne: re-press, 2009.