Coming Soon: Post-Cinema, edited by Shane Denson and Julia Leyda

PA-Artifacts

Post-Cinema: Theorizing 21st-Century Film is the working title of a collection of essays that Julia Leyda and I have proposed to REFRAME Books, a new branch of Catherine Grant’s open-access publishing ventures (you may know Catherine Grant, of Film Studies at the University of Sussex, from her excellent blog Film Studies For Free, while REFRAME includes the innovative journal Sequence and a variety of other great publications and platforms).

Our proposal has been well received by REFRAME’s reviewers, so if all goes well (i.e. pending review of the completed chapters) the collection should be appearing sometime in the near future. We are particularly excited to be working with REFRAME on this project, as this means that the book will appear in a variety of open-access formats (PDF, epub, Mobi), free of charge and freely distributable! Ours will be among the first full-length edited collections to appear with REFRAME, whose publications are sure to make waves in scholarly publishing in the coming years. We are very proud to be in on the ground floor!

While there is still quite a bit of work ahead of us on this project, and though not all of the details have been finalized yet, we couldn’t wait to announce the collection; we are very excited about the group of contributors we have assembled (more info soon), and we are confident that the volume will make an important contribution to the still emerging discussion of post-cinema.

According to an anonymous reviewer for REFRAME:

“The proposed collection promises to be a landmark publication by bringing together some of the most important critical essays that have discussed recent developments in film and media cultures and a number of original essays that develop in innovative ways the perspectives and provocations of those earlier interventions.”

We will do our best to live up to these high expectations, and we will be sure to provide further details about the project in due time!

News and Reviews: Transnational Perspectives on Graphic Narratives

transnational-comics

Up to now, Transnational Perspectives on Graphic Narratives (which I co-edited along with Christina Meyer and Daniel Stein) has only been available in a prohibitively expensive hardback edition, but luckily that’s about to change: a much more affordable paperback is set to appear in September, and it is now available for pre-order on amazon.com (here), amazon.ca (here), amazon.co.uk (here), and amazon.de (here). If you can’t wait and you’re OK with reading from a screen, there’s also a Kindle edition available for a couple of dollars/pounds/euros/etc. less.

Recently, a brief review of the book appeared in Journal of Graphic Novels and Comics 5.2 (2014); according to the reviewer, Ralf Kauranen, “Transnational Perspectives on Graphic Narratives offers a wealth of concepts and perspectives for the study of the transnational in comics research … [and] signals the arrival of the ‘transnational turn’ in comics studies.”

And in case you missed it, you might want to check out the interview that Michael Chaney (professor at Dartmouth and contributor to the volume) conducted with Christina, Daniel, and me about the book and our experiences and interests in comics: An Interview with the Editors of Transnational Perspectives on Graphic Narratives.

Postnaturalism, with a Foreword by Mark B. N. Hansen: Forthcoming 2014

postnatural

Having hinted at it before, I am pleased now to announce officially that my book Postnaturalism: Frankenstein, Film, and the Anthropotechnical Interface will be appearing later this year (around Fall 2014) with the excellent German publisher Transcript, with US distribution through Columbia University Press.

I am also very excited that Mark B. N. Hansen has contributed a wonderful foreword to the book. Here is a blurb-worthy excerpt in which he identifies the philosophical and media-philosophical stakes of the book:

Shane Denson’s Postnaturalism develops [an] ambitious, wide-ranging, and deeply compelling argument concerning the originary operation of media in a way that sketches out a much-needed alternative to destructive developments which, expanding the darker strains of poststructuralist anti-humanism, have pitted the human against the material in some kind of cosmological endgame. Postnaturalism will provide a very powerful and timely addition to the literature on posthuman, cosmological technogenesis. Perhaps more clearly than any other account, it reconciles the irreducibility of phenomenality and the imperative to move beyond anthropocentrism as we seek to fathom the postnatural techno-material “revolutions” that have repeatedly remade – and that will no doubt continue to remake – the environments from which we emerge and to which “we” belong before we become and as a condition of becoming human subjects.

Now, as I put the finishing touches on the manuscript and prepare for it to leave my control — to go forth, monstrously, and (who knows?) prosper — I can only hope that the book will live up to Hansen’s estimation of it and, above all, that it will make a worthy contribution to the debates over nonhuman agency and human-technological co-evolution that have recently defined some of the more exciting strands in media theory, science studies, and speculative realism, among others.

Post-Cinematic: Video Essays

Screenshot_2014-04-23-20-55-23

For their final projects in my 21st-century film class, three of my students chose to make video essays, which they have now made available on a blog that they set up especially for this purpose. Over at 21stcenturycinema.wordpress.com, you will find Jesko Thiel’s exploration of transmedia storytelling, Christopher Schramm’s analysis of editing techniques in videogame-based “fragmovies,” and Andreas Merokis’s look at violence as narrative and/or spectacle in contemporary cinema. Take a look and leave them a comment!

Post-Cinema / Post-Phenomenology

artifact

Following my talk last week at the Texas State Philosophy Symposium, details have now been finalized for another talk at Texas State: this time in the context of the Philosophy Department’s Dialogue Series, where I’ll be talking about post-cinema (i.e. post-photographic moving image media such as video and various digital formats) and what I’ve been arguing is an essentially post-phenomenological system of mediation (see, for example, my talk from the 2013 SCMS conference or these related musings). For anyone who happens to be in the area, the talk will take place on Monday, April 14, 2014 at 12:30 pm (in Derrick Hall 111). UPDATE: The time has been changed to 10:00 am.

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media (full text)

Phil-Sci-Denaturalized.034

As I recently announced, I was invited to give the keynote address at the 17th annual Texas State University Philosophy Symposium. Here, now, is the full text of my talk:

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media

Shane Denson

The title of my talk contains several oddities (and perhaps not a few extravagances), so I’ll start by looking at these one by one. First (or last) of all, “philosophy of media” is likely to sound unusual in an American context, but it denotes an emerging field of inquiry in Europe, where a small handful of people have started referring to themselves as philosophers of media, and where there is even a limited amount of institutional recognition of such appellations. In Germany, for example, Lorenz Engell has held the chair of media philosophy at the Bauhaus University in Weimar since 2001. He lists as one of his research interests “film and television as philosophical apparatuses and agencies” – which, whatever that might mean, clearly signals something very different from anything that might conventionally be treated under the heading of “media studies” in the US. On this European model, media philosophy is related to the more familiar “philosophy of film,” but it typically broadens the scope of what might be thought of as media (following provocations from thinkers like Niklas Luhmann, who treated everything from film and television to money, acoustics, meaning, art, time, and space as media). More to the point, media philosophy aims to think more generally about media as a philosophical topic, and not as mere carriers for philosophical themes and representations – which means going beyond empirical determinations of media and beyond concentrations on media “contents” in order to think about ontological and epistemological issues raised by media themselves. Often, these discussions channel the philosophy of science and of technology, and this strategy will indeed build the bridge in my own talk between the predominantly European idea of “media philosophy” and the context of Anglo-American philosophy.

OK, but if the idea of a philosophy of media isn’t weird enough, I’ve added this weird epithet: “postnatural.” The meaning of this term is really the crux of my talk, but I’m only going to offer a few “notes towards” a postnatural theory, as it’s also the crux of a big, unwieldy book that I have coming out later this year, in which I devote some 400 pages to explaining and exploring the idea of postnaturalism. As a first approach, though, I can describe the general trajectory through a series of three heuristic (if oversimplifying) slogans.

Phil-Sci-Denaturalized.010

First, in response to debates over the alleged postmodernity of (Western) societies at the end of the twentieth century, French sociologist and science studies pioneer Bruno Latour, most famous for his association with so-called actor-network theory, claimed in his 1991 book of the same title that “We have never been modern.” What he meant, centrally, was that the division of nature and culture, nonhuman and human, that had structured the idea of modernity (and of scientific progress), could not only be seen crumbling in contemporary phenomena such as global warming and biotechnology – humanly created phenomena that become forces of nature in their own right – but that the division was in fact an illusion all along. We have never been modern, accordingly, because modern scientific instruments like the air pump, for example, were simultaneously natural, social, and discursive phenomena. The idea of modernity, according to Latour, depends upon acts of purification that reinforce the nature/culture divide, but an array of hybrids constantly mix these realms. In terms of a philosophy of media, one of the most important conceptual contributions made by Latour in this context is the distinction between “intermediaries” and “mediators.” The former are seen as neutral carriers of information and intentionalities: instruments that expand the cognitive and practical reach of humans in the natural world while leaving the essence of the human untouched. Mediators, on the other hand, are seen to decenter subjectivities and to unsettle the human/nonhuman divide itself as they participate in an uncertain negotiation of these boundaries.

Phil-Sci-Denaturalized.012

The NRA, with their slogan “guns don’t kill people, people kill people,” would have us believe that handguns are mere intermediaries, neutral tools for good or evil; Latour, on the other hand, argues that the handgun, as a non-neutral mediator, transforms the very agency of the human who wields it. That person takes up a very different sort of comportment towards the world, and the transformation is at once social, discursive, phenomenological, and material in nature.

Phil-Sci-Denaturalized.014

With Donna Haraway, we could say that the human + handgun configuration describes something on the order of a cyborg, neither purely human nor nonhuman. And Haraway, building on Latour’s “we have never been modern,” ups the ante and provides us with the second slogan: “We have never been human.” In other words, it’s not just in the age of prosthetics, implants, biotech, and “smart” computational devices that the integrity of the human breaks down, but already at the proverbial dawn of humankind – for the human has co-evolved with other organisms (like the dog, who domesticated the human just as much as the other way around). From an ecological as much as an ideological perspective, the human fails to describe anything like a stable, well-defined, or self-sufficient category.

Phil-Sci-Denaturalized.016

Now the third slogan, which is my own, doesn’t so much try to outdo Latour and Haraway as to refocus some of the themes that are inherent in these discussions. Postnaturalism, in a nutshell, is the idea not that we are now living beyond nature, whatever that might mean, but that “we have never been natural” (and neither has nature, for that matter). Human and nonhuman, natural and unnatural agencies are products of mediations and symbioses from the very start, I contend. In order to argue for these claims I take a broadly ecological view and focus not on discrete individuals but on what I call the anthropotechnical interface (the phenomenal and sub-phenomenal realm of mediation between human and technical agencies, where each impinges upon and defines the other in a broad space or ecology of material interaction). This view, which I develop at length in my book, allows us to see media not only as empirical objects, but as infra-empirical constraints and enablers of agency such that media may be described, following Mark Hansen, as the “environment for life” itself. Accordingly, media-technical innovation translates into ecological change, transforming the parameters of life in a way that outstrips our ability to think about or capture such change cognitively – for at stake in such change is the very infrastructural basis of cognition and subjective being. So postnaturalism, as a philosophy of media and mediation, tries to think about the conditions of anthropotechnical evolution, conceived as the process that links transformations in the realm of concrete, apparatic media (such as film and TV) with more global transformations at a quasi-transcendental level. Operating on both empirical and infra-empirical levels, media might be seen, on this view, as something like articulators of the phenomenal-noumenal interface itself.

So the more I unpack this thing, the weirder it gets, right? Well, let me approach it from a different angle. Here’s where the first part of my title comes into play: “Philosophy of Science De-Naturalized.” Now, I mentioned before that postnaturalism does not postulate that we are living “after” nature; what I want to emphasize now is that it also remains largely continuous with naturalism, conceived broadly as the idea that the cosmos is governed by material principles which are the object, in turn, of natural science. And, more to the point, the first step in the derivation of a properly postnatural theory, which never breaks with the idea of a materially evolving nature, is to work through a naturalized epistemology, in the sense famously articulated by Willard V. O. Quine, but to locate within it the problematic role of technological mediation. By proceeding in this manner, I want to avoid the impression that a postnatural theory is based on a merely discursive “deconstruction” of nature as a concept. Against the general thrust of broadly postmodernist philosophies, which might show that our ideas of nature and its opposites are incoherent, mine is meant to be a thoroughly materialist account of mediation as a transformative force. So the “Philosophy of Science De-Naturalized,” as I put it here, marks a particular trajectory that takes off from what Ronald Giere has called “Philosophy of Science Naturalized” and works its way towards a properly postnatural philosophy of media.

Phil-Sci-Denaturalized.019

Giere’s naturalized philosophy of science is of interest to me because it aims to coordinate evolutionary naturalism (in the sense of Darwin) with revolutionary science (in the sense of Thomas Kuhn). In other words, it aims to reconcile the materialism of naturalized epistemology with the possibility of radical transformation, which Kuhn sees taking place with scientific paradigm shifts, and which I want to attribute to media-technical changes. Taking empirical science as its model, and taking it seriously as an engagement with a mind-independent reality, an “evolutionary epistemology” posits a strong, causal link between the material world and our beliefs about it, seeing knowledge as the product of our biological evolution. Knowledge (and, at the limit, science) is accordingly both instrumental or praxis-oriented and firmly anchored in “the real world.” As a means of survival, it is inherently instrumental, but in order for this instrumentality to be effective – and/or as the simplest explanation of such effectivity – the majority of our beliefs must actually correspond to the reality of which they form part. But, according to Kuhn’s view of paradigm shifts, “after a revolution scientists work in a different world” (Structure of Scientific Revolutions 135). This implies a strong incommensurability thesis that, according to critics like Donald Davidson, falls into the trap of idealism, along with its attendant consequences; i.e. if paradigms structure our experience, revolution implies radical relativism or else skepticism. So how can revolutionary transformation be squared with the evolutionary perspective?

Phil-Sci-Denaturalized.020

Convinced that it contains important cues for a theory of media qua anthropotechnical interfacing, I would like to look at Giere’s answer in some detail. Asserting that “[h]uman perceptual and other cognitive capacities have evolved along with human bodies” (384), Giere’s is a starkly biology-based naturalism. Evolutionary theory posits mind-independent matter as the source of a matter-dependent mind, and unless epistemologists follow suit, according to Giere, they remain open to global arguments from theory underdetermination and phenomenal equivalence: since the world would appear the same to us whether it were really made of matter or of mind-stuff, how do we know that idealism is not correct? And because idealism contradicts the materialist bias of physical science, how do we know that scientific knowledge is sound? According to Giere, we can confidently ignore these questions once the philosophy of science has itself opted for a scientific worldview. Of course, the skeptic will counter that naturalism’s methodologically self-reflexive relation to empirical science renders its argumentation circular at root, but Giere turns the tables on skeptical challenges, arguing that they are “equally question-begging” (385). Given the compelling explanatory power and track record of modern science and evolutionary biology in particular, it is merely a feigned doubt that would question the thesis that “our capacities for operating in the world are highly adapted to that world” (385); knowledge of the world is necessary for the survival of complex biological organisms such as we are. But because this is essentially a transcendental argument, it does not break the circle in which the skeptic sees the naturalist moving; instead, it asserts that circularity is an inescapable consequence of our place in nature. In large part, this is because “we possess built-in mechanisms for quite direct interaction with aspects of our environment. The operations of these mechanisms largely bypass our conscious experience and linguistic or conceptual abilities” (385).

Phil-Sci-Denaturalized.024

So much for the evolutionary perspective, but where does revolutionary science fit into the picture? To answer this question, Giere turns to the case of the geophysical revolution of the 1960s, when a long established model of the earth as a once much warmer body that had cooled and contracted, leaving the oceans and continents more or less fixed in their present positions, was rapidly overturned by the continental drift model that set the stage for the now prevalent plate tectonics theory (391-94). The matching coastlines of Africa and South America had long suggested the possibility of movement, and drift models had been developed in the early twentieth century but were left, by and large, unpursued; it was not just academic protectionism that preserved the old model but a lack of hard evidence capable of challenging accepted wisdom – accepted because it “worked” well enough to explain a large range of phenomena.

Phil-Sci-Denaturalized.025

The discovery in the 1950s of north-south ocean ridges suggested, however, a plausible mechanism for continental drift: if the ridges were formed, as Harry Hess suggested, by volcanism, then “sea floor spreading” should be the result, and the continents would be gradually pushed apart by its action. The discovery, also in the 1950s, of large-scale magnetic field reversals provided the model with empirically testable consequences (the Vine-Matthews-Morley hypothesis): if the field reversals were indeed global and if the sea floor was spreading, then irregularly patterned stripes running parallel to the ridges should match the patterns observed in geological formations on land. Until this prediction was corroborated, there was still little impetus to overthrow the dominant theory, but magnetic soundings of the Pacific-Antarctic Ridge in 1966, along with sea-floor core samples, revealed the expected polarity patterns and led, within the space of a year, to a near complete acceptance of drift hypotheses among earth scientists.

According to Giere, naturalism can avoid idealistic talk of researchers living “in different worlds” and explain the sudden revolution in geology by appealing only to a few very plausible assumptions about human psychology and social interaction – assumptions that are fully compatible with physicalism. These concern what he calls the “payoff matrix” for accepting one of the competing theories (393). Abandoning a pet theory is seldom satisfying, and the rejection of a widely held model is likely to upset many researchers, revealing their previous work as no longer relevant. Resistance to change is all too easily explained. However, humans also take satisfaction in being right, and scientists hope to be objectively right about those aspects of the world they investigate. This interest, as Giere points out, does not have to be considered “an intrinsic positive value” among scientists, for it is tempered by psychosocial considerations (393) such as the fear of being ostracized and the promise of rewards. The geo-theoretical options became clear – or emerged as vital rather than merely logical alternatives – with the articulation of a drift model with clearly testable consequences. We may surmise that researchers began weighing their options at this time, though it is not necessary to consider this a transparently conscious act of deliberation. What was essential was the wide agreement among researchers that the predictions regarding magnetic profiles, if verified, would be extremely difficult to square with a static earth model and compellingly simple to explain if drift really occurred. Sharing this basic assumption, the choice was easy when the relevant data came in (394).

Phil-Sci-Denaturalized.026

But the really interesting thing about this case, in my opinion, is the central role that technology played in structuring theoretical options and forcing a decision, which Giere notes but only in passing. The developing model first became truly relevant through the availability of technologies capable of confirming its predictions: technologies for conducting magnetic soundings of the ocean floor and for retrieving core samples from the deep. Indeed, the Vine-Matthews-Morley hypothesis depended on technology not only for its verification, but for its initial formulation as well: ocean ridges could not have been discovered without instruments capable of sounding the ocean floor, and the discovery of magnetic field reversals depended on a similarly advanced technological infrastructure. A reliance on mediating technologies is central to the practice of science, and Giere suggests that an appreciation of this fact helps distinguish naturalism from “methodological foundationism” or the notion that justified beliefs must recur ultimately to a firm basis in immediate experience (394). His account of the geological paradigm shift therefore “assumes agreement that the technology for measuring magnetic profiles is reliable. The Duhem-Quine problem [i.e. the problem that it is logically possible to salvage empirically disconfirmed theories by ad hoc augmentation] is set aside by the fact that one can build, or often purchase commercially, the relevant measuring technology. The background knowledge (or auxiliary hypotheses) are embodied in proven technology” (394). In other words, the actual practice of science (or technoscience) does not require ultimate justificational grounding, and the agreement on technological reliability ensures, according to Giere and contra Kuhn, that disagreeing parties still operate in the same world.

But while I agree that Giere’s description of the way technology is implemented by scientists is a plausible account of actual practice and its underlying assumptions, I question his extrapolation from the practical to the theoretical plane. With regard to technology, I contend, the circle problem resurfaces with a vengeance. As posed by the skeptic, Giere is right, in my opinion, to reject the circle argument as invalidating naturalism’s methodologically self-reflexive application of scientific theories to the theory of science. Our evolutionary history, I agree, genuinely militates against the skeptic’s requirement that we be able to provide grounds for all our beliefs; our survival depends upon an embodied knowledge that is presupposed by, and therefore not wholly explicatable to, our conscious selves. But as extensions of embodiment, the workings of our technologies are equally opaque to subjective experience, even – or especially – when they seem perfectly transparent channels of contact with the world. Indeed, Giere seems to recognize this when he says that “background knowledge (or auxiliary hypotheses) are embodied by proven technology” (394, emphasis added). In other words, scientists invest technology with a range of assumptions concerning “reliability” or, more generally, about the relations of a technological infrastructure to the natural world; their agreement on these assumptions is the enabling condition for technology to yield clear-cut decision-making consequences. Appearing neutral to all parties involved, the technology is in fact loaded, subordinated to human aims as a tool. Some such subordinating process seems, from a naturalistic perspective, unavoidable for embodied humans. However, agreement on technological utility – on both whether and how a technology is useful – is not guaranteed in every case. Moreover, it is not just a set of cognitive, theoretical assumptions (“auxiliary hypotheses”) with which scientists entrust technologies, but also aspects of their pre-theoretically embodied, sensorimotor competencies. Especially at this level, mediating technologies are open to what Don Ihde calls an experiential “multistability” – capable, that is, of instantiating to differently situated subjectivities radically divergent ways of relating to the world. But it is precisely the consensual stability of technologies that is the key to Giere’s contextualist rebuttal of “foundationism.”

Phil-Sci-Denaturalized.030

Downplaying multistability is the condition for a general avoidance of the circle argument, for a pragmatic avoidance of idealism and/or skepticism. This, I believe, is most certainly the way things work in actual practice; (psycho)social-institutional pressures work to ensure consensus on technological utility. But does naturalism, self-reflexively endorsing science as the basis of its own theorization, then necessarily reproduce these pressures? Feminists in particular may protest on these grounds that the “nature” in naturalism in fact encodes the white male perspective historically privileged by science because embodied by the majority of practicing scientists. What I am suggesting is that the tacit, largely unquestioned processes by which technological multistability is tamed in practice form a locus for the inscription of social norms directly into the physical world; for in making technologies the material bearers of consensual values (whether political, epistemic, psychological, or even the animalistically basic preferability of pleasure over pain) scientific practice encourages certain modes of embodied relations to the world – not just psychic but material relations themselves embodied in technologies. It goes without saying that this can only occur at the expense of other modes of being-embodied.

More generally stated, the real problem with naturalism’s self-reflexivity is not that it fails to take skeptical challenges seriously or that it provides a false picture of actual scientific practice, but that in extrapolating from practice it locks certain assumptions about technological reliability into theory, embracing them as its own. While it is contextually – indeed physically – necessary that assumptions be made, and that they be embodied or exteriorized in technologies, the particular assumptions are contingent and non-neutral. This may be seen as a political problem, which it is, but it also more than that. It is, moreover, an ontological problem of the instability of nature itself – not just of nature as a construct but of the material co-constitution of real, flesh-and-blood organisms and their environments. Once we enter the naturalist circle – and I believe we have good reason to do so – we accept that evolution dislodges the primacy of place traditionally accorded human beings. At the same time, we accept that the technologies with which science has demonstrated the non-essentiality of human/animal boundaries are reliable, that they show us what reality is really, objectively like. This step depends, however, on a bracketing of technological multistability. If we question this bracketing, as I do, we seem to lose our footing in material objectivity. Nevertheless convinced that it would be wrong to concede defeat to the skeptic, we point out that adaptive knowledge’s circularity or contextualist holism is a necessary requirement of human survival, that it follows directly from embodiment and the fact that the underlying biological mechanisms “largely bypass our conscious experience and linguistic or conceptual abilities” (Giere 385). But if we admit that technological multistability really obtains as a fact of our phenomenal relations to the world, this holism seems to lead us back precisely to Kuhn’s idealist suggestion that researchers (or humans generally) may occupy incommensurably “different worlds.” If we don’t want to abandon materialism, then we have to find an interpretation of this idea that is compatible with physicalism.

Indeed, it is the great merit of naturalism that it provides us with the means for doing so; however, it is the great failure of the theory that it neglects these resources. The failure, which consists in reproducing science’s subordination of technology to thought – in fact compounding the reduction, as contextually practiced, by subordinating it to an overarching (i.e. supra-contextual) theory of science – is truly necessary for naturalism, for to rectify its oversight of multistability is to admit the breakdown of a continuous nature itself. To consistently acknowledge the indeterminacy of human-technology-world relations and simultaneously maintain materialism requires, to begin with, that we extend Giere’s insight about biological mechanisms to specifically technological mechanisms of embodied relation to the world: they too “bypass our conscious experience and linguistic or conceptual abilities.” If we take the implications seriously, this means that technologies resist full conceptualization and are therefore potentially non-compliant with human (or scientific) aims; reliance on technology is not categorically different in kind from reliance on our bodies: both ground our practice and knowledge in the material world, but neither is fully recuperable to thought. Extending naturalism in this way means recognizing that not only human/animal but also human/technology distinctions are porous and non-absolute. But whereas naturalism tacitly assumes that the investment of technology with cognitive aims is only “natural” and therefore beyond question, the multistability of non-cognitive investments of corporeal capacities implies that there is more to the idea of “different worlds” than naturalism is willing or able to admit: on a materialistic reading, it is nature itself, and not just human thought or science, that is historically and contextually multiple, non-coherently splintered, and subject to revolutionary change. Serious consideration of technology leads us, that is, to embrace a denatured naturalism, a techno-evolutionary epistemology, and a material rather than social constructivism. This, then, is the basis for a postnatural philosophy of media.

Phil-Sci-Denaturalized.033

Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media

010406-N-0000X-002

I am very honored to have been invited to hold a keynote address at the Texas State University Philosophy Department’s annual philosophy symposium on April 4, 2014. Having studied as an undergraduate at Texas State (which back then was known as Southwest Texas State University, or SWT for short), this will be something of a homecoming for me, and I’m very excited about it!

In fact, one of the first talks I ever delivered was at the 1997 philosophy symposium — the very first year it was held. My talk back then, titled “Skepticism and the Cultural Critical Project,” sought to bridge the divide between, on the one hand, the analytical epistemology and philosophy of science that I was studying under the supervision of Prof. Peter Hutcheson and, on the other hand, the Continental-inspired literary and cultural theory to which I was being exposed by a young assistant professor of English, Mark B. N. Hansen (before he went off to Princeton, then University of Chicago, and now Duke University).

In a way, my effort back then to mediate between these two very different traditions has proved emblematic for my further academic career. For example, my dissertation looked at Frankenstein films as an index for ongoing changes in the human-technological relations that, I contend, continually shape and re-fashion us at a deeply material, pre-subjective, and extra-discursive level of our being. The cultural realm of monster movies was therefore linked to the metaphysical realm of what I call the anthropotechnical interface, and my argument was mounted by way of a lengthy “techno-scientific interlude” in which I revisited many of the topics in Anglo-American epistemology and philosophy of science that I had first thought about as an undergrad in Texas.

Thus, without my knowing it (and it’s really only now becoming clear to me), my talk back in 1997 marked out a trajectory that it seems I’ve been following ever since. And now it feels like a lot of things are coming full circle: A book based upon my dissertation, for which Mark Hansen served as reader, is set to appear later this year (but more on that and a proper announcement later…). In addition, as I announced here recently, I will be moving to North Carolina this summer to commence a 2-year postdoctoral fellowship at Duke, where I will be working closely with Hansen. Now, before that project gets underway, I have the honor to return to the philosophy symposium in San Marcos, Texas and, in a sense, to revisit the place where it all started.

I thought it would be appropriate, therefore, if I delivered a talk that continued along the trajectory I embarked upon there 17 years ago (wow, that makes me feel old…). My talk, titled “Philosophy of Science De-Naturalized: Notes towards a Postnatural Philosophy of Media,” takes a cue from Ronald N. Giere’s “Philosophy of Science Naturalized” — which sought to reconcile Thomas Kuhn’s idea of revolutionary paradigm shifts in the history of science with W. V. O. Quine’s notion of “Epistemology Naturalized,” i.e. a theory of knowledge based more in the material practice and findings of natural science (especially evolutionary biology) than in the “rational reconstruction” of ideal grounds for justified true belief. As I will show, my own “postnaturalism” — which is ultimately a philosophy of media rather than of knowledge or science — represents not so much a break with such naturalism as a particular manner of thinking through issues of technological mediation that emerge in that context, issues that I then subject to phenomenological scrutiny and ultimately post-phenomenological transformations in order to arrive at a theory of anthropotechnical interfacing and change.

DAAD Postdoctoral Fellowship at Duke University

Logo-DAAD-webDuke_logo

At long last, I am excited to announce that my application for a 2-year postdoctoral fellowship at Duke University has been approved for funding through the DAAD (German Academic Exchange Service). At Duke, I will be working closely with Mark B. N. Hansen and other scholars of media and culture to develop a media-archaeological perspective on serialization processes in video games and digital media culture more generally. The fellowship, which runs from August 2014 to July 2016, will allow me to conduct archival research in the US that will supplement and expand my work in the project “Digital Seriality” that I co-direct with Andreas Jahn-Sudmann in the context of the DFG Research Unit “Popular Seriality — Aesthetics and Practice.” Needless to say, I am very excited about this, and I will continue to post updates here! More soon…

CFP: Digital Seriality — Special Issue of Eludamos: Journal for Computer Game Culture

Digital_Seriality.003a

I am pleased to announce that my colleague Andreas Jahn-Sudmann and I will be co-editing a special issue of Eludamos: Journal for Computer Game Culture on the topic of “Digital Seriality.” Here, you’ll find the call for papers (alternatively, you can download a PDF version here). Please circulate widely!

Call for Papers: Digital Seriality

Special Issue of Eludamos: Journal for Computer Game Culture (2014)
Edited by Shane Denson & Andreas Jahn-Sudmann

According to German media theorist Jens Schröter, the analog/digital divide is the “key media-historical and media-theoretical distinction of the second half of the twentieth century” (Schröter 2004:9, our translation). And while this assessment is widely accepted as a relatively uncontroversial account of the most significant media transformation in recent history, the task of evaluating the distinction’s inherent epistemological problems is all the more fraught with difficulty (see Hagen 2002, Pias 2003, Schröter 2004). Be that as it may, since the 1990s at the latest, virtually any attempt to address the cultural and material specificity of contemporary media culture has inevitably entailed some sort of (implicit or explicit) evaluation of this key distinction’s historical significance, thus giving rise to characterizations of the analog/digital divide as caesura, upheaval, or even revolution (Glaubitz et al. 2011). Seen through the lens of such theoretical histories, the technical and especially visual media that shaped the nineteenth and twentieth centuries (photography, film, television) typically appear today as the objects of contemporary digitization processes, i.e. as visible manifestations (or remnants) of a historical transition from an analog (or industrial) to a digital era (Freyermuth and Gotto 2013). Conversely, despite its analog pre-history today’s digital computer has primarily been addressed as the medium of such digitization processes – or, in another famous account, as the end point of media history itself (Kittler 1986).

The case of digital games (as a software medium) is similar to that of the computer as a hardware medium: although the differences and similarities between digital games and older media were widely discussed in the context of the so-called narratology-versus-ludology debate (Eskelinen 2001; Juul 2001; Murray 1997, 2004; Ryan 2006), only marginal attention was paid in these debates to the media-historical significance of the analog/digital distinction itself. Moreover, many game scholars have tended to ontologize the computer game to a certain extent and to treat it as a central form or expression of digital culture, rather than tracing its complex historical emergence and its role in brokering the transition from analog to digital (significant exceptions like Pias 2002 notwithstanding). Other media-historiographical approaches, like Bolter and Grusin’s concept of remediation (1999), allow us to situate the digital game within a more capacious history of popular-technical media, but such accounts relate primarily to the representational rather than the operative level of the game, so that the digital game’s “ergodic” form (Aarseth 1999) remains largely unconsidered.

Against this background, we would like to suggest an alternative angle from which to situate and theorize the digital game as part of a larger media history (and a broader media ecology), an approach that attends to both the representational level of visible surfaces/interfaces and the operative level of code and algorithmic form: Our suggestion is to look at forms and processes of seriality/serialization as they manifest themselves in digital games and gaming cultures, and to focus on these phenomena as a means to understand both the continuities and the discontinuities that mark the transition from analog to digital media forms and our ludic engagements with them. Ultimately, we propose, the computer game simultaneously occupies a place in a long history of popular seriality (which stretches from pre-digital serial literature, film, radio, and television, to contemporary transmedia franchises) while it also instantiates novel forms of a specifically digital type of seriality (cf. Denson and Jahn-Sudmann 2013). By grappling with the formal commensurabilities and differences that characterize digital games’ relations to pre-digital (and non-ludic) forms of medial seriality, we therefore hope to contribute also to a more nuanced account of the historical process (rather than event) of the analog/digital divide’s emergence.

Overall, seriality is a central and multifaceted but largely neglected dimension of popular computer and video games. Seriality is a factor not only in explicitly marked game series (with their sequels, prequels, remakes, and other types of continuation), but also within games themselves (e.g. in their formal-structural constitution as an iterative series of levels, worlds, or missions). Serial forms of variation and repetition also appear in the transmedial relations between games and other media (e.g. expansive serializations of narrative worlds across the media of comics, film, television, and games, etc.). Additionally, we can grasp the relevance of games as a paradigm example of digital seriality when we think of the ways in which the technical conditions of the digital challenge the temporal procedures and developmental logics of the analog era, e.g. because once successively appearing series installments are increasingly available for immediate, repeated, and non-linear forms of consumption. And while this media logic of the database (cf. Manovich 2001: 218) can be seen to transform all serial media forms in our current age of digitization and media convergence, a careful study of the interplay between real-time interaction and serialization in digital games promises to shed light on the larger media-aesthetic questions of the transition to a digital media environment. Finally, digital games are not only symptoms and expressions of this transition, but also agents in the larger networks through which it has been navigated and negotiated; serial forms, which inherently track the processes of temporal and historical change as they unfold over time, have been central to this media-cultural undertaking (for similar perspectives on seriality in a variety of media, cf. Beil et al. 2013, Denson and Mayer 2012, Jahn-Sudmann and Kelleter 2012, Kelleter 2012, Mayer 2013).

To better understand the cultural forms and affective dimensions of what we have called digital games’ serial interfacings and the collective serializations of digital gaming cultures (cf. Denson and Jahn-Sudmann 2013), and in order to make sense of the historical and formal relations of seriality to the emergence and negotiation of the analog/digital divide, we seek contributions for a special issue of Eludamos: Journal of Computer Game Culture on all aspects of game-related seriality from a wide variety of perspectives, including media-philosophical, media-archeological, and cultural-theoretical approaches, among others. We are especially interested in papers that address the relations between seriality, temporality, and digitality in their formal and affective dimensions.

Possible topics include, but are not limited to:

  • Seriality as a conceptual framework for studying digital games
  • Methodologies and theoretical frameworks for studying digital seriality
  • The (im)materiality of digital seriality
  • Digital serialities beyond games
  • The production culture of digital seriality
  • Intra-ludic seriality: add-ons, levels, game engines, etc.
  • Inter-ludic seriality: sequels, prequels, remakes
  • Para-ludic seriality: serialities across media boundaries
  • Digital games and the limits of seriality

******************************************************************************

Paper proposals (comprising a 350-500 word abstract, 3-5 bibliographic sources, and a 100-word bio) should be sent via e-mail by March 1, 2014 to the editors:

  • a.sudmann[at]fu-berlin.de
  • shane.denson[at]engsem.uni-hannover.de

Papers will be due July 15, 2014 and will appear in the fall 2014 issue of Eludamos.

*******************************************************************************

References:

Aarseth, Espen. 1999. “Aporia and Epiphany in Doom and The Speaking Clock: The Temporality of Ergodic Art.” In Marie-Laure Ryan, ed. Cyberspace Textuality: Computer Technology and Literary Theory. Bloomington: Indiana University Press, 31–41.

Beil, Benjamin, Lorenz Engell, Jens Schröter, Daniela Wentz, and Herbert Schwaab. 2012. “Die Serie. Einleitung in den Schwerpunkt.” Zeitschrift Für Medienwissenschaft 2 (7): 10–16.

Bolter, J. David, and Richard A, Grusin. 1999. Remediation: Understanding New Media. Cambridge, Mass.: MIT Press.

Denson, Shane, and Andreas Jahn-Sudmann. “Digital Seriality: On the Serial Aesthetics and Practice of Digital Games.” Eludamos. Journal for Computer Game Culture 1 (7): 1-32. http://www.eludamos.org/index.php/eludamos/article/view/vol7no1-1/7-1-1-html.

Denson, Shane, and Ruth Mayer. 2012. “Grenzgänger: Serielle Figuren im Medienwechsel.” In Frank Kelleter, ed. Populäre Serialität: Narration – Evolution – Distinktion. Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript, 185-203.

Eskelinen, Markku. 2001. “The Gaming Situation” 1 (1). http://www.gamestudies.org/0101/eskelinen/.

Freyermuth, Gundolf S., and Lisa Gotto, eds. 2012. Bildwerte: Visualität in der digitalen Medienkultur. Bielefeld: Transcript.

Glaubitz, Nicola, Henning Groscurth, Katja Hoffmann, Jörgen Schäfer, Jens Schröter, Gregor Schwering, and Jochen Venus. 2011. Eine Theorie der Medienumbrüche. Vol. 185/186. Massenmedien und Kommunikation. Siegen: Universitätsverlag Siegen.

Hagen, Wolfgang. 2002. “Es gibt kein ‘digitales Bild’: Eine medienepistemologische Anmerkung.” In: Lorenz Engell, Bernhard Siegert, and Joseph Vogl, eds. Archiv für Mediengeschichte Vol. 2 – “Licht und Leitung.” München: Wilhelm Fink Verlag, 103–12.

Jahn-Sudmann, Andreas, and Frank Kelleter. “Die Dynamik Serieller Überbietung: Zeitgenössische Amerikanische Fernsehserien und das Konzept des Quality TV.” In Frank Kelleter, ed. Populäre Serialität: Narration – Evolution – Distinktion. Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript, 205–24.

Juul, Jesper. 2001. “Games Telling Stories? – A Brief Note on Games and Narratives.” Game Studies 1 (1). http://www.gamestudies.org/0101/juul-gts/.

Kelleter, Frank, ed. 2012. Populäre Serialität: Narration – Evolution – Distinktion: Zum seriellen Erzählen seit dem 19. Jahrhundert. Bielefeld: Transcript.

Kittler, Friedrich A. 1986. Grammophon, Film, Typewriter. Berlin: Brinkmann & Bose.

Manovich, Lev. 2001. The Language of New Media. Cambridge, Mass.: MIT Press.

Mayer, Ruth. 2013. Serial Fu Manchu: The Chinese Supervillain and the Spread of Yellow Peril Ideology. Philadelphia: Temple University Press.

Murray, Janet H. 1997. Hamlet on the Holodeck: The Future of Narrative in Cyberspace. Cambridge: MIT Press.

Murray, Janet H. 2004. “From Game-Story to Cyberdrama.” In Noah Wardrip-Fruin and Pat Harrigan, eds. First Person: New Media as Story, Performance, and Game. Cambridge, MA: MIT Press, 2-10.

Pias, Claus. 2002. Computer Spiel Welten. Zürich, Berlin: Diaphanes.

Pias, Claus. 2003. “Das digitale Bild gibt es nicht. Über das (Nicht-)Wissen der Bilder und die informatische Illusion.” Zeitenblicke 2 (1). http://www.zeitenblicke.de/2003/01/pias/.

Ryan, Marie-Laure. 2006. Avatars of Story. Minneapolis: University of Minnesota Press.

Schröter, Jens. 2004. “Analog/Digital – Opposition oder Kontinuum?” In Jens Schröter and Alexander Böhnke, eds. Analog/Digital – Opposition oder Kontinuum? Beiträge zur Theorie und Geschichte einer Unterscheidung. Bielefeld: Transcript, 7–30.