2009 Annual Conference: Decodings

Full Program With Abstracts and Online Materials

Updated Nov 3, 2009

With Javascript: Click on titles to view abstracts | Contract all abstracts | Expand all | Save current view
Download a printable PDF | Participant List | Conference site home

Materials are available: Go to first item

Session 1 - Thurs 4:30pm - 6pm

Session 1 (A) Bennett
SLSA Creative Writers Read I
Susan Allender-Hagedorn

Robert Martinez.
L'Harapientu - El Farrapientu - The Raggedy Man
Reading a short story that explores the encoding and decoding of memory and - especially - how our reality might be altered if we could select which memories to keep and which to eliminate. Our realities, after all, are composed of memories, sometimes accurate and sometimes not. How we read and interpret our memories is not necessarily the same as reality (just as the genetic code can be read and misread in a variety of ways). As usual, I set my stories in an Asturian context. 

Susan Allender-Hagedorn; Cheryl Wood Ruggiero.
Reading “Caveat Anthem”
A humorous short story dealing with future longevity and genetic theft.

Janine DeBaise.
River birch: Touching sky
Reading creative non-fiction that explores my connection to the landscape of upstate New York, including the woods behind my house, filled with brittle non-native Scotch Pines planted in the 1930s by the CCC; a meromictic lake that I've known since childhood; and my rural front yard, where I plants native river birches as a symbol of growth. My ecofeminist writing meshes scientific information, taken from geology, ecology, or botany, with personal, aesthetic, and spiritual responses to the landscape. I invite my readers to consider the genre of nature writing as a kind of decoding. Is it possible for humans to use language to decode nature? Or is metaphor a code that artists, scientists, and writers impose on nature?

Session 1 (B) Crescent
Science Fact/Fiction I
Chair: Patrick B. Sharp

Luis Arata.
From Science Fiction to Science: Kepler’s Use of Modeling and Simulation in Somnium
This paper gives a brief description of Kepler’s Somnium to show how this precursor work of science-fiction played out in its narrative key concepts of physics and astronomy. The Somnium constitutes a thought experiment that models and simulates relativity of motion and inertia. In a long body of notes that expand its fictional narrative, Somnium also speculates on gravitation and the influence of the moon on tides. Using Kepler’s work of fiction, the paper discusses how imagination, through the interface of models, is at the heart of science.  My work on modeling examines how imagination is formalized across the disciplines to yield effective theories, simulations, designs, architectures, narratives, and works of art. I argue that such modeling further cultivates the imagination to produce constructions that blur the boundaries between the actual and the imagined. Such interaction both filters and shapes our realities.

Patrick B. Sharp.
The "Indispensable Woman": Early Twentieth-Century Representations of Gender, Technology, and Evolution
This paper focuses on the impact of Darwin's concept of "sexual selection" on discourses of national identity in the United States of the early twentieth century. In particular, this project looks at the influence of Darwin's account of sexual dimorphism in humans, which emphasized how male and female bodies had been shaped differently through natural selection. Darwin argued that male bodies were naturally selected to invent and use technology, and therefore had larger brains and more adept hands than women. On the other hand, he argued that female bodies were naturally selected to be beautiful and nurturing in order to fulfill what he saw as their biological role as mothers. This paper explores how this evolutionary formulation of gender circulated in political discourse, popular science journals, and science fiction magazines between 1900 and 1930. Stories such as Philip Francis Nowlan's "Armageddon 2419 A. D.," the original Buck Rogers story, exemplified a pervasive anxiety about women as soldiers and masters of technology. I argue that such stories went to great lengths to make sure that their women soldiers and engineers embodied traditional feminine characteristics and were safely contained within the logics of sexual selection. In this regard, these science fiction stories were consistent with political discourse and evolutionary science as it was distributed through "popular" publishing outlets.

Session 1 (C) Whitman B
Elizabeth Wilson
The paradigm of stimulus and response is vital to the production of scientific knowledge. It constitutes a practical framework for establishing knowledge about a variety of phenomena, particularly within the life sciences. Importantly, stimulus and response also form a theoretical-conceptual framework for interpreting specific biological phenomena, from molecular interactions to neuronal information transfer. As an organizational logic, stimulus/response relies on, and reproduces, a series of perceptual and analytic distinctions (subject/object, interiority/exteriority, action/reaction) that have significant implications for how we read biological phenomena, epistemologically and ontologically, and how the body or subject is characterized. Our panel brings together a number of different disciplinary perspectives on the ideas of stimulus and response, in order to speak generally about these concepts as a paradigm in scientific knowledge production. How does stimulus and response shape the perception of biological events? How do they reinforce or challenge traditional models of objectivity in scientific practice? And how are different conceptualizations of the body/subject constrained or enabled by this logic?

Astrid Schrader.
Reaction, Response, and Responsibility in Experimentation with Toxic Dinoflagellates
“And say the animal responded?” is the title of a lecture and a question Jacques Derrida puts before the entire tradition of Western philosophy, which, according to him, has always said the same: that ‘the animal’ cannot speak and therefore it cannot respond. Response must be distinguished from a reaction. Nobody doubts that nonhuman animals can react to environmental stimuli, instinctively or programmatically according to their ‘genetic program’. A genuine response, however, it is said, is proper only to the human subject; it requires reflexivity, history and memory. In this paper, I explore how such a seemingly fundamental distinction between humanity and animality guides scientific experimentations with toxic marine microorganisms and how it is displaced as soon as the scientific object is granted ‘historicity.’ Drawing on research with toxic dinoflagellates that thrive in polluted coastal water and periodically kill a large number of fish, I discuss how a distinction between reaction and response is reinscribed differently in experiments that seek to provide evidence for their toxicity. In the case of the fish-killing dinoflagellates Pfiesteria piscicida, toxicity is not only environmentally induced but crucially depends on how the boundaries between the microorganisms and their environment are experimentally enacted, that is, how the dinoflagellates’ ‘agencies’ are taken into account. Experimental efforts to locate a source of bioactivity within presumed boundaries of a potentially toxic species fail to solicit a toxic response. If reaction implies the existence of a bounded object before it begins to act, the provocation of a genuine response, I argue, requires taking responsibility for specific experimental relations enacted, which cannot presuppose that a species (whether microscopic or human) is definable ‘as such’.

Michelle Jamieson.
The Ecological Origin of the Immune Response
Within the discipline of immunology, the immune response is typically conceptualised as a cause and effect relation between two discrete entities: a stimulus and a response (or an antigen and an organism). As a phenomenon, it is interpreted in terms of a linear narrative of infection, in which the physical integrity of an organism is breached by the penetration of a foreign entity or substance. This perception of the immune response as an encounter between pre-existing entities, views the complementarity of stimulus and response as an effect of their meeting. However, this model does not explain how organism and antigen come to exist in a relation as different or opposed, and yet biologically correlative and implicated. That is, conventional causal interpretations of stimulus/response pairings cannot account for how a stimulus comes to be physiologically provocative for an organism that is already receptive to this specific provocation. In short, there is no sense that the unique coupling of a stimulus and a response is symptomatic of their larger, ecological entanglement. In order to complicate the view of stimulus/response relations as a confined relation between two existent things, this paper critically considers how an organism’s capacity to respond to a stimulus is triggered or animated in the first instance. Taking allergic reactions as its example, it examines the phenomenon of sensitisation: the becoming sensitive or responsive of the organism to a specific foreign substance. Focusing on Clemens von Pirquet’s theory of allergy and his extensive experiments into the events of sensitisation, I argue for a view of stimulus and response as ontologically implicated phenomena that cannot be meaningfully, or materially, disaggregated from this relation. Using Pirquet’s work to demonstrate that there can be no immunological body outside its deep ecological contextualisation, I argue that the properties of stimulus and response – which we take to be materially inherent to antigens and organisms – are characteristics of matter that arise only through lived relation.

Lyle Muller.
The Logic of the Receptive Field
Science requires objects of fixed identity to maintain order within its fields of study, often setting objects into a mechanistic model of causation through systematic localization of stimulus and response. While this process works seamlessly in the physical sciences – which produce knowledge of objects as fixed in identity, such as molecules of water or photons of light – in neuroscience, the sharp distinction between stimulus (which is imaged on a screen placed before an experimental animal) and the neural response (illustrated in color maps of neural activity) becomes unclear, both conceptually and technically. What ontological positions are implicit when a neuroscientist describes the brain as re-presenting the world? In this paper, I develop a rigorous analysis of the paradigm of the receptive field – the dominant theoretical framework used to explain neural representation. I connect the reading of the relation between stimulus and response contained within the receptive field paradigm to theories of cognition and, ultimately, to conceptions of selfhood in relation to these scientific data. I argue that receptive fields, current theories of cognition, and conceptions of scientific selfhood exemplify several instances of the same underlying logic, and that a re-conceptualization of both neural representation and the subject’s relation to it are necessary for a critical response to modern neuroscience. Indeed, by analyzing neural representation in light of concepts of representation found in critical theory, it is possible to construct an active theoretical position in relation to modern neuroscience. We must not simply respond to new scientific data in defining new conceptions of the body and selfhood, but rather, we must critically engage ourselves with the process of knowledge production in the life sciences. With the current and future deluge of data stemming from technical advances in the neurosciences, this active theoretical position in relation to its data is becoming increasingly vital.

Session 1 (D) Woodruff A
Before Germ Theory: Contagion in Early Modern Europe
Lucinda Cole
This panel offers three different but mutually-illuminating approaches to the puzzling issue of contagion in early modern Europe. Sharon Nell Dewitte reports on forensic evidence from medieval burial sites in France; Lucinda Cole examines seventeenth-century descriptions of rodent infestation, emphasizing the relationships between naturalistic and theologically-driven desciptions of plague; and Robert Markley explores the "interlaced rhetorics of climate and mortality in late seventeenth-century India, particularly on the notoriously unhealthy island of Bombay."

Sharon Nell DeWitte.
Death and Dying in the Middle Ages: The Black Death of 1347-1351
During the Black Death (1347-1351), mass burial grounds were established throughout Europe to accommodate the victims who rapidly overwhelmed local parish cemeteries. One such Black Death cemetery, the East Smithfield cemetery in London, is one of the few excavated cemeteries that has clear documentary and archaeological evidence linking it to the mid-fourteenth-century epidemic. Most, if not all, of the individuals buried in East Smithfield died during the epidemic, and the skeletons excavated at the East Smithfield site thus provide an excellent opportunity to explore questions about the patterns of Black Death mortality and how survivors responded to that mortality. By comparing the East Smithfield Black Death cemetery (n = 491) to a normal (i.e. non-epidemic) skeletal sample (n = 290), I found important similarities between Black Death and normal mortality patterns. Analysis of the risk of death associated with pre-existing health conditions suggests that the Black Death was selective with respect to health, such that individuals in poor health were more likely to die during the epidemic than their healthier peers. This is similar to the pattern of selectivity observed under conditions of normal mortality – i.e. those in poor health are at highest risk of mortality from non-epidemic causes of death. Comparison of the burial patterns in the East Smithfield cemetery to those in normal mortality cemeteries also reveal important similarities; despite the devastating mortality experienced during the Black Death, survivors treated the dead with the same care and respect shown under conditions of normal mortality. The similarities between the Black Death and normal mortality challenge assumptions that catastrophic mortality, such as that associated with the Black Death, and the human response to that mortality is very different from the patterns of and responses to normal mortality.

Lucinda Cole.
Of Mice and Moisture
This paper discusses the relationships among famine, theocentric accounts of pestilence, and recorded rodent invasions in the early modern period.  While food shortages were sometimes associated with what Giambattista Della Porta called “armies of myce” which razed crops for miles around, within Galenic theory mice and rats were often perceived as a byproduct of hot, wet air from which, increasingly, both famine and disease were thought to arise. Within this naturalistic strain of thinking about pestilence, rats function as what Bruno Latour calls “quasi-objects,” as warm-blooded competitors for food but also as experimental subjects whose births and deaths were used as historical predictors of climactically-produced disease.   

Bob Markley.
‘A Putridness in the Air’: Bioregionalism, Climate, and Disease “A Putridness in the Air”: Bioregionalism, Climate, and the Etiology of Disease
In an era before modern conceptions of the etiology of infectious diseases, biohazards typically were identified with specific regions, whether in the familiar locales of Northwestern Europe, trading outposts in the tropics, or colonies in the Americas. But in the monsoon regions of South and Southeast Asia, the “medical environmentalism” that dominated understandings of human health came under increasing stress as British merchants and physicians struggled to identify and analyze the conditions that killed westerners with often terrifying frequency. Examining works by John Arbuthnot, John Ovington, Alexander Hamilton, and John Fryer, I explore the interlaced rhetorics of climate and mortality in late seventeenth-century India, particularly on the notoriously unhealthy island of Bombay. The debates about whether the English could acclimate to this alien environment offer a means to analyze the crucial role that climatological thinking played in British perceptions of South Asia.

Reception - Thurs 6-7pm
Mojito Bar/Restaurant

Guest Scholar Session - Thurs 7pm
Woodruff A & B

Creative Evolution: Science and Belief
Wendy Wheeler

Session 2 - Fri 8:30am - 10am

Session 2 (A) Bennett
Military Code
Chair: Robert Blaskiewicz

Jason Ellis.
Decoding the Origins of the Tank and “The Land Ironclads”: Sir Ernest Dunlop Swinton and H. G. Wells
The first popular, and widely cited, fictional account of the military tank is H.G. Wells’ 1903 short story, “The Land Ironclads.” The recognized and widely circulated literary publication, the Strand Magazine published Wells’ short story in 1903--thirteen years before the British tank was unveiled to the world at Flers and Courcelette on 15 September 1916 during the First World War’s Battle of the Somme. However, Wells was not involved in the actual development of the tank, but many historians point to Major-General Sir Ernest Dunlop Swinton as the single person most responsible for convincing the British military to design and commit invaluable war time resources to its development and utilization in the Great War. Interestingly, these two persons--Wells and Swinton--developed a public debate in print and other media, which eventually led to Swinton’s libel suit against Wells, over who was most responsible for the invention of the tank. It is the purpose of this presentation to highlight their public debate, and uncover how the public reacted to these men’s claims. From this very public argument it will be possible to decode the meaning of such claims to invention, and the early history of Science Fiction, which was in part buttressed on imaginative futurology.

John Freeman.
A Secret Communication System of Variable Frequency Carrier Oscillators, Modulators, Detectors, and Rectifiers for Decrypting the Life of H. K. Markey
Patented in 1941, H. K. Markey’s Secret Communication System for guiding torpedoes to their targets was so complex and ahead of its time that it was not deployed until twenty years later, for coded communication among U.S. warships during the Cuban missile crisis. Two piano rolls set in synchronous motion between the sender and the torpedo would allow the latter to be guided accurately to its target. Any enemy attempting to jam the radio signals would have to decode what pattern among the eighty-eight possible frequencies was guiding the torpedo—an impossible task. As so many innovations, its initial development as a war-time aid led to many peace-time applications. Indeed, its use of frequency hopping principles for conveying signals ushered in the shift from analog to digital communication in what is now known as spread spectrum technology. In my multimedia presentation, I will employ the mechanism described in the 1941 patent as a means of mixing and sampling the shifting frequencies that defined the life of this inventor and Hollywood screen star otherwise known as Hedy Lamarr.

Nicholas Knouf.
Mining the Military-Academic-Industrial Complex in a Poetic-Serious Fashion
Firefox extensions or add-ons are an example of free/libre/open source software (FLOSS) art that have a long history within new media, but are now being approached from a more critical perspective, as exemplified by festivals (Piksel), organizations (Medialab-Prado, Sarai), and edited collections (_FLOSS+Art_). Extensions present a relatively low-barrier of entry into the development of web- and browser-based artistic projects. While I do not want to discount the level of programming knowledge necessary to build them, they are still based on a libre platform and can be developed within a rather large community of programmers, programmers whose own work is, by default, available for inspection and study. (What I mean here is that all of the source code for an extension is contained within the extension itself, making it easy to learn from the work of others.) This is in marked contrast to previous (and contemporaneous (and future)) strands of that might have valorized the use of Director, Shockwave, Flash, or Java, the first three being expensive, proprietary, and closed platforms, and the last being an open programming language, but one where the actual source code is often difficult to get to if it is not provided directly by the artist. This partially explains the recent flurry of add-on development both poetic and serious, including pieces that prevent various types of search engine tracking (Track-me-not), replace ads on web pages with art (Add Art), graphically turn sites into their mid-1990s equivalents (Timemachine), and enable web browsing from behind the Chinese firewall (China Channel). My contribution to this development is MAICgregator (, an extension that aggregates information about colleges and universities embedded in the military-academic-industrial (MAIC) complex. It searches government funding databases, private news sources, private press releases, and public information about trustees to try and produce a radical cartography of the modern university via the replacement or overlay of this information on academic websites. One advantage of MAICgregator, and Firefox extensions in particular, is the ability in many places to (at least temporarily) install these plugins on public machines at colleges, universities, libraries, or internet cafes. The ease by which extensions can be installed---and the ability a programmer has to modify pages at will once the extension is installed---makes them an ideal vector for the propagation of radical or alternative perspectives to those that are fixed on the web page itself. This is especially the case within the modern university, where schools carefully control what types of information make it to their front page or internal portals, or where students consume their news in computer-generated chunks via Google News, absent marginalized or alternative voices. Add-ons provide one way to break open this lock on web-based media, combining disparate sources together in a montage that is at once both serious and poetic. MAICgregator and the current batch of Firefox extensions portend a new means of producing, one that potentially gives artists unlimited ability to modify others' webpages---all without the possibility of jail time! This reappropriation of the space of the browser re-opens the consideration of exactly what the "web" is, given its continued atrophy into staid configurations of a mass-media-controlled or capitalist-informed semiotics. And it is finally a performance of what we might call a "poetic austerity": the use of whatever means are available to us---absent the possibility of funding through traditional sources, given their decrease in this time of "crisis" and of focusing on the "essentials"---to respond to power on our own terms and using the means at our disposal.

Session 2 (B) Crescent
What Is at Work in a Work of Digital Literature?
Chair: Marjorie C. (M.D. Coverley) Luesebrink
In keeping with the theme of SLSA 2009, this panel looks at decoding with respect to the practice of electronic fiction and poetry. A finished electronic piece is the end result of various decisions about technology and the coding that accompanies this production. In some cases the reading of a piece partially decodes the assemblage; in other works, the coding structure remains hidden. The members of this panel will look at both phenomena as an aspect of investigating the work of coding/decoding in Digital Literature.

Mark Marino.
Critical Code Studies of Queer Technology
Zach Blas' "Queer Technology" hacks into heteronormative cultural systems by means of a fictional software development kit, delivering criticism in the form of a set of Macintosh computer files called "transCoder." In what he calls a "Queer Programming Anti-Language," Blas has created a declarative pseudo-language to reprocess assumptions about gender and sexuality. In this paper, after first outlining the basic tenets of critical code studies (CCS), I present a CCS close-reading Blas' work as an intervention, as a gesture toward freeing programmers (and citizen/agents) from the dominant protocols/paradigms. Blas' work is about reprogramming the machine even as it develops a new mode of cultural criticism that builds on the work of the codework artists, such as John Caley and Mez.

John Zuern.
Primitive Poetics, Elemental Ethics: Minimal Units of Meaning (and Morals) in Ask Me For the Moon
Produced in Flash, Ask Me for the Moon: Working Nights in Waikiki is at once a chapbook of poems and a philosophical essay, combining animated text and images, scholarly citations, and audio to reflect on the politics of hospitality labor in Hawai‘i. Its goal is to align the principles of efficient ActionScript coding with the restricted syllabic and lexical parameters of the poetry as well as with the ethical demands of the theme. The project engages the question of the minimal unit or “primitive” of semiotic systems (and moral codes) that include non-verbal and temporal elements, a question I approach via the philosophers Peirce, Levinas, and Badiou.

Stephanie Strickland.
Reading Refigured: The Shifting Meaning of Reading in slippingglimpse
slippingglimpse is a born-digital poem, a collaboration between Stephanie Strickland, who wrote and designed it; Cynthia Lawson Jaramillo, who coded it in Flash and co-designed the interface; and Paul Ryan, who contributed the video. They all wished to explore 1-human relationship with the earth as part of the earth’s evolution; 2-non-hierarchical structures of communication; 3-turbulence as a source of information; 4-the type of information that videotaping and editing capture; 5-shifts from binary to triadic systems (deriving from an interest in Bateson’s cybernetics and Peirce’s semiotics); and 6-the water-flow patterns called chreods in René Thom’s catastrophe theory. The resulting poem involves a refiguring of the reading and translation process.

Marjorie C. (M.D. Coverley) Luesebrink.
The Management of Time and Text
Egypt: The Book of Going Forth by Day is a novella-length fictional narrative about ancient and modern Egypt. In this work, the “manuscripts” or “pages” were envisioned as “endless” – as in the concept of the “infinite page.” The story was created in Director. Director, however, is time-based authoring software, and its basic matrix is the time frame. In order to configure text in time, it is helpful to look more closely at the way that the coding and decoding of time shapes the access to the elements of each “movie” file, how these files are structure, and how these “movies” are translated into the metaphor of the manuscript.

Session 2 (C) Doggett
Spatial Decodings in Theory and Practice
Marcel O'Gorman
This panel explores a variety of public research/creation projects that graft spatial theory onto an urban setting by means of geocaching techniques. Projects range from situationist detournements inspired by Rem Koolhaus to sousveillance tactics that rely on QR codes and biofeedback devices. What these projects achieve is a complex interplay between urban and virtual worlds and between spatial theory and practice.

Jennifer Doyle.
Decoding Eco-Longing in Architectural Design
This paper will explore a collaborative public art project that engages with decoding ecological longing in architectural and digital space. This piece of intervention art entitled “Dream Pharm: Seeding the City, EcoCache 2009” was created to initiate a decoding of green space pinings embodied by the University of Waterloo’s downtown Pharmacy building. The most evident expression of eco-longing is embedded in the sprawling images of flowers, bees and insects that adorn the windows of the space. The term Eco-Cache, in this instance, refers to an adoption of the basic principles of geocaching with minor variations. The Eco-Cache contains seed packets and instructions for planting, as well as directions to go to our blog and join the online community; mirroring and mapping the real world greening taking place through an interventionist seeding of the city. Participating in the project involved the evolution of a digital community, a real world treasure hunt (that functioned to call attention to human intervention in natural seeding processes), a space for planting, and a return to digital space to connect and provide an overarching impression of regional impact, mappings, and the decoded desire for flourishing green space in the former industrial core of Kitchener-Waterloo. The impetus behind this form of art project lies in engaging with manmade environments in a way that calls attention to eco-longings, specifically in digital and architectural space. By applying spatial theory and ecocritical concerns, the project calls attention to the latent desires inherent to the Pharmacy Building’s architecture, and more significantly the way we negotiate urban space through ecotopic longings. In my SLSA presentation, I will combine Kristeva’s discussion of chorography, Derrida’s examination of the pharmakon, and the work of various ecocritical theorists to explore how this project engages with encoding physical space through a deft interweaving of physical and digital sites.

Emily Freeman.
Mirrors, GPS, and a Tautological Scavenger Hunt: A Paranoid Critical Approach to Geo-Caching.
When Derrida and Eisenman set out to design a garden for the Parc de la Villette in Paris, the garden never materialized; Plato’s notion of chora was yet again something that could be read about but never actually seen. I wanted to create a theoretical art project that actually moved from the realm of theory to practice. Influenced by Breton and paranoid critical methodology, I completed a surrealist geocache project that involved mapping (via GPS coordinates) several locations throughout Waterloo, Ontario. Each of these locations had a clue and password that directed participants back to a blog site, where they discovered surrealist blurbs that had little, if any, relationship to the spaces they supposedly were referencing. We wanted to resist the type of logical spatial narratology often found in projects that combine geography, technology, and narrative. The primary goal was to disrupt and decode the participants’ relationship to both geography and story and also physical and digital space. Participants had to go back and forth from computer interface to physical space during the hunt. What could have been a continuous stroll was broken up by the need to return to the computer and receive the next set of coordinates. Ultimately, the task itself was a tautological romp designed to put a few clicks on the pedometer and some dirt under the fingernails. In other words, the project pushed the digital subject back out again—-from the glare of the monitor into the glare of the sunlight. This paper will document my art project and the complications involved in moving between the often incongruous border of physical and digital space.

Karina Graf; Marcel O'Gorman.
Play For Your Life: Designing a Mobile, Embodied, and Socially Active Digital Game
Immersion isn’t necessarily good for you. Unless you’re a long-distance runner, the experience of immersion--or of “flow,” for that matter--almost necessarily entails a sedentary indoor activity. This is especially true in the case of digital immersion, which is most commonly achieved while sitting motionless in front of a screen. What’s more, as stupidly simple as this may sound, immersive digital environments tend to compete with “off-screen” environments, preventing us from engaging mindfully with our physical surroundings. Some teenagers, for example, can find their way through 500 kilometers of mythical digital battlefields, but have no idea that a hiking trail runs right through their backyard. This situation is not ideal for fostering either physical wellbeing or environmental stewardship. With this specific problem in mind, might it be possible to transfer the immersive qualities of digital games into an activity that requires individuals to explore their built environments? In this presentation, we will explore this very possibility by discussing the results of a mobile computing project currently underway in the Critical Media Lab at the University of Waterloo. This project crosses health studies research, geo-informatics, narrative theory, and game design. The game itself combines geocaching, social networking, and biofeedback technologies. Players of this experimental game (which in Phase 1 includes Grade 7 students at a Kitchener public school) will be given a Blackberry equipped with a camera, GPS and accelerometer, linked via bluetooth to a heart rate monitor. During each day of a three-week gaming period, players will be sent coordinates for a specific feature in their local built environment, from a “Neighborhood Watch” sign to a chestnut tree in the nature preserve adjacent to the school. Players have to locate the feature, photograph it using the Blackberry, and upload it to a social networking site, which will then reveal feedback about each feature’s relationship to environmental wellbeing. The first player to complete a “grid” of environmental features is the winner. Players will be given power-ups (early access to new coordinates) based on increases in their activity levels, which involves measuring their heart rates and spatial movements. Over the course of the gameplaying period, the researchers involved track the movement patterns of the players using custom Blackberry Enterprise Server software. This, and other comparative measures built into the study, will help determine if the game: a) increased the activity level of players vs. their non-playing peers; b) improved the players’ understanding of environmental issues; and c) influenced the players’ media-related habits over a long term. In short, the research will examine if this game replaced sedentary screen immersion with immersion in the built environment.

Bryn Choppick; Devon McDonald.
“Waterloo Watchmen”: Decoding Surveillance Space at the Crossroads of Digital Public Art and Critical Theory
“Waterloo Watchmen” is a collaborative public digital art project by Bryn Choppick, Devon McDonald and Adeel Khamisa. The aim of this project is to create a situation whereby the prevalence of surveillance technology on the University of Waterloo campus would be highlighted for our peers to provide a starting point for increasing discussion of disciplinary observation. Our interest lies in how surveillance is tied to spatiality and how those spaces can be complicated through the intervention of art and critical theory. Waterloo Watchmen is presented as a blog-style website ( concentrating on surveillance and counter-surveillance on the University of Waterloo campus. Our collective markets the “Waterloo Watchmen” project as a game that asks students to locate six different spaces marked for counter-surveillance on campus. In each space, a poster employing QR Code technology allows students with “smart cell phones” the ability to scan a barcode and receive a link for immediate access to a blog post about the specific space they are standing in, thus working to decode the physical/digital divide. The mandate of the “Waterloo Watchmen” project is to avoid dictating how surveillance culture should be perceived; instead, the project offers access to various theoretical perspectives on surveillance, aiming to broaden public understanding of the cultural, social and psychological effects of surveillance, while questioning its ubiquity as a presence in postmodern life. This mandate provided the opportunity to collaborate using distinct theoretical approaches, and as such, Bryn Choppick and Devon McDonald will each present their different critical interpretations of this project. McDonald will first address the project from the perspective of psychoanalytic and feminist theory, focusing on sound in relation to surveillance. In Discipline and Punish, Foucault states that within the panoptic society the subject "is seen, but he does not see; he is the object of information, never a subject in communication" (200). Although the surveilled subject’s limited visual ability has garnered much academic attention, Foucault’s latter point, that this same subject is also silenced, offers a largely untapped arena for investigation. Focusing on the communicative elements intrinsic to surveillance space reveals a number of key questions: When and how does the visual shift to the vocal? How is the surveilled subject silenced and how can this silence be “heard”? If the underprivileged subject is silenced does the privileged observer have a voice? And, if so, what does this voice communicate? McDonald will describe the project as designed to both expose the voice of the surveiller (and decode his message) and to give a voice to the surveilled. McDonald’s intention is to elicit a response from the subject—a voice—that would be added to the clamour of the Internet. He hopes to thus translate the voice of the surveillor and decode the silence of surveilled subject. This reveals much regarding how both sight and sound function to facilitate surveillance spaces, and will answer the aforementioned questions regarding who is given a voice and why, exposing what a communication-focused study of surveillance technology suggests about its function as both an effective tool for managing anti-normative behaviour and, conversely, its role in the reinforcement of injurious and bigoted social hierarchies. Choppick will then describe how the Situationist tactics of détournement and pyschogeography can be applied to surveillance spaces to transform them into locations for critical thinking about surveillance as a larger cultural practice. He will also pursue what functions surveillance and counter-surveillance perform in a society that has learned to constitute reality and pleasure through the mediation of images. Choppick will then explore the exhibitionistic/voyeuristic dialectic as a cultural phenomenon, thus further suggesting the possibility of deriving pleasure from surveillance. Following the Situationist position that poststructuralism ultimately describes the spectacle’s self-fulfilling prophecy about itself, these theories are drawn on to describe the process of normalizing surveillance, before demonstrating how Situationist tactics can decode such structures. In doing so, Choppick seeks to restore the values of adventure, scandal and play in surveillance society.

Session 2 (D) Mitchell
Science Fact/Fiction II
Chair: John Bruni
These panels propose that the most interesting theoretical work that remains to be done lies in the space between science and literature--between fact and fiction. Not only has objective reality given way to the observation of observation (as in systems theory), there has been questioning of the limits of scientific truth. Karen Barad, for instance, argues for an agential realism, where science and culture are co-constituted. We wish to investigate how both scientific and literary writing are forms of social communication whose meanings are informed by cultural noise and pollution.

Clarissa Ai Ling Lee.
EroScitopia: or a guide on how to code an alternate universe that will legitimize science as free play and where no ideas are disciplinarily implausible
To move from normal to extraordinary science, in Kuhnian terms, epistemology construction within theoretical sciences is often the result of sometimes bizarre and occasionally outlandish thought experiments; outlandish because the scientific establishment is excited or shaken by its very notion. I am interested in exploring new epistemological strategies in the constructivism of scientific theories that would allow me to try to excavate more productive ways of studying consciousness, particularly scientific consciousness. I argue that the way by which one can encode a ‘Unified’ theory of science is through the interrogation of human and non-human consciousness in a manner that circumvents disciplinary resistances and stakes. I intend to decode the physical science and mathematical narratives that are at work in constructing the Large Hadron Collider as a site of both the real and the imaginary by the kind of product it is able to generate (the hadronic particle) and the work that goes into interpreting the data-stream generated as a result of the LHC’s ability to encode exotic particles within an analyzable loci. I intend to wander through a permutation of scientific universes represented by the LHC with the philosophical and scientific idea-logical underpinnings coded in “The Turing Machine” by Christos H Papadimitrou, “Radiant Cool” by Dan Lloyd, “Einstein’s Dreams” by Alan P Lightman, and Karen Barad’s “Meeting the Universe Halfway: Quantum Physics and the Entanglement of Meaning and Matter” in order to show that in a non-realist paradigmatic approach to constructing scientific theories, fact and fiction is easily blurred, and that the fictive and the factual can be simultaneously encoded in thought experiments that constitute the make up of the ontogeny of information, to borrow the title of Susan Oyama’s seminal work.

John Bruni.
Popular Science, Evolution, and Global Information Management
This paper examines evolution in the magazines, Scientific American and Popular Science Monthly, from 1895-1910.  I look at the cultural pressures that destabilize a pre-formed consensus about the social meanings of evolutionary theories.  As I demonstrate, the development of new academic disciplines, such as anthropology, question how and to what degree evolution can be seen as revealing truths about human development.  At the same time, national expansion and immigration compel scientific reports, addressed to a popular audience, about the biological origins of citizenship and overseas regions as possible sites for economic and political control by the U.S.  These reports are shaped by evolutionary ideas about progress and the popular understanding of race, gender, and national belonging.  Furthermore, such social reporting feeds into an emerging global system of information management.  My larger goal, then, is to address how the concept of popular science is created through the editorial policies and practices of Scientific American and Popular Science Monthly.

Ludo Hellemans.
The Dutch author and biologist Dick Hillenius (1927-1987) deserves to be mentioned in the history of biology in the Netherlands because of his role in promoting public acceptance of evolution and human ethology in the 1960s and 1970s. Hillenius, a herpetologist, was curator of the University of Amsterdam’s zoological collection, housed in Artis, the zoo of Amsterdam. He became well known to the public through his playful essays, newspaper columns and poetry, as well as his Dutch translations of some important books on ethology, for example Konrad Lorenz’ famous book ‘On Aggression’ (‘Das sogenannte Böse’) which examined aggression in animals and humans. Hillenius described himself once as a new variant of the human species who was equipped with ‘a biological eye for almost everything’. ‘Almost everything’, however, included not only human (social and intimate) behavior, but also art and culture. The Netherlands, more than other European nations, was known for its deep-rooted religious tradition which imbued all matters cultural and societal. However, after the second world war, and especially during the ‘roaring sixties’, the Netherlands went through a far-reaching process of secularization. This was accompanied by what can be called the naturalization of public views on nature, humankind and society. In this context, Hillenius’ was in his time quite influential: by consequently adopting a biological stance when writing on a vast array of topics of general interest, he efficaciously promoted the secularization of public conceptualizations and opinions in matters cultural and societal.

Session 2 (E) Whitman A
Virtual Geographies I

Mary Sanders Pollock.
Earth Island: An Ecological Trope
In David Quammen's Song of the Dodo (1996), the island microsm serves as a trope for perils facing the entire planet. Indeed, this trope is pervasive in contemporary science writing. Science and the literary imagination meet in Gerald Durrell’s novel The Mockery Bird (1981). The plot of this ecological fable is familiar: a proposed airfield would destroy an isolated valley on the island of “Zenkali." The heroes discover that the amela tree, the island’s most lucrative export, depends for pollination on a bird-tree-moth association based in the valley. “People tend to think we’re just fussy animal lovers, who want to put animals before humanity,” comments one of the heroes. “That is not the case at all, for the protection of Nature, the protection of the world is the protection of humanity” (217). Fortunately, on this fictional island, connections between human survival, economic stability, and environmental responsibility become clear—the plan is foiled, and everyone lives happily ever after. Literary and scientific works informed by the island trope are essential twenty-first-century reading: they show that economics, human survival, and ecology coincide on a planetary scale.

Leigh Schwartz.
Exploring Our Virtual World
As new virtual worlds join the ranks of existing imaginative environments, such as Disneyland, Las Vegas, or even MUDs, imagination, myth, and fantasy continue to play prominent roles in our lives. And exploration and discovery, which are important historical bases of geographic knowledge and techniques, are also imaginative processes of creating knowledge about unknown spaces and places. However, geographers have largely overlooked the connections between exploration, myth, and fantasy in imaginative environments. And understanding of the cultural geography of online gaming and virtual worlds also remains limited. This research addresses these gaps by investigating exploration and discovery in the imaginative virtual environments of eGenesis’ innovative A Tale in the Desert IV (2008). Using qualitative research methods, this study specifically focuses on the interactions of player exploration, scientific discovery, and code in the collaborative representations of this imaginative virtual environment.

Amy Clary.
Accidental Tourists: Invasive Species, Ecotourism, and the Desire for Wilderness
The debate over the treatment of wild land in the U. S. has until recently been dominated by two opposing possibilities: development or preservation. The growing popularity of ecotourism and sustainable development projects has complicated this distinction, suggesting that it may be possible to simultaneously preserve wild landscapes and develop them for tourism. The possibility of such a compromise highlights the importance of understanding Americans’ desire for wild lands. The desire that is satisfied by ecotourism and sustainable development projects is not simply a longing to know that wild lands exist, but rather a desire to see wild areas and experience them firsthand. However, this desire raises a number of important questions, such as: what sorts of firsthand experiences and scenic vistas will satisfy the public’s urge for wilderness? How are those desires for wilderness created, and what is the material impact of those desires on wilderness itself? In addressing these questions, this essay will examine the nature of Americans’ desire for wilderness and the unintended consequences that can result from even the most responsible kinds of ecotourism, such as the inadvertent spread of non-native seeds carried by the car wheels, boot soles, and pants cuffs of ecotourists. Drawing upon Thoreau’s “Ktaadn” and William Cronon’s “The Trouble with Wilderness,” this essay will examine the inadvertent contamination of wilderness with invasive species as a reflection of Americans’ complex and conflicted relationship with wilderness.

Thomas Parks Fair.
Theodicy of the Prairie: Decoding the Frontier of Eliza Farnham’s Life in Prairie Land
Social activist and feminist, Eliza Wood Farnham perceives a metaphysical undercurrent to the complex forces shaping the largely hostile environment of the nineteenth century American frontier. Farnham’s Life in Prairie Land (1848) details her travels around the Illinois territory from 1835 to 1840 and weaves together local color realism, popular travel literature, and a romantic response to nature through evocative descriptions of the rivers, wildlife, and prairie. Farnham’s observations of and reflections on her encounters with the natural forces of the frontier present a way of decoding and comprehending physical nature as a manifestation of both the beautiful and the sublime. Her observations of nature suggest a force at work, an apparent consciousness that encompasses human endeavor as part of a greater process. She describes a natural interface of the mind and nature, a comprehension of the beautiful and the sublime that ultimately reveals either human foolishness or human potential.

Session 2 (F) Whitman B
Contemporary Literary Decodings

Kersti Tarien Powell.
The Agents of Decoding: Scientific Instruments in Stoppard and Banville
This paper examines the role of scientific instruments as agents of decoding through which quests for personal identity and epistemological questions are being explored. Instruments such as telescopes and lenses become (metaphorical) agents for decoding the world and allowing the characters to negotiate their personal space and decode the borders between internal and external worlds. My analysis centers around two examples: the use of telescopes and microscopes in Tom Stoppard's Galileo - originally conceived as a film script and then adapted for the stage - and the role of windows and lenses in the first novel of John Banville's science tetralogy, Doctor Copernicus.  

Jason Embry.
Revolution through the Lost Language of Nature in Chuck Palahniuk's Lullaby
Chuck Palahniuk, in his novel Lullaby, imagines the discovery of a lost book of spells can lead to revolution, both ecological and social if the users were not divided in their utopian aims. Four people argue throughout the novel for their individual solutions to the problems of modern society, armed with spells that tap into the organizing structure of nature, allowing these individuals to make change in ways the modern technological world cannot. The central problem of the text reveals the different weight individuals give to social ills and their difficulty in compromising for the good of all. Palahniuk explores the utopian impulse armed with powers over nature and its impact on a world steeped in technology and so far removed from nature. I argue, understanding the basic code of nature gets us no closer to constructing a better world so long as the various opinions about that better world are in conflict with one another. Decoding nature only serves to illustrate the need to decipher the code to humanity.

Cat Yampell.
I’m Rejectin’ My Abjection: The Animal as Respite from the Horrors of Humanity
Rooted in the Western European folktale tradition, human-animal to animal metamorphoses are punishment, creating the ultimate outsiders or dejects (Kristeva 8).  These characters are “radically separate, loathsome.  Not me. . . . A ‘something’ that I do not recognize as a thing” (2). Young Adult novels abound with abjection, mirroring many adolescents’ reality as outsiders.  While some of these novels end in death with “the corpse” which Julia Kristeva defines as “the utmost of abjection” (4), I suggest that a state of abjection exists that transcends that of the corpse:  an “eleven” abjection (for those familiar with Spinal Tap) or an über-abjection.  In Paul Gallico’s The Abandoned and Gillian Cross’s Pictures in the Dark, the protagonists transgress Kristeva’s utmost state of abjection.  Expelling themselves through their bodily fluids, these protagonists then expel their humanity and their species as they become animal, perceiving life as human-animals, not death, as the eleven- or über-abjection.     Works Cited: Kristeva, Julia.  Powers of Horror:  An Essay on Abjection.  Trans. Leon S. Roudiez.  New York:  Columbia UP, 1982.

Session 2 (G) Woodruff A
Decoding Nineteenth-Century Narrative

Mary Rosner.
Revising a Victorian Convention: Alan Paton and the Final Journey of David Livingstone
When David Livingstone died, some of the Africans with him carried his body more than a thousand miles through multiple dangers. At the end of this trek, English officials took charge of the corpse and transported it to England where it was buried with honors at Westminster Abbey. To them, he was a hero; the Africans who suffered to bring his body to the coast were almost entirely ignored. The liberal African writer and politician Alan Paton did not focus on Livingstone at all in his retelling of that final journey. His heroes are the Africans who chose to take on this "epic journey"; his villains are the English who trivialized their actions. This paper will analyze how Paton's unpublished play "David Livingstone" reconstructs Victorian Englishmen in this context.

Amanda R. LaRoche.
Embracing the Wild: Pearl Prynne’s Natural Humanity
Nathaniel Hawthorne's The Scarlet Letter is an exploration of the repercussions of sin and guilt set amongst Puritanical Salem Massachusetts. Hester Prynne is the community's example of the sinfulness of lust; however, Prynne functions for the reader as an example of honor and devotion, both to God and to love. The union between Prynne and the unknown man results in the conception of her daughter whom Prynne names Pearl. Citing Matthew 12:45-46 to describe her, Pearl stands as a reminder of the price paid by Prynne for her sinful ways, both in Pearl's existence and name. Pearl is not simply a biblical metaphor for the price of sin, but a connection to the natural world as displayed by the child's delights in nature. Prynne believes the turmoil in her soul during pregnancy resulted in the elfish, impish, and perhaps evilness of Pearl's being; Pearl is wild and uncontainable. Like the pearl of an oyster, luminescent, light, and vibrant, Pearl is the light of an otherwise dark and sinful burden of the narrative. The use of the forest as a setting for conversations between Prynne and Dimsdale, the two most heavily burdened characters, mirrors the shame felt by both characters. Pearl, who has always rejected the community and associations with civilization, revels in her time in the forest and draws to her the natural world. Pearl's wildness and connection with nature through her rejection of civilization proves her to be the most humane and fully human of all the characters. Choosing wilderness over civilization and wild behavior over conformity is often perceived as a person's rejection of humanness in terms of tradition and belonging. This paper examines the ways in which nature and coexistence with nature has formed the idea of one being an outlier of mainstream ideology. In essence, embracing nature and the natural world can cause an individual to be rejected from society, encouraging the notion that to be human one must belong to a traditional community with its associated social constructs and ideals.

Atia Sattar.
Scientific Naturalism: Zola and Pater’s Experimental Aestheticism
Naturalism, an aesthetic mode seeking to uncover the laws of nature, reached its literary apotheosis at the nineteenth century in the works of Émile Zola. In an 1889 article in Nineteenth Century magazine, British aesthete Walter Pater, while acknowledging naturalism’s debt to Zola, describes it as “an element of all living art.” Insofar as we can refer to Pater’s own writings as naturalistic, this paper examines the experimental endeavors of these two authors, whose attempts to discover natural laws were decidedly scientific. In his 1880 essay, “The Experimental Novel,” Zola presents a “scientific” method for naturalism. Taking as his model experimental physiologist Claude Bernard’s "Introduction to the Study of Experimental Medicine," Zola emphasizes the role of experimentation for naturalist writers. Much of the text consists of quotes taken directly from "Introduction," with the word “doctor” replaced with “novelist.” Drawing on the principles and language of scientific inquiry, Zola asserts that the experimental novelist should begin with an empirical observation of life and then place his/her characters into scenarios that unfold as an experiment. In a similar fashion, Pater’s "The Renaissance" (1873) likens the work of an aesthete to that of a chemist, regarding physical life as “a combination of natural elements.” Only through an examination of one’s sense impressions—precise measurable units of experience—can an aesthete successfully come to know his/her self, surroundings, or literary subjects. Is naturalism then inherently scientific? And, if so, how do we explain a naturalist science that subsumes the experimental method into personal literary experience?

Session 2 (H) Woodruff B
Reading with Animals
Martha Kenney
This panel takes up the conference them of decoding/encoding through attention to practice of 'reading’ in multiple ways. Rather than take writing/reading to be a unidirectional and strictly human activity, these papers engage with the complex languages that pass between and constitute humans and non-humans. We read and write with pit bulls, rats, and caribou in order to get at better understandings of how these literary acts configure and reconfigure race, gender, place and knowledge. Through our readings of human/non-human encounters in media representations, scientific literature, and environmental impact statements, we argue that modes of "reading" are always active ways of making, remaking, and unmaking relationships between different species.

Allison Athens.
Living With Caribou: Real and Figural Time in the Arctic
My paper discusses “time” and its appearance in translating cross-species exchanges, especially in relation to humans and non-humans in ecologically sensitive or designated wilderness areas. In my studies of narratives that figure “caribou” and “adaptation” in one way or another, I have noticed that different iterations of time keep cropping up, or perhaps it is the same iteration in different guises, depending on who is doing the reading. My paper proposes that through temporal literacy, we can understand inter-species adaptation differently. “Reading” the proximity and movement of another species well necessitates the acknowledgement of a different conception of time and draws attention to who is being left behind by whom in the film Being Caribou: the filmmakers by the caribou or the caribou by modernity.

Martha Kenney.
The Ichthyologist and the Rats: An Epistemological Fable
In a 1933 article ichthyologist E.W. Gudger (1866-1956) investigates if rats carry eggs using the following technique: one rat lies on her back and clutches the egg in her paws, while another pulls her tail, turning the first rat into an “egg-wagon.” Although it was thought to be merely apocryphal, Gudger takes this “unnatural history” seriously.  Surprisingly, he reads illustrations of Jean de la Fontaine’s egg-wagon fable carefully for clues. Taking my cue from Gudger, I turn his story into a fable about wonder and empiricism. I read his natural and unnatural histories not as opposing, but part of a practice of marvelous realism. This is a story about how wonder is a mode of attention and how the obsessive empiricist is also an enchanting fabulist. By reading and writing scientific fables, I experiment with a literary style of accountability for the specific ways that empirical objects are constituted.

Mary Weaver.
White People Saving ‘Brown’ Dogs from Brown Men: Reading the Vick Case with Spivak
This paper thinks through the public and media response to the recent Michael Vick case, arguing that it reveals a shift in the links between racialization and non-human animals in contemporary U.S. culture. The media’s change in focus to the dogs-as-victims, coupled with visual rhymes between public protests aimed at Vick and revelations of lynching-like practices of dog killing at Bad Newz Kennels, reconfigured the manner in which discourses of 'dangerous dogs' and black masculinities co-constitute each other. I think through the eerie resonance between the ‘saving’ of Vick’s dogs by mostly white rescue organizations and Spivak’s theoretical work regarding “White men saving brown women from brown men.” Where and how this resonance fails helps me get at a mode of reading and writing with ‘dangerous dog’ discourses that suggests a careful and caring ethics of response attentive to the connections and disjunctures between signification, race, and non-human animals.

Session 3 - Fri 10:30am - noon

Session 3 (A) Bennett
Decoding Digital Humanities

Adam Frank.
"Some Mad Scientists": An Audiodrama
I have recently completed a dramatic musical adaptation of an ancient Greek text which, according to Mikhail Bakhtin, is the first work to depict the laughter of the mad scientist. The story is as follows: the townspeople of Abdera invite Hippocrates to come cure their most famous resident, Democritus, who they believe to be mad because he laughs at everything, both good news and bad. They are worried that this madness will infect them and damage the city’s reputation. Hippocrates accepts but believe it likelty that the townspeople are morally or emotionally ill rather than Democritus. Hippocrates arrives to discover the philosopher surrounded by heaps of dissected animals: he is writing a treatise on insanity and seeks its cause in the animal bodies strewn around him. After Democritus delivers his philosophy of laughter as a diatribe, Hippocrates leaves convinced of his wisdom; a reader, however, may be left wondering if a joke has just been played, and if so on whom.

Stephanie Boluk.
Database Aesthetics and
At the crossroads between concepts of the literary and emergent forms of database aesthetics lies a contemporary model for theorizing serial production.  This paper investigated the under-examined concept of seriality and the way it has been reconfigured in digital media. Using Homestar Runner as the central case study, I provide a survey of issues surrounding the literary, database and seriality in this work of popular media.  I will specifically trace the propensity of electronic literature for what has been described as a technologically-conditioned melancholia and relate this to the serial constructs within Homestar Runner.

Karen Head.
Coding, Decoding, and Recoding: Connecting Traditional Text-based Poetry and Digitally-enhanced Poetry
In my research, I have discovered that poets tend to be either traditional or digital in their approaches; rarely do poets attempt to bridge these sub-genres, especially in the U.S., which is one reason I spend as much time as possible engaging with European scholars and artists. The benefit to my integrated approach is that it encourages scholars, artists, and readers from the otherwise isolated sub-genres to make transitions and explore other formats. Ultimately, I want to encourage an aesthetic dialogue about what poetry in the 21st century can be, and use technology to transcend all imaginable boundaries. More generally I will discuss a broad range of practices in the mediation of the poetic experience. These practices include: poetic readings enhanced with digital video or music, poetic performances in virtual or augmented environments, real-time interactive poems, and location-based poetry delivered through phones or other mobile devices.

Helen J Burgess.
How to read an electric poem
Sometimes it takes trying to explain something in simple terms for us to be able to articulate what it is about a genre that makes it compelling or worthy of study. John Miles Foley, in How to Read an Oral Poem, noted that the "challenge of explaining the structural and artistic dimensions of oral poetry" was matched by the "continuing challenge of teaching and lecturing", noting that his aim was to produce an "interpretive tool kit for reading oral poetry". In this paper, I will try to articulate a similar "interpretive tool kit" for electronic poetry, concentrating on several key features -- such as the loop and the layer -- that appear in electronic literature. These features, I argue, are not just structural but operational; that is, they do something: they execute, they feed on information and transform it. This liveliness of action -- this electricity -- gives provides us with opportunities to approach and identify features of electronic poems in ways that can be adapted for both scholarly and classroom use.

Session 3 (B) Crescent
Media Environments and World Games
Chair: Colin Milburn

Nancy Anderson.
“We All Want to Change the World: John Lennon’s ‘Revolution’ and Buckminster Fuller’s World Game"
In 1964 Buckminster Fuller proposed a computer game, the World (Peace) Game, to be installed in his geodesic dome for the U.S. Pavilion at Expo ’67 in Montreal. The World Game would have consisted of a computer database holding information on all the world’s physical resources and human needs and trends, which would then be made available to players at consoles situated on a balcony overlooking the game board, a huge football field size Dymaxian world map inlaid with thousands of light bulbs. “To begin playing,” Fuller explained, “an inventory of world population areas is flashed on the map… in red dots. Then a player might show an inventory with the locations of industrial technology centers on the map in yellow. An obvious problem… is revealed by this visual overlay. Several areas of red (population) are without areas of yellow (industry). Now, one must develop strategies to alleviate the problem.” The point was to take technologies of the “world warring game” (computers, satellite tracking devices, etc) and turn them to solving global issues of living (housing, energy sources, pollution). This paper will consider the computer game as emerging medium and Fuller’s utopian vision of human-machine collaborations for “making the world work” for all humanity. Anderson is interested, however, in placing Fuller’s techno-utopianism in the context of 1960s political utopianism and social turbulence. The contemporaneous text Anderson will use to do this is John Lennon’s 1968 song “Revolution,” a detailed description and sharp critique of radical thought and violence of the time. Lennon (The Beatles), of course, communicated through established, but still thought of as revolutionary, “mass interception media” (Kittler): rock music (phonograph, radio). Thus, what Anderson will contrast here are media: Fuller’s sincere futuristic digital utopianism and the hardcore guitar to “shoo-be-do-wah,” revolution now to “free your mind instead” riffs of the White Album.

Colin Milburn.
"Mondo Digital"
This paper will examine the formal strategies developed in the 1960s and 1970s Italian mondo movies and their imitators for conjuring the “worldness” of our world on a planetary scale. Beginning with the surprise international success of 1962’s globetrotting shockumentary Mondo Cane by filmmakers Paolo Cavara, Gualtiero Jacopetti, and Franco Prosperi, the mondo genre soon cohered around a certain set of formal conventions: intermixing brief scenes of everyday life from all around the world; usually focusing on lurid, unusual, or spectacular behaviors; posing as a mode of visual anthropology but often staging or fabricating the actual content of the scenes; and relaying a sense of the worldwide commonality of human culture or human nature, across national differences, through a montage technique that took the form of radical modularity. Without regard for linearity or continuity—indeed, precisely in resistance to such notions—the mondo film in its classical form (which has been inherited today by certain “reality TV” programs) performs the “whole earth” precisely through its atomization, its disintegration, its digitization. Breaking the analog whole into discrete modular units that can be reprogrammed and recombined (the clips from mondo films are not only regularly repurposed for other mondo films but often themselves have been snagged from other sources), the logic of mondo resonates with similar forms of high-modernist collage and cut-up that parcel out the analog world and resynthesize the bits into an in/coherent whole. Indeed, the medium of moving film as such can be seen in this way as inherently constituted through the interplay of the analog and the digital, well in advance of digital computing technologies. But it was not until the arrival of the mondo genre, whose technical ability to assemble worldwide footage was made possible through increasingly globalized networks of air travel and televisual distribution of international film and video materials, that the world picture—the image of the world in its worldness—was rendered digital, and left digital without final analog resolution or refolding. Quite distinct from other images of the whole earth in the same period—NASA images of “Blue Marble” and “Earthrise,” for example—which replicate the analog worldness of the world, mondo films instead perform an anticipation or “premediation” of the digital worldview, in which the world is not “whole” but is rendered world precisely through its irreducible molecular divisions and differences. If today we have grown accustomed to this picture through such entities as Google Earth, or the new, high-def NASA “Blue Marble” project, returning our attention to the pre-history of the digital world in Mondo Cane and its kin may also help to account for our increasing sense that this world is, once again, going to the dogs.

Rob Mitchell.
"Media Environments, Media Vitality"
Recent theories of media point toward a shared effort to understand media less as channels that transmit content and more as environments or worlds that serve as conditions of possibility for innovation and transformation. Brian Larkin’s conceptualization of media as “infrastructures,” for example, draws in part on an understanding of infrastructures as networks that employ the material properties of matter in order to create predictable and enduring links between people, but he also explores ways in which rivalrous social groups are able to exploit “breakdowns” in the material functioning of official infrastructures in order to create new communities. Working from a different perspective and conceptual reservoir, W. J. T. Mitchell has proposed that we think of media as “environments,” within which images (for example) would “reside . . . in the way organisms reside in a habitat.” Finally, to take a third example, Matthew Fuller’s recent description of media in terms of “ecologies” is intended to focus our attention on “the massive and dynamic interrelation of processes and objects, beings and things, patterns and matter” that media enable as well as the dynamic tendency of these interrelationships (that is, their tendency to produce new constellations of “processes, objects, beings and things, patterns and matter”). Mitchell’s paper simultaneously seeks to encourage this effort to understand media as all-encompassing—as an infrastructure, environment, or ecology, for example, rather than as a channel—while at the same time emphasizing the need to think this “surround” in terms of vitality and life. These latter terms, he suggests, are in the final analysis more expansive and open-ended conceptual breeding grounds for understanding the potentials of media, for they encourage the invention of tactics that encompass, but also exceed, structures, “-logies,” and systems.

Session 3 (C) Doggett
Performance: a Method of Decoding / Decoding: a Method of Performance
Katherine Behar; Frenchy Lunning
Recent years have seen a tremendous amount of fertile and provocative scholarship addressing, on the one hand, how technologies code human performances (human-computer interaction), and on the other, how code itself performs (critical code studies, speech act theory). Acknowledging the importance of such investigations, this experimental panel brings together diverse viewpoints in an attempt to ask a slightly different question. What happens when we appropriate performance studies methodologies, in an act of cross-disciplinary decoding, to decode performances of and around technologies? In short, how can performance studies serve as a methodology for looking at digital culture? Taken collectively, each of these papers seeks to decode some aspect of emergent performance practices. We address performances around technologies, as well as performances of technologies. In the former category, we look to the performances of human users – the acquired behaviors, habits, and hacks that percolate through, rub up against, and, as frequently happens, overwrite the performance "script" provided by a hard-coded software or hardware interface. In the latter category, we explore how technologies are programmed to "behave" – the politics and technosocial relations, that are enacted by recent technological artifacts and that become coded into normativity. Papers include a discussion of Cosplay as a performance around technology, and a study comparing restrictive interfaces in Cybernetics and in BDSM subcultural body modification as a performance of technology.

Jon Cates; Jake Elliot.
0UR080R05 - Noise, Live Coding, Art Hacking and Realtime Audio Video Performance
In 2012 jonCates and Jake Elliott began a realtime audio-video noise performance project that will complete itself in 2007. This project, called 0UR080R05 (a translation of "Ouroboros" the serpent-god who swallows it's own tail), represents 5 years of Performance Art in the context of New Media Art, experimental musics and Noise venues. 0UR080R05 has been performed collaboratively using digital video and audio processing (in realtime) via self-modifying programs, speculative Software Art, fictional operating systems and impossible technologies. These programs contain algorithms that feedback audio, video and data stream into each recurrent performance to cause Head-Tail Recursions and infinite palindrome loops from the future.

Katherine Behar.
Command and Control: Restrictive Interfaces in Cybernetics and BDSM
This paper is an experiment in decoding through a textual and visual, comparative, close reading of protocological code and BDSM (Bondage & Discipline/Dominance & Submission/Sadism & Masochism) visual culture in pornography and product advertisements. By overlaying human and nonhuman systems, each becomes a decryption key for decoding the other. We are all familiar with how, from the perspective of human users, novel technologies serve as pet fetish objects. However, this paper attempts to conceive of how technologies perform as nonhuman sexual subjects in fetish culture. To this end, I explore correlations between restrictive interfaces in computational systems and restrictive interfaces in BDSM culture. In the visual culture of BDSM, emphasis is placed on impenetrable surfaces and on highly controlled, specifically regulated articulations of penetrability. The same can be said for digital communications technologies which enforce strict protocols (VoIP, TCP/IP, SMTP, etc.) for the passage of information. Digital technologies appear to us as objects and present us with an illusion of mastery. In reality, technologies are active subjects and we, their "users," must bend to their requirements. Gaming scholars Ted Friedman and Alex Galloway refer to the process by which users must first internalize machinic logic in order to win mastery over a machine as "learning the algorithm." Indeed, cybernetics, the science of command and control through communication, has much in common with sexual power dynamics. Both involve getting a partner to do what one wants and to not do what one doesn't want. The dominant consumerist relationship with technologies is already sexually charged. But in order to imagine an alternative, it becomes crucial to ask where power accumulates and how power functions in our interactions with devices. In a given moment of Human-Computer Interaction, who or what is a master and who or what is a slave?

Frenchy Lunning.
Cosplay, Drag and the Performance of Abjection
The subjects of Shojo (Japanese girl culture) -- women and homosexual men -- have been marked by mainstream culture through their transgression of gender priorities as abject subjects. In their abject profile, these subjects have accumulated a taxonomy of loss: of identities thrust aside like empty costumes, which have collectively piled up within the confines of a culture that systematically requires adherence to certain identities, systems and orders. The shojo, mining this pile of abject detritus, operates as a mime of abjection: she performs not so much as a being, but rather as a constellation of ambiguous abjected identities and desires, states and objects, processes and agents, all of which are animated through her symbolic narrative enactments. The shojo is perhaps the most complex and profound of possible fan subjects. As we approach her through her most obvious manifestation: she reveals her abject state through her visual morphology in representation. Those representations appear in various forms of fan behavior, but most particularly through the practice of Cosplay. The morphology of the Cosplayer is read from the body of manga and anime, and the bodies represented in manga and anime: bodies that are in no way stabilized, and in no way actual; in fact, they dangerously swivel in gender to the extent, that gender becomes a fictive notion in favor of a magical state of shape-shifting, and in an absence of an originating gender state. Although drag performances are usually of homosexual men as adult women, there is a way in which the shojo culture has appropriated the practice. Cosplay, in the enactment of shojo characters, performs text from manga and anime, or in the case of Loli, a voguing of a Victorian pastiche. In each case, both male and female fans carefully construct an identity for the purpose of performance of a shojo character, much in the same sense as drag queens. Part of the jouissance experienced from shojo drag lies in the parodic and rebellious play between gender identities. Rather than mourn, cosplayers use the masquerade of the anime convention not only as a theater in which gender normativity becomes the pivot of the parodic play, but as a subversion of it rather than in despair of it.

Session 3 (D) Mitchell
Eating, Viewing, and Living in a (De)Coded Environment: How We Come to Depend on Scientific, Technological, and Cultural Practice to Situate Our Lived
Justin Lerberg
This panel examines the decoding and encoding of nature and its processes through the use of technology and cultural practices. Broadly conceived, we believe that cultural norms inform and produce coded knowledges about the natural world and the human relationship to it. On the one hand, constructed messages produce meaningful and positive relationships between the human and nonhuman world such as information that helps identify environmental risks. On the other hand, the entanglement of science, politics, and technology, can sometimes produce contradictory information, especially for a general consumer/citizen. Our panel focuses on three particular areas in which coded knowledges inform individual choices: agriculture, animal human relationships, and climate change. We argue that in each area individual consumers must decode layers of political, scientific, and cultural knowledge. However, the intricate layers call into question the complexity of identifying an “informed choice” because of the complicated codes that are often in conflict.

Justin Lerberg.
Global Warming 2.0: A (Re)Mixed Reality of Science, Popular Culture, Techno Theory, and Consumer Generated Content
This presentation addresses the inability to “see” global warming without scientific explanation and/or visual data. Whether through virtual climate models, digital satellite imagery, or other visual medium, there inevitably needs to be some visual component in order for the general public to grasp the impact of global warming. Through the aid of new media applications, the reality of global warming is a “mixed reality,” a reality that is coded by digital and virtual media. In the realm of new media, perhaps expertise is not measured by the actual science of global warming, but the ability to navigate and decode complex and conflating information about the issue.

Matthew Lerberg.
Out of Focus: Confronting the Spectacle of the Other (than Human) in Film
This paper addresses the intersection of animal studies, film, ethics, and environmental and critical theory in the Sony Pictures Classics films Winged Migration and The Making of Winged Migration, which illustrate important and complex connections between animal and human in film. Juxtaposing the films reveals that the relationship between man and animal is intimately intertwined with each other and is greatly influenced by their multiple environments, especially landscapes; furthermore, it reveals the way in which film history and techniques complicate the connections. This intimate relationship provides critical approaches for conceiving human and animal, because both are simultaneously a part of and apart from each other, as well as culturally, materially, and discursively defined through a juxtaposition of both films.

Tracey-Lynn Clough.
Science Fiction: Making Sense of the Politics of Agricultural Technologies, the Modern Industrial Diets and the Food of Our Imagination
Individual food choices are culturally and historically contingent practices that arise through an amalgamation of often hidden political, scientific, and economic policies that shape desire and influence access. This presentation explores the conundrum of food—particularly agricultural crops—and food technologies through an examination of the ways in which technologies are politically, economically, and culturally constructed and provides a framework for better understanding the intersections of science, technology, politics and culture. These intersections are better illuminated through an examination of the ways in which industry and non-industry writers bring these technologies into the public discourse through counterrevolutionary scientific texts, essays and fictional techno-fantasies of assault. In many ways, these multiple discourses inundate modern consumers with conflicting ideas about nutrition and formulate food choices that are at once more transparent and illusive. Together they demonstrate that the food of our imagination is very rarely the food on our plates.

Session 3 (E) Whitman A
Decoding Gender
Chair: Andrew Logemann

Laura Blankenship; Anne Dalke.
Coding for “Possibility Spaces”: Designing and Teaching a New Course in Gender and Technology
“Decoding,” the process of translating received messages, presumes a cipher that can be cracked, a script, already written, but needing clarification. Conventional pedagogical skills--such as those encoded in the legislation of “no child left behind”--focus on students learning to "decode" the information given to them by their teachers, in order to answer the sorts of questions posed to them on standardized tests. Co-designing and teaching a new course on Gender and Technology, which we offered @ Bryn Mawr College in spring 2009, meant a semester spent “decoding,” first, the very terms of our title, the conventional understandings of the terms of “gender” and ‘technology.” It also meant “decoding” a series of pedagogical assumptions, shared across disciplines, about performing competence. In this course, which was cross-listed in Computer Science, English, Film Studies and Gender and Sexuality, we met those challenges by insisting that our students do their own "coding",” by re-designing the variety of “possibility spaces” in which they were learning. We will tell the story of this course, as a means of arguing that higher education should be revamped to create more such spaces: moving away from older curricula, changing (as gaming companies do) with the needs of our customer-students, and inviting them to use the content and discussions in their courses to open up new ideas and opportunities for themselves. Laura Blankenship is a free-lance consultant in technology and education. Anne Dalke is a literary critic interested in emergent pedagogies, feminist theory and narrative traditions, revisionary work in canon of American literatures, and the intersections between science and literature.

Jennifer S. Tuttle.
Romancing the Rest Cure: Illness and Assimilation in the Work of Emma Wolf
Among literary critics, the “rest cure” for nervous women is widely associated with nerve specialist S. Weir Mitchell, famously disparaged by Charlotte Perkins Gilman in “The Yellow Wall-Paper” (1892) for his misogynist medical theories and his paternalistic bedside manner. Less familiar to modern readers, though no less compelling, is Jewish American writer Emma Wolf’s portrait of the rest cure in her novel Other Things Being Equal (in which the rest cure doctor is the romantic lead), published mere months after Gilman’s iconic story. This essay interprets Wolf’s portrait of the rest cure, along with her use of a constellation of other medical conventions, as therapeutic codes deployed in order to argue for intermarriage and assimilation as well as women’s self-determination in relations with men.  

Andrew Logemann.
Scientific Lyricism: H. D., Feminist Epistemologies, and Scientific Praxis
This paper will apply the methodologies of Latourian science studies to the work of American imagist H. D., examining her novel HERmione (1927) in the context of debates over the legitimacy of scientific practices in the period, and establishing that H. D. self-consciously constructs a scientific mode of literary production designed to critique received, positivist notions of inquiry. Building on the methodological consequences of Niels Bohr’s complementarity, H. D.’s writing reflects a systematic rejection of Newtonian mechanics in favor of a knowledge-building practice that embraces scientific subjectivism.  This paper will develop the implications of H. D.’s efforts to use literature as an epistemologically-viable alternative to the black-boxed scientific method, arguing that the structure of her writing reflects these motivations in productive ways.

Min-Jung Lee.
Decoding Femininity: Miscoded Lily Bart in the Filmic Rendition of The House of Mirth (2000)
In this paper, I explore how Terence Davies decodes Femininity through miscoding Edith Wharton’s character Lily Bart in The House of Mirth (1905). The film follows the major plot line, but it fails to represent the psychological task of the redemption of the Femininity that Lily experiences. The film presents Lily as Wharton originally intended: a heroine who “fails” in her traditional role. Wharton refused to have Lily fulfill the role of the traditional novels’ protagonist who is brave, decisive, strong, and accomplished. Wharton set up Lily to fail and to suffer, blocking her at every point in her efforts to realize the social ambitions for which she strives. Wharton, however, made it clear that Lily does not want a conventional notion of success, and it is her very “failure” in the social game that makes Lily a heroine. I discuss how Davies’s filmic representation decodes the novel’s thematic concerns: “the central truth of existence” and “real relation to life.” Furthermore, this paper attempts to reveal its miscoding the unpopular but powerful truth that Wharton expressed in Lily’s innate power of Femininity through her quest for soul. I will use the notion of “the feminine soul,” which Carl Jung named “anima,” taken from the Latin, “soul.”

Session 3 (F) Whitman B
Before Beginning

Catherine Belling.
Always About to Begin: The Recursive Structure of Hypochondria
Physicians often disparage hypochondriac patients because of their seemingly irrational resistance to reassurance: they refuse to accept that a doctor’s inability to find evidence of organic  disease means they are well.  I explore hypochondria’s disruption of the conventional narratives that structure clinical medicine. The ideal trajectory--diagnosis, action, and completion, either as cure or as palliative management--is restructured when patient's visceral authority and doctor's scientific-professional authority are at odds, the patient repeatedly positing a pathological beginning too recondite (for now) to be diagnosed. Hypochondria anxiously resists the narrative closure promised by the acute and the terminal, its chronic uncertainty directly threatening the evidence-based epistemology of medicine. Using fictional and clinical accounts, I show how the plots of illness anxiety threaten contemporary Western medicine’s functional denial of the truth that sooner or later for every patient, even the “worried well,” medicine will, by its own accounting, fail.

Cynthia Current.
Panic!! and the Evil Child: Decoding Determinism in William March's _The Bad Seed_
William March’s The Bad Seed, published in 1954, not only became a bestselling novel but was also produced as a successful broadway play and an Academy Award nominated film. Certainly, for audiences at the time, March’s creation of a child serial killer was shocking, but the shock ultimately resonates at the level of the seeming paradox of panic and biological determinism. Rhoda Penmark functions as a child who is the direct embodiment of a genetic code, of a blue print that requires only minor developmental adaptations. Rather than appearing as an anomaly, she invokes instead a chilling stability (and intense conservatism) amidst the chaos of Cold War politics, homophobia, and the darkly comic misappropriation of psychoanalysis. Most important, however, is the contrast between Rhoda and her mother Christine. Christine works through patterns of kinship, heredity, and her own disabling panic in the face of the overdetermined behavior of her child. This presentation explores how the novel enunciates genetic determinacy while simultaneously rendering panic itself as particularly deterministic and classifiable. Cybernetics, psychopharmacology, and molecular biology develop as disciplines nearly simultaneously. The behavior of Rhoda and Christine, then, should be explored as parallel rather than opposing forces, the dynamics of which reflect upon empircism and regulation in a world concerned with the rise of individual and global violence.

Ada Smailbegovic.
Poetics of Liveliness: Theories of Embryological Development and Gertrude Stein's The Making of Americans
In her lecture “Portraits and Repetition,” Gertrude Stein writes that if anything is alive, it is not repeating, but continuously varies in its insistence. In order to illustrate this statement, she evokes the example of a living organism: “It is very like a frog hopping,” she writes, “he cannot ever hop exactly the same distance or the same way of hopping at every hop” (Stein, “Portraits and Repetition” 100). Following these insights about biological organisms, Stein attempts to recreate the liveliness of her subjects by composing a series of nearly repeating sentences, which, through their incremental variations or shifts in insistence, enact the continuously changing, developing nature of living organisms. My paper will position Stein’s interest in the liveliness of organisms, particularly in her early novel The Making of Americans (1911), in relation to theories that have shaped understandings of biological development. Specifically, I am interested in interrogating the preformationist premise that genes in isolation contain the information necessary for development and that developmental processes can be understood by decoding the genes. I am proposing an analogy between the resistance exerted by Stein’s compositions towards critical approaches that attempt to “decode” the “meaning” of her texts, and the way questions of development have remained resistant to explanations premised solely on decoding genetic information. Instead, both living organisms and Stein’s compositional recreations of them constitute systems involved in continuous processes of epigenetic change, guided not by an encoded developmental program located in the genes but by a complex set of relations between an organism and its environment.

Session 3 (G) Woodruff A
Untimely Animals and Spectral Deaths
Ron Broglio

Richard Nash.
Animal Death and Wordsworth's Hart-Leap Well
In The Animal that therefore I am, Derrida takes up directly the postmodern "question of the animal" and in particular revisits Descartes' separation of reaction and response. There is already a burgeoning and proliferating secondary literature generated by this particular meditation on what was already a topic of expanding conversation in post humanist theory. His discussion centers around the challenge of thinking pity: "War is waged over the matter of pity. This war is probably not ageless but, and here is my hypothesis, it is passing through a critical phase. Donna Haraway has responded to Derrida, by asking us to not rest with pity, but think beyond it, towards joy: "the question of suffering led Derrida to the virtue of pity, and that is not a small thing; but how much more promise is in the question, Can animals play? Or work? And even, can I learn to play with this cat? Can I, the philosopher, respond to an invitation or recognize one when it is offered? What if work and play, and not just pity, open up when the possibility of mutual response, without names, is taken seriously as an everyday practice available to philosophy and to science? What if a usable word for this is joy?" Here I will return to consider animality and sentiment at the end of the eighteenth-century, the dawn of Derrida's war, and do so to juxtapose didactic verse with Wordsworth's "Hart-Leap Well" in order to pay special attention to how that poem configures the relationship between pity and joy, and how those states converse.

Ron Broglio.
Curious Cases toward an Animal Revolution
Incident in the animal revolution: Chuck the groundhog bites New York City mayor Michael Bloomberg's finger. Chuck has redefined groundhog's day. It is about the groundhog, not the humans. Stay out of the animal layer. Days later, the Chef of State is the one bitten: "Former French President Jacques Chirac was rushed to a hospital after being mauled by his pet dog who is being treated for depression." This and other curious cases will be explored alongside the theoretical un/groundings of an animal revolution. Animals in death remains on without being mourned and their remainder haunts us. Animals are not mourned because we do not know how to incorporate the animal body into the social body. Indeed, to do so would mean to rethink the human and the non-human community. This haunting through the animal body opens up the animality of the human in ways that trouble good sense and common sense. As Giorgio Agamben recognizes, we have yet to come to terms with human animality as biological limits over and against our social being. What is scandalous about the animal body and the animality of the blunt human body, what is indeed the seed of revolution, is that these bodies abandon reason, sensibility, and civility as the modes of discourse and point to an-other register all together. Enter then the idiot (companion to the ‘dumb’ animal) who makes use of the blunt body to disrupt the social register by a flight of thought into corporality—as outline by Jean Luc Nancy in Birth of Presence. The final section of this presentation speculates on what is meant by saying that the animal revolution is ‘to come.’ Foremost, revolution is untimely. Working at a register outside of the social, there is no sense in which a revolution should abide by human clockwork. The animal revolution comes from ‘left field’ or to quote Walter Benjamin: ‘All decisive blows are struck left-handed’.

Kari Weil.
Peeing and Time
Bill Viola’s Aesthetics of Attunement Thinking, Derrida suggests, may begin with our “nakedness” under the gaze of an animal who looks at us. Indeed, Derrida’s attempts to come to grips with the self as it is caught in the gaze, not of another person, but of his cat, has engendered a widespread critique not only of a philosophical tradition that has supported the objectification of animals, but also of a tradition of representation that pictures animals as objects to be studied and measured according to human instruments of thought. And yet, to focus on the animal’s gaze is to risk both anthropomorphism and anthropocentrism; especially in so far as we remain the objects and focus of that gaze. Bill Viola’s early documentary video, “I Do Not Know What it is I am Like” offers an alternate pathway to representing animality as well as “nakedness,” exposed in the film less in terms of image, than of temporality. Described as “an epic journey into the inner states and animal consciousness we all possess,” the film brings us to consider not how we look to animals, but rather how we may be marginal to their gaze and their worlds. Indeed, through a series of what John Berger would describe as missed encounters with the animal’s gaze we begin to realize that looking (and by extension thinking) is insufficient for acceding to any knowledge or experience of animality. And without that knowledge, without the look of the animal to recognize or confirm the viewer, neither is there recognition of what it is to be human—hence the title of the film. Through inordinately long shots of buffalos peeing in fields or fish breathing in water Viola focuses not on the so-called inner humanity that we look for in (some) animals, but on the exposed “bare life” of naked animality, attuned to a world that the camera may witness, but not reveal.

Session 3 (H) Woodruff B
Bodies, Gender, and Sexuality: A Systems Approach
Chair: Kelly Ball
Feminist science studies has developed new analytic tools for understanding and making political decisions about issues involving  gender and sexuality. Thinkers like Susan Oyama (The Ontogeny of Information, 2000) and Karen Barad (Meeting the Universe Halfway, 2005) have developed a way of approaching problems we call “feminist systemic thinking.” The papers on this panel use feminist systemic thinking to approach different issues involving gendered and sexualized bodies (transsexuality in children, pharmacological treatments for sexual dissatisfaction, fMRIs and the treatment of women with psychological distress, and the phenomenon of multiple births). In each paper, systemic thinking provides a novel way to approach and understand these issues and leads to new, and sometimes unexpected, political choices.

Megan Friddle.
Decoding Gender-Variance in Children
 Using recent texts on gender variance in children, this paper aims to address several emerging questions: Given existing nature/nurture dichotomies in discussions of transsexuality and gender identity, how can one discuss gender-variant children without falling into a trap of gender essentialism versus constructionism? How does feminist and/or queer theoretical work fail to adequately account for the complexity of child development and development of the individual over a lifetime? Does the recourse to self-knowledge and self-identification in relation to child gender-variance cause as many problems as it solves? In this paper I will offer provocative ways to think systemically about gender and childhood—outside the nature/nurture binary.

Kristina Gupta.
Pharmacological Treatments for Female Sexual Dissatisfaction
Researchers have tried to develop a diagnostic category for female sexual dissatisfaction, called Female Sexual Dysfunction (FSD), and have sought to develop drug treatments for FSD. Some feminists have critiqued this “medicalization” of sexual dissatisfaction. Members of a feminist group testified to the FDA against a drug designed to treat FSD.   In this paper, I argue that feminist opposition to pharmacological treatments for FSD is based on a linear understanding of drug interventions. Using a systems analytic (developed by scholars including Oyama and Barad), I argue that the introduction of these drugs will have complex, dynamic, multi-directional, and unexpected effects. Therefore, blocking drug treatments for FSD may not be the best political choice.

Stefanie Speanburg.
What's the Matter?: Feminist Re-vision of fMRI Research
Recent medical technologies, like fMRI and DNA testing, de-mystify interior neurological and biological processes of the body.  By making what was once invisible visible, what was once internal external, once imagined imaged, the boundaries of bodily interiority and alterity become increasingly fluid and flexible.  "urn:schemas-microsoft-com:office:office" />   Karen Barad’s feminism, her material-discursive practice makes meaning of new and reconfigured boundaries of what were formerly conceived as internal and external, mind and body, subject and other.    This paper will explore the implications of the influx of fMRI research in recent years for the material-discursive treatment of psychological distress in women.

Kira Walsh.
Developing Children from Multiple Births
In January 2009, Nadya Suleman gave birth to eight babies -- octuplets -- in California. Public response to the birth has been largely negative, accusing Suleman of engaging in irresponsible fertility practices and exploiting public interest in her children. Using the Dionne quintuplets of Canada as a comparison case study, I will draw on the work of feminist theorist Karen Barad to suggest that the words "quintuplet" and "octuplet" work as apparatuses in guiding public opinion surrounding multiple birth children and their mothers. Therefore, a potential first step for protecting the Suleman children may be to trademark the words "octuplet" and "Octomom" that have been used to describe the family in the media, not for monetary gain, but as a bid to protect the family from excessive media scrutiny.

Session 4 - Fri 1:30pm - 3pm

Session 4 (A) Bennett
An Animated Session

» My presentation

Naoto Oshima.
The Turing Legacy of Decoding and its Relevance to ThunderCats
Alan Turing once wrote that "Science is a differential equation. Religion is a boundary condition." ThunderCats follows the adventures of the eponymous team of heroes, cat-like humanoid aliens from the planet of Thundera. The series pilot pays homage to the origin of Superman, as the dying Thundera meets its end, forcing the ThunderCats (a sort of Thunderean nobility) to flee their homeworld. The fleet is attacked by the Thundereans's enemies, the Mutants of Plun-Darr, who destroy all the starships in the "ThunderFleet," except for the flagship containing the young Lord of the ThunderCats, Lion-O, his protectors, and the mystical Eye of Thundera, the source of the ThunderCats's power, embedded in the hilt of the legendary Sword of Omens. Though the Mutants damage the flagship, the power of the Eye drives them back, and Lion-O's elderly guardian, Jaga, pilots the ship to the safety of the world of "Third Earth." Unfortunately, he dies in the process; there were insufficient cryo-stasis pods aboard the ship, and the journey to Third Earth takes several decades even with advanced interstellar spacecraft. The pertinence of these themes to the phenomenon of decodings could not be more pressing in today's world.

Nancy White.
Uncanny Terrain: A Study of the Impact of the Environment on the Origins of Comic Book Superheroes
A tree falls in the forest. As it decomposes, shoots grow up from the rotting wood, and carve out their own place in the landscape. Likewise, when an animal dies, its body becomes host to myriad creatures, all nourished from the now deceased flesh. In nature, a balance exists whereby death creates new life. For George Bataille: Besides the external action of life (climatic or volcanic phenomena), the unevenness of pressure in living matter continually makes available to growth the place left vacant by death. It is not a new space, and if one considers life as a whole, there is not really growth but a maintenance of volume in general. In other words, the possible growth is reduced to a compensation for the destructions that are brought about. (Bataille, George. The Accursed Share: An Essay on General Economy. Volume 1: Consumption. New York: Zone Books, 1991. p.33) In the world of comic book superheroes, this balance still exists, but in an augmented form. Instead of the peaceful felling of an oak, the superhero narrative is filled with uncanny accidents, from chemical explosions to genomic mutations, radioactive spider-bites to plunges in vats of toxic waste. In response to these moments of hyperbolic destruction, equally exaggerated forms of life emerge: superheroes. This paper examines the trope of the superhero origin and its seemingly symbiotic relation to the environment.

Session 4 (B) Crescent
Beyond Likeness: Propositional Statements - Displacing/Replacing the Recognized Capacity of Portraits to Represent, Session 1
James McManus
In his book Portraiture, Richard Brilliant posits the notion that portraits, presented in the form of propositional statements fundamental to everyday experience, capable of articulating our beliefs, offered as the predicates of a given subject, possess the capacity to elicit from the viewer the response “this is so and so.” Here, the portrait image supports the urge to identify the subject through connections with likeness referring to the actual person. Long held captive by the need to portray the subject's physiognomic likeness, Brilliant contends the necessity for portraits to present “some discernable connection between the visible image the person portrayed in order to legitimize the analogy . . .” - a degree of resemblance. Its necessity has historically imposed constraints on the image's freedom of reference, and “has brought about the term 'likeness' as a synonym for 'portrait'.” What happens when “likeness” is no longer the primary concern of the portraitist in the treatment of their subject? Such a paradigm shift expands exponentially the potential forms assumed by propositional statements offered as predicates of the chosen subject. Presenters in the two sessions will investigate various ways in which artists during the twentieth-century moved beyond the long history of portraiture where the artist had been consumed by the need to mirror the subject's physiognomic likeness. From early in the twentieth-century questions regarding the individual in an increasingly urbanized/industialized society, as well as those related to identity assumed a prominent position among modernist thinkers. Access to information regarding new technologies and sciences, along with a tectonic drift away from the cultural order of the previous century created new opportunities - opportunities for avant-garde artists to invent new means, free of the need to achieve “likeness,” in pictorial discussions about their subjects.  

James Housefield.
"Starry Messenger: Marcel Duchamp and Popular Astronomy circa 1920"
Marcel Duchamp shaved a comet into his head in 1921, transforming himself into a living artwork that influenced the pioneers of body art in the 1960s and '70s. This talk looks at Duchamp's haircut as a sign of his interest in astronomy rather than considering it as the avant-garde gesture many interpreted it to be. With his gesture, Duchamp embodies Galileo's title of the 1610 Sidereus Nuncius, physically transforming himself into a "starry messenger." "It is a very beautiful thing, and most gratifying to the sight," wrote Galileo, "to behold the body of the moon, distant from us almost sixty earthly radii, as if it were no farther away than two such measures." Galileo's treatise concerns the relationship of optics to our perception of the cosmos, a subject that inspired many of Duchamp's works. Duchamp's interest in optics is well documented, spanning from his time as librarian at Sainte-Geneviève, Paris (1913) through the unveiling of his final work, Étant Donnés (1944-68, on display in the Philadelphia Museum of Art since 1969). Little attention has been paid to the ways these interests coincided with his fascination with popular astronomy. This talk situates Duchamp's interest in astronomy in the context of literature (Stéphane Mallarmé), his fellow artists (Joseph Cornell), and contemporary developments in astronomy, including the construction of the Einstein Tower observatory near Potsdam, Germany, by Erich Mendelsohn (1919-21). Duchamp's interest in popular astronomy tells us much about the modern fascination with watching the skies in the decades preceding the race for space and the lunar landing of 1969.

M.E. Warlick.
"Magritte and Alchemy: Elemental Transformations"
Magritte painted illusionistic replicas of the natural world that subvert our expectations of reality. Scholars diverge on the degree to which they find symbolic, biographic, or psychoanalytic content in his imagery and the extent to which the words and objects within his paintings support or deconstruct a semiotic analysis. His frequent repetition of objects suggests there is an underlying code, but he always discouraged attempts to find or to decipher it. This paper proposes that within the diversity of his imagery, Magritte included natural elements that have alchemical associations, specifically the four basic elements of the western medieval alchemists – earth, air, fire and water. To these, he added other elements drawn from Chinese alchemy – wood and metal. Within a recurring theme of physical metamorphosis of one object into another, Magritte employed these alchemical elements as the building blocks of his natural world. Magritte’s first prolonged exposure to alchemy most likely occurred during his stay in Paris from 1927 to 1930, a time when Surrealism was taking a decided turn towards occult traditions, bracketed by the publication of Le Mystère des Cathédrals (1926) by the mysterious alchemist Fulcanelli and André Breton’s Second Manifesto (1929), which claimed that the goals of the Surrealists were not unlike those of the medieval alchemists. Following brief summaries of the hermetic interests of the Surrealists in the late 1920s and alchemical theories of the elements, this paper will focus on two significant time periods in Magritte’s career when alchemical imagery seems to cluster in his work, the late 1920s and the 1950s. Medieval philosophers debated the ability of alchemists to replicate the processes of nature within the laboratory. From the time of Aristotle, the four elements were thought to be the basic components of primal matter, the base material with which the work begins. By moderating the four qualities of the elements -- dryness, moisture, coolness, and heat -- the alchemist could transform one element into the next. The goal of the work was the production of silver and gold, represented by the moon and the sun, whose sexual fusion in the glass vessel produced their child, The Philosophers’ Stone, a mysterious substance that enabled further transformation. In the twentieth century, alchemy had long since been replaced by empirical science, but the psychoanalytic approach to alchemical symbolism had opened a new level of interpretation to its processes of metamorphosis. Magritte repeated the western alchemical elements of earth (stones and landscape), water (the ocean), air (clouds), fire, as well as the eastern elements of wood (the forest) and metal (bells) within his landscapes and compartmentalized grids. In alchemical texts and images, symbolism was contradictory and multivalent to discourage the uninitiated from gaining access to its mysteries. Likewise, Magritte used a variety of strategies to complicate our understanding of his symbols. Recurring objects can be viewed alchemically, like hens’ eggs that recall the “Philosophic Egg,” or alchemical vessel. His stones can be compared to primal matter, or to the element Earth, or to the Philosophers’ Stone. A lion is an alchemical symbol of primal matter and fixed earth, while a red rose appears at the end of the alchemical operations to signify that perfection has been attained. Lions, roses, eggs, glass vessels, and fire all appear in Magritte’s paintings. In his Empire of Lights, 1954, the masculine Sun fuses to the feminine Moon as day and night collide. By suggesting these many alchemical associations, it is hoped that Magritte’s interest in alchemy can be established, however capriciously he played with its philosophic and visual traditions.

Anne Collins Goodyear.
“Duchamp’s Perspective”
"See Catalogue of Bibliotèque St. Genviève/the whole section on perspective," writes Duchamp in a series of notations dedicated to the topic of "perspective" in A l'infinitif, a series of notes published in 1966. Duchamp's interest in n-dimensional geometry and other modes of recasting and testing traditional perspectival systems receives particularly close attention in this collection of notes, which appeared late in the artist's life. But the jottings represent Duchamp's thinking from the 1910s and early 1920s when the artist was closely involved in the construction of The Bride Stripped Bare by her Bachelors, Even [The Large Glass] (1915-1923). Numerous artworks bear out Duchamp's interest in modes of visuality and particularly visual trickery or transformation, such as his Handmade Stereopticon Slide (1918-19), which transforms a two-dimensional image of a pyramid-like structure into a three-dimensional image when placed into a stereopticon viewer. The problem of perspective, the possibility that one thing can be many different things depending upon how one looks at it, fascinated Duchamp, and received close attention in his development of the Wilson-Lincoln system, a conceptual device based upon his encounter with a two-way changeable image (perhaps constructed with slats, or perhaps a lenticular photograph), that "seen from the left show[ed] Wilson seen from the right show[ed] Lincoln," as Duchamp wrote in a note included in The Green Box. This presentation exams Duchamp's fascination with optical "puns," demonstrating the degree to which they destabilized expectations, permitting Duchamp to recast both his own self-representation and the very definition of art-making itself within a perspectival system of his own making.

Session 4 (C) Doggett
Ethical Codes

Paul Sukys.
Decoding the Apocalypse: Literary Views on the Cloning of Christ
The Apocalypse has always been a popular theme in speculative fiction. Fictional portrayals of apocalyptic events include the works of Forrest Loman Oilar, Salem Kirban, and Gary Cohen, each of whom attempted to decode those prophecies within the context of contemporary events. Until recently, however, those literary decoding efforts have emerged from within a political, social, and military context. Thus, we see the military portrayal of apocalyptic events in the novels of Mel Odom, the social perspective in the works of Faith Hunter, and the political dimension in the stories of Neesa Hart. Lately, however, a new take on the subject has surfaced within the scientific context, specifically as it relates to speculation on the cloning of messianic figures from the past. The best example of this genre is a trilogy penned by James BeauSeigneur. The trilogy builds a persuasive scenario that begins with the alleged cloning of Christ, winds its way through the unraveling of future events as foretold in Revelation and other documents within ancient scripture, and ends in an unexpected, yet credible final confrontation. The paper argues that the work of BeauSeigneur represents not only the most well written version of the cloning hypothesis, but also its most credible and most accurate portrayal.

Rosemarie Garland-Thomson.
Decoding Disability: The Logics of Eugenic Euthanasia
This paper lays out a logic for eugenic euthanasia as a means of selecting who is included and excluded in human communities. The case study offered to investigate this logic is the material process and matrix of representation used by the Nazi regime to euthanize people with disabilities in Germany between 1939 and 1942. The justification for eugenic euthanasia rests on scientific authority, instrumentalizing human value, and reducing human variation.

Deboleena Roy.
Synthetic Lives: Recombinant DNA, Minimal Genomes and the Parasite Within
A recent series of (polymerase) chain reactions has lead us to a rather strange place – a place that amidst much static, must also be filled with the possibility of new entanglements and new relationships. Life is starting all over again, albeit on a minimal scale, as a genetically altered copy of a bacterial genital parasite is poised to become the new Eve. In January 2008, J. Craig Venter and a team of seventeen scientists at the J. Craig Venter Institute (JCVI) in Rockland, Maryland announced that they had successfully created the first synthetic bacterial genome (Gibson et al., 2008). Using a variety of molecular biology techniques including in vitro recombination, cloning, polymerase chain reaction (PCR), in vivo recombination in yeast and “shotgun” sequencing, Venter and colleagues synthesized, assembled and cloned the complete bacterial genome referred to as Mycoplasma genitalium JCVI 1.0. This chemically synthesized chromosome is now poised to become a part of the first artificially synthesized life form. But before Venter and other synthetic biologists “shotguns” our way into a newly synthesized world, it maybe time to take a closer look at the machine(s) being used to transport us there. This paper uses a feminist science studies approach to not only understand the social impacts of synthetic biology on the lives of those individuals who are typically marginalized in our society due to their gender, race or economic class, but also to create a space for shared perplexities. Perhaps there is even an opportunity here to intervene and contribute to the encoding mechanisms being used in synthetic biology.

Session 4 (D) Mitchell
Decoding Discourses Encouraging Social Responsibility in Training, Media, and Public Policy
Rebecca E. Burnett

L. Andrew Cooper.
HORROR FILMS AND CONTAMINATION: Decoding Media Representations to Decode Violence
On December 8, 1980, Ed Blanche from the Associated Press reported that feminists in Leeds, England, terrified and outraged by the Yorkshire Ripper’s brutal crimes against women, stormed movie theatres showing horror films and “hurled red paint at the screens.” The protesters saw a clear connection between film violence and real-world violence; every screening of bloodshed encouraged carnage in the world outside the theatre. Similarly, a letter to the editor published in The New York Times on January 25, 1981 blames films such as Friday the 13th for an “epidemic increase” of violence against women. The message of such protests is clear: people believed that technologically mediated representations of violence could cause violence in the real world. On the weekend of Friday, February 13, 2009, a remake of the film Friday the 13th took first place in the box office revenue with $42.2 million. Despite the film’s graphic content, which is arguably stronger than the original’s, one college newspaper labeled it as a “kinder, gentler brand of horror” (East Carolinian, February 17, 2009). Of course neither the 1981 nor the 2009 response to Friday the 13th reflects a universal attitude toward film violence, but the difference between the responses reflects a shift in people’s attitudes toward film technologies: people in 2009 aren’t nearly as concerned as their 1981 counterparts about the risks the media pose. Why? One answer lies in the way evolving media have maintained horror films’ presence in the popular consciousness. Condemned as dangerous trash that could contaminate their viewers, the original Friday the 13th and films like it were nevertheless resurrected for constant circulation on VHS and then again on DVD. The dissemination of these technologies has therefore paralleled the development of television news, which evolved between Vietnam and Iraq from showing horrific violence at dinnertime to showing horrific violence all the time. Many film historians link the violence of horror films in the 1970s and 1980s with the televised brutality of the Vietnam War. The remakes seem to reflect on the wars in Afghanistan and Iraq—sometimes self-consciously, sometimes not, but activists aren’t throwing paint on any screens. This change in horror films’ reception suggests an underlying transformation of audiences’ attitudes toward horror films and toward the sorts of violence they depict. Using newspapers, film criticism, and the films themselves, this paper demonstrates how contemporary horror remakes provide a lens on changing attitudes toward violence, both real and fictional. Analyzing this change in attitudes reveals a historical shift in the political and social agency attributed to media. As transnational dissemination of new technological media has made violent films more ubiquitous, the media themselves have been attributed less power. This seemingly counterintuitive change in beliefs about film technologies’ agency reflects an underlying ideological accommodation of the United States’ violent foreign policy. 

Rebecca E. Burnett.
POLITICS AND CONDOMS: Decoding the Gag Rule
Writing and Communication On January 23, 2009, three days after President Barack Obama was inaugurated, he issued a Presidential Memorandum that rescinded three previous Presidential Memoranda comprising what is broadly known as the Mexico City Policy (also called the Global Gag Rule). Obama characterized this policy as “excessively broad” and “unwarranted” in its efforts “to promote safe and effective voluntary family planning programs in foreign nations.” Obama’s decision represents a reversal of the history played out in 40 of the last 48 years, where the dissemination of information about condoms and the dissemination of condoms themselves have been influenced by Federal policies reflecting moral and political biases, despite the verifiable merit of the technology and the widespread social need for it. For nearly 50 years, the international circulation of condoms has been controlled by contentious policies that have depended less on efficacy than on the perceived moral impact. Since 1961, the U.S. Foreign Assistance Act has limited the availability and circulation of information and technology related to reproductive health — for example, restricting the distribution of condoms, information about their efficacy and effectiveness, and instruction about their use. These restrictions were formalized in 1984 when President Reagan announced the Mexico City Policy at the United Nations International Conference on Population. He directed the United States Agency for International Development (USAID) to limit family planning and “withhold USAID funds from NGOs [nongovernmental organizations] that use non-USAID funds to engage in a wide range of activities [related to reproductive health].” The Mexico City Policy was withdrawn during President Bill Clinton’s administration. In 2001, President George W. Bush reinstated and extended the Mexico City Policy by imposing even stricter limitations on USAID support for circulation of information and technology related to reproductive health. In examining the ways in which changes in the circulation of technology affect sexual safety, this paper draws on two broad categories of information. First, archival sources (such as Presidential Memoranda, Congressional hearings, related news reports, and editorial commentary representing a range of political perspectives) help to determine the timeline of events, identify and characterize the stakeholders, and trace the narratives told by these stakeholders. Second, policy briefings and fact sheets from international health organizations help to define the international impacts on health. This paper demonstrates that the timeline of changes in the dissemination of technology can be correlated with changes in world health. This paper uses the timeline as the background for analyzing stakeholder narratives in light of the actual health of individuals affected by the Mexico City Policy and argues that dissemination of health technologies is interdependent with sociopolitical values. This paper links health crises in many countries (e.g., Bangladesh, Brazil, the Dominican Republic, Egypt, Ethiopia, Ghana, Kenya, Nepal, Pakistan, Romania, Tanzania, Turkey, Zambia, Zimbabwe) directly to the Mexico City Policy. Fear of losing USAID funding has led to extreme caution in what is accepted in family planning and reproductive health care programs. According to Population Action International, an independent policy advocacy group, the Mexico City Policy forced “a cruel choice” on developing countries that needed US support. Foreign NGOs had two choices: accept USAID funds for highly restricted health services (including restrictions that sometimes risked the health of patients) or reject the policy and lose critical U.S. support. This paper decodes values, foreign policy decisions affecting dissemination of information and technologies in developing countries, and the health in those developing countries, demonstrating that no matter how valuable a technology, if it is not circulated, it cannot contribute to social/cultural advancement, including sexual safety.

Beverly A. Sauer.
USING GESTURE: Decoding Technoscientific Discourse in Post-apartheid South Africa
This paper examines how technoscientific discourses function in safety training in post-Apartheid South Africa. The research is based upon a longitudinal study of technoscientific discourse in SA Coal Mines, using psycho-linguistic analysis of speech and gesture to examine key dimensions of the discourses of risk in difficult cross-cultural and institutional contexts. This paper draws upon 51 subject interviews with South Africans who differed across language, education, coal mining experience, and ethnicity. Interview questions were based upon misunderstandings documented in a 1997 analysis of coal mine safety training at the Kloppersbos training site. This paper examines the question: “Why does ‘Imbawula’ [Fanakalo, =‘Methane’] burn?” In 1997 training sessions, miners did not understand why Imbawula burned. Trainers interpreted miners’ confusion as resulting from their lack of education and knowledge of gas mechanics. This interpretation assumed (reasonably) that miners were part of the so-called Lost Generation who received little or no formal science education during the struggle following the Soweto uprisings. SA educators refer to this cohort when they discuss current problems of scientific literacy as the root cause of Black poverty and lack of opportunity. In the 2005 interviews, however, educated Zulu-speaking South Africans affirmed that ‘Imbawula does not burn’ with striking confidence in their interpretation. Analysis of 2005 subject interviews reveals that the 1997 miners were actually scientifically correct. Imbawula has a range of meanings dependent upon language, culture, and context. In SA townships, Imbawula is a small stove; by extension, Imbawula is also the gas the emerges from the small stove—carbon monoxide. Subjeccts were therefore correct in asserting that Imbawula (carbon monoxide) does not burn. Analysis of gesture in these interviews shows that these diverse meanings are visibly encoded in subjects’ elicited and naturalistic (idiosyncratic) gestural representations of (a) a small stove or (b) the movement of methane. These gestures also reflect the speaker’s experience in a coal mine. Coal miners have a more unified sense of the meaning of Imbawula than non-coal mining subjects. French-speaking Congolese mining engineers interpret the term to mean ‘chimney’--the distinct cone-like shape of a diamond mine. In addition to its practical application in unpacking local misunderstandings in coal mine safety training, this paper has important theoretical interest by providing a new methodological approach for examining the reflexivity of technoscientific discourses and the construction of scientific literacy in a multilingual culture (cf. Starr, Verran, among others). This problem of reflexivity is not limited to South Africa, but it is made visible in the highly stratified differences in language, education, and culture that still affect how South Africans interact in the workplace.

Session 4 (E) Whitman A
Fiction, Gaming, Postmodernity

Brandon Jones.
Wholeness and the Individual: Reclaiming Order in a Postmodern World
Thomas Pynchon’s The Crying of Lot 49 portrays a postmodern world rich with information that can be overwhelming to its characters and its readers. Looking through a literary and scientific lens, I will examine how Oedipa and the Tristero network both take disorder and fragmentation as intrinsic prerequisites in their conflicting approaches to handling the surrounding information. In the end, the novel supports the Tristero’s approach, as its success allows them a semblance of order and leads to the disintegration of Oedipa as an individual entity. Contrasting these approaches with the non-traditional methods of physicist David Bohm and poet Walt Whitman, I consider the plausibility that order and wholeness are inherent rather than disorder and fragmentation, and that a coherent notion of individuality exists in a postmodern world.

Cheryl Wood Ruggiero; Susan Allender-Hagedorn.
When the Other is Us: Postcolonialism without the Guilt in Joss Whedon's Firefly.
Post-colonial? We're post-9/11! Postcolonial guilt went down with the Towers. Evidence from our popular culture mirror includes Joss Whedon's space western series Firefly and companion film Serenity. In Whedon's universe, there are no Others. Humans terraformed planets when Earth-that-was could not support life. The ruling Alliance's core worlds are rich and dominant; outer planets are the frontier, where the Alliance drops settlers with little more than seeds and cattle. In a classic colonial relationship, colony worlds export raw materials, and the metropolitan Alliance profits by sending them back as hi-tech medicine and foodbars. Real-world postcolonial guilt comes from oppression/destruction of "native" races. In Firefly, there are none. The Alliance's secret atrocity is the use of Pax, a drug meant to end aggression; Pax caused the death by lethargy of most people on planet Miranda and turned a minority with bad drug reactions into the super-violent, cannibalistic Reavers. But the victims were humans, not an oppressed native species. Thus Firefly fans sit in the moral catbird seat: we despise the metropolitan power, root for the colonials, but never suffer guilt over the natives. Serenity's crew is a perfect expression of what America would like to be, post-9/11: feisty and guilt-free.

John Sharp.
Situating Play in the Past
This paper examines four examples of how games and play were situated within earlier cultures. The four moments under consideration are the 17th century establishment of the Go Academy in Japan by Shogun Iyeyesu; Friedrich von Schiller's discussions of game play as a form of creative expression; Marcel Duchamp's engagement with Chess in the 1920's and 30's; and the Hingham Institute Study Group on Space Warfare's "Theory of Computer Toys" and the subsequent creation of Spacewar!. Looking back to moments in the past in which games and play factored prominently in culture is an exercise well worth the effort. It provides us with perspective on the value games and their play can have.

John Johnston.
Game-World Fiction: Halting State and Daemon
While imaginatively exploring the dynamics of the Internet game-world, two recently published novels offer detailed mappings of significant technological transformations currently in process. Charles Stross' novel Halting State (2007) focuses directly on the world of Massive Multiplayer Online Role Playing games (MMORPs). By having the narrative originate in a "real crime" perpetrated --and perpetuated-- across both virtual and actual spaces and networks, Stross raises the question of the reality status of the on-line game world, which has rapidly evolved from a quasi-autonomous activity into a dynamic network of inherent complexity and enormous influence in present social and economic reality. In Daemon (2008) Daniel Suarez shows how the on-line game world may provide a transformational matrix for a new and fully distributed and automated society engineered by a remorseless machine -- the Daemon of the title. Operating primarily through the actions of webbots triggered by specific events announced in the mass media and coordinated by a distributed AI game engine, the Daemon begins to extend its reach into and refashion the world according to the apparently unstoppable dictates of its own logic.

Session 4 (F) Whitman B
Decoding Narratives of Nanotechnology
Lisa Yaszek
The members of the panel use critical frameworks drawn from the humanities and social sciences to illuminate the rhetorical strategies used by government officials, game designers, and scientists to garner consensus for specific avenues of nanotech research and development. While the first two panelists use game and science fiction studies to examine the futures promised to us by present-day nanotech discourses, the final panelist uses literary theory and the sociology of science to rethink the history of nanotech research and development itself.

Colin Milburn.
"Have Nanosuit—Will Travel: Military Nanotechnology, Video Games, and the Crisis of the Digital Battlefield”
Military scientists frequently claim that nanotechnology will transform the future of warfare, integrating soldiers and machines into the “digital battlefield” at a molecular level and making combat programmable at every scale: war becomes a video game. At the same time, consumer video games increasingly feature simulations of “nanowar” based on the research  agendas of real scientific institutions. In this paper, Colin Milburn examines the convergence of military nanotechnology with video game culture. Focusing on the recent game Crysis, which follows the adventures of a soldier equipped with a nanotechnology battlesuit (based on U.S. Army prototypes), he investigates how players navigate the speculative condition of nanowar. Crysis players typically adapt to the official rhetoric of military nanotechnology, which presents nanotechnology as a form of masculine empowerment. At the same time, these players often comprehend the functioning of military nanotechnology as occasioning a state of crisis: a crisis of gender and human embodiment. Therefore, even as video game players adjust to the concepts of nanowar, their engagement with the digital battlefield as an everyday playspace simultaneously opens the discourse of military nanotechnology up to other politics, other genders, and other futures.

Lisa Yaszek.
"Science, Fiction and American Public Policy"
In this presentation, Lisa Yaszek uses the critical apparatus of science fiction studies to make sense of public policy writing about nanotechnology. She begins by considering the surprisingly similar narrative imperatives between science fiction and public policy writing about emergent technosciences, exploring how authors in both domains strive to build “realistic” worlds of tomorrow by extrapolating from the utopian promises of emergent sciences and technologies, acknowledging the potential risks inherent in these sciences and technologies and explaining how creative and rational people can reduce those risks to conquer the future. Yaszek then reads four key documents in the construction of American public policy on nanotechnology through the lens of formalist science fiction criticism, demonstrating how the major story and character types commonly associated with science fiction inform national debates over the meaning and value of nanotechnology itself. It is precisely by telling tall but persuasive tales about the future, Yaszek contends, that public policymakers build that future in our own world today.

Michael Bennett.
"Simulation of Influence"
Currents of humanities and social science studies of the emergence of nanotechnoscience have recently turned to assessments of disciplinary and instrumental originary impacts. Was the scanning tunneling microscope maximally seminal, or should we assume that previously unknown avenues of material science research were initially decisive, they wonder. In this presentation  Michael Bennett contends that while these currents have considerable critical historical value, in isolation they also tend to radically discount important political aspects of “the birth of nanotechnology,” particularly concerning government funding. Deploying an amalgam of Bloomian antithetical literary criticism and insights of the institutional sociology of science into intra-discipline stratification, Bennett strives to revivify the sclerosed Feynman-Drexler origin story of nanotechnoscience by recasting it as a strategic and political expedient the historical value of which outweighs the import of the recent revisionist disciplinary-instrumental debates.

Session 4 (G) Woodruff A
Rules of the Game: Perspectives on Hunting
Chair: Susan McHugh
In the literature of Animal Studies, what hunters say about what they do is most often greeted with responses ranging from disgust to accusations of self-serving rationalization. The scholarly tone is typically a kind of ironic knowing with which we are all generally familiar. This panel begins with the agreement that we will take the ideas and comments of hunters seriously. If a hunter says that shooting animals makes him or her feel closer to and even at one with nature, we want to understand that claim as it is. By sketching encounters with hunters over different times and spaces, and comparing these as well to the ideas expressed by those who hunt things other than animals, the panel hopes to begin a new kind of discussion about hunting practices.

Alissa Mazow.
Cy Twombly, John Cage, and the Art of Hunting Mushrooms
In the summer of 1964 the artist Cy Twombly hiked among the Dolomites of Italy, observing nature and hunting mushrooms. Some scholars argue that this activity inspired a series of lithographs he made ten years later—Natural History Part I Mushrooms (1974)—a portfolio of ten prints that marks an important, and overlooked, model for the intersection of art and science in later twentieth-century visual culture. In his fungal foray, Twombly integrates the collage and the sketch, the painting and the diagram, the image and the word, addressing such diverse themes as repetition and process, artistic and biological reproduction, life and death. A further consideration of these prints is that Twombly was hardly alone in his recognition of mushrooming as a possible route to artistic expression and understanding human perception. The avant-garde composer John Cage, too, was taken with mushrooming throughout his life, his mycological activities providing a critical foil against which to read Twombly’s mushrooms. Twombly’s use of the mushroom to explore empiricism’s claims to truth is matched by Cage’s zeal for mycology as a subject that requires many of the same acute skills of perception as does music. Arguably, what Twombly, Cage, and a number of notable artists, writers, and scientists (including Valdimir Nabokov, R. Gordon Wasson and Valentina Pavlovna Wasson) tell us, is that for so many, the journey of the mushroom hunt is as important as the proposed end game. Compositions and mushroom forays provide a structure for thought and action, where participants embark on a quest, a hunt, to identify and uncover the subject: the art, the music, or the mushroom. But as the Fluxus artist Alison Knowles observed, so often this search leads to “other things.” This paper will explore the contemplative space opened up by mushroom hunting, in practice and in the arts, giving a degree of consideration to the ways that hunting mushrooms and hunting mammals are similar and/or dissimilar.

Nigel Rothfels.
Decoding the Rules of the Game

Karen Syse.
Stalking the Stag: Hunting for Nostalgia in Scotland
The Scottish red deer were first and foremost a beast of the forest. With post-war afforestation of Scottish hill land, they have become the British Forestry Commission’s main concern. Deer like to eat Sitka spruce, and because of this they have a dual role in Scotland. Deer are both a prized trophy for hunters, yet at the same time, they are classified as vermin when they move into Sitka spruce plantations. Deer are also highly symbolic creatures, which help link the hunters to the past and let them tie scientific deer management practice to what they consider to be a traditional way of life in rural Scotland. The deer have evolved from being a medieval aristocratic target for hunting, to a Victorian rite of passage for empire building men, to finally achieving an ambivalent position as part vermin and part trophy for paying deerstalkers. Yet the deer, the Monarch of the glen, holds its position as an important national symbol.

Garry Marvin.
‘... and that’s why the elk is left on the stairs’: Ritual, Respect and Dignity in Hunting
In this paper I will attempt to explore issues relating to the activities and experiences of hunters. Rather than taking a critical perspective from outside the event this paper will be based on anthropological research from within the cultures of hunting. The hunters with whom I work could be classed, in terms of Kellert’s classification of types of hunters, as ‘nature hunters’ - those who seek a close engagement with wild animals, and the landscapes they inhabit, through hunting. My research suggests that although the killing of an animal is often the end of what is regarded as a successful hunt the fact that an animal is killed is less significant than how that death was brought about. I am fascinated that a hunt without a kill can, nevertheless, be regarded as having been a satisfying event. The focus of this paper will be the nature of the experience that the hunters with whom I work seek and how that experience is created and generated out of the challenges that the landscapes, the conditions and the animals present. Fundamental to this experience is how one behaves and performs as a hunter in response to these challenges. For the hunters with whom I have spoken one must hunt with dignity and respect and in an appropriate manner. Without this one has not, truly, hunted nor hunted truly - and that is why the elk was left on the stairs. That elk was the end of a moral story told to me by a hunter and I will begin with that ending.

Session 4 (H) Woodruff B
Whitehead+Cosmopolitics I: novel ecologies of theory i
Steven Meyer
This is the first of four panels for the stream "Whitehead+Cosmopolitics" proposed by Steven Meyer ( and Sha Xin Wei ( The abstract is identical for each panel, although the presentations differ. Most generally, this four-panel stream poses the question of the relation between the cosmopolitics envisioned by Isabelle Stengers in her “Cosmopolitical Proposal” (2005)—and back of that, the two-volume Cosmopolitiques (1997, 2003)—and the philosophy of social organism of Alfred North Whitehead. Both Stengers and Whitehead insist that any cosmological inquiry worthy of the name must take science (including technoscience) seriously, and that this means more than merely subjecting it to traditionally social and political as well as philosophical critique. For starters, it means cosmopolitics and it means societies of actual occasions. How to undertake the relevant cosmopolitical and Whiteheadian analysis is, to be sure, a complicated question; and the ten panelists approach it variously, in some instances directly, in others indirectly. The stream is divided between theoretical and practical approaches, although the frame is deliberately porous. As Andy Goffey remarks in an observation equal parts Whiteheadian and Stengersian, the idea is to develop more specific accounts of “the cultivation of thinking practices that evade the easy academic distinction between theory and practice."

Thomas Lamarre.
Cosmopolitics and Biopolitics
The inseparability of nature from politics has become a key problem for a number of thinkers in recent years, and we have seen the emergence of two manners of thought, cosmopolitics and biopolitics. Both manners differ from traditional political theory in their insistence on the dynamism of life and of nature. It is not a matter of a pre-established harmony (cosmology) that provides a foundation for natural rights or natural sovereignty. Rather cosmopolitical and biopolitical theories confront the natural as “chaosmos,” reckoning with the fields of forces as a ground for political action. Yet, where cosmopolitics presents a proposal for new ways of thinking the political that draw on the insights of sciences and technologies, biopolitics identifies a fundamental impasse in modern thinking about the political. Reductively speaking, we might contrast Latour and Agamben. Latour proposes a sort of amplification of natural rights by according speech to quasi-objects and quasi-subjects who nonetheless require representation by humans. Agamben continually refers us to the reduction of the human to “naked life.” To work between such currents, I propose to explore what Whitehead’s rethinking of the human might contribute to such discussions, with an emphasis of his radical kind of extension of the dynamics of human emotion toward the “chaosmos,” and with reference to Stengers’s and Deleuze’s readings of Whitehead. I also propose to consider how the cosmopolitical and biopolitical potentially intersect at politics of the commons rather than a politics of rights.

Steven Shaviro.
The Actual Volcano: Whitehead, Harman, and the Problem of Relations
In the last several years, a group of philosophers known as "speculative realists" -- most notably Graham Harman, Quentin Meillassoux, Ray Brassier, and Iain Hamilton Grant -- have forced us to look at the status of modern thought in a new way. They have all questioned the anti-realist and human-centered assumptions of most post-Kantian thought, and opened up the possibility of new forms of philosophical speculation that are no longer restricted to "self-reflexive remarks about human language and cognition." Graham Harman, in particular, has invoked Alfred North Whitehead as an ally in this endeavor; for Whitehead is one of the few twentieth-century thinkers who dares "to venture beyond the human sphere," and to develop a metaphysics in which all entities, from God to "the most trivial puff of existence in far-off empty space," are placed upon the same ontological footing. Harman is close to Whitehead in many ways, but he also criticizes the "relationism" that is so central to Whitehead's cosmology. In this talk, my aim is both to show how Harman helps us to see Whitehead in a new light, and also to develop a Whiteheadian reading of Harman, one that incorporates Harman's insights but maintains Whitehead's doctrine of the importance of relations.

Jennifer Spiegel.
The Bios, Cosmos, and Ec(h)o of Political Praxis
As the refrains of ecological crisis become ubiquitous, the political stakes of how ecological relations are cast is increasingly coming under scrutiny. In Three Ecologies Guattari identified the levels of ‘ecological crisis’ as: the mental, the social, and the environmental. In so doing, he opens a host of questions concerning the relationship between the various discourses committed to these levels of ecological study. In her reading of Whitehead and science, Stengers draws attention to the bifurcated understanding of nature that continues to haunt ecological discussions: with the “objective” law-based understanding provided by the sciences, and the “subjective” understanding of the arts. Her reading of Whitehead suggests that a process of cosmological adventuring may be able to navigate this fracture, not by transcending or displacing it, but by focusing on the manner in which the cosmos is composed through an adventurous abstraction of the various elements that are salient. This process is inherently socio-political, in so far as the manner in which society endures depends upon the manner in which it takes place. Such cosmological adventuring would, however, seem to be hemmed in by what Foucault has called biopolitical regimes: processes controlling the sorts of material and conceptual adventuring that might take place. This paper places Stenger’s account of the cosmological adventurer into dialogue with Foucault’s biopolitics in order to shed light on the manner in which ecological relations are forged. It explores the tensions between the bio-political, the cosmo-political and the eco-political, and proposes that the process of cosmological adventuring may, through repetition and transmutation of received social organizations, transfigure the domain of the eco-political relations.

Session 5 - Fri 3:30pm - 5pm

Session 5 (A) Bennett
Autopoiesis, Gaia, Climate, and Time
Chair: Bruce Clarke

Bob Markley.
Embodied Time, Climatological Time, Sustainability
This talk explores a key conceptual tension between two notions of time: embodied or experiential time and climatological time. The latter, measured in centuries and millennia, exists beyond our daily, temporally bounded experiences of the weather, and beyond the duration of our lifetimes. Climatological time is dynamic, shaped and recalibrated, as Bruno Latour suggests, by the networks, alliances, and assemblages that collect, transmit, verify, interpret, and disseminate data; that then reaffirm or modify assumptions and values about the natural world; and that continually negotiate the vexed relationship between seemingly individual, embodied experience and scientific knowledge. In this respect, climatological time registers the theoretical relationships between qualitative experience and quantitative knowledge, between human history and the earth's history. The tensions between embodied and climatological time lie at the heart of the sociocultural, political, economic and ecological debates that shape both progressive and retrograde attitudes toward global warming, including efforts to galvanize the resources, capital, and political will needed to move urgently toward a paradigm of sustainability. Yet the ideal of sustainability that underlies most plans of collective action to address global warming rests on a host of unexamined, and in some cases ahistorical, perceptions of humankind's relationship to climatological time. In retheorizing sustainability, I argue that we need to examine critically the ways in which climatological time has emerged, developed, and been redefined since the sixteenth century.

Bruce Clarke.
Autopoiesis and the Planet
Along one line of development, the concept of autopoiesis has unfolded with second-order systems theory as a discourse of epistemological constructivism. Along another, autopoiesis has been brought up to the level of geobiological systems theories of planetary regulation, the Gaia theory of James Lovelock as elaborated by Lynn Margulis. Here I will extend a discussion I opened up at the 2007 meeting, about how Margulis has applied the discourse of autopoiesis to Gaia theory, by drawing out connections to the wider discourse of self-referential systems. I will argue that Margulis has effectively remediated Lovelock’s homeostatic or first-order cybernetic animism through a second-order systems-theoretical resolution of his overly “strong” form of the Gaia concept. Margulis has retraced for Gaia a metabiotic course parallel to that by which Niklas Luhmann has carried the theory of autopoiesis over into the metabiotic co-emergence of consciousness and communication. Additionally, a 2004 article by Paul Bourgine and John Stewart, “Autopoiesis and Cognition,” by emphasizing the functional rather than material instantiation of the boundary delineating the autopoietic system, also makes a case for Gaian autopoiesis.

Henry Sussman.

Session 5 (B) Crescent
Beyond Likeness: Propositional Statements - Displacing/Replacing the Recognized Capacity of Portraits to Represent, Session 2
James McManus
In his book Portraiture, Richard Brilliant posits the notion that portraits, presented in the form of propositional statements fundamental to everyday experience, capable of articulating our beliefs, offered as the predicates of a given subject, possess the capacity to elicit from the viewer the response “this is so and so.” Here, the portrait image supports the urge to identify the subject through connections with likeness referring to the actual person. Long held captive by the need to portray the subject's physiognomic likeness, Brilliant contends the necessity for portraits to present “some discernable connection between the visible image the person portrayed in order to legitimize the analogy . . .” - a degree of resemblance. Its necessity has historically imposed constraints on the image's freedom of reference, and “has brought about the term 'likeness' as a synonym for 'portrait'.” What happens when “likeness” is no longer the primary concern of the portraitist in the treatment of their subject? Such a paradigm shift expands exponentially the potential forms assumed by propositional statements offered as predicates of the chosen subject. Presenters in the two sessions will investigate various ways in which artists during the twentieth-century moved beyond the long history of portraiture where the artist had been consumed by the need to mirror the subject's physiognomic likeness. From early in the twentieth-century questions regarding the individual in an increasingly urbanized/industialized society, as well as those related to identity assumed a prominent position among modernist thinkers. Access to information regarding new technologies and sciences, along with a tectonic drift away from the cultural order of the previous century created new opportunities - opportunities for avant-garde artists to invent new means, free of the need to achieve “likeness,” in pictorial discussions about their subjects. 

James McManus.
“Dr. O’Doherty’s ‘ HeArt Machine’: A Portrait of Marcel Duchamp”
Two incidents motivated the artist and former physician Dr. Brian O’Doherty to produce his sixteen part Portrait of Marcel Duchamp (1966-67). One was Duchamp’s statement that works of art have short life spans and die with the artist – an argument O’Doherty vigorously sought to refute. The other was born out of Thomas B. Hess’s vicious attack on Duchamp, in the essay “J”Accuse Marcel Duchamp,” published in the February 1965 issue of Art News. Hess opined that Duchamp “ disastrously has confused art with life. . . . He has tried to turn himself into a masterpiece, and through his example, has been a corruptor of youth.” How to respond? O’Doherty would transform and “transplant” his subject’s heartbeat into a series of still and kinetic images. He knew exactly what he was looking for. And, that something was turning Duchamp’s heartbeat into a work of art - a living heartbeat - mechanomorphed and boxed - a heartbeat that, under his control, would continue indefinitely. Like the fictitious Edison in Villiers de l’Isle – Adam’s Eve of the Future Eden whose android Hadaly replaced Alicia Clary, and two figures from Fritz Lang’s classic film Metropolis – C. A. Rotwang, the inventor of the machine-human Maria, and Grot, supervisor of the “Heart Machine.,” the machine that controlled the flow of electrical current to the city - O’Doherty had to fashion his android, his replacement for Duchamp, having it assume his subject’s identity, feeling of being, and sense of consciousness. He began by taking Duchamp’s electrocardiogram. Working with its data he set out to design and build his machine modeled on an oscilloscope providing an optical presentation of the heartbeat. As a work of art it assumed its place in galleries and museums. O’Doherty would have Duchamp alive and in his hands, refuting the artist’s claim that his works would die with him. This paper looks at the motives behind Portrait of Marcel Duchamp, O’Doherty’s approach to problem solving as artist/scientist creating its sixteen parts, and the work’s implications following Duchamp’s death.. At that time (1968) Man Ray declared, “His heart obeyed him and stopped.” —Or did it?

Hannah Wong.
“Portrait of a Lady: Humor and Francis Picabia's Mechanomorphic Object Portrait, Jeune fille américaine.”
On the occasion of his second visit to America in 1915, French-born artist Francis Picabia told a reporter for the New York Tribune that “almost immediately upon coming to America it flashed on me that the genius of the modern world is in machinery, and that through machinery art ought to find a most vivid expression.”  It was during his time in New York that Picabia developed his mechanomorphic “object portraits,” which appeared in the publication 291. These “object portraits” referenced popular science and machinery, and scholars have identified many of the scientific and technological sources that served as Picabia’s inspiration.  What scholars have yet to explore fully are the types of humor embedded in the clever visual puns and relationships drawn between the “sitter” and mechanical objects.  This paper will explore Picabia’s Portrait d’une jeune fille américaine dans l’état de nudité [Portrait of a Young American Girl in the State of Nudity] from 1915, a “portrait” of a young woman depicted as a spark plug.  Past Picabia scholarship already demonstrates that the cultural context of Picabia’s imagery enriches one’s understanding of his work. I will argue that cultural context as well as Picabia’s own oeuvre can enrich the viewer’s appreciation of visual jokes found in works such as Portrait.

Kate Dempsey.
“Seeing Double: Ray Johnson and Marcel Duchamp”
Ray Johnson and Marcel Duchamp complicated the viewing process. Duchamp detested what he called "retinal" art, yet studied optics intensely and addressed this interest in numerous works. Johnson’s manipulations of the optical were more conceptual than scientific. Many of his works take the act of viewing as their subject. One series of collages, for example, depicts various people "getting a better view of Étant donnés." Johnson saw everything through correspondences as is exemplified in his silhouette portraits—each sitter’s profile is filled with images that Johnson associated with that sitter. In focusing on a portrait of Duchamp by Johnson, I will highlight the serious consideration that both artists dedicated to understanding as well as challenging vision.

Session 5 (C) Doggett
Decoding Social Networks
Patrick Jagoda
As evidenced by the explosive recent popularity of online social networks such as Facebook, social relations have increasingly become digitally encoded. Yet how do the cryptographic methods employed by digital media interact with the equally (if not more) complex social codings already implicit in the networks of social relations they subsume? How can new technologies and interventions exploit the intersections and discontinuities inherent to such systems? In order to grapple with these questions, this panel will explore both the codings that produce contemporary social networks and the critical decodings necessary to make social sense of technologies of interconnection.

Zach Blas.
GRID: Viral Contagions in Homosexuality & the Queer Aesthetics of Infection
This paper seeks out a new queer viral aesthetic configuration that decodes and reconfigures the biosocialities of homosexuality. Today, two overarching grids can be identified that shape and define the biologies and cultures of homosexuality: 1) G.R.I.D. (Gay-Related Immune Deficiency), the term briefly given to AIDS in 1981, sutures itself perpetually to the concept of the homosexual body, encrypting it into a grid of viral contagion and disease, and 2) our contemporary grids of communication and capital have produced and replicated an image of the homosexual as always complicit within flows of consumption and nationalism. While Jussi Parikka defines this contemporary regime of capitalism as viral, “capable of continuous modulation and heterogenesis,” Hardt and Negri echo this claim when they write, “Empire’s institutional structure is like a software program that carries a virus along with it, so that it is continually modulating and corrupting the institutional forms around it.” Jasbir Puar suggests this biopolitical model produces the homonational or homonormative to subsume all homosexuals. The homosexual finds itself encrypted within the paradox of two interlocking viral grids: a biological grid that always already casts the homosexual body out as diseased, infected, dying, dead, and another cultural grid of capital that continuously modulates and reproduces the image and the very idea of homosexuality as anything but an other to heterosexuality and the nation. The homosexual as grid persists like a ghost, with a body disposed of and an image antithetical to all the bodies it “correlates” with that spreads at the rate of digital replication and infection. The grids that constitute homosexual biosocialities code the persistence, reproduction, replication, existence, and visualization of something called “homosexuality” through its viral logic. Indeed, this biopolitical GRID of modernity and globalization that shapes the homosexual is itself a virus/viral. As Galloway and Thacker write, “viruses are life exploiting life [. . .] exploit[ing] normal functioning of their host systems to produce more copies of themselves.” As GRID reproduces a biosocial monolith of homosexuality through a process of erratic replication as a way of “never-being-the-same,” a queer grid must aim toward a de-coding of GRID through an exploitation of the selves of its nodes to produce a replicated difference of never being-the-same. By turning to the work of digital art and activist organization Queer Technologies (of which I founded), I would like to consider their projects on “GRID” and “Gay Bombs” as viral methodologies for reconstructing the dominant GRID of homosexuality. Employing Alan Liu’s call for destructive creativity—a creativity that goes “beyond the new picturesque of mutation and mixing to [. . .] the new sublime of ‘destruction.’ [. . . a] viral aesthetics.” Such an aesthetic tactic would be a viral de-coding of the homosexual’s self that is simultaneously an escaping of GRID and a reconstitution. The queer aesthetics of GRID relay throughout a repetitive stream of disidentifications, a queer cryptography, repetitively infecting/de-coding the infections of viral capitalism and homonormativity (a “cool” virus), at the risk of obliterating one’s own “hygiene:” an aesthetics that de-codes it own historical infections to vie for agency. Such a project would always be rooted in the biopolitical. If viral repetitions have been defined as “illegible and incalculable,” this project for a new aesthetics of queer grids must replicate and reproduce a queer affect as de-coded escape—nonhygienic ways of being, living, de-coding/re-coding—that chart the possibilities of queer world-making on and off the GRID.

Casey Alt.
VacilLogix™: When Social Networks Go Sociopathic
Founded in 2007, the social media startup company VacilLogix™ trumpets itself as the first entrepreneurial venture to fully embrace the limitless productive power of sociopathy. As brazenly stated in its corporate mission statement, VacilLogix™ is committed to the creation of social value derivatives and the universal democratization of sociopathic strategies. Their goal is complete deregulation of social value systems. As such, VacilLogix™ considers itself not just a software company or even a brand, but a social movement, a cultural revolution, a collective art performance to "Outcode our moral code." Its primary point of client contact is its Slightly Sociopathic Software™ line, designed to enhance and automate sociopathic interactions. The line consists of four software applications, including a utility to track and manage lies ("The Deceptionist™ — your personal deception receptionist!") and the StalkBroker™, which helps people discreetly surveil others' activities. Initially intended as a suite of online social networking applications, the first Slightly Sociopathic Software™ line will feature custom-designed Facebook, MySpace, and iPhone applications designed to help "corporations-of-one" better exploit and leverage the digitized relationships inherent to online social networks. This paper will explore the as-yet nascent phenomenon that is VacilLogix™ and argue for its immense critical significance as a harbinger of the next generation of social media applications.

Patrick Jagoda.
Wired: The Social Networks of David Simon's "Other America"
Beginning in the mid-twentieth century, the network emerged as the principal architecture and most multivalent metaphor of the globalizing world. Especially in recent years, various cultural anxieties have come to characterize distributed networks. Collective formations such as oppositional terrorist networks, volatile economic networks, uncontrollable computer networks, and threatening disease ecologies have been perceived as the most persistent objects of fear in an interconnected world. In an attempt to complicate this cultural perception of networks, this paper explores the recent HBO television show The Wire which renders the terror inherent in a different type of social web. Through a narrative and visual analysis of the show, I contend that the fear of the network form, as such, obfuscates a deeper fear of a widespread social network of racial prejudice, unemployment, poverty, and crime. The Wire represents various nodes of this more expansive American capitalist network. Moreover, the television program decodes the operation of both material surveillance technologies and systemic technologies of social control.

Session 5 (D) Mitchell
Playing Chicken with Theory
Heidegger and Ragtime Joe: Poetics and Autopoietics
This talk puts Heidegger’s *Poetry, Language, and Thought* into dialogue with the classic American country song “C-H-I-C-K-E-N,” where we encounter a Heideggerian poet in the guise of backwoods virtuoso Ragtime Joe. I use this discussion to explore ways in which poetics is related to autopoietics, the logic of complex systems.

Kevin Chua.
Haacke’s Luhmann: Rethinking Shapolsky, et. al.
This paper revisits Hans Haacke’s Shapolsky et. al. Manhattan Real Estate Holdings, a Real-Time Social System, as of May 1, 1971 (1971) through Niklas Luhmann’s social systems theory. Haacke’s installation, which consisted of a number of photographs of tenement buildings in New York City plus information and diagrams of the Shapolsky et. al. real estate holding company, matter-of-factly traced a system of corruption in the real estate economy in New York City, by exposing the financial connections of various individuals in the Shapolsky scheme. The work provoked enormous ire, and famously resulted in the cancellation of Haacke’s planned retrospective exhibition at the Guggenheim Museum in 1971 – but thereby revealed a hidden nexus of art, money, and power that many did not want to see, much less acknowledge. My paper will examine Haacke’s seminal installation using Luhmann’s social systems theory (a theory whose purchase has mostly been sociological rather than one relevant to a socio-historical aesthetics). I argue that Haacke’s installation provides a demonstration of Luhmann’s sociological aesthetics – especially in its specifically visual dimension. As much as Haacke’s work turns its gaze (back) on the real estate system in New York City, I propose to open the work up to the wider forces of the global financial economy in the 1970s. Do the epistemological limits of Haacke’s work stop at the real estate system, or do they extend toward the city, and the world? The aim of this paper is to ask whether we can effectively see global pressures of gentrification and spatial differentiation taking place in the “system” of the New York real estate economy in the early 1970s, on the ground – and in the tilted horizon of Haacke’s installation.

Susan Squier.
Inauguration: auguries of race, property and the law in the image of the black chicken thief
This talk, taken from the final chapter of my book on chickens as cultural and scientific objects to think with, examines the image of the black chicken thief as it functioned in a highly circulated image from the Obama presidential campaign era, the "Obama bucks."

Session 5 (E) Whitman A
Virtual Geographies II

Christopher Miller.
“Beginners Play Atari: Human Creativity in the Age of Procedurality”
Now that the coded nature of games is increasingly apparent, it would seem that the most skilled gamers are those who can most effectively “decode” the game: those who can think most like a machine. This presentation will argue that, on the contrary, it is precisely the most skilled gamers who, by pushing the game beyond its prescribed limits, can and must exert their creativity, and thus their humanity, most powerfully. In competitive gaming especially, innovations by players have been so revolutionary that gameplay is determined as much by players as it is by designers. Recognizing this requires that we dispose of old interpretive models based on print culture (which, despite the rise in critical work on video games, are still held by most of these theorists) and embrace new, more horizontal and open critical practices.

Alenda Chang.
“Mission Planet”: Games as Virtual Ecologies
Environmental trends are increasingly the subject of a variety of games that suggest surprising new approaches to both game studies and environmental advocacy.  Such games raise an interesting complex of questions: how do games model “nature” and relevant scientific theories, and how do code-based representations of nature differ from those in more traditional media?  Do games permit a better understanding of natural processes by moving past the visualization of data to procedural or algorithmic embodiment?  Digital games and networked media offer promising avenues not only for rendering the realities of environmental crisis—nature as problem space—but also for schematizing possible solutions in ways that leverage the unique affordances of the computer, the Internet, and player collectives.

Session 5 (F) Whitman B
From Encoding through Decoding to Transformation
Elizabeth McCormack; Anne Dalke; Ava Blitz
We are a literary critic, physicist and artist (representatives, respectively, of “literature, science and the arts”) who are intrigued by the mysterious similarity of structures, from the microscopic to the macroscopic, found in astronomy, geology, biology, and cultural forms such as art and literature. The more we look, the more blurred become the distinctions we see between fossil and artifact, between natural and cultural history, between the past, the present, and the future. Common to all these fields is an on-going process of exploration, experimentation, discovery, encoding and decoding, interactivities of interpretation that in turn create new, previously non-existent mysteries. We are proposing a panel to demonstrate this complex process of encoding, decoding and transformation. Ava will initiate our public conversation by describing the open-ended process whereby she encodes her “pixies,” landscape and sculpture projects. Liz and Anne will demonstrate different “decodings” of those evolving scripts of creativity. All of us will then join in an account that interprets the dynamic of encoding and decoding to describe the kind of transformation that can take place when art is made without known, desired or predictable outcomes. The decoding process, as we experience and explain it, highlights less the clarity than the mystery of interpretation. Ava Blitz is a visual artist who divides her time between studio work in sculpture, works on paper, photography, and public art. Anne Dalke is a literary critic interested in emergent pedagogies, feminist theory and narrative traditions, revisionary work in canon of American literatures, and the intersections between science and literature. Elizabeth McCormack is a physicist and graduate school dean. Her research uses techniques in nonlinear optical laser spectroscopy to study fundamental characteristics and excited state decay dynamics of small molecules.

Session 5 (G) Woodruff A
Whitehead+Cosmopolitics II: novel ecologies of theory ii
Steven Meyer
This is the second of four panels for the stream "Whitehead+Cosmopolitics" proposed by Steven Meyer ( and Sha Xin Wei ( The abstract is identical for each panel, although the presentations differ. Most generally, this four-panel stream poses the question of the relation between the cosmopolitics envisioned by Isabelle Stengers in her “Cosmopolitical Proposal” (2005)—and back of that, the two-volume Cosmopolitiques (1997, 2003)—and the philosophy of social organism of Alfred North Whitehead. Both Stengers and Whitehead insist that any cosmological inquiry worthy of the name must take science (including technoscience) seriously, and that this means more than merely subjecting it to traditionally social and political as well as philosophical critique. For starters, it means cosmopolitics and it means societies of actual occasions. How to undertake the relevant cosmopolitical and Whiteheadian analysis is, to be sure, a complicated question; and the ten panelists approach it variously, in some instances directly, in others indirectly. The stream is divided between theoretical and practical approaches, although the frame is deliberately porous. As Andy Goffey remarks in an observation equal parts Whiteheadian and Stengersian, the idea is to develop more specific accounts of “the cultivation of thinking practices that evade the easy academic distinction between theory and practice."

Sha Xin Wei.
Morphogenesis: Whitehead's Concrescence, Alexander's Unfolding, and Serres' Noise
In this paper, I develop my consideration of morphogenesis, setting out from A.N. Whitehead, visiting Christopher Alexander’s recent work, and landing in Michel Serres. One of my fundamental motivations for this concern is in fact a return to a tissue- or field-based extension of biopolitics that respects the autopoietic qualities of systems but does not reduce to just mimicking biological, informatic, or graph-theoretic schema. In Process and Reality, Whitehead states his ontology unequivocally as a philosophy of process: “How an actual entity becomes constitutes what the actual entity is.” (PR 64) This process of concrescence is marked by the production of a novel occasion out of many occasions, the “production of novel togetherness.” (PR 22) Much space is devoted to what one might characterize as a synchronic description of morphogenesis, but Whitehead also dedicates the third quarter of PR fully to detailing the dynamic mechanism of his morphogenetic process of concrescence. To make this relatively self-contained I will briefly summarize Whitehead’s construction, although in this particular presentation I wish to focus on Whitehead’s fundamental appeal to the Principle of Least Action, and contrast it with four other morphogenetic principles. In his late, and philosophically most evocative work, The Nature of Order, Christopher Alexander arrived at the same central problem of morphogenesis from the practical art of building living structures via living processes exfoliating in space and place, explicitly revealing his relation to Whitehead and Leibniz. However, Alexander recognized several alternatives to the principal of least action—emergence in complex dynamical systems, richness through evolutionary process of adaptation or selection (NO2 45). Alexander remarked that there must be a morphogenetic dynamic beyond those governed by “energetic” action, which he equated with the category of number-measure. He therefore conjectured a fourth principle, a morphogenetic unfolding defined with respect to a slippery concept of wholeness that is thoroughly Whiteheadean. Against these four morphogenetic principles I would contrast a fifth, a Serrean noise, of excess, which has nothing to do with the abstract randomness of information or complexity theory, but echoes the clamor of the sea. My presentation will end by asking what relations may exist between any of these principles of morphogenesis and Stengers’s cosmopolitical dynamics explored by the other panelists.

Joan Richardson.
Windows 2009 or Notations of the Wild, the Ruinous Waste
Whitehead: “We find ourselves in a buzzing world, amid a democracy of fellow creatures.” William James: “The baby, assailed by eyes, ears, nose, skin, and entrails at once, feels it all as one great blooming, buzzing confusion; and to the very end of life….” I want to keep the space of James’s sentence open for now, a window through which to look back to last year, when I attempted to score my remarks somewhat closer to a musical composition—chording, resting, returning again to the beginning—using additional kinds of punctuation to indicate Stop. Look. Listen to the overtones, “feeling-tones,” the “buzzing” of our synaptic connections making contact with the “‘cosmic’ event,” the residual elements of cosmological creation and other creatures resident still in our bodies. We are, in fact, the stuff that stars are made of and so can know, “what the sleeping rocks dream” if we re-cognize “our bond to all that dust”: what Stengers, continuing the work of Whitehead, Deleuze and other “studious ghosts” enjoins us to attend to under the rubric of cosmopolitique. I would like this year to add the voices of two more “studious ghosts” for our consideration within the stream: Jean Wahl’s (complicating the Deleuze connection as well as illuminating the Whitehead-Wallace Stevens link) and Gregory Bateson’s. In the pieces collected in Steps Toward an Ecology of Mind, written over the span of his career, Bateson references Whitehead and Whitehead and Russell’s Principia nine times, particularly the theory of sets. Following Bateson, confusion of different registers of language/experience, combined with not having a grammar to express “primary process,” induces individual and cultural schizophrenia. We need a system of notation, set operations for accessing/translating/transforming what James called “the deeper, darker strata of character where we find real fact in the making,” the habitation of mystical beasts and holy fools.

Session 5 (H) Woodruff B
Animal Transgenics: Of Mice and Meat
Susan McHugh
“Animal Transgenics: Of Mice and Meat” Even narratives critical of the new economic structures of globalization focus on the rare genetically modified (GM) animal in a world in which every year markets for GM plants and plant products grow astronomically, a trend that suggests a broader representational problem: why do animals loom so large in the transgenic imaginary? Bringing together sociological, literary, and film studies perspectives in animal studies, this panel begins to address this question through explorations of issues specific to GM organisms in animal form, more precisely, the laboratory mice, livestock animals, and zoonotic viruses taking shape in fact and fiction as animal transgenics. Together, the panel frames specific questions for animal studies research: what are the special obligations of scientists, ethicists, and consumers regarding such creatures, and how do animals become conduits for voicing—and silencing—these concerns? What else is consumed and rendered (in Nicole Shukin’s broadly biopolitical sense) along with animal transgenics? How do laboratory science and fiction model transgenic agency across species forms? And with what implications for human subjectivity?

Sherryl Vint.
Disembodied Cuisine: Eating Well in The Mad and Oryx and Crake
In “‘Eating Well,’ or the Calculation of the Subject” (1991), his interview with Jean-Luc Nancy on the question of ‘who comes after the subject,’ Jacques Derrida argues that Western metaphysics of subjectivity has relies on an exclusion of the animal, on a distinction between a human and a non-human relation to self. Derrida calls for a radical rethinking of the discourse of the subject, trying to find a way through such distinctions and beyond them to some category not caught up in pre-existing discourses of law, morality and politics as a necessary task in the deconstructive project of radical responsibility. So long as the indeterminate ‘who’ following the subject retains this boundary between humans and the rest of life, Derrida argues, “we will reconstitute under the name of the subject, indeed under the name of Dasein, an illegitimately delimited identity, illegitimately, but often precisely under the authority of rights! – in the name of a particular kind of rights” and thus retain a residual humanism in our posthuman thought. This schema of the subject Derrida names “carno-phallogocentrism” and argues that it dominates a subject who “does not want just to maser and possess nature actively … [but also] accepts sacrifice and eats flesh.” The ethics of who eat whom is central to the human/animal boundary and its ideological work. Key to this discourse is not only the understanding that humans can prey on animals but that animals are somehow ‘unnatural’ if they reverse this relationship. Further a prohibition against cannibalism is part of the formation of human identity in most cultures. Attempts to find alternatives to slaughter without giving up a meat-based diet are complex and at times contradictory. For example, animal tissue may be grown in a lab and harvested from a cell culture, not requiring an entire sentient animal in order to produce animal: yet the bio-art collective Tissue Culture and Art reveals that it is not so simply to avoid animal exploitation. There is currently no such things as victimless meat because ‘current methods of tissue culture require the use of animal-derived products as a substantial part of the nutrients provided to the cells, as well as an essential part of various tissue culture procedures’ (Catts and Zurr 132). Beginning from this philosophical and material context, my paper will explore two recent Canadian sf texts – Johnny Kalagis’ film The Mad (2007) and Margaret Atwood’s novel Oryx and Crake (2003). Each dramatise our anxiety about the exploitative relationships that structure our consumption of meat in the era of factory farming and genetically engineered animals. The former tells of the outbreak of ‘mad cow people’ who, like zombies, consume the flesh of other humans once they have been infected by eating cattle that have been dosed with massive amounts of ‘Genetic Remedy: Antibiotic Growth Hormone.’ The latter projects a future in which almost all human life has been wiped out by a virus, tracing the origin of this catastrophe to a culture which normalises the genetic manipulation of species. Raised in a context in which it is perfectly acceptable to produce pigoon (pigs with human DNA, engineered to provide organ replacement) and Chickie-Nobs (an engineered chicken that is all meat, no nervous system), Crake has no moral qualms about engineering what he believes to be a superior Homo species, Crakers, and wiping out the violent and territorial Homo sapiens. These texts’ interrogation of questions of the human/animal boundary through the ethics of killing and consuming others suggest that they understand a metaphysics of subjectivity based on exclusion of the animal to be part of the problem of contemporary society. They deconstruct to varying degrees the human/animal boundary that Derrida argues is the ‘conceptual machinery’ (‘Eating Well 109) for the human subject.

Susan McHugh.
Toward a Literary History of GM Animal Agency
Global consumption of meat is scheduled to double by 2050, provoking a crisis in what many already see as an unsustainable industry. Facing enormous pressures to meet rising consumer demands, producers who already are hampered by environmental and public health problems pin their hopes on technologically reconfiguring meat itself, producing “real artificial meat” in vitro rather than in whole-animal form. And they find unlikely allies in animal rights philosophers, policymakers, and activists, who claim that they are morally required to support what they see simply as a tech fix to the suffering of livestock, even before such products come to market. Thus would-be animal slaughterers and saviors alike share an assumption that tissue-cultured meat is produced (as the research group New Harvest’s website phrases it) “in a cell culture, rather than from an animal.” Such developments reveal a broad and deepening confusion about meat’s intercorporeal intimacies with animal agents and human subjects. Working against this trend, fictions of real artificial meat foreground shared human-animal lives in these conditions as points of access into public life, anticipating how the future of such creatures in agri-food flows hinges on perceptions as much as transfigurations of biopolitical life. Novels from Frederick Pohl and C. M. Kornbluth’s The Space Merchants (1954) through Margaret Atwood’s Oryx and Crake (2003) show how transgenic farm animals and other technological interventions into the global economies of meat do more than simply render the animal a deconstructive element within the sacrificial structures of human subjectivity. In these narratives, chicken- and pig-shaped transgenics in particular gain peculiar cultural resonances as what Philip Armstrong terms “feral agents,” more specifically, as animal forms of menace embedded in the tissues of biotech communities. More surprisingly, these literary figures draw out collaboratively creative potentials, akin to biologist Lynn Margulis’s “symbiogenesis” and philosopher Gilles Deleuze’s “heterogenesis,” in stories and histories of real artificial meat, even as they body forth the greatest threats to the future of species life.

Tora Holmberg.
Transgenic Noise and Mouse Silences---Engineered Animals in an Age of Biological Control
There is, on the one hand, what can be called a constant noise when it comes to transgenic and GM organisms: in research and in public discourse, transgenic animals are portrayed as sources of future salvation from human illnesses. On the other hand, however, there are also striking silences when it comes to ethical and welfare concerns. In this paper, I will address how these silences are produced and how a reinstatement of the trans-concept may counteract such silences. Transgenic animals become dilemmatic at a cultural level, since they can be understood as boundary walkers – or crawlers – constantly balancing on the fine line between nature and culture, organism and innovation, reality and model, science and technology. They constitute forms of techno-scientific hybrids, and, as such, simultaneously challenge and confirm cultural categories and dichotomies. But at the same time, they are quite ordinary laboratory mice within the lab-ecology. They are in short ordinary treasures. The paper builds on field notes and interviews with laboratory workers and members of animal ethics committees in Sweden, and engages with feminist science studies, science and technology studies and human-animal studies.

Reception / Business Meeting - 6pm - 7pm
GT Global Learning Center, 5th St.

Plenary Session - Fri 7pm - 8:30pm
Theater, GT Global Learning Center

Alien Phenomenology: A Pragmatic Speculative Realism
Ian Bogost

In recent years, a small cadre of rogue philosophers have assembled, thinkers who eschew both the analytical and continental traditions that drove most of twentieth century thought. Loosely grouped under the name "speculative realism" after a symposium by that name held at Goldsmiths College in 2007, key figures include Ray Brassier, Iain Hamilton Grant, Graham Harman, and Quentin Meillassoux. While none of these nor the others they have inspired adopt a unified approach, they do share one key principle: a critique of "correlationism," Meillassoux's name for the philosophical tradition since Kant. In his words, correlationism holds that "we only ever have access to the correlation between thinking and being, and never to either term considered apart from the other." Harman's version of the critique of correlationism is more general, he understands it as a resistance to philosophies that privilege humans and human experience over other entities, a approach that he calls an "object-oriented philosophy."

The speculative realists represent a type of neorealism, albeit a very strange sort. Object-oriented philosophy in particular owes more to A.N. Whitehead and Bruno Latour than it does to Plato or Carnap. Their perspective, particularly Harman's, issue a challenge and an invitation to science studies and cultural studies. The challenge comes from the critique of correlationism, particularly the challenge it poses to the anthropocentrism of social studies of science (including those descended from Latour's actor-network theory, despite the fact that Latour articulates an object-oriented philosophy) and cultural studies, since both of these fields focus almost exclusively on human experience. The invitation arises from speculative realism's young and therefore still narrow approach. Harman and his colleagues are first principles philosophers; their work addresses fundamental propositions, particularly metaphysical ones about the nature of things and their existence. Harman, for example, extends the givenness of Dasein to every entity, not just humans. Despite the fact that Harman's ontology allows many more sorts of objects into the party of being, it doesn't give us much help in approaching *particular* objects.

This talk offers such an extension of speculative realism in general and Harman's object-oriented ontology in particular, contrasting it with the ontologies of Whitehead and Latour, as well as my own ontology of "unit operations." I offer a kind of "applied" speculative realism, one that would offer approaches to the speculative study of objects of all kinds, human, mineral, animal, inert, conceptual, and imagined. Building on Husserl's epoché, Harman's theory of vicarious causation, and Whitehead's panexperientialism, I devise an approach that I call "alien phenomenology." To illustrate this method, I offer three modes of practice: ontography (the authorship of works that reveal the existence and perception of objects), metaphorism (the authorship of works that speculate about the unknowable inner lives of objects), and carpentry (the construction of artifacts that illustrate the perspectives of objects). Discussions of each of these modes are concrete and diverse, including examples from science studies, photography, literature, videogames, and computing.

Session 6 - Sat 8:45am - 10:15am

Session 6 (A) Bennett
Decodings in Postmodern Fiction
Michaela Giesenkirchen
Postmodern fiction is known for its tendency toward self-conscious play with the underlying structures of its own signification. This panel explores various instances in which the idea of textuality negotiated in postmodern novels includes digital or biological communicative systems constituting our physical worlds. How do Atwood, DeLillo, Calvino, Gibson, and Danielewsky variously decode such systems? How does such decoding relate to aesthetics, selfhood, embodiment, organicism, and globalism?

Michaela Giesenkirchen.
"Code and Romantic Organism in Calvino’s Anatomy of Reading"
Italy Calvino’s if on a winter’s night a traveler, a distinguished example of the postmodern meta-novel, presents us with a brilliant anatomy of both novel writing and reading. It self-reflexively enacts an array of contemporary theoretical approaches to textuality (such as Feminism, Marxism, Structuralism, and Poststructuralism), but, reaching beyond the critical climate of its day, also re-explores the possibility of narrative’s organic unity, on which the idea of anatomy rests. Perhaps more in the Pragmatist than in the Deconstructionist spirit, Calvino’s novel romantically seeks, and achieves, organic coherence trough reiterative incoherences and aesthetic pluralism. Central to this negotiation of organic coherence is the doubling of the “Other Reader,” the respective roles that the sisters Ludmilla and Lotaria play in representing diametrically opposed modes of reading. In Ludmilla, who believes that “authors produce novels like a pumpkin vine produces pumpkins,” Calvino celebrates the ever-elusive, entangled and fragmented, bio-aesthetic romance of modernist organicism. Lotaria, in turn, who charts the numerical distribution of words with computer programs to find every word charged with programmatic meaning, represents mechanistic postmodern preoccupations with the unraveling of (political) codes. How does code relate to organism, programmatic language to poetic pumpkins? Is the body just another uniform, as Lotaria believes, or is it still the ontological foundation of all readerly activity, as the Reader’s encounter with Ludmilla suggests?

Mary Newell.
"Decoding Identity in Decomposing Environments in Don DeLillo’s White Noise and Margaret Atwood’s Oryx and Crake"
Since our nervous systems complete their development through experience in environments, our neural codes for interpreting the environment and our own embodiment develop in tandem. Comparing Atwood's and DeLillo's novels, I will explore challenges to identity when environmental catastrophe threatens the characters’ capacity to decode cognitive signals. DeLillo’s and Atwood’s characters suffer damage in parallel with negative changes in their environments, thereby losing assumedly stable reference points that allowed them to decode reality. After his exposure to a toxic chemical release, Jack in White Noise discovers that medical scanning equipment can “read” his body better than he can, and also that environmental indicators are unreliable. Snowman in Oryx and Crake survives the intentional release of a deadly virus, but is concerned with the destabilization of his personal and species boundaries through biotechnology and microbial invasion. Further, Atwood parallels a loss of species with a loss of vocabulary, suggesting the role of human cognition and memory in constructing an environment that in turn influences it. These characters’ struggle for a sense of identity open a discussion of how to envision a model of human identity that is convergent with postmodern views of multiplicity and contemporary views of dynamic ecologies. For example, Donna Haraway suggests replacing the “chilling fantasy” of a self fortified against possible microbial invasions with a “semi-permeable self able to engage with others (human and non-human, inner and outer), but always with finite consequences; of situated possibilities and impossibilities of individuation” (“Biopolitics” 224). I will discuss the contribution of these novels to exploring decodings that support the interrelatedness of humans and their dynamic ecological contexts.

Doug Davis.
"The Aesthetics of Globalization: Decoding Transnational Digital Art in William Gibson’s Pattern Recognition and Spook Country"
In this paper I argue that in his two latest novels, Pattern Recognition and Spook Country, William Gibson uses the thematics of decoding, and specifically the praxis of artistic interpretation, to craft an aesthetics of globalization. Globalization is treated synecdochically in these two novels, its form latent in singular and enigmatic works of postmodernist artwork, “the footage” and “locative art.” When these novels’ protagonists finally decode these mysterious works of art they also come to an understanding of the forces and history of globalization. In these two recent novels Gibson rewrites the first two novels of his Sprawl Trilogy, Neuromancer and Count Zero, to provide literary expressions of transnational space instead of cyberspace. “Everyone says I foresaw cyberspace,” Gibson remarks in a recent Reuter’s Life interview about Spook Country, “but I did foresee the world of globalization.” While most critics focused on Gibson’s vision of the virtual world in his cyberpunk books, his science fiction also represented the material world of globalization. Frederic Jameson early on identified Gibson’s brand of cyberpunk as “the supreme literary expression if not of postmodernism, then of late capitalism itself.” Gibson no longer writes science fiction because, as he points out, contemporary reality has caught up with his globalized and computerized vision. Gibson accordingly turns his virtualized future into the commodified present. In place of hackers Gibson now writes about aesthetic detectives who, seeking the origins and meanings of works of digital art, enter vast global networks of cultural production in a post-9/11 world. These art historical research projects figure as a way for Gibson to retell the history and reveal the scope of globalization, both of which are always erased by commodities. In Pattern Recognition, Cayce Pollard’s search for the source of the mysterious, globally popular internet footage takes her deep into the new economy of post-Soviet Russia. Likewise, in Spook Country, Hollis Henry’s investigation of the new media of locative art brings her into contact with the cold war’s third world diasporic peoples. Both novels ultimately decode their mysterious postmodernist artworks by demystifying the globalized world.

Session 6 (B) Crescent
Visual Decodings I

Drew Ayers.
Humans Without Bodies: DNA Portraiture and Biocybernetic Reproduction
With the development of 21st century genetic technologies, humanity can be represented, reproduced, and decoded in entirely new ways, and this paper uses an examination of “DNA Portraits” as a starting point for an analysis of contemporary conceptualizations, representations, and decodings of “humanness.”  Placing DNA portraits into a dialogue with traditional portraiture, this paper examines the importance of embodiment in cultural understandings of the “essence” of the human.  Using Bruno Latour’s idea of the hybrid, this paper analyzes the ways in which DNA portraiture serves as an example of the weakening binary between nature and culture in contemporary society.  Bridging the gap between aesthetics and science, DNA portraiture straddles the division between humanistic and scientific representations and decodings of humanness.

Maria Aline Ferreira.
“Decoding Genetic Portraits”
With advances in biotechnology artistic practice has also been undergoing a comparable paradigm shift, as artists engage with the sciences, collaborating with scientists, turning laboratories into their studio and using organic material. In addition, as a result of biomedical developments the notion of personhood, of the “I”, is inevitably undergoing conceptual changes, which derive from the perception that what makes us what and who we are is firmly grounded, to a substantial extent, in our genes, in our embodied consciousness, a belief inflected by the profusion of new diagnostic tools such as TMS, PET scans, EKG and fMRI which provide alternative views of the functioning brain. Using this framework, then, I wish to analyse artistic examples of alternative self-portraits, which as a rule take bodily materials as their artistic medium, and address this confluence of novel interpretations of the self, ranging from the neurosciences to cognitive sciences and neuropsychology. As representative instances of this trend inwards of self-portraiture I will be looking at Helen Chadwick’s Self Portrait (1991), Steve Miller’s Genetic Portrait of Isabel Goldsmith (1993), Kevin Clarke’s Portrait of James D. Watson (1998-1999), Marc Quinn’s Sir John Sulston: A Genomic Portrait (2001) and Gary Schneider’s Genetic Self-Portrait: Irises (2002), amongst others. These artists engage with the question of genetic determinism, the influence of visual depictions of the inside of the body on the popular imagination, in addition to being concerned with the anxieties and ethical quandaries raised by biotechnologies and new diagnostic resources. I will draw on recent work on genetic self-portraiture as well as on genetics and biopolitics, areas that fertilize each other and interrogate the locus of self-identity from different but complementary perspectives.

Isabel Wunsche.
Exploring the Properties of Crystals: Otto Lehmann and Mikhail Matiushin
The complex structures and natural properties of crystals have fascinated physicists and artists alike. Beginning in the 1870s, the German physicist and father of liquid crystal technology, Otto Lehmann (1855-1922), began to systematically investigate phenomena such as the “growth” of crystals and the changes in crystalline substances. Experimenting with substances such as cholesteryl benzoate and silver iodide, he discovered “flowing crystals”—a group of substances that displays birefringence when put under polarized light. Based on Lehmann’s scientific studies, the Russian avant-garde artist Mikhail Matiushin (1861-1934) created a series of self-portraits based on the crystal form. By breaking up the face into parts in the same manner in which a crystal refracts the white light into its different color components, he used the crystal form to simultaneously reflect multiple individual facets of the human personality. The exploration of the spatial complexity of the crystal’s atomic structure in his works represents an early stage in Matiushin’s spatial studies and coincides with his interest in Cubism. In this paper, I will explore aspects of crystallography that were of interest to the avant-garde’s artistic explorations into color and form in the early 20th-century.

Christina Nguyen Hung.
Art In Vitro: microphotographic figures and landscapes
In my most recent work, hundreds of individual photographs are shot through a microscope and stitched together to represent the in-vitro environments that house live specimens. For the last year, I have been working with researchers in the Biophotonics Lab at Clemson University to create these images using neurons harvested from chick embryos. My use of the stitched panorama is unique in that it pictorializes fields of microscopic phenomena in a manner that goes far beyond common scientific visualizations and shifts into the realm of landscape. By rendering these artificial micro-environments visible, with all their “irrelevant” detail and the spacio-temporal incongruities which are the result of process, I am creating images that re-present elements of our world (neurons, soil samples, bacteria etc.) at a new, intermediate scale between the microscopic and the plainly visible.

Session 6 (C) Doggett
Decoding Detection

Melissa Littlefield.
Decoding the CSI: Effect: The Legacy of Scientific Detective Fiction 1909-1923
In legal scholarship and the popular media, the CSI: Effect has become a trope for the dangers of lay involvement with science. Lawyers and pundits alike have claimed that shows, including CBS’ Crime Scene Investigation (CSI:) franchise and FOX’s Bones, have affected juror expectations about evidence and the authority of science. In this paper, I argue that what we now call the CSI: Effect has a long legacy in the forensic sciences, scientific detective fiction, and the law. Decoding the CSI: Effect means recognizing the ways that scientific authority has been discursively coded at various cultural moments: in the early 20th century the Scientific Detective Fiction Effect kept scientists out of the courtroom; in the early 21st century, the CSI: Effect is helping to safeguard the authority of certain kinds of science in the courtroom.

» decoding detectives powerpoint
Previous | Next

Kate Roach.
Decoding the Detectives
Links between developing scientific disciplines and the emerging genre of detective fiction have been well described to date. Foucauldian theory has been widely and profitably applied to explore how scientific world-views intersected with the shift to disciplinary power in the 18th and 19th centuries, in part creating a fictional genre which emphasizes control of self and others through the synergy of science and detection. Such work tends to focus on plot and form, but few studies have explored the representation of the detectives themselves and their relationship to fictional scientists. In this paper I want to propose an alternative method, which substantiates links between detection and science, but shifts focus onto these representations. I will suggest that a phenomenological approach, of the kind developed by Alfred Schutz to probe everyday reality, is helpful in understanding how such figures come into meaningful engagement with the social world of the reader. In Schutz’s terms a ‘Holmes-styled’ detective, or a ‘mad scientist’, constitutes a typification – a socially endorsed schema through which untried experience is measured, such that new objects or events can be understood and communicated, without prior knowledge. From its inception in the 19th century, Scotland Yard was persistently worried by its failure to attract intelligent and educated staff, yet novelists and journalists variously constructed detectives as knowing empiricists, meticulous observers, or razor sharp logicians. These fictions, I will argue, are typified structures concocted from elements of the gothic scientist, the magician and the picaresque hero. Decoding the detectives in this way shows how they drift into cultural consciousness as a ‘chimeric’ typification cast from the parts of others. Detective and scientific facets are aptly characterised as disciplinary, but picaresque and magic veer toward deviance, attractively so, in fact. Is this what makes the disciplinary detective so irresistible?

Kerstin Bergman.
Girls Just Wanna be Smart? The Function of Female Scientists in Contemporary Crime Fiction
In recent years the scientific presence in crime fiction has grown rapidly and gained increasing popularity. One important part of this presence is the scientist, who is often assisting the main detective, but sometimes also, as in the case of the forensic crime fiction, appears as the primary investigator. In this paper, I will examine the role and function of in particular the female scientist in contemporary crime fiction. Her characteristics and function will be outlined and discussed, using representative (primarily Anglo-American) examples from novels, television, and film from the early 21st century. My thesis is that while being an interesting and complex scientific addition to crime fiction, the female scientists simultaneously often represent a feministic backlash.

Session 6 (D) Mitchell
Decoding Animism
Chair: Helena Feder

Elizabeth Swanstrom.
Animal, Vegetable, Digital: Decoding the Human in Digital Art
I consider digital works that intersect within animal, human, environmental, and technological terrains and suggest that such works mark an important shift in our conception of human nature. Throughout cultural history human identity has been likened to non-human entities. In the Phaedrus, for example, Plato uses the metaphor of the charioteer to describe the conflicting features of the human soul, which he describes simultaneously as human (the driver), animal (the horses), and technological (the chariot). Yet this example, as well as many others, remains firmly metaphorical. I refer to digital art works such as Steve Potter’s “Living Rat Brain,” Douglas Easterly and Matt Kenyon’s "Spore 1.1," and Lisa Jevbratt’s web crawlers to show the way that such metaphors yield to ontology in contemporary art and in order to argue that such works perform an emerging type of subjectivity that is distributed across human and non-human agents.

Karalyn Kendall-Morwick.
“Thoroughbred in Body and Soul”: Albert Payson Terhune’s “Lad” Stories and Modern Dog Fancy
The early twentieth century saw the rise of modern dog fancy, an unprecedented movement to codify the physical conformation of canine varieties through the establishment of breed standards. Although humans have been engaged in artificial selection and modification of breeds for centuries, dog biologists Raymond and Lorna Coppinger argue that modern dog fancy is marked by a problematic shift from selecting for behavior to selecting for physical appearance in the working breeds. In fact, people who hunt, herd, and work with dogs have tended historically to oppose fanciers’ efforts to secure kennel club recognition for their chosen breed. The selection for physical conformation encouraged by most breed standards, coupled with the pet trade’s demand for dogs better suited to domestic life than your average working sled dog or livestock guardian, tends to dilute or distort the behavioral conformation which predisposes each breed to succeed at a given task. American writer and rough-collie-breeder Albert Payson Terhune, most famous for his collection of “Lad” stories published in 1919, displays a complex attitude toward breed fancy. On the one hand, Terhune’s prose oozes with the fanciers’ Lamarckian rhetoric of function following form. The AKC-registered Lad’s heroic acts and near-supernatural abilities are routinely linked to his physical beauty and thoroughbred nature. Despite never having seen a sheep, for example, Lad heeds his collie instinct and expertly maneuvers a scattered herd into ring formation. On the other hand, Terhune’s most scathing criticisms are leveled at overbred dogs and the owners and breeders who fancy them. This paper explores the tensions between Terhune’s nostalgic love of the rough collie and his desire to justify that breed’s continuation in an increasingly urbanized, industrialized context in which it finds itself quite literally out of a job. I argue that Terhune’s stories reflect a widespread uncertainty about the place of working dogs in modern American culture.

Robyn Braun.
WE3: Decoding State Power
WE3, a graphic novel by writer Grant Morrison and artist Frank Quitely, tells the story of three pets – a dog, a cat and a rabbit – nabbed by the US military and forced to serve as cyborg weapons in the top-secret “Weapons Experiment 3.” When “Weapons Experiment 3” is decommissioned, the animals are slated to die. Upon receiving this news, the animals’ keeper leaves their door unlocked and the animals make their escape when they can. The story that unfolds is of the tenacious power of decoding and undermining of state forces by the very underside of those same powers. In their quest for survival in the world, without their keeper, the animal cyborgs are forced to defend themselves against the state’s attempts to capture them and in so doing the animals destroy every technique of capture deployed by the state, including the public’s perception of the state. On their voyage, a homeless man rescues them ultimately when they’ve shed their cyborg components and are cold, hungry and lonely as bare, used up and left behind state waste, befriends the animals. I will show how WE3 examines and represents the tenacity of the decoding power of those who must be left behind for capitalism to function, and those whose existence must be walled off for our ‘humanity’ to flourish.

Session 6 (E) Whitman A
Decoding the Noosphere: Experiments in Collective Intelligence
Richard Doyle
This panel will experiment on and with notions of collective intelligence.

Frederick Dolan.
Emergent Pan Psychism: The Really Hard Problem
Recent decades have brought a renewed willingness to take seriously the phenomena we stab at with such terms and phrases as “subjectivity,” “consciousness,” and “the character of human existence.” The tone of today’s discussion, however, is very different from that which surrounded these words a half-century ago. Then, “existentialism” referred to issues of individual authenticity and commitment, “consciousness” was closely linked with concerns about action and with notions of social and historical determination, and Jean-Paul Sartre was the master thinker. The context in which existence, subjectivity, and consciousness were discussed was predominantly moral and political. In an important sense, as today’s talk about “subject position” and “the performance of identity” suggests, that discussion is still very much alive, though the terms are different, the context has suitably changed from History-with-a-capital-“H” to matters of justice and liberty here and now, and Michel Foucault has displaced Sartre. At the same time, though, a different kind of inquiry into existence, subjectivity, and consciousness has taken shape, focused more squarely on the nature of these phenomena: on what sort of “thing” consciousness is, for example, on whether it can be explained,  and, if so, what sort of explanation is called for, and on why and how it matters.  My talk will focus on how the very idea of an "explanation" of consciousness is misguided and will point to more fruitful approaches to the meaning of the phenomenon.

Philip Kuberski.
Bateson on Distributed Mind
Gregory Bateson in his Mind and Nature (1979) has argued that a "mind" is neither necessarily interior nor non-material, but exists as and within an ecology. "A mind is an aggregate of Interacting Parts or Components." This can refer, then, not only to the "mind" of Wordsworth healing itself by a return to the Lake District; it could also include, as Bateson suggests, ecosystems such as a seashore or a redwood forest since they are both "self-corrective" (119). In other words, for Bateson, mind or intelligence is not only an aspect or feature of brains but of the wider system that evolves brains: we can scale these as "natural selection," "epigenesis," and "nature" itself. This "mind" is an association of parts, and "the interaction between parts of mind is triggered by difference" (100). Thus the mind is an aggregate of differences and is driven by difference: "...perception operates only upon difference. All receipt of information is necessarily the receipt of news of difference" (29). Batesonian difference seems to reveal a spirit-function in nature and mind. Like spirit, difference is unreified, atopical, eternal, immaterial and yet somehow central or essential. When Emerson tells us in Nature (1835) that "Spirit primarily means wind," he is reminding us that the basis of "metaphysical" terminology is metaphor--and metaphor is at base a rough draft of ecology, a tactical assertion that relationship in its profounder sense has not to do with appearance, but function. The metaphor of self, in other words, is irreducibly double, different in essence--a relationship: it represents the self as if and as is--the hard and durable idea of the self is a literal reading of the metaphor, the fluid and transient view is a metaphorical or equivocal reading--. “nothing” but breath or wind.

Richard Doyle.
Declarations of the Noosphere: Towards an Involutionary Speech Act
This talk will treat the noosphere - an emergent concept of Vernadsky, Le Roy, Teilhard and Aurbobindo - as a form of address suited to mapping the informational attentional layer of the biosphere towards a thermodyanamic practice of sustainability. If the noosphere is a "form of address", a word addressing an ecological ensemble of attention gathering, exchange, recording and recombination, (transduction) who or what is being addressed? Diverse traditions and studies of Non Ordinary Consciousness ( Harner, Fischer, Strassmann) suggest that rhetorical practices of emptiness are both necessary and sufficient to these "involutionary speech acts" ( after Austin, 1954, "illocutionary", "perlocutionary"). These practices of the self on the self, in which a radical sense of interconnection with larger scale phenomena such as "noosphere" often occur, are likely "irreducible" ( Wolfram) yet well mapped by simple iterative rules that produces of effects of "infinity". ( CF. recursion in Lisp, "A Little Lisper") "Involution" ( Aurobindo) often involves algorithms such as mantra or the simple repetition of breath, in conditions not entirely of the self's choosing. Neither "self reference" nor "reflection" nor a "strange loop", "involution" labels an operation of folding, an always ongoing discovery of the very topology of what has been dubbed "interiority". Such an exploration of the "morphogenomic space" (Lalvani) of subjectivity confronts the involute with evidence that the "inside" world of introceptive (Wollf) space is at least as complex, differentiated and evolutionary as the "outside." As a form of address, I offer the "noosphere" as an invitation to an experiment in collective intelligence through parallel involution. The Noosphere, a concept probably first coined by Vernadksy, names the feedback effect of awareness on living systems. Vernadksy's context for the definition was essentially a functional one - the noosphere is the collective effect of human attention on the Biosphere, which in turn transformed the lithosphere through the collective processes of living systems. Given what we now know about plant, fungal and bacterial intelligence, it now seems necessary to expand Vernadksy's definition to include all organisms and groups of organisms capable of actively increasing their dissipation of exergy (available energy) through the transfer, expression and alteration of information. Every layer of this system appears to be thermodynamically informed, so concepts from thermodynamics can usefully orient our collective transhuman investigation of this emergent attribute of our planet. In closing, we will consider the sentence "Perhaps the dynamics of human subjectivity have more to learn from protein folding than has been previously considered." Just Say Yes to the Noosphere!

» Online materials available
Previous | Next

Andrew Pilsch.
He Called It “Utopia”: Jameson’s Social and Vedic Transhumanism?
Now a widely transmitted meme with diverse meanings and practices, transhumanism often involves the evolution of a collective intelligence. The temporal location of this collective differs from account to account. In some cases, this transhuman collective will manifest the end point of a certain evolutionary trajectory, while in other accounts, the collectivizing forces that humanity begins to capture in their creation of the transhuman are immanent to an already existing system. This concept of an immanent transpersonal force is often linked to the transmission of Vedic ideas (Cf. Goswami) and practices of meditation as originally espoused in the Upanishads. From this perspective of a field of immanent and certainly collective intelligence, I explore the work of Marxist critic Fredric Jameson. Specifically, I am interested in exploring the transhuman valences of three terms in his conceptual vocabulary: "social," "political", and "Utopia". While an entire book could probably be written on Jameson’s concept of Utopia alone, I argue for a specific interpretation of the concept as a form of collective existence within the framework of the social. Sri Aurobindo's conversion from poet and radical to spiritual, transhuman guru during the early part of the 20th century makes him an exceptionally interesting figure for building a bridge between Jameson’s post-Marxist analysis and the main body of transhuman thought. In exploring various examples of Vedic transhumanism, I wish to highlight the importance of the figure of individual, spiritual work of participants in this model of transhuman overcoming. The commitment to revolutionary praxis and social change within Jameson and Aurobindo's theoretical systems extends throughout the other thinkers within this model of transhumanism and offers us a necessary and important context for understanding the personal and collective work of bringing about a transhuman future.

Session 6 (F) Whitman B
Jean Crotti's Portrait sur mésure de Marcel Duchamp:

History, Originality, and the use of Technologies of Reproduction for the Creation
James McManus; Anne Collins Goodyear
This panel examines the case for redating two drawings by Jean Crotti, created after his Portrait sur mésure de Marcel Duchamp. Although inscribed 1915, a recent combination of evidence and technical examination of the works, carried out in conjunction with the recent exhibition "Inventing Marcel Duchamp: The Dynamics of Portraiture" raises questions about whether those dates can in fact be considered documentation of the works' creation. The problem of the timing of the works' manufacture, the question of the artist's motivations in putting this work into the public eye, and the translation of a work which was originally sculptural into the new media of photography then graphite and charcoal also raises fascinating questions about how art has been created and interpreted with respect to particular historical categories in the recent past. The discussion and study of these issues has been facilitated by close collaboration by a team of curators and conservators from the Museum of Modern Art, The Philadelphia Museum of Art, The National Gallery of Art and the National Portrait Gallery, and has incorporated careful technical examination of the work conducted by this group in the Lunder Conservation Center, Smithsonian American Art Museum and the National Portrait Gallery This is a project in which historical and scientific study of the objects at hand has been critical. Our panel examines the role of emerging technologies in the creation and study of works of art in the recent past, extending back to look at Crotti's construction and dissemination of images of his sculptural Portrait sur mésure de Marcel Duchamp and coming up to the present in our reliance on new technical tools for the physical examination of these works. Our proposed discussion will demonstrate the historical value of a multi-disciplinary approach to the reexamination and reconsideration of works of art. This panel will consist of joint presentations by James W. McManus and Anne Collins Goodyear, building on a team-based study of the works at hand, and in particular on the physical study of the works by conservators Scott Gerson of the Museum of Modern Art in New York and Scott Homolka of the Philadelphia Museum of Art.

Session 6 (G) Woodruff A
Whitehead+Cosmopolitics III: novel ecologies of practice i
Steven Meyer
This is the third of four panels for the stream "Whitehead+Cosmopolitics" proposed by Steven Meyer ( and Sha Xin Wei ( The abstract is identical for each panel, although the presentations differ. Most generally, this four-panel stream poses the question of the relation between the cosmopolitics envisioned by Isabelle Stengers in her “Cosmopolitical Proposal” (2005)—and back of that, the two-volume Cosmopolitiques (1997, 2003)—and the philosophy of social organism of Alfred North Whitehead. Both Stengers and Whitehead insist that any cosmological inquiry worthy of the name must take science (including technoscience) seriously, and that this means more than merely subjecting it to traditionally social and political as well as philosophical critique. For starters, it means cosmopolitics and it means societies of actual occasions. How to undertake the relevant cosmopolitical and Whiteheadian analysis is, to be sure, a complicated question; and the ten panelists approach it variously, in some instances directly, in others indirectly. The stream is divided between theoretical and practical approaches, although the frame is deliberately porous. As Andy Goffey remarks in an observation equal parts Whiteheadian and Stengersian, the idea is to develop more specific accounts of “the cultivation of thinking practices that evade the easy academic distinction between theory and practice."

Lina Dib.
Cosmopolitics and the Recoding of Memory Machines
As architects of memory, computer scientists paired with neuropsychologists and engineers create wearable sensor-based cameras and furry robot companions. Based on the notion that memory is something we keep in our heads and retrieve at a later date, they design controlled experiments to test for the performance of what they envision as prototypical memory aids. Stengers' “Cosmopolitical Proposal” argues that in order to address the contemporary world, the question is not about sorting things based on existing concepts and theories, but rather to let things provoke thoughts and contingent formulations of them. How might the production of memory tools operate "in a mode that gives the issue around which they are gathered the power to activate thinking, a thinking that belongs to no one, in which no one is right" (CP 1001)? In this paper, I consider how these prototypical recording machines bring with them the obligation to rethink the gap on which their makers draw between the fantasy of total recall and the fear of complete amnesia. As sites of contention and convergence, how might the design of new recording technologies offer possibilities for recoding experiences of remembering that are not "besieged by dramatic either/or alternatives that slice up our imaginations" (CP 1002)? Rather than fitting wearable cameras and robot companions with well-established neurological or mechanical notions of memory, were we to think with Stengers and follow a cosmopolitical approach, the stage would be set for a productive amnesia or suspension of memory. That is to say that in their encounter with the world, these unfinished technological objects might raise new polyphonic formulations of the very concept of memory, allowing it to burst beyond the private and corporeal bounds we usually attribute to it.

Andrew Goffey.
Existential Catalysers: Cosmopolitics and Axiological Creationism
In their essay Capitalist Sorcery, Philipe Pignarre and Isabelle Stengers borrow Felix Guattari's notion of existential catalysis to characterise the processes of 'autonomous creation' that the cosmopolitical approach to our 'temps des catastrophes' calls for. Inseparably speculative and pragmatic, the notion of existential catalysis provides an excellent example of the kind of conceptual invention that is needed to resist those forms of power that otherwise succeed in inhibiting thinking and benumbing experience. My paper explores and experiments with the notion of existential catalysis as a conceptual creation that draws on and complicates contemporary ecologies of knowledge practice. Recurring back to discussions at SLSA 2008 panels on Whitehead and Deleuze, the paper examines: the links between the 'universes of alterity' that existential catalysts open up and Whiteheadian value; the theory and practical exploration of auto-catalytic sets in complexity theory; and the autonomous—if not autonomist—creation of practices of resistance. Set against these largely expository themes, the aim of the paper is to develop a more specific account of what, with Stengers, we need to learn to think of as the cultivation of resistance, the cultivation of thinking practices that evade the easy academic distinction between theory and practice. Whilst sceptical of simplistic appropriations of contemporary science within the humanities, the paper argues that the cosmopolitical reading of the ecology of practices, and the axiological creationist's insistence on transversality can mitigate against the unproductive, limited and limiting metaphorical transfer of terminology from one domain to the next. In the artificial chemistry of new forms of life, we find an experimental creation of imaginative, covalent bonds between Whiteheadian metaphysics, contemporary technoscience and social practice.

James J. Bono.
Multiplicity, Atomicity, Experience: Whitehead's Buzzing World, Stengers's Space of Hesitation, and the Universalizing Word's Seductive Theology
Lurking in the shadow of the Deleuzean “idiot” populating Stengers’s “space of hesitation”—a space where slowing down is a tactic for resisting consensual constructions of common, universalizing worlds—is the haunting figure of Francis Bacon. Bacon, after all, pithily resisted the benumbing “Idols” that lulled fascinated Aristotelians into leaps of hasty imagination, and thus exclamations of authoritative knowledge, with one witheringly deflating quip: “Truth is the daughter of time, not wit.” Slow down, avant la lettre. Yet, despite Bacon’s seeming resistance to authority and universalizing claims, his own critique conjured up a lost Adamic language—a lost universal language—that he linked inexorably to Adam’s lost divine power and birthright: Adamic dominion over all creatures. In seeking a “Great Instauration” to recover this lost birthright—knowledge as power, knowledge as dominion, the human knower as new Adam and Lord of all animate and inanimate things—the Baconian turn toward experience and empiricism with its embrace of multiplicity proved abortive. Building upon my own work on the emergence of the language of modern science out of a mythos of Divine Word and Adamic dominion, and a recent paper by Bill Egginton, I shall argue that Stengers’s space of hesitation, her salutary praise of slowing down, represents a telling response to the continued seductions of a now “secularized” version—better yet, repetition—of the Judaeo-Christian theology of the Word in which nature is presumed to be coherent, subject to human practices that can in fact produce a “universal neutral key,” and whose rightful outcome is dominion by humans. Here I shall link this argument to my analysis at last year’s SLSA conference of why Whitehead’s fundamental acceptance of multiplicity as an irreducible feature of this “buzzing world” leads him to insist upon the atomicity of what he calls “actual occasions.” For it is precisely the temptation to reduce this “buzzing world” of things as events and processes to a “programmed” and purposive unfolding of potentialities already contained in a “universe” that “is shivered into a multitude of disconnected things”—Aristotelian substances—that Whitehead’s atomistic actual occasions and societies resist and refuse. To the seductive Western theology of “World” as product of the “Word”—as a Divine plan or “program” that the will-to-truth desires to possess and control by reconstituting the Adamic language in the abcedarium, for example, of Bacon’s natural and experimental histories or of the genetic code as the Divine Book of Life—Whitehead opposes the concreteness of the multiplicity of things calling unto one another and creatively constituting the new. With Whitehead’s process ontology and analysis of experience, Stengers’s call to “slow down,” to create a “space of hesitation,” and to think a cosmopolitics without the domination implicit in a universal language of science accessible only to humans finds an ally and a home.

Session 6 (H) Woodruff B
TRANimalS: Understanding the Trans- in Zoontology
Lindsay Kelley; Eva Hayward
Animal Studies often encodes and decodes the prefix trans-: trans-species, trans-genics, trans-biology, trans-humanism, trans-marine, transplantations. As with parasitism, commensalism, and mutalism, trans- implies movement across, through, over, to or on the other side of, beyond, outside of, from one place, person, thing, or state to another. Trans- disturbs purification practices; psychical and corporeal experiences and events are blended at multiple material and semiotic levels. Species exist in taxonomic differences (Homo sapiens sapiens are not the same as Octopus vulgaris), but species are also always already constitutive of each other through the spaces and places we cohabit – this of course includes language and other semiotic registers as well as other more familiar kinds of zones of contact. Indeed, species are relationships between species; relationality is worldliness. Trans-codings take place at the multiple levels, binding humans to their non-human companions, jellyfish to rabbits, and food to flesh. Central to trans-coding is the very real presence (and conditions) of animals in culture, history, art. We invite papers that examine the rhetorical figuration of trans- in relation to human/animal relationships. We welcome topics related to bioart, transgenic practices, xenotransplantation, becoming-animal, animorphs/zoophilia, food cultures, anthropomorphism, speculative fiction, bestiality, and other kinds of trans-species corporealities.

» TRANimalS flyer!
Previous | Next

Prema Prabhakar.
“Do Not Rest in Peace”: The Obsessional Mediumship of Diamanda Gala 
In her book, The Shit of God, Diamanda Galas writes, “an actor may stimulate the desired emotion state through a skilled manipulation of external object materials, or he may use the raw materials of his own soul in a process which is the immediate, the direct emotion of the experience itself.  The second concern is felt by performers who, not just professional, are Obsessional performers.” Clearly Galas’ performances, ignited by her unearthly voice, multiple microphones and elaborate staging, are meant to be obsessional, an experience that brings Galas’ audience into the feeling state of her voice; Galas’ performance are, often, so intense that audience members have reported experiencing acute panic attacks, nervous agitation and intense sickness—a reaction that alas encourages and applauds.       I would argue that Galas’ performances are not only obsessional—working as opportunities for Galas to touch the audience with her soul’s “raw materials”—but they work as acts of “meta-mediumship”; they navigate through and, eventually, embody sites of historical trauma.  Like the traditional medium (definitions of what an act of traditional mediumship entails will be briefly outlined) whose voice and sudden, spastic physical movements indicate communication with and possession by the conjured ghost, Galas’ piercing, wailing, trembling, shrieking and animal-growling four octave range speaks into and echoes the historical ghosts of AIDS, The Romanian Genocide, mental illness and the crucifixion of Christ.       In “Plague Mass”, Galas, shirtless, bloody and wild-haired, uses the language of the Bible’s Leviticus, uttered in her own base, dirge-like tones to evoke the devastation of AIDS and the necessity of bearing witness to such devastation.  Galas’ voice and the biblical language her voice utters, not only calls upon, it calls within—it becomes the eye, the center of the Plague Mass; her voice embodies the AIDS epidemic as historical, biblical –an apocalypse: a historical circumstance where suffering and the pain of the condemned have climaxed and must speak itself through her voice.       Galas’ voice, again, embodies the history of trauma in “The Sporting Life”.  Unlike “The Plague Mass” where Galas’ voice became the site of a reverberating apocalypse, here, Galas is double-voiced: crow—a vengeful, cackling grunting, ‘becoming animal’ and the human companion to crow: the strident, unbending narrator.  While Galas’ narrator voice tells the story of revenge: a man is sodomized in the back of the woman’s car he attempted to rape, crow titters, giggles and screeches.  Galas’ double voice allows her to tell the story of a particular event and, by using the ambiguous human-animal, crow, express the obsessional feeling-state of rape trauma; her voice, like the psyche of many (certainly not all) trauma victims is rent apart.      Finally, in “Shrei X”, a performance meant to touch the experience of madness, confinement and torture, Galas’ loses much of her language and becomes pure voice, purely obsessional, purely a medium for the emotional state she is performing.  She emotes a “trans-language”, where language has turned against her and she has only sound, only voice.  She does not have the doubled voice of “The Sporting Life”; she has, effectively, become animal, become crow.       It is important to note that while each of Galas’ performances becomes the medium through which a historical trauma is experienced and evoked, Galas is no mere conduit and she is not without intent or agenda.  Often, she is simultaneously able to become the voice of the traumatized and the voice of judgment; in her performances, the booming voice of judgment: Leviticus, the fates, crow is the voice of the revenged victim.       In attempting to define the terms obsessional and meta-mediumship, this paper will ask such questions as: How is Galas able—if she is able—to wake the dead, these uneasy ghosts—through the vibrant, vibrating vehicle of her voice?  How does Galas’ “becoming animal”, her crow-like cackling and whining help her to locate and embody these historical sites of trauma? And How is it possible to be a medium for both the wronged, the victims of history’s violence, and the voice of an unforgiving, damning, Judge?  How, indeed, is it possible to be the voice for a historical experience and event, rather than, simply, the medium for an individual and their experience?

Katie King.
My Distributed Animality
The literal/metaphoricity or playful immersive double consciousness in the virtual world Second Life has reembodied my understandings of “distributed human being” (as described by Leigh Star 1995: 18-19) and “distributed cognition” (as synthesized by Kate Hayles 1999: 290-1) – two trans ontologies I find myself entangling in writing, teaching, methodological reflection and academic activism. My Second Life avatar, Katie Fenstalker, keeps uncovering in dreams and other unconscious and proprioceptive knowledges, how much a First Life Katie King is just another RL (“real life”) avatar.  For this panel I would like to explore distributed animality and cognitive sensation in Second Life. Two instances I have recently turned to others in SL to animate, demonstrate, share and teach. First, the interactive reprogramming “training” of scripted dog-objects to interact with other scripted animal-objects, that is to say, training an Aussie to herd sheep in SL. Second, the playful double consciousness of cognitive sensation mobilized by disability activists in SL, who create events in which RL humans inhabit two kinds of SL avatars, one a “blind human” and another a “seeing eye dog,” in order to create new trans knowledges.  I am also interested in connecting this project to thinking from my current book on Networked Reenactments and so would like also to demonstrate why and how theories of play initiated by Gregory Bateson (1954-1984) are taken up in studies of immersion by game developers and theorists such as Katie Salen and Eric Zimmerman (2003) and then can be applied to distributed being experiences in Second Life. Such transdisciplinary immersive practices engage in repurposing cognitive sensation for multiple commercial, academic, and public media.  References: Bateson, G. (1972 [1954]). “A Theory of Play and Fantasy” in Steps to an ecology of mind (pp. 177-194). San Francisco: Chandler.Hayles, N. K. (1999). How we became posthuman. Chicago.King, K. (2006). “Cycling through Trans Knowledges in Queerish Landscapes.” Paper presented at the the plenary panel on "Crossing (Queer) Disciplines," Global Queeries: Sexualities, Globalities, Postcolonialities Conference, University of Western Ontario, Canada, 13 May.King, K. (forthcoming). Networked Reenactments, flexible knowledge under globalization. Book manuscript in progress.Latour, B. (1993 [1991]). We have never been modern (C. Porter, Trans.). Harvard.Latour, B. (2004). “How to talk about the body? The normative dimension of science studies.” In Body and Society, special issue edited by Madeleine Akrich and Marc Berg, ‘Bodies on Trial.’ Online version from author's website. Retrieved February, 2009Salen, K., & Zimmerman, E. (2003). Rules of play: game design fundamentals. MIT.

» interactive web version of talk with pictures and video
Previous | Next

Mel Chen.
The Animacy of Toxins
Essentially I am working on the animacy of toxins, with a particular eye toward their racialization. My latest piece in Discourse, as well as a future piece for GLQ on animacy and environmental illness, study the connection between the 2007 lead panic around children's toys in the U.S. with the morphing human poster child of health/security/sovereignty panic, from a "black inner city child" to a "white middle class child" whose health threat seems to center around the (queer) licking of (colorful) Thomas trains (another trans, transitivity, appears here, with the trains having critically been assembled and painted in China, which becomes the sovereign threat materialized in the paint). At the same time, the lead itself coheres sociopolitical mythologies with biochemical ones, acquiring the racialized animacy of a contagious vector. As a toxin, lead has demonstrated neurological and other effects on bodies. Other toxins (for instance, mercury, which I also discuss in relation to altered sociality with inanimate objects) acquire similarly, if shifting, racialized and corporealized profiles. I examine the ways in which certain toxicities come to appear "proper" to certain populations transnationally, despite not being proper to anyone, and transfigure just as they (literally) transform bodies. Considered this way, the notion of property (proper bodies, capital, toys, manufacture, nationhood, race) must give some way to transitivity and porosity, and received typologies upon which narratives of contagion, security, humanness/animality, and health rely, transmutate in spite of themselves.

Natalie Hansen.
Trans-Species Embodiment: Becoming Human with Horses
Human-horse co-domestications engage affective, social, and biological aspects of both human and equine experiences. Girl-horse love offers an alternative model for becoming-human that is mediated by the fantasy of trans-species embodiment. In this paper, I argue that girlhood enactments of cross-species embodiment constitute a form of identifying with bodily otherness apart from anthropocentric constraints, thus enacting “imaginary alignments” with corporeal others (Silverman 19). Stories that trace the fantasy of transposing girl body and horse body, of becoming a girl-centaur, half girl, half horse, and the empowerment that comes with this transformation, are, I argue, “body narratives,” narratives that “engage with the feelings of embodiment” (Prosser 12). Jay Prosser argues that body narratives are not purely representational, that they “not only represent but allow changes to somatic materiality” (16; my emphasis). I argue that girl-horse transformations, metamorphoses into the girl-centaur, are a way of making sense of the body, a way of bringing imagined and real bodies together. Girl-horse transformation engages girls in cross-species imagining, engaged relationality, social interaction, as a way to be a body, to self-define, to negotiate a livable body ego.      My literary analysis focuses on one of the most culturally familiar figures of girl-horse love, Velvet, the protagonist of Enid Bagnold’s National Velvet. This novel (and the movie staring the young Elizabeth Taylor and Mickey Rooney) narrates the love between a special girl, Velvet Brown, and a special horse, The Pie, both queer in their own ways. The novel begins with the fantasy of being a horse, of having the girl-body transformed into a horse body, into a girl centaur – half horse, half girl. The novel progresses through Velvet’s fantasies into the reality of her riding and competing her own horses, activities that culminate in her racing The Pie in the Grand National steeplechase. To accomplish this dream, Velvet transforms her body into that of a “little man,” passing as a male jockey.      I argue that girl-horse love can offer an alternative model for the development of self-identity, autonomy, community, and (cross-species) love. Decoupling readings of horse-crazy love from normative models of subjective identity and desire broadens the possibilities of relationality and subjectivity for both girls and horses. The queer potential of girl-horse love also lies in the alternative trajectory it suggests for becoming-human. Normatively, “the becoming-human of the self occurs through a process of recognition which must necessarily abandon and repress desire’s more fluid potentialities” (Colebrook 21); resisting this, the cross-species love expressed by horse-crazy girls offers a trans-species model for becoming-human and for being non/human. Deborah Bright argues that far from being dismissed as immature passion, girl-horse love is powerful, a force that, in resisting being harnessed by conventional social pressures, has the potential to challenge anthropo- and heteronormative relations through the “subversive sisterhood” of horse-crazy girls and the “deviant passion” of cross-species partnerships (22).  Works Cited Bright, Deborah. "Being and Riding." GLQ 6.3 (2000): GLQ Gallery. Print. Colebrook, Claire. "How Queer Can You Go? Theory, Normality and Normativity." Queer Non/Human. Ed. Noreen Giffney and Myra Hird. Burlington, VT: Ashgate, 2008. 17-34. Print. Prosser, Jay. Second Skins: The Body Narratives of Transsexuality. New York: Columbia U P, 1998. Print.  Silverman, Kaja. The Threshold of the Visible World. New York: Routledge, 1996. Print.

Session 7 - Sat 10:45am - 12:15pm

Session 7 (A) Bennett
Roundtable: Does Art Produce Knowledge? Can Science Produce Art?
Laura Otis
Although it is widely believed that people can learn from art, many of us would hesitate to say that art can produce knowledge. Art is often associated with emotion and intuitive realizations; knowledge, with carefully formulated ideas, substantive evidence, and rigorous testing. Nevertheless, if asked how they know what they know, few people would point to scholarly or scientific books. For many of us, literature, music, theater and painting do more than move us: they show us how the world works. Admittedly, scholars’ and artists’ goals are vastly different, but art and science both involve the representation of perceived patterns. Ultimately, human intelligence and creativity drive both fields, so that it is questionable whether knowledge can be linked to one alone. In this roundtable discussion among artists, a scientist, a literary scholar, and a philosopher, we will rethink and challenge traditional concepts of knowledge, working toward a new definition of knowing vitalized by art.

John Johnston; Robert N. McCauley; Steven J. Oscherwitz; Laura Otis; Sidney Perkowitz.

Session 7 (B) Crescent
Decoding Use: Broken and Failed, Localized and Expansive
Kimberly Lacey
This panel investigates adaptations of “decoding use” through gaming, sculpture, literature, and climate change. Our first two panelists discuss machines and how their human users adapt to failed technologies. Speaker #1 explores how Nintendo DS users have to understand their relation to the game differently if the touch-screen is broken. Speaker #2 interrogates the failure of humans and computers, working together and separately, to decode the Kryptos sculpture outside the CIA headquarters. Our next two panelists focus on user centric failure. Speaker #3 analyzes the tendency in early American literature to use local narratives in order to construct broader, national ones. Speaker #4 reflects on the limitations and possibilities of modeling climate change and the effects of such generalizations and abstractions on the public. Through an investigation of decoding failures and challenges, this panel addresses multiple avenues of "decoding use" and their effects on codes in the public sphere.

Samuel Tobin.
Breaking the Code of a Broken Screen
This paper will examine a range of issues including materiality, evidence, interaction and play through the use, wear and tear of the Nintendo DS video game system. By focusing on marks and indications of use, specifically scratches, indentations, and loops and whorls on the touch-screen we can begin to explore the aspects of our use of these (and related) devices that is not purely a matter of digital encoding or interaction. This approach also emphasizes the materiality of not just the thing, but also of the relation of the thing and its user. This paper is an attempt to suggest ways in which we might open up new avenues of research, exploration and criticism.

Kelsey Squire.
De/Coding Cultural Memory in Mark Twain and Willa Cather
Many non-native American writers find that tracing their Anglo-American genealogy inhibits the development of a uniquely "American" literature, history, and culture. In this paper I explore how Mark Twain and Willa Cather attempt to circumvent Anglo-genealogy through coding and decoding American cultural memory of ancient peoples and geography.  Through their interpretations of American history Twain and Cather simultaneously engage in acts of coding and decoding: as they decode the landscape they extract meaningful memory that goes beyond genealogy; however, they also code a particular (perhaps more Anglo than native) version of these cultural memories.  Ultimately, cultural memories in Twain and Cather's novels raise ethical questions concerning the construction of a unified, national narrative out of a collection of localized communities and competing narratives.

Kimberly Lacey.
Decoding Kryptos and the Failure of Human/Machine Interaction
Highly intricate algorithms, created to muffle sensitive national security information, can be quickly infiltrated with computer-aided reverse analysis. Much like biological processes such as memory, espionage is an increasingly joint production between computers and humans. However, outside of the CIA headquarters in Langley, VA there is a symbol that is unintelligible to the decoding efforts of the world’s foremost cryptographers, humans and machines alike. Translated from the Greek as “hidden,” the sculpture Kryptos is divided into four quadrants (K1-4), all of which have been solved except K4. What is most significant about Kryptos is that neither the most skilled human or computer cryptographers are able to decode its message. Unfortunately this presentation will not solve K4’s secret, although in this paper I will explore the dual failure of human/machine interactions evidenced by the impenetrable Kryptos sculpture.

Jared Grogan.
Garbage, Gospel, and Gold: Decoding User Inputs in Climate Modeling
In the 1970s the climate model and simulation World3 set in motion a series of debates, conceptual conflicts, rhetorical interpretations that evolved with the changing structure of models (the construction and interaction of equations), quantity and quality of data, and the methods of aggregation and calculation. Rather than charting a chronological narrative of these debates, I begin a new line of questions that applies Johnston’s "computational assemblage" (a machine and its associated discourse) as a framework to identify how users’ input into models both resembles and differs in form and function over three broad stages of development.  This paper then examines contemporary modeling practices, exploring innovative ways that climate models are “going public” where big data sets are generated, bought, and sold in petabytes, while being rhetorically constructed in terms or risk, scarcity and excess.

Session 7 (C) Doggett
Occultism and Science in 20th-Century Art
Chair: Linda Dalrymple Henderson

Linda Dalrymple Henderson.
'Four-Dimensional Vistas': Claude Bragdon's Synthesis of Theosophy, Ether Physics, and the Fourth Dimension in the 1910s
Through his books A Primer of Higher Space (1913) and Four-Dimensional Vistas (1916) Claude Bragdon was the most important promulgator of the multivalent spatial "fourth dimension" in early 20th-century America. Not only had this concept quickly acquired a variety of non-mathematical associations in the later 19th century, it was regularly discussed in relation to the ether of space in popular literature through the 1910s. Bragdon's Four-Dimensional Vistas is an invaluable time capsule of attitudes in this period. In his plaintive lament, "If [the Relativists] take away the ether, they must give us something in its stead," Bragdon spoke for a generation of artists and writers grounded in the prewar and wartime period. Like Bragdon, these figures saw the central elements of the prewar "meta-reality"-the spatial fourth dimension and the ether-displaced in the new space-time world of Einstein in the 1920s and were challenged to navigate a radically changed cultural terrain. 

Ashley Schmiedekamp Busby.
Astral Magicians: Surrealism, Astronomy, and the Occult in the 1930s and 1940s
During the early years of Surrealism in the 1920s, occult practices such as mediumship served as a vehicle for exploring the unconscious. However, as the movement developed, a second generation of Surrealist artists, active from the mid-1930s onward, explored a broader range of occult practices. This "occultation of Surrealism," as it has been called, served as a means to unite and strengthen the movement both during and just after the hardships of World War II. To date, scholars have focused primarily on links between Surrealism and alchemy; yet, given the Surrealists' general interest in astronomy and the heavens, the related occult practices of astrology and the tarot were equally central. This paper focuses on the art of Victor Brauner, Remedios Varo, Leonora Carrington, and Kurt Seligmann to explore their specific engagement with astrology and heaven-oriented elements of the tarot.

Arielle Saiber.
The Architecture of the Afterlife: Paul Laffoley's The Divine Comedy Triptych
Contemporary artist Paul Laffoley's The Divine Comedy triptych (1972-1975) exhibits his trademark architectural precision, mind-boggling detail, and immense erudition--¬qualities that lend themselves to an illustration of something as vast and intricate as Dante's Comedy. The triptych is, as Laffoley himself describes it, a “verbal cathedral.” It reconstructs the epic, encyclopedic content of Dante's masterpiece like no other rendering to date, and it does so through both a literal, visual “translation” of the text and through a carefully woven complex of medieval and modern esoteric symbolism and occult thought. As dizzying and transcendent as Dante's journey itself, the triptych reminds the viewer of the ineffable nature of the universe's scaffolding while giving a structure in/through which to think and even experience its shape and nature. As journalist Ken Johnson wrote of Laffoley's works recently, “If you want to give your mind a good stretch, try a Laffoley” (The Boston Globe, Feb. 11, 2007).  

Session 7 (D) Mitchell

Benjamin Robertson.
The Illegible and the Interface
David Carson once solved the problem of boring interview copy by laying it out in Zapf Windings, thus following his tagline: "Don't mistake legibility for communication." In the early twenty-first century, Carson's point is critical. Even as writers struggle to produce legible documents, the interfaces through which they do so disappear and thus render the convergence of technology and reading/writing practices unacknowledged and unthought. Thus we have granted to words-on-a-page, left-to-right, top-to-bottom writing an unimpeachable position of legibility. Through an examination of the potential of composition interfaces and the products they can produce, this paper questions inherited notions of what can and cannot be read, or, in the language of this conference, what can be encoded and decoded. As Carson demonstrates through his work (mirrored in contemporary novels by Mark Z. Danielewski and Jonathan Safran Foer), the illegible is merely an act of communication we have yet to learn to read.

Rebecca Perry.
Decoding Process: Science and The Editable Self
To begin to decode the process of editing as a significant, widespread yet nearly invisible activity in science and culture, three realms of editing are considered: editing creatures in video games, editing creatures for laboratory research and editing scientific images. The process of editing is a new area for investigation and an evolving form of discourse which has been taking place largely out of view—having long been considered a byproduct of production of final scientific objects for publication. Digital creatures: I was swimming as hard as I could, wriggling through the churning water, but I couldn’t outmaneuver the large creature with sharp jaws that was gaining on me, closing fast. A few quick scissoring motions later, the tide pool was pink with my blood. I went into editing mode, selected a faster, stronger set of fins and attached a pair of protective spikes. This used all of my available DNA points; I hit enter and returned to the game. Editing is the nearly the first experience players have in Spore. Editing tools and practices are integral to the game, and editing has consequences—specific parts encode behaviors, sounds, capabilities, strengths and identities in the game. Editing is distinct from creating, or the origination of new content. Laboratory creatures: Editing also now includes living laboratory objects, the production of model organisms such as worms, fruit flies or engineered strains of mice, including those with obesity, aging disorders, hair loss, and chemically-induced menopause. Custom-edited lab animals are available as well, in different formats such as frozen embryos or DNA samples. In synthetic biology researchers are using the language and icons of engineering to recharacterize components of a cell as biological “parts,” to be edited, standardized, rearranged, and recombined into “devices” as part of a manufacturing process aimed at producing items like tiny chemical clocks that fluoresce in cycles. Images: Scientific image editing for publication has come to include a broad spectrum of complex scientific activities. Image creation has become a process of interaction with overlapping technological systems and regimes of visualization, overseen by a variety of actors, from the lab director to the journal editor. Many images are not optically generated, and are entirely dependent on image processing technologies. Landsat images, MRI scans, radar data, infrared images from space, scanning tunneling microscopy images, may look as if they were made with a camera, while their construction is entirely nonphotographic. Objects, processes or states may not have inherently visual characteristics and require translation from a non-visible to a visible state. Editing has become a process of discovery, a way to uncover hidden structures. What are the consequences of editability within scientific practice? Scientific seeing may experience an “interactive turn,” or shift away from a hierarchical model of inscription-based seeing, anchored by stabilized referents in publications, inscriptions displayed in the laboratory, or presentations at scientific conferences. Instead of an emphasis on completed images and inscriptions, we might focus on the process of image excavation, scientific object construction or other processes now considered intermediate and unfinished steps en route to the goal of scientific certainty.

Mark Wolff.
Decoding the Oulipian Text: A Question of Access
Many of the constraints on writing explored by the Oulipo, or Ouvroir de littérature potentielle, involve rigorous procedures based on mathematics that are effectively algorithms for generating texts. Reading an Oulipian text thus involves a process of decoding: in order to reach a fuller understanding the text, the reader must ascertain the rules the author adopted and the way he or she applied them.  The content of the decoded text ostensibly precedes that of the encoded one, but instead of  focusing on the messages such decoding elicits, I will consider what it means to have access to these messages.

Session 7 (E) Whitman A
Mathematics and Literature

Paul Halpern; Michael C. LaBossiere.
Beyond the Wall of Perception
In our talk we will show how H.P. Lovecraft, inspired by the scientific and mathematical breakthroughs of his time, developed themes in his later writings involving the decoding of the true reality and the revelation of the horror that lies beyond the walls of our normal perceptions. We will demonstrate how Lovecraft, though influenced strongly by such 19th century fantastic works as Arthur Machen’s The Great God Pan, struck out into new ground by weaving into his tales of horror 20th century attempts to decode hidden aspects of nature. For example, in “The Dreams in the Witch House,” a mathematics and physics student learns how to perceive dimensions beyond our own and in “The Shadows Out of Time” a professor unwittingly experiences mental displacement into the past. Other tales such as “Beyond the Gate of the Silver Key” and “Beyond the Wall of Sleep” feature the use of arcane tomes that reveal hidden truths of the world and show the reader what lies outside the sane bubble in which humans dwell. As we will examine in our talk, Lovecraft was one of the first writers to extract horror from efforts by modern physicists to decode the natural world—revealing its startlingly unpredictable and imperceptible aspects.

Michael Harris.
An automorphic reading of Pynchon's Against the Day
(submitted at the suggestion of Arkady Plotnitsky) Pynchon is commonly said to have a non-linear narrative style. No one seems to have taken seriously the possibility, to be explored in this presentation, that his narrative style might in fact be quadratic. Specifically, it is suggested that both narrative structure and imagery of each of his major novels are dominated by one of the conic sections: the hyperbola in Against the Day, the ellipse in Mason and Dixon, and so on. This semi-serious hypothesis is offered as an excuse to explore the role of technical mathematical material in Pynchon's novels. (For a sketch of the argument, see

Greg Kinzer.
Possible Worlds: Trans-world Travel, Haecceity, and Grief in Jacques Roubaud's The Plurality of Worlds of Lewis
ABSTRACT: This paper addresses the problem of decoding the nature of absence and death in Jacques Roubaud's The Plurality of Worlds of Lewis. An elegy meditating on the untimely death of Roubaud's wife Alix-Cleo, this book of poems takes as a central pivot the peculiar case -- from the standpoint of language as referent to the world -- of meaningful language that has no referent. Each poem addresses a 'you', the ever-present 'not-there-ness' of his beloved wife, that both does and does not point to something in the world. Roubaud, the mathematician and Oulipian writer and poet, refuses the easy consolation of religious or spiritual transformation (the staple of the traditional elegy) and turns instead to David Lewis's modal realism, the thesis that "the world we are part of is but one of a plurality of worlds, and that we who inhabit this world are only a few out of all the inhabitants of all the worlds" (Lewis vii). Themselves a decoding of Lewis's controversial book On the Plurality of Worlds, Roubaud's poems attempt to use the possible existence of many worlds, transworld travel, and ersatz-worlds to make sense of the trauma of this unbearable loss. If Alix-Cleo no longer exists in this world, she must exist in other possible worlds. In the process, Roubaud challenges us to ask whether mathematical systems such as Lewis's truly decode the world as we experience it, and whether any such logical structure can ever offer true consolation for the loss of a loved one. Building on Brian Rotman's semiotics of zero, I suggest that the 'you' of these poems functions like the mathematical zero, as both a metasign that points to the absence of certain other signs and a sign that marks the point where a person begins counting. Or in this case, the point where the speaker of these poems begins recording his loss. Just as zero marks the theoretical limit of a one-to-one correspondence between numbers and things, the 'you' of these poems explore the theoretical limits of language itself. The real heartbreak of The Plurality of Worlds of Lewis is the way in which the words used to designate her not-there-ness -- 'you,' 'Alex-Clio' -- become unfixed and slippery metasigns. Meaningful, and yet ultimately undecipherable. References: Lewis, David. On the Plurality of Worlds. Malden, MA: Wiley-Blackwell, 2001.

Session 7 (F) Whitman B
Expression and Scientific Texts

» File available
Previous | Next

Allmon Allred.
Strange Entanglements: The Hermeneutics of Natural History
In one of the most important moves for literary theory in the 20th Century, Hans-Georg Gadamer "saved" the humanities from collapsing into the natural sciences by uncovering the importance of a constant and irreducible play that we engage with in the past. In doing so, he considerably toned down the radicalism of his own teacher, Martin Heidegger, who regarded his understanding of temporality as a feature of philosophical understanding as such, and who remained interesting in parallels with, for example, quantum physics. In this paper, I propose a way of "hijacking" Gadamerian hermeneutics and putting them to service for a philosophical understanding of the natural sciences (if not to the service of the natural sciences per se). I do this by examining experiments into the phenomenon of strange entaglement that show two particles in apparently identical states can nonetheless differ on account of their history. Although I am interested in this analogy as part of a larger project into what seeing whether the concepts of history and futurity in Heideggerian hermeneutics and deconstruction might have any unique contributions to the philosophy of science (especially physics), in this paper, my main purpose is simply to uncover the mechanisms of the analogy and develop a vocabulary through which a "natural" hermeneutics could be constructed.

Suzanne Black.
Giant Molecules Through the Decades: A Content Analysis of Figures in Biochemistry Textbooks
Recent work in the rhetoric of science has analyzed both the historical evolution of the scientific article (Gross, Harmon & Reidy, 2002) and the importance of visual representations for science (for example, de Chadarevian & Hopwood, 2004; Pauwels, 2006). In this paper, I propose to weave these two strands of analysis together by studying the historical development of visuals in biochemistry textbooks designed for advanced undergraduates and beginning medical students. I will begin with a review of successive editions of two widely used textbooks, Lubert Stryer’s Biochemistry and Albert Lehninger’s Principles of Biochemistry. I will then turn to early textbooks as well as some surveys of biochemistry intended for slightly more and less expert audiences. In studying the textbooks, I will examine the number of visuals, ratio of visuals to text, types of visual, uses of color, and visual conventions. Multiple editions of textbooks offer the researcher a way to study changes in science visualization over time, while textbooks bridge the boundary between lay and expert audiences. I focus on biochemistry partly because the field has a rich disciplinary history (Kohler, 2008) that at once precedes molecular biology and has been revolutionized by it and partly because, as Roald Hoffman has argued, drawing is central to chemical communication. In addition, images from the new biology have begun to inspire artists, raising the question of the aesthetic value of the scientific images themselves. In performing this analysis of visual content, I hope to answer questions such as the following: (1) To what extent are changes in biochemical illustration driven by scientific advances (better understanding), technological advances (better software), and by aesthetic concerns (clearer or more elegant representations)? In the case of textbooks, are pedagogical concerns used to justify certain types or features of illustration? (2) How do textbook authors and illustrators discuss illustrations and justify changes in them? (3) When do biochemists begin to rely on visual explanations and visual evidence? Gross, Harmon and Reidy argue that argumentation from visual evidence becomes central to the sciences in the late nineteenth century. Some of my other work suggests that this shift happens later in medicine. Biochemistry, however, is at once a medical and a chemical field, so it offers a useful test case for their finding. (4) Do textbooks start using visual argument at the same time as journal articles? Works Cited: De Chadarevian, Soraya, ed. and Nick Hopwood. Models: The Third Dimension of Science. Stanford: Stanford UP, 2004. Gross, Alan G., Joseph E. Harmon, and Michael Reidy. Communicating Science: The Scientific Article from the 17th Century to the Present. New York: Oxford UP, 2002. Kohler, Robert E. From Medical Chemistry to Biochemistry: The Making of a Biomedical Discipline. Cambridge: Cambridge UP, 2008. Pauwels, Luc, ed. Visual Cultures of Science: Rethinking Representational Practices in Knowledge Building and Science Communication. Lebanon, NH: Dartmouth College Press, 2006.

Scott Enderle.
Expression, Idea, Encoding: Copyright and Metamathematics
Fundamental to the definition of copyright in the United States is a distinction between idea and expression. This distinction, explicitly laid out in the US Code (17 USC 102), has been prominent in discussions of copyright law since its genesis in eighteenth-century Great Britain. But what is an "expression," and what is an "idea?" Scholars have sometimes questioned this distinction by observing that literary texts are precisely those most likely to undermine the analogous distinction between "form" and "content." My talk problematizes the distinction from another direction. Rather than focusing on exclusively "literary" texts, I attend to the status of all language as code. As anyone who opens a text file in a hex editor can confirm, computers can function as they do because human language consists of discrete symbols, which can be encoded numerically and decoded again. In the early twentieth century, mathematicians applied this insight to logical languages, and the resulting proofs laid the foundation of modern computer science. Among those mathematicians was the logician Kurt Gödel, who used such encodings to create a paradoxically self-reflexive statement akin to the liar's paradox. I argue that by creating a logical sentence that referred to itself, Gödel showed that the distinction between mathematical "ideas" and "expressions" cannot be sustained, at least in the physical world. Furthermore, I argue that this mathematical argument may be applied to literary texts as well, and that its implications for copyright law should be considered more carefully than they have been. Although it would be quixotic to seek a perfect concordance between the logic of mathematics and the pragmatics of the law, I argue that reading copyright law alongside Gödel's proofs yields productive insights into the methods and goals of copyright law as it is, and as it could be.

Session 7 (G) Woodruff A
Whitehead+Cosmopolitics IV: novel ecologies of practice ii
Steven Meyer
This is the fourth of four panels for the stream "Whitehead+Cosmopolitics" proposed by Steven Meyer ( and Sha Xin Wei ( The abstract is identical for each panel, although the presentations differ. Most generally, this four-panel stream poses the question of the relation between the cosmopolitics envisioned by Isabelle Stengers in her “Cosmopolitical Proposal” (2005)—and back of that, the two-volume Cosmopolitiques (1997, 2003)—and the philosophy of social organism of Alfred North Whitehead. Both Stengers and Whitehead insist that any cosmological inquiry worthy of the name must take science (including technoscience) seriously, and that this means more than merely subjecting it to traditionally social and political as well as philosophical critique. For starters, it means cosmopolitics and it means societies of actual occasions. How to undertake the relevant cosmopolitical and Whiteheadian analysis is, to be sure, a complicated question; and the ten panelists approach it variously, in some instances directly, in others indirectly. The stream is divided between theoretical and practical approaches, although the frame is deliberately porous. As Andy Goffey remarks in an observation equal parts Whiteheadian and Stengersian, the idea is to develop more specific accounts of “the cultivation of thinking practices that evade the easy academic distinction between theory and practice."

Steven Meyer.
Animals Like Us: Adding Stengers and Whitehead to the Jamesian Mix
"Humanity Even for Nonhumans" runs the title of a recent New York Times op-ed by Nicholas Kristof (9 April 09). The "idea popularized by [Peter] Singer—that we have ethical obligations that transcend our species—is one whose time appears to have come." Animal rights, in two words. The March 2009 PMLA is framed by remarks on "why animals now?" (by Marianne DeKoven) and "'animal studies' and the humanities" (by Cary Wolfe). Something is certainly in the air, but when we limit nonhumans to animals ("pigs and chickens," "pregnant hogs or egg-laying hens" kept "in tiny pens or cages"), are we really broadening our sights to nonhumans or narrowing our focus to nonhumans like us? To follow out the implications of this question I turn to the Whitehead of Process and Reality ("we find ourselves in a buzzing world, amid a democracy of fellow creatures") and the Stengers of "The Cosmopolitical Proposal." As Whitehead put the crux of the matter some years earlier with regard to “value” ("the word I use for the intrinsic reality of an event"—later he would substitute "actual occasion" for "event" in this context, reserving the latter term for “a nexus of actual occasions” and characterizing an actual occasion as “the limiting type of an event with only one member”): "Value is an element which permeates through and through the poetic view of nature [and by extension the cosmopolitical view]. We have only to transfer to the very texture of realisation in itself that value which we recognize so readily in terms of human life." The question, then, is whether the current "push in Europe and America alike to grant increasing legal protection to animals," with its exclusive focus on creatures so very much like us, gets us any closer to the buzzing world in the midst of which we certainly do find yet also lose ourselves, as in a fog.

T. Hugh Crawford.
Thinking with Trees: Idiots, Assemblages, and Cosmopolitics
Henry David Thoreau built his Walden house at what was probably the peak of deforestation in New England (as the farms were deserted in the latter part of the nineteenth century, the trees made a come-back). He built out of second growth white pine, and has been duly chastised for this comment: "Before I had done, I was more the friend than the foe of the pine tree, though I had cut down some of them, having become better acquainted with it." One way to read his comment is that he is anticipating Isabelle Stengers call to play the idiot—the scientist/philosopher willing to risk idiocy in order to slow down processes and judgments so that real cosmopolitical understanding can emerge: “We know, knowledge there is, but the idiot demands that we slow down, that we don’t consider ourselves authorized to believe we possess the meaning of what we know” (“Cosmopolitical Proposal”). What happens if, following Stengers/Thoreau, we try to think with trees? Such idiocy quickly points us back to Stengers’s intellectual geneaology—Whitehead and Deleuze, who provide a philosophical context to understand such ecological assemblages. Specifically, we must call on Whitehead’s notion of philosophy of organism (with all its complexity), and we need to recast and complicate Deleuze and Guattari’s opportunistic and somewhat facile rhizomatic/arborescent distinction, turning instead to the much more productive “machinic assemblage.” With this conceptual background—organic, machinic, cosmopolitic—we can turn back to Thoreau and on to others who have become acquainted with trees, e.g., Eric Sloane’s Reverence for the Wood, Christopher Stone’s Should Trees Have Standing?, and Joan Maloof’s Teaching the Trees. Look out Animal Studies, Tree Studies is here!

Session 7 (H) Woodruff B
The Limits of Code: Life, Body, and Image
Ellen Esrock; Rob Mitchell; W.J.T. Mitchell
We propose a panel to take up the question of code in three contexts: life, the body, and the image. Rob Mitchell will address the relationship of living organisms and code. Ellen Esrock will discuss the body as a participant in the codes of meaning and affect that circulate through verbal and visual art. W. J. T. Mitchell will take up the question of the image as a sign that both relies upon and breaches the boundaries of code.

Rob Mitchell.
Coding Life & Vital Codes
ince the 1950s, an understanding of life in terms of "code" has guided research in molecular biology, both in what Richard Doyle has described as its early cryptographic phase (the effort to "crack the code of life") as well as in its more recent pragmatic phase (mixes and mash-ups of genetic sequences). Yet in our present moment, this understanding of life as code seems to have arrived at a tenuous juncture. On the one hand, synthetic biology--that is, the creation of genetic sequences (and thus, in principle, organisms) from the ground up--seems like the supreme validation of this understanding of life as code. On the other hand, increasing awareness within disease research communities of extraordinary difficulties involved in trying to "decode" disease--has many wondering whether a new paradigm for understanding the relationship between heredity and life is necessary. This confusion over what it might mean for life to be coded provides us, I suggest, with an opportunity for rethinking both "life" and "code" and their interrelation. This presentation seeks to draw together some of these efforts to rethink the relationship between life and code, drawing on, for example, Eugene Thacker's emphasis on the importance of feedback processes to the coding of life; my own work on the relationship between life and "media"; and recent work on the environments and ecologies of digital media.

Ellen Esrock.
Coding the Body of Reader and Viewer
Contemporary theories of embodiment hold that the physical body fundamentally shapes our emotions, thoughts, concepts, and beliefs -- phenomenon long held to be mental -- and serves as the starting point for understanding all human processes and activities.Despite increasing support for such theories, the foundational role of the body is still only beginning to be explored in studies of literature and visual art.While the body as a theme has certainly been prevalent, and psychological studies of reading and spectatorship have become increasingly sophisticated, the body is still treated much as it was in the 1970s, as a site for registering emotion, which is produced by readers and spectators who decipher codes of meaning and affect that circulate throughout various media.Drawing both on the cognitive neurosciences and on phenomenology, I propose that spectators and viewers can use their bodies, specifically their somato-sensory-motor systems, to enter into works of literature and art.From this new position, the body can become fully coded and intergral with the existing visual and verbal codes and the body also can become a fragment of code that forgets, exceeds, or ignores the circulating codes of affect and feeling.

W.J.T. Mitchell.
Image and Code
Much of the modern theoretical literature on images has been divided into two opposed camps: those who think that the digital revolution and semiotic science have rendered the concept of the analog sign irrelevant; and those who think that the very same revolution has created a world in which “image is everything.” This seems like a propitious moment to move beyond this impasse to re-examine some fundamental questions: 1) are images fully coded, or do they come with a remainder, a “message without a code,” as Roland Barthes put it; 2) has digital photography eroded the traditional account of this medium’s indexical relation to reality? 3) to what extent are images specific to media and their codes, and to what extent do they circulate across media? 4) what is the relation between the codes pertinent to language and to images, respectively? 5) what are the different kinds of codes that play a role in image-making and image-breaking? 6) to what extent is the question of the image/code relation implicated in our other two topics, the body and life? Could it be that bodies, life forms, and images are three names for entities that share many of the same dialectics between the coded and the uncoded?

Session 8 - Sat 1:45pm - 3:15pm

Session 8 (A) Bennett
"Twittered Subjects"
Chair: Mark Hansen
Description: This panel addresses the impact of the contemporary technosciences on our conceptualization of the subject. All three papers draw attention to the constitutive relationality of the subject and give varying, but complementary accounts of what comprises the relational context for contemporary experiences of subjectivity. In all three cases, emphasis is placed on the correlation between microprocesses of embodied subjective experience and dynamic technical structures (ranging from the body itself to social networking technologies and pervasive computing environments). Thus, all three papers assert some claim for the interrelation of contemporary neuroscientific research into the fine structure of cognition and the massive transformation of our contemporary cognitive environment. At stake, ultimately, is the question concerning how our self-understanding correlates with our constitutive plasticity, how our “intimate” experience is coupled with the massive informatization of our world.

Casey Alt.
Response: Sociopathic Subjects?
Response to the presentations that will likely include a flash widget specially designed to correlate the three presentations. Response will draw on presenter's work designing sociopath software.

Bill Seaman.
The Body as Electrochemical Computer
The notion of building a model for an electrochemical computer is both an exciting and daunting task. In order to model and ultimately build such a device, one seeks to borrow important operative concepts and processes from the body and re-understand them in the context of a device that is not human in nature. Certainly the task is to learn more about mind/brain/body and language in the process. This procedure will include bridging a series of domains including biology, physics, cognitive science, computer science, electrical engineering, an expanded linguistics, philosophy, psychology and the arts. Starting with a new premise related to an “Open Order Cybernetics” we will see that language acquisition and the production of meaning is an open ongoing process — it is situated and informed by reciprocal action between others, self and environment. Thus the delicate intermingling of scientific, philosophical and linguistic pre-suppositions surrounding such a project must be carefully examined. This paper will articulate a series of questions related to human experience to inform the generation of the model and to define a driving set of problems.

Tim Lenoir.
Twittered Datastreams: The subject function in ubiquitous computing networks”
Early cyberneticians dreamed of immersive data-driven environments where the Realists’ world of stable ontologies populated by static objects and subjects semiotically mediated by indexical representation would give way to a world of data streams, a shift to process, to mapping relationships, patterns, dynamic flows, and interactions among data objects. In place of the static archive (of, for instance, photographs), memory would become dynamic and contextual, and visuality would become distributed, emergent and not necessarily concentrated in the subject. Much recent scholarship has derided, even exploded, the fantasies of the early cyberneticians. While sympathetic to this body of important work, I will argue that much of what the early cyberneticists imagined is being installed in current massively networked cyberinfrastructures. In elaborating upon this claim I will look at three streams of work: 1) recent developments in massively distributed data-mining and visualization tools which dispense with traditional ontologies in favor of relational fields. These tools transform traditional senses of reading that inscribe a sense of “interiority”; 2) recent work in the cognitive neurosciences documenting the massive amount of unconscious information processing going on and the central role of affect in cognition. Whereas philosophers of mind in the early 1980s dismissed the subject as a superfluous fiction, the subject is now back—but in a new, metacognitive form administering to the creation of automatic non-conscious processes. As the subject is embedded in ubiquitous networked environments, the self is increasingly an interface technology; 3) newly emerging interface technologies for self-management. In ubiquitous computing environments the self is managed through its avatar. Interfaces such as FaceBook, LinkedIn, YouTube, and Twitter organize and present the self in formats accessible to human-scale temporality, while identity management services crawl the data streams performing data-mining, monitoring and cleaning up personal data-clouds, managing and narrating an online reputable self suitable for exposure to Google.

Mark Hansen.
“When is the Subject?: Micro-temporal networks and asubjective fluxes”
The contemporary neurosciences hold much promise for reconfiguring subjectivity – and our discourses on subjectivity – at a level far more fine-grained than that of representation and consciousness. Enabled by contemporary imaging technologies which have opened up a massive micro-domain of temporal processes for exploration, neuroscientists have focused on the role of temporalization in the generation of mental images and other neural assemblages that can be said to constitute the infrastructure of experience. While disagreements arise concerning how patterns of activity are bound (the “binding problem”), there is an expanding consensus among scientists that binding is the key to understanding the emergence of coherent and seamless experience from discrete cognitive subprocesses. Scientists adhering to this consensus include such prominent figures as Damasio, Edelman, Rodolfo Llinas, Semir Zeki, Walter Freeman, and Francisco Varela. In my paper, I shall explore the significance of this corpus of work on temporal binding for our conceptualization of subjectivity. To do so, I shall follow the lead of philosopher Catherine Malabou who in a series of books (most notably: What Should We Do With Our Brain? and Les Nouveaux Blessés [The New Wound]) has argued for a rethinking of cognition in terms of the concept of plasticity and in particular of the destructive form of plasticity that, for instance, allows a neurologically damaged patient to keep living, even as she undergoes a total personality change. If, Malabou proposes, we maintain the fragility of identity exposed by the new neurological wound as the principle and basis of (our understanding) of subjectivity, we are compelled to think this latter not as a substance but as the very process of plasticity in action. My paper will engage in a minor critique of Malabou that aims to displace the negative injunction at the core of her position (that we take the neurologically wounded mind as paradigm) in favor of a positive account of the ongoing emergence of subjectivity from the operation of (quasi-)autonomous microtemporal processes as these are bound by various “technologies,” ranging from neural synchronization, sensory convergence, through language, to social assemblages and contemporary technical networks. My paper may include a brief discussion of Richard Powers’s novel about neurological wounding, The Echo Maker: A Novel (2007).

Session 8 (B) Crescent

Mark Pizzato.
Decoding Cathartic Effects in the Brain
Aristotle's ancient notion of catharsis, as the purifying of fear and sympathy (or other emotional drives) while reading tragic drama and watching theatre, has become perhaps the most influential aesthetic idea in Western culture. It has also influenced psychoanalytic theory and clinical research. How do individual readers or spectators, and the collective audience, absorb certain types of communication from the page and stage, thus altering the corresponding theatrical areas of spectators' brains, improving their conscious awareness or not? This essay uses neuroscience and Lacanian psychoanalysis to explore the possible cathartic effects of drama and theatre upon the brains of readers and spectators. First, I will present a theoretical model of Symbolic (mostly left cortex), Imaginary (mostly right cortex), and Emotional or Real (subcortical) aspects of catharsis. Then I will discuss the initial findings of research with students in my fall 2009 Performance Theory class, using surveys to evaluate their experience of each of these three aspects of catharsis in reading and viewing a Hamlet monolog (script and various movie versions), Arthur Miller's The Crucible (script and UNC-Charlotte stage production), and Suzan-Lori Parks's Topdog/Underdog (script). Eventually, algorithms of each student's reactions will be developed to create a computer model of collective catharsis at specific points in these scripts and performances—as tragedies from different historical contexts.

Steven J. Oscherwitz.
Decoding an Artists Intent
I think we do decode nature and I want to agree with Husserl’s – 1954 masterpiece , The Crisis in The European Sciences , that the intellectual decoders, the conceptual tools that the ordinary scientist intentionally used then and in fact still now to understand, represent and translate the physical world are trapped in pre-conceived notions that reflect flaccid and non-rigorous logical processes, not sufficient for the ontology’s and epistemologies’ of the coming 21st and 22nd centuries? Historically with the development of Greek Science it was the intention of the observer to represent and make visible the inherent invisible workings of nature; in other words to decode , to make translatable, observable and understandable both the seen and unseen law like fabric of nature. Such translation would allow one to get beyond the world as it is first seen and appears to be. Aristotle’s’ study of the cosmos, rocks, plants, animals, systematically represents and makes visible and observable nature’s objects through an empirical philosophy that pictures and translates the natural world into a systematic understanding with specific logical orders; This Aristotelian Philosophy would evolve for many centuries. But not until the 20th century philosophy of Edmund Husserl does the inner image—the very act and fabric of the mindful assembly of intention and its inner presentation of imagery become the actual object to investigate, decipher, and decode. This inner and rigorous logical fabric of mindful activity, as defined by Husserl, becomes a phenomenological vehicle that kinetically constitutes our thought and our identity with the world. Through a presentation of my own digitally created 3 -dimensional architectonic compositions, I want to explore and reflect on this Husserlian Interpretation of how inner appearances, images and phantasies present a theater of inner picturing which could help compose more logically rigorous and meaningful epistemological and ontological structures for the future. While concentrating on specific chapters of Husserl’s Crisis, and Ted E. Klein and William E. Pohl’s 1971 translation of Phenomenology and The Foundations Of The Sciences and John B. Brough recent 2005 translation of Husserl’s- Phantasy, Image Consciousness, and Memory (1898-1925) Spinger Press , I want to contend that an artist composing and experiencing digital imagery thru the cad software program rhino 5 , can compose more comprehensive and multi-faceted forms of observation which prove to be very susceptible to phenomenological interpretations? I then want to make the case that contemporary science can use such a Husserlian reflection geared to digital form to better compose observation of the Natural continua of the world ,which would develop more logically potent and meaningful forms of decoding of the law like fabrics of nature , both inside and outside ourselves.

Kimberly Knight.
Decoding the Viral: A Matrix for Evaluating Viral Video
In Web Video: Making it Great, Getting it Noticed, Jennie Bourne advises the novice video maker on the importance of manageable length, a well-developed narrative, and a receptive community. However, these criteria fall short of explaining some of the most popular videos currently in circulation. In this paper I examine a sampling of “viral” videos and attempt to “decode” their dominance in terms of quantifiable factors, such as length, number of comments, and the presence of an external community. I also delve into certain intangibles such as aesthetics and humor. Questions I hope to address include: Are there certain memes or mythemes held in common by these videos? Does the originating purpose of the video affect the matrix? Are there meaningful differences between videos that are popular in the short-term versus those that sustain their popularity for longer periods?

Eva Stuhal.
Painting as “Esperienza”: Art and the New Science in Seventeenth-Century Florence
Recent scholarship on the seventeenth century has become increasingly aware of the important role the complex interchange between art and science plays in Baroque culture. In my paper I propose to trace parallels between science and painting in early modern Florence. In particular, I will compare the techniques of observation and experimentation (“esperienze”) practiced by the Accademia del Cimento, founded in 1657 by a group of Galileo’s students, and the “simple imitation of nature” on which the Florentine painter Lorenzo Lippi (1606-1665) based his art. Lippi envisioned the role of the artist as being that of an acute observer of the physical environment, not the creator of a fantastic world; this project matches the focus on close observation of nature that already marked Galileo’s concept of science. Like Galileo and his followers, Lippi also emphasized sensory, autoptic experience as the means to produce art as a science-like, objective discipline. Erwin Panofsky and Horst Bredekamp have demonstrated the impact of art on early modern science by showing how deeply Galileo’s scientific inventions, indeed his concept of science as a whole, were informed by his training as a painter and his interest in art. Conversely, I would like to demonstrate the reciprocal impact of science on art by examining Lippi’s belief that truth and objectivity were requirements for his art. Lippi’s familiarity with the world of the scienza nuova was established most likely through the Accademia del Disegno, where Galileo’s students Evangelista Torricelli and Vincenzo Viviani taught mathematics and the art of perspective. Given this contact, parallels between Lippi’s art and Galileo’s scientific principles are not accidental: both believed that attentive study of nature leads to incontrovertible results which are able to withstand change, resulting in artistic or scientific principles that are unfading and immortal.

Session 8 (C) Doggett
Thoreau's House: a Nineteenth Century Building Project
Hugh Crawford
A group of Georgia Tech Honors students are participating in a seminar in Fall, 2009 where, among other things, they are building a full-scale version of Thoreau's House using only 19th Century tools. They will fell the trees and hew them with borrowed axes. Member of the class will present different perspectives on this project through a series of Pecha Kuchas. Pecha Kucha is a presentation format where each speaker shows 20 slides, speaking for 20 seconds per slide. Hugh Crawford, instructor of record for the course, will serve as respondent.

Session 8 (D) Mitchell
Decoding Technologies of Mediation

Robert Rosenberger.
Experiencing Television Otherwise: Technological Mediation and the Ethics of the TV Set
Television has received much critical analysis in the forms of evaluations of programming content, and reflections on the kind of subject engendered by the passivity of television viewing. I consider the experience of television not primarily in terms of the content of programming, but in terms of the television’s status as a technology—as an apparatus which mediates a viewer’s experience in a particular manner. Expanding on Don Ihde’s phenomenology of technology, I develop an account of how this technology structures a viewer’s experience, actively occupying conscious attention in certain ways, just as other aspects (such as the remote control in hand) fade into the background of awareness. I suggest that this account has the potential to productively refigure contemporary debates in the philosophy of technology over the morality of television consumption practices. In Albert Borgmann’s influential account of the morality of technology, he identifies television viewing as a paradigmatic example of a large-scale, technologically constituted, morally problematic pattern in contemporary everyday life. He argues that we should be more cognizant of practices that bring together the positive things in our lives so that these practices can be promoted. Or else we will continue to increasingly fall into technologically influenced bad habits such as watching too much TV, rather than, say, spending quality time with our loved ones. However, others such as Ihde and Peter-Paul Verbeek caution that television—and indeed any technology—should not be condemned as essentially negative since other relations and contexts are always possible. (Discussions in the philosophy of technology often become mired in such quandaries about how particular identified patterns of technological relations should be reconciled with anti-foundational and anti-deterministic commitments.) I claim that for the specific case of television, but also with implications for these general quandaries, an expanded phenomenological account of the experiential relationship a viewer shares with a television productively draws out the mechanisms involved in the particular ways viewership typically occurs, and reveals what it would mean concretely for this to occur otherwise. The expanded phenomenological account I provide emphasizes several features of television viewing, such as the particular manner in which the TV structures a viewer’s field of experience, and also the particular strength of the bodily-perceptual habits associated with this experiential structuring. If thinkers such as Ihde and Verbeek want to seriously contend that television can be approached differently, any alternatives that are offered must take into account these features of experience. I continue by considering possible stable alternative human-television relations. For example, riffing on Bruno Latour’s famous analysis of speed bumps (which enforce speed limits through physical imposition), I consider the related phenomenon of ripple strips which interrupt a driver’s relation to the car though rumble and noise; an analogous occasional interrupting “reminder” may be possible for helping one to remain more conscious of her or his television viewing. Another of the examples I consider is the possibility of systematically expanding television viewing practices into larger projects occurring in the room, as happens in cases in which the television is watched while one jogs on a treadmill. The objective in examining these alternatives is NOT to advocate any in particular; it is to contrast the characteristics of various possible human-television relations with those of the typical pattern. Insofar as these alternatives strike us as inconvenient, difficult, impossible, or inconceivable, or do not address the problems associated with the typical pattern, a spotlight is shown on the degree to which our typical relations are entrenched in bodily, perceptual, and material habits and practices. These considerations are especially relevant to our current historical moment, as these habits and practices increasingly spread to further technological configurations, such as video played on desktop and portable computers and on cellular phones. References Borgmann, A. (1984). Technology and the Character of Contemporary Life. Chicago: University of Chicago. Ihde, D. (1990). Technology and the Lifeworld. Bloomington: Indianna. Latour, B. (1999). Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge: Harvard. Selinger, E. (2006). “Normative Phenomenology: Reflections on Ihde’s Significant Nudging.” In E. Selinger (ed.), Postphenomenology: A Critical Companion to Ihde. pp. 89-107. Albany: SUNY. Verbeek, P.P. (2005). What Things Do: Philosophical Reflections on Technology, Agency, and Design. State College: Penn State.

Katherine Hayles.
Telegraphy and the Place of the Human: The Co-Evolution of Code Books and Human-Machine Cognition
The invention of Morse code, the first globally pervasive binary signaling system, was quickly supplemented by codes used to encrypt the telegram’s plaintext. Although at first codes were used for secrecy, the emphasis soon shifted to economics; By the 1880’s, telegraph code books were in general use.  Nearly every industry had its own code book—cotton, iron, law enforcement, even cinema.  Some code books contained codes that spanned multiple languages, so that an English speaker, for example, could construct a telegram using code groups for English phrases, and the recipient could translate the code groups using the French section. The dream of using a binary system to construct a universal language, fully realized with the digital computer, began with telegraphy. Indeed, telegraphy in general played a central role in bringing about globalization, for it was the first technology to enable rapid communication across the world.  Code construction became a focus for social contestations not only between telegraph companies and code book publishers but also, since code construction was regulated by the International Telegraph Union, between different nations, language groups, and corporate interests. In 1879, the International Telegraph Union conference identified German, English, Spanish, French, Italian, Dutch, Portuguese and Latin as the languages which could count as “natural” languages for extra-European telegraph traffic and so be charged at a cheaper rate than enciphered telegrams, which would therefore include all of the rest of the world’s languages as well as artificial words. During the same period the United States required that code words either be pronounceable made-up words or dictionary words. The reasons involved the dynamics of the humans who encoded and decoded the messages; telegraph operators found that the regular alternation between vowels and consonants in made-up words were easier to encode than multiple-consonant configurations in English or German. The general trajectory of code books moved from natural language words used to encode phrases (for example, “festival” stood for “mammoth torpedoes, 3 case”) to code groups composed of an arbitrary combination of five or ten letters. Telegraph code books thus open a window into the complex relations between human cognition, kinesthetics, and psychology as they interacted with capitalism and international politics.

sam smiley.
Can You Hear Me Now? A commemoration of the Transatlantic Cable of 1858
In 1858, the first successful messages were sent over a transatlantic cable which spanned Valentia, Ireland to Heart's Content, Newfoundland. After some tests, in August of 1858, Queen Victoria of England, and President Buchanan of the United States exchanged congratulations. In September of1858, the cable stopped working due to among other factors, the zealous and consistent over-application of 4000 volts of electricity by Edward Whitehouse, one of the engineers. In 2008, AstroDime Transit Authority, a media arts collective commemorated this transmission using a serial and relay tin can telecommunications line from an island in the Boston Harbor to the mainland during low tide. Combining history, performance art, text and video, this presentation commemorates the transatlantic cable of 1858. It also looks at the history and the technology behind this effort and examines how failure can be a positive outcome. BIO OF ASTRODIME AstroDime Transit Authority is a Research and Media Arts collective organized around ideas of transportation and communication. Our methodologies include video interviews and documentation, surveys, performances, and research into the history of low-tech telecommunication systems. Members include sam smiley, Bebe Beard, Ali Horeanopoulos, Lisa Lunskaya Gordon, Mary Ann Kearns, and Julia Tenney We publish a twice yearly video journal called INtransit which examines technocultures and practices. More information about AstroDime Transit Authority can be found at BIO of sam smiley (presenting member) sam smiley is a media artist and educator. She teaches in the Creative Arts in Learning program at Lesley University and is the assistant director of the Community Arts Masters program. Her media work includes videos, music, performances and interventions.

Session 8 (E) Whitman A
Round Table on Schechner's Performance Theory
Susan Squier
This panel brings together three scholars working in very different fields to discuss a single, shared text with important implications for science studies: Richard Schechner's Performance Theory, and the chapter "Magnitudes of Performance" in particular. The presentations will explore how Richard Schechner's performance theory resonates in their own work, and how they use, redefine, adapt for their own purposes or quarrel with his notion that performance takes place, and thus can be explored, at an ascending scale of magnitudes that links the biological and the social, the very minute and the very large.

Jameson Bell.
"A Universal or Situated Brain? Engaging the Performance of a Cerebral Event"
By incorporating then-contemporary neuro-scientific theories into performance studies, Richard Schechner argues in Performance Theory (1976, 1988, 2003) that culturally specific signifiers found in micro and macro scale human activities can be interpreted universally through their emotional (affective) signified brain events, or felt ANS activity. After outlining several assumptions that appear to motivate Schechner's very specific description of a 'brain event,' this paper will present my own research on a historically situated 'cerebral happening.' My work on the Medieval and Early Modern ventricular doctrine of brain activity shows both favorable and contradictory theoretical applications of performance theory to brain events and the use of neuroscience to further humanistic research. In addition to Schechner's performance theory, Latour and Daston argue positively that various magnitudes of performance can be favorably applied to scientific objects as well as individuals. Schechner's believe in a universal (cerebral) language proves at best difficult to confirm and at worst contradictory to his theory of frames and objects that transform considerably within defined spatial and temporal rituals. Brain events thus become fashionable and intricately connected to shifting cultural values and epistemic practices.

Charlotte Eubanks.
"Deep Acting: Buddhist Rituals of Sutra Reading and the Neurological Magnitude of Performance"
In his chapter "Magnitudes of Performance," Richard Schechner posits a very provocative idea.  "Maybe," he argues, "the deepest acting goes on at the neurological level." He continues, drawing on scientific literature, to suggest that ritual has the ability to resynchronize both biological and social rhythms.       In this paper, I bring Schechner's ideas into conversation with the classical Buddhist concept of kanno ("stimulus and response"), a desired state of sympathetic resonance between a believer and a divine being. Significantly, kanno is often described in musical terms, likened to the way that a zither, plucked in one corner of a room, will cause a nearby instrument's strings to vibrate. Working from medieval Japanese miracle literature, I explore legends concerning the musical art of sutra recitation as an instance of deep, neurological 'acting.'

Luis Arata.
"Model Performance"
This presentation reflects on two issues that Schechner raises in Performance Theory. One is that performance is an illusion of an illusion, making it more real than ordinary experience.  The other is the question of whether a performance generates its own frame or is imposed from the outside.  I argue that modeling links and illuminates both issues.  A model (a frame and somewhat of an illusion) is needed to perform. Performance, in turn, leads remodeling (reshaping the frame) and enriching the illusions that shape realities. For examples I will draw on Sartre's Nausea (modeling existence as performance) , Cortazar's Hopscotch (framing the unknown for performance) and Euclidean geometry (enabling performance through illusions).

Session 8 (F) Whitman B
Market Theory, Financial Engineering, and the 2008 Collapse of the American Economy
Jim Swan

Jim Swan.
Equilibrium, Model, and Shock: American Economics and the Crash of 2008
The foundational concept of neo-classical, orthodox economics is “general equilibrium.” Its purpose is to formalize Adam Smith's concept of an “invisible hand,” the free-market mechanism working behind the scenes to guide the totality of market activity toward an optimal accumulation of social wealth. Hence, markets naturally seek equilibrium and are only harmed by “outside” intervention. Neo-classical market theory is in fact strongly inflected toward hard distinctions between “inside” and “outside.” Outside “shocks,” like the oil shocks of the 1970s, disturb markets but, sooner or later, they again achieve equilibrium. Air pollution is an “externality” of the market process, as is the resulting lung cancer and global warming. As for mathematical models of the market, standard textbook language says that “some things must be treated as given… These are the exogenous variables [i.e., arising outside the model]… The endogenous variables, then, are those that are going to be explained by the model.” For orthodox economists, even economic history is located outside the discipline, taught only in other departments. Working today against orthodox theory is a growing number of economists specializing in international trade, third world development, the environment, and an historicized model of economics. For them, the boundaries between inside and outside are not only permeable, but they point toward interdependent and mutually contingent histories of world economies and their effects. This paper sketches how such histories are the true context for any account of the American economy, from the end of World War II to the 2008 market collapse.

» PowerPoint Presentation & Paper
Previous | Next

Sean Miller.
Random Walks and Lévy Flights: The Black-Scholes-Merton Model of Options Pricing as an Epistemic Imaginary
In 1997 Myron Scholes and Robert Merton won the Nobel Prize in Economics for two related papers entitled 'The Pricing of Options and Corporate Liabilities' and 'Theory of Rational Options Pricing,' both published in 1973. These papers proposed a new model, now called the Black-Scholes-Merton model, for determining the value of derivatives in a given financial market. Interestingly, the model itself exploits a form of stochastic calculus that Einstein used in 1905 to propose a way of experimentally validating the existence of atoms by measuring their Brownian motion, their 'random walk' through a fluid medium. Thus this notion of the 'random walk' would seem to substantiate a model for both a natural and a social phenomenon. The sociologist Donald MacKenzie has documented the adoption of the Black-Scholes-Merton model by 'orthodox' economics as an integral part of their epistemological culture. He argues that '[o]ption pricing theory seems to have been performative in a strong sense: it did not simply describe a pre-existing world, but helped create a world of which the theory was a truer reflection'-a truer reflection, at least, until the stock market crash of 1987, the massive failure of Long-Term Capital Management in 1998, and the recent collapse of the global financial system. In this presentation, the question at hand will be: how does the Black-Scholes-Merton model, as an epistemological imaginary, engage with, shape, and potentially disrupt the world as both imaginative bricolage and objective reality?

Session 8 (G) Woodruff A
Animal Subjects

Jeff Karnicky.
Who’s Hailing Whom? Individuals, Populations, and Multispecies Relationships
In When Species Meet, Donna Haraway describes the “finite, demanding, affective, and cognitive claims” that animals and humans make on one another. This paper will examine how these claims play out in relationships between people, birds, and other animals. In describing her community’s efforts to sterilize, vaccinate, feed, and name a group of feral cats, Haraway writes “I don’t care when I see Steller’s Jays feathers littering [the cats’] hunting grounds” because the “avian population” of jays is not threatened by the cats. This specific example shows a preference for the individualized cats over the undifferentiated population of birds. In this relationship, the individual eats the population. This paper will ask if the embodied specificity of multispecies relationships relies on individual subjects. I will also explore how these relationships might reconfigure the formation of individual human and non-human subjects as discussed by Haraway, Derrida, and the authors of biographies of individual birds such as Rosie: My Rufous Hummingbird and Wesley the Owl.

Marcel O'Gorman.
Animals, Evocative Objects, and other "Things": On Posthumanism and Applied Media Theory
Over the past decade, posthumanist discourse has undergone a conspicuous shift in focus away from the cyborg (Haraway) and toward the question of the animal (Wolfe). This shift, which is documented in the archive of SLS conference papers, marks a movement away from the critical study of technology and towards more phenomenological investigations rooted in environmental concerns. This discursive turn is about to come full circle as “thing studies” assert themselves more strongly in posthumanist theory (Harman). This presentation will examine the shifting orientation in posthumanist theory from the cyborg to the animal to the inorganic “thing,” in order to revive the posthumanist question concerning technology and to foreground a new research methodology that I have called Applied Media Theory. Applied Media Theory (AMT) builds on my work in _E-Crit_ (U of Toronto P, 2006) to demonstrate how media theorists can engage directly in technological production as a component of their critical praxis. Taking a cue from Sherry Turkle and Don Ihde, AMT is rooted in the concept of the “evocative object” or “epistemology engine.” However, unlike Turkle and Ihde, I suggest that media theorists must learn to invent and encode their own technological objects-to-think-with, rather than merely pulling them off the shelf as objects to be decoded. It is only by engaging at the R&D level of technological production that media theory will converge with technoscience in the creation of mindful technologies informed by a critical perspective. I will conclude the presentation by examining AMT projects currently underway in the Critical Media Lab at the University of Waterloo, including “Cycle of Dread” and “OncoGeiger.”

Sarah E. McFarland.
Decoding a Radical Animal Subjectivity
I begin inspired by Levinas’ suggestion that it is by encountering the face of the other that one calls into question her own subjectness, that it does not matter the particularity of the other so much as what that other can do: the way that the sheer otherness of the other momentarily shatters my labelizing, my being in the world, my very existence. Unfortunately, Levinas explicitly excludes non-human animals from the categories of beings that have a face, which creates a problem for discussions of animal ethics and non-human subjectivity. However, Derrida confronts this problem when he finds his naked body being gazed upon by a cat and feels a malaise brought upon by his inarticulable recognition of the limits of his subjectness. Non-human animals compel us to address the vulnerabilities we share as living, mortal beings, and I attempt here to move the discussion from vague theoretical abstractions and into real, concrete behaviors.

Session 8 (H) Woodruff B
Math in Theory I
Aden Evens
Though mathematics has been a part of humanistic research for thousands of years, the increasing specialization and balkanization of disciplines has divided math from the humanities, leaving relatively few scholars committed to the intersection of these sets of ideas. These two panels present papers that engage with mathematics as history and philosophy, amplifying a recent surge of attention to this subject matter.

Arielle Saiber.
Math and the Alphabet: Renaissance Theories on the Geometry of Language’s Building Blocks
Mathematician Luca Pacioli wrote in 1498 that “without the two mathematical lines (the curved and the straight)… it is not possible to do anything well” (De divina proportione, published 1509).  In this compendium of Renaissance thought (illustrated by none other than Leonardo da Vinci) on the nature of the golden section, regular and irregular polyhedrons, artificial perspective, and Vitruvian proportions of architecture, Pacioli includes a section on the architectonics of lettering.  Inspired by humanist investigations into the proportions of Roman capital letters engraved in lapidary inscriptions and monuments, Pacioli constructs an alphabet of “divine” capital letters, which he intended to be used not only by stone-cutters, but by punch-cutters, goldsmiths, scribes, and even miniaturists.  Pacioli’s alphabet—along with those of Feliciano, Alberti, Torniello, Dürer, Tory, and numerous others—provides a fascinating window onto a time in which geometry was primarily Euclidean; notations for mathematical functions were far from standardized and mainly written in words; and proportions both divine and human (that is, those of the human body) were sought in the natural world and intentionally reproduced in the man-made.  In this period, the literal shape of language was changing from medieval scripts to the more rapidly-producible chancery, and from the calligraphic to the typographic.  Pacioli’s treatise emblematizes the important role mathematics plays in the physical act of writing, and the importance of mathematics to the very atomic structure of language.

Sha Xin Wei.
Morphogenesis : The principle of least action (Leibniz, Whitehead, Alexander)
Given a set of paths in a space of configurations of a dynamical system, and a way to define a total "action" along any path in that set, the Principle of Least Action states that the dynamical system will evolve along the path that minimizes the action.    This principle, or more accurately this family of principles, underpins not only all of classical physics, but also certain moments in process ontology, notably in Leibniz, Whitehead, and Christopher Alexander.    Alexander writes: “wholeness, as a structure of symmetries and centers, evolves in time, and will always have a natural dynamic of such a nature that as many as possible of these symmetries (and especially some of the larger ones) are preserved as the system moves forward in time. As the system evolves, it destroys as little as possible.”  This phrase, as little as possible, although it refers to symmetries, calls for a non-energetic principle of least action.A non-energetic, non-metric least action principle may seem difficult to conceive, but it is possible, I believe. In this presentation, I explore how the calculus of variations, the least action principle, and the primordial concept of differential inform these philosophies of process that in turn infused Deleuze and Guattari.

Shankar Raman.
Shakespeare's Merchant and the Algebra of Law
The endeavours of a range of early modern thinkers -- especially Cardano, Viète, and Descartes -- resulted in a properly symbolic mathematics, that is, in forms of abstraction that generalised the concept of number and allowed symbols to be manipulated as if they were numbers (see Klein 1968, Hadden 1994).   This fundamental shift is correlated with crucial changes in economics and law:  in all three domains, scholars were moved to  representing things in their merely potential determinateness -- be they commodities, people, or algebraic unknowns. Shakespeare's The Merchant of Venice renders luminous this nexus.  Relating the play's legal and economic manoeuvres to the logic of  algebraic manipulations, I argue that its interest in unknown things and  unknowability perhaps marks the very constitution of personhood and "privacy" as an indeterminacy that is always only made determinate.

Session 9 - Sat 3:30pm - 5pm

Session 9 (A) Bennett
Artmaking as an Imaginary Solution: Alfred Jarry as an Intellectual Source for Twentieth Century Art
Chair: Peter Mowris
The writer Alfred Jarry (1873-1907) is best known for his notorious play Ubu Roi (1896), but an equally powerful source of interest for artists who were interested in Jarry’s work was (and still is) his notion of ‘pataphysics, or what Jarry called a “science of imaginary solutions.” This approach, though mentioned in Ubu Roi, received full treatment in Jarry’s posthumously published masterpiece of science fiction, Gestes et Opinions du Dr. Faustroll (1911). Scholars of Jarry’s work have painstakingly established how many of his most famous texts use the scientific theories of his day, almost verbatim, as the intellectual sources for his fantastical stories. In this manner of borrowing, Jarry made the boundaries between science, literature, truth, and imagination into inconsequential conventions, or, as Jarry called them, ”unexceptional exceptions.” Scholars have barely begun to plot the extent to which artists drew inspiration from Jarry for their diverse approaches to science. The panelists offer focused studies from a variety of contexts that will establish the prevalence of ‘pataphysics as an intellectual source for artmaking in the twentieth century. Overall, these scholars will explore what happens when artists interested in science approach it through alternative channels and are inspired by a writer who portrayed all scientific discourse as a series of imaginary solutions.

Fae Brauer.
Jarry and Picasso: The Pataphysics of the Cubist Pasted Papers
When the word “Merdre!” (shit!) opened Alfred Jarry’s Ubu Roi at the Théatre de l’œuvre, Pablo Picasso was seeking an equally explosive art form amongst the Anarchist poets and painters at Les Quatre Gats in Barcelona.Seven years later when his closest comrades in Paris met Jarry at the Closerie des Lilas, Picasso was introduced to Jarry’s Ubuesque œuvre, including his concealed gelignite.As fervent an admirer as Guillaume Apollinaire, André Salmon and Max Jacob, every Tuesday Picasso would walk across Paris to the weekly reunions of Paul Fort’s Vers et Prose to hear snippets of Jarry’s plays, Ubu Cuckolded and Ubu Bound as well as such prose as The Supermale and The Passion Considered as an Uphill Bicycle Race.On Jarry’s death in November 1907, Picasso acquired his manuscripts and legendary pistol, using it in not dissimilar ways to define his aesthetic position.Yet, as this paper will reveal, it was not until publication of Jarry’s 1898 novel in 1911, Exploits and Opinions of Dr. Faustroll, pataphysician that Picasso began to explore Jarry’s philosophy of pataphysics and cultivation of paradox, as manifest by his Cubist pasted-papers. Consistent with Jarry’s anti-aesthetic of “l’horriquement beau” and his use of shabby costumes and cardboard heads for his actors, Picasso appended onto his picture plane the peeling paper of billboards and wallpaper, as well as torn strips of newspaper. During this period of neoregulationism when absinthe was criminalized, alcohol became as much a subversive leitmotif in Picasso’s Cubist iconography as it was in Jarry’s prose.At this time in the French Radical Republic when fitness became a national imperative, La Vie sportive and L’Education Physique became as much a subject of parody for Picasso as it was for Jarry’s Supermale, obsessed with bicycling faster than trains and maximizing orgasm in order to set new world records. However, as this paper will demonstrate, it was the farcical games Picasso played with accident, contradiction, inversion and negation that brought him closest to Jarry’s “science of imaginary solutions … extending as far beyond metaphysics as the latter extends beyond physics.”    

Michael R. Taylor.
Pataphysics in Philadelphia: The Strange Case of James E. Brewton
In the early 1960s a small group of Philadelphia artists began to explore the iconoclastic writings and ideas of Alfred Jarry. Their number included Thomas Chimes, Jim McWilliams, and James E. Brewton, all of whom had begun to incorporate references to Jarry’s Ubu Roi and Exploits and Opinions of Doctor Faustroll, Pataphysician in their recent paintings, sculptures, and mixed-media assemblages.Whereas Chimes and McWilliams drew their inspiration from the What is Pataphysics? issue of the Evergreen Review, which appeared in 1960, Brewton learned of the French writer through the Danish artists Asger Jorn and Erik Nyholm, who had explored Jarry’s “science of imaginary solutions” in their own paintings of the 1950s and early 1960s.In his own art works, which were largely inspired by the adventures of Doctor Faustroll, Brewton fused the gestural abstractions of Jorn and Nyholm with the graffiti that he found and admired in urban Philadelphia, to create paintings, prints, and constructions that he called Graffiti Pataphysic. In 1964, Brewton founded the J.E. Brewton Institute of Comparative Vandalism, which was intended as an alternative to the esoteric and intensely private Collège de ‘Pataphysique in Paris, and in the same year published The Pataphysics Times and a “Patacomic” book based upon Jarry’s Le Surmale. These activities brought Brewton to the attention of other Jarry-obsessed artists working in Philadelphia at that time, many of whom were teaching at the Philadelphia College of Art. In 1967, this group organized a Pataphysical exhibition of their art works at the Socrates Perakis Art Gallery in Philadelphia, which attracted hundreds of people to its midnight opening on May 15. Unfortunately, Brewton shot himself three days before the opening night, overshadowing the event and prematurely ending the group's activities. Nonetheless, they deserve to be recognized in the growing literature on Jarry’s impact on the visual arts in the 20th century.

Peter Mowris.
How to Make Water: Playful Science in the Collaborative Artmaking of Max Ernst and Hans Arp, c. 1920
In 1920, Max Ernst and Hans Arp embarked on a Dadaist venture of artmaking that they called FaTaGaGa, an abbreviation for "Fabrication of guaranteed gasometric pictures." The fulsome title makes passing reference to gasometry, a field pioneered by the French scientist Antoine Lavoisier (1743-1794). References in this series to Lavoisier's inventions are rather detailed, which makes one wonder why these two artists were so interested in such an obscure figure. A closer look at the imagery and references in these works suggests that Ernst and Arp were interested in constructing a pataphysical dimension in their collages that stands as a critical reaction to the common usage of thermodynamics in gasometry and physiological psychology, which was a branch of psychology that sought to explain mental function by reference to physiological nervous function. The functionality of the chemical world had a deep relation to the supposed rationality of nerve-based mental operation, because it was believed that the functional nervous system followed laws of thermodynamics in its function as the basis of thought. The common usage of thermodynamics in both these fields unfolded with the expressed intention to develop laws for both matter and mind. Ernst and Arp, by contrast, used Jarry's dictate that any law was a correlation of exceptions. They embraced several overlapping techniques of experimental collaboration borrowed from the context of modern dance as a way to create their own imaginary solutions of artistic practice that called into question the supposedly objectivist model of moving and thinking in discourses of physiological psychology.

Session 9 (B) Crescent

Spencer Schaffner.
Unintelligible Writing at the Edge of Meaning
This is a multimedia presentation consisting of twenty minutes of video featuring a dozen media objects (video, kinetic images, still images, text, audio, and html). The presentation will focus on how deviant writing is understood in science. In addition, the presentation will show how unintelligible writing is used in contemporary visual and performance art. The artists Zhang Huang, Kunizo Matsumoto, and Hayes Henderson all deploy unintelligible writing in their work, simultaneously referencing linguistic intelligibility and undermining it. Many disciplines that are committed to the decoding of written language (composition/rhetoric, linguistics, English language learning, English, textual studies) take it as axiomatic that written language must be orthographically, grammatically, and visually intelligible in order to make meaning. And yet, these artists situate themselves in a rival tradition which uses written expression (drawing from the asemic writing by Henri Michaux in the 1920s and '30s) to signify a-grammatically. In this multi-media presentation, I explore how such artists make meaning with unintelligible writing forms, contextualizing their work amid several artistic and intellectual genealogies. I also explain why unintelligible writing of this kind should be taken seriously. In particular, I argue that unintelligible writing of this kind demonstrates how linguistic decoding itself can function as a signifier of the apparatus of written language even when decoding such writing is impossible.

Kinga Araya.
Decoding Impossible
As a practicing artist and critical thinker who has been using analogue and digital media to express her artistic ideas concerning personal and cultural displacements, I would like to present a selection of my art works that comment on formal and conceptual impossibility to “translate” one system into the other. Drawing first on the scientific definition of decoding, I would examine theoretical and aesthetic concepts of decoding present for example, in a deconstructive thought. I am very intrigued by the word “decoding” that seems to promise, “the reverse process converting decoded data back into information understandable by a receiver” (Wikipedia). I believe I have been "encoding" and decoding" myself since I left Poland in 1988 while translating one political and cultural system to another. I wonder what happens in the situations when neither the receiver nor the sender understands "the decoded message"? Some of my digital video and audio projects explore this fascinating yet paradoxical new media phenomenon.

Michael Hancock.
“Your Body in Big Letters: Textually Decoding Body Space”
To an extent, text, in the form of the printed word, has always been a site for decoding the body, either in the form of literary representation or medical treatises. At the same time, it is also a vehicle that carries the body from a physical form to a representational one. In “The Last Vehicle,” Paul Virilio states that the end of the twentieth century seemed to “herald the next vehicle, the audiovisual one, a final mutation: static vehicle, substitute for the change of physical location” (112). As an example, words entered on a screen transition the self into cyberspace; though the body is in the same physical place, a mental movement has occurred. And like other vehicles, the same text can take the body to very different conceptual destinations. In my paper, I intend to demonstrate how two similar textual starting points bring the body to two very different, almost binary positions, and what is at stake in this opposition. Human DNA is often referred to, in shorthand, by the letters representing the four bases that compose the DNA: A, G, C, and T. In essence, then, an entire body can be encoded in a long sequence of letters. Previously, I was involved in a project that used operations of deletion and insertion on formal languages to simulate genetic molecules . In theory, we would eventually be able to take a strand of DNA, convert it into its textual form, then go a step beyond that, converting the string of letters into an algebraic form. Text, then, served as site of both encoding and decoding, allowing the transition from a physical space into an abstract one. Starting from a similar position, installation artist Camille Utterback uses DNA text to end in a wildly different place. In Drawing From Life, Utterback takes a live video of the viewers/participants and converts their image into a human-shaped collection of the DNA letters. Utterback’s purpose in the installation was to raise questions in the minds of the visitors: “ ‘am I more than my DNA'? 'Does my DNA define me?’” Again, we have an encoding and consequent decoding of the body created through a transition into text. The difference between the two projects is the underlying assumptions that inform their respective ends: my project assumed that, through DNA text, we could reach a simplified, orderly view of the body expressible in mathematical form. Utterback’s project, in contrast, in its randomly changing letters, attempts to hint “at the vitality and chaos of life itself.” At stake, potentially, is the nature of the body: is it reducible to mathematical understanding, or does it ultimately resist or even transcend such articulations? Using Mary Morgan and Margaret Morrison’s theories on scientific modeling, and theories on the body by Gail Weiss, I intend to demonstrate how these two projects utilize DNA decoding to pursue opposing interpretations of body space and what the nature of the vehicle—the DNA text—implies about the directions they are traveling.

Ted Kafala.
Decoding the Digital Arts: Loops and Series in the Narrative and Performance of Computer Media
Interactive video pieces and computer music performances reflect and illustrate the tension between rational and irrational processes, between discrete and continuous data streams, or between coded instructions and the decoded modes of reception that underlie and surround creative work. This paper explores how coded, generative, serial iterative and pseudorandom processes in experimental new media (interactive art and music), which are incorporated into performance pieces by artists and programmers, create engaging and unpredictable experiences among audiences and observers. It also contrasts expressive mouse-click, or key-press events executed by the “user” that effect diegesis with repetitive loops as formalist devices inherent in the generative digital arts. Sean Cubitt (1998) points out that while traditional video montage attempted to reveal deep structures in the world, computer media create a kind of “Boolean montage” by which systems of representation are renditions of digital code in culturally specified forms. In contrast, Gilles Deleuze and other constructivists argue that complexity in new media does not correspond to expanded cybernetic hardware in the brain, or to any kind of “coded” or algorithmic function; in other words, media cannot be interpreted in completely computational terms. Deleuze (1989) suggests that the cerebral, topological and probabilistic space of interactive new media is where the rhizome/brain/thought come alive. Beginning with the concept of a “noematic surface” for coding/decoding digital arts and media, where the creative programming of the artist meets the embodied experience of the observer, I examine how seemingly rational, deterministic algorithms unfold in unexpected, nondeterministic and random variations of moving images and music in the “performance mode”. Artists and composers who want to aesthetically and conceptually express alterations between randomness and serially generative patterns in digital art pieces as they are “decoded” to audiences must rely on particular algorithmic methods in program “code”. The seemingly random and limitless variations in the unfolding/ decoding of performative and stylistic variations that respond to user/observer input actually rely on particular algorithmic methods, such as pseudorandom number generators, Markov processes with varying outcomes, and other types of both looping (iterative) and discontinuous pattern generation. I draw on Internet art to provide some working examples, and discuss the creative possibilities. Selected References Alpine, K. "Making Music with Algorithms" Computer Music Journal, 23:2 (1990), 19-30.Cubitt, S. Digital Aesthetics. London: Sage Publications, 1998. Deleuze, G. The Fold: Leibnitz and the Baroque. Minneapolis: University of Minnesota Press, 1993. Deleuze, G. The Logic of Sense. New York: Columbia University Press, 1990. Deleuze, G. Cinema 2: The Time-Image. London: Athlone Press, 1989. Fishwick, P., Ed. Aesthetic Computing. Cambridge, MA: MIT Press, 2006. Hansen, M. New Philosophy for New Media. Cambridge, MA: MIT Press, 2006. Marks, L. Touch: Sensuous Theory and Multisensory Media. Minneapolis: University of Minnesota Press, 2002. Rodowick, D. Reading the Figural, or Philosophy after New Media. Durham: Duke University Press, 2001.Taube, H. Notes from the Metalevel. London: Taylor & Francis, 2004.

Session 9 (C) Doggett
Decodings from the Wasatch Front
Michaela Giesenkirchen
The panel will introduce the SLSA community to three interrelated creative projects from the Salt Lake avantgarde--two independent films and one body of performance poetry. All three projects are inspired by the tensions from within a deeply coded regional society that fosters both conventional doctrine and radical counter-culture. They explore, respectively, Mormon culture as a network of codes that is intrinsically akin to science fiction and the possibility of a posthumanist decoding—and liberation—of sound through poetry. The panel contains only two presentations because both will require the showing of movie clips and are designed to be interactive.

Michaela Giesenkirchen; Nancy Rushforth.
“Decoding Mormonism: the Science Fiction of Plan 10 from Outer Space”
The Utah cult movie Plan 10 from Outer Space, a low-budget sci-fi comedy by Trent Harris featuring Karen Black, won recognition at Sundance and the Grand Prize at Raindance in London in 1995. In the quest-for-the-Holy-Grail genre, the movie explores the ritual codes of the Mormon religion as well as technological and imaginative possibilities of decoding them. With the help of a descendant of Joseph Smith, the heroine investigate these codes’ possible origin in an alien world from outer space and their prediction of an imminent apocalypse in the form of an alien invasion. A sympathetic parody of contemporary Mormon culture and its historical heritage, the movie presents us with a postmodern anatomy of religion’s general function to encode human realities, and asks whether not all cultural beliefs may be considered a form of science fiction.

Torben Bernhard; Travis Low.
“Sonosopher: The Physics and Metaphysics of Sound”
The sonosopher in the title of this presentation is the post-Beat, neo-Dada performance poet Alex Caldiero, whose work is briefly featured in Plan Ten from Outer Space. Born in Sicily and raised in Brooklyn, for the last thirty years, Caldiero has been an institution both at the Wasatch front and in New York. He recently has become the subject of Sonosopher, a feature-length documentary by student filmmakers Travis Low and Torben Bernhard. Drawing on the film, we will introduce our audience to Caldiero’s profound experimentation with the physical and transcendental qualities of sound at the margins of intelligibility. Philosophical aspects embodied in Caldiero’s work include the relationship between sound and space, sound and synaesthesia, and animate and inanimate sound; encodings of sound in language and decodings of sound in poetry; the life and death of sound in human and non-human bodies; and the afterlife of sound and language in the physical reality of silence. Sonosopher also explores the limits of the technology and formal devices of film making in sounding out a life of synaesthetic performance.

Session 9 (D) Mitchell
Math in Theory II
Aden Evens
Though mathematics has been a part of humanistic research for thousands of years, the increasing specialization and balkanization of disciplines has divided math from the humanities, leaving relatively few scholars committed to the intersection of these sets of ideas. These two panels present papers that engage with mathematics as history and philosophy, amplifying a recent surge of attention to this subject matter.

Peggy Reynolds.
Boundary Disruptions: Agency in a Fractal World
Geometries can be thought of as abstract models of the body/environment interface. For example, the angles, vertices and circumferences of Euclidian geometry reflect not the way the world is but rather the way humans see, that is, through a curved lens, a triangulated field of vision and an orthogonal frame of reference. Its idealized forms, however, do not acknowledge their origins in this entangled body/environment interface, but rather project the existence of a static universal order. Fractal geometry, by contrast, more readily admits of its genesis in this entangled interface by making explicit the connection between its articulations and the actions of the one-who-measures (as demonstrated in the example of the length of the coastline of Britain being dependent upon the unit of measure employed). It folds the objective observer, as given by Euclidian geometry, into its self-reflexive, iterative processes, disrupting dualist notions of subject/object, inside/outside, real/ideal and refocuses attention on relationship rather than “thingness”. Notions of boundaries and scale begin to take on a relative rather than an absolute quality within this nonlinear, topological register, thus creating new possibilities for the situating of subjectivity and the theorizing of agency.

Arkady Plotnitsky.
Categorical Imperatives, from Grothendieck to Badiou: Topos Theory, Politics, and Ethics
Arguably the single most significant aspect of Alain Badiou’s thought, distinguishing it from that of most of his French contemporaries, whose work has impacted current theoretical discussions, is his grounding of his philosophy in mathematical ontology—the set-theoretical ontology in his early work and then the ontology defined by Alexandre Grothendieck’s ‘topos theory,’ based on ‘category theory.’ This paper explores the topos-theoretical ontology as the ontology of the irreducibly multiple, and the philosophical, the political, and the ethical implications of this multiplicity, which are the main concern of Badiou’s work. This ontology may, in particular, be juxtaposed Kant’s ontology, as in Badiou’s terms “subtractive ontology,” and Kant’s ethical thought, as manifest in his concept of the categorical imperative. According to Badiou, the topos-oriented ontology of the multiple makes all ethics irreducibly multiple and, hence, irreducibly political. This political determination of the ethical as irreducible multiple might be said to be the philosophical “categorical imperative” of Badiou’s topos-theoretical ontology, which compels him to move, against Kant’s ethics of the categorical imperative and the tradition that follows it, extending, in particular, to Levinas and his followers. My first aim in this paper, then, is to show how this ontology is specifically shaped by the mathematical ontology (there is indeed no other ontology, according to Badiou) of Grothendieck’s topos theory, and explore the implications of these connections for Badiou’s thinking and for thinking in general—mathematical, philosophical, political, and ethical. Secondly, however, I shall also argue that, given that these modalities of thought are not only heterogeneous but also interactive, the ethical-political aspects of the ontology of the multiple in question have in turn important implications for mathematical thought and practice, and this paper will address these implications as well.

» Paper and slides
Previous | Next

Brian Rotman.
Mathematics and Gesture
“That bodies speak has been known for a long time.” (Deleuze) More fundamentally, bodies gesture. For some, (Merleau-Ponty, Wittgenstein) the difference vanishes: speech is seen as gesture. Certainly, gesture has multiple intimacies with spoken language. Not merely in the obvious and vital sense that utterance consists of gestures, but, as a mode of signifying and thought-production. As such, gesture precedes (evolutionarily and developmentally), alternates with (emblems), accompanies (gesticulation), gags (Agamben), and is embedded inside (tone) speech. Indeed, any kind of sense-making -- film, architecture, theatre, dance, music – is likely to exhibit significant liaisons with gesture. Mathematics, though not speech, operates through (written) languages and it too is ‘spoken’ by the body, gestured into being.  Such is the contention of Gilles Chatelet, whose analysis of the mathematization of space locates gesture and its diagrammatic capture as the source of mathematical thought.  The founding moves of geometry and arithmetic together with their (written) symbols bear out the idea that it is impossible to think mathematical thought in the absence of gesture. What consequences – ontological, semiotic - might flow from the fact that mathematics conceals within its ‘purely abstract’, ‘disembodied’ formalisms the dynamic work of the body?

Session 9 (E) Whitman A
Coding Originality

Grant Maxwell.
“Heavy on the Borderline”: Decoding Bob Dylan through the Evolution of World Views
Dylan's late Sixties work partially mediated a transformation of consciousness which culminated the counter-culture’s multivalent trajectory, enacting a rupture in the larger arc of modernity. Applying the theoretical orientation of William James, Bergson, and Whitehead, I will show that Dylan's evolution from the mid-Sixties to the increasingly mature encodings of his late-Sixties records, traces a movement from the reaction against the modern privileging of intellect to a more embodied and sustainable ethos, exemplified by Dylan's embrace of Country music, concretized in his migration from Greenwich Village to Woodstock. I will decode the evolution of Dylan’s personal mythography from his richly symbolic 1966 motorcycle crash, which seems to have impelled an ego-death and rebirth into a novel stage of emergent order, presaging and fractally reiterating the psychedelic self-immolation of the counter-culture a few years later in the wake of the Woodstock Festival peak experience, which Dylan impelled, paradoxically, via his absence.

Lindsey Joyce.
Rhythm Scientist or Musical Thief: an Analytical Interview which Questions the Plagiarism of Sound and the Agency of Influence in DJ Culture
In the book Rhythm Science, Paul Miller proposes that the DJ is an artist who remixes past art, culture, and ideals into a new idea, a new concept, and a new vision for the future. The true art is that the DJ manipulates present culture by recycling the cultural output that already exists in the world. He regenerates and recreates culture all at once. But, the DJ’s re-use of the materials created by other musicians and artists raises ethical questions of theft, plagiarism, and concepts of creativity and originality. This analytical interview with DJ AC Slater seeks to consider first, how he, as a DJ, defines  what a DJ does; second, to question how he defines “originality” in music and whether reuse or remixing of music negates that concept of originality; third, to see if he impacts culture and whether it is his primary goal to do so.

Session 9 (F) Whitman B
Visual Decodings II

Anat Pollack.
Quantum Populi
This paper covers “Quantum Populi,” an artwork that utilizes scientific and digital technologies in order to simulate living organic systems, thereby creating visual poetic analogies to human and animal social ecologies. A functional habitat includes various robots, equipped with independent yet complementary artificial intelligence activated by the audience. A sculptural artwork that is a scalable simulation of biodiverse systems and social networks in the natural world serves as a metaphor for collective systems: how individuals form themselves into a community, producing non-predictable phenomena like social behavior and intelligent action, and in doing so, develop seemingly closed systems whereby the individual components are only seen in relation to the other components. This artwork exists at the forefront of groundbreaking research in the field of Electronic Arts.

Patrick LeMieux.
Lossy Subjectivity: Self-destructing Codeworks after René Magritte
Belief in the ostensibly fixed, functional, and causal relationship between code and output in digital media productions constructs boundaries or limit points within both digital media criticism and production. Assuming an intrinsic relationship between code and output, a relationship which can be likened to pre-structuralist assumptions about the fundamental connection between signifier and signified, severely limits any possibility of interpretation. Following Michel Foucault’s art criticism, this paper attempts to decouple code and output in order to carve out an interpretive zone in which to better understand the semiotic play at work between different orders of textuality in digital media. In order to further investigate the semiotic play between code and output, this paper follows a trail of scholarship surrounding René Magritte’s now classic 1928-29 painting _La trahison des images_, or _The Treachery of Images_. A method for reading digital media productions emerges by tracing a slowly decomposing relationship between language and images in Scott McCloud’s reductive materialism in _Understanding Comics_ (1993), Henning Pohl’s ostensibly definitive code-space _La trahison des images numeriqués_ (2009), Douglas R. Hofstadter’s clever calligramatic sketches in _Gödel, Escher, Bach: an Eternal Golden Braid_ (1979), and Michel Foucault’s five part procedural analysis in _This is Not a Pipe_ (1973), inspired in part by Guillaume Apollinaire’s calligramme _Fumées_ (1914). The lessons learned from the discourses surrounding Magritte’s painting can be applied to , a highly theorized digital media artwork by Joan Heemskerk and Dirk Paesmans (Jodi) in which meaning is produced specifically through the exchange between code and output. This paper picks up where Alan Sondheim, Peter Lunenfeld, John Cayley, McKenzie Wark, Alan Liu, and C. T. Funkhouser each end their discussion of the website. In these readings Jodi non-coincidentally serves as the culmination of these authors’ theories—the terminal example with which they conclude their writings. Jodi figures as the punch line or period at the end of each particular work. The goal of this essay is to convert that period into an ellipsis and to draw attention to certain assumptions these theorists make regarding the relationship between code and output. Instead of reading narrative causality between code and output, this paper performs a kind of Foucauldian reading which emphasizes the disconnect between these two orders. Rather than reading various windows of a web browser sequentially in order to construct narrative frames, linguistic signs and plastic elements collapse into simultaneous and, more importantly, discontinuous arrays.

Andréia Oliveira.
Simondon’s Contributions to Digital Art: The Potential of the Associative Milieu between Dissimilar Technologies
This paper will analyze the process of technicity within digital art, especially within interactive video installations. It aims at problematizing the decoding processes between analog and digital technologies through the process of technicity. It is based on four of Simondon’s key concepts: the element, the associative milieu, transduction and information. We will explore how the element inhabits the plane of pre-individuation and the function of the minor elements such as repetitive gestures or actions and their interchangeable nature between various technologies. The process of adaptation between elements and their associative milieu will be considered as emergent spacetime where neither the elements nor the associative milieu are prefigured to the process and the result is information and not form as it is translated from one technology to another through the process of transduction. Keywords: element, associative milieu, transduction, information, digital art.

Session 9 (G) Woodruff A
Symbiogenesis and Evolutionary Code

Victoria N. Alexander.
The Problem of Authorship and Evolution by Natural Selection
Artists and others have long resisted Darwin's revolution on the grounds that natural selection does not explain evolution, a theory of which must include a theory of creativity. In early 20th-century biology, there were still many vocal and powerful dissenters: William Bateson and C. H. Waddington (also a painter and a poet), Richard Goldschmidt, and D'Arcy Thompson, who were heir to 19th-century teleomechanists and morphologists such as von Baer, Mivart, Owen, Muller, and Geoffroy. Repressed in the 1950s during the hardening of the Modern Synthesis, ideas about evolutionary creativity and progress have bubbled up again. Saltationists have increased in number. Many of those calling themselves selectionists have actually strayed far from the fold insofar as their research shows that saltatory changes occur and the resulting organisms are immediately viable, making natural selection as a force of change superfluous. Now is precisely not the time for students of literature to start looking to Darwin for guidance nor to take up Literary Darwinism. Instead we should be focused on various other mechanisms (e.g. drift, neutrality, repetitive differentiation, orthogenesis, symbiosis), which, we may say, are at work in the process of authorship as well. I will give examples from Nabokov's Lolita and Henry James' A Small Boy and Others. This talk will be based on a paper I co-authored with Stanley Salthe. Keywords: complexity sciences, self-organization, non-Darwinian evolution, teleology

» Presentation slides
Previous | Next

Carlos Castellanos; Diane Gromala.
The Symbiogenic Art Experience: A Phenomenological and Aesthetic Approach to the Question of Human-Machine Co-evolution
This paper outlines a research agenda that addresses the question of how contemporary interactive arts practice can evolve new strategies or ways of facilitating the development of subjective experiences that elicit an embodied, felt sense and awareness of our co-evolution with intelligent systems and digital technologies. In addition to providing a broad historical and philosophical context for our inquiry,we describe Protocol, an interactive artwork currently in progress through which we attempt to realize this new form of human-machine symbiosis.

Karl Zuelke.
Creative Becoming and the New Ecological Age
Pierre Teilhard de Chardin, geologist, paleontologist, philosopher, theologian, and Jesuit priest, famously stated in 1931 that, “For the observers of the Future, the greatest event will be the sudden appearance of a collective humane conscience and a human work to make.” Many critics have noted how this statement, when examined in light of Teilhard’s ideas about evolution and the emerging noosphere, seems to be predicting the Internet, that technological breakthrough which has enabled connection and communication on a scale undreamed-of only a generation ago. One critic notes that Jung gave us the collective unconscious, and now Teilhard’s vision has helped us begin to understand the unfurling of the “collective conscious” through which humankind deliberately participates in an ongoing state of universal creative becoming. One result of humankind’s awareness of and participation in the emerging noosphere is a resultant breakdown of the notion of human beings as the atomized, mechanistically interacting components of a social system, so that our enmeshment in a physical, biological, social and spiritual totality becomes more and more apparent. This has ramifications in the 21st century as humankind faces the consequences of having envisioned itself separate from and superior to the geosphere and biosphere out of which the noosphere has arisen and upon which it depends. This paper will pose the philosophy of Pierre Teilhard de Chardin next to humankind’s need for a reevaluation of its relationship to the earthly environment and Gaia, noting how Thomas Berry’s understanding of unfolding story, and Lynn Margulis and Dorian Sagan’s understanding of a second nature of technology and life converge, helping to propel the emerging paradigm through which humankind may propagate new narratives and thus continue participation in its own creation.

Session 9 (H) Woodruff B
Decoding Cognitive Process

Elizabeth Freudenthal.
Neurodiversity Activism and Non-Representational Language
In early 2007, Amanda Baggs posted on YouTube a short film depicting her autistic “native language,” a series of nonverbal somatic and sensory interactions with her environment. Her film attracted hundreds of thousands of viewers, reporters from Wired magazine and CNN, and the ire and enthusiasm of activists from all sides of the various medical and political debates about autism. However, most public discussion of this film—both positive and negative—misses Baggs’ most powerful and radical point: that her experience of language as non-representational, as anti-code, is the foundation of her human rights agenda. Using this film and the writing of other self-representing autism activists, this presentation will contextualize the neurodiversity movement in contemporary human rights discourse. I will show that Baggs' and others' particular disability rights agenda, by contesting the conventional definition of language as symbolic, radically challenges our definitions of “human” and our accompanying notions of human rights.

Elizabeth Donaldson.
Of Mice and Mental Illness: The Schizophrenic Mouse
What is a schizophrenic mouse? How is schizophrenia, this very human illness, the diagnosis of which lies beyond mere biological markers, manufactured or manifested, decoded and recoded, in the mouse? Considering the complex and contested history of the schizophrenia diagnosis, the very existence of a schizophrenic mouse model appears necessarily problematic. Recent developments in transgenic mouse models of schizophrenia, which rely on human genetic material, are ripe for analysis from both disability studies and animal studies perspectives. This project pulls together theories from both fields in order to examine the novel articulation of personhood that these new mouse models embody.

Alexandra Kleeman.
Whole Persons and Organic Holes: Encoding Personhood in Aphasic Autobiographies
Diverse fields of knowledge, from literature to medicine, have dealt with persons as whole entities that come into forms of being that are stable and continuous, yet continually threatened by dissolution. The whole person is described as an autonomous, self-sufficient physical and cognitive unit, differentiated from others and from its surround by embodying its own self-production. At the same time, the coherence granted to this whole person generates narratives of dissolution, damage and destruction: physical events such as death or torture have been viewed as forces of depersonification with the power to erase selves, linking the disappearance of persons and personalities to bodily crisis, pain, and the loss of language (Scarry 1985). It is in light of the presupposed coherence of the whole person that aphasics, individuals who have lost some portion of their language ability as a result of brain damage, have been figured in popular and medical literature as partial persons, persons irreparably damaged and reduced. When speech is viewed as the material of thought, speechlessness may be interpreted as a hole, situated at the precise location of the expressive self. On the other hand, refusing the premise that what we identify as a whole person has ever been capable of perfectly autonomous self-expression and opens a space in the self into which other actors, other activity may enter. Much recent work in post-human science studies and actor-network theory has emphasized the work of distributed networks of human and material agents in creating entities that “hang together,” giving an appearance of organic unity despite the heterogeneity of their production (Latour and Woolgar 1986, Latour 2001, Mol 2002). This paper explores the role of autobiographies and memoirs in actively performing the personification of their subjects. Written accounts and narratives are mobilized by physicians, relations, and patients themselves to address the demand that personhood be reasserted following language loss and brain damage. These accounts give shape to the person that is presented, influence which capacities are perceived by others, and at the same time are capable of describing a reconfigured personhood that does not conform to the established binary of organic whole, or damaged part. In addition, the context of life writing allows for the emergence of alternative temporalities of language production that are unavailable to the aphasic during real-time conversation, enabling utterances to be written down, edited, re-edited, and rearranged—ultimately taking on normatively coherent forms that support their recognition by others as competent speakers. Autobiographies are thus actors that intervene not only the social sphere as guarantors and symbols of a person’s wholeness, but also within the subject, shaping experience, cognition, and self-figuration. As Donna Haraway writes, “We are not immediately present to ourselves. Self-knowledge requires a semiotic-material technology linking meanings and bodies” (Haraway 1990). Autobiography is one such semiotic-material technology, a technology for encoding bodies, behaviors, and language as indicators of personhood. Authors such as Floyd Skloot and Paul West, writing of and within their own experience with aphasia, encode themselves as affective selves within their narratives, even as they present the act of narrative encoding as a nontransparent task, a task requiring artifice, effort, and the cooperation of an extended material assemblage usually put under erasure in traditional autobiography. But the loss of a transparent, representational relationship between language and speaker, when language is viewed as a material for exploring the way in which meaning is encoded, is no longer sufficient to place personhood in question. Instead, one “deprived of language as we know it” (West 2008) negotiates a new relationship to language and thought: memoir becomes a process that develops the relationship between personhood and language in radical directions.

Elizabeth Lyman.

Guest Writers Session - Sat 5pm - 7pm

Jim Grimsley, reading from his science fiction story "Wendy," and others

Dance with Soulphonics - Sat 9pm - 11:30pm
Woodruff A & B

Session 10 - Sun 8:45am - 10:15am

Session 10 (A) Bennett
SLSA Creative Writers Read II
Susan Allender-Hagedorn

» Recording
Previous | Next

Blake Leland.
Ice Wagon
Since first reading, years ago, C.P. Snow's "Two Cultures" essay, I've been fascinated by the slippery, anthropomorphic notion of Entropy. That fascination manifests itself in my teaching--I've just finished leading a Senior Seminar in some of the various concepts of Entropy for our Science, Technology and Culture majors here at Georgia Tech. And "Ice Wagon" is my most sustained attempt to incorporate a meditation on Entropy into a poem. The poem is not primarily or directly about the Second Law: it is, mostly, about my Grandfather who once made a living driving an ice wagon in Massachusetts. It is also about the problem of writing poetry about the summer (I'm overweight and find heat unpleasant). It is, thus, a poem about heat, cold, work, loss, memory, poetry, summer, ice, and time.

Kevin O'Sullivan.
Radiant in Lyra
Reading two poems, "Climax Forest" and "Radiant in Lyra," each triggered by bodies of scientific knowledge, biological and astronomical, respectively. Originally in two entirely different manuscript settings, they bear no link to each other thematically. One explores an explanatory concept that is debated among forestry experts today; the other weaves astronomy, mythology, psychology, music, and contemporary history into an exploration of the origin and silencing of lyric.

» Read online |
» Recording
Previous | Next

Wayne Miller.
Reading from The Bog Monster of Booker Creek, with a word about being a Creative Commons author
I will read from two locations in the novel, one the description of waking on an alien world, the other the perhaps familiar experience of being a panelist at a conference (in this case, about Sasquatch) that one doesn't really want to attend. I will also say some things about my experience of releasing two novels on the Internet under a Creative Commons license.

Session 10 (B) Crescent
Stormy Weather: The Material in Contemporary Humanities

Atsuro Misoe.
Singin’ in the Rain: Meteorology during the Age of Nuclear Deterrence
From Harry S. Truman’s metaphorical juxtaposition of the atomic bomb with the “rain of ruin from the air” (August 5, 1945) to the military strategy of “nuclear umbrella,” the cold war had often embraced the terminology “rain.” During the age of nuclear deterrence, the scientific knowledge and representation of the rain helped substitute for the phenomenological absence of the nuclear weapon. Indeed, the era corresponded to the birth of modern meteorology, as typified in John von Neumann’s computerized decoding of weather through the ENIAC. The postwar convergence toward the rain is also recognizable in cultural texts, among others, such as J. D. Salinger’s The Catcher in the Rye (1951) and Gene Kelly’s Singin’ in the Rain (1952). By historicizing the representations of the rain through the context of modern meteorology, I will situate the rain as the semiotic arena where naturalization of the cold war was realized or not.

Melissa Warak.
“Decoding Die Reihe: Karlheinz Stockhausen, Metaphysics, and the Spatialization of Sound”
The small German music periodical Die Reihe, translated as “the row” or “the series,” explored concepts in new music and composition. Published in German from 1955 to 1962 and translated into English volumes from 1958 to 1966, Die Reihe originated as a forum for composers to discuss post-war serial music. Edited by the German composers Herbert Eimert and Karlheinz Stockhausen, the eight volumes include essays by composers, usually related to a special theme. Despite its technical nature, Die Reihe became a primary vehicle for new ideas on sound and technology to the avant-garde. Contributors included composers John Cage, Pierre Boulez, György Ligeti, Henri Pousseur and Christian Wolff, and American abstract filmmaker John Whitney, among many others. Stockhausen frequently used his interests in metaphysical philosophy—particularly various practices of Buddhism and the Vedantas—as a source for his own compositions, for which one of his most overarching goals was the spatialization of sound. This paper analyzes the metaphysical content of essays such as Cage’s “Indeterminacy” and Stockhausen’s “…How Time Passes…” and “Music in Space” in the context of the compositional objective of creating or fostering spatialized sound.

Cara Ogburn.
Decodexings: Mortality, Materiality and Electronic Literature
"Decodings" derives from the Latin codex: tree trunk, wooden tablet, book, code of laws. This etymology calls into focus the materiality of a book as paper, as tree—as situated within a cycle of mortality. Books’ lives end with their organic disintegration—what we might call the "decodexing" of the book. Digital media has its own kind of life span, tied both to the media on which it is stored and to the “life-span” of technologies we use to read or execute code; as we lose the ability to "decode" or execute the program that presents us the text it becomes "decodexed."   My paper intervenes in these questions by examining themes of life, death, mortality and materiality in several electronic literature texts. By considering these themes in the texts themselves, I hope to come at questions raised about (digital and print) materiality, mortality and textuality in a new way.

Session 10 (C) Doggett
Contemporary Narratives

Jonathan Goodwin.
Economic States of Being in David Lynch's INLAND EMPIRE
A recent New York Times article referred to Poland during the second Bush administration as the "fifty-first state." In addition to a conservative alignment in politics, Poland has in recent years begun to rival Canada as an outsourcing venue for Hollywood film production. David Lynch's INLAND EMPIRE (2006), shot entirely with digital cameras, uses Poland as a location directly and Polish characters indirectly as the cause of the first of many changes in states of being in the film. The idea that a character's perspective on life can shift suddenly and without apparent cause is frequently externalized through supernaturalism (Twin Peaks) and expressionism (Eraserhead) in Lynch's films. In INLAND EMPIRE, however, the shifts in perspective correspond almost exactly to economic states. There is a constant downward movement through the Greater Los Angeles demimonde, and each transformation is conditioned by the parallel economic pressures on filmmaking itself. The digital video, which was inexpensive enough to allow Lynch to create the film, also provides the abundant footage needed to create the film's precisely edited decodings of economic reality through psychological degeneration. It has been frequently remarked that Lynch's films are largely unconscious in intent, the change in medium in INLAND EMPIRE permits the unconscious political allegory to be read more clearly than in his previous work.

Srikanth Mallavarapu.
Storytelling and the Decoding of Modernity in Rana Dasgupta’s Tokyo Cancelled
Just as modernity relegates the artisan to the archaic past, the storyteller who narrates fairy stories and folk tales seems out of place in the contemporary world dominated by science and technology. However, Michael Taussig points out in his study of the folklore of plantation workers and miners in South America that these narrative traditions reflect an attempt to understand capitalism and industrialization through the lens of precapitalist frameworks. Taussig suggests that a careful engagement with these narratives leads to an examination of what he refers to as the “phantom objectivity” of the structures that we take for granted in modernity. This paper examines how Delhi based author Rana Dasgupta uses conventions from folk traditions to map out the contours of contemporary experience in his first novel, Tokyo Cancelled.

Trever Holland.
Hybrid Cats and Dutch Grandfathers: The Myth of Scientific Truth in Linda Hogan's Power
Linda Hogan’s novel Power raises questions regarding the legitimacy of traditional Western epistemologies, philosophies, and knowledge to interrogate the affect these “disciplines” have on individuals, their (dis)connection with the environment, and humans shared ecological community. Scientific discourse (as an objective discipline) is cross-examined as a binary structure that continues to divide and alienate individuals from their communities and from their environment. By situating her narrative in a socio/political context, Hogan addresses these issues from a historical standpoint to demonstrate that ecological degradation is closely linked with how individuals construct themselves in opposition to Nature. Hogan links environmental issues with racism and how politics continue to objectify land, animals, and people in part by situating the text in the Everglades of Southern Florida. As a synecdoche of those environmental issues, Hogan writes of the plight of a fictional indigenous community and links the Taiga with the endangered Florida panther, who suffers visibly noticeable genetic defects as a result of environmental degradation and territory fragmentation. Using Foucault’s theory of knowledge as a means of control, I argue that Hogan’s novel interrogates the authenticity and privileging of colonial education and scientific discourse over alternative conceptualization of the world. As her characters find places of resistance to imposed bodies of knowledge, Hogan’s novel reiterates traditional Native American philosophies and ideas and engages the reader in questioning the efficacy of traditional Western ideologies in sustaining life on both a cultural level but, also, ecologically. Hogan’s novel is arguably the articulation of the Western construction of binaries and the hierarchal positioning that accompanies this imposed configuration. By deconstructing this arrangement, Hogan’s novel opens a space for the reader to engage with “nature” and thus dismantle discordant hegemonies to open a dialogue for resistance.

Session 10 (D) Whitman A
Decoding Technologies

Jenell Johnson.
The Rhetorical Work of Literary Epigraphs in Popular Neuroscience
In recent years, the literary market for popular neuroscience has exploded: books by neuroscientists like Joseph LeDoux, Antonio Damasio, V.S. Ramachandran, and Oliver Sacks are perennial bestsellers. Although each researcher-author works (sometimes literally) in different areas of the brain, most of their books share one peculiar feature: the persistent use of literature as epigraphs—often to the point where quotations from writers like Virginia Woolf and T.S. Eliot frame every chapter of the book. These epigraphs are rarely incorporated into the text’s narrative or argument, which makes it tempting to dismiss them as thoughtless additions gleaned from Bartlett’s Familiar Quotations. Instead of approaching the epigraph as “mere” decoration, however, this paper asserts that because of its location and its liminality (neither fully “inside” nor “outside” the text), the epigraph is one of the most powerful rhetorical spaces in any piece of writing. In the case of popular neuroscience, this paper argues that the literary epigraph works as a cipher for the nonspecialist reader; specifically, it directs how neuroscience’s significance to the human condition should be understood. Using works of literature in such a prominent place in the text does more than textually bridge the “two cultures,” a goal claimed by a number of popular neuroscientists (e.g. Ramachandran 2001; Walker 2000); it positions the sciences and humanities as partners in dialectical inquiry.

» talking automata as symbols of dangerous knowledge(first four pages of article)
Previous | Next

Kevin Lagrandeur.
Cybernetics and Servitude: Decoding the Artificial Slave
A New York Times article dated May 24, 2009, entitled “The Coming Superbrain” discusses the dream, or nightmare, of true artificial intelligence. No longer the realm of science fiction, the notion that increasingly interconnected computer and communications networks might spontaneously emerge as self-organizing, self-replicating, and perhaps self-aware appears to be giving Silicon Valley scientists and technology experts conflicting fits of paranoia and joy—depending on their optimism about the controllability of such servant networks. The thrust of my presentation will be that the worries of current AI theorists about AI tractability appear to be essentially the same as those implied by Aristotle over 2000 years ago, when he pondered the possibility of artificial servants in his Politics.  The inherent interconnectedness of maker and tool, slave and master compounds the dangers associated with creating artificial servants. For  the maker/master often overlooks the danger of the dialectical inversion of the master-slave relationship that may occur precisely because of that interconnectedness. Thus, the older accounts of creating artificial slaves are accounts of modernity in the making—a modernity characterized by the project of extending the self and its powers, in which the vision of the extended self is fundamentally inseparable from the vision of an attenuated self.

Hannah Abelbeck.
Lasers as Home Defense
Despite their omnipresence in retail, entertainment, and telecommunications, lasers are still imagined as the domain of evil scientists and space explorers. Lasers can allow machines to see, and they can potentially aid in both the positive transformation and complete destruction of bodies, evoking both wonder and fright. This paper will focus on lasers as protectors of domestic security, both the for homeland (THEL and ABL programs) and in the home (Laser Shield). The rhetoric surrounding these lasers, I will argue, stems from the (American) control society's obsession with in/security. After all, few are anxious about the use of lasers to engrave sheet metal. Lasers are simultaneously protective when they work for you, and frightening when they are aimed at you. But being comforted by security is only a momentary reprieve in a world necessarily figured as dangerous and threatening. For example, the ritual interaction with a private home security system is largely one of affect, of an anxiety about safety and the promise of placating it, a dynamic that functions apart from any real threat or real solution, "where feeling vulnerable is no longer part of her reality", as the advertisement for the Laser Shield Home Alarm System explains. Rhetoric about lasers is at its most affective when it is associated with the regulation of movement and the often bodily consequences of motion's detection. This feeling of fear and vulnerability at points of access—at borders, at gates, at doors of corporations and homes—is deeply enmeshed with how the subjects of the control society have come to understand their subjectivities. The threat of being locked out or mistaken as an intruder is an ever-present threat at these thresholds, even for those who have no reason for such a fear.

Session 10 (E) Whitman B
Greg Bear

Ho-Rim Song.
Distributed Subject and Nanovision: Assembled Individual Subject in Superorganism in Blood Music
Claiming that complexity paradigm, understanding the world as a complex, adaptive, self-organizing system, emerges in our technoscience culture, my paper analyzes Greg Bear’s nanofiction, Blood Music, to account for a new type of subjectivity based on the new paradigm.  Blood Music shows the invalidity of the traditional concept of individual as the coherent and self-determining self through a form of superhuman called “noocytes,” a collective intelligent nano-bio machine that transforms humans and other living things into a complex, self-organizing system.  In such a superhuman network, the individual subject is contingently re-constituted by complex processes of the network rather than disappears.  Focusing on this aspect, the paper attempts to articulate the subjectivity of the complexity paradigm. 

Olivia Banner.
The Flexible Genome in Epigenetics and Greg Bear's Darwin Novels
In Evolution in Four Dimensions, Eva Jablonka and Marion Lamb advance the epigenetic case for the self-regulation of the genome -- that evolution is not a process of slow random blind mutaton but is rather self-directed. The genome is, in other words, flexible. Greg Bear's Darwin novels utilize this model of the self-regulating genome, and, like Jablonka and Lamb, use it to attack the central dogma. However, Bear's novels manage to reinscribe the very biological essentialism that the central dogma fed, especially through a heteronormative narrative. My paper poses two questions: first, how do these models of flexibility relate to what Emily Martin has criticized as a model endemic to both immunological as well as corporate capital discourse of flexible bodies, and that feeds a new form of Social Darwinism? Second, how does Bear's model advance evolution as producing posthumans who are more able-bodied than their human forebears, and what is this model's relation to the heteronormative schema the novels offer?

Idema, T..
Evolution as biopolitics: Ethics of control and change in Greg Bear’s science fiction
Modern western culture has often perceived social and biological changes as a threat —a threat projected on an ‘other’ defined as ‘less-than-human’. While the excesses of eugenics and social Darwinism have been criticized, ‘us vs. them’ thinking, often accompanied by metaphors of war, endures in science and culture. Greg Bear’s science fiction novels Darwin’s Radio (1999) and Darwin’s Children (2003) illustrate how the challenge of coping with differences and changes in human life may be disengaged from a quasi-militaristic dialectic between an ‘us’ (the healthy, sane, rational) and a threatening ‘them’ (a disease, irresponsible scientists, terrorists). In these novels, human evolution is altered through the spontaneous activation of an endogenous retrovirus, ‘SHEVA’, ironically located in a ‘non-coding region’ of the human genome. SHEVA causes humans to metamorphose into another species. At first perceived as a global plague, SHEVA provokes mass panic. An international task force is assembled to control the crisis and to find out how SHEVA operates at the genomic level. However, as the story unfolds, it becomes manifest that SHEVA is too complex to locate, decode, or ‘treat’—and, moreover, that it may not represent a disease at all, but simply an emergent, posthuman stage in evolution. Taking the cue from Bear and contemporary Deleuzian scholars Rosi Braidotti, Luciana Parisi and Manuel Delanda, I understand SHEVA as a ‘literary machine’ that sets the narrative in motion and a ‘genomic machine’ that turns human evolution into an immediate biopolitical issue. The novels thus become machinic topologies that map out a range of potential effects (social, economic, medical) and ethical responses. I argue that Bear’s scenario offers a critique of an ethics of control and informs an ethics of transformation that has significance for discussions on biopolitics and genomics.

Session 10 (F) Woodruff A
Animal Representations
Analia Villagra
Animals are making their presence known in the social sciences and the humanities as more than just abstractions or objects; animals and the human-animal relationship is increasingly receiving serious scholarly attention. The non-humans that share so much of our social, political, and creative spaces bring us joy and companionship, but also fear and apprehension as their contested moral, legal, and ideological standing forces us to confront the edges of our own humanity and animality. With a focus on film, the four papers in this interdisciplinary panel explore representations of animals and animal representations that challenge our understanding of humanity, animality, and the human-animal relationship.

Sharon Wilcox Adams.

Aaron Shackelford.
The Western's Rhetoric of Animals in Eastwood's Unforgiven

Casey Riffel.
Of Foxes and Insects: Vladislav Starevich, Animation, and the Problem of Animal Articulation

Analia Villagra.
Zombie sheep and biofears
Modern biotechnology takes the classic dichotomy between the natural and the mechanical and turns it on its head. Crops, seeds, even entire animals, are, oddly enough, the most cutting edge of current technology. In this world of altered crops and manipulated genetic material, the man versus machine technological fear is replaced by biofear, fear of the cyborg, a hybrid character neither entirely biological nor entirely machine. Idyllic nature, once the obvious counterpoint to rampant fear of computers and technological monstrosities, now becomes the very source of unnatural chimeras. That which was once unquestionably safe, peaceful, and natural has become the unexpected locus of our new biofear. This paper examines the New Zealand film Black Sheep in which the production of a genetically enhanced breed of sheep also produces, as a byproduct, monstrous zombie sheep who viciously attack their human keepers. In this film, the bucolic countryside and the sea of identically innocent sheep faces becomes the central point of our most intensely (post)modern fears. Innocence, uniformity, and the countryside are transformed into the terrifyingly unexpected horror of the most cutting edge of biotechnology gone wrong.

Session 11 - Sun 10:45am - 12:15pm

Session 11 (A) Bennett

Tamara Popowski.
Code-duality: Mending the broken bridge between Nature and Culture?
Code-duality: mending the broken bridge between Nature and Culture? In their influential paper ‘Code-duality and the semiotics of nature’ (1991), biologists Jesper Hoffmeyer and Claus Emmeche quite famously built on the notion that ‘[s]elf-reference is the fundament on which life evolves’ by claiming further, that the central feature that allows for this self-reference is ‘code duality, i.e., the ability of a system to represent itself in two different codes, one digital and one analog’. Indeed, for these two theorists and the subsequent work of others whom they inspired, the emergence of digitalisation is precisely what marks the difference between life and non-life. As such, addressing the question of how life emerged out of a lifeless world is rearticulated as a question of how code-duality itself could arise. Posing this question however, requires that one also pay close attention to the emergence of digitalisation in the sphere of human life or human culture. For it was Hoffmeyer and Emmeche’s suggestion than one could only understand the emergence of digitalisation, and therefore of life, ‘through inspiration from the much better known interface between the human and the non-human, the cultural and the natural’. In short, for these two theorists and those that followed them, the evolution of natural and cultural systems share the same essential step – the formation of code-duality – while nevertheless maintaining a difference between each kind of system. This paper will closely examine exactly what is at stake in making the aforementioned claims, beginning with an examination of the distinction between analog and digital codes. My contention, quite briefly, is that this distinction is far more fraught than is usually allowed. More broadly, the field of biosemiotics, of which Hoffmeyer and Emmeche are such noted proponents, has quite admirably attempted to free textuality from the confines of language or culture. At the same time, my argument here is that biosemiotics maintains a purchase on the very thing it attempts to dismantle, an investment in precisely what it attempts to disavow that goes unacknowledged. If we are to claim, as Hoffmeyer and Emmeche do in their seminal paper, that the ‘broken bridge between nature and culture must be reconstructed’, then it seems as though we must be very attentive to the assumptions and implications of such a statement. What strange attempts at containment do such statements engender, somehow in spite of their attempts at co-implication?

Till Andreas Heilmann.
Digital Decoding and the Techno-Logic of Representation
Codes are generally viewed to be either of a more social kind or of a more technical kind. Accordingly, decoding is understood either as an interpretative act with varying degrees of constraint or as the application of a strict set of rules for transforming expressions. Therefore, a technical code is often assumed to have only one possible decoding whereas social codes usually allow for divergent readings. While it is true that technical codes are always explicitly laid down and the result of a particular act of decoding is determinate and predictable, in digital computing the decoding of information can happen in a variety of ways. In fact, it is precisely the multiplicity of decoding that makes the digital computer such a powerful device. With digital data, there is no single "right" decoding or "true" representation. The presentation outlines the multiplicity of digital decoding and explores its consequences for the techno-logic of digital representation.

Adam Nocek.
The Networked Gene: Transcendental Empiricism as Biophilosophy
Largely because of Gilles Deleuze’s avowed naturalism that embraces a kind of process ontology, there has been a growing interest in his (and Guattari’s) connection to the biological sciences. To this end, the work of Brian Massumi, Manuel DeLanda, and John Protevi has been invaluable to the field, establishing connections among Deleuze (and Deleuze and Guattari) and biological systems theory, nonlinear dynamics, autopoiesis, neurodynamical systems, and more. Not surprisingly, however, there has also been some backlash to the recent flurry of Deleuzian biocriticsm; Mark Hansen suggests, for instance, that despite some superficial similarities, namely, at the level of self-organization, the “cosmic expressionism” of Deleuze-inspired immanent philosophy is largely incompatible with the conservatism and organizational preservation in biological systems. To confront such attacks, I focus on Deleuze’s Difference and Repetition and The Logic of Sense in order to suggest not just a critical connection between Deleuze and the biological sciences, but how this link will help theorize new, networking potentials in biophilosophy. To this end, I review critical developments in evolutionary and developmental biology; this includes thinking about how recent theories of development (Evo Devo, DST, etc.), moving away from models of the gene as “master molecule,” are starting to think in terms of gene-environment networks – there is no prefiguration of protein synthesis. This more immanent hypothesis will then be developed in terms of Deleuze’s paradoxical notion of transcendental empiricism. What I suggest is that Deleuze’s revision of Kantian Idealism, a revision that is seen as closely aligned with William James’ radical empiricism, allows us to think not just the philosophical status of the networked genome, but what is ontologically at stake in a technoscientific world where technology is capable of manipulating the genome in ever more profound ways.

Session 11 (B) Crescent
Nineteenth-Century American Literature and Science

Kristen Case.
Decoding Thoreau’s “Kalendar”: Reading Between Science and Literature
From 1860-62, Thoreau attempted to consolidate his observations of seasonal change in a variety of lists and charts he sometimes referred to as his “Kalendar.” These unpublished materials have important implications not only for the ongoing reassessment of Thoreau’s late work, but also for the reconsideration of the categories of “the literary,” “the human,” and the “natural” that ecocriticism urges, and that the growing evidence of environmental crisis demands. Recently, these charts have been put to use by biologist Richard Primack, who has used them to study the effects of climate change in Concord.  But what does the Kalendar offer the contemporary humanities?  Located between the “two cultures” of the sciences and the humanities, the Kalendar demands not only a reconsideration of disciplinary boundaries, but also a shift from text to process as the subject of inquiry – a shift that suggests the relevance of science studies as a model.

Anton Borst.
'Corroborating a Rugged Phrenology': Walt Whitman and a Science of Self
Remembered as a mere cultural oddity today, phrenology offered a new key to reading nature to many antebellum Americans. Some were so confident in its power that phrenological examinations were required for job applications. In addition to its role in prison, education, and other reform movements, phrenology had a significant impact on the literature of the American Renaissance, particularly the work of Walt Whitman. It provided Whitman with a way of decoding the multifaceted subjectivity emerging in his poetry, as well as a way of encoding his homosexuality under the phrenological faculty of “adhesiveness.” Furthermore, an understanding of this pseudoscience allows Whitman’s readers today a way of decoding his own poetry, which is pervaded by unfamiliar phrenological concepts and terms. This paper will focus on Whitman’s phrenological chart, which he revised and published three times during his life, tinkering with the qualities of personality most important to him. Through his revisions, Whitman creatively responded to and exploited a discourse often viewed as deterministic. Phrenology for him held out solutions to the philosophical problems faced by him and other Romantic and Transcendentalist writers: the contradictions between fate and freewill, the dualism of subject and object, and the seeming cross-purposes of science and religion, materialism and spiritualism. As a poet, Whitman understood phrenology not as a reductive system to explain away the soul, but as a new vocabulary with which to express all aspects of the self. Whitman’s engagement with phrenology provides a vivid example not only of how porous was the boundary between science and literature at this time, but also of a poet infusing a reductive codification of the self with lyric potential.

Sam Schwartz.
Deciphering "Invention": Encoding and Decoding the Techno-Aesthetic in Melville and Twain
The literary critic J. Hillis Miller writes that certain words, when employed in literary narrative, are so dense in meaning that they tend to function, not as simple words whose denotations are straightforwardly decoded, but more like tropes, whose meanings are symbolically loaded, beyond their denotations. Paul de Man, using different terminology, makes a similar observation: at times, the "rhetoric" of language can exceed its "grammar." Indeed, as he points out in __Allegories of Reading__, grammar and rhetoric sometimes exclude each other, which makes the task of decoding literature one in which any specific methodology will not guarantee an accurate deciphering. I contend that any significance that "decoding" is to have for literary analysis must consider the ways in which codes (especially those that distinguish the "literary" from other types of communicative discourse) have the ability to subvert their own deciphering. If the literary is encoded via a specific set of pre-established conventions, the hallmark of the literary is also its use of ambiguity and irony, which make coding and decoding the message or truth of a literary work a much less formulaic task than when it is applied in warfare, communication theory, computer programming, and linguistics. For my talk at SLSA 2009, I wish to explore the trope/word "invention" as it pertains to American literature in the nineteenth century. I will investigate, through the work of Herman Melville and Mark Twain, the ways in which invention, as both a technological and aesthetic concept, is encoded and decoded depending upon the context of its use. I use "invention" as an example of a code that subverts its own deciphering, without losing its rhetorical power.

Session 11 (C) Doggett
Early Modern Literature and Science

William Silverman.
"And wisdom at one entrance quite shut out": Decoding Epistemological Concerns in Paradise Lost
Galileo's telescope rocked intellectual worlds.  The once immutable heavens changed, and the empirical approach to nature gained validation. Scientific knowledge began to rely more heavily on the senses. For centuries, many believed that sight held the epistemological key. This posed a problem for the blind John Milton. This essay explores epistemological concerns in Paradise Lost, from Milton’s own belief that blindness shuts out wisdom, to Eve’s apostrophe to Experience, her “best guide” (IX.808). As a poet who could no longer see the changing world, Milton had to create a world of his own. His world is based upon knowledge he possessed and what the “Heavenly Muse” imparted to him, that through comparisons to the growing scientific discoveries he heard of, with the use of epic similes and words like “seemed,” Milton truly could “see and tell of things invisible to mortal sight” (III.54-5).

Kevin Carr.
"What strange delusion's this?": Decoding the Effects of Early Modern Stage Technology.
In this paper, I discuss the employment of technology on the Renaissance popular stage, in the work of writers such as Shakespeare, Marlowe and Fletcher. I argue that the use of technology on stage(both mechanics and pyrotechnics) became a means for the audience to decode reality from illusion through a type of Brechtian alienation effect (Verfremdungseffekt). I further argue that the employment of such artistic effects was instrumental to the rise of scientific traditions of Baconian empiricism and induction that characterized the Scientific Revolution.

Session 11 (D) Whitman A
Medical Code

Susan Allender-Hagedorn.
Virtual Medicine: Lessons from the Future
We live in a world where over 90% of our population gets its scientific “understanding” of science and technology from television and movies. Over the past decades, in the multi-seried Star Trek, we have seen the evolution of the role of the physician (and his/her technology) from the very human, approachable Dr. McCoy to the virtual, non-human “doctor” of Star Trek: Voyager. Do these (and other science fiction representations of medicine) have any influence on the “real” (not “reel”) practice of medicine, in particular the public perception of virtual or telemedicine?

» Biomedical Simulation Laboratory |
» Presentation slides

Dolores Steinman; David Steinman.
Medical Imaging in the 21st Century: Encoding Reality, Decoding the Unseen
Imaging/visualization has always been an integral part of understanding the human body, and physicians have used and applied various techniques and technologies to this end. In 21st century medicine, digital medical imaging has allowed for intricate encodings of patient data that require concomitantly sophisticated decodings to make sense of them. As an example from our own practice of elucidating the role of blood flow mechanical forces in cardiovascular disease development and therapy, we use computational fluid dynamics – CFD, which solves the mathematical equations governing fluid flow – to visualize and quantify these otherwise unseeable and unmeasurable forces. These scientific moving images rely on anatomical and functional patient data that have undergone two key encoding/decoding steps: a) binary encoding of patient data by the medical imaging device, then decoded to produce anatomic or functional visual images; and b) these images encoded into a mathematical (CFD) model, and decoded into graphic anatomical and physiological information. As we shall discuss, the advantage of this process is that it deciphers the biological code and visualizes the invisible as opposed to merely mirroring reality. However, as with nature itself, these cycles of encoding and decoding are fraught with the possibility of inadvertent or intentional distortions – random or directed mutations – of this unseen reality.

Jennifer Ellis West.
"Viewer Discretion Advised": Decoding the “Reality” of Hospital Birth on the Discovery Health Network
On a network that averages around six hours a day of shows about pregnancy, childbirth, and parenthood, one might expect a range of perspectives on a process as dynamic and varied as the physical birthing of a child. However, even a cursory viewing of the shows available demonstrates that regardless of the setting or the circumstances, the “reality” of birth is presented as inherently and homogenously dangerous and therefore in need of strict medical control. By analyzing the narrative construction of science, medicine, and women's bodies, I will argue that by linking the unquestioned authority of medical professionals with an exacerbated view of the dangers and risks of pregnancy and birth, shows like those on Discovery Health code childbirth as a process women should fear, and therefore, cede control of to what anthropologist Brigitte Jordan has called the “authoritative knowledge” of institutionalized medicine.

Session 11 (E) Whitman B
Science in Victorian Culture

Catherine Day.
Material Fact and Imaginative Experiment in Charles Lyell’s Principles of Geology
In the widely-read Principles of Geology, first published in 1830, Sir Charles Lyell acknowledges the difficulties of modern geology, a science that can employ neither deduction nor traditional experimentation. Because of this, Lyell asserts in the first volume of the Principles, geologists are “called upon, in our researches into the state of the earth, as in our endeavors to comprehend the mechanisms of the heavens, to invent means for overcoming the limited range of our vision. We are perpetually required to bring, as far as possible, within the sphere of observation, things to which the eye, unassisted by art, could never obtain access.” The limitations of geological observation, in other words, demand alternative methods of increasing our knowledge of the earth and its history. For Lyell, these methods must be powered by human imagination. Several recent articles on the rhetoric of the Principles have focused on Lyell’s use of narrative and imagination to expand the perspective of his readers, so that they might better comprehend both the multiplicity of present geological processes and the vast span of the earth’s past. There is no doubt that an important rhetorical strength of the Principles is its ability to encourage a kind of macro-level view of the earth and its history. The Principles has a characteristic “largeness,” given the number of topics it treats and the expansiveness of its argument (not to mention the physical size of its three volumes). But focusing on the text’s “big picture” may overlook the significance of those smaller details that make up the bulk of its pages. Looking at the role of thought experiments in the Principles, we see the integral place of the small fact, the material detail, and the local process in Lyell’s argument. In these thought experiments, it is clear that the expansion of the readers’ perspective is not at the expense of their recognizing the “small stuff” of geological change. Rather, this perspective is dependent upon Lyell’s ability to depict in vivid detail the slow and local processes of the earth and to connect them to larger laws of science. Each of Lyell’s thought experiments begin with a set of imagined material facts upon which the experiment depends, and it is only after the reader has these facts distinctly in mind that Lyell can use them to test an aspect of his geological theory of uniformity. Focusing on Lyell’s use of thought experiments in the Principles enlightens the relationship between material detail and abstract theory in the work as a whole. The expansiveness of the Principles’s argument—and consequently the persuasiveness of Lyell’s uniformitarianism—is primarily an effect of its ability to organize such a large and diverse group of geological facts under the umbrella of uniformity. This is, as historians of science have noted, the key similarity between the argument of the Principles and that of Charles Darwin’s 1859 The Origin of Species, a text which also employs thought experiments to test the explanatory potential of its theory of natural selection. The effectiveness of Darwin’s argument in The Origin, like that of his mentor Lyell in the Principles, is in its capacity to create clear relationships between small material facts and broad scientific theories through imaginative means.

Amanda Mordavsky Caleb.
“The New Hedonism:” Grant Allen and the Construction of a Decadent Science
In this paper I intend to discuss the intersections of decadence and science in the late nineteenth century, specifically in the works of Grant Allen. Here I aim to reconsider the implicit understanding that decadence and science are at odds with each other at this particular period, instead suggesting that we can see them as coalescing in the works of Allen, specifically in his merger of the figures of the scientist and the decadent. Allen presented a bridge between the sciences and the arts through his success in both disciplines; moreover, Allen re-imagined how the two disciplines might inform and critique each other by highlighting the amorality of both the scientist and the decadent. This union is evident in both his fiction and nonfiction; by looking at a range of his works, including “a New Hedonism,” “Social Anarchy,” _The British Barbarians_, and _Physiological Aesthetics_, we can begin to understand how Allen used literature to critique the failings of science, and science to critique the imprecision of literature. In re-examining Allen’s body of work as a complete collection—and not as two separate collections representing science and fiction—I will suggest that we might re-read the late nineteenth century as a period in which decadence and science shared more similarities than differences.

David Smith.
Uncanny (In)Securities: The Great Lock Controversy of 1851
During the Great Exhibition of 1851, an American lock expert picked England's best-known burglar-proof locks (built by Bramah and Chubb) in demonstrations that The Times dubbed the “Great Lock Controversy.” Although these defeats ultimately had more symbolic than real significance, public reaction to the controversy, which received wide and sustained news coverage, provides a lens through which I examine technology’s role in shaping the terms of acquisitive morality and in creating a space that can quickly become uncanny when the sociopolitical mandate of self-governing individualism gives way to an ethos of aggressive self interest. News items about the controversy, which I treat as a collective narrative in which participants and observers debated whether or not the American had fairly picked the famous patent-locks, failed to arrive at a conclusive assessment but left intact the ostensibly transparent meaning of security at the Exhibition. In contrast with newspaper accounts, I analyze Richard Henry Horne’s “A Penitent Confession” (Household Words 1851), a short story that reenacts a central feature of the controversy and threatens to subvert the protected space of the Crystal Palace when the otherwise law-abiding middle-class protagonist dreams that he steals the famed Koh-i-noor Diamond from the high-security display case built by Chubb, one of the most popular attractions at the Exhibition. Demonstrating that an encounter with the uncanny provokes a crisis of propriety and, by semantic extension, of property, I argue that the Chubb safe depicted in Horne's story serves as a site of ideological rupture. The spectacle of the security commodity, though central to the manifest political and nationalist aims of the Exhibition, has a latent meaning that only becomes evident in the act of illicit decoding or violation, exposing the political unconscious of the Crystal Palace to public scrutiny.

Session 11 (F) Woodruff A
Encrypted Thinking

Sean Simpson.
A "Diagrammatic" Encoding/Decoding of Exemplary Pictorial Thinking: The Imperial Making and Unmaking of the World at Tikal, Guatemala.
The objectification of a torture victim operates first through certain attributes of pain, then through the translation of those attributes into a "fiction of absolute power." This process is encoded with exemplary, "diagrammatic" immediacy on two Maya reliefs at Tikal. "Eternal victory" is conceptualized through the nonnarrative concentricity of a clothed, standing ruler and an unclothed, suspended captive. The ruler constitutes a scaled fractal iteration, or internal digest (mise-en-abyme), that compresses and designates the corporate body of the imperial Tikal dynasty. This points to the diagrammatic pictorial thinking of the translative, classificatory property-space of the Tikal ruler/dynasty's "four bodies"--which may be effectively decoded through ongoing investigations of direct, diagrammatic ("pictorial") cognition, and through several schemas whose respective signifying activity turns on a nonnarrative, metaleptic circularity of cause and effect: the translative diagram of the logical square/hexagon; the inversional diagram of chiasmus; and Arielle Saiber's translative, analog diagram of chiasmus. 

Lori Emerson.
"A Case for Dirty Hands": From Artist Books and/as Creative Code
This paper explores the ways in which the philosophical underpinnings of a school of programmers and graphic designers, accreting around the work of John Maeda and the Aesthetics + Computation Group at the MIT Media Lab and now the Rhode Island School of Design (RISD), not only neatly echo that of book artists throughout the 20th and early 21st century but are in fact drawn from this long-standing history of the art of book-making. Working explicitly against creating aesthetic objects that are seamlessly enmeshed in a slick, surface-level interface, “code-works” (in digital poet John Cayley’s words) created by those in the hacker-driven “demo scene” as much as those working in digital poetry and/or net-art are driven by a belief in what Maeda calls “dirty hands.” Writing for a blog for Harvard’s business school, Maeda declares “In the last few decades, technology has encouraged our fascination with perfection — whether it's six sigma manufacturing, the zero-contaminant clean room, or in its simplest form, ‘2.0.’ Given the new uncertainty in the world however, I can see that it is time to question this approach — of over-technologized, over-leveraged, over-advanced living. The next big thing? Dirty hands.” Given RISD’s long-standing dedication to archiving and creating artists books, it is no coincidence that process-driven programming and the tradition of artists books should be conjoined in the figure of Maeda himself, now President of RISD. In fact, as I argue in this paper, it is the artist book (from those created by Russian Futurist Ilia Zdanevich to to those by Johanna Drucker or produced by The Center for Book Arts in New York City) that laid the groundwork for this turn toward self-conscious, self-reflexive coding as it showed us how to hack the book in order to renew the book, to turn it from a transparent carrier of meaning to an object that is meaningful in itself.

Willemijn, van der Linden.
‘Theoretical Madhouse’: Representations of cosmological knowledge in The discovery of heaven (1992) by Harry Mulisch
Harry Mulisch’ The discovery of heaven (1992) is one of the most famous Dutch novels: over forty reprints have been issued and it has been translated into many languages. Both national and international critics heralded the book in important newspapers (the New York Times, for instance), and it was even made into a film in 2002 (with Stephen Fry acting as one of the protagonists). This remarkable reception can be related to the way in which scientific and technological themes are represented. Two angels reflect on the ecological effects and moral dilemmas of scientific developments and technological innovations on earth. Furthermore, the novel (and the movie) narrates the cosmological research of the radio astronomer Max Delius. By making use of advanced telescopes and a variety of concepts and theories, he assumes to have gained an understanding of the very beginning of the universe. Moreover, he believes he has discovered heaven. Religious and scientific ideas are closely connected. The boundaries between science fact and science fiction blur. The purpose of my contribution is to demonstrate that The discovery of heaven engages in actual discussions on the origin of the cosmos, and the so-called ‘Big Bang Theory’ in particular. Literature, I argue, is not just a reflection of the contemporary perception of science, but must be considered as an important factor in the ongoing debate on the meaning of science and technology in our culture. The role of fiction in processes of public image-making should not be underestimated.

SLSA Wrap-up meeting: all welcome - 12:30pm - 1:30pm
Woodruff A