If in all ideology men and their circumstances appear upside-down as in a camera obscura, this phenomenon arises just as much from their historical life-process as the inversion of objects on the retina does from their physical life-process.

Friday, August 28, 2009

New Normatives?

One of the normative moments in Donna Haraway's When Species Meet pertains to the encounter between organisms, humans and baboons for example, or humans and dogs (We could add between dogs and baboons as well, that just isn't the kind of question humans tend to cogitate on). She recognizes, and advocates recognizing, cross species transactionalism. The Other (dog, baboon, etc) hails us; we can either respond--and perhaps succeed, perhaps fail, but at least gain recognition as something to which one signifies--or be outside the world of communication. We wouldn't just be poor in world (as we might be if we did a very bad job responding to the Other, and they were as egocentric as we) but we would be without world, as stone.

Haraway's ethics of the encounter seems to me to be something like a Levinasian/Derridean ethics of the singularity of the Other, except mutated by an intractable case of immanence or "concrescence," Haraway's preferred word (intended with full Whiteheadian connotation). With that infection comes a body of knowledges as well, what I lump together under the heading comparative ethology and what others might distill into anthropology, sociology, biologies of all kinds. Haraway's big improvement is to reattach those physical knowledges to the abstraction of deconstructive ethics. No small feat, and one she accomplishes with panache.

Now the question that keeps bugging me is how we can extend the encounter beyond the macrosystems called organisms. Baboons and dogs, after all, are pretty gigantic clusters of physical processes. They are a lot more like us than the vast majority of matter or even the vast majority of life. Their social codes aren't quite intuitive but they are not outside of the structural ken of the megaorganism called humanity. However, the speculative realists have made "massively unavoidable" the issue of all that other stuff that is not improbably similar to us (Derrida uses "massively unavoidable" to describe the animal question in Specters of Marx and it has a ring to it that I like).

The process of hailing and having face is not impossible to hook up with nonhuman animals (For more on this head over to The Inhumanities for a discussion of Matthew Calarco's Zoographies). I sign on with that a priori; I've had dogs all my life and they're not hard to understand. But what about beyond the "higher species," beyond mammals, vertebrates, animalia? I probably wouldn't be raising this as a mere theoretical contention if it weren't for the dismal state of the environment. Philosophy, we are reminded, is a language and a techne, and it addresses the problems of its time. For better or worse we are all going to have to answer to the environmental question.

So how does the ethics of the encounter stack up? How does it encounter, how does it respond to the stuff that lacks a socius? Now that the humanist prejudices have started to fall, liberating animals (at least a bunch of them) to ethics, all the other criteria or lines to be drawn seem artificial and insubstantial. I imagine we could--some have--redraw the borders of who/what is in and who/what is out, but that doesn't seem to hold water (Think of all the lifeforms that water is holding!). So, while we can draw on well developed naturalcultural eth(n)ologies to approach certain types of organisms making powerful claims on our philosophical moment, I do not see any established knowledge experience to guide us elsewhere.

We do, of course, have plenty of scientific knowledge on trees, rocks, fire, cotton (maybe not Harry Potter; that one might have to wait a minute). But none, or little, of this knowledge elucidates these objects in the mode of hailing us or of an encounter with recognizable normative dimensions. As I see it--and I may be wrong, feel free to correct me--the claims that trees put on us are pretty mild. Trees don't want to be cut down, but if so they would prefer to have a space in which their seedlings can propagate. Speculatively I'd guess they prefer to have a genetic pool which would make their species more capable of weathering the kinds of plagues that befall tree populations. Cotton I would consider similarly in its life as a plant. But fire? Or rocks? Once things are inanimate the encounter ethos (or any of its predecessors) falls on hard or at least very uncertain times. How do these things hail us? I can't tell.

Drawing on Derrida, I see temporality as a partial key to this riddle. Doing the right thing is conditioned by doing something at the right (delayed but right on time) moment. Say "what's up" too soon or too late and it's worse than staying quiet. The general time frame of right action in human interaction lines up pretty well with other macroorganisms. But with trees, or fire, or--wait for it--Harry Potter, maybe the temporality is just way outside what we are used to and what we are comfortable with. For trees I think we need to think in at least fifty year moments; for certain biochemical processes constituting other organismic scales, longer than that. (Remember how carbon-14 dating works? Among so many other concerns). For something like Harry Potter this may extend to infinity; hence the insurmountable oddity that thinking HP as equally an object presents to many modes of thinking. Maybe temporality is only some part of a greater criterion that would have greater extensibility to inanimate beings. Maybe the final hurdle is a secular sub specie aeternatis. This is the kind of thing Derrideans would pronounce undecidable and which Haraway would point out we are deciding all the time. It seems to me that we still need a better conceptual apparatus for making these tentative decisions.

Oh the inhumanities

There's a hot new blog I heard about called The Inhumanities run by Scu, Craig and, well, myself. We will collectively discuss texts of interest and importance to the animal studies community. Our first is to be Matthew Calarco's Zoographies. If you've read it then you know that it swiftly opens a lot of new ground in the Continental tradition's engagement with "the animal question." By "Continental tradition" I mean the big boys of the 20th C.: Heidegger, Agamben, Levinas, Derrida. If you haven't read it, hop on over to The Inhumanities and you can get both a synoptic and critical reading. It's going to be fun.

Thursday, August 27, 2009

A return to cruelty, but where?

The only claim with any merit I can see in Elisbeth Roudinesco's "dialogue" with Derrida on animals is the distaste for a world expunged of cruelty. I'm not sure how important this really is to her--it comes up late in the interview, seemingly by accident, and is preceded by what I take to be "serious" questions about food supply and the treatment of nonparadigmatic humans--but this lateness by the sign of importance. Who knows.

ER: "I am always worried that we moving toward the construction of a sanitized society, without passions, without conflicts, without insults or verbal violence, without any risk of death, without cruelty." (For What Tomorrow, 75)

The claim is that cruelty must be somewhere in the world, that it is something valuable to humans, and that reducing it by fiat in some places relocates it (psychically, probably geographically, and with some concern about an ontological relocation). Unlike the claim for "the necessity for industrial organization in raising and slaughtering animals, which makes it possible to prevent so many humans from starving," (71) or "the necessity for humans to eat meat" (68), the displacement by prohibition theory has at least some empirical validity. The questions I would put to this configuration of theory and instance is 1) whether vegetarianism constitutes the kind of institutional prohibition which can be cited on behalf of this theory 2) whether industrial farming is not itself the greater purveyor of an absent cruelty in our world and 3) whether the kind of need that attaches to the idea of "cruelty" must, or even can, be met by factory farming/ingestion/harm to animals. It seems to me that the construction of an animal necessary for these processes to go forward under the law first strips the animal of the capacity to be in a relation of cruelty. Hence this economy has multiplied so radically. It's like trying to sate hungry by drinking Kool Aid.

So: a first step toward a crueler world is seeing animals as more like persons at least to the extent that, as Derrida consistently argues, their suffering matters. There would then be a super abundance of cruelty, more than enough for Roudinescu, more than Sade himself could imagine. Roudinescu diagnoses her problem precisely: "But I prefer not to see it, even though I know that this intolerable thing exists" (71). How will we ever enjoy cruelty if our first task is not to see it? I see cruelty everywhere, it is under my fingernails. This is probably what helps to make vegetarianism so satisfying for me: it allows me (in a psychoanalytic sense of internal policing) to see more cruelty and to enjoy it without the bad conscience of the subject of law.

Oh, Google Books offers For What Tomorrow if you want to see for yourself. My characterization of Roudinesco's absurdities has actually been pretty generous compared to what she says.

Wednesday, August 26, 2009

It never stops...

I'm not sure what to say about this but thought I'd repost it, as it is of importance to everyone and of interest to at least animal or science studies folks...

Permanent Gene Therapy

Tuesday, August 25, 2009

Is Vegism an Amputation?

Sometimes vegists stress that their dietary and other consumption patterns are not restrictive in a negative sense. Vegetarian food tastes good, is easy to make, and is nutritious; I don't think anyone committed to a veg lifestyle misses what they forego. Stressing that vegism is not restrictive is, I think, designed to persuade potential recruits that our lifestyle does not diminish the quality or variety of life (it doesn't!). However, I think that viewing vegism as an "amputation" (I'm borrowing the word from a friend) actually turns the tables on the "vegism is a restriction" debate more effectively than cataloging the specific pleasures of vegism.

An amputation requires that something be lost, yes. One could use the metaphor of disease, a disease that has metastasized throughout the Human but which has its nodes and nodules of highly dangerous tissue. Maybe we can save the Human (and a new kind of humanism), maybe not. The disease metaphor illustrates analogically a situation in which amputation increases life. Not that an analogy is an argument--there are always counter analogies--but it prepares the way for an argument.

After amputating flesh-eating and the like, we have room for a prosthesis. Again, a long tradition of the Human has made persons with prosthetics less than the "full" or "whole" body. Practically if not theoretically I think we are well past the uninterrupted body and into the age of the cyborg. Eyeglasses are a pretty primitive technology and I would be dead without them; don't get me started on my interface with coffee and the coffee machine. Fulfilling the function of a lost organ is only one facet of the prosthesis. By not being identical to the other organ prostheses have other facets of being that are thereby offered to the person, culture, machine, etc, to which they are attached. Marking the prosthesis as one center of being rather than as a marginal case makes other parts of the body-machine reveal themselves as multi-faceted, non-monological, interesting and dexterous.

It is in this light, as amputation and prosthesis, that I view vegism. Something is excluded to be sure, excised or exorcised, but so as to yield a "net gain" in vitality. This is not an ethical argument in the traditional sense and will probably not convince many "persons on the street." However, I can't in good (immoral) conscience promulgate a Christian version of ethics even if it does have more short term benefits.

Wednesday, August 19, 2009

The Sausage Factory

Every time I expose myself to a media outlet these days I hear some variation on this: "Yes, there are problems in X part of healthcare reform, but we're in the sausage making phase and things get ugly." Meaning that Americans are resistant to political change because our political process is as inherently revolting as that Upton Sinclair describes in The Jungle. The metaphor implies that American representative democracy stands or falls on the same aesthetic criteria undergirding meat consumption. Yes, it is ugly, but the finished product is aesthetically pleasing; that's precisely why we displace and disguise the gross part. Does it also mean that our government and its mouthpieces link themselves with the ethical connotations of meat making? If I say: hey, maybe we shouldn't be making sausages either--where does that leave the governmental process? The flimsy morality of this metaphorical defense strikes me as a sign that the Obama administration has not found a new clearing for the left to establish itself but is still in its intellectual death throes. And reform, if it comes--will it be as laden with pain and remorse as the sausage?

Tuesday, August 18, 2009

Animals in danger in movies

I was watching a movie tonight, Tell No One, of the psychological thriller variety. There was a dog in the movie and I found myself very, very concerned with the fate of this dog though, as it turned out, it played a minor role and no harm came to it. But there was good reason to believe this dog was in danger, and this seized on my mind. At the same time I found myself thinking how odd my anxiety was. My concern could be dismissed as me being a bleeding heart welfarist, or whatever the going denigration is, except that my response was in the context of a genre that above all else aims to elicit tension and concern apropos the human characters. A pretty large number of people were killed and/or abused in this movie; there was good reason for me to be concerned for any and all of the characters, and indeed I was worried about them; but it was the dog that I really hoped didn't get it. Amidst a concerted effort to make my brain worry about people, I found a dog to think about.

I can't quite put my finger on why this happens, but at the least I can say that (for me) dogs puncture some vital veil between image and reality (or whatever we call this world we live in). The humans in movies I always see also as actors--if we are particularly convinced in a given role that they are not 'acting,' we praise them as good actors! And when appropriate I have the same feeling towards well trained and physically skilled animals. But when the dog's role is just to run alongside a human protagonist and sit on the sidewalk, I don't see him as an actor. Though there is certainly some level of pre-discipline required, just as we must be constrained by language to express ourselves, I see this as not "performing" in the same sense. I might lie sometimes, and express powerful emotions at others, but neither of these capacities makes me an actor. Those are par for the course, and when a nonhuman is acting par for the course I do not see it as an image but as a being in a part of the same world from which I watch. The set on which the animal acts is continuous with the earth on which, watching, I sit, whereas the earth on which the actors and directors etc narrative takes place is not even contiguous with this world. It is a world to itself entire.

I see two sides to this. One, we see the animal as a link between image and self, and a vital one at that in a world increasingly populated by glowing screens, by images of elsewhere brought here. Two, we see this general phenomenon as evidence of a certain techne of viewing that has historical precursors. Strangely enough, I am thinking of Racine; of his contrived plots that correspond with brutal exactitude to the quasi-Aristotelian unities demanded in his day. His plays were praised, in the degree to which they measured up to said unities, as rational and natural, though from another measure they are the most absurd and unnatural things conceivable. But, as Auerbach points out, the yardstick of nature is not in this case in the world at large but already within the theater. That is, if we assume the action on stage is real, how could a play staged in two or three hours possibly be taking much (more than 24 hours) longer than that? This whole apparatus only makes sense if we understand that the audience sees what is on stage as a separate but whole world, rather than an emanation from our world and reducible to it. I have no evidence that my experience of nonhuman animals in movies is widely shared, but if it is I would venture that we also share, in part, this neo-classical divorce from the image as a world that we can watch without sharing its laws. But again, the animal is the limit and signal of this phenomenology. Perhaps even the lever prying us apart.

Sunday, August 16, 2009

Requiem for a chicken

When I moved into my new living situation a couple months ago I was taking a calculated risk that I could prevent either of my two dogs from killing the three free ranging chickens on the premise. For two months that peace held, and sadly last week someone--I strongly suspect one of my dogs, but there is room for reasonable doubt--killed Fraulein Schnitzel, one of the last two surviving chickens (another one disappeared a week before) of the original half dozen or so that other dogs/coyotes/etc had winnowed down. While it was not unheard of for a dog to kill one of the chickens on the property, it was still particularly sad to see Schnitzel go as she was a remarkable chicken. She had what I can only describe as a crown growing up from her head, and she was uncommonly fond of interacting with humans. She was also very loud so her absence is noticeable.

I don't bring this up as a kind of "he was a good dog" speech. Since it was my dog that probably did the deed, and Schnitzel was the favorite chicken of our property owner, I figured burying her was the least I could. This was much harder than I had expected. The soil is completely dry. In the first place I dug there was a 2x6 buried a foot down, a nice surprise. In the second hole there was a root about two inches in diameter. It was hot as hell to boot: we're not far from the wildfires in the Santa Cruz mountains making national news. When I was done I was drenched in sweat, my back hurt, and I had two blisters filling with blood and a spot on my thumb where the skin was just floating. Such are the infirmities of the scholar. But I did feel accomplished, and somewhat reconciled to the death of Schnitzel.

Now in a way, there is no need for this complicated economy of reconciliation to the dead, "mourning" as we call it. Nothing can change the facticity of death, nor did I share a very rich emotional world with Schnitzel relative to that I have with some humans, dogs, and cats. Schnitzel's owner eats other chickens so its not like this was the transgression of categorical imperative. (She was, for the record, upset but understanding). I didn't do anything bad here (my dog had snuck out unbeknownst to me) and there was no expectation of penance or punishment. Death is a kind of non-event , but its peculiarity as such seems to make even more of an event. But it doesn't seem to be a kind of event that humans have any monopoly over, either as mourned or mourners. I see a warped vision concerning the anthropology of mourning, where a ritual that exists among humans is studied as such, and in the process taken to pertain especially to humans: that because knowledge of mourning contributes to anthropology, there is an actual anthropos being excavated in the process. (Incidentally, Heidegger makes this same point with regard to scientific research building up its unacknowledged worldview in "The Age of the World Picture." I hope I didn't offend any Heideggerians with my casual characterization of death as anti-event.)

Rather, mourning seems to open up all of those boundaries that construct the nonanimal human. One acts or feels as if one has wronged the dead, or as per Freud imagines that one has caused his or her death, knowing that one is likely not at fault in any socially rigorous sense--and even if one is, the slate has been cleared. Mourning puts the affective dimension of ethics in flux, while foregrounding this element at the discount of the rational form of ethics. Hence we read from religious books, or stories, or poems--"Do not go gentle into that good night"--rather than the second Critique. Thus mourning also makes one "mad," temporarily outside of social mores. One might hurt oneself or do something crazy under the excusatory power of grief.

While this can be seen as the token of mourning's exceptionality, its role in erecting 20th c anthropology has been anything but marginal. I would turn anthropological wisdom on its head: the ontological and ethical openness therein is the ground floor, or even the basement, on which a climbing and ultimately teetering "humanity" rests. If notions of the human, of our destiny as a species, are in question, what makes sense isn't an increase in buttressing that makes for a more spectacular disaster down the road but a (re)turn to what structural anthropology could have shown had it conceived of itself as comparative ethology instead.

Monday, August 10, 2009

Coraline

For those who haven't seen it, I strongly recommend Coraline, the stop action film based on Neil Gaiman's novel. Ostensibly this is "for kids" but in no way is this a limitation; it's more like Coraline exceeds the division of a "child" aesthetic and an "adult" aesthetic. The device that allows this encompassing movement is its supreme creepiness.

When this movie came out I had to do a lot of driving for work, so I wound up hearing an interview with the maker about three times on NPR. Two things stuck with me: his insistence on doing this movie with stop-action figures rather than CGI, and his belief that what is most terrifying is age-indiscriminate. His instincts were right, because Coraline is very effective and I don't think too intense for kids. Maybe some kids. Why is it effective?

The materiality of the figures is part of it--a small part, in comparison with the amazing directing and effects--but it is this part that creates the atmosphere of the uncanny (E. T. A. Hoffman's The Sandman is more than a passingly similar text, especially since Gaiman made his name with his own Sandman series. Here is a short film version). The animation is done superbly, but it has inevitable gliches or discontinuities almost too small to see. In an extremely subtle way, the movie reminds of us our own materiality and mortality. But the division within the movie between the "real" world and a seductive simulacrum in which everyone has buttons sewn over their eyes hinges on the idea of being a doll or being real. Yet we know that the "real" world is also made of dolls. The chain of signification establishes a dualism between real and false (and this is total division: ontology and ethics are synonymous) but then shows that this same division connects the screen-world to the viewer-world. Since materiality is the means by which the image reveals its construction, that the viewer-world possesses "the most" materiality makes it the most frightening. Things speak, have worlds: and, as in the flickering gaps of the claymation, the mode in which things reveal their other lives is by a minor absence. The imperfect fluidity of the figures does not break down narrative or meaning or affect; one could count it out entirely, or not even perceive it. This is what is terrifying: that the alterity of things can go unnoticed by the modes of perception and meaning most dear to modern humans. We like to think that anomalies will hail our attention and direct our colonization of all possible worlds. But there may be a profound indifference in the world, one that can take or leave our participation. Coraline is a kind of Spinozism for kids and it rightly shows that this is both marvelous and profoundly disturbing for humans, "awful" in the old sense.

Thursday, August 6, 2009

Auerbach and the para-modern

One of the offshoots of the study of the postmodern has been a new life for pre or early modernity as a kindred spirit of our own time. The most obvious example to my eyes is the place of "science" which was mythologized as part of modernity and which has since become somewhat more horizontal to other knowledges, as it was in the emergent period of the early modern. Conversely, the kind of claims religion can make, or the ways in which "the religious" informs judgment, have shifted around the demarcations of modernity. Without over-emphasizing the borders of "modernity," there is some usefulness to it as a periodizing device.

I recap this as an intro to two ways in which narrative today can be compared to early modern mimetics in Auerbach's Mimesis. First, Auerbach characterizes the Medieval period in terms of a strong displacement of high tragedy because of the figural power of Christianity. By "figural" he means a certain relationship between the earthly and the heavenly, the present and eternity, in which all phenomena on each side ultimately correspond; and while the heavenly side is absolutely the more significant, this device gives every earthly happening, no matter how mean or human, connection to the divine. The limit case is Dante's Comedy in which the endowment of the earthly overwhelms and excludes the heavenly; but this is a limit case, and the drama of the Middle Ages generally did not suffer from this potential crisis--in general it enjoyed its license to earthiness with the knowledge that all filth was a sign in the cycle of redemption. Retrospectively, Dante is seen as a bridge into humanism and the general replacement of the figura as an aesthetic device. Examples of an anti-figural writing are Montaigne's essays or Shakespeare--anything in which we can recognize an existential dimension.

I would argue, however, that the postmodern has rediscovered the figura in the Holocaust. The "other side" of the figural is not unrepresentable; rather, it is an inescapable part of any economy of representation. Thus Ranciere, in The Future of the Image, rebuts Adorno (et al.) on the unrepresentability or barbarism of representation of the Holocaust. It is instead a matter of focal distance, of making decisions and staking positions in the economy of "the visible and the sayable" (Ranciere's definition of "the image"). Just as the Medieval Christian aesthetic endorsed all kinds of "creatural" or "kreaturlich" abjection in the name of the figural regime, so today art has taken on a mission in relation to dehumanization under the unannounced but understood sign of the Holocaust as a kind of "beyond."

Second point: Auerbach moves on to discuss the differences between Elizabethan and Greek tragedy. I'll boil the distinction down as it is fairly basic to Western history: Greek tragedy presumes the audience knows the story because it is part of national myth or history; Elizabethan tragedy involves fate as a personal and existential value. The experience of Greek tragedy is thus something that is fundamentally withheld from us except insomuch as we have taken the time to familiarize ourselves with an equivalent knowledge of Greek myth--and in fact, the same is now true for viewing Shakespearean productions. In general, though, our tragedies fall on the model of Shakespeare's--except in rigidified low genres, like horror. While one can point to examples of "good horror movies" with well developed characters, the weight of the output is on a mythic tragic format in which character's are basically ossified and predestined.

Differences: the post-modern viewer has retained an expectation of surprise, a distilled form of irony, which prevents the solemnity of tragedy, the condition for speech, from developing. So let us imagine stripping a horror movie of all the tricks and turns designed to trigger our startle reflexes. What will fill the time? The Greek solution is rhetoric, the well-formed speeches that take their turns but which cannot hold back the falling blades of fate. The current solution is the eloquence of viscerality, which the Greeks banned from the stage and which I doubt could have been rendered in prior ages as hyper-graphically as today, even when a butcher's harvest could have represented, with hardly any mimetic gap, the interior of the human body. I see shreds or sparkles of this neo or post mythic tragedy in certain zombie films, and it is here too that we find the structural conditions most ripe to exclude the surprise.

Wednesday, August 5, 2009

"War, inc."

About ten minutes into "War, Inc." my wife asked me what the movie was supposed to be about. I started rambling about Halliburton, Blackwater (rebranded as "Xe," I believe), privatization, Pinochet, and a bunch of other chestnuts pertaining to how American foreign policy is fucked up. "But what is it about?" she repeated, and I realized the movie was answering to a different "being-about" than her question. She wanted to know the plot as a linear movement in which what has happened helps us anticipate and understand what will happen; I had given a non-linear constellational explanation, and when I tried to formulate the movie in terms of plot it came up pretty short ("John Cusack is trying to kill this guy, but not very diligently, and he is trying to get in Marisa Tomei's pants, though that has nothing to do with the premise?"). At about the halfway mark I thought I understood what's going on in "War, Inc": it wasn't a movie, it was a holographic bumper sticker retrojecting itself into the form of the cinematic. Take fifteen odd slogans pertaining to the true, disgusting, and heartbreaking state of American foreign policy under Bush, give them screen life, and string them together. Though the metaphor of "stringing together" gives the false impression that this film is organized as vignettes. Rather, it is like trying to sort through frustrating and similar ideas after too little or too much coffee, in which one gives way to two or three others before it is brought fully to light, so that the continual displacement prevents solid thinking but installs a very effective atmosphere. Communication is difficult.

But then I found that War Inc is more than that--maybe not much more, maybe shyly so, but its confusion accelerates into the rhythm and density that cause a real aesthetic to break out. It moves from capitalizing on the advertisement form (which I consider an extremely weak political tactic today, half a century after Warhol) to embracing the art form. The art form is less efficacious, less popular, it is true, and perhaps the movement from a popular to a difficult discourse is intended to grease the wheels a little. The last twenty minutes or so are hard to describe: they are both rich in the events/diegesis that had been heretofore muffled and even more saturated with an atmosphere that has suddenly proven itself fecund for human life. It put me in the mind of Godard's Les Carabiniers and Natural Born Killers, though it does not equal either of those, or more obviously Dr. Strangelove. It's difficult to say whether the filmmakers wanted the intensity of the surreal or feared it; since they are Americans, I can only assume the answer is both.

Monday, August 3, 2009

To rationalize or not to rationalize

Animal Person has a great post up on the nigh-intentional stupidity of the reigning discourse on farm animals versus pets (I originally typed "poets" instead of "pets," which is sort of where I'm going with this). David Scott, the chairman of the Livestock, Dairy and Poultry Subcommittee of the House Committee on Agriculture, grants some empirical validity to the foundation for animal activism, but then backtracks just as quickly into what seems to me to be sheer anti-logic. My point isn't to castigate him for being, in my view, a dumb man, but to note how this divides the two main responses I get from meat eaters.

It seems to me that people of some sophistication grant everything I say as true and accept that they are "morally" in the wrong in a logical sense but still in the right in a social sense. People of less sophistication seem more willing to argue that I am wrong, or that there has to be some loophole or categorical difference that will cause social and logico-ethical standards to align. Now, this might just be a permutation of the general division of the labor of combat, assigning ironic deference to the ruling class and the task of violence to the under class--and this is the general explanation I would advance. But at the same time, the categories for dividing these strategies of judgment fall under the heading of Art or aesthetics, so that even if we are dealing with, at root, a sociological explanation, this explanation would itself be split between the dialectical/argumentative and antinomic/ironic methods exhibited by the question. That a sociological explanation needs to borrow from aesthetics certainly does it no discredit--many aestheticians wish someone would find a use for their work. What is discredited, first and last, is the belief that animals or art--pets or poets--can be subsumed under a single method of appreciation. They share a fate.

Sunday, August 2, 2009

"Day of the Dead" x2

"Day of the Dead" is one of the great missed opportunities of zombie art. "Night of the Living Dead" is great, "Dawn of the Dead" takes the aesthetic a step further, and "Day" would have gone all the way in expanding the critique of American culture. For those who haven't seen it--and there are plenty, as "Day" is rightly ignored by those not immersed in zombie stuff--"Day" takes place well after zombies have kicked the shit out of humanity. The film's protagonists are a small group of soldiers and scientists locked in an underground bunker/laboratory where they selectively capture zombies for experimentation. The soldiers are guided by the worst chauvinist impulses, threatening the female scientist and the very project of science throughout. If "Dawn" issued a sneering image of consumerism, "Day" assaulted the union of dromocracy and science as the progenitors of knowledge-power in its simplest, cruelest, and most inhuman form. What's worse, and more brilliant, is that the procedures of capture and scientific torture makes the viewer sympathetic to the zombies. In idea if not in execution, Romero's "Day of the Dead" is in the best critical zombie tradition.

So it was with mixed emotion that I spotted a remake of "Day" while perusing the Red Box by our house for "Coraline." One of the things that stands out is that the "Day" remake stars Ving Rames, one of the principals in the horrible remake of "Dawn" a few years ago. "Dawn" did not need remaking, but I can see room for improvement in "Day." The remake also has a couple semi-recognizable actors: Nick Cannon (Wild n Out, Drumline), Mena Suvari (American Beauty!) and AnnaLynne McCord (Nip/Tuck and the remake of 90210). If someone wanted to do penance for the remake of "Dawn," this would be the way.

Obviously they chose instead to make a total piece of shit. But let's be objective about it.

The first two thirds of the remake is the initial zombie outbreak. The version of zombie powers used here (as well cinematography, lighting, etc) line up with the remake of "Dawn" so it's pretty boring. In fact, it is the same basic "whoah, something bad is happening and we don't know why" story that goes unspoken in the genre. What Romero had accomplished in his zombie trilogy going into "Day" was the ability to make a sci-fi movie while avoiding exposition of what cannot be explained--that is, to tell a story dependent on counterfactuals without being drawn into the cycle of explaining the unexplainable. (Even in "Dawn" he is able to give an expository montage in the first five minutes). The remake, however, spends most of its screen time on what was one of the crucial, subtle accomplishments of Romero's third zombie movie.

This is even more outrageous when we consider that the zombie movie is today well recognized--so well recognized that new incarnations have to be tweaked in some way to be viable--and that this clearly follows on the remake of "Dawn," which would at the very, very minimum provide a starting point for a follow up. Alas, the will to repeat that creates the remake is also an atavistic will to unnecessary expository parataxis. If the characters and their lives were interesting, that would be one thing--in fact, that would be Romero's recent "Diary of the Dead"--but this movie is shit so we have two dimensional horror cut outs. Which is acceptable in the genre, but not when the external circumstances do not offer a substitution for the subjectivity of the characters.

At last our band of survivors finds themselves in an underground military bunker/laboratory (!) where some govt conglomerate had been developing bioweapons. I missed some of the detail at this point because I realized doing the dishes was a better use of my time. In short, a couple people die, including Nick Cannon who was the only one holding this thing together, a couple people live, and they blow the hell out of the zombies and drive away. A radio reports that order has been restored but then a zombie jumps up in front of the camera so we know that more is to come. Where the remake of "Dawn" ended nihilistically with the slaughter of all principals during the credits, contrasting with the parenthetical promise of Romero's final cut (an earlier version had all characters dying), this remake is relatively hopeful. The zombie plague is an airborne virus, but some people are immune and it seems localized. That movie about Ebola with the monkeys had a worse scenario, and that didn't kill us all. Romero's "Day" was awesomely nihilistic. So, on both occasions the remakes of Romero's films have reversed the tonality of the endings so as to display a total ignorance of the thematic context of the film. Escape is thematically possible in Romero's "Dawn" because it is in the nature of the mall to be reiterated, traveled between, variant but within the great circulation of capital--one must leave the mall, but with the grim awareness that all one can hope for is to find another mall. In contrast, the laboratory and military science repeats but it repeats exactly. Like empire, the laboratory is endless and undifferentiated. Romero was right to have "Day" be a basically depressing movie where revenge is the only pleasure, and the ending is not so much evidence of pessimism as dictated by the material.

What do we make of this final image: the survivors pull out of sight, into the Colorado mountain backdrop, and a zombie leaps up directly in front of the camera to shake its head (menacingly?). Obviously, it contravenes the radio voice over: the event is not fully contained. But to where will it spread? Some other passably vaguely-named small town in Colorado, from which its inhabitants dream of escaping to an "anywhere" of the mountain sunset or, in the case of Suvari's character, the US military? The marker of spatial specificity here is the zombie itself: it stands in direct relation to the camera. But it too is reduced (or elevated) to an "anywhereness" as a film effect directed to the viewer. Wherever you are, the zombie is addressing you. The zombie in this scene, then, is pulled in two directions: out of the screen, into a somewhat abashed address to a viewer it cannot read but who can examine it (like Romero's scientists), and back into the film, back to the woodsy periphery of the town, back to an origin point that might be Anywhere USA but is the belated anchor for the zombie mythos. The zombie might be transported to any screen or imagined in any shadowy field, hospital, or friendly face, but at this moment it stands in a determinate distance to the camera that fixes it for examination. Even the zombie has become a "character" in the sad sense this holds for unimaginative horror movies.