Yesterday I got out to see Avatar at the theater. I have to say, I have not been that immersed in a film environment since I saw Star Wars as a kid. With all due credit to Cameron and his cinematography team (not so much the actors, however) I don't think my response was simply because of cool visuals. Or rather, it had to do with the visuals, and with the film production team's ingenuity, but not in the way that these comprise the productive forces of most other films. What was unusual about my experience with Avatar was that the cultural studies hemisphere of my brain--the skeptical, ideology-critiquing, perpetually discontent half--was at one with the nihilistic, pleasure-seeking half. This union is the object of thought for thinking about the viewership of Avatar and for the future of cinema.
The movie quite obviously calls attention to what an 'avatar' is, what it means to inhabit, act through, and be affected through another's body. You don't need a Judith Butler bibliography to figure that out. Nor is it especially coy about linking various layers of avatarism. The perfect form is that which provides the jumping-off point for the film's narrative: the transfer of one consciousness from one whole body to another. This is then weighed against the avatarism of a human controlling a robot body from within, or controlling a robot body from a distance; or, in the other direction, the philosophical avatarism in which the universal (or life force) takes shape through the particular (or organismic). These are problems worked on within the film, but raised to such a pitch that one almost has to confront the avatarism of film itself. Precisely to the extent that the protagonist effectively inhabits his Na'vi avatar, the viewer does the same. Lapses in affect--body and mind as a unitary being--are the criterion by which the viewer judges in cinema. Even the film's failures (like its unoriginal plot, Fern Gully Redux) are intelligible within its formal success at the level of technicity.
I wouldn't recommend this movie on the basis of its narrative elements. Nor would I recommend approaching cinema in general as if it were a medium of pure narrative. Avatar is important because it basically accomplishes a theoretical trajectory set out early in the twentieth century concerning what it means to have a body and the possibility of exchanging that body--that mass of flesh strangely dis/obedient and inescapable--for another. Cinema has gestured toward this exchange for the whole of its existence; so has literature. We are now extremely close. Avatar points to the technological, factical gap between 'literally' inhabiting another body, but it performs the fallacy of that distance. That form of difference belongs to an old and dead imaginary of the body (Mary Shelley never saw cinema). It is not so much 'technology' as some imaginary externality belonging to the gods that has changed as us, and by virtue of the actual existence of technicity as something embedded in humanity. Avatarism is the new affect. As tangential evidence, I would also point to the huge effect Cameron had in determining affective modes with Titanic (the young female Werther of its day) and Terminator.
Wednesday, December 23, 2009
Friday, December 11, 2009
Gesamtkunstwerk 2.0
The "work of art" was a hypothesis that was finally disproven around the time of Benjamin. The Gesamtkunstwerk offers something different, however, than an addition of the forces of individual works of art. As Engels would point out, it separates itself qualitatively by the fact of its unique level of accumulation. Nietzsche was right to cut down Wagner's resuscitation of a political metaphysic via the Gesamtkunstwerk, but the idea anticipates the form that aesthetic productivity must take in the next generation of capitalism (ie, the world we were born into). What we are close to realizing is the all encompassing advertisement, in which every object is actively being pitched by an investor (and hence behind every object there stands a capital flow underwriting the venture) and every presentation of objects take the form of an engaging narrative.
These kinds of ads were previously considered "high concept" but they are becoming the norm; or if the skill of narrative carries too high a price, they can at least attain the anxious ambivalence of insecure and sarcastic youth. On the other front, NBC's 30 Rock has done the best work to ironically incorporate, as in a vacuole, the pleasures and perils of the discourse of advertising. But that model and its opposition are bound to collapse, leaving us with a life that has both the curious twists of a fate of an unrevealed novel and the opportunities for buying back tokens of our life in every atomic aggregation.
These kinds of ads were previously considered "high concept" but they are becoming the norm; or if the skill of narrative carries too high a price, they can at least attain the anxious ambivalence of insecure and sarcastic youth. On the other front, NBC's 30 Rock has done the best work to ironically incorporate, as in a vacuole, the pleasures and perils of the discourse of advertising. But that model and its opposition are bound to collapse, leaving us with a life that has both the curious twists of a fate of an unrevealed novel and the opportunities for buying back tokens of our life in every atomic aggregation.
Thursday, December 10, 2009
Horizon
I was looking at the horizon for a long time and had to question how I would see it without knowledge of the spherical earth. The paradox I see is this: the horizon is the marker for where we cannot see any more, but we can't see 'nothing.' It is the visible non-visible. This is an important phenomenon for the development of the human mind. In the spherical model we tell ourselves we see the atmosphere as it overshoots and wraps around the solid world, the sea disappearing by virtue of its curvature. Yet as it appears in human phenomenology this is a line, not a curve, and looking out to sea or across vast fields it is one of the flattest naturally occurring lines. (Seriously, where else in nature would we find something approximating not only the flatness of the line but its geometric being as an indefinite series of contiguous points). Where two discrete elements meet we cease to see, and so conceive of their difference in a specific situation of the visible non-visible. Looking past this, however, without a knowledge of outer space, there is a re-circling of the elements as they continue their trajectories past the border of the failure of sight.
Sunday, December 6, 2009
Dictatorship of the Vegetarian
Craig has his excellent post up on animal rights and anthropocentrism at The Inhumanities, in which he makes a distinction between State societies which believe that 1) they are the model of the human and that 2) they should exterminate nonhumanity, in contrast to non-state societies which share 1) but not 2). The conclusion is that anthropocentrism is not the problem per se, but how that fairly common attribute of culture reacts with State organization.
I like how Craig has linked veganism to historical phenomena, ie, the State, providing a target for opposition larger than the boycott but smaller than Christian ontology. This historicization also makes available a number of resources developed for resisting the State.
What comes to my mind, in particular, is the doctrine of the dictatorship of the proletariat. I'd put this in parallel with the ethical meat position that aims to regenerate something like the relation to killing animals we find in non-State societies--which, at the least, simply cannot kill animals at the same level of scale as the factory farming of State societies. (Cleaning dead bodies in an artisanal rather than Fordist set-up is neither easy nor quick and constitutes a form of skilled rather than unskilled labor). If such a re-ritualized valorization of meat eating is what ethical meat holds as its normative social goal, the road to this as other than an exception or fetish (ie, as a social mode rather than a commodity) requires a theory of how one gets from State to non-State relations to animals and meat. If the State is directly attached to the social forms of mass meat eating (the factory farm, the normalization of plentiful, cheap meat) then ethical meat maintains a fantasy of the dictatorship of the vegetarian that would provide the link between the present and a society in which ethical meat recaptures its ritual meaning.
Discourse on ethical consumption recapitulates the Marxist argument that in a communist society labor wouldn't be alienated, or we wouldn't have a totalizing instrumental view of nature, or objects wouldn't confront us as commodities--that in whatever way, something we occasionally access today as an exception invested with fetish value would be normalized and de-fetishized. Following Craig's astute diagnosis of the entanglements of State and meat, we can see that half-way measures like ethical meat don't really believe in themselves except insomuch as they believe that vegetarianism is the condition of their reason.
Tuesday, September 22, 2009
Farmville
If you use Facebook you have probably seen ads for Farmville or dabbled in it yourself. If you don't use the F-book you have no idea what I'm talking about.
Farmville is a game you can play for free through facebook. You get a gridded piece of virtual land. You can plant an increasing array of virtual crops as you progress in experience, as well as acquire livestock and ornaments and plant trees. Crops, animals, and trees give you money. They are simple, risk free investments. It is absurdly addictive.
The game progresses as a kind of well-graphicked algebra problem in which you, the player, try to figure out which plants--factoring in their time til harvest, their cost, and their payoff--you want to put in the ground. It is the way capitalism would work if it were a single variable problem. Farmville is only about supply. There is an infinite and continuous demand, so if you grow all soy beans all the time you get the same return every time. Isn't that wonderful? By the same token, there can be no exploitation of crop failures, of speculators, of market manipulation. You can't game the system.
My personal feeling toward Farmville is that it is a harmless addiction that restores logic and order to my largely disordered and anxious existence. (Weightlifting works in a similar way but, as with all empirical processes, is given to moments of independent fluctuation). I am not sure what to make of it as a critical or reactionary program. As I have shown, it both idealizes capitalism as the best possible world, and it slaps down the actual working of capitalism as anathema to that ideal. In the same way, one can continually harvest products from virtual trees and animals without destroying them (milking the cows, getting eggs from chickens, finding truffles with the pigs). This is both deeply idealistic and ideological. To borrow from the Beach Boys, yeah, that would be nice--but does imagining it make us closer or farther? And does the addictive pull to return to the screen, to plan one's virtual acriculture around a relationship to the computer, similarly entail a coming together between a carbon- and a silicon-based operating system? Or does it underscore how an organic veneer is necessary to allow the one to "pass" in the world of the other? Whatever the answer to these questions, many, many people are living that experience right now.
Farmville is a game you can play for free through facebook. You get a gridded piece of virtual land. You can plant an increasing array of virtual crops as you progress in experience, as well as acquire livestock and ornaments and plant trees. Crops, animals, and trees give you money. They are simple, risk free investments. It is absurdly addictive.
The game progresses as a kind of well-graphicked algebra problem in which you, the player, try to figure out which plants--factoring in their time til harvest, their cost, and their payoff--you want to put in the ground. It is the way capitalism would work if it were a single variable problem. Farmville is only about supply. There is an infinite and continuous demand, so if you grow all soy beans all the time you get the same return every time. Isn't that wonderful? By the same token, there can be no exploitation of crop failures, of speculators, of market manipulation. You can't game the system.
My personal feeling toward Farmville is that it is a harmless addiction that restores logic and order to my largely disordered and anxious existence. (Weightlifting works in a similar way but, as with all empirical processes, is given to moments of independent fluctuation). I am not sure what to make of it as a critical or reactionary program. As I have shown, it both idealizes capitalism as the best possible world, and it slaps down the actual working of capitalism as anathema to that ideal. In the same way, one can continually harvest products from virtual trees and animals without destroying them (milking the cows, getting eggs from chickens, finding truffles with the pigs). This is both deeply idealistic and ideological. To borrow from the Beach Boys, yeah, that would be nice--but does imagining it make us closer or farther? And does the addictive pull to return to the screen, to plan one's virtual acriculture around a relationship to the computer, similarly entail a coming together between a carbon- and a silicon-based operating system? Or does it underscore how an organic veneer is necessary to allow the one to "pass" in the world of the other? Whatever the answer to these questions, many, many people are living that experience right now.
Monday, September 21, 2009
In Praise of Particle Board
Particle board is the material of choice for cheap furniture. Venture into a dorm or bachelor pad and you will likely find particle board products. It is the lowest form of wood, lower even than plywood (which at least has substantial ruggedness to recommend its otherwise unattractive appearance). Particle board has only its cheapness, and that it wears a veneer well.
Like many people I was inclined to look down on PB. It displaces real, beautiful wood, and the kind of crap easily made with PB displaces real carpenters. Beholding a wooden stool made with good craftsmanship and good materials is a glory.
I was raised in this cult of quality. Now I am reconsidering whether I judged the its foe too harshly. PB is made mostly out of waste product. That's why it is so cheap. Isn't there a quiet dignity in reclaiming what the quest for greatness discards? And isn't there something truthful about a material that really wears its economization on its laminate sleeve? There is no aura to the thing made of PB; it is undisguised simulacra. It might imitate a granite counter top or a cherry cabinet (with the help of a more attractive sheath) but once you commit to PB rather than the real thing you have given up the social capital game. The question becomes not: what do others think of this and of me as a result? but: what do I think about the appearance of this thing. Because if you scratch the surface, it's right there. There is no fetish value there, it's just there there. No symbolic filler, just literal filler. I feel deeply related to this subpar material, recuperated from the ashbin of authenticity, that is worthless inside but effective on the surface.
Like many people I was inclined to look down on PB. It displaces real, beautiful wood, and the kind of crap easily made with PB displaces real carpenters. Beholding a wooden stool made with good craftsmanship and good materials is a glory.
I was raised in this cult of quality. Now I am reconsidering whether I judged the its foe too harshly. PB is made mostly out of waste product. That's why it is so cheap. Isn't there a quiet dignity in reclaiming what the quest for greatness discards? And isn't there something truthful about a material that really wears its economization on its laminate sleeve? There is no aura to the thing made of PB; it is undisguised simulacra. It might imitate a granite counter top or a cherry cabinet (with the help of a more attractive sheath) but once you commit to PB rather than the real thing you have given up the social capital game. The question becomes not: what do others think of this and of me as a result? but: what do I think about the appearance of this thing. Because if you scratch the surface, it's right there. There is no fetish value there, it's just there there. No symbolic filler, just literal filler. I feel deeply related to this subpar material, recuperated from the ashbin of authenticity, that is worthless inside but effective on the surface.
Wednesday, September 16, 2009
Freud, Animals, and Smell
Where does repression come from? Part of its origin, says Freud, is physiological. As hominids became erect our noses moved away from our genitals and anuses. At the same time, we began to rely more on sight and less on smell; our sense of smell withered in proportion to our strengthening eyes. Our once easy familiarity with our sexual zone became uneasy. What happens first as evolutionary biology repeats as evolutionary psychology: humans are distinguished from animals not just by our erect posture, strong eyes and weak noses, but by a mental economy of not knowing what we know. Out of a certain defensiveness about (ostensibly) not knowing sex morality is born.
So says Freud, if you can forgive my rough paraphrase. I think we are, today, generally skeptical of any story of this kind that roots the discoveries of a certain age (ie repression, Oedipus, etc) in a pre-historical proto-humanity. What I see of use in this story is a wedge for a particular critique of Freud. When he talks about the strong sense of smell of primitive or pre-humanity he is talking about animality, a general concept grounding the concept of humanity; but he is also talking about lots of species coeval with homo sapiens. Dogs are the obvious example, especially as they are distinctly "within" the polis, and in Freud's case within his office during sessions. But when we look on dogs and their amazing ability to track by scent, to approach the world through a different lens than humans--and the ineffability of butt-sniffing to the weak-nosed human--we don't see the Id incarnate. The history/prehistory divide set up in Freud's story introduces an unnecessary binarism that would divide ego from id and uncritically attach values based on a division that is at best heuristic (This tendency persists today in researchers showing that language-oriented dogs are "smarter" than scent-oriented dogs). The "uncanny," the general darkness of the libidinal region, are results of the very vertico-centricity that Freud is criticizing. It seems more the case that the ego as the human form of consciousness does not have one other (sex) but many, and the empirical evidence of this many lies quite simply in the lives of animals. Thus the Id is not frightening, as any dyadic Other must at first appear, but different in a non-competitive way (non-competitive because framed in a wide and generally flat field, rather than the top-bottom orientation of any two term set). From here it is not a long distance to an Anti-Oedipal reading of multiplicity in the unconscious, with the advantage that actual animals are irreducibly included, as actual and not symbolic, in the dialectic of human selfhood
So says Freud, if you can forgive my rough paraphrase. I think we are, today, generally skeptical of any story of this kind that roots the discoveries of a certain age (ie repression, Oedipus, etc) in a pre-historical proto-humanity. What I see of use in this story is a wedge for a particular critique of Freud. When he talks about the strong sense of smell of primitive or pre-humanity he is talking about animality, a general concept grounding the concept of humanity; but he is also talking about lots of species coeval with homo sapiens. Dogs are the obvious example, especially as they are distinctly "within" the polis, and in Freud's case within his office during sessions. But when we look on dogs and their amazing ability to track by scent, to approach the world through a different lens than humans--and the ineffability of butt-sniffing to the weak-nosed human--we don't see the Id incarnate. The history/prehistory divide set up in Freud's story introduces an unnecessary binarism that would divide ego from id and uncritically attach values based on a division that is at best heuristic (This tendency persists today in researchers showing that language-oriented dogs are "smarter" than scent-oriented dogs). The "uncanny," the general darkness of the libidinal region, are results of the very vertico-centricity that Freud is criticizing. It seems more the case that the ego as the human form of consciousness does not have one other (sex) but many, and the empirical evidence of this many lies quite simply in the lives of animals. Thus the Id is not frightening, as any dyadic Other must at first appear, but different in a non-competitive way (non-competitive because framed in a wide and generally flat field, rather than the top-bottom orientation of any two term set). From here it is not a long distance to an Anti-Oedipal reading of multiplicity in the unconscious, with the advantage that actual animals are irreducibly included, as actual and not symbolic, in the dialectic of human selfhood
Sunday, September 13, 2009
UC Graduate Student Walkout
As everyone knows, the UC system is in a lot of trouble. Less widely discussed is why and whose vision of higher education this crisis benefits. Perhaps even less mentioned is what can and is being done to avoid total capitulation by faculty and students to the Regents. Please read the letter below being circulated by UC grad students and visit the sites:
Faculty Walkout
Graduate Student Walkout
Dear Professor XXX,
I write to express my solidarity with the striking UPTE workers and UC Faculty pushing for a system-wide walkout on the first day of class on 9/24. In advance of this date, I want to let you know of my intention not to cross any picket line. The emergency powers recently seized by the University of California Office of the President—not to mention the Administration's heavy handed budget decisions made under cover of summer vacation and holiday weekends—are unacceptable from any perspective within the UC system. This new thrust of long-standing trends toward privatization makes a farce of the University's stated mission of providing an accessible and quality public education for the youth of California. As educators, students and workers, we all have a stake in fighting for this dream against the prevailing corporate cynicism of the Chancellors and Office of the President.
Along with my fellow graduate students I have witnessed steep cutbacks in TAships, departmental funding decreases, fee hikes and dwindling job prospects. These new cutbacks threaten graduate students, who already have staggeringly high levels of debt, with the prospect of real financial ruin along the path to completing their degree programs. Assisting our professors instruct undergraduates grows more difficult with each over-crowded classroom and every bloated discussion section that the administrators force upon us. We are asked to take the hit for the financial crisis while those charged with managing the budget reject significant cuts in their own large salaries and, remarkably, refuse public disclosure the budget itself. For these reasons and many more, I support the actions and demands of the UC Faculty Walkout which must ensure that the University of California will not be "business as usual" on 9/24. On behalf of a growing contingent of graduate students (http://www.gradstudentstoppage.com/ ), I strongly encourage you to make the decision to walk out and sign the open letter if you have not already done so. That open letter and signatory page is here: http://ucfacultywalkout.com/
I strongly believe that this faculty walkout represents an important exercise in pedagogy: disruption is an essential component of all critical thought and all advances in human knowledge. Towards this end, I welcome the opportunity to discuss ways of including our undergraduates in this day of action. It is of utmost importance that we don’t punish undergraduates who choose to walk out in support of faculty on the first day, so we may want to discuss postponing attendance, permission codes and enrollment until the next scheduled day of class. In the days to come, building solidarity and creatively collaborating on pedagogical resistance will be essential to defending—more than just our individual positions—the very principle of a free and public education against the vicious and failed ideologies of corporatization and privatization. I hope this letter is only the beginning of an ongoing dialogue between us about these issues.
In solidarity,
XX
Feel free to re-post or publicize this effort any way possible.
Faculty Walkout
Graduate Student Walkout
Dear Professor XXX,
I write to express my solidarity with the striking UPTE workers and UC Faculty pushing for a system-wide walkout on the first day of class on 9/24. In advance of this date, I want to let you know of my intention not to cross any picket line. The emergency powers recently seized by the University of California Office of the President—not to mention the Administration's heavy handed budget decisions made under cover of summer vacation and holiday weekends—are unacceptable from any perspective within the UC system. This new thrust of long-standing trends toward privatization makes a farce of the University's stated mission of providing an accessible and quality public education for the youth of California. As educators, students and workers, we all have a stake in fighting for this dream against the prevailing corporate cynicism of the Chancellors and Office of the President.
Along with my fellow graduate students I have witnessed steep cutbacks in TAships, departmental funding decreases, fee hikes and dwindling job prospects. These new cutbacks threaten graduate students, who already have staggeringly high levels of debt, with the prospect of real financial ruin along the path to completing their degree programs. Assisting our professors instruct undergraduates grows more difficult with each over-crowded classroom and every bloated discussion section that the administrators force upon us. We are asked to take the hit for the financial crisis while those charged with managing the budget reject significant cuts in their own large salaries and, remarkably, refuse public disclosure the budget itself. For these reasons and many more, I support the actions and demands of the UC Faculty Walkout which must ensure that the University of California will not be "business as usual" on 9/24. On behalf of a growing contingent of graduate students (http://www.gradstudentstoppage.com/ ), I strongly encourage you to make the decision to walk out and sign the open letter if you have not already done so. That open letter and signatory page is here: http://ucfacultywalkout.com/
I strongly believe that this faculty walkout represents an important exercise in pedagogy: disruption is an essential component of all critical thought and all advances in human knowledge. Towards this end, I welcome the opportunity to discuss ways of including our undergraduates in this day of action. It is of utmost importance that we don’t punish undergraduates who choose to walk out in support of faculty on the first day, so we may want to discuss postponing attendance, permission codes and enrollment until the next scheduled day of class. In the days to come, building solidarity and creatively collaborating on pedagogical resistance will be essential to defending—more than just our individual positions—the very principle of a free and public education against the vicious and failed ideologies of corporatization and privatization. I hope this letter is only the beginning of an ongoing dialogue between us about these issues.
In solidarity,
XX
Feel free to re-post or publicize this effort any way possible.
Saturday, September 12, 2009
Have you seen "Brazil"!
I just stumbled across Brazil, a kind of playful reworking of the Orwellian nightmare. Somewhere between Blue Velvet and Willy Wonka, it tickles the absurdity drive and anxiety drive equally. What I enjoyed about Brazil--other than its simulations of the visual rhetoric of a control society, the euphemisms of the age of universal terrorism, its self-cinematizing lack of emotional manipulation--was that it did not allow itself to fantasize about an actual "outside" of the system. The total bureacracy that the protagonist finds himself within is not a Megatronic evil entity one can front, fight, or flee from. It is society itself.
Furthermore, the bureaucracy is not apart from the protagonist. He is not the messianic possessor of truth and light in a world benighted by paper work. Rather, the absolute bureaucracy is a fantasy of the will to artistic resistance. We see him as the figure of light, quite literally, in his fantasy world. By splitting between the protagonist's objective experience with his bosses, mother, desk, and papers, and his subjective myth-fantasy of flight, dueling, salvation, self-knowledge, the film shows that the revolutionary interpretation of narratives about this kind of world is precisely the means by which one fails to cognize that world as itself.
Weber gives a theorist's account of this beautiful monstrosity (if we use Kant's terms the perfect bureaucracy is both beautiful and sublime). Kafka is sainted for it, then Orwell. Brazil gives a generic reading of this fantasy: the image of the perfect bureaucracy arises from the desire for aesthetic resistance. Only as opposed to bureaucracy can the modern artist conceive of himself in Icarian terms (Lowry's fantasy begins as Icarus, flying toward the light above the clouds). Once at a show in Gainesville a pamphlet was distributed that appeared to be an insurance form. The text, however, explained that by typing within that format the author was able to escape detection at work and spend his or her time writing the manifesto that followed. The formalism of bureaucracy, the contentless gray race it engages, is what allows content--dreams, revolutions, individuals, objects--to appear by contrast.
Brazil's plot is above all about a state of terrorism. "Have you ever seen an actual terrorist?" Lowry is asked. He has not, and the only "terrorist" we have seen is a rogue repairman (played by Robert Deniro, even though he appears for maybe 10 minutes out of 140). But we have seen terrorism: explosions in restaurants and shopping malls that affect mostly the upper class, and the (counter-)terrorist raids that leave the tenements of the poor in ever worse repair (not to mention dragging them away for "information retrieval," ie, torture). Terrorism, like the liberation Lowry inchoately longs for, is a milieu, the in-between; it vanishes in the graspable. The mit, not the Sein. Lowry's demise seems to reinforce the nihilistic reading. Deniro's anarchistic repairman, however, is another version of rebellion. He has a specific goal: making stuff work. He operates anarchistically because it is more functional. "Why don't you work for central services?" Lowry asks. "I became a heating engineer for the action, not the paperwork," he replies. Deniro's repairmen, however, does not get a macronarrative. There is no end in sight of a time when everything is fixed. That sounds about right.
Furthermore, the bureaucracy is not apart from the protagonist. He is not the messianic possessor of truth and light in a world benighted by paper work. Rather, the absolute bureaucracy is a fantasy of the will to artistic resistance. We see him as the figure of light, quite literally, in his fantasy world. By splitting between the protagonist's objective experience with his bosses, mother, desk, and papers, and his subjective myth-fantasy of flight, dueling, salvation, self-knowledge, the film shows that the revolutionary interpretation of narratives about this kind of world is precisely the means by which one fails to cognize that world as itself.
Weber gives a theorist's account of this beautiful monstrosity (if we use Kant's terms the perfect bureaucracy is both beautiful and sublime). Kafka is sainted for it, then Orwell. Brazil gives a generic reading of this fantasy: the image of the perfect bureaucracy arises from the desire for aesthetic resistance. Only as opposed to bureaucracy can the modern artist conceive of himself in Icarian terms (Lowry's fantasy begins as Icarus, flying toward the light above the clouds). Once at a show in Gainesville a pamphlet was distributed that appeared to be an insurance form. The text, however, explained that by typing within that format the author was able to escape detection at work and spend his or her time writing the manifesto that followed. The formalism of bureaucracy, the contentless gray race it engages, is what allows content--dreams, revolutions, individuals, objects--to appear by contrast.
Brazil's plot is above all about a state of terrorism. "Have you ever seen an actual terrorist?" Lowry is asked. He has not, and the only "terrorist" we have seen is a rogue repairman (played by Robert Deniro, even though he appears for maybe 10 minutes out of 140). But we have seen terrorism: explosions in restaurants and shopping malls that affect mostly the upper class, and the (counter-)terrorist raids that leave the tenements of the poor in ever worse repair (not to mention dragging them away for "information retrieval," ie, torture). Terrorism, like the liberation Lowry inchoately longs for, is a milieu, the in-between; it vanishes in the graspable. The mit, not the Sein. Lowry's demise seems to reinforce the nihilistic reading. Deniro's anarchistic repairman, however, is another version of rebellion. He has a specific goal: making stuff work. He operates anarchistically because it is more functional. "Why don't you work for central services?" Lowry asks. "I became a heating engineer for the action, not the paperwork," he replies. Deniro's repairmen, however, does not get a macronarrative. There is no end in sight of a time when everything is fixed. That sounds about right.
Friday, September 11, 2009
Virginia Woolf
Today I got The Complete Shorter Fiction of Virginia Woolf in the mail. I'd bought it because I thought it contained "Flush," a novella about Elizabeth Barrett Browning's spaniel. Any story centered on an animal is of interest to me. (You can actually find a nice copy of the full text free online, I just like physical books). "Flush," however, is not included, though in one of the appendices there are some nice fragments about a dog and monkeys (separate fragments). I also enjoyed a story about a woman named "V." as it was oddly similar to Pynchon's novel/character of the same name in thematizing cyborgs, dispersed agents, and undeath as a unit. The later, more typical stories were less to my liking because they seemed to have a back door, almost Heideggerian reinstatement of the human. Woolf is fantastic at disaggregating the flitting about of the subject in the phenomenological field and capturing each moment as both with and without relation to those adjacent. On one hand, then, this dismantles "the subject"--and while "dismantle" is something of a metaphor here, on the path of modernism/Woolf to postmodernism/Pynchon it is quite literal: Pynchon's V. is made mostly of mechanical parts and her death scene is a literal dismantling. This centerless or defiltered flow is also more likely to admit those object-agents traditionally disqualified as even supporting characters to play furniture or mise-en-scene. However, as Woolf makes clear in her early story "Phyllis and Rosamond," the intent is anthropological: "Let each man, I heard it said the other day, write down the details of a day's work; posterity will be as glad of the catalogue as we should be if we had such a record of how the door keeper at the Globe [passed his day]....And as such portraits as we have are invariably of the male sex...it seems worth while to take as model one of those many women who cluster in the shade" (17). The focal point of the phenomenological constellation is a new and better human.
This is not to say that Woolf is only an anthropologist, or even that her contribution to the reformation of literary anthropology is negative, but that her driving continuity is torn between these visions. She is very much at the forefront of modernism. Heidegger's hands, too, know not what the other is doing. Ulysses marks its uncertainty about animals in its mythic-lawmaking structure with sudden reflections on cannibalism. Faulkner's novella "The Bear" in Go Down, Moses is about how a hybrid hunting dog is the only way for humanity to encounter the sublimity of Nature's ferocity. If "modernity" is a term too easily place in certain narratives of the suppression and enclosure of animals, maybe the shorter periodizations of literature can highlight how modernity folds in on itself; how each progressive revolution undercuts itself with regard to its animals and its self as animal. That would be a big project. It can at least be begun with due specificity in texts like "Flush."
This is not to say that Woolf is only an anthropologist, or even that her contribution to the reformation of literary anthropology is negative, but that her driving continuity is torn between these visions. She is very much at the forefront of modernism. Heidegger's hands, too, know not what the other is doing. Ulysses marks its uncertainty about animals in its mythic-lawmaking structure with sudden reflections on cannibalism. Faulkner's novella "The Bear" in Go Down, Moses is about how a hybrid hunting dog is the only way for humanity to encounter the sublimity of Nature's ferocity. If "modernity" is a term too easily place in certain narratives of the suppression and enclosure of animals, maybe the shorter periodizations of literature can highlight how modernity folds in on itself; how each progressive revolution undercuts itself with regard to its animals and its self as animal. That would be a big project. It can at least be begun with due specificity in texts like "Flush."
Friday, August 28, 2009
New Normatives?
One of the normative moments in Donna Haraway's When Species Meet pertains to the encounter between organisms, humans and baboons for example, or humans and dogs (We could add between dogs and baboons as well, that just isn't the kind of question humans tend to cogitate on). She recognizes, and advocates recognizing, cross species transactionalism. The Other (dog, baboon, etc) hails us; we can either respond--and perhaps succeed, perhaps fail, but at least gain recognition as something to which one signifies--or be outside the world of communication. We wouldn't just be poor in world (as we might be if we did a very bad job responding to the Other, and they were as egocentric as we) but we would be without world, as stone.
Haraway's ethics of the encounter seems to me to be something like a Levinasian/Derridean ethics of the singularity of the Other, except mutated by an intractable case of immanence or "concrescence," Haraway's preferred word (intended with full Whiteheadian connotation). With that infection comes a body of knowledges as well, what I lump together under the heading comparative ethology and what others might distill into anthropology, sociology, biologies of all kinds. Haraway's big improvement is to reattach those physical knowledges to the abstraction of deconstructive ethics. No small feat, and one she accomplishes with panache.
Now the question that keeps bugging me is how we can extend the encounter beyond the macrosystems called organisms. Baboons and dogs, after all, are pretty gigantic clusters of physical processes. They are a lot more like us than the vast majority of matter or even the vast majority of life. Their social codes aren't quite intuitive but they are not outside of the structural ken of the megaorganism called humanity. However, the speculative realists have made "massively unavoidable" the issue of all that other stuff that is not improbably similar to us (Derrida uses "massively unavoidable" to describe the animal question in Specters of Marx and it has a ring to it that I like).
The process of hailing and having face is not impossible to hook up with nonhuman animals (For more on this head over to The Inhumanities for a discussion of Matthew Calarco's Zoographies). I sign on with that a priori; I've had dogs all my life and they're not hard to understand. But what about beyond the "higher species," beyond mammals, vertebrates, animalia? I probably wouldn't be raising this as a mere theoretical contention if it weren't for the dismal state of the environment. Philosophy, we are reminded, is a language and a techne, and it addresses the problems of its time. For better or worse we are all going to have to answer to the environmental question.
So how does the ethics of the encounter stack up? How does it encounter, how does it respond to the stuff that lacks a socius? Now that the humanist prejudices have started to fall, liberating animals (at least a bunch of them) to ethics, all the other criteria or lines to be drawn seem artificial and insubstantial. I imagine we could--some have--redraw the borders of who/what is in and who/what is out, but that doesn't seem to hold water (Think of all the lifeforms that water is holding!). So, while we can draw on well developed naturalcultural eth(n)ologies to approach certain types of organisms making powerful claims on our philosophical moment, I do not see any established knowledge experience to guide us elsewhere.
We do, of course, have plenty of scientific knowledge on trees, rocks, fire, cotton (maybe not Harry Potter; that one might have to wait a minute). But none, or little, of this knowledge elucidates these objects in the mode of hailing us or of an encounter with recognizable normative dimensions. As I see it--and I may be wrong, feel free to correct me--the claims that trees put on us are pretty mild. Trees don't want to be cut down, but if so they would prefer to have a space in which their seedlings can propagate. Speculatively I'd guess they prefer to have a genetic pool which would make their species more capable of weathering the kinds of plagues that befall tree populations. Cotton I would consider similarly in its life as a plant. But fire? Or rocks? Once things are inanimate the encounter ethos (or any of its predecessors) falls on hard or at least very uncertain times. How do these things hail us? I can't tell.
Drawing on Derrida, I see temporality as a partial key to this riddle. Doing the right thing is conditioned by doing something at the right (delayed but right on time) moment. Say "what's up" too soon or too late and it's worse than staying quiet. The general time frame of right action in human interaction lines up pretty well with other macroorganisms. But with trees, or fire, or--wait for it--Harry Potter, maybe the temporality is just way outside what we are used to and what we are comfortable with. For trees I think we need to think in at least fifty year moments; for certain biochemical processes constituting other organismic scales, longer than that. (Remember how carbon-14 dating works? Among so many other concerns). For something like Harry Potter this may extend to infinity; hence the insurmountable oddity that thinking HP as equally an object presents to many modes of thinking. Maybe temporality is only some part of a greater criterion that would have greater extensibility to inanimate beings. Maybe the final hurdle is a secular sub specie aeternatis. This is the kind of thing Derrideans would pronounce undecidable and which Haraway would point out we are deciding all the time. It seems to me that we still need a better conceptual apparatus for making these tentative decisions.
Haraway's ethics of the encounter seems to me to be something like a Levinasian/Derridean ethics of the singularity of the Other, except mutated by an intractable case of immanence or "concrescence," Haraway's preferred word (intended with full Whiteheadian connotation). With that infection comes a body of knowledges as well, what I lump together under the heading comparative ethology and what others might distill into anthropology, sociology, biologies of all kinds. Haraway's big improvement is to reattach those physical knowledges to the abstraction of deconstructive ethics. No small feat, and one she accomplishes with panache.
Now the question that keeps bugging me is how we can extend the encounter beyond the macrosystems called organisms. Baboons and dogs, after all, are pretty gigantic clusters of physical processes. They are a lot more like us than the vast majority of matter or even the vast majority of life. Their social codes aren't quite intuitive but they are not outside of the structural ken of the megaorganism called humanity. However, the speculative realists have made "massively unavoidable" the issue of all that other stuff that is not improbably similar to us (Derrida uses "massively unavoidable" to describe the animal question in Specters of Marx and it has a ring to it that I like).
The process of hailing and having face is not impossible to hook up with nonhuman animals (For more on this head over to The Inhumanities for a discussion of Matthew Calarco's Zoographies). I sign on with that a priori; I've had dogs all my life and they're not hard to understand. But what about beyond the "higher species," beyond mammals, vertebrates, animalia? I probably wouldn't be raising this as a mere theoretical contention if it weren't for the dismal state of the environment. Philosophy, we are reminded, is a language and a techne, and it addresses the problems of its time. For better or worse we are all going to have to answer to the environmental question.
So how does the ethics of the encounter stack up? How does it encounter, how does it respond to the stuff that lacks a socius? Now that the humanist prejudices have started to fall, liberating animals (at least a bunch of them) to ethics, all the other criteria or lines to be drawn seem artificial and insubstantial. I imagine we could--some have--redraw the borders of who/what is in and who/what is out, but that doesn't seem to hold water (Think of all the lifeforms that water is holding!). So, while we can draw on well developed naturalcultural eth(n)ologies to approach certain types of organisms making powerful claims on our philosophical moment, I do not see any established knowledge experience to guide us elsewhere.
We do, of course, have plenty of scientific knowledge on trees, rocks, fire, cotton (maybe not Harry Potter; that one might have to wait a minute). But none, or little, of this knowledge elucidates these objects in the mode of hailing us or of an encounter with recognizable normative dimensions. As I see it--and I may be wrong, feel free to correct me--the claims that trees put on us are pretty mild. Trees don't want to be cut down, but if so they would prefer to have a space in which their seedlings can propagate. Speculatively I'd guess they prefer to have a genetic pool which would make their species more capable of weathering the kinds of plagues that befall tree populations. Cotton I would consider similarly in its life as a plant. But fire? Or rocks? Once things are inanimate the encounter ethos (or any of its predecessors) falls on hard or at least very uncertain times. How do these things hail us? I can't tell.
Drawing on Derrida, I see temporality as a partial key to this riddle. Doing the right thing is conditioned by doing something at the right (delayed but right on time) moment. Say "what's up" too soon or too late and it's worse than staying quiet. The general time frame of right action in human interaction lines up pretty well with other macroorganisms. But with trees, or fire, or--wait for it--Harry Potter, maybe the temporality is just way outside what we are used to and what we are comfortable with. For trees I think we need to think in at least fifty year moments; for certain biochemical processes constituting other organismic scales, longer than that. (Remember how carbon-14 dating works? Among so many other concerns). For something like Harry Potter this may extend to infinity; hence the insurmountable oddity that thinking HP as equally an object presents to many modes of thinking. Maybe temporality is only some part of a greater criterion that would have greater extensibility to inanimate beings. Maybe the final hurdle is a secular sub specie aeternatis. This is the kind of thing Derrideans would pronounce undecidable and which Haraway would point out we are deciding all the time. It seems to me that we still need a better conceptual apparatus for making these tentative decisions.
Oh the inhumanities
There's a hot new blog I heard about called The Inhumanities run by Scu, Craig and, well, myself. We will collectively discuss texts of interest and importance to the animal studies community. Our first is to be Matthew Calarco's Zoographies. If you've read it then you know that it swiftly opens a lot of new ground in the Continental tradition's engagement with "the animal question." By "Continental tradition" I mean the big boys of the 20th C.: Heidegger, Agamben, Levinas, Derrida. If you haven't read it, hop on over to The Inhumanities and you can get both a synoptic and critical reading. It's going to be fun.
Thursday, August 27, 2009
A return to cruelty, but where?
The only claim with any merit I can see in Elisbeth Roudinesco's "dialogue" with Derrida on animals is the distaste for a world expunged of cruelty. I'm not sure how important this really is to her--it comes up late in the interview, seemingly by accident, and is preceded by what I take to be "serious" questions about food supply and the treatment of nonparadigmatic humans--but this lateness by the sign of importance. Who knows.
ER: "I am always worried that we moving toward the construction of a sanitized society, without passions, without conflicts, without insults or verbal violence, without any risk of death, without cruelty." (For What Tomorrow, 75)
The claim is that cruelty must be somewhere in the world, that it is something valuable to humans, and that reducing it by fiat in some places relocates it (psychically, probably geographically, and with some concern about an ontological relocation). Unlike the claim for "the necessity for industrial organization in raising and slaughtering animals, which makes it possible to prevent so many humans from starving," (71) or "the necessity for humans to eat meat" (68), the displacement by prohibition theory has at least some empirical validity. The questions I would put to this configuration of theory and instance is 1) whether vegetarianism constitutes the kind of institutional prohibition which can be cited on behalf of this theory 2) whether industrial farming is not itself the greater purveyor of an absent cruelty in our world and 3) whether the kind of need that attaches to the idea of "cruelty" must, or even can, be met by factory farming/ingestion/harm to animals. It seems to me that the construction of an animal necessary for these processes to go forward under the law first strips the animal of the capacity to be in a relation of cruelty. Hence this economy has multiplied so radically. It's like trying to sate hungry by drinking Kool Aid.
So: a first step toward a crueler world is seeing animals as more like persons at least to the extent that, as Derrida consistently argues, their suffering matters. There would then be a super abundance of cruelty, more than enough for Roudinescu, more than Sade himself could imagine. Roudinescu diagnoses her problem precisely: "But I prefer not to see it, even though I know that this intolerable thing exists" (71). How will we ever enjoy cruelty if our first task is not to see it? I see cruelty everywhere, it is under my fingernails. This is probably what helps to make vegetarianism so satisfying for me: it allows me (in a psychoanalytic sense of internal policing) to see more cruelty and to enjoy it without the bad conscience of the subject of law.
Oh, Google Books offers For What Tomorrow if you want to see for yourself. My characterization of Roudinesco's absurdities has actually been pretty generous compared to what she says.
ER: "I am always worried that we moving toward the construction of a sanitized society, without passions, without conflicts, without insults or verbal violence, without any risk of death, without cruelty." (For What Tomorrow, 75)
The claim is that cruelty must be somewhere in the world, that it is something valuable to humans, and that reducing it by fiat in some places relocates it (psychically, probably geographically, and with some concern about an ontological relocation). Unlike the claim for "the necessity for industrial organization in raising and slaughtering animals, which makes it possible to prevent so many humans from starving," (71) or "the necessity for humans to eat meat" (68), the displacement by prohibition theory has at least some empirical validity. The questions I would put to this configuration of theory and instance is 1) whether vegetarianism constitutes the kind of institutional prohibition which can be cited on behalf of this theory 2) whether industrial farming is not itself the greater purveyor of an absent cruelty in our world and 3) whether the kind of need that attaches to the idea of "cruelty" must, or even can, be met by factory farming/ingestion/harm to animals. It seems to me that the construction of an animal necessary for these processes to go forward under the law first strips the animal of the capacity to be in a relation of cruelty. Hence this economy has multiplied so radically. It's like trying to sate hungry by drinking Kool Aid.
So: a first step toward a crueler world is seeing animals as more like persons at least to the extent that, as Derrida consistently argues, their suffering matters. There would then be a super abundance of cruelty, more than enough for Roudinescu, more than Sade himself could imagine. Roudinescu diagnoses her problem precisely: "But I prefer not to see it, even though I know that this intolerable thing exists" (71). How will we ever enjoy cruelty if our first task is not to see it? I see cruelty everywhere, it is under my fingernails. This is probably what helps to make vegetarianism so satisfying for me: it allows me (in a psychoanalytic sense of internal policing) to see more cruelty and to enjoy it without the bad conscience of the subject of law.
Oh, Google Books offers For What Tomorrow if you want to see for yourself. My characterization of Roudinesco's absurdities has actually been pretty generous compared to what she says.
Wednesday, August 26, 2009
It never stops...
I'm not sure what to say about this but thought I'd repost it, as it is of importance to everyone and of interest to at least animal or science studies folks...
Permanent Gene Therapy
Permanent Gene Therapy
Tuesday, August 25, 2009
Is Vegism an Amputation?
Sometimes vegists stress that their dietary and other consumption patterns are not restrictive in a negative sense. Vegetarian food tastes good, is easy to make, and is nutritious; I don't think anyone committed to a veg lifestyle misses what they forego. Stressing that vegism is not restrictive is, I think, designed to persuade potential recruits that our lifestyle does not diminish the quality or variety of life (it doesn't!). However, I think that viewing vegism as an "amputation" (I'm borrowing the word from a friend) actually turns the tables on the "vegism is a restriction" debate more effectively than cataloging the specific pleasures of vegism.
An amputation requires that something be lost, yes. One could use the metaphor of disease, a disease that has metastasized throughout the Human but which has its nodes and nodules of highly dangerous tissue. Maybe we can save the Human (and a new kind of humanism), maybe not. The disease metaphor illustrates analogically a situation in which amputation increases life. Not that an analogy is an argument--there are always counter analogies--but it prepares the way for an argument.
After amputating flesh-eating and the like, we have room for a prosthesis. Again, a long tradition of the Human has made persons with prosthetics less than the "full" or "whole" body. Practically if not theoretically I think we are well past the uninterrupted body and into the age of the cyborg. Eyeglasses are a pretty primitive technology and I would be dead without them; don't get me started on my interface with coffee and the coffee machine. Fulfilling the function of a lost organ is only one facet of the prosthesis. By not being identical to the other organ prostheses have other facets of being that are thereby offered to the person, culture, machine, etc, to which they are attached. Marking the prosthesis as one center of being rather than as a marginal case makes other parts of the body-machine reveal themselves as multi-faceted, non-monological, interesting and dexterous.
It is in this light, as amputation and prosthesis, that I view vegism. Something is excluded to be sure, excised or exorcised, but so as to yield a "net gain" in vitality. This is not an ethical argument in the traditional sense and will probably not convince many "persons on the street." However, I can't in good (immoral) conscience promulgate a Christian version of ethics even if it does have more short term benefits.
An amputation requires that something be lost, yes. One could use the metaphor of disease, a disease that has metastasized throughout the Human but which has its nodes and nodules of highly dangerous tissue. Maybe we can save the Human (and a new kind of humanism), maybe not. The disease metaphor illustrates analogically a situation in which amputation increases life. Not that an analogy is an argument--there are always counter analogies--but it prepares the way for an argument.
After amputating flesh-eating and the like, we have room for a prosthesis. Again, a long tradition of the Human has made persons with prosthetics less than the "full" or "whole" body. Practically if not theoretically I think we are well past the uninterrupted body and into the age of the cyborg. Eyeglasses are a pretty primitive technology and I would be dead without them; don't get me started on my interface with coffee and the coffee machine. Fulfilling the function of a lost organ is only one facet of the prosthesis. By not being identical to the other organ prostheses have other facets of being that are thereby offered to the person, culture, machine, etc, to which they are attached. Marking the prosthesis as one center of being rather than as a marginal case makes other parts of the body-machine reveal themselves as multi-faceted, non-monological, interesting and dexterous.
It is in this light, as amputation and prosthesis, that I view vegism. Something is excluded to be sure, excised or exorcised, but so as to yield a "net gain" in vitality. This is not an ethical argument in the traditional sense and will probably not convince many "persons on the street." However, I can't in good (immoral) conscience promulgate a Christian version of ethics even if it does have more short term benefits.
Wednesday, August 19, 2009
The Sausage Factory
Every time I expose myself to a media outlet these days I hear some variation on this: "Yes, there are problems in X part of healthcare reform, but we're in the sausage making phase and things get ugly." Meaning that Americans are resistant to political change because our political process is as inherently revolting as that Upton Sinclair describes in The Jungle. The metaphor implies that American representative democracy stands or falls on the same aesthetic criteria undergirding meat consumption. Yes, it is ugly, but the finished product is aesthetically pleasing; that's precisely why we displace and disguise the gross part. Does it also mean that our government and its mouthpieces link themselves with the ethical connotations of meat making? If I say: hey, maybe we shouldn't be making sausages either--where does that leave the governmental process? The flimsy morality of this metaphorical defense strikes me as a sign that the Obama administration has not found a new clearing for the left to establish itself but is still in its intellectual death throes. And reform, if it comes--will it be as laden with pain and remorse as the sausage?
Tuesday, August 18, 2009
Animals in danger in movies
I was watching a movie tonight, Tell No One, of the psychological thriller variety. There was a dog in the movie and I found myself very, very concerned with the fate of this dog though, as it turned out, it played a minor role and no harm came to it. But there was good reason to believe this dog was in danger, and this seized on my mind. At the same time I found myself thinking how odd my anxiety was. My concern could be dismissed as me being a bleeding heart welfarist, or whatever the going denigration is, except that my response was in the context of a genre that above all else aims to elicit tension and concern apropos the human characters. A pretty large number of people were killed and/or abused in this movie; there was good reason for me to be concerned for any and all of the characters, and indeed I was worried about them; but it was the dog that I really hoped didn't get it. Amidst a concerted effort to make my brain worry about people, I found a dog to think about.
I can't quite put my finger on why this happens, but at the least I can say that (for me) dogs puncture some vital veil between image and reality (or whatever we call this world we live in). The humans in movies I always see also as actors--if we are particularly convinced in a given role that they are not 'acting,' we praise them as good actors! And when appropriate I have the same feeling towards well trained and physically skilled animals. But when the dog's role is just to run alongside a human protagonist and sit on the sidewalk, I don't see him as an actor. Though there is certainly some level of pre-discipline required, just as we must be constrained by language to express ourselves, I see this as not "performing" in the same sense. I might lie sometimes, and express powerful emotions at others, but neither of these capacities makes me an actor. Those are par for the course, and when a nonhuman is acting par for the course I do not see it as an image but as a being in a part of the same world from which I watch. The set on which the animal acts is continuous with the earth on which, watching, I sit, whereas the earth on which the actors and directors etc narrative takes place is not even contiguous with this world. It is a world to itself entire.
I see two sides to this. One, we see the animal as a link between image and self, and a vital one at that in a world increasingly populated by glowing screens, by images of elsewhere brought here. Two, we see this general phenomenon as evidence of a certain techne of viewing that has historical precursors. Strangely enough, I am thinking of Racine; of his contrived plots that correspond with brutal exactitude to the quasi-Aristotelian unities demanded in his day. His plays were praised, in the degree to which they measured up to said unities, as rational and natural, though from another measure they are the most absurd and unnatural things conceivable. But, as Auerbach points out, the yardstick of nature is not in this case in the world at large but already within the theater. That is, if we assume the action on stage is real, how could a play staged in two or three hours possibly be taking much (more than 24 hours) longer than that? This whole apparatus only makes sense if we understand that the audience sees what is on stage as a separate but whole world, rather than an emanation from our world and reducible to it. I have no evidence that my experience of nonhuman animals in movies is widely shared, but if it is I would venture that we also share, in part, this neo-classical divorce from the image as a world that we can watch without sharing its laws. But again, the animal is the limit and signal of this phenomenology. Perhaps even the lever prying us apart.
I can't quite put my finger on why this happens, but at the least I can say that (for me) dogs puncture some vital veil between image and reality (or whatever we call this world we live in). The humans in movies I always see also as actors--if we are particularly convinced in a given role that they are not 'acting,' we praise them as good actors! And when appropriate I have the same feeling towards well trained and physically skilled animals. But when the dog's role is just to run alongside a human protagonist and sit on the sidewalk, I don't see him as an actor. Though there is certainly some level of pre-discipline required, just as we must be constrained by language to express ourselves, I see this as not "performing" in the same sense. I might lie sometimes, and express powerful emotions at others, but neither of these capacities makes me an actor. Those are par for the course, and when a nonhuman is acting par for the course I do not see it as an image but as a being in a part of the same world from which I watch. The set on which the animal acts is continuous with the earth on which, watching, I sit, whereas the earth on which the actors and directors etc narrative takes place is not even contiguous with this world. It is a world to itself entire.
I see two sides to this. One, we see the animal as a link between image and self, and a vital one at that in a world increasingly populated by glowing screens, by images of elsewhere brought here. Two, we see this general phenomenon as evidence of a certain techne of viewing that has historical precursors. Strangely enough, I am thinking of Racine; of his contrived plots that correspond with brutal exactitude to the quasi-Aristotelian unities demanded in his day. His plays were praised, in the degree to which they measured up to said unities, as rational and natural, though from another measure they are the most absurd and unnatural things conceivable. But, as Auerbach points out, the yardstick of nature is not in this case in the world at large but already within the theater. That is, if we assume the action on stage is real, how could a play staged in two or three hours possibly be taking much (more than 24 hours) longer than that? This whole apparatus only makes sense if we understand that the audience sees what is on stage as a separate but whole world, rather than an emanation from our world and reducible to it. I have no evidence that my experience of nonhuman animals in movies is widely shared, but if it is I would venture that we also share, in part, this neo-classical divorce from the image as a world that we can watch without sharing its laws. But again, the animal is the limit and signal of this phenomenology. Perhaps even the lever prying us apart.
Sunday, August 16, 2009
Requiem for a chicken
When I moved into my new living situation a couple months ago I was taking a calculated risk that I could prevent either of my two dogs from killing the three free ranging chickens on the premise. For two months that peace held, and sadly last week someone--I strongly suspect one of my dogs, but there is room for reasonable doubt--killed Fraulein Schnitzel, one of the last two surviving chickens (another one disappeared a week before) of the original half dozen or so that other dogs/coyotes/etc had winnowed down. While it was not unheard of for a dog to kill one of the chickens on the property, it was still particularly sad to see Schnitzel go as she was a remarkable chicken. She had what I can only describe as a crown growing up from her head, and she was uncommonly fond of interacting with humans. She was also very loud so her absence is noticeable.
I don't bring this up as a kind of "he was a good dog" speech. Since it was my dog that probably did the deed, and Schnitzel was the favorite chicken of our property owner, I figured burying her was the least I could. This was much harder than I had expected. The soil is completely dry. In the first place I dug there was a 2x6 buried a foot down, a nice surprise. In the second hole there was a root about two inches in diameter. It was hot as hell to boot: we're not far from the wildfires in the Santa Cruz mountains making national news. When I was done I was drenched in sweat, my back hurt, and I had two blisters filling with blood and a spot on my thumb where the skin was just floating. Such are the infirmities of the scholar. But I did feel accomplished, and somewhat reconciled to the death of Schnitzel.
Now in a way, there is no need for this complicated economy of reconciliation to the dead, "mourning" as we call it. Nothing can change the facticity of death, nor did I share a very rich emotional world with Schnitzel relative to that I have with some humans, dogs, and cats. Schnitzel's owner eats other chickens so its not like this was the transgression of categorical imperative. (She was, for the record, upset but understanding). I didn't do anything bad here (my dog had snuck out unbeknownst to me) and there was no expectation of penance or punishment. Death is a kind of non-event , but its peculiarity as such seems to make even more of an event. But it doesn't seem to be a kind of event that humans have any monopoly over, either as mourned or mourners. I see a warped vision concerning the anthropology of mourning, where a ritual that exists among humans is studied as such, and in the process taken to pertain especially to humans: that because knowledge of mourning contributes to anthropology, there is an actual anthropos being excavated in the process. (Incidentally, Heidegger makes this same point with regard to scientific research building up its unacknowledged worldview in "The Age of the World Picture." I hope I didn't offend any Heideggerians with my casual characterization of death as anti-event.)
Rather, mourning seems to open up all of those boundaries that construct the nonanimal human. One acts or feels as if one has wronged the dead, or as per Freud imagines that one has caused his or her death, knowing that one is likely not at fault in any socially rigorous sense--and even if one is, the slate has been cleared. Mourning puts the affective dimension of ethics in flux, while foregrounding this element at the discount of the rational form of ethics. Hence we read from religious books, or stories, or poems--"Do not go gentle into that good night"--rather than the second Critique. Thus mourning also makes one "mad," temporarily outside of social mores. One might hurt oneself or do something crazy under the excusatory power of grief.
While this can be seen as the token of mourning's exceptionality, its role in erecting 20th c anthropology has been anything but marginal. I would turn anthropological wisdom on its head: the ontological and ethical openness therein is the ground floor, or even the basement, on which a climbing and ultimately teetering "humanity" rests. If notions of the human, of our destiny as a species, are in question, what makes sense isn't an increase in buttressing that makes for a more spectacular disaster down the road but a (re)turn to what structural anthropology could have shown had it conceived of itself as comparative ethology instead.
I don't bring this up as a kind of "he was a good dog" speech. Since it was my dog that probably did the deed, and Schnitzel was the favorite chicken of our property owner, I figured burying her was the least I could. This was much harder than I had expected. The soil is completely dry. In the first place I dug there was a 2x6 buried a foot down, a nice surprise. In the second hole there was a root about two inches in diameter. It was hot as hell to boot: we're not far from the wildfires in the Santa Cruz mountains making national news. When I was done I was drenched in sweat, my back hurt, and I had two blisters filling with blood and a spot on my thumb where the skin was just floating. Such are the infirmities of the scholar. But I did feel accomplished, and somewhat reconciled to the death of Schnitzel.
Now in a way, there is no need for this complicated economy of reconciliation to the dead, "mourning" as we call it. Nothing can change the facticity of death, nor did I share a very rich emotional world with Schnitzel relative to that I have with some humans, dogs, and cats. Schnitzel's owner eats other chickens so its not like this was the transgression of categorical imperative. (She was, for the record, upset but understanding). I didn't do anything bad here (my dog had snuck out unbeknownst to me) and there was no expectation of penance or punishment. Death is a kind of non-event , but its peculiarity as such seems to make even more of an event. But it doesn't seem to be a kind of event that humans have any monopoly over, either as mourned or mourners. I see a warped vision concerning the anthropology of mourning, where a ritual that exists among humans is studied as such, and in the process taken to pertain especially to humans: that because knowledge of mourning contributes to anthropology, there is an actual anthropos being excavated in the process. (Incidentally, Heidegger makes this same point with regard to scientific research building up its unacknowledged worldview in "The Age of the World Picture." I hope I didn't offend any Heideggerians with my casual characterization of death as anti-event.)
Rather, mourning seems to open up all of those boundaries that construct the nonanimal human. One acts or feels as if one has wronged the dead, or as per Freud imagines that one has caused his or her death, knowing that one is likely not at fault in any socially rigorous sense--and even if one is, the slate has been cleared. Mourning puts the affective dimension of ethics in flux, while foregrounding this element at the discount of the rational form of ethics. Hence we read from religious books, or stories, or poems--"Do not go gentle into that good night"--rather than the second Critique. Thus mourning also makes one "mad," temporarily outside of social mores. One might hurt oneself or do something crazy under the excusatory power of grief.
While this can be seen as the token of mourning's exceptionality, its role in erecting 20th c anthropology has been anything but marginal. I would turn anthropological wisdom on its head: the ontological and ethical openness therein is the ground floor, or even the basement, on which a climbing and ultimately teetering "humanity" rests. If notions of the human, of our destiny as a species, are in question, what makes sense isn't an increase in buttressing that makes for a more spectacular disaster down the road but a (re)turn to what structural anthropology could have shown had it conceived of itself as comparative ethology instead.
Monday, August 10, 2009
Coraline
For those who haven't seen it, I strongly recommend Coraline, the stop action film based on Neil Gaiman's novel. Ostensibly this is "for kids" but in no way is this a limitation; it's more like Coraline exceeds the division of a "child" aesthetic and an "adult" aesthetic. The device that allows this encompassing movement is its supreme creepiness.
When this movie came out I had to do a lot of driving for work, so I wound up hearing an interview with the maker about three times on NPR. Two things stuck with me: his insistence on doing this movie with stop-action figures rather than CGI, and his belief that what is most terrifying is age-indiscriminate. His instincts were right, because Coraline is very effective and I don't think too intense for kids. Maybe some kids. Why is it effective?
The materiality of the figures is part of it--a small part, in comparison with the amazing directing and effects--but it is this part that creates the atmosphere of the uncanny (E. T. A. Hoffman's The Sandman is more than a passingly similar text, especially since Gaiman made his name with his own Sandman series. Here is a short film version). The animation is done superbly, but it has inevitable gliches or discontinuities almost too small to see. In an extremely subtle way, the movie reminds of us our own materiality and mortality. But the division within the movie between the "real" world and a seductive simulacrum in which everyone has buttons sewn over their eyes hinges on the idea of being a doll or being real. Yet we know that the "real" world is also made of dolls. The chain of signification establishes a dualism between real and false (and this is total division: ontology and ethics are synonymous) but then shows that this same division connects the screen-world to the viewer-world. Since materiality is the means by which the image reveals its construction, that the viewer-world possesses "the most" materiality makes it the most frightening. Things speak, have worlds: and, as in the flickering gaps of the claymation, the mode in which things reveal their other lives is by a minor absence. The imperfect fluidity of the figures does not break down narrative or meaning or affect; one could count it out entirely, or not even perceive it. This is what is terrifying: that the alterity of things can go unnoticed by the modes of perception and meaning most dear to modern humans. We like to think that anomalies will hail our attention and direct our colonization of all possible worlds. But there may be a profound indifference in the world, one that can take or leave our participation. Coraline is a kind of Spinozism for kids and it rightly shows that this is both marvelous and profoundly disturbing for humans, "awful" in the old sense.
When this movie came out I had to do a lot of driving for work, so I wound up hearing an interview with the maker about three times on NPR. Two things stuck with me: his insistence on doing this movie with stop-action figures rather than CGI, and his belief that what is most terrifying is age-indiscriminate. His instincts were right, because Coraline is very effective and I don't think too intense for kids. Maybe some kids. Why is it effective?
The materiality of the figures is part of it--a small part, in comparison with the amazing directing and effects--but it is this part that creates the atmosphere of the uncanny (E. T. A. Hoffman's The Sandman is more than a passingly similar text, especially since Gaiman made his name with his own Sandman series. Here is a short film version). The animation is done superbly, but it has inevitable gliches or discontinuities almost too small to see. In an extremely subtle way, the movie reminds of us our own materiality and mortality. But the division within the movie between the "real" world and a seductive simulacrum in which everyone has buttons sewn over their eyes hinges on the idea of being a doll or being real. Yet we know that the "real" world is also made of dolls. The chain of signification establishes a dualism between real and false (and this is total division: ontology and ethics are synonymous) but then shows that this same division connects the screen-world to the viewer-world. Since materiality is the means by which the image reveals its construction, that the viewer-world possesses "the most" materiality makes it the most frightening. Things speak, have worlds: and, as in the flickering gaps of the claymation, the mode in which things reveal their other lives is by a minor absence. The imperfect fluidity of the figures does not break down narrative or meaning or affect; one could count it out entirely, or not even perceive it. This is what is terrifying: that the alterity of things can go unnoticed by the modes of perception and meaning most dear to modern humans. We like to think that anomalies will hail our attention and direct our colonization of all possible worlds. But there may be a profound indifference in the world, one that can take or leave our participation. Coraline is a kind of Spinozism for kids and it rightly shows that this is both marvelous and profoundly disturbing for humans, "awful" in the old sense.
Thursday, August 6, 2009
Auerbach and the para-modern
One of the offshoots of the study of the postmodern has been a new life for pre or early modernity as a kindred spirit of our own time. The most obvious example to my eyes is the place of "science" which was mythologized as part of modernity and which has since become somewhat more horizontal to other knowledges, as it was in the emergent period of the early modern. Conversely, the kind of claims religion can make, or the ways in which "the religious" informs judgment, have shifted around the demarcations of modernity. Without over-emphasizing the borders of "modernity," there is some usefulness to it as a periodizing device.
I recap this as an intro to two ways in which narrative today can be compared to early modern mimetics in Auerbach's Mimesis. First, Auerbach characterizes the Medieval period in terms of a strong displacement of high tragedy because of the figural power of Christianity. By "figural" he means a certain relationship between the earthly and the heavenly, the present and eternity, in which all phenomena on each side ultimately correspond; and while the heavenly side is absolutely the more significant, this device gives every earthly happening, no matter how mean or human, connection to the divine. The limit case is Dante's Comedy in which the endowment of the earthly overwhelms and excludes the heavenly; but this is a limit case, and the drama of the Middle Ages generally did not suffer from this potential crisis--in general it enjoyed its license to earthiness with the knowledge that all filth was a sign in the cycle of redemption. Retrospectively, Dante is seen as a bridge into humanism and the general replacement of the figura as an aesthetic device. Examples of an anti-figural writing are Montaigne's essays or Shakespeare--anything in which we can recognize an existential dimension.
I would argue, however, that the postmodern has rediscovered the figura in the Holocaust. The "other side" of the figural is not unrepresentable; rather, it is an inescapable part of any economy of representation. Thus Ranciere, in The Future of the Image, rebuts Adorno (et al.) on the unrepresentability or barbarism of representation of the Holocaust. It is instead a matter of focal distance, of making decisions and staking positions in the economy of "the visible and the sayable" (Ranciere's definition of "the image"). Just as the Medieval Christian aesthetic endorsed all kinds of "creatural" or "kreaturlich" abjection in the name of the figural regime, so today art has taken on a mission in relation to dehumanization under the unannounced but understood sign of the Holocaust as a kind of "beyond."
Second point: Auerbach moves on to discuss the differences between Elizabethan and Greek tragedy. I'll boil the distinction down as it is fairly basic to Western history: Greek tragedy presumes the audience knows the story because it is part of national myth or history; Elizabethan tragedy involves fate as a personal and existential value. The experience of Greek tragedy is thus something that is fundamentally withheld from us except insomuch as we have taken the time to familiarize ourselves with an equivalent knowledge of Greek myth--and in fact, the same is now true for viewing Shakespearean productions. In general, though, our tragedies fall on the model of Shakespeare's--except in rigidified low genres, like horror. While one can point to examples of "good horror movies" with well developed characters, the weight of the output is on a mythic tragic format in which character's are basically ossified and predestined.
Differences: the post-modern viewer has retained an expectation of surprise, a distilled form of irony, which prevents the solemnity of tragedy, the condition for speech, from developing. So let us imagine stripping a horror movie of all the tricks and turns designed to trigger our startle reflexes. What will fill the time? The Greek solution is rhetoric, the well-formed speeches that take their turns but which cannot hold back the falling blades of fate. The current solution is the eloquence of viscerality, which the Greeks banned from the stage and which I doubt could have been rendered in prior ages as hyper-graphically as today, even when a butcher's harvest could have represented, with hardly any mimetic gap, the interior of the human body. I see shreds or sparkles of this neo or post mythic tragedy in certain zombie films, and it is here too that we find the structural conditions most ripe to exclude the surprise.
I recap this as an intro to two ways in which narrative today can be compared to early modern mimetics in Auerbach's Mimesis. First, Auerbach characterizes the Medieval period in terms of a strong displacement of high tragedy because of the figural power of Christianity. By "figural" he means a certain relationship between the earthly and the heavenly, the present and eternity, in which all phenomena on each side ultimately correspond; and while the heavenly side is absolutely the more significant, this device gives every earthly happening, no matter how mean or human, connection to the divine. The limit case is Dante's Comedy in which the endowment of the earthly overwhelms and excludes the heavenly; but this is a limit case, and the drama of the Middle Ages generally did not suffer from this potential crisis--in general it enjoyed its license to earthiness with the knowledge that all filth was a sign in the cycle of redemption. Retrospectively, Dante is seen as a bridge into humanism and the general replacement of the figura as an aesthetic device. Examples of an anti-figural writing are Montaigne's essays or Shakespeare--anything in which we can recognize an existential dimension.
I would argue, however, that the postmodern has rediscovered the figura in the Holocaust. The "other side" of the figural is not unrepresentable; rather, it is an inescapable part of any economy of representation. Thus Ranciere, in The Future of the Image, rebuts Adorno (et al.) on the unrepresentability or barbarism of representation of the Holocaust. It is instead a matter of focal distance, of making decisions and staking positions in the economy of "the visible and the sayable" (Ranciere's definition of "the image"). Just as the Medieval Christian aesthetic endorsed all kinds of "creatural" or "kreaturlich" abjection in the name of the figural regime, so today art has taken on a mission in relation to dehumanization under the unannounced but understood sign of the Holocaust as a kind of "beyond."
Second point: Auerbach moves on to discuss the differences between Elizabethan and Greek tragedy. I'll boil the distinction down as it is fairly basic to Western history: Greek tragedy presumes the audience knows the story because it is part of national myth or history; Elizabethan tragedy involves fate as a personal and existential value. The experience of Greek tragedy is thus something that is fundamentally withheld from us except insomuch as we have taken the time to familiarize ourselves with an equivalent knowledge of Greek myth--and in fact, the same is now true for viewing Shakespearean productions. In general, though, our tragedies fall on the model of Shakespeare's--except in rigidified low genres, like horror. While one can point to examples of "good horror movies" with well developed characters, the weight of the output is on a mythic tragic format in which character's are basically ossified and predestined.
Differences: the post-modern viewer has retained an expectation of surprise, a distilled form of irony, which prevents the solemnity of tragedy, the condition for speech, from developing. So let us imagine stripping a horror movie of all the tricks and turns designed to trigger our startle reflexes. What will fill the time? The Greek solution is rhetoric, the well-formed speeches that take their turns but which cannot hold back the falling blades of fate. The current solution is the eloquence of viscerality, which the Greeks banned from the stage and which I doubt could have been rendered in prior ages as hyper-graphically as today, even when a butcher's harvest could have represented, with hardly any mimetic gap, the interior of the human body. I see shreds or sparkles of this neo or post mythic tragedy in certain zombie films, and it is here too that we find the structural conditions most ripe to exclude the surprise.
Wednesday, August 5, 2009
"War, inc."
About ten minutes into "War, Inc." my wife asked me what the movie was supposed to be about. I started rambling about Halliburton, Blackwater (rebranded as "Xe," I believe), privatization, Pinochet, and a bunch of other chestnuts pertaining to how American foreign policy is fucked up. "But what is it about?" she repeated, and I realized the movie was answering to a different "being-about" than her question. She wanted to know the plot as a linear movement in which what has happened helps us anticipate and understand what will happen; I had given a non-linear constellational explanation, and when I tried to formulate the movie in terms of plot it came up pretty short ("John Cusack is trying to kill this guy, but not very diligently, and he is trying to get in Marisa Tomei's pants, though that has nothing to do with the premise?"). At about the halfway mark I thought I understood what's going on in "War, Inc": it wasn't a movie, it was a holographic bumper sticker retrojecting itself into the form of the cinematic. Take fifteen odd slogans pertaining to the true, disgusting, and heartbreaking state of American foreign policy under Bush, give them screen life, and string them together. Though the metaphor of "stringing together" gives the false impression that this film is organized as vignettes. Rather, it is like trying to sort through frustrating and similar ideas after too little or too much coffee, in which one gives way to two or three others before it is brought fully to light, so that the continual displacement prevents solid thinking but installs a very effective atmosphere. Communication is difficult.
But then I found that War Inc is more than that--maybe not much more, maybe shyly so, but its confusion accelerates into the rhythm and density that cause a real aesthetic to break out. It moves from capitalizing on the advertisement form (which I consider an extremely weak political tactic today, half a century after Warhol) to embracing the art form. The art form is less efficacious, less popular, it is true, and perhaps the movement from a popular to a difficult discourse is intended to grease the wheels a little. The last twenty minutes or so are hard to describe: they are both rich in the events/diegesis that had been heretofore muffled and even more saturated with an atmosphere that has suddenly proven itself fecund for human life. It put me in the mind of Godard's Les Carabiniers and Natural Born Killers, though it does not equal either of those, or more obviously Dr. Strangelove. It's difficult to say whether the filmmakers wanted the intensity of the surreal or feared it; since they are Americans, I can only assume the answer is both.
But then I found that War Inc is more than that--maybe not much more, maybe shyly so, but its confusion accelerates into the rhythm and density that cause a real aesthetic to break out. It moves from capitalizing on the advertisement form (which I consider an extremely weak political tactic today, half a century after Warhol) to embracing the art form. The art form is less efficacious, less popular, it is true, and perhaps the movement from a popular to a difficult discourse is intended to grease the wheels a little. The last twenty minutes or so are hard to describe: they are both rich in the events/diegesis that had been heretofore muffled and even more saturated with an atmosphere that has suddenly proven itself fecund for human life. It put me in the mind of Godard's Les Carabiniers and Natural Born Killers, though it does not equal either of those, or more obviously Dr. Strangelove. It's difficult to say whether the filmmakers wanted the intensity of the surreal or feared it; since they are Americans, I can only assume the answer is both.
Monday, August 3, 2009
To rationalize or not to rationalize
Animal Person has a great post up on the nigh-intentional stupidity of the reigning discourse on farm animals versus pets (I originally typed "poets" instead of "pets," which is sort of where I'm going with this). David Scott, the chairman of the Livestock, Dairy and Poultry Subcommittee of the House Committee on Agriculture, grants some empirical validity to the foundation for animal activism, but then backtracks just as quickly into what seems to me to be sheer anti-logic. My point isn't to castigate him for being, in my view, a dumb man, but to note how this divides the two main responses I get from meat eaters.
It seems to me that people of some sophistication grant everything I say as true and accept that they are "morally" in the wrong in a logical sense but still in the right in a social sense. People of less sophistication seem more willing to argue that I am wrong, or that there has to be some loophole or categorical difference that will cause social and logico-ethical standards to align. Now, this might just be a permutation of the general division of the labor of combat, assigning ironic deference to the ruling class and the task of violence to the under class--and this is the general explanation I would advance. But at the same time, the categories for dividing these strategies of judgment fall under the heading of Art or aesthetics, so that even if we are dealing with, at root, a sociological explanation, this explanation would itself be split between the dialectical/argumentative and antinomic/ironic methods exhibited by the question. That a sociological explanation needs to borrow from aesthetics certainly does it no discredit--many aestheticians wish someone would find a use for their work. What is discredited, first and last, is the belief that animals or art--pets or poets--can be subsumed under a single method of appreciation. They share a fate.
It seems to me that people of some sophistication grant everything I say as true and accept that they are "morally" in the wrong in a logical sense but still in the right in a social sense. People of less sophistication seem more willing to argue that I am wrong, or that there has to be some loophole or categorical difference that will cause social and logico-ethical standards to align. Now, this might just be a permutation of the general division of the labor of combat, assigning ironic deference to the ruling class and the task of violence to the under class--and this is the general explanation I would advance. But at the same time, the categories for dividing these strategies of judgment fall under the heading of Art or aesthetics, so that even if we are dealing with, at root, a sociological explanation, this explanation would itself be split between the dialectical/argumentative and antinomic/ironic methods exhibited by the question. That a sociological explanation needs to borrow from aesthetics certainly does it no discredit--many aestheticians wish someone would find a use for their work. What is discredited, first and last, is the belief that animals or art--pets or poets--can be subsumed under a single method of appreciation. They share a fate.
Sunday, August 2, 2009
"Day of the Dead" x2
"Day of the Dead" is one of the great missed opportunities of zombie art. "Night of the Living Dead" is great, "Dawn of the Dead" takes the aesthetic a step further, and "Day" would have gone all the way in expanding the critique of American culture. For those who haven't seen it--and there are plenty, as "Day" is rightly ignored by those not immersed in zombie stuff--"Day" takes place well after zombies have kicked the shit out of humanity. The film's protagonists are a small group of soldiers and scientists locked in an underground bunker/laboratory where they selectively capture zombies for experimentation. The soldiers are guided by the worst chauvinist impulses, threatening the female scientist and the very project of science throughout. If "Dawn" issued a sneering image of consumerism, "Day" assaulted the union of dromocracy and science as the progenitors of knowledge-power in its simplest, cruelest, and most inhuman form. What's worse, and more brilliant, is that the procedures of capture and scientific torture makes the viewer sympathetic to the zombies. In idea if not in execution, Romero's "Day of the Dead" is in the best critical zombie tradition.
So it was with mixed emotion that I spotted a remake of "Day" while perusing the Red Box by our house for "Coraline." One of the things that stands out is that the "Day" remake stars Ving Rames, one of the principals in the horrible remake of "Dawn" a few years ago. "Dawn" did not need remaking, but I can see room for improvement in "Day." The remake also has a couple semi-recognizable actors: Nick Cannon (Wild n Out, Drumline), Mena Suvari (American Beauty!) and AnnaLynne McCord (Nip/Tuck and the remake of 90210). If someone wanted to do penance for the remake of "Dawn," this would be the way.
Obviously they chose instead to make a total piece of shit. But let's be objective about it.
The first two thirds of the remake is the initial zombie outbreak. The version of zombie powers used here (as well cinematography, lighting, etc) line up with the remake of "Dawn" so it's pretty boring. In fact, it is the same basic "whoah, something bad is happening and we don't know why" story that goes unspoken in the genre. What Romero had accomplished in his zombie trilogy going into "Day" was the ability to make a sci-fi movie while avoiding exposition of what cannot be explained--that is, to tell a story dependent on counterfactuals without being drawn into the cycle of explaining the unexplainable. (Even in "Dawn" he is able to give an expository montage in the first five minutes). The remake, however, spends most of its screen time on what was one of the crucial, subtle accomplishments of Romero's third zombie movie.
This is even more outrageous when we consider that the zombie movie is today well recognized--so well recognized that new incarnations have to be tweaked in some way to be viable--and that this clearly follows on the remake of "Dawn," which would at the very, very minimum provide a starting point for a follow up. Alas, the will to repeat that creates the remake is also an atavistic will to unnecessary expository parataxis. If the characters and their lives were interesting, that would be one thing--in fact, that would be Romero's recent "Diary of the Dead"--but this movie is shit so we have two dimensional horror cut outs. Which is acceptable in the genre, but not when the external circumstances do not offer a substitution for the subjectivity of the characters.
At last our band of survivors finds themselves in an underground military bunker/laboratory (!) where some govt conglomerate had been developing bioweapons. I missed some of the detail at this point because I realized doing the dishes was a better use of my time. In short, a couple people die, including Nick Cannon who was the only one holding this thing together, a couple people live, and they blow the hell out of the zombies and drive away. A radio reports that order has been restored but then a zombie jumps up in front of the camera so we know that more is to come. Where the remake of "Dawn" ended nihilistically with the slaughter of all principals during the credits, contrasting with the parenthetical promise of Romero's final cut (an earlier version had all characters dying), this remake is relatively hopeful. The zombie plague is an airborne virus, but some people are immune and it seems localized. That movie about Ebola with the monkeys had a worse scenario, and that didn't kill us all. Romero's "Day" was awesomely nihilistic. So, on both occasions the remakes of Romero's films have reversed the tonality of the endings so as to display a total ignorance of the thematic context of the film. Escape is thematically possible in Romero's "Dawn" because it is in the nature of the mall to be reiterated, traveled between, variant but within the great circulation of capital--one must leave the mall, but with the grim awareness that all one can hope for is to find another mall. In contrast, the laboratory and military science repeats but it repeats exactly. Like empire, the laboratory is endless and undifferentiated. Romero was right to have "Day" be a basically depressing movie where revenge is the only pleasure, and the ending is not so much evidence of pessimism as dictated by the material.
What do we make of this final image: the survivors pull out of sight, into the Colorado mountain backdrop, and a zombie leaps up directly in front of the camera to shake its head (menacingly?). Obviously, it contravenes the radio voice over: the event is not fully contained. But to where will it spread? Some other passably vaguely-named small town in Colorado, from which its inhabitants dream of escaping to an "anywhere" of the mountain sunset or, in the case of Suvari's character, the US military? The marker of spatial specificity here is the zombie itself: it stands in direct relation to the camera. But it too is reduced (or elevated) to an "anywhereness" as a film effect directed to the viewer. Wherever you are, the zombie is addressing you. The zombie in this scene, then, is pulled in two directions: out of the screen, into a somewhat abashed address to a viewer it cannot read but who can examine it (like Romero's scientists), and back into the film, back to the woodsy periphery of the town, back to an origin point that might be Anywhere USA but is the belated anchor for the zombie mythos. The zombie might be transported to any screen or imagined in any shadowy field, hospital, or friendly face, but at this moment it stands in a determinate distance to the camera that fixes it for examination. Even the zombie has become a "character" in the sad sense this holds for unimaginative horror movies.
So it was with mixed emotion that I spotted a remake of "Day" while perusing the Red Box by our house for "Coraline." One of the things that stands out is that the "Day" remake stars Ving Rames, one of the principals in the horrible remake of "Dawn" a few years ago. "Dawn" did not need remaking, but I can see room for improvement in "Day." The remake also has a couple semi-recognizable actors: Nick Cannon (Wild n Out, Drumline), Mena Suvari (American Beauty!) and AnnaLynne McCord (Nip/Tuck and the remake of 90210). If someone wanted to do penance for the remake of "Dawn," this would be the way.
Obviously they chose instead to make a total piece of shit. But let's be objective about it.
The first two thirds of the remake is the initial zombie outbreak. The version of zombie powers used here (as well cinematography, lighting, etc) line up with the remake of "Dawn" so it's pretty boring. In fact, it is the same basic "whoah, something bad is happening and we don't know why" story that goes unspoken in the genre. What Romero had accomplished in his zombie trilogy going into "Day" was the ability to make a sci-fi movie while avoiding exposition of what cannot be explained--that is, to tell a story dependent on counterfactuals without being drawn into the cycle of explaining the unexplainable. (Even in "Dawn" he is able to give an expository montage in the first five minutes). The remake, however, spends most of its screen time on what was one of the crucial, subtle accomplishments of Romero's third zombie movie.
This is even more outrageous when we consider that the zombie movie is today well recognized--so well recognized that new incarnations have to be tweaked in some way to be viable--and that this clearly follows on the remake of "Dawn," which would at the very, very minimum provide a starting point for a follow up. Alas, the will to repeat that creates the remake is also an atavistic will to unnecessary expository parataxis. If the characters and their lives were interesting, that would be one thing--in fact, that would be Romero's recent "Diary of the Dead"--but this movie is shit so we have two dimensional horror cut outs. Which is acceptable in the genre, but not when the external circumstances do not offer a substitution for the subjectivity of the characters.
At last our band of survivors finds themselves in an underground military bunker/laboratory (!) where some govt conglomerate had been developing bioweapons. I missed some of the detail at this point because I realized doing the dishes was a better use of my time. In short, a couple people die, including Nick Cannon who was the only one holding this thing together, a couple people live, and they blow the hell out of the zombies and drive away. A radio reports that order has been restored but then a zombie jumps up in front of the camera so we know that more is to come. Where the remake of "Dawn" ended nihilistically with the slaughter of all principals during the credits, contrasting with the parenthetical promise of Romero's final cut (an earlier version had all characters dying), this remake is relatively hopeful. The zombie plague is an airborne virus, but some people are immune and it seems localized. That movie about Ebola with the monkeys had a worse scenario, and that didn't kill us all. Romero's "Day" was awesomely nihilistic. So, on both occasions the remakes of Romero's films have reversed the tonality of the endings so as to display a total ignorance of the thematic context of the film. Escape is thematically possible in Romero's "Dawn" because it is in the nature of the mall to be reiterated, traveled between, variant but within the great circulation of capital--one must leave the mall, but with the grim awareness that all one can hope for is to find another mall. In contrast, the laboratory and military science repeats but it repeats exactly. Like empire, the laboratory is endless and undifferentiated. Romero was right to have "Day" be a basically depressing movie where revenge is the only pleasure, and the ending is not so much evidence of pessimism as dictated by the material.
What do we make of this final image: the survivors pull out of sight, into the Colorado mountain backdrop, and a zombie leaps up directly in front of the camera to shake its head (menacingly?). Obviously, it contravenes the radio voice over: the event is not fully contained. But to where will it spread? Some other passably vaguely-named small town in Colorado, from which its inhabitants dream of escaping to an "anywhere" of the mountain sunset or, in the case of Suvari's character, the US military? The marker of spatial specificity here is the zombie itself: it stands in direct relation to the camera. But it too is reduced (or elevated) to an "anywhereness" as a film effect directed to the viewer. Wherever you are, the zombie is addressing you. The zombie in this scene, then, is pulled in two directions: out of the screen, into a somewhat abashed address to a viewer it cannot read but who can examine it (like Romero's scientists), and back into the film, back to the woodsy periphery of the town, back to an origin point that might be Anywhere USA but is the belated anchor for the zombie mythos. The zombie might be transported to any screen or imagined in any shadowy field, hospital, or friendly face, but at this moment it stands in a determinate distance to the camera that fixes it for examination. Even the zombie has become a "character" in the sad sense this holds for unimaginative horror movies.
Friday, July 31, 2009
More on animals and horror
I've posted before a little on the connection between horror movies and animals. In the most general sense, my argument is that horror cinema is a necessary outlet or byproduct of social sanction for the outrageous violence of factory farming. After watching "Marley and Me," I have another piece of this puzzle to add.
The genre of sentimental boy-dog books and movies has been around for awhile and to an extent is self-evident. Think "Old Yeller": it's sad as hell, about "boy" stuff, and so provides a way for young males to negotiate emotions that they are going to be expected to generally disavow as "men." "Marley and me" is not directed at boys in particular, but I think it is probably intended as a family film. Really, it's about being a young to middle aged professional, but because of the PG rating it presents that arc through the discursive possibilities of a much younger audience. The end is sad because it is inevitable and (for me) refers to pets that have already died and my living dogs who will someday die. As pedagogical, it also presents adulthood and the end of childhood as part of the inevitability of generational cycling. All this humanization through the life of a dog.
The correlative process of learning emotional restraint is, as Noel Carroll argues, ingrained in horror cinema as a ritual for teenage males. (There's nothing particularly "male" about the process he describes, its just an empirical observation that teenage boys are the biggest fans of horror). Adolescent males watch horror movies to practice confronting fear and mastering it; watching movies in a group then displays this mastery and/or buttresses it through communal mockery.
There's certainly a critique to be made of the repression wrought by the Old Yeller process, but I think it is also important that the existence of such documents serves to maintain that border as fragile. The memory of tears welling up is useful to remember that despite outward appearances one retains the capacity to be moved deeply by the lives of others. A predominance of the horror mindset--seeing the mastery of horror as the mastery of affect--gives a comfort that is not so much false as dangerous. (This links up with my criticism of "Blindness" as well: its subject matter has the potential to be sad or disgusting, and it opts for the latter.)
In sum, cinema has two genres for teaching the control of public emotion (fear) and domestic emotion (love, grief) that are organized by an unspoken connection of the animal body: as object of absolute love and absolute violation.
The genre of sentimental boy-dog books and movies has been around for awhile and to an extent is self-evident. Think "Old Yeller": it's sad as hell, about "boy" stuff, and so provides a way for young males to negotiate emotions that they are going to be expected to generally disavow as "men." "Marley and me" is not directed at boys in particular, but I think it is probably intended as a family film. Really, it's about being a young to middle aged professional, but because of the PG rating it presents that arc through the discursive possibilities of a much younger audience. The end is sad because it is inevitable and (for me) refers to pets that have already died and my living dogs who will someday die. As pedagogical, it also presents adulthood and the end of childhood as part of the inevitability of generational cycling. All this humanization through the life of a dog.
The correlative process of learning emotional restraint is, as Noel Carroll argues, ingrained in horror cinema as a ritual for teenage males. (There's nothing particularly "male" about the process he describes, its just an empirical observation that teenage boys are the biggest fans of horror). Adolescent males watch horror movies to practice confronting fear and mastering it; watching movies in a group then displays this mastery and/or buttresses it through communal mockery.
There's certainly a critique to be made of the repression wrought by the Old Yeller process, but I think it is also important that the existence of such documents serves to maintain that border as fragile. The memory of tears welling up is useful to remember that despite outward appearances one retains the capacity to be moved deeply by the lives of others. A predominance of the horror mindset--seeing the mastery of horror as the mastery of affect--gives a comfort that is not so much false as dangerous. (This links up with my criticism of "Blindness" as well: its subject matter has the potential to be sad or disgusting, and it opts for the latter.)
In sum, cinema has two genres for teaching the control of public emotion (fear) and domestic emotion (love, grief) that are organized by an unspoken connection of the animal body: as object of absolute love and absolute violation.
Thursday, July 30, 2009
Michael Jackson memoriam
I saw K-Punk's essay on Michael Jackson and decided I would go ahead with the one that had been forming in my head as I walked through endless supermarket corridors the last couple weeks.
I am basically too young to know Jackson's efflorescence. I wasn't alive, or at least not aware, when he was doing his best work. I do remember when the video for "Black or White" came out because it was touted as a semi-event (I was too young to be critical, much less cynical) and was in a way self-fulfilling--the mass mobilization of cultural capital is something, even if it is promoting a lost object. In the case of MJ this was all the more so. His reclusion was something like that of another masterpiece in black and white, Citizen Kane, and the fact of him stepping away from his lugubrious throne was enough to catch attention. But more than that, the video for B&W was pushed as cutting edge digital manipulation (the wikipedia page says that it was previously used only in films such as Terminator). MJ might not have been at the forefront of "music" with Dangerous, but he was still at the crest of some other wave. It was not clear then whether his video was breaking ground in sheer expenditure ("the expense of spirit in a waste of shame") or in technological innovation, and this ambiguity haunts all aspects of the Jackson legacy.
The part of this haunting interesting to me here is how his death has given him and his music new life, a social presence that I believe was being held back by the fact of his personal vitality. I mean "vitality" as the simple fact of living. Life was a negative value, a predicate that diminished him. The same could be said I suppose of any pinnacle celebrity (Elvis is the obvious example, Hitler the other), and really of any of us. But by "life" in this context I don't mean all the messy details that drag us down and sully us with their swarming demands, or the King's beer belly, garish costumes illuminated by historical hindsight, and other indignities of aging in the limelight, I mean "life" as a factical condition. In the way that life would diminish a ghost.
In the last few weeks Jackson's music has become omnipresent on the radio. A weight has been lifted: the child molestation charges, the generally off-putting weirdness of late Jackson, has been paid in blood-gold and the preferred parts of his corpus can be separated from the offal. If he challenged convention by becoming a cyborg, we can now dissassemble him, like the deathless-dead body in Pynchon's V. without disgust or sentiment at the abjection of the human body. The mourning is festal, a wake: thank god, the airways breathe, we can stop qualifying our love of MJ and his music. It's a shame he is dead, but he is so much more alive now. He hasn't been this sonically omnipresent in decades. If anything, he's younger than ever.
The turning point for this rejuvenation is clearly his death. What else could have exonerated him from his history? What Jackson had put forward was a vision of deathlessness--not just in his own facial mask, the Neverland Ranch, media-circus rumors, the myth of the frozen king, and all the other ways he seemed to ascend from corporality to a digital heaven--but of cultural capital as unexpendable. (I am using "expenditure" here in the sense of a discharge of wealth that is not recouped dialectically as it is in the investment). No matter how much money Jackson wasted on personal fantasies--and he is well known to be massively in debt--he had attained an unimpeachable place in cultural and especially musical history that could always turn its own mythic expenditure to profit. This is the point at which restricted economy touches general economy in a schematic sense; MJ gave this bloodless formula very real historical dimensions. The initial, modernist question--can a person of vast wealth forestall death indefinitely a la Howard Hughes--has passed its zenith and has reformulated itself: can such a person die? With the archival, financial, and media technologies that allowed "Michael Jackson" to exist, the answer is no.
I am basically too young to know Jackson's efflorescence. I wasn't alive, or at least not aware, when he was doing his best work. I do remember when the video for "Black or White" came out because it was touted as a semi-event (I was too young to be critical, much less cynical) and was in a way self-fulfilling--the mass mobilization of cultural capital is something, even if it is promoting a lost object. In the case of MJ this was all the more so. His reclusion was something like that of another masterpiece in black and white, Citizen Kane, and the fact of him stepping away from his lugubrious throne was enough to catch attention. But more than that, the video for B&W was pushed as cutting edge digital manipulation (the wikipedia page says that it was previously used only in films such as Terminator). MJ might not have been at the forefront of "music" with Dangerous, but he was still at the crest of some other wave. It was not clear then whether his video was breaking ground in sheer expenditure ("the expense of spirit in a waste of shame") or in technological innovation, and this ambiguity haunts all aspects of the Jackson legacy.
The part of this haunting interesting to me here is how his death has given him and his music new life, a social presence that I believe was being held back by the fact of his personal vitality. I mean "vitality" as the simple fact of living. Life was a negative value, a predicate that diminished him. The same could be said I suppose of any pinnacle celebrity (Elvis is the obvious example, Hitler the other), and really of any of us. But by "life" in this context I don't mean all the messy details that drag us down and sully us with their swarming demands, or the King's beer belly, garish costumes illuminated by historical hindsight, and other indignities of aging in the limelight, I mean "life" as a factical condition. In the way that life would diminish a ghost.
In the last few weeks Jackson's music has become omnipresent on the radio. A weight has been lifted: the child molestation charges, the generally off-putting weirdness of late Jackson, has been paid in blood-gold and the preferred parts of his corpus can be separated from the offal. If he challenged convention by becoming a cyborg, we can now dissassemble him, like the deathless-dead body in Pynchon's V. without disgust or sentiment at the abjection of the human body. The mourning is festal, a wake: thank god, the airways breathe, we can stop qualifying our love of MJ and his music. It's a shame he is dead, but he is so much more alive now. He hasn't been this sonically omnipresent in decades. If anything, he's younger than ever.
The turning point for this rejuvenation is clearly his death. What else could have exonerated him from his history? What Jackson had put forward was a vision of deathlessness--not just in his own facial mask, the Neverland Ranch, media-circus rumors, the myth of the frozen king, and all the other ways he seemed to ascend from corporality to a digital heaven--but of cultural capital as unexpendable. (I am using "expenditure" here in the sense of a discharge of wealth that is not recouped dialectically as it is in the investment). No matter how much money Jackson wasted on personal fantasies--and he is well known to be massively in debt--he had attained an unimpeachable place in cultural and especially musical history that could always turn its own mythic expenditure to profit. This is the point at which restricted economy touches general economy in a schematic sense; MJ gave this bloodless formula very real historical dimensions. The initial, modernist question--can a person of vast wealth forestall death indefinitely a la Howard Hughes--has passed its zenith and has reformulated itself: can such a person die? With the archival, financial, and media technologies that allowed "Michael Jackson" to exist, the answer is no.
Tuesday, July 28, 2009
"Blindness"
The premise in the movie "Blindness" (based on the novel by Jose Saramago, which I haven't read) is great. People start going blind without cause or cure. Julianne Moore's character doesn't. The result is a zombie movie without disembowelment--all the strengths and weaknesses of the human soul, and the societies it can sustain, are drawn to the surface. It's pretty ugly for awhile but not in the way that I enjoy. Nor is this movie well made by almost any measure I would use. The pacing is uneven and sluggish, the emotions seem forced, the crimes are hideous, the connections stereotypical or unnatural. Plus blind people boycotted it for being derogatory.
It's very similar to "Changeling," actually, another movie I didn't like, because it insists on the significance of representing the worst crimes as an ameliorative for the instincts and conditions that breed them. Yes, rape is wrong; yes, greed that kills is wrong. The problem is that in bringing our blood to a boil over such high crimes, this film--and so many like it--miss the iceberg that sinks the ship. The question is how to represent systems (or networks, if one prefers) over the established short hand of rape-patriarchy avarice-capitalism (Slumdog Millionaire is another example).
This was precisely why I had been excited by the premise of "Blindness": what I am calling for is a kind of blindness. The reason actual blind people protested the movie was that it represents blindness as totally debilitating (and to some extent morally corrupting). It's not. There is a particular irony at the beginning of the film in that one of the characters wears sunglasses all the time; I assumed she was blind and the filmmakers were showing how capable blind people can be. Not so. She was a prostitute. Sight and shame are never disarticulated in this film, and so it misses the big picture.
It's very similar to "Changeling," actually, another movie I didn't like, because it insists on the significance of representing the worst crimes as an ameliorative for the instincts and conditions that breed them. Yes, rape is wrong; yes, greed that kills is wrong. The problem is that in bringing our blood to a boil over such high crimes, this film--and so many like it--miss the iceberg that sinks the ship. The question is how to represent systems (or networks, if one prefers) over the established short hand of rape-patriarchy avarice-capitalism (Slumdog Millionaire is another example).
This was precisely why I had been excited by the premise of "Blindness": what I am calling for is a kind of blindness. The reason actual blind people protested the movie was that it represents blindness as totally debilitating (and to some extent morally corrupting). It's not. There is a particular irony at the beginning of the film in that one of the characters wears sunglasses all the time; I assumed she was blind and the filmmakers were showing how capable blind people can be. Not so. She was a prostitute. Sight and shame are never disarticulated in this film, and so it misses the big picture.
Sunday, July 26, 2009
Hybrids right and left
Two fabulous stories in the news of late: first, that Senator Brownback (R-KS) has introduced a bill to ban "part-human, part-animal creatures, which are created in laboratories, and blur the line between species." (I linked to the Huffington Post blippet because it links to the other relevant data). Second, the recent concern about military robots running amok feasting on the flesh of the living. (Levi brought this first to my attention, but I soon heard about it on NPR's "Wait wait don't tell me" as well--which last week made reference to Sen. Brownback's 'Mermaid bill' as well).
Now, like most people my gut reactions are that robots driven to consume flesh are bad, and that bills banning human-animal hybrids are silly. But on both of these issues there points to be made contrary to one's political intuition. The EATR robot awakens Terminator scenario nightmares, but on the other hand it is "green" in its energy source. Sure, it feeds on organic matter, but where do you think petroleum comes from?
As for the latter issue, it does not seem far-fetched to me--much less far-fetched, in fact, than the apocalyptic EATR scenario--that biotech corporations would invent human-animal hybrids that exist only for profit (both monetary and knowledge-power) and that have no legal existence or protection. Donna Haraway brought some attention to this in her essay on the oncomouse (the mouse designed to grow cancer). The purpose of the oncomouse is to model human bodily conditions through the body of a nonhuman. It would be much more effective for R&D to just have a version of the human body that is nonhuman.
At the same time, there is the more theoretically serious issue that Brownback is reifying "human" and "animal" as an actual opposition and denying the dignity (and pervasiveness) of hybridity. In either case, laughing at Brownback for trying to ban centaurs and mermaids is stupid and politically ignorant. Laughing at Brownback is probably not a bad idea, generally speaking, but that does not take him out of the senate.
Now, like most people my gut reactions are that robots driven to consume flesh are bad, and that bills banning human-animal hybrids are silly. But on both of these issues there points to be made contrary to one's political intuition. The EATR robot awakens Terminator scenario nightmares, but on the other hand it is "green" in its energy source. Sure, it feeds on organic matter, but where do you think petroleum comes from?
As for the latter issue, it does not seem far-fetched to me--much less far-fetched, in fact, than the apocalyptic EATR scenario--that biotech corporations would invent human-animal hybrids that exist only for profit (both monetary and knowledge-power) and that have no legal existence or protection. Donna Haraway brought some attention to this in her essay on the oncomouse (the mouse designed to grow cancer). The purpose of the oncomouse is to model human bodily conditions through the body of a nonhuman. It would be much more effective for R&D to just have a version of the human body that is nonhuman.
At the same time, there is the more theoretically serious issue that Brownback is reifying "human" and "animal" as an actual opposition and denying the dignity (and pervasiveness) of hybridity. In either case, laughing at Brownback for trying to ban centaurs and mermaids is stupid and politically ignorant. Laughing at Brownback is probably not a bad idea, generally speaking, but that does not take him out of the senate.
Thursday, July 23, 2009
"Milk"
An important part of the setting of Melville's "Benito Cereno" is that it takes place in the 1790s, while Melville is writing in the latter half of the nineteenth century. I point this out because, from the modern reader's perspective it is easy to overlook, and from the scholar's perspective it is critical to understanding the refractory gaze that Melville means to cast on the institution of slavery in the U.S. It is a kind of condensation of everything that Marxist literary critics, especially of the Jamesonian bent, would insist on in methodology. Always historicize. Critique of the present proceeds through the representations of the past. History doesn't repeat itself but it does rhyme--wait, that's Twain--I mean, there is a dialectical process that conserves macrostructures and repeats superstructural cycles. "Milk" has that eerie "Benito Cereno" quality of the present existing as more fully itself in a representation of the past.
At a gut level this is simply depressing for me, most of all because I have just moved to California and must bear the albatross of Prop 8. (On the other hand, I moved from Virginia where an equally shameful if less unexpected law was also passed recently). I visited San Francisco the day before I watched "Milk." If some legal battles for gay rights have been won in the intervening years, the sense of a political struggle has faded in contrast to Harvey Milk's clear sightedness. Linking this with a broader historical narrative, it is the sense of political struggle in general that has faded--compare with the incremental erosion of abortion rights.
But what I'm more interested in than a jeremiad on the decline of political thinking is what "Benito Cereno" might mean for its methodological inheritors--not so much Jameson and the critical crowd as for artists and art objects. The premise of "Milk" as somewhat authentic must be that it has political effectivity. (Of course, it might be politically meaningless, bread and circuses--but let's just suspend and disavow that interpretation for the time being). Like "Benito Cereno" we are taken in to an uncertain political milieu (70s San Francisco) through an interlocutor whose allegiances are uncertain. Harvey Milk is the man in the three piece suit when we meet him, and when he becomes the bearded hippy all in denim he is no less knowable by his sartorial signifiers, just as Melville's Babo is knowable because he is small and Black.
But Milk becomes unknowable, even/especially to those around him, when he sees himself as above all a political actor. This is more than his behavior being unpredictable: it is, rather, completely predictable because it is principled, just as the behavior of various establishment figures around him is predictable because it follows certain other principles--essentially, the principle of non-politicization. He is unknowable in the sense that his subjectivity seems to take on its properly philosophical predicates of disconnection from objectivity, but is still in tune with the world of objects. Subjectivity is not hypothetically removed from the world but secretly available through the port of culture, it is really an inaccessible Cartesian subjectivity. This is not proposed as a vindication or reformation of Descartes but as an example of where an eerie dualism seems against all reason to work.
Melville cultivates such an ambiance of eerie dualism in "Cereno" as the narrator and reader are taken on a tour of a fantastic representation of racial power dynamics. In "Milk" the eerie dualism comes in the form of Milk dictating memoirs not long before his death, a death which is announced at the beginning of the film. As an aside that will go unexplored, this leads on to a reading of "Bartleby"'s ghostly attributes in the political terms I have laid out here. The point I really want to make, aside from a de rigueur and empty endorsement of political "thinking," is that if "subjectivity" as philsophy has developed it takes on this meaning through politics, the outward signifiers of subjectivity that scientists and linguists have sought to discover in animals is a dead end. Assume the political significance of animals and their lack of (scientific) "subjectivity" is the condition of subjectivity as we actually find it.
At a gut level this is simply depressing for me, most of all because I have just moved to California and must bear the albatross of Prop 8. (On the other hand, I moved from Virginia where an equally shameful if less unexpected law was also passed recently). I visited San Francisco the day before I watched "Milk." If some legal battles for gay rights have been won in the intervening years, the sense of a political struggle has faded in contrast to Harvey Milk's clear sightedness. Linking this with a broader historical narrative, it is the sense of political struggle in general that has faded--compare with the incremental erosion of abortion rights.
But what I'm more interested in than a jeremiad on the decline of political thinking is what "Benito Cereno" might mean for its methodological inheritors--not so much Jameson and the critical crowd as for artists and art objects. The premise of "Milk" as somewhat authentic must be that it has political effectivity. (Of course, it might be politically meaningless, bread and circuses--but let's just suspend and disavow that interpretation for the time being). Like "Benito Cereno" we are taken in to an uncertain political milieu (70s San Francisco) through an interlocutor whose allegiances are uncertain. Harvey Milk is the man in the three piece suit when we meet him, and when he becomes the bearded hippy all in denim he is no less knowable by his sartorial signifiers, just as Melville's Babo is knowable because he is small and Black.
But Milk becomes unknowable, even/especially to those around him, when he sees himself as above all a political actor. This is more than his behavior being unpredictable: it is, rather, completely predictable because it is principled, just as the behavior of various establishment figures around him is predictable because it follows certain other principles--essentially, the principle of non-politicization. He is unknowable in the sense that his subjectivity seems to take on its properly philosophical predicates of disconnection from objectivity, but is still in tune with the world of objects. Subjectivity is not hypothetically removed from the world but secretly available through the port of culture, it is really an inaccessible Cartesian subjectivity. This is not proposed as a vindication or reformation of Descartes but as an example of where an eerie dualism seems against all reason to work.
Melville cultivates such an ambiance of eerie dualism in "Cereno" as the narrator and reader are taken on a tour of a fantastic representation of racial power dynamics. In "Milk" the eerie dualism comes in the form of Milk dictating memoirs not long before his death, a death which is announced at the beginning of the film. As an aside that will go unexplored, this leads on to a reading of "Bartleby"'s ghostly attributes in the political terms I have laid out here. The point I really want to make, aside from a de rigueur and empty endorsement of political "thinking," is that if "subjectivity" as philsophy has developed it takes on this meaning through politics, the outward signifiers of subjectivity that scientists and linguists have sought to discover in animals is a dead end. Assume the political significance of animals and their lack of (scientific) "subjectivity" is the condition of subjectivity as we actually find it.
Tuesday, July 21, 2009
Capital Gains
Since moving to the Santa Cruz/ San Jose area of CA we have been looking for a starter home to buy. Our thinking was that this area took a beating in the collapse of the real estate bubble; therefore, there would be many houses available at approximately reasonable prices.
In part this is true. There are many houses available at the bottom of the market that would have been priced 20-30% more two years ago. However, they tend to fall into two categories: houses that are genuinely good deals, and houses that are crumbling and filthy. In the former case, investors buy these houses on terms I cannot compete with--of the four(ish) houses we have been seriously interested in, our efforts were stymied by people buying with cash. In the latter category, the houses need a general contractor to take them on, invest in them, and make a profit reselling for them to move. I don't begrudge the GC's for profitting in this way, but it means that the current low price tag is illusory.
This is only anecdotal evidence but it is exactly the cycle one would predict in capitalism's ability to re-entrench itself through periodic failures. People buying 200-300k houses with cash in the first week on the market are not small families, they are speculators. Moreover, while these bank-owned properties are listed in the 200-300 range, they usually sell for 50-100k more. The market is adjusting itself as liberal economics would predict. The end result is that prices will eventually return to (around) their cyclical median, except that more material capital will be in the hands of those persons who are already the wealthiest and more of the least wealthy will be reduced to selling their wage labor for diminishing returns.
I can't complain: my position is better than many, many other.
In part this is true. There are many houses available at the bottom of the market that would have been priced 20-30% more two years ago. However, they tend to fall into two categories: houses that are genuinely good deals, and houses that are crumbling and filthy. In the former case, investors buy these houses on terms I cannot compete with--of the four(ish) houses we have been seriously interested in, our efforts were stymied by people buying with cash. In the latter category, the houses need a general contractor to take them on, invest in them, and make a profit reselling for them to move. I don't begrudge the GC's for profitting in this way, but it means that the current low price tag is illusory.
This is only anecdotal evidence but it is exactly the cycle one would predict in capitalism's ability to re-entrench itself through periodic failures. People buying 200-300k houses with cash in the first week on the market are not small families, they are speculators. Moreover, while these bank-owned properties are listed in the 200-300 range, they usually sell for 50-100k more. The market is adjusting itself as liberal economics would predict. The end result is that prices will eventually return to (around) their cyclical median, except that more material capital will be in the hands of those persons who are already the wealthiest and more of the least wealthy will be reduced to selling their wage labor for diminishing returns.
I can't complain: my position is better than many, many other.
Saturday, July 18, 2009
Law of Ruins
Graham Harman has posted an interesting excerpt on Roman statues. This reminds me of Albert Speer's law of ruins for Third Reich architecture. From White Noise (257-8):
It also is evidence to extend being toward death to the architecture of (at least) a certain time and place.
he knew that Hitler would be in favor of anything that might astonish posterity. He did a drawing of a Reich structure that was to be built of special materials, allowing it to crumble romantically--a drawing of fallen walls, half columns furled in wisteria. The ruin is built into the creation, I said, which shows a certain nostalgia behind the power principle, or a tendency to organize the longings of future generations.
It also is evidence to extend being toward death to the architecture of (at least) a certain time and place.
the one who reads
I was going to write a somewhat lengthy review of two books I read recently on affect, but I became distracted in my own mind by a tangential topic of why I would do that. The main reason is that I often notice book reviews from Scu and Craig in the blogroll and pop over to see whether those books could be instrumental for my immediate purposes, some day swell a bibliography, or should be avoided. It is also comforting to me that someone out there is reading all these books which I am sure another someone put a lot of time and thought and emotion into composing. This is a nice fiction because it fosters the possibility that someone is also reading me and my writing. In a Kantian (or Nietzschean) way, I try to imagine that no one is reading this but I do it because this is what I would will to be doing (and in fact it is what I am doing!). But still, I am not insensitive to the fantasy of the one who reads, and I think it is evidence of this that I take the time to write in a public forum (a blog) where I might facilitate this fantasy for others.
Why the one who reads? I am positing this figure as a metamorphosis of a couple Lacanian figures, especially "the one who knows." More than having an analyst know what is wrong with me (or the world), I simply want to be seen by the analyst-figure. And while having a "reader" out there, someone who sees into the text and draws out its essence, is close to the internalized sur/sous-veillance of the father, I again think the fantasy of the one who reads shows a milder, smaller claim of desire. Rather than modern oversight that is invasive, forceful and costly in terms of labor-time, the fantasy of the one who reads is a post modern (or something) fantasy of glancing knowledge. Rather than fantasizing about the drastic relocation and examination of the Clinic, this desire is just "to be seen," to be briefly, casually admitted into the doctor's space, have him strike a few glancing sparks of illumination from my personal surface, and be returned to circulation. (To borrow further from the discourse of the pomo, we could say the modern power systems exerted force on deviant persons as if they were master narratives needing to be ripped to shreds; in contrast, once everyone must be regulated in direct relation to biopolitis, the amount of attention given to individuals must be more cursory). We don't have time to treat every philosopher, critic, theorist, artist as if they were important (meaning, brutally dissecting their arguments, wrangling with everyone ambiguity)--most of us don't even have the time to keep up with all the books we would like to read, if all the books we should be reading.
This isn't meant as a criticism of those who provide book reviews. I sincerely value the opinions of those critics (and bloggers) I read and take their evaluations to heart, and hope to contribute something back to that sphere of information. Even if no one reads it.
Why the one who reads? I am positing this figure as a metamorphosis of a couple Lacanian figures, especially "the one who knows." More than having an analyst know what is wrong with me (or the world), I simply want to be seen by the analyst-figure. And while having a "reader" out there, someone who sees into the text and draws out its essence, is close to the internalized sur/sous-veillance of the father, I again think the fantasy of the one who reads shows a milder, smaller claim of desire. Rather than modern oversight that is invasive, forceful and costly in terms of labor-time, the fantasy of the one who reads is a post modern (or something) fantasy of glancing knowledge. Rather than fantasizing about the drastic relocation and examination of the Clinic, this desire is just "to be seen," to be briefly, casually admitted into the doctor's space, have him strike a few glancing sparks of illumination from my personal surface, and be returned to circulation. (To borrow further from the discourse of the pomo, we could say the modern power systems exerted force on deviant persons as if they were master narratives needing to be ripped to shreds; in contrast, once everyone must be regulated in direct relation to biopolitis, the amount of attention given to individuals must be more cursory). We don't have time to treat every philosopher, critic, theorist, artist as if they were important (meaning, brutally dissecting their arguments, wrangling with everyone ambiguity)--most of us don't even have the time to keep up with all the books we would like to read, if all the books we should be reading.
This isn't meant as a criticism of those who provide book reviews. I sincerely value the opinions of those critics (and bloggers) I read and take their evaluations to heart, and hope to contribute something back to that sphere of information. Even if no one reads it.
Friday, July 17, 2009
"Defiance"
I just watched "Defiance," the World War II pic about a group of Jewish refugees who hide and restart their lives in the forests of, I think, Poland (or Russia? I missed some context). The point being that it is a story in which Jews are agents of resistance and self-formation rather than the people of endless passive suffering. (The humor in the film, limited given its subject matter, comes from the irrepressible love of intellectual bickering that characterizes robust Jewish society).
What I found interesting is an implicit counter point to the either/or of Schmittian friend/foe politics. As the attacked, minoritarian, under-supplied, and non-state people this group exists in a tripartite politics: friend, foe, and non-combatant. Of course, part of the dramatic and philosophical tension of the film lies in the question of whether there can be non-coms; but from an extremely pragmatic point of view for a dispossessed group, the category non-combatant is expeditious for avoiding unnecessary and costly battles.
This also puts what counts as a "friend" in a different light. For Schmitt, a friend might be a non-com who permitted state violence against a third part. For the politics of the dispossessed, a friend is one who provides material support: an active rather than passive position.
If such a political paradigm belongs properly to those politically dispossessed, it can at least be a resource for condemning the either/or grandstanding of American foreign policy. We are short on friends and foes right now; the sort of varying relations that can be created with non-combatants is our biggest concern.
What I found interesting is an implicit counter point to the either/or of Schmittian friend/foe politics. As the attacked, minoritarian, under-supplied, and non-state people this group exists in a tripartite politics: friend, foe, and non-combatant. Of course, part of the dramatic and philosophical tension of the film lies in the question of whether there can be non-coms; but from an extremely pragmatic point of view for a dispossessed group, the category non-combatant is expeditious for avoiding unnecessary and costly battles.
This also puts what counts as a "friend" in a different light. For Schmitt, a friend might be a non-com who permitted state violence against a third part. For the politics of the dispossessed, a friend is one who provides material support: an active rather than passive position.
If such a political paradigm belongs properly to those politically dispossessed, it can at least be a resource for condemning the either/or grandstanding of American foreign policy. We are short on friends and foes right now; the sort of varying relations that can be created with non-combatants is our biggest concern.
Mythical races
Today I was screwing around on Facebook and took a "what kind of dog are you?" quiz. I was pretty disappointed to get Welsh Corgie, but then my wife, trying to best me, got the same. Whatever breed I would have picked for her, my first pick would have been "not the same as me," because our attitudes appear to both of us to be pretty dissimilar. Apparently not.
Unrelatedly, if that is a legitimate adverb, I was thinking about Lord of the Rings while doing the dishes and what nonhuman race I would be. And then I was struck by how the idea of "race" is so powerfully represented as a real category in LotR. Different "races" share a cognitive spectrum, like all humans do, but have exaggeratedly different phenotypes and, probably because Middle Earth has pre-modern technologies, are essentially monocultures. (I don't know if elves, humans, dwarves, hobbits, etc. can interbreed--I am applying my D&D knowledge to surmise it is possible but for sociohistorical reasons infrequent).
Now, asking what breed of dog I am most like is a fairly innocent question. True, it reifies "breeds" that are historically constructed and which contain individuals of widely varying personalities and temperaments, but since there is almost certailny going to be a gap between the subject's self-image and his/her projected breed, I think the experiment tends to challenge those borders as much as it uses them.
Asking what mythic race one would be strikes me as a little less innocent. Not only do I think race is a more vicious myth than breed, I would go so far as to say that racialist discourse has done more harm to animals than breed. Middle Earth presents race as it appears in the racialist imaginary. But the question might then be a way to move around racialist thinking: to imagine oneself in formally different positions vis-a-vis anthronormativity without the sticky history of stereotypes bogging down the imagination. I don't know. If anyone reading this is nerdy enough to talk Tolkien racialism I'd love some feedback.
Unrelatedly, if that is a legitimate adverb, I was thinking about Lord of the Rings while doing the dishes and what nonhuman race I would be. And then I was struck by how the idea of "race" is so powerfully represented as a real category in LotR. Different "races" share a cognitive spectrum, like all humans do, but have exaggeratedly different phenotypes and, probably because Middle Earth has pre-modern technologies, are essentially monocultures. (I don't know if elves, humans, dwarves, hobbits, etc. can interbreed--I am applying my D&D knowledge to surmise it is possible but for sociohistorical reasons infrequent).
Now, asking what breed of dog I am most like is a fairly innocent question. True, it reifies "breeds" that are historically constructed and which contain individuals of widely varying personalities and temperaments, but since there is almost certailny going to be a gap between the subject's self-image and his/her projected breed, I think the experiment tends to challenge those borders as much as it uses them.
Asking what mythic race one would be strikes me as a little less innocent. Not only do I think race is a more vicious myth than breed, I would go so far as to say that racialist discourse has done more harm to animals than breed. Middle Earth presents race as it appears in the racialist imaginary. But the question might then be a way to move around racialist thinking: to imagine oneself in formally different positions vis-a-vis anthronormativity without the sticky history of stereotypes bogging down the imagination. I don't know. If anyone reading this is nerdy enough to talk Tolkien racialism I'd love some feedback.
Wednesday, July 15, 2009
Animals as socializing capital
Therapy animals, usually dogs, are one of the great newly discovered resources of the affective economy. I see good and bad in this. The good news is that animals are being more frequently encountered within human social space, normalized as the kind of entity that "belongs" there and can make claims about that space. The bad, or the problematic, is how this happens. I am not even really thinking here of the imposition of class categories and hierarchies onto dogs--the "good" golden retrievers and labs, the "bad" rotts and pits--or between dogs and other species. I am thinking, right now, about how scientific studies promulgate a certain image of therapy animals that accomplishes their introduction into social space but in a way that prevents them from making claims against it.
For example, a piece by CBS on how children below level in reading catch up better by reading to dogs. On one hand we have descriptions like this:
I do not think it is incorrect to ascribe patience, self-control, and anticipation to Ross, and I know my dogs loving hanging out with kids. But what if he didn't like it, or didn't feel like it one day? Is that an interpretive option?
The thinking is that the kids are no threatened by the dog and so will read to their ability without fear of judgment. The problem for these kids seems to be not so much a reading deficit as a power relation. They don't want to be objects of evaluation; when they feel like that, they underperform to escape attention; the adults monitoring them induce this feeling.
What the kids seem to get out of the experience is an escape from the limitations of power relations.
That's not just adorable, that's transaction. It might be a fiction of the kid's making that they are both interested in the book, but it is at least a fiction of a shared world--and I would say, a fiction closer to the truth of the dog than that proposed by the adults.
The way the adults see this seems to hearken back to the structural issues that discouraged kids from reading in the first place. The dogs/kids are reinserted under surveillance, telos, accountability. The space in which they can interact is limited and, after each session, dis-integrated. The dog is only allowed to enter human space in its most passive mode. The positive contribution, the multiplication of narratives--and there are tons of stories to be had with dogs--are "screened" out. "They just have to be willing to lie so still for so long," concludes the writer with a sympathetic wink. I do not doubt that these dogs are largely willing to do something boring for the benefit of others, but it seems cruel--to them and the kids--and stupid to limit their social representation to that.
For example, a piece by CBS on how children below level in reading catch up better by reading to dogs. On one hand we have descriptions like this:
Ross, an Irish setter owned by Barbara Murgo, sits quietly and patiently as kids read to him.
And Ross seems to look forward to it.
I do not think it is incorrect to ascribe patience, self-control, and anticipation to Ross, and I know my dogs loving hanging out with kids. But what if he didn't like it, or didn't feel like it one day? Is that an interpretive option?
The thinking is that the kids are no threatened by the dog and so will read to their ability without fear of judgment. The problem for these kids seems to be not so much a reading deficit as a power relation. They don't want to be objects of evaluation; when they feel like that, they underperform to escape attention; the adults monitoring them induce this feeling.
What the kids seem to get out of the experience is an escape from the limitations of power relations.
One youngster told Turner, "He sits and, when I show him the book, he looks at it."
That's not just adorable, that's transaction. It might be a fiction of the kid's making that they are both interested in the book, but it is at least a fiction of a shared world--and I would say, a fiction closer to the truth of the dog than that proposed by the adults.
The way the adults see this seems to hearken back to the structural issues that discouraged kids from reading in the first place. The dogs/kids are reinserted under surveillance, telos, accountability. The space in which they can interact is limited and, after each session, dis-integrated. The dog is only allowed to enter human space in its most passive mode. The positive contribution, the multiplication of narratives--and there are tons of stories to be had with dogs--are "screened" out. "They just have to be willing to lie so still for so long," concludes the writer with a sympathetic wink. I do not doubt that these dogs are largely willing to do something boring for the benefit of others, but it seems cruel--to them and the kids--and stupid to limit their social representation to that.
Monday, July 13, 2009
Criticism of capitalism in "Confessions of a Shopaholic" and "The International"
I guess it has become manifest that this is largely a film criticism blog from the perspective of someone who views the world in terms of animals and anti-capitalism. It is probably equally apparent that I watch crappy mainstream movies with relish.
Recently I saw "Confessions of a Shopaholic" and "The International." For those who miss commercials, the former is about a young female in NY who buys too much fashion stuff and the latter is about a cop trying to bring an international bank to justice for murdering people who interfere with its aspiration to control the production of war debt. (Baudrillard's essay on debt that I referred to in my last post is equally apropos here).
In some ways these movies are highly critical of capitalist processes. "Confessions" is about the debt cycle and over consumption at the personal level, "The International" at the political or transnational level. In both, these processes are destructive and promote harmful behavior toward others. Predictably, though, the forms of closure available to these texts as narratives are inscribed or prepared within the capitalist order. Both end, essentially, with an affirmation of the individual as that which can step away from and resist the systemic. This is the prima facie argument to be made against these films.
However, I think this kind of reading gives too much over to narrative structure--it basically agrees that beginning, middle, and end are where the (capitalist) plot says they are. If we imagine these films in an "eternal return of the same" scenario or as repeating in the sense in Difference and Repetition (which spends no small effort defining "repetition" as a technical term away from simply "doing it again") the loci called beginning, middle, and end are open to redistribution. The "middle" is where the critique comes out in these films, and the task of critical viewership is to relocate this to an end (with intentional reference to the philosophical "end" or purposively grounded state).
In "Confessions" this comes when the main character communes with animate mannequins who congratulate her on her new found discipline not to shop. Once one has broken from the capitalist imaginary, and specifically its ontological divisions, all kinds of things are able to speak and celebrate. There are more "subjects," not more divisions in kind between subjects. In "The International" this comes when an old cynical bank exec, formerly a Party hardliner with the Stasi, tells Clive Owen's character that there can be no justice within a fundamentally unjust system. Justice against persons is not politically relevant if it does not challenge the systemic nature of what elicits powerful criminals. That's what I say! If one stopped the film here, the climax would be a powerful, expansive, and completely accurate (from my standpoint) statement on what does and does not constitute politics. How could there be a better climax?
The only way out is for the film to interpolate itself in the space where the (critical) viewer should reassume his/her position in the world. The film takes on the role of action that, if arrested, it might have inspired in the viewer.
Recently I saw "Confessions of a Shopaholic" and "The International." For those who miss commercials, the former is about a young female in NY who buys too much fashion stuff and the latter is about a cop trying to bring an international bank to justice for murdering people who interfere with its aspiration to control the production of war debt. (Baudrillard's essay on debt that I referred to in my last post is equally apropos here).
In some ways these movies are highly critical of capitalist processes. "Confessions" is about the debt cycle and over consumption at the personal level, "The International" at the political or transnational level. In both, these processes are destructive and promote harmful behavior toward others. Predictably, though, the forms of closure available to these texts as narratives are inscribed or prepared within the capitalist order. Both end, essentially, with an affirmation of the individual as that which can step away from and resist the systemic. This is the prima facie argument to be made against these films.
However, I think this kind of reading gives too much over to narrative structure--it basically agrees that beginning, middle, and end are where the (capitalist) plot says they are. If we imagine these films in an "eternal return of the same" scenario or as repeating in the sense in Difference and Repetition (which spends no small effort defining "repetition" as a technical term away from simply "doing it again") the loci called beginning, middle, and end are open to redistribution. The "middle" is where the critique comes out in these films, and the task of critical viewership is to relocate this to an end (with intentional reference to the philosophical "end" or purposively grounded state).
In "Confessions" this comes when the main character communes with animate mannequins who congratulate her on her new found discipline not to shop. Once one has broken from the capitalist imaginary, and specifically its ontological divisions, all kinds of things are able to speak and celebrate. There are more "subjects," not more divisions in kind between subjects. In "The International" this comes when an old cynical bank exec, formerly a Party hardliner with the Stasi, tells Clive Owen's character that there can be no justice within a fundamentally unjust system. Justice against persons is not politically relevant if it does not challenge the systemic nature of what elicits powerful criminals. That's what I say! If one stopped the film here, the climax would be a powerful, expansive, and completely accurate (from my standpoint) statement on what does and does not constitute politics. How could there be a better climax?
The only way out is for the film to interpolate itself in the space where the (critical) viewer should reassume his/her position in the world. The film takes on the role of action that, if arrested, it might have inspired in the viewer.
Subscribe to:
Posts (Atom)