Excuse Me?

Looking over the much-blogged-about list compiled by a panel of noted conservatives of the “ten most dangerous books”, I have to say that my ability to imagine or appreciate what other people are thinking deserts me a bit.

I readily understand the intellectual and political histories that put Marx, Friedan, Hitler, and Mao on the list, though putting Friedan there kind of seems like a Sesame Street exercise: “one of these things is not like the others…”. But the underlying logic of selection seems so variable. Some books seem to have been deemed dangerous because they inspired murderous or destructive social action. I suppose that’s why Dewey is there as well, though the proportionality problem seems even worse in that case. Other books, though, seem to be there simply because they were wrong or factually flawed but widely viewed by many as credible or accurate in the context of ongoing social or cultural debates–Mead or Kinsey or Ehrlich. I’m all for being a harsh in retrospect about the credulous readings of such works by their devotees, but “harmful”? Harmful like Mein Kampf harmful?

Then there’s the weird choices: can somebody direct me to the intellectual debates or writings that would explain why Comte is on the list? High up on the list of the top ten. I think I missed a chapter in the intellectual history of American conservatism that would help explain why he’s a bugbear of such note.

As long as we’re at it, help me understand why John Stuart Mill’s On Liberty almost cracked the top ten?

And they picked the wrong Foucault if you really want take a crank’s eye view of him as “harmful” in this vein. Kind of feels like somebody said, “We need something by that French guy Foucault” and they picked the first title they came across.

Posted in Books, Politics | 17 Comments

Book Notes: Jon Wiener, Historians in Trouble

I’m going to return to my aspiration to make regular entries about books I’m reading, partly to help me to work my way through the growing stack of things I’ve put on my “to be read” pile. These aren’t reviews: they’re just questions and reactions of varying length about particular books I’ve read.

With Historians in Trouble, one of the things that struck me is that most of the trouble Wiener describes doesn’t seem to me to have much to do with the discipline of history per se. For example, Elizabeth Fox-Genovese’s troubles, discussed in Chapter One, strike me as being highly generic to academia as a whole, possibly even to professional life more broadly. The allegation is that she abused her subordinates and had them perform duties that they should not have had to perform, and that she treated graduate students as subordinates. It’s not quite dog bites man, but I don’t think it’s altogether that uncommon or unusual a story in academia, either, more’s the pity. But that fact that Fox-Genovese is a historian doesn’t seem to me to have anything to do with it. The same goes for the case of Stephen Thernstrom.

Some of the other cases he deals with: Michael Bellesiles, David Abraham, Edward Pearson, definitely do involve the specific craft of historical scholarship. This is one of the basic problems I had with the overall book: Wiener understands the major difference between these controversies in right-left terms, and in terms of the degree of public spectacle they involved, but there’s a real apples-and-oranges problem at the heart of the book. Before we get to questions of political polarization, there’s just a basic difference between what Fox-Genovese was alleged to have done and what Bellesiles was alleged to have done.

Fox-Genovese’s trespasses were about professional and social relations within academic institutions, about power; Bellesiles’ trespasses were about craftwork, about professional standards for publication. It seems to me obvious that these two kinds of accusations are going to play out differently both because they involve different kinds of evidence (to discuss Fox-Genovese requires talking about testimonial evidence; to discuss Bellesiles involves talking about archives and documentary evidence) and because the harms involved are understood differently.

In the end, there’s probably more professional consensus among historians, regardless of political leanings, that what Bellesiles did is unacceptable. (I think Wiener strains mightily here to make it sound like historians on the left are largely sympathetic to Bellesiles and beleagured by right-wing critics outside the academy). That’s because the nature of the evidence and the nature of the trespass are things which permit greater consensus. Fox-Genovese’s case is uncomfortable in different ways, perhaps not the least because powerful senior scholars on both the right and the left who teach in research universities have probably committed some of the same sins over time: to critique what she did almost requires signing on to a larger critique of power relations within academic life that cannot be easily contained to the usual identity-politics terms that many on the left would like to remain within.

Wiener’s right that the right-left divide within the public sphere weighs heavily on a lot of the cases he cites. Certainly there is to me and others a nasty double standard in the differing public readings of Bellesiles and John Lott, who seem to me to have engaged in somewhat similar behavior. But in a lot of cases, it seems to me Weiner is not very interested in the historical roots and contemporary practice of the professional ethos of historians in specific or academics in general. If you want to understand the cases of Stephen Ambrose or Doris Kearns Goodwin, for example, I don’t think it’s very relevant to talk about right and left (Wiener agrees with this) but neither are the usual discourses about plagiarism much help. The problem with Ambrose and Goodwin is the problem with an entire model of productivity in research universities, where famous senior scholars produce books with the paid help of a number of research assistants. This model is less common among historians than it is legal scholars or in other fields, and appropriately so: the writing of scholarly history, it seems to me, requires that the individual author not only craft all the prose, but to know the archives, the sources, the architecture of evidence. I don’t mean that historians should never use assistants but they clearly should do so in very circumscribed ways. That’s one story I think has to come into focus in these cases. Another that Wiener is more alert to is the highly institutionalized manner that all public figures in contemporary American life use to absolve themselves of wrong-doing: confess mistakes but never intent of wrong-doing, retreat temporarily from public life, accept ceremonial punishment. Those techniques are not available to those who are not already public figures, who do not already have some kind of power–but they’re also not available to those who do not possess some marketable talent. Goodwin was back on the air pretty quickly because she is articulate and affable, not just because she is powerful and has many powerful friends.

I’m just not convinced that these cases all belong together, and if they do, I don’t think their association tells us much about the public sphere or right-left antagonism: the only threads weaving in and out here seem to me to be more subtle kinds of collisions between the ethos of academic professionalism and the marketplace of public culture, about the fading gentility of scholarly life, the ferocity of careerist professionalism, the commodification of academic reputation. I can’t say that I found Wiener particularly attentive to those questions.

Posted in Academia, Books | 2 Comments

Graduation

Yet another graduation yesterday. A very nice one, actually: beautiful weather and some unusually good speeches. Jonathan Franzen was brief but both funny and slyly profound. Our president gave (for him) an unusually pointed speech as well about the dangers of “moral absolutism”.

I can’t be the only faculty person who finds graduation both moving and depressing all at once. I actually like going every year, and hate to miss it. It’s often very enjoyable to meet the parents of students you’ve taught. But it’s also a strange sensation, as you hear about what the new graduates are planning on doing, and often also hear about what older graduates are doing now (as many of them tend to show up at this time of year). It’s a bit like the vertigo you get when you’re sitting still but watching visual images of rapid movement, like at an IMAX theater. In the end, at a place this size, you don’t really teach that many people from the beginning of your career to the end. I’d estimate that in each graduating class here, there are about fifty students that I know moderately well, and perhaps ten of those I’d hear from in the future about what they’re doing and have some sense of engagement with their future. That’s really not too many people. If teaching well requires some sense of “outcomes”, then it’s a pretty small basis for evaluating that.

On the other side of things, though, I’m still very clear about the positive outcome of a number of my undergraduate courses and the teaching that went into them. I still think back often to a course on the methodological problems involved in studying the history of “subject peoples”, taught by Ann Wightman. (The particularly curious thing about that course is that there were a number of other future Africanist historians in it that year, and I think we all remember it as a particularly formative experience.) There are courses that had a huge impact on me that don’t feed as directly into my professional life, even a few from high school, especially a summer session I did on the East Coast.

But that sense that lives are flowing around you and past you, that you’re a rock in the middle of a stream, slowly getting worn down by the passing of years, is both pleasing and melancholic. Graduation somehow crystallizes that sense for me every year.

Posted in Academia | 1 Comment

Pedantry and Critical Thought

John Bruce has a long series of posts detailing his critique of academic quality control. His main complaint, as I understand it, is that in some disciplines (perhaps most or all: he’s focused on English but there’s no restriction of his critique to English) he believes that a significant number of professors are factually ignorant about their subjects of expertise. This, he feels, degrades the value of undergraduate education. I’ve criticized him for not detailing exactly what the harm suffered is, but to be fair, he’s had a lot to say about that in the past: he views many students trained by elite universities as shamefully ignorant in factual terms and therefore crippled in their ability to do good work after graduation.

In this series of posts, he attributes the source of the problem to academic hiring practices, and cites a number of other bloggers critical of academia (including yours truly) to buttress his characterizations of academic hiring. I’ve made some critical comments over at his blog, but I’ll echo and amplify a few of them here.

The general problem, as I see it, is that John Bruce is building his own Frankenstein monster out of various anecdotal spare parts which do not really fit together very well. The lowest common denominator critique of academic hiring, tenuring, graduate training and so on that you can find at my blog, at Invisible Adjunct’s defunct blog, at Erin O’Connor’s site, and many other blogs and publications, is that academia is a closed shop in various problematic ways. That it has an overly narrow or parochial set of selection criteria, that it punishes idiosyncracy or originality, that it allows people to misuse confidentiality to promote their own views or practices at the expense of healthy intellectual diversity. That academia is too much of a monoculture.

John Bruce’s representations of academic hiring are fairly accurate. Typically a department is granted a “line”. The members of the department craft a description of the ideal candidate: usually this describes a required field of expertise and suggests some desired secondary fields. The college administration often approves the language. The job is advertised. Applications come in. In most cases, an application consists of the candidate’s c.v. and reference letters. Some candidates may also send in a publication and a sample syllabus. Usually a department of more than three people will designate a committee to make a first pass through the applications with the intent of eliminating those that are totally unsuitable and those that are especially interesting or promising.

Some of the applications that are more or less eliminated at this stage, honestly, anybody would eliminate. Really. Academia’s no different from any other employer in that respect. Anybody who has done any kind of job search in any profession knows that you get some applications from people who are totally unsuited to the position by any standard. Kind of like the first round of American Idol. If the committee were all unprofessional and tried to eliminate honstly competitive applications? The norm in most cases is to keep all the files together, even those deemed uncompetitive. People can and do check up on their colleagues to be sure everything is going fairly.

It’s possible to bias the selection process more subtly, and people do. But I don’t know that academia is all that different from any other workplace in that regard. John Bruce thinks industrial workplaces are different because market mechanisms are self-correcting. Maybe in the grand scheme of things, but there’s plenty of capitalist workplaces that look more like “The Office” than an efficient entrepreneurial powerhouse, where cronyism is the rule of the day. (And of course, the same remedy is available with universities that it is with businesses: if they’re dysfunctional enough in your judgement, don’t give them your money. Shop around for quality.)

Still, I’d agree that academic hiring does tend to reinforce some of the worst aspects of scholarly culture: its insularity, its timidity, its drift towards safe mediocrity, its monoculture. Plurality of viewpoints and methodologies, unconventional ideas, and so on, do tend to be punished, often without any obvious over-the-top manipulation or unprofessionalism–more by insinuation, or by the simple fact that group decision processes tend to revert to a mean. The incentive structure of academic life, in hiring and elsewhere, is often completely screwed-up.

That’s the lowest-common-denominator consensus, and John Bruce appears to be drawing on that to forge his own complaint. The problem is that the main thrust of his accusation is the one thing that virtually no one else complains about. Left, right or none of the above, the one thing that almost no one says about academics is that they’re insufficiently erudite or knowledgeable. In fact, I’d say that’s the one thing academic hiring processes tend to be brutally efficient about detecting, for the most part. I was involved in a search some years ago, not in my own department, where one of the candidates turned out in my judgement and the judgement of many others to simply not know some basic factual information, the kind John Bruce is concerned about. That was an instant disqualifier, and I think it would be in many departments. The same goes for other kinds of vetting processes, by and large. Whatever is wrong with academia, I don’t think it’s that most academics don’t know their fields of expertise reasonably well.

The reforms that John Bruce suggests, particularly a national comprehensive multiple choice exam administered to all Ph.D candidates, seem to me to kill the patient outright (or to return to my earlier metaphor, putting a defective brain inside the Frankenstein monster). What is it that parents have a right to expect in return for their money? What do students have a right to expect? What are the outcomes desired, and what skills do professors need to have to deliver those desired outcomes? Bruce’s suggestions would change the incentive structure of graduate education and academic hiring in major ways. The consequence of those changes, in my view, would be to replace what is now understood as scholarship with something that looks more like pedantry. To make professors people with tremendous amounts of rote learning and no ability to think. Or to teach and communicate. If anything, the more common anecdotal complaint heard outside of academia is not that professors don’t know their material, but that they have no ability to teach it (or no interest in doing so).

My father used to tell me about the professors who made a difference to him as a first-generation college student at a Catholic university. The golden thread that connected them, in his view, was not what they knew but how they thought. He didn’t come away from their classes with a sack full of discrete facts–one of the classes that mattered most to him was a course that he literally made no use of later in life in terms of the subject matter. He came away having encountered a person who could think clearly, think critically, think skeptically, and who could show his students how to do the same. My father learned persuasion. He learned communication. He learned what knowledge was, and how to come to know new things when it was necessary or even pleasurable to do so. Of course my father knew also that thinking well involved knowing things, being clear about the facts, and saying no more or no less than what the facts allowed you to say. He learned that also from these few professors he valued.

The ones he didn’t value were the pedants: the people who knew a great deal factually but who had no idea why they ought to know it. Who could say nothing about why their knowledge mattered, and could only demand that their students repeat the same rote processes that the professor himself had undergone to acquire knowledge.

I share my father’s views. You have to know your shit but knowing your shit isn’t the thing that makes you a valuable professor. It’s the easiest part of this career. It isn’t the thing that a well-designed process of hiring and retention needs to be looking for first and foremost, the thing that separates the excellent from the adequate. It isn’t the primary good that higher education ought to aspire to deliver to students. It comes with the territory, but a fetishistic emphasis on factual information to the exclusion of critical thought, persuasive communication, and the ability to explain why knowledge matters, is a surefire way to reduce the value of higher education still further.

Posted in Academia | 12 Comments

Welcoming New Arrivals to the Sekrit Clubhouse

Some time ago, a friend and colleague of mine used to call me a lot just to talk about our field, my friend’s job situation, and other stuff. My friend finally got in a good job situation and we talked less often. I think partially this is because I got busy and I’m kind of bad about keeping up with people in general, which is a continual source of anxiety-causing regret for me. I’m not kidding about the easily distracted thing.

But it was also because my friend always wanted to diss everyone else in our field and diss most of the books and articles written by people in our field, in generally pretty personalized terms. I kind of played along with that while my friend was still in job limbo but I got tired of it once my friend was in a more secure situation.

It’s an odd thing. I can be pretty harsh about academia as an institution overall. I can be pretty critical about some of the problems I see in my own fields of speciality. I can certainly dish out abuse on some books: one of the first professional book reviews I wrote pretty thoroughly brutalized a work that I thought then and still think was pretty awful. I was certainly over-the-top mean in my assessment of Ward Churchill’s scholarship.

Mostly, though, I’d rather not. Even if I’m going to be critical of a particular scholar, a particular book, a particular field, I’d still always like to think sympathetically about why that scholar does what he or she does, why a particular book has the problems that it does, why some field of study or discipline has a blindspot or two. I mean, it’s hard to write a scholarly monograph or complete a serious research project, even when the results are not hugely significant. Some of the harsher critics of academia out there are acting like most of what academics produce is just totally worthless. Really, that’s too harsh. It might be fair to say most of it is sort of middling or mediocre, but that’s not an invitation to act like particular scholarly individuals stole your lunch money. The lack of proportionality in some criticisms–including my friend to whom I don’t speak that much anymore–depresses me.

I don’t think that sympathy came naturally to me: it’s more like a professional commitment I picked up along the way. I learned it partially from my graduate advisor, for whom it is a profound way of life, especially in his pedagogy. As far as he’s concerned, almost every work of scholarship has something in it that is useful or smart, and when he criticizes, he has a great knack for trying to criticize something in its own terms rather than break it on the wheel of his personal tastes and make it be more like himself. I’ve never seen him say, as many of us do, “You should have written the book I would have written”.

All of this is a prelude to explaining why a recent Crooked Timber thread about complexity and social networks made me feel really uncomfortable. Some really intense things got said, especially in the comments, that seem to me disproportionate to the subject matter. The substance of the discussion is that physicists are moving into the area of social networks with a relative lack of knowledge of work done by social scientists on this topic. That seems a fair observation. If you want the really detailed, insanely smart version of that observation, read Cosma Shalizi’s essay on the subject. (As long as I’m mentioning it, anybody who thinks scholarly blogs aren’t particularly scholarly should read that essay: the scholarly goods delivered in that entry to me outweigh any three or four average journal articles.)

Part of the reason it’s fair is not just that physicists are moving into this area of study but that they’re doing so without a kind of performative (and substantive) humility. Ok, fine. But at least some of the people bothered by this movement at Crooked Timber, both contributors and commentators, are so hostile to the mere fact of movement that I’m not sure sufficient humility would do anything to ease the passage of new players into work on social networks.

It makes me feel uneasy because I’ve tentatively been poking at these subjects (complexity, social networks, emergence) with a long stick and an acute sense of my own limitations. Reading some of that CT thread, particularly Daniel Davies’ comments, it almost seems to me that anybody who wants to say anything about these subjects, or apply them to work of their own, has to emerge like Aphrodite from the waves, a dazzlingly erudite, full-blown sociological researcher whose command of everything and anything within the canon prescribed stretches all the way down to the roots and all the way up to the tops of the scholarly tree.

Subjects, theories and methodologies may be old hat in one area of inquiry and excitingly new in another. Africanists have been thinking about problems of epistemology in history for two generations that other areas of humanistic thought are only just starting to consider, but I find it quite annoying when Africanists get upset about that fact. Sure, they could cite us more often, but I think it’s better to be pleased at the company than jealous about the table scraps. I wrote about commodification and consumerism in my own narrow field before almost anyone else did; now there are many new studies, and not that many of them cite me. I notice that, mind you: we all do. But it seems to me that it would be churlish to complain about it. In all honesty, no fake humility here, I’m very pleased to see that shift happen. Even in purely egotistical terms, it feels good to see it happen: it means I got lucky saw it coming ahead of time.

Maybe the ability to be relatively relaxed about such movements is a function of career security. A mid-career, tenured scholar with little driving ambition to move up the perceived status ladder can afford to just be playful and generous. Those who have a more serious need to accumulate reputation capital may need to hoard it more aggressively.

It seems to me that we learn best about fields which are new to us by trying to practice them, not by going into monkish seclusion and acquiring a comprehensive command of a full canon before we even dip our toes into a new hot tub. Complexity and emergence is old hat to some people, but it’s new to me, and judging from the reaction when I talk about it to colleagues in my own discipline, new to most historians. It’s best when we first try to practice in a new area that we acknowledge our limitations, I agree. It does annoy when someone tackles a topic that is deeply explored by a generation of scholars as if those scholars never existed. Stephen Wolfram got smacked for just such a sin, and appropriately so, whatever the virtues of his work in other respects might be. That humility isn’t just a performance: it’s a recognition of the collective, cumulative nature of scholarship. It needs to be met by a generosity on the other side of things, however. Part of how we learn by doing is the gentle amendations of others inside and outside our own disciplines once our work circulates or is published. Guiding people to well-formed bodies of knowledge is one thing; punishing them for having failed to find those bodies of knowledge, for failing to be us is another. The one seems to me the best possibility of a scholarly community; the other the worst realization of it.

Posted in Academia | 2 Comments

Nerd Hermeneutics, or Do Not Make Out My Ticket for Middle-Earth

Ok, one last Star Wars post (just one tiny wafer, sir…) and then I’ll return to more serious issues. (Sorry to those of you who read this blog looking for something other than a cadet branch of Star Wars geekery…)

In the comments on my last post, Dan talks about “nerd hermeneutics”, about the ways in which a fiction like Star Wars invites a particular kind of reader to make it into more than it is, to fill in its gaps, invent coherencies, see themes that are only barely there. I think the key attribute that it is “in the text” that invites this labor is that Star Wars, like Middle-Earth, self-presents as a “total world” text, whose referentiality largely turns inward.

While nerd hermeneutics can be serious, even pompous, as it goes about its business, it’s ultimately a form of interpretative gymnastics, a system-inventing game. The skills and passion it requires are also not really that different from non-nerd hermeneutics: the difference is less in the intellectual substance of the work as in the sociology and importance of the two practices.

The biggest mistake that some non-nerd hermeneuts make in looking on with curled lip is to assume that the work of nerd hermeneutics is about wish fulfillment, about fashioning universes in which we would prefer to live. There’s some of that going on, to be sure, and I mentioned it in an earlier post. Jedis, wizards, nobility, superheroes are attractive figures to adolescent geeks who imagine themselves as possessing inner talents and merits that are scorned or marginalized in the wider culture. I don’t know if I’ve mentioned the one issue of Scott McCloud’s astonishing comic Zot in this context before, but during his “Earth Stories” arc, in issue #31, the story of the character Ronnie is so eerily on target that it unnerved me and evidently others: McCloud comments that it has led some readers to accuse him of “spying on them”. Ronnie’s an alienated and very serious, almost humorless, geek who dreams of writing comics. He’s had a day full of teenage melodrama and at the end of it he slides into a chair and imagines a sort of Cyclops-and-Phoenix against the whole world apocalypse scene while listening to bombastic symphonic music. The imaginary world Ronnie slips into magnifies and ennobles his adolescent angst.

But a substantial amount of nerd hermeneutics is not about wish fulfillment, quite the contrary. Like David Brin, the more hermeneutical work I do on Star Wars, the less I want to live in its universe at any time. In the end, I don’t like the Jedi at all, much less wish I were one. The way I “read” them by filling in the interpretative cracks (sometimes yawning gaps) in Lucas’ vision? An ascetic order of militant monks, celibates, clueless about the real world, smug, hierarchical (and worse yet, hereditary), ripping young children away from their families. If being Jedi were something that anyone with sufficient will, desire, and training could be, that would be one thing, but instead it seems to be something like being a mutant, you have to be born with the right supply of midichlorians.

The same certainly goes for Middle-Earth. In Middle-Earth, being morally right is largely about living within the contours of the role that the gods have decreed for you, accepting your place in life, bearing the burdens designated for you. It’s a marvelous “total world” to do hermeneutical work on, and a rousing story, but I would find it a horrible place to actually live.

This is the thing about total-world fictions: the pleasure of the text lies in filling out the world in its own terms. I would be profoundly offended if the upcoming film version of the C.S. Lewis’ Narnia books soft-pedaled or discarded the overt Christian content of those books. Not because I strongly identify with that content particularly (though The Last Battle is the most appealing presentation of death and apocalypse I’ve ever seen from a Christian thinker), but because it’s part of the total-world rules that Lewis set out. It’s the necessary foundation of his fiction, and from that foundation, a nerd hermeneutician could ask many interesting things. Say, for example, how the Calormens got into that world, and if they’re Sons of Adam and Daughters of Eve, why they weren’t equally threatening to the White Witch. Or for that matter, where all the nasty evil creatures in the Witch’s army come from, given that she’s the only evil creature in Narnia at the Creation. And so on.

That I would defend those foundations as integral to the fictions built upon them has nothing to do with my imaginative desires, with envisioning a world in which I would like to live. I’m hard-pressed to think of a total-world fiction that I like in which I would actually prefer to live, in fact. Superhero comics? No way: entire cities, families, neighborhoods get wasted by psychotics, aliens, disasters on a regular basis, and human institutions are more or less beholden to an unelected elite of superpowered people. Hierarchical medieval or science-fantasy worlds? No, for a lot of reasons. I’m actually hard-pressed to think of a total-world fiction where the protagonists are fully human characters and the abiding themes are about the achievement of self, the exploration and achievement of individual freedom, the reform or transformation of the world, the building of better societies, the cultivation of pleasure, the sacrifice of oneself for the good of others, or any of the other things that mobilize secular modern humans for good (and ill). Earthsea, maybe.

Posted in Popular Culture | 15 Comments

Spies Like Us

I’m enjoying the new group blog Savage Minds. I particularly admire that they’ve been able to keep the blog fairly focused on their own discipline, something that I think we historians at Cliopatria struggle with from time to time.

Oneman’s entry on anthropology and counter-insurgency was especially interesting to me. Most historians of modern Africa do work that is extremely close to anthropology in methodological terms, involving many of the same ethical questions, though (thankfully) we’re one step removed from the internal turmoil that such issues raise within anthropology. (The AAA is a much more interesting annual meeting than the AHA because there’s always a much bigger chance of a major intellectual fist-fight breaking out at some point during the conference.)

The problem is that I think Oneman’s reservations (and some of the other comments in the thread that follows) about whether anthropologists should assist the U.S. military in various ways are shot through with a lot of the contradictions that trail behind ethnographic work in a postcolonial world. Some of these contradictions are acknowledged and some aren’t. Some of my own reflections are as follows:

1) Oneman mentions that scholars doing ethnographic research in much of the developing world are already perceived as spies or employees of governmental or international agencies by most of the people they speak with. Most Africanists I know, including myself, would confirm that observation. People I have spoken with often assume that I represent a governmental agency: I’ve been asked to forcibly direct the police to return a wayward son to his mother, to carry pleas to particular governmental ministers or international agencies, to arrange loans, to carry messages to the U.S. government.

This has interesting implications, however. Oneman notes that it’s better to be able to say honestly that you’re not when someone asks if you are. I suppose, but it’s not that easy. Many people you meet think of you as a spy in an extremely general and vague way, and as far as they’re concerned, whether you literally work for a specific American intelligence or military agency is neither here nor there.

I actually think they’re right: functionally the ethnographer is a spy, even if they’re just an anthropologist from Harvard looking to finish a monograph. They’re a stranger who watches, records, asks questions, produces knowledge. Spy melodramas aside, that’s what a decent amount of “humint” produced for various governmental agencies also is: observation, recording, questioning, knowledge production.

If you’re talking to an informant who specifically wants to know whether you’re CIA or not, and has a specific idea of what that means, you’re possibly already in real danger. Moreover, no informant to whom that specific question matters is going to believe anything an American or European researcher says anyway about whether or not they’re supplying information to American military interests.

2) In part, this is because of the fact that scholarly ethnography is intended to be public knowledge, something that anyone can make use of in any way that they like, constrained only by the responsibilities and burdens of the public sphere. No matter what you do, the people you work with are going to blend their understanding of your actual purposes with their perceptions of American, international and local interventions into their lives, so this is really the more important place where your own understanding of your work might be in tension with giving dedicated assistance to American policymakers of various kinds.

But if you do your job well, and provide a responsible ethnographic account of a particular place or people or theme, what’s to prevent American military planners or intelligence agencies from making use of it to serve their own operations and interests better? None if you take seriously the obligation to produce publically disseminated knowledge. So this is a good reason not to do private contractual work for military intelligence, because that would remove your research from public circulation. But the very fact that we produce public knowledge means that anyone whose primary motivation is to prevent the US (or any other institutional or governmental interest) from optimizing its own strategies would have to keep private and unpublished any information that would help them do so, thereby violating their obligations as a researcher.

Do your job right as a researcher, and there’s every reason to suppose that you are in fact helping military planners or policymakers, if they bother to pay attention to what you say. If that’s your hang-up, you’re already in the wrong business, unless you assume your work is of negligible importance as a whole.

3) Or if you assume that ethnographic research done well and responsibly will by its very nature return results of little use to military planner or policymakers not because of anything the researcher does, but because modern human beings are by their very nature intractable to military planning or most policy interventions. In which case, why not consult with military officials if they ask for a consultation? I’m not talking about being on retainer to produce classified monographs–but if I were a specialist on the Middle East and a US officer called me up and asked if I’d brief some group of military officials, well, why the hell not? If someone called me and asked me how to invade Zimbabwe, I’d tell them as an expert that it can’t be done long before I put on my Mr. Ethics cap and started complaining about why it would also be a Bad Thing for other reasons. What most of us who have done this kind of research have to say about people in general and the societies and places we study in specific is precisely the reason that many of us were skeptical about the Iraq occupation from the outset. Before I have anything to say about the United States as a moral entity or about warfare as an ethical problem, or anything particularly political, I have somethings to say as a much more detached student of humanity. I have no problem saying those things, relatively dispassionately, to whichever audience might ask to hear them.

4) It rings a bit hollow to me to hear from some (Oneman doesn’t say this, I should note) that the problem with advising US military interests is that anthropology should be free of all entanglements, or free of any institution which imposes power or authority on the people and communities that anthropologists study. More than a few anthropologists consult with or work for development agencies of various kinds, both governmental and NGOs, and all such agencies exert some kind of direct governmentality over the targets of their efforts. Sometimes quite considerably so. Yes, this has led some ethnographers and similar researchers to advocate withdrawing from any direct relation to the work of development as well. But…

5) The whole conversation ultimately strikes me as demonstrating some of the general disarray afflicting anthropology’s conception of its relations with power–global, national, local. It’s not that much of a surprise that these sorts of discussions rapidly tend to lead to various and sundry self-absorbed musings about the need to shuck off anthropology’s past complicity with colonialism. What often ends up emerging from these bouts of anxious reflexivity is an unsustainably contradictory position: a feeling that anthropology must somehow do emancipatory work or pay dividends in the lives of those who are studied while also abjuring any connection to the actual circuits of power that travel through the world as it is. It sometimes sounds like a high-toned version of various wrestlings with the Prime Directive on Star Trek (small surprise: Star Trek is clearly an ethnographic imaginary at its roots). Everything becomes a minuet, a choreographing of precise gestures and manners, a politics which dare not speak its name in some publics but which can freely declare its affection among others.

Otherwise somewhere in that conversation ethnographers would have to pause and at least seriously treat the possibility that US military power might under some circumstances do the work of emancipation, or that development agencies might do the same. The possibility at least might have to be entertained that the agency and will of some of the people we study might regard the forms of power that ethnographers or academics see with the greatest disdain as the lesser of many evils. In my field of speciality, I think it’s fair to say that many scholars would first and foremost want a complete disconnect between themselves and the US military, with the US government a close second–but for many years, until the late 1990s, quite a few of them were untroubled by the proposition that some postcolonial African governments might make use of their research or even consult them directly. In a place like Zimbabwe, at least in its urban communities, the hierarchy of emancipation for many people runs in the opposite direction. The US is a remote presence; the postcolonial state and ruling party an immediate one. At the very least, some of the assumptions about the failures of US policy in Iraq would have to be moderated against a wider sense of the operations of local power upon the people of Iraq before the invasion.

Some of the assumptions that whatever the US does there always and inevitably tends to the worst are also unsustainable in this light, and that’s part of the problem: it is possible to imagine military policy there which if not resting on a sound and well-founded understanding of Iraqi society, might also be considerably more “enlightened” in a great many ways, much more ethnographically savvy. Stephen Budiansky has an article in this month’s Atlantic about the resurgence of an obscure Marine Corps interrogation manual that came out of experiences in the Pacific War, a document that in some ways can only be described as humanistic. It rejects torture and coercion and advocates practices that can really only be described as ethnographic: entering into a mutualistic conversation with prisoners, understanding them in sympathetic terms, learning the deep foundations of their thought as human beings–and the advice given makes clear that these are not cynical tricks but in fact, practices which aim ultimately at a form of persuasion, to convince prisoners of a larger or different set of interests which they should possess, albeit under circumstances which are profoundly unfree in other respects. The distance between this and the most exquisitely sensitive and professionally correct ethnographic research strikes me as being wafer-thin rather than vast, and that observation should not be cause for self-flagellation but instead exploratory musing. The assumption of some that ethnographic knowledge in the context of US military policy or power always heightens the capacity for oppression seems to me to be in and of itself based on ignorance. Ignorance (or at least incuriosity) about the specific roots and causes of Abu Ghraib and things like it, and the social and cultural character of the US military and government itself.

Contemporary anthropology or ethnography can dream of a skeptical distance from all power, but if so, it best not also dream of being a handmaiden to human emancipation save through the proposition that the truth shall make you free. If that’s the service we can provide, then it’s on tap for everyone, from Donald Rumsfeld down to a peasant in Mtoko. As I’ve suggested, I think that if the US military took some of what we already knew in 2001 seriously, the entirety of the war in Iraq would have been thoroughly reconceptualized or never undertaken in the first place.

If some more direct engagement with the forms and structures of power is advocated, then it cannot be advocated without an actively declared politics, which renders moot most of the responses that Oneman offers to Montgomery McFate–it’s no longer about the nature of public knowledge or the scruples of anthroplogy as a professional practice, only about a politics that defines the US military as the worst of all evils. Which, if it isn’t clear already, I think is a flatly wrong-headed politics on several levels, including as a specific politics of ethnographic knowledge.

Posted in Academia, Africa | Comments Off on Spies Like Us

Rampant Geekery: Star Wars Thoughts [SPOILERS]

Inspired by Gary Farber’s interesting comparison of what is supposed to be the full script and the actual theatrical version of “Sith”, I thought I’d list some of the things that occurred to me about the film and the overall Star Wars narrative as it is left at the end of the film.

1. Much as I suggested in an earlier post, the Jedi had an internal crisis which paralleled that of the Republic. They had lost their way, even their balance, just as the Republic had become arteriosclerotic by the time of “The Phantom Menace”. Palpatine’s overall plan brilliantly capitalized on their complacency and detachment, their inability to understand the world around them. Nor do Obi-Wan and Yoda actually seem to have learned much in their exile by the time “A New Hope” rolls around. They both tell Luke he’s going to have to kill his own father and both of them seem horrified by Luke’s strong feelings for his friends. Understandably since from their perspective, Anakin’s attachment to Padme appears to be the cause of his fall. But watching “Sith”, I don’t think that’s it. Overlooking the awfulness of Lucas’ actual staging of the romance, you might argue that only Obi-Wan’s personal friendship with Anakin and Padme’s love for him keep him from falling much earlier. It’s not his attachment to people that is his weakness: it’s his narcissism, which arguably the Jedi helped to feed with all the talk about him being “The Chosen One”. If I were going to go back and rewrite the EU novels, most of which stink pretty bad anyway and could use a rewrite, I’d make the post-ROTJ story of the New Jedi Order be about a Jedi Order that rejected asceticism and understood that the Force isn’t just simply divided into a dark and a light side. I know that’s where the EU books eventually got to, but in a kind of haphazard way. The prophecy of the Chosen One appears now to be accurate, but what the Jedi don’t grasp (even as they speculate that the prophecy has been misinterpreted) is that they’re the target of the prophecy, not the Sith, that the Jedi are the ones “out of balance”.

One other note on this: I’m struck at how casually the Jedi kill their enemies when they could just disable them instead. (For example, Yoda’s decapitation of the two clone troopers on Kashyyak.) Whatever else they are, they’re not especially reverent about life, and even less so sentience, since they have zero compunction about droids even when said droids clearly are sentient. Their reluctance to kill helpless enemies is clearly a martial code first and a nominal allegiance to some kind of justice system, not a belief about the sanctity of life. I have no problem with this: it’s what makes the Jedi attractive in many ways, true Zen warriors–but it does mean that a certain amount of their rhetoric rings hollow.

2. Gary mentions that he’s not entirely sure why Palpatine stages his own kidnapping. Partially it appears that there’s a brewing political situation where he needs to be confirmed in his Supreme Chancellorhood once again, so this is another in a long series of Reichstag Fires for him. Clearly it’s also very much about getting Anakin to kill Dooku and hopefully removing Obi-Wan at the same time: Palpatine wants Anakin at his side by the time he brings down the curtain on the Separatists, evidently recognizing that the Separatists can’t keep it up forever and the time is fast coming where he’ll have to declare himself Emperor. I did like the brief look of panic and surprise that Dooku gives Palpatine when he tells Anakin to kill Dooku: evidently that was not in the scenario that the two Sith worked out beforehand. One wonders what Palpatine told Dooku: whether Dooku had any idea that Anakin is targeted for recruitment, whether Dooku really knows the overall plan, and so on. The “Clone Wars” cartoons help with this a bit, and the overall history of the Sith gives us plenty of room to recognize that Dooku and Palpatine are playing the usual, “Let’s work together while we try to kill each other” game.

3. I still can’t forgive “midichlorians”, but Lucas did at least try to recover the fumble of Anakin’s virgin birth with that extremely interesting conversation between Palpatine and Anakin at the Star Wars-universe version of a Cirque de Soleil performance. Not only does Palpatine hint that his plan to bring down the Jedi and take over the Republic has been in the works for a very, very long time, he gives us the briefest glimpse of his own training.

4. Other things about all three prequels now make a bit more sense. Everyone made fun of the stupidity of having a single ship control all the droids on Naboo in “Phantom Menace” but now it’s clear why Palpatine wanted it that way: he wanted an off switch so that when the time came he could shut down all the Separatist armies in one easy move.

More on the question of Palpatine’s training and other things. I saw someone suggest that the film really should have ended with Vader taking off to chase Captain Antilles and Princess Leia, to recover the plans, with an ellipsis of the years in between. Actually I like it the way the actual film does it much better, and no doubt the caretakers of the Star Wars universe do too, since it opens to them a whole series of stories for books, animated films and comic books that are set in the time in between “Sith” and “A New Hope”. I’d actually read or watch those if they’re done well, because there are some obvious and rather interesting stories that reflect on the overall narrative.

1. The formation of the Rebellion. The contours are there (and further outlined in the full script Gary read), but I like an especially delicious irony that I can see. Palpatine uses a “wag the dog” scenario with the Separatists in order to gain control of the Republic, but the emotions spawned by the war against the Separatists actually open the way for the Rebellion, and the tactics used by the Separatists also teach the Rebellion a great deal. Palpatine actually sows the seeds of his own future defeat through the very machinations that gain him power.

2. Jedi-in-hiding. The reversal of the Jedi Temple beacon means that it’s actually possible that some other Jedi survive the massacre in hiding. It’s reasonable to speculate that one of the major jobs that Darth Vader will shoulder in the first ten years or so of the Emperor’s reign is hunting down surviving Jedi. Any Jedi in hiding, including Obi-Wan, would obviously have to struggle against a complicated burden. They don’t dare demonstrate or use their powers, but they would be living in societies where compassion would almost demand that they do so. I can definitely see a story involving Obi-Wan, the Hutts and the danger of exposure…

3. Yoda and Dagobah. It’s left open how Yoda finds Dagobah and how long he’s been there when Luke arrives in ESB. I assume he chooses it because of the profusion of lifeforms. There might be a story in Yoda’s flight to the planet, particularly because the Emperor and Darth Vader know full well that Yoda is still alive and a threat. That story might also explain why there’s a place very strong in the Dark Side on Dagobah…

4. Palpatine’s backstory and his future ambitions. I like Obi-Wan’s deleted line about a plot hundreds of years old in the script Gary quotes: it’s nice to see that he at least realizes at the very end the enormity of the plan Palpatine has been carrying out. Is it his plan, or is he the inheritor of it from his own master? Moreover, once the whole plan succeeds, now what? The motivation for just having two Sith and no more is presumably gone, since that was about the need to keep hidden from the Jedi and avoid division on a grand scale. You’ve just subverted the Republic and destroyed the Jedi: what you gonna do now, Palpatine? Go to Disneyland? What drives a conspirator once his conspiracy succeeds at the grandest scale?

5. How early does Darth Vader begin to hatch his own plots to overthrow the Emperor, as long as we’re at it? Anakin doesn’t appear to have the cunning of Dooku or the raw intensity of Maul, so it would be interesting to see how he goes about learning his trade as a Sith in the early years of the Emperor’s reign.

—–

One thing that does occur to me is that it’s a little hard to believe that Darth Vader is so blase when it turns out Princess Leia has fled to Tatooine at the beginning of “A New Hope”. Considering that he’s also receiving constant briefings about the Death Star plans when he returns to the Death Star, briefings that include a description of one of the droids involved and a description of a pair of farmers who appear to have been harboring the droids, it’s a bit weird that he doesn’t put two and two together quicker than he does. Even after Obi-Wan shows up, he seems to still think it’s just about the Death Star and Princess Leia. Presumably though this is what allows him to grasp right after the end of ANH that he has a son.

Posted in Popular Culture | 13 Comments

It Doesn’t Suck

“Sith” isn’t the best time I’ve ever had at the movies, but I enjoyed myself quite a lot. If that’s the right word for it: as many early reviews and a minimal knowledge of the timeline of the Star Wars universe might suggest, it’s actually a pretty dark film in places. I saw someone saying that they’d take any child to see it who had also been comfortable with Return of the King. No way. The violence in ROTK is too intense in many places for a very young child, but this is something completely different. It’s not just specific scary bits, but an entire story about how the good guys lose.

On the Star Wars charts, I’d actually rate it right under “Empire” and “New Hope” in terms of the overall satisfactions it delivers. As many early reviews have noted, it’s got some absolutely cringe-worthy dialogue in the middle, and the romantic material is toxic. I was actually wincing in pain at how bad it is in places. That’s the “Ewok” of this film, the element that really drags it down, but the romance doesn’t get as much screen time as those little hairballs did. The plot also creaks and groans in a few parts in terms of the effort necessary to get all the players into the proper locations for the grand finale, but the pay-off is big. Some of it really feels right in terms of those images you had in your head way back in 1977 when you tried to imagine the backstory of these characters.

There’s still a few geeky loose ends that I’m puzzling over, but nothing that the creative labors of continuity freaks couldn’t sort out reasonably easily. I’ll leave those for a week or two. But if you’re at all interested in this film, I’d say yes, yes, it’s worth seeing, and worth seeing reasonably soon. I don’t think it will disappoint anyone who likes “Star Wars”, though it won’t redeem “Star Wars” for those who have never liked it.

Posted in Popular Culture | 15 Comments

Tote that Barge, Lift that Bale

The wave of cultural criticism about how “Star Wars” changed American popular culture is now flowing at full tide, if not quite up to full tsunami intensity yet. There is not much new to say amid that flood.

There’s been tremendous attention to the changes in production models, the business of cultural production, visual aesthetics and so on. There’s also been a fair bit of writing about the underlying generational shift involved, but here I think there’s a bit more to be said. Along with a shift in the generational identity of the mass audience came a dramatic shift in overall assumptions about the nature and purposes of popular culture.

I’m really struck looking at American popular culture in the 1950s and 1960s, both films and television, at how much of it was quite programmatically doing some kind of “work”: the work of defining and reinforcing models of family and domesticity on one hand (sitcomes of various kinds in particular) and the work of defining national identity on the other (the Western, most acutely). It’s not that this work was being done in any simplistically instrumental or monolithic fashion. Westerns were a capacious genre, making all sorts of diverse statements, but even spaghetti Westerns or anti-Western Westerns were making statements about the nature of American identity and history. It seems to me that it was something of a given at that point that popular culture had a job to do, and even subversive work took that as its starting place, as the thing which it rebelled against.

“Jaws” and “Star Wars”, on the other hand, seemed to flatly lack that sense of carrying a burden besides entertaining their audiences. This is not to say that they are not both absolutely filled to the gills with potently meaningful social content: Jaws has a lot to say about suburban family life and masculinity; Star Wars has an iconography of heroism and villainy that can and has kept cultural critics humming ever since 1977. Neither film, nor much of the popular culture which has descended from them, carried a sense of being on a general, shared mission, a common understanding of the purpose of cultural work. Even when popular filmmakers or television producers have proclaimed a social agenda (say, the professed desire for a strong female protagonist that appeared in the making of Aliens) it generally seems a personal or individual aesthetic rather than a systemic imperative.

There’s a lot that lies behind that (the disintegration of the studio system, for one), but I think this is one of the things that sometimes befuddles Baby Boomer cultural critics about the post-1977 moment. Both left and right, they expect popular culture to be doing certain kinds of explicit work, to have a function which one might either defend or assault. When that functionalist sensibility is not there, many of them look for it all the same–hence the cultural right’s constant assertion of popular culture’s “liberal” agenda, and the cultural left’s perpetual assumption of a consciously instrumental and persistent use of representation in popular culture to do the work of political and social domination.

I think the cultural property which brings this out most clearly for me is not Star Wars but Lord of the Rings. Here’s a book which ascended to its popularity privately, through word-of-mouth and intimate discovery, mostly among audiences born after 1960. When it finally became a series of successful films (after various lesser animated versions), some observers were left puzzled, reading the tea leaves of the zeitgeist for an explanation. What “work” were the books and films doing? The work of supporting the “war on terror”? Identity work, in the casting of heroes as Nordic and pure and villains as dark, black, racially Other? The work of moral absolutism? Of aestheticizing violence?

What a lot of this sort of cultural criticism missed was that the main story of the ascendency of LOTR was rooted in a more interior kind of sociology. One of the major satisfactions of a great deal of fantasy literature since 1960 has been compensatory: a chance for readers carrying a sense of intimate persecution by or exclusion from the cultural mainstream as constructed in the 1950s and 1960s to embed themselves in narratives where true worth and value had social meaning. Medieval fantasies–and for that matter, fantastical science fiction like Star Wars–frequently trafficked (and still traffic) in imaginary worlds where hierarchy and inner merit have some correspondence. Fans are slans; much of the work we read avidly and dreamed constantly allowed us to imagine universes where our intelligence, our insight, our moral character, our will, our skill, made us knights or wizards, Jedi or Deryni, superheroes. If those worlds were worlds were our dopplegangers were persecuted or had to fight to redeem the true social order, so much the better: it made the correspondence deeper and richer still.

I think this aspect of the shift is also reasonably well understood, especially among critics who focus on science-fiction and fantasy. But I do feel somehow that at the general level of our society, the connection still has not been made, the shoe has still not dropped. We went from a Baby Boomer popular culture that shared a common sense of its function to a popular culture devoted to the interior identities of closet meritocrats, to a rising generation of men and women who were less interested in films, books and TV shows that did “nation-work”, “family-work” or “gender-work” and much more interested in popular culture that was about the cultivation and protection of the self, about what was perceived as highly personal “imagination-work”.

The curious and interesting part of this to me is that much of American popular culture today is still recognizably my culture, the culture I grew up with, cherished privately, was sometimes ashamed of (and still am sometimes ashamed of: I get very uneasy when my non-geek neighbors or colleagues happen to see my study full of action figures, comic books, SF and so on), but it is also now everyone’s popular culture, the whole world’s popular culture. The triumph of superheroes, fantasy, and so on, isn’t really the triumph of the private worlds of those of us who consumed all those things avidly. My mom has seen and liked Star Wars but there’s still a big gap between the ways in which she is entertained by the film and the epiphany I had in a theater in 1977, the intense shock at seeing my interior, private, vaguely shameful imaginary spaces suddenly realized on a movie screen, and the disorienting sense that those fantasies were well-liked by most everyone. Much of what has happened since still does various kinds of interior “work” for me, but I don’t think it’s doing that work for anyone but me and all the other people in my tribe. For everyone else, it’s just fun and entertaining and perhaps sometimes a little odd, and for those older Americans who expect their popular culture to be doing other kinds of heavy lifting, perhaps also perpetually disappointing and lightweight. They’re wrong, but I’m not surprised they feel that way: the heavy lifting being done is done on landscapes inaccessible to them.

Posted in Popular Culture | 10 Comments