The Evitable Future of the Digital

I don’t think anyone will be surprised that I agree to a large extent with Virginia Heffernan that education needs to prepare contemporary children for the world of work and citizenship as it is and will be rather than as it has been, and that this primarily involves new engagements with digital media as tools and publishing platforms.

There’s an interesting paradox embedded in Heffernan’s essay that applies to educators, though. She runs through a long list of careers and activities that rely upon skills in digital media or with information and communications technology that already exist, and uses them as a signpost for the unknown careers of the future that will require students trained in today’s cultural and knowledge-producing environments.

The paradox is that somehow we got to this point without our education system having that orientation. That’s a lot of content, work and invention without the training that Heffernan suggests we’ll need for tomorrow’s world. So the trick for educators is not arguing about what we’ll need to operate at all, but about what kinds of improvement and range a “new culture of learning” could achieve, or what kinds of still-unseen practices we might engender. And that is indeed a tricky business: most of the people who try to envision the practices and careers that might come into being succumb quickly to goofy utopianism.

We can start smaller. I think the term “digital native” is basically nonsense. Young adults are not intrinsically and universally gifted users of digital media and online communication simply because they were born in the right generation. They are more accustomed to certain kinds of practices than many older people, sure, but that’s not to say that there isn’t a lot left to learn, lots of untapped possibilities. Moreover, the distribution of skill with digital media and online communication is uneven even in young people. I see a very wide range of know-how and comfort with new media in our population of highly selected students and elsewhere. So educators can argue that their immediate job is to ensure an even distribution of experience with new media practices and a richer exploration of interpretative and expressive work in those media.

Of course, to do so, educators themselves would have to have widely distributed skills and be practiced in those richer possibilities. This is not my sense of the current norms in higher education in the humanities and social sciences, nor do I necessarily see incoming faculty as being markedly closer to that goal, only that there are tendencies in that direction.

But the silver lining here is that what will most improve or sharpen practices of new media creation and interpretation is not technical skill with hardware and software nor is it being the most brave-new-worldish professor on the block. What would most dramatically improve or transform existing digital practices of cultural interpretation and information literacy would be the extrapolation and extension of many of the existing and long-standing strengths of humanistic inquiry. Note I do not say, “Just keep doing what you’re doing.” New media environments are new, and the jobs and practices which extend from them are also novel. Sometimes in little ways, sometimes in very big ways. But intellectuals have followed culture and ideas into new spaces and modes of expression before and accepted that in that journey, much of their own practice would have to change. This is just a bigger and more dizzying expedition. We need to be able to envision something like the transition between the spread of print culture into coffeehouses and public spaces in the 17th and 18th Centuries and the disciplined improvement and wider distribution of print communications and print-based knowledge production in the 19th Century through dictionaries, encyclopedias, public education, and the like.

The key thing, however, is that academics don’t have very long to figure out how they’re going to describe the ways in which their skilled guidance will significantly improve existing practices and professions involving information, knowledge, and representation. If we can’t demonstrate what better ideas and more ethical approaches will look like, rather than complain querulously about how nothing has really changed, stop fiddling with this new-fangled shit, young people these days are so clueless, then we really are going to be in trouble. Higher education (and K-12, for that matter) is going to have to really show what value-added work looks like in a 21st Century world, what better cultures and ways of reading and understanding cultures might be. Pure rejection, unless it seems truly aware of what it’s rejecting isn’t good enough. But neither are blank checks written to supposedly inevitable futures in which everyone is required to be a digital native, as if merely deciding to be digital sufficiently explains what the average skilled, educated digital practicioner of the future will be. If we don’t have any sense of what it is that we lack, given how much has already changed, we can’t make a convincing case for why or what we’ll need to teach.

Posted in Digital Humanities, Information Technology and Information Literacy | 3 Comments

Seven Days in the World of Books on Fire

I said it on Twitter but I’ll say it here. The relief for a stupid book review in which someone says something that is not only evaluatively stupid but actually empirically wrong is to say so. It’s not a 65,000 pound libel judgment. I’m sorry, but Sarah Thorton has committed an act of violence against the academy which granted her degrees and against the literate world of her practice. Lynn Barber committed exaggeration, misstatement and a nasty seasoning of prevarication on top of it in her negative review of Thorton’s book. Barber’s reward should be humiliation, intense disagreement, and having to admit the truth. If we had a court that compelled that, rather than expected it of anyone purporting to be an intellectual, I might be down for that.

Barring that, the real upshot of this should be than any writer who publishes in the UK should think again. And anyone reviewing or commenting or footnoting or otherwise using writers who publish in the UK should think once more beyond that.

Posted in Academia, Books, Information Technology and Information Literacy | 10 Comments

Culture Fears

I’m completely in agreement with Claire Potter, writing at her blog Tenured Radical, that mocking Governor Rick Perry for his college grades and using them to explain Perry’s policies on education is a bad idea on several levels. As Potter points out, for one, it personalizes and psychologizes an argument that is more powerful if it’s about politics and not just Perry’s. Perry is not engaged in an obsessive, solitary attack on public education, after all. If you’re a professor, attacking him on this point also breaches professionalism. As Potter points out, faculty should know that a mixed transcript can mean many things, and should have enough decorum to refrain from mocking anyone in public over their grades unless that person is making untrue claims about their earlier work. Weak grades for an 18-22 year old can mean any number of things, and tell you very little about the capabilities and character of a public figure twenty or thirty years later.

The deeper issue is something I keep coming back to as this blog. Precisely because I so deeply oppose both what the leaders and rank-and-file of the Tea Party and their allies are trying to accomplish I think it’s absolutely crucial to spend time trying to understand their motivations. That means inquiry rather than dashing off some bargain-basement invective about racism or standard-issue rhetoric about how they’re being manipulated by the steering committee of global capitalism.

I have two areas that I personally focus on in that inquiry. In this post, I’ll focus on the Tea Party as culture warriors retaliating for a long series of perceived intrusions and trespasses against them. The thing is that kneejerk mockery of Perry as an ignoramus based on his college grades shows that this perception is based on some kind of reality.

I’ll refrain from once again going through my “Gramscian repurposing of institutions once legalistic reform hit the wall in the early 1970s was a tactical mistake” argument in full. In short, the last forty years have provided deep and shallow reinforcements of the proposition that educated Americans who have some ability to prosper in the service economy that they played some role in bringing about have a lot of contempt for anyone left in their wake.

I know there are completely legitimate policy questions as well as moral ones trailing in the wake of issues like gun rights or public forms of religious practice. It’s troubling, still, viewed from a historical distance, how quick a lot of people were to rush in and entangle those policy questions with cultural ones. Honestly, as soon as gun rights questions come up, if I’m in a crowd where everyone is the “right kind of person”, it’s not long before the issue isn’t the right to own a gun but pick-up trucks with gun racks and nude female silhouettes on the mud flaps. And I grant equally that there’s a legitimate politics where that truck is an issue, too, but the mixing of the two has been fairly incendiary.

The consequence is a situation where when Palin, Bachman, Huckabee or Perry says something laughably wrong about either the past or the present there is nothing that we the eggheads can say that doesn’t get written into that record, one more bad grade in a class that we do not control and that the students refuse to accept as valid. If you’re a teacher you know it’s a hopeless situation from the moment that dynamic takes hold. In this case, in no small measure because the moment you set yourself up as the teacher to your peers on all things, rather than the small and narrow range of things you really know better than they, you’ve lost yourself as much as anyone you’d help to teach.

I don’t know better than anyone how to minister to all children or how to deal with all criminals or how to save all the dying or fix all the economies or have all the right governmental processes or have soldiers kill only those who ought to be killed. The olive branch to offer anyone who wants to be my sibling and citizen and peer is not, “You’re always wrong and stupid and you get a C”. It’s also not, “The whole of the law is do as you will: tear down my country and I’ll still always offer to negotiate with you and compromise with you”. Drawing a line in the sand isn’t about what we know: it’s about what’s right and wrong.

Posted in Academia, Politics | 2 Comments

More on Going to Graduate School

Notes for further revisions and additions to my old essay about whether graduate school is a good idea. Thanks to Paul Musgrave for helping me to think through some of these points, some of which involve the academic job market and others of which involve graduate school itself.

1) When I first wrote my essay and posted it on my old hand-written HTML blog in 2003, I was newly tenured and very inspired by conversations at the Invisible Adjunct’s blog. By that point, I’d only advised a relative handful of Swarthmore students about their plans to go into academia (looking back, it looks as if most of the folks who chose to go then have had good outcomes from that choice).

Now I’ve advised a lot more students and I’ve developed a deep frustration over my inability to reconcile two imperatives that govern my advice. Once I’ve delivered the basic warnings contained in my old essay and some of the cautions about how troubled academia is as a career, I move on to talking to students about more specific questions about what fields they might study, where they might apply, and so on. Sometimes I flat out tell a student that it’s a bad idea for them to go to graduate school, at least at this point in their lives. I’m usually comfortable if not happy about doing that. If I think there are no red flags and it’s possible that the student might find graduate school relatively congenial and have a decent shot at winning a good prize in the tournament, then we get down to brass tacks.

Here I hit the contradiction. Students who already have a commanding, mature sense of their academic interests and inclinations often are already thinking past or against disciplinary boundaries. Some of my smartest students with the best skills as writers and speakers are often practicing intellectuals in the best sense, combining ideas and methodologies from multiple disciplines. They’ve lived up to the ideal of a liberal arts education. I want them to continue being that way.

But now I have to advise them about what programs to apply to. Many of them are dissatisfied with disciplinary exclusivity. I hear from students who want to do ethnographic research but are also interested in studying the past, students who are interested in integrating theory from economics with qualitative sociology, students who want to do political philosophy but not as the discipline of philosophy or political science do them, students who want to combine film or media theory with some kind of media production but who don’t want to do an MFA or go to art school. You name it, there are interesting, well-reasoned combinations that I hear proposed, many of them founded in inspirational coursework students have done while at Swarthmore.

So I have to decide: am I going to carry water for narrower, more constrained, more territorial practices of disciplinarity that are governed from and set by the elite R1 institutions that have the Ph.D programs and cultural capital that my students are aspiring to? To be responsible, I should do so: I need to tell my students that they’ve got to make some constrained choices, give up some of their interesting ideas, conform. I should do that for pragmatic reasons, that this is what the next few years are going to be like (bowing the knee to the most controlling or authoritarian presences in their graduate program). I should do so maybe even because there’s a legitimate case to be made for getting a strong command of one disciplinary tradition under your belt before you mess around with several. (I’m less convinced of this now than I have been in the past, but it’s at least a legitimate discussion.) At the same time, I almost want to tell the students with the most creative and confident vision of their intellectual practice to just not go to graduate school, or to do graduate work in the one freaky program out there that would welcome their ideas even if it means they’ll be completely unemployable on the other end.

I know that some people will object and say that even the most odd-duck graduate program can find a place for its students. But honestly, I have been on the other side in way too many grant competitions, job searches, panel selections and so on. In a tournament economy with hundreds of highly qualified competitors, just one thing that irks one judge or evaluator is enough to knock you out of the race. If it’s a consistent thing, e.g., something that rubs up against an orthodox way of defining a field or discipline, it’ll knock you out of most races. It’s nearly impossible to convince most historians to hire someone whose degree is in anthropology and vice-versa. When the unlikely happens and someone from one discipline manages to infiltrate the redoubt of another, they’d better be happily oblivious to sniping and negativity, because I guarantee you they’ll have a constant background buzz trailing them wherever they go.

So as an advisor, do I carry water for a way of organizing the administrative and intellectual work of academic institutions? It’s the responsible thing for me to do, and offers much lower risks. But that’s the first step on the path to a lifetime of taking few risks in a career that offers protections that are intended to incentivize risk.

2) I’ve written before about how difficult it is to come up with intentional practices that help undergraduates acquire cultural capital, which I think is more important by far in social mobility than the content of the curriculum. For first-generation college students or students who have little familiarity with the hidden codes and assumptions of an elite liberal-arts institution, making it all transparent is absolutely critical. A student who loves literary criticism with a transforming passion but who has no idea how tenure works, where money comes from in a university, how scholars actually publish, what the big picture of disciplinarity is like, which famous literary critic is actually a notorious asshole, and so on, is heading for trouble at the exits if they decide to go to graduate school. This was such a classic pattern in the conversations at the Invisible Adjunct: people who wandered into the discussion still full of passion for some period of history, for some theoretical approach to social analysis, for poetry and fiction, or for the general idea of being an intellectual and so full of confusion and alienation about how little their graduate work seemed to resemble their romantic conception of what it should have been like. (Those heartfelt expressions have since been appropriated and bowdlerized in one strand of conservative ressentiment: check out Michael Bérubé’s latest at Crooked Timber for a particularly eye-rolling example.)

I feel this every time I meet a student who tells me he wants to go study with a famous scholar whose work the student has found inspirational and I gently have to tell the student that the scholar is dead, retired, doesn’t have graduate students, or is widely known as a monster to almost everyone in the field. Ok, sometimes you just don’t know these things. Sometimes I don’t know it any more either, because if you don’t train graduate students, you miss out on some of the cutting-edge gossip in various fields. But like as not, that first statement is going to be followed by other ones that tell me that no one has made anything transparent to the student until this very moment. Where I have to explain, for example, that a doctorate in political science is generally not seen as a great first step for a person whose main career objective is to run for elected office. That no one does simultaneous doctorates in microbiology, cultural anthropology and computer science except for characters in comic books. That you should expect to receive a stipend and a tuition waver if you’re admitted to a doctoral program and if you don’t, that this is a sign that they don’t really want you. That you don’t need to do a terminal MA first in one program as preparation for doctoral study. That there are no merit grants which fund more than a teeny tiny proportion of graduate work unless you fit some rare demographic blessed by an eccentric philanthropist, like being the child of an Orthodox Jew and a Quaker who would like to study medical anthropology in Patagonia.

And so on. All these little rules, ways of being, figures of speech. Most of them not at all defensible or rational, just the markers of a particular social habitus, of hierarchy. I can tell you which future graduate students generally already have the keys to the kingdom before they even start: the children of academics. Just about everyone else is likely to lack some crucial bit of insider knowledge that is important to flourishing. What makes this especially difficult is that so much of academic work both in graduate school and afterwards is inexplicit. I’m sure there are programs which are exceptions, but most of us were not trained to write (or interpret) peer reviews, letters of recommendation, grant applications. dossiers, paper presentations, and so on. You figure out by watching others, but if you have the bad luck to happen on the wrong template or guess wrong, at the very least, you’re heading for humiliation, at the worst for self-immolation.

So I struggle here too. Academic institutions endorse faculty diversity, but the conversation about diversity usually boils down to fixed identarian formulas, to improving the percentage of recognized groups, not to diversifying the kinds of experience (and passions) that professionals can bring to intellectual work. I feel intuitively that the generation of faculty just ahead of me, people from their late 50s to 70s, are more diverse in this sense if not racially so. I know considerably more first-generation scholars whose passionate connection to intellectual work got them into academia in that generation than in any younger cohort. The question is whether I should encourage someone who I think hasn’t been exposed to all the insider rules and codes to go on to graduate work. There’s no way I can make up for all that in one conversation or even several. The best I can do is tell someone bluntly that they’re going to be at a disadvantage and that they’ve got to do their best to break the code every chance they get. At the very least, you owe it to applicants to tell them about this problem.

3) I mentioned this above, but let me mention it again. With rare exceptions, no Ph.D. program that is primarily or exclusively aimed at an academic career is worth pursuing if the applicant is not given a tuition waver upon admission. Probably it’s not worth it if you don’t get some kind of stipend or support. (I’ll add by way of disclosure that I was not funded for my first year, but got funded by my second, though I had a waiver from the beginning. In retrospect, I should have gone with the offer where I was funded from the beginning, which might have been a better place for me to be in other ways.) Do NOT go into debt for a Ph.D. program that doesn’t have other well-paying career outcomes beyond academia. It is very easy to justify going into debt out of hope or even desperation, but this is some crushing stuff to overcome later on. People who tell me that it’s worth it to them because they love what they’re going to study so much, well, seriously, with an $80,000 credit card debt, you could buy a lot of books, pay for broadband, and live in a decent apartment in a city where there’s lots of free events with intellectual heft to them and maybe even find a decent job. That’s a better option both for consummating your love of intellectual work and for developing a career and life, really. A graduate program aimed at an academic career should admit you with no tuition obligation and support you with a stipend because in the end you’re going to save them money by being a cheap teacher.

4) One thing I’ve heard over the years (most recently from several people replying to my last post) is that graduate work has a way of pulling you out of your existing peer network and making your life feel very deferred or de-synchronized. Certainly one thing that I absolutely tell potential applicants is that by seeking an academic career, they need to give up on the idea of living in a particular part of the country that they prefer. There are many wonderful places that you could choose to live as a lawyer, doctor, psychologist, accountant, information technologist, etcetera, that you simply can’t choose to be as a professor because there are no universities or colleges in those places, or maybe just one. Your pre-academic friends, on the other hand, may be making all sorts of choices like that, not just about where to live but which will-o-the-wisp to chase. This is ok if one of your reasons for choosing an academic career is stability and predictability. But I talk to some students interested in graduate school whose self-image is that they are risk-takers, that they like change and dynamism, that they like the idea of being a professor but only if it’s being a professor IN BERKELEY or NEW YORK. This is going to cause trouble sooner or later. Relationships, life aspirations, a wide or diverse emotional range, are all structurally harder to work out if you’re a person chasing a tenure-track academic post via completion of a doctorate. This is something an applicant has got to understand in advance.

Posted in Academia | 8 Comments

Weighing the Market

Maybe it’s time for me to update my old essay on graduate school.

In my old essay, I emphasized that the institutional culture of academia makes it very difficult to evaluate whether an academic career is a good idea or not once you’ve started your doctoral work. I pointed out that most doctoral programs are devoted to socialization into academic norms, not the deepening or enrichment of a liberal-arts style undergraduate education. I think this is still an important thing to understand: if you’re thinking about graduate school in an academic subject, you have to know that you can’t just experiment with it and get a clear understanding of whether you like it or not.

I consciously chose to put aside the question of the job market in academia when I wrote that essay. Markets are volatile, and my advice could be rendered moot fairly quickly. At this point, though, I think several things about the academic job market are likely to stay relatively fixed for a while.

First, the working conditions in academia have grown steadily worse for the last decade and are likely to worsen further. Leave aside the question of blame for another conversation. Faculty, alt-ac, and administrative jobs have less job security, pay worse, have fewer benefits than they used to. Positions in public universities are also increasingly subject to the aggressive interference by elected officials seeking either to protect cronies or score points by bashing the professoriate, constantly belittling the professional competency and work ethic of faculty. Academia is an extreme example of a tournament economy where only a few people can hold the desirable jobs, and where getting to hold those jobs involves very significant amounts of luck. Anyone considering an academic career has to think soberly about this point.

Second, the opportunity costs involved in pursuing an academic career are still considerable and disproportionate. In a perverse way, that situation might have improved slightly as the employment picture of the whole economy worsens. If there are no other opportunities, then there’s less loss in six years or more spent gaining a doctorate that has no other uses besides an academic career.

I need to break this point down a bit further:

a) There are fields of doctoral study where a Ph.D is an important entry-point credential for careers other than academia. Economics is the classic example, engineering another. In those fields, graduate study doesn’t have the same opportunity costs, though each has its own job market blues to consider.

b) There are fields of doctoral study where a Ph.D actually will degrade your employability outside of academia. Not because employers will think less of you, but because your Ph.D overqualifies you for a position you would be glad to accept, or imposes a financial burden on an employer because of a mandatory payscale-to-qualification rule. This is where I really have to take issue with the increasingly large group of consultants and advisors who offer their services to ABDs and Ph.Ds seeking employment outside of academia. I appreciate the optimism of advice to “think outside the box” of the tenure-track job, but in disciplines that do not have established career paths where the doctorate is a nearly mandatory credential, almost all the things you could do with a Ph.D you could do without a Ph.D. If the Ph.D has any value in a less precisely defined professional setting, you often won’t know that until you’ve worked for a while in that profession, so it’s best to wait until that becomes very clear.

The problem with Cheryl Reed and Dawn Formo advising people to look to a whole range of other professional possibilities that make good use of the tools and skills acquired in pursuing a doctorate is that they assume that there are such skills and that the people they’re counseling have a clear sense of the range of the applications of those skills. This is where I come back to my older essay: in many disciplines, most of what you learn in graduate school is how to be an academic. If you learn other things (textual interpretation, archival research, quantitative skills) that’s often an auto-didactical deepening of things you already knew how to do at the end of your undergraduate career. If you learn presentation and writing skills, either through teaching or preparing work, that’s often in spite of a doctoral program than because of it.

If the job market for doctorates is going to be more flexible in general, rather than in a few fields, the doctorate itself needs a radical redesign. It needs to take less time and be far less focused on the sociocultural reproduction of the academy’s artisanal norms and fetishes. Advising people that they can find other uses for a doctorate prior to that kind of redesign can never be more than trying to help people make the best of a bad lot, salvaging what they can out of the wreckage of a near-decade (or more) of effort. Which, I have to say, has a certain kind of recursive intensity to it: one of the better things you can do to salvage your lost opportunity is advise other people about how to salvage theirs, but anyone in that line of work is just as dependent on people continuing to believe a doctorate is the right thing for them to do as the rest of academia is.

Posted in Academia | 4 Comments

Some Weeds

I got into an unedifying dispute some years ago about the term “Eurocentric”. Some conservative cultural critics seem to think that any mention of the term marks you off as a crazed member of Sendero Luminoso or some such. I pointed out that the word can and often does have a fairly neutral, technical meaning. Sure, you don’t call something Eurocentric as a compliment (maybe some conservative cultural critics do?) but it is a useful way to label arguments that see Western Europe as playing a central or exclusive role in some important historical or intellectual development.

Or perhaps even just ways of lightly taking Europe or the West as a universal subject. A couple of folks in my Twitter feed were pointing out recently how annoying it is when a journalist or travel writer talks about how some non-Western place is “in the 13th Century” or is “unchanged for the last millennium”, because the 13th Century the reporter has in mind is almost certainly a generically European one, not the local past.

What makes this kind of thing aggravating is that it’s unnecessary. If someone wants to make a serious argument that the West really is unique, that post-1450 world history revolves around a “European miracle”, that some kind of universalism is necessary and has to reference the Enlightenment: that’s all perfectly legitimate fuel for an intellectually respectable argument.

If on the other hand someone wants to be a serious, committed racist, that’s not at all legitimate but it’s very likely an intentional practice and whatever they have to say about the barbarism or backwardness of non-Western societies is thus equally deliberate.

Casually Eurocentric terms of phrase, or ways of framing an analysis that take Europe as the universal human subject without really needing or meaning to? I don’t feel inclined to drop a ton of polemical bricks every time I come across this sort of thing, or act as if every instance is one bar on an epistemological cage. Sometimes this kind of construction is just careless, and more importantly, completely unnecessary to the ideas or expression in question.

Case in point, Richard Mabey’s new book Weeds: In Defense of Nature’s Most Unloved Plants. It’s a very interesting, engaging book that intelligently synthesizes a range of different ideas and discussions of weeds, from literary representations to agricultural science. I recommend it without hesitation.

And yet, Mabey has a habit throughout the book of talking about weeds in an implicitly global or universal tone while much of the content of his discussion of weeds comes from British history and culture, or occasionally more broadly European history and culture. He’s quite aware that in some sense, European weeds are global or universal weeds now, even mentioning Alfred Crosby’s Ecological Imperialism to drive home this point. It’s not just that plants from Eurasia have disseminated around the world but that the European way of imagining plants as weeds has done so.

Mabey is terrifically interesting in his reflections on the concept of weeds, pointing out how shifting and contingent it really is. But all the more reason that he ought to be at least slightly aware of the possibility that European ideas about weeds as well as the plants themselves were certainly not universal in the past and may not be universal today.

I understand why an author like Mabey reaches out for what’s close at hand: the plants in his garden, the plants in his country, the plants in his native literature. That’s fine. I’m not asking that an author that’s trying to address the overall concept of weeds go off and read 18th Century Chinese-language documents on agriculture and weeds, or do research on African agricultural history, or anything of the sort. I think it’s fine to gesture outward from what you know best and what you can most easily find to bigger and more general stories.

It’s just that it costs very little and potentially gains a great deal to leave room to wonder about bigger questions: is this how all human societies used to see weeds before modern globalization? Did the concept of a weed even exist in some pre-1750 societies? Are there weeds whose histories of travel and dissemination are strikingly different than the weeds which came from Europe (or were the result of European replantings like Japanese knotweed)? Are there places today where agriculturalists see or talk about weeds in a really different way from the U.S. or Europe? All of these kinds of questions are easy extensions of what Mabey is already interested in, but there’s something about the way he assumes universality at certain moments that prevents them from ever rising to the fore. Without any deliberate effort, England and Europe become the world, the normal, the universal referent and there isn’t any particular reason why they have to be. Provincializing Europe is in some ways a small kind of project and one that shouldn’t require a lot of flash and fire.

Posted in Books, Generalist's Work | Comments Off on Some Weeds

Out, Out Damned Spot

Is there anything more grating than an interpretation whose language slips and innocently anoints its analysis with the status of a fact?

I’m sure I noticed this pattern in the letters to the editor in this week’s New York Times Book Review because they were complaining about Laura Kipnis’ review of Maggie Nelson’s The Art of Cruelty.

Kipnis’ review started off with a wonderfully bracing slap to that most tedious kind of middlebrow NPR-listening muddled complaint against mass culture: “Well-meaning laments about violence in the media usually leave me wanting to bash someone upside the head with a tire iron. To begin with, the reformist spirit is invariably aimed down the rungs of cultural idioms, at cartoons, slasher films, pornography, rap music and video games, while the carnage and bloodletting in Shakespeare, Goya and the Bible get a pass.” Kipnis continues, “Low-culture violence coarsens us, high-culture violence edifies us. And the lower the cultural form, or the ticket price, or — let’s just say it — the presumed education level of the typical viewer, the more depictions of violence are suspected of inducing mindless emulation in their audiences, who will soon re-enact the mayhem like morally challenged monkeys, unlike the viewers of, say, ‘Titus Andronicus,’ about whose moral intelligence society is confident.”

If I could fit that on a tattoo, I’d get it put on my arm, just to save time the next time I want to say roughly the same thing, which my friends and colleagues can tell you is about once a day.

It’s just about as predictable that after saying it, you can expect some kind of rebuke from purveyors of the conventional wisdom, often one that speaks past rather than to the original critic.

When I’ve been on panels about media-effects arguments, I’ve always been a bit amused at the gentle chaos that articulating a critique like Kipnis’ tends to sow among researchers or audience members who follow the standard line. They’re ready for dramatic self-righteousness if by some chance an executive or producer from the culture industry should happen to show up and disagree, but not for zooming off in a more perpendicular direction, such as a more academic dismantling of the methodology or conclusions of long-standing media-effects work, or Kipnis’ point about how much criticism of violence in mass media is rather open in its pimping for high-culture snobbery.

As an example of what that gentle chaos can lead to, Josephine Hendin’s response to Kipnis is a really prime example of the aforementioned rhetorical transposition of an act of interpretation with a statement of a fact. Moreover, because Hendin talking about violence, art and popular culture, she does a pretty fair job in two paragraphs of demonstrating why there was a scholarly revolt against limiting the subject of literary study to high-culture works.

Hendin complains that Kipnis “does not clearly distinguish” between valuable artistic uses of violence and “shock value”. I’m sorry, were literary critics the people who were supposed to be especially skilled at close reading? Because as a starting observation, this leaves me a bit confused. Kipnis starts off her book review rather clear on this point: she thinks this distinction is bollocks. So perhaps Hendin meant to say, “I don’t agree with Kipnis: I’m going to argue that there is a distinction”. See, speaking of distinction, I think there’s one between saying, “I don’t agree with you” and “you didn’t make my argument and made your own instead, so I think you’re being unclear”.

The rest of the letter has the same problem: interpretations are converted by some invisible table into empirical data. I understand, it’s a two-paragraph letter, and not a monograph. But it’s not that hard to find monographs by literary critics that make the same rhetorical slip for hundreds of pages, refusing to characterize or imagine a claim as an interpretation and instead stating it as something which is. “Much of pop culture is about endemic desensitization to anything but the action of violence”. Much? Well, what have you got in mind? Tomb Raider and Andy Warhol, really? Not what I’d call major foundation stones of contemporary popular culture, but that’s how these arguments usually work: highbrow critics and audiences reach out desperately for the one or two pop culture texts or properties that they have some paratextual familiarity with, maybe from a panel four years ago at the MLA or from their teenage child’s unrefined cultural consumption.

“Does not clearly distinguish” is of a rhetorical piece with some of my least favorite repeated phrases in undergraduate papers. For example, the venerable favorite: that the author of a text “forgot” to make an important point in that work. For some reason, my students think this is a gentler, fuzzier way to say that the author is wrong on some important point, while also hoping that they will keep me from noticing that they don’t really have a fully worked-out understanding of what is wrong with the author’s argument. What I point out to my students is that this is both a more condescending characterization than simply saying that they disagree with the text (I’d rather be argued with than have it insinuated that I didn’t do my work properly) and it calls attention to rather than disguises a lack of command over the issues.

I agree that direct and declarative language is a good thing, whatever the length of an analysis. But it’s important to use language that always recalls what interpretation really is, and what it’s not. One of the requirements of that language is self-awareness. By all means generalize, but know that it’s you that’s doing it.

Posted in Academia, Games and Gaming, Popular Culture | 6 Comments

Do I Really Look Like a Guy With a Plan?

I can’t really let this whole mess go, much as I want to turn my attention to a couple of other issues and ideas that are on my radar screen. Much as I did before the Iraq war started, I’m having trouble sleeping. Even more than the political leadership, it’s the punditocracy that’s on my mind.

I’m simultaneously fascinated, vindicated and irritated by the near-total inability of expert commenters, political journalists, talk show hosts and other purveyors of the conventional wisdom to grasp what’s going on here. This is not just a case of blind men feeling their piece of the elephant: they’re not even groping the right animal.

Most of the talk, like on WHYY’s Radio Times this morning, is about what “the public” or “the American people think” on one hand and on the other, it’s the usual sports-style color commentary on what Washington insiders are thinking, doing and planning. At least in the second hour, Stuart Diamond and Julian Zelizer got pretty close to breaking out of that discourse to the real point. On other shows in the 24/7 news cycle across various media, I’m hearing a few experts and pundits make fragmentary moves to shift the conversation, often pulling back to more familiar tropes.

I suspect that a lot of Washington insiders, including older, established Republicans, are further down the road to recognizing the new shape of things, but it must be hard for them too to grasp what’s happening.

In a nutshell, what’s going on is something that hasn’t happened in American politics for 50 years: an ideologically coherent social movement with clear political aspirations has taken shape out of murkier antecedents and disparate tributaries and at least for the moment, it has a very tight hold on the political officials that it has elected. The movement is not interested in the spoils system, its representatives can’t be quickly seduced into playing the usual games. And the movement’s primary objective is to demolish existing governmental and civic institutions. They’ve grown tired of waiting for government to be small enough to drown in a bathtub, so they’re setting out with battleaxes and dynamite instead.

Social movements that aren’t just setting out to secure legal protection and resources for their constituency, but are instead driven to pursue profound sociopolitical transformations are unfamiliar enough. What makes this moment even more difficult to grasp in terms of the conventional wisdom of pundits is that this isn’t a movement that speaks a language of inclusion, hope, reform, innovation or progress. It speaks instead about restoration of power to those who once held it, the tearing down of existing structures, about undoing what’s been done. This movement is at war with its social and institutional enemies: it has nothing to offer them except to inflict upon them the marginalization that the members of the movement imagine they themselves have suffered.

Even the left, whatever that might be, is having a hard time bending its head around the situation, because for decades it has been accustomed to thinking of organizations on the right as fringes or cults that need to be monitored or controlled, or watched for their infiltration of legitimate politics. It’s very true that the Tea Party and its cognate organizations are not by any means a majority of the electorate, but the point is that they’re a very coherent plurality that can win majorities in enough districts and localities to block votes and prevent business as usual, and that preventing business as usual is a political objective in its own right for them, not just a means to some other end.

On Radio Times, Stuart Diamond, who specializes in the management of negotiation, began to grasp the nettle when he recognized that you can’t find a compromise with a group that’s not seeking a compromise. Everybody who is still talking in those terms about deals and compromises really doesn’t get what’s happening. Even if there’s a compromise or agreement by Tuesday, it’s not going to have any long-term meaning. There is a sufficiently large political bloc inside the political system and a sufficiently coherent social movement outside of it who are unafraid of economic chaos, welcome the federal government’s inability to meet its obligations, and hope that the President is stuck with a major national crisis that he can’t fix because that’s what they want. Michele Bachmann isn’t ignorant about what might happen next week if there is no deal: the voters whose endorsement she seeks are hoping for the worst-case scenario.

Of course, the other reason that the punditocracy doesn’t know how to talk about a real social movement is that it isn’t the kind of thing that lends itself to political management or policy formation: it takes them into the unfamiliar discursive spaces of history, anthropology, culture, political theory, which don’t lend themselves well to punditry and don’t produce smugly self-contained recommendations and conclusions.

Posted in Politics | 2 Comments

Environmental Studies Capstone: An Early Sketch

Next spring, I’m going to be the faculty coordinator for the Environmental Studies capstone course for seniors completing a minor in the program. The field as a whole is not an area where I’m deeply knowledgeable or do a great deal of research of my own: my primary point of connection until now has been through a course I teach on the environmental history of Africa. What I want to do is focus the capstone on something I know a bit more about, namely questions about the intersections between the public sphere, online communication, everyday life and popular consciousness, expertise and academic authority and political activism and policy formation. The driving question behind our work will be: why have debates and conflicts over climate change science and policy taken the shape that they have in the United States over the past fifteen years? What drives controversy and contestation over this issue? And depending on the answers, what do the students in the course want to do about their conclusions, if anything?

So I’m imagining the course as a very open-ended, problem-based, student-driven investigation of the big questions in the first half, and an equally open-ended brainstorming and workshopping of strategies and solutions in the second half, almost as in a lab-based approach. What I have in mind right now is that in the first half the students will investigate what I would call “big narratives” about the underlying causes of political and social debate about climate change, and build up what I’ve imagined as a series of flow charts built around each of these big narratives. Each week in the first part of the course, I want all of students to participate in a scavenger hunt looking for what they consider to be influential, successful or intriguing examples of a particular narrative: books, online discussions, organizations, political campaigns, advertisements, and so on.

At this point, I’ve got four such narratives in mind, and I’m interested in getting feedback on whether these are sufficient, or whether there’s a better way to characterize them, keeping in mind that I’m trying to come up with highly generalized starting points.

1. Conspiracy. E.g., the legitimate findings and recommendations of climate change science are being deliberately sabotaged by powerful interests who either will suffer financial losses or political influence (or both) if those findings and recommendations are broadly accepted. Examples: Oreskes and Conway, Merchants of Doubt, Hoggan, Climate Cover-Up, Michaels, Doubt Is Their Product.

2. Climate change science or its accompanying policy orthodoxy is actually wrong in some or all respects. Opposition is rational and legitimate. Flaws might be epistemological, empirical, or involve the policy recommendations that have followed on the scientific findings. Examples: Bjorn Lomborg, Cool It; Roger Pielke, The Climate Fix; Michaels and Balling, Climate of Extremes; Spencer, The Great Global Warming Blunder.

3. Climate change scientists and their political supporters have erred tactically, rhetorically or organizationally in disseminating their findings or mobilizing public support, or must otherwise pursue new kinds of political tactics or work with different structures. Examples: Schellenberger and Nordhaus, Break Through; Kirkman, Skeptical Environmentalism; Paul Gilding, The Great Disruption; Heinberg, ed., The Post-Carbon Reader; Craven, What’s the Worst That Could Happen?; Elizabeth Kolbert, Field Notes From a Catastrophe. (This narrative probably requires a significant week-long detour into more abstract discussions about activism, political organizing, policy formation and so on.)

4. The climate change debate is the product of complicated, deep-seated conflicts, transformations and habits of mind. It is a synecdoche for much larger struggles over culture and values, social antagonisms, economic transitions; is strongly skewed by new relations between expertise, authority, information and democratic publics; is shaped by complex-systems interactions involving nature, economies and society that no one controls or can predict; or follows from powerful cognitive habits and patterns governing the formation of opinion, the assessment of risk, and so on. (This narrative probably requires a significant detour into questions about popular relationships to scientific expertise, red-state/blue-state cultural conflicts, the nature of the public sphere, complex adaptive systems theory, and discourse analysis.) Examples: Hulme, Why We Disagree About Climate Science; Randy Olson, Don’t Be Such a Scientist; Donella Meadows, Thinking in Systems; Gardner, The Science of Fear; Burton, On Being Certain; Latour, We Have Never Been Modern; Mitchell, Rule of Experts; Aggrawal, Environmentality.

————

Obviously, 1 and 2 are in some sense the simplest narratives for us to work with (and this might be why they are strong catalysts for political organizing on either side of the conflict). Narrative 4 is a bit overwhelming in that I’m collecting virtually every complex systems-level interaction I can think of under that heading, but I’d like to consider those all alongside each other and poke around for ways to integrate or connect some of those approaches.

Should I subdivide these (keeping in mind I have a finite number of weeks, that I want to present a big overview early and have the students make their own selections or decisions about where to dig deeper)? Am I missing another “big narrative” that warrants early investigation?

Posted in Academia, Swarthmore | 5 Comments

An Analogy

I mentioned this analogy in my Twitter feed and was asked to explain it in a bit more detail: that Obama’s Presidency is increasingly resembling James Buchanan’s Presidency. Buchanan was the 15th President, holding office just before Abraham Lincoln and the outbreak of the Civil War.

Historians of the U.S., especially specialists on antebellum history, are welcome to complicate, reject or deepen the comparison. But the rough outline that I see is that Buchanan was regarded by his political colleagues as intelligent, articulate and erudite in matters of law and political procedure. E.g., he wasn’t an inert dud or incompetent like some other antebellum or late 19th Century Presidents.

He’s nevertheless commonly regarded as one of the worst Presidents in American history because of the way he chose to deal with the deepening crisis over slavery, states’ rights and secession. He entered office determined to broker a lasting compromise between the two sides, positioning himself as an uncommitted, neutral figure who could be a trustworthy arbiter. That stance ended up infuriating almost everyone involved in the conflict.

The basic error was that Buchanan approached American politics in procedural or legal terms at a moment when the reigning political conflicts in American life were no longer in any sense shaped or resolved by procedural or legal processes. He waited passively for legal decisions to determine his course of action, and when the Dred Scott decision dropped in his lap, he regarded that as the end of the matter. Open conflict in Kansas baffled him, and again he turned to a safely procedural answer (advocating that Kansas enter the Union as a slave state).

His worst moment in these terms was when he reacted to secession by characterizing it as illegal while maintaining that doing anything about secession would also be illegal. That’s pretty much the definition of clueless, of a basic incapacity to grasp the nature of the situation.

One point that’s important about the discussions shooting back and forth between various bloggers about liberals, neoliberals and the left is that the comparable error in the present moment is assuming that the political conflicts inside the Beltway are still being driven by rules, procedures or processes, either those that structure American governmental authority or those that supposedly drive individual behavior and calculation. There are a lot of differences between 1855 and today, but a key similarity is that the drivers of political conflict are not originating from within the long-standing rule-based norms of American political process nor are they mappable to some kind of rational, game-theoretic or utility-seeking calculations by powerful individuals.

As in 1855, there are moments of potent intersection between the complex social and cultural formations driving large-scale political conflict and the formal political system (remember, after all, that it was Lincoln’s election that was the final catalyst for secession). But any elected official who really wants to lead at this moment needs to stop paying attention to what’s going on inside the Beltway and start paying attention to what’s going on outside of it. Any meaningful action that involves an engagement with the grievances of Tea Party activists has to be aimed at trumping or bypassing the established rules of the game. In those circumstances, compromise for the sake of compromise, justified in the name of necessity or helplessness, doesn’t resolve anything. It just kicks an increasingly explosive can down the road a bit. Like Buchanan did, which I think justifies his reputation as the wrong leader at the wrong moment.

Just to twist the knife a bit: a leader who hopes to restore respect for procedure also would have to do a more consistent job of it than Obama has: he’s done very little to reign in extrajudicial or arbitrary uses of security and military power, very little to call back extreme assertions of executive supremacy by the Bush Administration. It’s the worst of all worlds, really: President Obama has done a good deal to actually reproduce the illiberal, anti-procedural initiatives of his predecessor in security matters and on questions of transparency while hamstringing his own ability to speak to and act meaningfully at larger sociopolitical scales where the nation is most agonizingly at odds with itself.

Posted in Politics | 8 Comments