Remember when people used to use blogs mostly just for shout-outs to other bloggers? Ok, they’re often still for that purpose, but it seems to me that Twitter serves that function far more efficiently. Also, with my own bloggorhea, I’ve always been more likely to drone on about something on my mind than to link to work by others.
But two pieces which I read this week have really reverberated with me. The first was Bethany Nowviskie’s “It Starts on Day One”, at the Chronicle of Higher Education‘s ProfHacker column. Nowviskie argues that graduate programs in the humanities should completely wipe out all of their existing methodology courses (she uses the metaphor of a comet hitting the dinosaurs).
I’d agree with her first complaint against such courses, which is that they often teach methods which aren’t really in use any longer, or are inflected with an unthoughtful ethos of wariness or hostility towards digital infrastructure. The second argument she advances I worry about a bit more, which is that many such courses are “a crash course in academic jargon and en-vogue theories”. I’ve previously voiced my own sympathy for the “more hacking, less yacking” vision of some digital humanists, but it’s important not to kill the small mammals along with the dinosaurs, not to let an insurgent energy overwhelm some of the pedagogical wisdom that’s come out of existing practice. In this case, what that might mean is that we shouldn’t forget that making and problematizing are not binary states. Methods classes that are so entirely about doing or practicing that they never stop to be troubled about the purposes and aspirations of doing very quickly become mechanical and arid. “How” should never become the mortal enemy of “why”, “so what” or “who says so?”
Nowviskie rightfully says that a graduate curriculum must include consistent, persistent attention to the “uninterrogated policies and procedures that cover and shape the humanities in the modern college and university”. That’s very much my own feeling, and a driving force behind my continued blogging. But it’s crucially important not to turn many of the critical commitments of digital humanists into the one uninterrogated idea in that process. E.g., if we are going to teach graduate students in a new methodology course how to work with new platforms and publication forms that reconfigure intellectual property or create open access, we can’t step over the question of whether they should. Whenever you’re dealing with a whether kind of discussion, it’s important not to close all the escape hatches. That’s where methods classes have to come back to theory, to problematizing, and without any stopwatch ticking that says, “Hey, we only have five minutes for gnawing on our own entrails, then we have to get back to learning PHP.” This isn’t just an important pedagogical and ethical obligation: it’s also the currency of the humanities. Methods which are cut-and-dried, just about making, just about doing, just about following the recipe, are by their nature somewhat orthogonal to the spirit of humanistic inquiry.
This leads me to the second piece I really liked in this past week, at Ian Bogost’s blog. Now, look, to some extent this essay is just Bogost being Bogost: whether in tweets, blogs or books, you get the clear sense that he exemplifies the quip about not wanting to be part of any club that would have him as a member. The voice that I’ve built up on this blog over the years is so sedately reasonable that I can’t really write in this space any longer in a more expressive way, as I once think I could, but if I could, I’d probably write very nearly what Bogost says in this entry. Bogost says to humanists that if there’s a crisis in the humanities, they’ve got no one to blame but themselves.
To quote at length, he writes:
“We are insufferable. We do not want change. We do not want centrality. We do not want to speak to nor interact with the world. We mistake the tiny pastures of private ideals with the megalopolis of real lives. We spin from our mouths retrograde dreams of the second coming of the nineteenth century whilst simultaneously dismissing out of our sphincters the far more earnest ambitions of the public at large—religion, economy, family, craft, science.”
Digital culture, he adds, is good for the humanities for the simple reason that “computing has revealed a world full of things: hairdressers, recipes, pornographers, typefaces, Bible studies, scandals, magnetic disks, rugby players, dereferenced pointers, cardboard void fill, pro-lifers, snowstorms”.
Where the evenhanded compulsion of my public voice kicks in the wake of his complaint is simply to say that the things scholarly humanists care about, they care about earnestly, passionately, sincerely, and much of how they care about what they care about would be easier to appreciate if those passions were sized to their subject better. Bogost is complaining in part about something that Bruce Robbins observed some time ago about the political posture of many cultural studies scholars: that they simultaneously assume that the stakes of scholarly work are so very high that the least form of error (political, interpretative or empirical) is devastating in its possible impact and that scholars and intellectuals are peripheral, unimportant and marginalized (and must somehow figure out how not to be). The consequence of that dual construction is that the simple pleasures of humanistic writing and teaching get washed out and so too the simple possibilities of talking with publics about culture and ideas in a conversation that could satisfy everyone involved.
Scholarly humanists, taken as an abstract whole, are now so anxious about so many things: their prestige, their authority, their exclusivity, the stability of their subject, that they strain the patience of anyone or any group more serene in its sense of place within the university or the culture. And that anxiety often leads to lashing-out in all directions: at enemies both powerful and weak, at baffled witnesses and sympathetic friends, even to purification rituals within the ranks. I don’t think it has to be that way at all. Bogost thinks the answer is a purge. I think the answer is both as difficult and as simple as a more relaxed, humble and curious approach to being humanists, to scale down the claims we make and the stakes we impose.
Is it worth mentioning that Ian Bogost’s post was published in January 2010? I’m not sure what this means — that linking back to stuff brings it up to the present? that we don’t notice dates on posts that are linked to today? that the ideas in that post are still fresh and vital? that nothing has changed in the two years since he wrote?
The piece was reprinted in Matt Gold’s edited collection, Debates in the Digital Humanities (just released by U of Minnesota Press) so perhaps that is why it’s re-circulating. But all the responses I’ve seen to the post give me the impression they think it was written just now. Did you notice the date? Does it change your thoughts?
I just happened to follow a digital trail to it this week, I think via Twitter, maybe because of some other provocation that Bogost was engaged in. I didn’t see it back in 2010, and it pleased me on reading. In a way, that’s another odd thing about blogging, that we’re always stuck with a sense of having to respond right now to what’s said right now, but it’s an asynchronous mode of publication–it seems to me there should be more going back and forth in and out of the archives of what we write online, particularly when the sentiment or argument has continuing relevance (as this surely does).
Every few months there’s a new surge of readers for that post. It’s interesting. This time round, I’m not sure what it was. It could have been this new post.
Eesh … my intro course was in essence a course sampling the methods of different historiographical schools. My complaint was that it was too present-minded (as indeed all grad history courses), although, looking back, they didn’t do so bad a job as all that. I’d rather have a good historiography course than an instantly outdated course on “digital humanities.” I generally like the idea that there be more on the craft of professing as a grad student–but then, courses on craft tend to be heavy on filler and never as good as actually crafting. And why on earth learn spread-sheets as a grad student so as to be a good department chair, when you’re not going to be a department chair for a decade at best, and maybe never? Seems a waste of time–better to swot it up at the last minute. Frankly, since most grad students will never be professors anyway, training them on how to be professors might simply add to the cruelty of the current process.
As for marginality–how about, “stick to your last”? Much as I hate the displacement of the humanities, in practice, a PhD adds nothing to your contributions as a public intellectual/citizen/blabbermouth. Specialized knowledge about Provencal verse–or even American history–doesn’t make your contributions any more acute than the random journalist or blogger. It’s just the usual half-baked skein of fact and opinion, gussied up with a few villanelles. Who bloody cares if its “marginal”?–marginal to what? It’s good work for its own sake, academics are actually of doing it well, so why not do it without guilt or regret?
I should say, I realize you’ve been having a long discussion about the digital humanities, which I haven’t really been following closely. I am, from hasty skimming, doubtful that it’s a Holy Grail–but it’s more of a skepticism than a rigid hostility. So take the last comment as not particularly hostile, and recognizing that I’m not really up to speed on the details of the field.