Definitions of “Liberal Arts”: 1

Carl Edgar Blake II, Iowa pig farmer:

““I can build a motorcycle, I can fly a model airplane, I can throw somebody out of a bar, I can wrestle a pig and I can program a computer…”

Posted in Defining "Liberal Arts", Good Quote, Bad Quote | 2 Comments

The Moral of Pierre

I heard just a small bit of a story on NPR this morning about “crunch time” in family life, where working parents feel the pressure of getting their kids fed with a decent meal, finished with homework, and to sleep at a reasonable hour, and how exercise and play both tend to fall out of the picture many days.

The segment I heard featured a woman who talked about how she cried when she saw the NPR solicitation for the story on Facebook and another mother who talked about how she didn’t think this is what family life was all about. And then the experts came on and said, “Everybody knows what they’re supposed to do” (in terms of making sure kids get enough exercise and eat well to avoid obesity) and concluded that what we really need to do is figure out why so few people do what they know they’re supposed to do.

This is a fairly established line of expert reasoning in national discourse about issues that have been coded or marked as “public health” crises. Using a fairly narrow range of methodologies drawn from social science, particularly economics and social psychology, the experts verify first that existing forms of public education have been sufficient to establish baseline awareness of a public health problem that turns on behavior. Sometimes they read the evidence and conclude that the education needs to be in a different form or in a different location, or that more money needs to be spent on it. Usually that involves experts in the expert’s community of peers, if the recommendation is taken.

Sometimes (as in this case) the experts conclude that there is sufficient awareness, just not sufficient compliance. People aren’t doing what they’re supposed to be doing with the near-ubiquity that they ought to do it: not wearing helmets or seat belts, not quitting smoking, not taking a recommended pharmaceutical, not getting enough exercise, not minimizing their consumption of some kind of mass media, not following dietary recommendations, and so on.

Rarely if ever does the community of experts pause at this moment to inventory their own histories of error and exaggeration, or ask what the nature of their relationship is to the publics they advise and the resources they demand for the advising and studying of those publics. That alone might provide something of a testable hypothesis: that sometimes publics stall and defer on doing the things they ought to do because at least some of them are old enough to remember other things that they were told they ought to do that later on turned out to be not so important, or actively the wrong thing to do. Or that some of the advice turns out to be improvident or unrealistic in unnoticed or unacknowledged ways. Or that the experts are being impatient: on some issues, it turns out that people will change, if you just quietly keep working on the problem and don’t insist on changing your focus and approach every three seconds.

But the urgent rhetoric of many public health campaigns is a clue to a deeper problem. The rhetoric almost always calls back to a form of technocratic common sense. Ask an expert or government official: why should we wear motorcycle helmets? Lose weight? Smoke less? Use gun safes? Wear sun screen? The answer, often, is “so you won’t die earlier than you should, be injured far more severely than you would be otherwise, or injure others more than you might.”

All of which is often demonstrably true about the issues that have risen to the level of a general national concern or discourse. Though often the campaign to reduce death and injury stops short of an undiscussed political threshold where the competing good of individual freedom has a mostly unmeasured and unacknowledged weight in the conversation. You could reduce death and injury even more, for example, if you outlawed motorcycles and similar vehicles altogether.

The fact that freedom or autonomy sneaks into the discussion via the backdoor is a clue to the real weakness of a lot of well-meaning public advice. There is something still more important missing from the discussions started in this fashion. Namely, why should we care? Why do the experts care? What does it matter if some larger number of people die earlier than they would have on average, or are injured more often and in a worse manner?

The fallback answer embedded in most public policy is, “Because it costs us more money”, either directly in terms of treatment and other costs or indirectly in terms of lost or reduced productivity. That assertion, of course, sometimes kicks off the kind of further research that justifies the reputation of economics as the dismal science. In the terms of standard economics, it’s never self-evident that more injury or earlier death actually does cost more money. If I have to spend X amount of time washing my hands, exercising, cleaning all surfaces in my house and so on, and Y amount of money on vitamins and flu shots and Purell, maybe it turns out that the Z cost to my productivity of three days of flu is lower and I’m perfectly rational for taking no special measures to avoid flu. (Throw in for good measure that maybe being too aggressive at avoiding flu makes my immune system more vulnerable because it’s untrained.) (Also throw in for good measure the financial losses of a medical profession that has fewer flu patients to treat.)

Or maybe the math works out and yes, I should try to live a little longer and be injured somewhat less…in order to avoid costing society some slightly higher amount for my care or some fraction of lost productivity. And here we have arrived deep in the belly of the neoliberal whale, just in time to watch the experts and technocrats hand out machetes to we, the swallowed. If you want an explanation of the meanness of 21st Century American public discourse, for the fractures in the body politic, this will do as a starting place. “Get that guy to wear his helmet, because otherwise he’s going to cost you money.” “Get that woman to lose weight, because otherwise she’s going to cost you money.” “Hassle that couple because their kid plays too many video games and might slightly underperform in school and not make the contribution to net productivity that we are expecting of him.”

We are offered a thousand reasons to complain of other people’s behavior (and to excoriate and loath our own) on the grounds that it will cost us too much. That we should talk about what is good and bad, right and wrong, mostly in terms of the selfish consequences, or at best, in terms of the kind of closeted idea of a collective interest that neoliberalism dare not directly speak of–sort of the nation, sort of the economy, sort of the community, but really none of those directly or clearly.

What the experts generally rarely say is, “Because we care for one another, want the best possible lives for one another, and would not be deprived of each other’s company one moment sooner than we must”. Why does your mom tell you to wear a helmet and stop smoking and lose some weight? Ok, sometimes because of the ordinary psychodrama of family life and its little struggles for power, but sometimes, often times, simply because your mom or your dad or your kid or your friend loves you. Because they value you.

This humane sensibility drops from public policy and technocratic expertise because, for one, we’ve become profoundly unpracticed in its use.

For another reason, because it’s harder to just keep hammering at some change in an inflexible and unreflective way. When I was in seventh grade, I once screwed up my courage to tell my intelligent, sensitive, very queer, 50-something chainsmoking English teacher that he should stop smoking. He winced, teared up a bit, thanked me for caring, and said, “But darling boy, I think it would hurt me worse at my age to try and stop”. Which at seventh grade I was not prepared to understand, but now I can. When we care about others, we also know that there are reasons why they ride motorcycles without helmets or serve chicken nuggets three times a week, reasons that are profoundly built into their specific humanity or are at the least not really worth the harm and cost of the persistent harassment that might push a change in habit.

Which is another reason the technocrat avoids this mode of argument. Because to see people in this way is to be seen. If it’s about the empirical evidence and the abstract costs of acting or not acting, the expert can stay invisible and outside. But when we sit down to persuade through love or affection, we are naked and vulnerable ourselves. Our bodies and habits are as seen as those we are looking upon. The worst of all worlds is the person who borrows the grandiose certainty and intensity of public health and imports its rhetoric into more intimate kinds of observing and commenting upon others. There is no surer recipe for a flame war between “mommy blogs”, for example, than one blog attacking another’s vision of parenting in this kind of olympian voice, where the critic’s own family life is off the table and beyond the gaze.

But ultimately, if I had to put money on which kind of discursive approach most powerfully gets at the issues and changes that matter, I’d put my money on caring for one another. “Stop costing me money” in a society that also protects the autonomy of individual choice is a perverse and counterproductive angle of approach: it makes me want to do more of whatever that is up until I’m not allowed to any longer. It is, ultimately, the voice of the Boss, and at least for now, we can still say, most of the time, that the experts and the government and the human resource specialists and the doctors are not the Boss of Me. Small wonder that many policy wonks and technocratic experts flirt so relentlessly with prohibition and restriction as the big stick behind the soft talk.

Maybe the greatest reason that a neoliberal society doesn’t choose the route of caring and cherishing is the further obligations we might incur down that road. We might have to become far more subtle and careful about our entitled and dismissive readings of the ethical content of everyday life–and we might eventually have to do more than ask people not to do something.

Posted in Politics | 11 Comments

Late Afternoon of the (Academic) Elites

I like Michael Bérubé’s essay about the “crisis in the humanities” at the Chronicle of Higher Education but I’ve written quite a lot about the main issues in the essay lately and I want to give it a bit of a rest.

I did find myself drawn to an exchange way down in the comments between Bérubé and the pseudonymous commenter I_have_been_Catholic. The commenter acts like a tendentious jerk in many ways, and as Bérubé points out, a bit creepy too when he/she says “we know each other”. But I think Bérubé sticks with the exchange as long as he does because the anger and dissatisfaction deserve some kind of answer and there is in some way no answer that can be given.

Christopher Hayes’ Twilight of the Meritocracy is a good, short, smart analysis of the accelerating feedback loops that are undercutting virtually every profession, institution, and practice in the United States where some form of meritocratic selection is an important part of its daily operation and its forms of self-justification. As Hayes observes, the ideology of meritocracy is increasingly just a cover story for the self-reproduction of a shrinking elite, and in his view, this is the inevitable structural outcome of meritocratic ideals, that their processes are always corroded and corrupted over time. Hayes knows that it is hard, perhaps impossible, to do without the proposition that talent, ability or skill is unevenly distributed and should be unevenly rewarded, but that for now, what’s needed is an egalitarian push that breaks open current systems of meritocratic selection, renews, reorders and flattens the distribution of wealth and privilege, and so on. It’s a cycle, as he sees it, and the life of a democracy will always involve this kind of cycle of renewal, corruption, radical reform and renewal again.

So let’s take the one small corner of American life that’s at stake in the exchange between Bérubé and his commenter: the training, hiring, and continuing employment of faculty in higher education.

The broad contours of the situation at present would probably be agreed to by almost everyone:

1) There are many universities in the United States, and thus a significant number of jobs available to people whose job is primarily to teach courses and a lesser but significant number of jobs for people whose primary job is to conduct research at a university.

2) Teaching in higher education should require some kind of advanced training or study in the field(s) to be taught, as should research.

3) The qualification most commonly used is the Ph.D; in other fields, it might be some other graduate degree. In a few cases, it might be experience in some other professional field with or without relevant graduate training. Few critics suggest dispensing entirely with some kind of graduate training as the primary method for qualifying someone to teach or research in higher education. (I might, actually, but that’s a blog for another day.)

4) The terms of employment in higher education in the United States are increasingly bad, in some ways consistent with the general ways that professional and managerial work have become less rewarding and stable, in other ways specific to higher education.

5) There has been a steady reduction in the number of highly desirable teaching and research positions where the salaries, benefits, and terms of work are very good, most specifically including continuous tenure, but some of these highly desirable positions remain. At the same time, there have been until the last few years, steadily increasing numbers of trained candidates seeking these positions across all fields, and in some fields, these increases have continued.

6) Some fields of specialized training have few, if any, other options for employment. Other fields offer a wider range of alternatives to academic work as plausible or even common outcomes.

Ok? Now we come to the point where opinions and feelings diverge with increasing intensity as we take up the question, “So what to do about it, what’s fair and unfair?”

Very very many job-seekers
Many jobs
Very few good jobs

Bérubé’s critic asserts that in this situation, the people who hold the very good jobs primarily look out for each other, and privilege the students trained by the institutions that have the very good jobs. He suggests that if all marks that distinguished where candidates were trained, who they were trained by, and so on were stripped from the files and the selection of candidates was done by some kind of national, disinterested group, that the outcomes of hiring in academia would be far more legitimately meritocratic.

Bérubé replies that this is impossible both in terms of being nightmarishly complex and bureaucratic and (as I read his increasingly and I think legitimately annoyed comments) because you cannot possibly strip the identifying characteristics of candidates for academic jobs, that the commenter is revealing that he doesn’t really know how searches actually work. He points out that the writing sample is a crucial part of the evaluation of academic candidates. One of the things I think he’s pointing out is that when you are assessing the question, “Will this candidate be a good teacher in this field? A good researcher in this field?” and you’re answering that question partly by reading their scholarship and listening to them present their work or teach a sample class, you should know right away some things about the candidate’s training and pedigree because you know something about the field and the discipline. If I were hiring a historian of modern Africa and could know nothing in advance about where the person was trained and who they were trained by, but I could see their scholarly work and hear them present, I’d almost certainly have a good guess about the information that had been concealed from view. If I didn’t, then that would be a sign that I don’t know my own field and indeed, that I wasn’t a very skilled historian overall, since this is precisely the kind of reading out of information from documents that I’m trained to do.

If the process was made objective by taking it away from me entirely, to some national star chamber, it would have to be taken away from everyone in my field and maybe my discipline. Which would leave us in the peculiar situation of having a process of hiring that is so fearful of selection bias that it asks everyone but the people who know and practice a field to decide who is the most meritocratic candidate in that field.

But this is one common response to watching meritocracies break down, become insular and elitist: to try and make them objective and dispassionate, to take rhetoric and human judgment out of the loop of their operations. What this response believes is that there is true merit and that it is accurately discernable to very fine degrees. You can’t get away from the sense that Bérubé’s interlocutor thinks he/she and others are more deserving of meritocratic selection than others who have been selected and that the only explanation for the actual results is corruption and bias. This kind of critic believes in meritocratic distinction and hierarchy, that there are a very few who are much better at a task than anyone else, and a pyramidal distribution down from that, just that the pyramid needs resorting using a better mechanism of selection.

The other way to read the situation at present is that the numbers of people who could teach and research at approximately the same level of ability and quality is very, very large and that the selection processes are unfair largely because they can only match a small number of equally qualified people to the jobs they deserve. That observation leads to a different kind of solution: you could either argue, “We need more good jobs, then” or you could argue, “Let’s dispense with all the meritocratic flummery and gibberish then and just admit this is a tournament system where the people with the best jobs mostly got them through dumb luck, like winning a lottery”. (You can argue both at once, if you like.) You could even go one step further, as one of my colleagues has suggested about admissions to Swarthmore, and say, “Look, let’s just randomly select people based on some baseline set of objective qualifying metrics, let’s stop screwing around pretending that we are making fine-grained meritocratic distinctions that have any validity.” This is ultimately the position that says, “Everybody (including people who haven’t gotten the good jobs) needs to stop talking the language of merit and stop complaining that there are people with merit who haven’t been rewarded for it and needs to start talking more about something like equality instead.”

——-

I end up wanting to borrow elements of all of these responses. Thinking that you’re the winner of a lottery is an uncomfortable way to think because it saddles you with an inescapable survivor’s guilt. Why me? Why not that guy or that guy? On the other hand, very few of us would give up a winning Powerball ticket even if we felt bad for all the people who didn’t win Powerball–we might just say, “But do good things with the money, try to live a deserving life”. If you’ve witnessed a lot of job searches, handed out grants or been involved in any selection process, you know that decisions often come down to very nearly random or miniscule distinctions between people who are very much equal in all the major ways that count, and that some of those distinctions can turn on the quirks of the individuals participating, on the structure of the selection process, on unconscious systematic biases, on disciplinary cultures, and so on, as Michele Lamont’s How Professors Think documents rather well. (Bérubé’s critic cites the book, as well he/she might.)

But Lamont’s book also points out how hard people involved in these selections often work to try and guard against some of those inclinations and how carefully people work to try and link “merit” to some set of ideals and requirements that aren’t just about entrenched privilege or the self-protection of an elite. If you’ve been involved in giving grants, hiring faculty, admitting students and so on, you’ve probably seen that there is sometimes striking consensus across a very broad range of temperaments, training, and relative access to privilege about who is at the top and bottom of a selection pool. And like Bérubé, I would say that that top and bottom are often about the substantive individual qualities and abilities of the candidates, not about their pedigree. When institutions do help their candidates, it’s often because, well, they did something better in teaching them. I helped judge an important fellowship for doctoral candidates for a number of years (it’s one of the competitions Lamont studied) and there was one research university whose candidates were always very strong, so much so that we typically had to cap the numbers of them who could get the award to be sure to spread the opportunity around to more institutions. That wasn’t because we were all wearing old school ties and shaking the secret handshake, it was because this institution had made a specific commitment of resources and time to teach its graduate students how to write applications for this grant and provided them with a lot of quality feedback on drafts. That’s privilege in some sense, but it’s also outcomes–what teaching is supposed to be about. If you’d blinded us to the institutional names, we’d have still picked these folks out because they had people who put time into making their applications “objectively” better, and other institutions didn’t even when they could have. A job candidate whose advisor takes time to read their dossier, critique their job talk, and provide advice about the places they’re applying to has an advantage that would show up whether or not you knew where the candidate was from or who the advisor was.

Merit is an ideology and I’d agree with Hayes that it is structurally accelerating towards a point of collapse and illegitimacy in American society. But honestly, yeah, there are people who do the job better and people who do it worse. Sometimes that’s well-predicted by how people present as candidates for the job or grant and sometimes it’s not. Sometimes people do a job really well for a while and then not so well for a while and then really well again, and maybe that’s something that it would be nice if the entire society said, “Ok, that’s fine, that’s human, let’s all chill the fuck out and stop judging all the time or as strongly as we have been.” Sometimes I’m confident saying, “That person or applicant or grant just is absolutely weaker than that other person or applicant or grant” and sometimes I can’t see much of a distinction and hate being forced to invent one. Much of the time, I don’t like people who are too invested in their own merit–whether they’re people who see themselves as having been appropriately rewarded for their talents or people who are angry and resentful about having been excluded from the rewards they believe they are due. But I’d also acknowledge that arguing for humility and generosity all around is a lot easier when you have one of those few good jobs and your middle-age angst and doubt doesn’t include wondering how to pay for health care or make your mortgage payment.

The best I can do is grope towards trying to do it better than we do. I don’t think there’s a magic alternative system that makes everything fair and just and accurate. I don’t even think that’s really the humane society that most of us would rather live in–I think most of would just rather that there were lots of pretty good jobs for all the pretty good people and that we stopped spending as much time as we do sorting out the best of the best and imagining that we are matching them to their entitled rewards or raging when they are not.

Posted in Academia | 12 Comments

Particularism as a Big Idea

One of the interesting points about Jared Diamond’s books that has come up recently at Savage Minds is that cultural anthropologists don’t write “big books” much any longer, that the disciplinary vision of cultural and social anthropology is now so anti-universalist, anti-teleological, so devoted to the particular character of specific places and times, that a sweeping analysis of large-scale themes or generalized theory seems out of bounds. (David Graeber’s Debt was mentioned as an exception.) Cultural history exhibits something of the same tendency towards the microhistorical and particular, as does a good deal of humanistic scholarship in general.

This alone seems enough to inflame one set of critics who seem to regard it as both heretical and superficial to refuse to pursue generalized, sweeping conclusions and universally valid principles that arise out of empirical data. So this, in fact, seems to me the “big book” that we need an anthropologist or historian to write, aimed at the same audiences that read Diamond, Pinker, E.O. Wilson, Haidt and other sociobiologists, evolutionary psychologists, neurobiologists and “big history” writers who offer strong universalizing or generalizing accounts of all cultures and societies across space and time. What we need is someone who can write a big book about why the most interesting things to say about human cultures are particular, local and contingent.

That book would have to avoid falling into the trap of being the straw man that Pinker in particular loves to hit over the head. It would need to start by saying that of course there are transhistorical, universal truths about human biology and minds and the physical constraints of environment and evolution. “Nature” matters, it’s real, it’s important. And equally of course, there are institutions which have persistent force across time and space either because human beings carry those institutions with them and reproduce them in new settings, or because there really are functional, practical problems which arise repeatedly in human societies.

A preference for local, situated, grounded studies does not require a blanket rejection of the biological, material or functional dimensions of human history and experience. What I think the “big book” could say is two major things:

1) that many forms of generalizing social science make far stronger claims that they are factually and empirically entitled to make, and that this problem gets much worse when the generalization is meant to describe not just all existing societies but all of human history.

2) that much generalizing or universalizing social science uses a description of foundational or initial conditions of social and cultural life as if it were also a description of particular, detailed experience and thereby misses both what is interesting and important about the detailed variations between different places and times–which includes the fact that there should be details in the first place. Essentially, that strongly generalized accounts of all human history are making a big deal out of the most obvious and least interesting aspects of human existence.

The first point is simpler, but should command far more respect among scholars and intellectuals who describe themselves as scientists and empiricists than it seems to. I’m going to focus on it for the remainder of this essay and take up the second point another day.

Let me use the example of “stages” of world history, which comes up prominently in Diamond’s new book, primarily as an assertion that there are “traditional” societies that reflect an original or early stage of human history and “modern societies”, with everything presumably arranged neatly in between them. (Diamond is not much interested in his new book in the in-between, and actually has never really been interested in it–Guns, Germs and Steel more or less argues that the early migration and development of human societies across the planet has determined all later histories in a directly symmetrical fashion.)

Most contemporary anthropologists and historians react negatively when they come across an account that tries to arrange human societies along a single spectrum of evolutionary change. To some extent, that reaction is conditioned by the historical use of such characterizations to justify Western racism and colonialism. But even accounts of evolutionary stages of human history that scrupulously avoid those associations are factually troubled.

What’s the issue? Let’s take a point that crops up in Diamond, in Napoleon Chagnon’s work and in a number of other sociobiological and evolutionary-psychology accounts of human variation.

If someone says, “Many human societies practice some form of warfare” or “organized violence is common in most human societies”, that’s fine. The anthropologist or historian who pushes back on that simple generalization is just being a tendentious jerk. Sure, it begs the question of what “warfare” is, but the generalization is so gentle that there’s plenty of space to work out what “many” and “warfare” mean.

Step up a notch, “All human societies practice some form of warfare”. This kind of generalization is easy to break, and it is frustrating when someone making a generalization of this kind digs in their heels to defend it. It’s really only defensible as an icebreaker in a conversation about the phenemenon in question. It can only hold as an airtight assertion if “warfare” is defined so generally that it includes everything from World War II to a football game.

Refine it a step using an evolutionary schema: “All human societies once practiced some form of warfare, but warfare grew into a more rarified, restricted and directed phenomenon as states grew in scale and organizational sophistication.” This sounds like it’s being more careful than the “all human societies practice” generalization but in fact it is even easier to break, because it rests on a linear account of the history of the state (and then a linear account of warfare’s relationship to that history). This is simply not true: human political institutions across time and space have all sorts of variations and really haven’t moved progressively towards a single form or norm until the exceptionally recent past. Even now there are some striking variations at a global scale–and it’s equally clear now that Fukuyama’s End of History assertion that liberal democracy is the final stage of human political evolution is just plain wrong. Beyond the present moment lies the unknown as far as political structures and practices go.

You can break the general assertion not just by citing endless examples of political structures that don’t fit neatly between “traditional” and “modern” societies or endless examples of “warfare” with non-linear relationships to changing political structure over time. You can also break it at the end that Diamond and Chagnon focus on, in the assertion that “traditional societies” in recent history are unchanged survivals, a window into the distant past. There’s increasing evidence, for example, that there have been a succession of large-scale polities in the Amazonian rainforest and the eastern Andes over a very long period of time that simply happened to be absent or weak at the time that Europeans first pushed into these areas. Assuming that small-scale societies of various kinds in the same region where such a history unfolded were unchanging, pristine and unrelated to other societies is at the very least unsupported by any direct evidence. More to the point, such an assumption actively overlooks evidence in many cases in the modern world that “pristine” societies of this type live where they live because they were trying to get away from larger or more centralized polities, that there is a dynamic relationship between them. Which surely includes ideas and practices of violence and warfare.

This is where the use of evolution as the organizing idea of such accounts is so aggravating. Not because it’s “scientific” but because it’s not. Evolutionary biologists know better than to describe speciation as progress towards an end or a goal, to assume that natural selection must always produce more complex or sophisticated organisms over time, or that evolutionary processes should ever be represented by a single line of descent. Go ahead, show an evolutionary biologist a single line that goes from Devonian tetrapods to homo sapiens with every ‘transitional’ animal in between neatly marked as one more interval on the way to us and get ready for a big eyeroll and an exasperated sigh.

Sure, there’s a successive relationship over time between forms of political organization in human history, but if you were going to chart it, you’d have something that looked hugely bushy, with all sorts of groupings, thousands of radial and convergent movements at all scales of time. And if you tried to place “warfare” in relationship to that complexity it would get even messier and more intricate.

Anything that arranges human history as a matter of “stages” progressing neatly towards the modern is just factually wrong before we ever get to the troubled instrumental and ideological history of such schema. Yes, that includes most versions of dialectical materialism: the dogged attempts of some Marxist historians and anthropologists in the 1970s and 1980s to get everything before 1500 into some kind of clear dialectical schema long since crashed into either an assertion that there’s only been one general world-systemic polity ever in human history (the “5,000 year-old world system”) or that lots of variant premodern histories collapsed into a single capitalist world-system after 1500.

When scholars who see politics or culture or warfare or many other phenomena in granular and variable terms rise to object to strong generalizing or universalizing accounts, their first motive is an empirical one: it just isn’t like that. Human political structures didn’t ALL go from “simple tribes” to “early states” to “feudalism” to “absolutist centralization” to “nation-states” to “modern global society”. They didn’t even go that way in Western Europe, really. Certain kinds of structures or practices appeared early in human history, sure, and then recurred because they radiated out from some originating site of practice or because of parallel genesis in relationship to common material and sociobiological dimensions of human life. Other common structures and practices appeared later, sometimes because new technological or economic practices allow for new scales or forms of political life and structure. But there is a huge amount of variation that is poorly described by a linear relation. There are movements between large and small, hierarchical and flat, organized and anarchic, imperial and national, etc., which are not linear at all but cyclical or amorphous.

That’s the “big idea” that people with their eye on variation and particularism could try to sell more aggressively: that the stronger your generalizations and universalisms about human culture and societies are, the more likely they are to be just plain wrong, factually and empirically wrong, and that the only way to dodge that wrongness to sustain those generalizations is to cherrypick your examples and characterize anyone who calls you on it as a pedant or ideologue.

Posted in Academia, Books, Generalist's Work, Oh Not Again He's Going to Tell Us It's a Complex System | 10 Comments

Getting to Wrong

About a month ago, I started writing an entry about Gawker Media as a model for the “new journalism”. When I started writing that, I mostly meant it as a compliment. I was thinking about Deadspin’s Manti Te’o expose (by a different Timothy Burke), about Gawker’s long-running Unemployment Stories series and about some of the longer-form essays that Gawker runs as well, about io9’s Daily Explainer and other bits of reportage they do.

My complimentary view was that at its best, this “new journalism” combines commentary, reportage and lucid interpretation in a way that’s largely unavailable in the self-important culture of what’s left of print journalism. That the best “new journalists” found across a range of sites and blogs are beginning to really define a new expressive and ethical set of norms that don’t just keep journalism alive but overcome some of the ponderous establishment status quo of late 20th Century print journalism. That some of the best of the new wave of work is in every sense better than the best of print and television journalism: more readable, more visceral, more diverse, and covering a vastly wider range of places, people and experiences.

But the last few days have reminded me of what the weaknesses of the new digital journalism are. Namely, that you can get a story really wrong and not ever feel any need to apologize for it. In fact, you can just go ahead and keep at it. Hamilton Nolan–whose Unemployment Stories are a really important document of America’s new economic realities–can also just recirculate other people’s news with just enough twisting and stripping off of context to make the story misleading or just plain wrong. And then never say anything when called on it, just hide behind the interface. (Which is sort of what I thought Nolan’s problem with Bill Keller was.) For example, suggesting that an extracurricular program at Duke University is a full department or major taken for credit. This is like finding out that there’s a group of students who meet one night every month in a campus cafe to listen to each other’s poetry and then ranting that they’re all paying tuition to get a poetry degree. A day later, Nolan gets one small aspect of an already slanted Wall Street Journal article about the new White House data on higher education right, but misses everything else in the WSJ article and in the larger story of the new White House initiative.

I get it, Gawker’s a “gossip site”. This is the usual defense of the shortcomings of digital journalism: that it’s not meant to be serious, that it has no ambitions, that it’s just ephemeral, that you have to privilege getting people’s attention more than getting it right. That all you need to do is confirm existing stereotypes and give your readers information that comforts their prejudices. Basically, Fox News only with a more generous view of anal sex and a less positive view of gun ownership. But in that case, you wonder why Gawker Media, or Slate, or Salon, or the Atlantic bloggers or anyone else out there bothers as much as they do with the writing that really does strive to be high-value reportage or original commentary. Or for that matter why any of these digital writers have the gall to complain when the mainstream media gets its facts wrong or grinds its axes.

It’s important not to turn this complaint into nostalgia for print journalism, either. Just passing along a tidbit of information created by someone else, sometimes with a bit of spurious twisting and recrafting to make it sound original while also grinding some axe, is a very well established part of traditional media practice. Most entertainment journalism for the last fifty years has consisted of the barely-artful repackaging of press releases from agents and studios. Much political journalism in the same time period has been built around reporters acting as the mouthpiece of a “confidential source”, generally repeating verbatim whatever ‘news’ they want to see on the front page. Digital journalism is different because of the volume of its repackaged information and because of the shitty wages it pays to the people who write it, not because in ye olden days there were Great Ethical Men who walked the earth and today it’s just a bunch of scumbags.

But the thing is: the pacing and the interface and the technology actually allow for correction, for updates, for getting the story right in a way that was never possible. The New York Times made a fetish of its corrections in part because its production cycle made them all a big deal, and in part because you couldn’t actually change the original item. Today? If you get it wrong (or miss something interesting) the first time and your commenters call you on it, it is an easy thing to say as much. Of course, do that enough, and people might begin to wonder: why not get it right the first time?

And once you ask that, well, yes indeed: why not?

Posted in Blogging, Digital Humanities, Information Technology and Information Literacy | 1 Comment

Don’t Compromise. Improvise.

Non-profit isn’t a status, it’s an ideal. The chief problem with “corporatization” in academia is not a greater emphasis on financial matters or the increased influence of companies on research. A non-profit organization needs to think just as much, and just as creatively, about financial sustainability as a corporation does. The influence of private corporations on academic research, and the corruption that follows, is a big issue, but it’s a different issue than “corporatization”.

The real problem is that the increasingly constricted imagination of late 20th/early 21st Century corporate management (which has not been entirely healthy even for companies) has all the wrong answers for the financial difficulties of a non-profit organization. Particularly higher education. As Aaron Bady notes, this is the heart of the problem with Emory President James Wagner’s citation of the “3/5 compromise” in the early American republic as a guidepost for consensus in academia. The ham-fisted stupidity of holding up one of the most painful episodes in American history as a model is inflammatory enough, but if you can leave aside the racial provocation for a moment, the analogy is a warning about where a corporate way of imagining organizational life leads at this moment in business history. Namely, that when an organization is financially challenged by changes in its economic environment, the first and last answer is to get rid of employees or reduce their cost by shaving compensation and benefits.

Higher education is labor intensive, and making it sustainable in the long run has involved and will continue to involve thinking intelligently about how to deal with the costs of labor. That’s just as important for a non-profit or public institution. The question is how you go about it. When you read back over the last year to see why Wagner is talking about “compromise” at Emory, you see that he has been going about it all wrong. He is doing it the way that American companies have done it for the last twenty years, which is the same way that a lion goes about figuring out which gazelle to chase: find the most vulnerable members of the herd. The injured, the elderly, the young, the isolated. Then cut whole divisions or workgroups once they’ve been identified by their inability to protect themselves in internal politics and explain the cuts as automagically generating profit. (No company gets a boost in the stock price for reinvesting its profits in R&D or in developing its workforce.)

As Bady points out, Wagner is not alone in this respect, whether we’re talking Marc Yudof’s clumsy mimicking of corporate language about revenue and profit in assessing the humanities or countless research university administrations pushing the adjunctification of most of their teaching and then pretending to be surprised by the disaffected and scattered students produced by that shift. American business leadership now sees growth only in contraction, and that’s the managerial style they are passing on to every public, civic or non-profit organization they touch upon, even the family.

The idea of a non-profit is inextricably linked to the idea of the public and to a commitment to a vision. The manager of a non-profit isn’t maximizing the returns to owners. He or she is the guardian of a trust, the custodian of a mission. If the trust is threatened because it has too many obligations, is doing too many things, is spending beyond its means, the route to sustainability can’t be travelled just by finding out who brings in the least money and tossing them overboard. The entire enterprise might depend upon its least remunerative components, rest on something more ineffable than a balance sheet.

——————-

Now for their part, faculty across American higher education have not necessarily done the best job at providing a systematic alternative way to think about financial sustainability. In at least some cases, the response to budgetary problems has been either to reject the existence of the problem because some of the people bearing news of it are seen as untrustworthy and manipulative, or to argue that the answer is a restoration of the social and political contracts of yesteryear, principally to return to greater public funding (both direct and indirect) of higher education. Tenure-track faculty have in many cases been at the least passive accomplices of adjunctification and at the worst have very nearly demanded it as a way to shift the labor burden of teaching. Equally, many tenure-track faculty have blandly looked on as tuition costs have increased while income inequality has grown, and at rich institutions, they grew accustomed to the expansion permitted by endowment incomes in the 1990s without questioning too much the preconditions of this growth. (Students at such institutions have been even less interested in long-term institutional sustainability, generally preferring to advance their own momentary causes and interests and blithely dismiss the consequences–but that at least is understandable for a variety of reasons.) When challenged, many tenured faculty tend to reflexively echo Benjamin Ginsberg and blame administrative growth and overreach for financial (and other) weakness in the contemporary academy.

However valid it is to raise growth on the administrative side as an issue, the fact is that faculty really need to tend to some issues that fall squarely in their own domains. The first of which is simply this: if being a nonprofit is a trust, a philosophy, an ideal, then we have to be able to imagine how we carry on with our mission without the assumption of growth and expansion. The assumption that you manage sustainability by slashing at “unprofitable” divisions or people is the dominant managerial logic of the last twenty years in American business, but the logic that endless growth is the only possible future of all institutions, the only possible language of progress is a deeper affliction. If faculty want to be custodians of the deeper mission of academia, we have to be able to imagine change and dynamism without yoking that vision to growth. And there is only one way to do that: by substitution. What we do now we have to imagine we will not be doing tomorrow: our disciplines, our methods, our interests, our practices, our publications all have to be, in the deepest sense, provisional. Not just in the relentlessly whiggish vision of some sciences, not always better or more true. The default assumption now as each new generation comes into academia is that we change only when there is a strong argument for change. It should be the opposite: every department and discipline and subject should assume its own transformation whenever there is a moment of natural transition, everything should be temporary until proven otherwise. Every commitment we make should be tenuous and fragile not because there is a profit-hungry manager looking to claw back some money but because that is the only way to keep what we do as scholars and teachers lively and responsive to the world around us without the assumption of endless growth. Assuming growth is how we end up with the kind of financial obligations that really do create crises–and thus opportunities for managerialism.

When we do end up on the wrong side of financial sustainability, we also have to imagine ways out of that situation that are philosophically and ethically satisfying but not unreal or abstract. If the faculty answer to limited resources are platitudes which soothe our sensibilities rather than corporate ones, or worse yet, just to pursue a less efficient and more insidiously brutal search for the weakest members of the herd, we shouldn’t be surprised when more direct or fashionable alternatives seize the reins. The faculty answer to crises of sustainability should never be to find the least fashionable or most vulnerable division or discipline, but we still have to have a way of imagining not just how to change without growing but in some cases, a principled way to do less or have less.

A good test of whether faculty at many institutions can come up with a better way to imagine change–and maybe contraction–will come out of shifts in enrollment patterns, particularly in the drift of students at many institutions away from the humanities. The wrong answer is to let enrollment drive a perfectly symmetrical change in resource distribution, to treat student interest as our “internal market” and our disciplines as products. The equally wrong answer is to close those markets off with requirements, to let disciplines and divisions engage in entrepreneurial schemes that route traffic, capture bodies, ratify existing commitments through cartels. That’s really what James Wagner is offering: that the people who are presently occupying the seats reach a “3/5 compromise” with administrators at the expense of students, at the expense of programs shoved to the margins, and worst of all, at the expense of a dynamic intellectual future. In many cases, I think the answer to the problem of shifting enrollments is as conceptually simple (if work-intensive) as re-envisioning what we teach and study, in travelling more than halfway to the lived world of our students and our society, and learning to let go of the course, the content, the curriculum that we’ve tricked ourselves into seeing as necessary and permanent. It’s not just indispensible men who lie in the grave. Look at a curriculum twenty years ago and you’ll see a bunch of courses that were thought to be perpetual and necessary, that were ratified by requirements and tenure, that were called forth by an endless slog of committee meetings and the harrumphing of professional associations.

Our answer shouldn’t be to cut. It shouldn’t be “compromises” that preserve our privileges at the cost of students, adjuncts, or the sense of higher mission that makes our enterprise beyond and outside of profit. Our answer? It should be to dance.

Posted in Academia, Swarthmore | 3 Comments

The Longue Duree of the Galactic Empire

In response to this symposium at Wired about the Battle of Hoth, my thoughts:

“The overly episodic focus of military historians and policy experts rather typically leads them to ignore the deeper structural considerations shaping this period in the history of galactic society. The Battle of Hoth is in fact an epiphenomenal afterthought notable largely for the waste of lives and resources on both sides, rather than any kind of turning point in the conflict. In the longue duree, what is more striking by far is the escalating failure of bureaucratic centralization under the late Imperial government, which was in turn little more than an extension of a similar structural contradiction in the late Republic period. Paying too much attention to ideological superstructures like ‘The Force’ conceals the degree to which galactic governance in either period had become a form of tributary extraction from separate polities whose cultures and languages were poorly integrated into the dominant elite culture. The Empire’s racial preference for humans with pink skin and a selected set of other privileged subaltern cultures was simply a ratification of the tendencies towards speciescentric elitism in the Republic, and the tendency to rely upon technological violence and coercion to keep systems in line merely a variation on the use of highly trained paramilitary “Jedi” to intimidate rebellious or dissenting local elites in the late Republic.

Battles like Hoth were a constant feature of the late Republic and Imperial periods alike, but have received less attention from scholars due to the lack of participation by charismatic leaders whose long-term importance was negligible, like Darth Vader and Leia Organa. In many ways, the destruction of Imperial facilities by poorly armed indigenes in the Endor system is more indicative of the ways in which galactic governance was fragmenting and failing in the late Imperial period. Treating the Rebellion as a privileged mode of dissent in an era when many other systems and social classes were in other ways ‘slipping through the fingers’ of the Coruscant metropole is itself granting too much credit to a ragtag band of avidly self-promoting malcontents.”

Posted in Sheer Raw Geekery | 5 Comments

The Dissertation Might Not Be Broken, But It Needs a Chiropractor

One more thing that Menand mentioned in passing in his talk at Swarthmore was that the median time to completion of a Ph.D in the humanities is over nine years. Even if the job market in academia were wonderful that would be a very hard pill to swallow.

There are a lot of reasons why time to completion is stretching out as long as it is. One, of course, is the job market itself. Considering that a full-time doctoral student is almost certainly either teaching (for her own institution or adjuncting elsewhere) or is being supported by a partner or family, if the market is especially terrible it might seem to make more sense to wait it out one more year and keep reworking the dissertation. Or it might be that the need to work is keeping a doctoral student from devoting the time necessary to finish the dissertation, or keeping them from having the money for necessary travel to another archive or fieldsite.

Another reason might have to do with the generally weak or diffuse nature of graduate pedagogy. A student who needs devoted attention from a mentor–or who needs to be told to finish up or quit–may well never get what they need, and just go on struggling alone for years and years.

But attention is understandably centering on two things: graduate study before the dissertation and the dissertation itself. A few courageous programs are tackling one or both of these problems, most notably Stanford.

I think something even bolder might be called for: first, no more than a single year of coursework and study, culminating in a proposal for a program of research. Accept only the students who are already well prepared in their discipline and thus accept that they are in fact ready to go to work. If that standard suggests there would be fewer students who met the criteria for admission, then that’s a bonus: it cuts down on the number of students doing doctorates and reinvigorates the separate M.A. as a concept: that’s for students who need further disciplinary preparation or who have changed the nature of their interests in between receiving a B.A. and preparing to undertake doctoral study.

You shouldn’t need more than a year–a year that would be directed at sharpening your possible research interests–because the process of scholarly research involves learning about scholarship in every relevant specialized literature and acquiring the necessary methodological skills.

Second, the research needed for a doctorate should not be a completed book-length manuscript, at least not in the humanities. A humanities scholar should prepare the following: an essay-length commentary on the disciplinary literature that addresses the research problem they’re working on; two article-length essays on the research subject; an executive summary of the overall area of research interest directed at a broad interdisciplinary audience; a plan for continuing research and inquiry on the subject; and a plan for making archives, notes and other materials connected to the study available digitally via some common depository supported by a consortium of academic institutions. In the same time, the doctoral candidate should prepare a syllabus for a thematic course in their area of interest and teach it. All of this should be complete by the end of the fourth year of study, with no more than a single additional year of extension possible.

This has to be done all at once, and every tenure and promotion committee in the country at every institution with even modest ambitions should adjust accordingly. The idea here is to move the expectation of a book deeper into a professor’s career–or to rethink that ambition entirely.

The idea of research as a basic part of the apprenticeship of a scholar isn’t broken, but the dissertation as we have known it is. Anthony Grafton says at the Chronicle , “‘The dissertation makes intellectual sense only as a historian’s quest to work out the problem that matters most to him or her, an intellectual adventure whose limits no one can predict…There’s no way to know in advance how long that will take. Cut down the ambition and scale, and much of the power of the exercise is lost.'” Much as I love Grafton’s work and his frequent attention to the state of the academic profession, this really feels like an extravagant view that’s out of touch with the reality of the actual market and the actual jobs available to the dissertating. Once it’s put like that, it’s not clear to me that we should ever have thought that way.

More importantly, it’s not clear why a process that can take as long as it needs has to run to its conclusion before we award a doctorate and admit someone into the profession. Why not keep working out the problem, keep travelling on the adventure, make the exercise a life-long one? All we should need to see is the evidence that the journey is well begun and the scholar has steady feet upon the path. Not the least of the good things that might follow from such a change is that doctorates might feel like there is more than one path available to them after they’ve finished, and the cost of setting foot on it, whatever might come of it, will be far smaller in human terms as well.

Posted in Academia | 13 Comments

More on Menand

Almost back to feeling normal, so I thought I’d return to my somewhat fever-delirious notes on the Menand talk last week at Swarthmore and see what I could pull out of them.

Menand’s talk, following some of his recent writing, was broken into three sections. The first was a quantitatively-oriented summary of the current trends in higher education in general and in the humanities in specific. The second was a review of the history of the humanities in academia in the last 75 years or so. The third was a meditation on possible solutions to the problems described in the first two parts.

Though he was appropriately cautious about the language of “crisis”, pointing out that the humanities in particular has been by its own lights perpetually in crisis, the numbers he laid out suggested that there is a real crisis at the moment and that it is gathering momentum. In particular, he focused on dismal enrollment trends in the humanities at major research universities (including history, which as usual is a borderlands discipline that pops in and out of focus in these kinds of conversations), and on the degree to which students in the US have long since preferred pre-professional degrees in Accounting, Nursing, and so on in favor of any liberal arts (including the sciences or the hard social sciences). Interestingly, he argued that small undergraduate colleges like Swarthmore are one of the few islands of relative calm in the storm, that enrollment trends for the humanities at most such colleges are only mildly negative and the support of most administrations is strong. Menand noted many other negative trends in alignment with enrollment, such as the near-total vanishing of grants and support for research in the humanities.

In the second part, he gave what I found to be a curiously reactive and Kuhnian account of the transformation of humanistic scholarship since 1950 that concluded that we’re in a moment of atheoretical ennui, that there are no big ideas or theories. (Notably he made no reference at all to digital humanities, “distant reading”, text mining or anything else along these lines.) Still, he offered this history as a hopeful one, showing the resilience and relevance of humanistic thought, and observing that each successive move, while not progressing towards a greater cumulative knowledge that was more “true” or “accurate” in the whiggish sense, generatively opened up the intellectual and social spaces of humanistic practice. There were some really appealing ideas in this account–one I liked was the argument that the “public intellectual” is a red herring, that the problem with much humanistic thought is not that it communicates poorly but simply that many people (particularly in other academic disciplines) disagree with it and will continue to do so. This he took to be a source of strength and mission rather than a problem.

The third part is where I felt a bit let down. Menand’s writing makes clear that he doesn’t think formal interdisciplinarity is an answer to the problems of disciplinarity because interdisciplinarity IS disciplinarity, it ratifies the disciplines. In his writing, he also doesn’t think disciplinarity is a problem, he thinks it is the consequence of professionalization and that professionalization is a necessary part of the value of academic institutions. In the talk, he moved off of this line somewhat, in a fuzzy way. What I heard in there somewhere, maybe because I’m predisposed to hear it, was that there needs to be more conscious generalism, less over-specialization in the humanities.

Menand also said that the humanities need to basically get into everybody’s else shit more, that a more generalist sensibility doesn’t just let you help students see how the humanities connect to the world, but it also lets you get involved in discussions about neurobiology and economics with more confidence.

So how do we get there? Menand said, “Well, you can’t rearrange departments or practices as they exist, that’s too hard, so you’ll just have to wait for us to train a new generation of scholars who have slightly different practices and outlooks”. Not only does that align very poorly with the immediacy of the existential threat he laid out in the beginning, it seems very nearly synonymous with saying, “Yep, we’re screwed.” It just doesn’t seem that hard to me to create some space for curricular and intellectual movement, to loosen the constraints, within existing practices.

Menand also said that he felt he couldn’t possibly advise undergraduates about any other career besides an academic one, because he doesn’t know anything about other careers. This also seems really wrong in the context of his urging that humanists speak to and about anything that involves the human subject and human practices. How could we possibly be comfortable engaging in that range of argument and yet say that we have nothing to say to students about the lives they might live unless they want to be professors like us? It may well be that I cannot tell a student specifically about current tangible considerations around employment in the museum industry (to use Menand’s example) without having worked in museums myself. But I can surely talk to students about the idea and institutional history of the museum, about ways to imagine exhibition, about new media forms and practices that might transform museums and exhibition, and so on. Still, I thought he also ended up making an unintentional argument that if we want more flexibility and range in humanistic thought we may also need to look for some humanists who come from completely different backgrounds or training rather than just from slightly reformed ‘traditional’ graduate education.

Of the ideas he put forth in the last part of his talk, the one that I found the most useful might be the simplest to pull off (at least in the spirit of how I heard it): that one strategy that might help the humanities is simply to readdress what it teaches, to redirect the focus of a course so that it speaks back to or anchors itself in concepts, subjects or disciplines outside the ‘traditional’ remit of humanistic academia.

There are of course humanists who’ve been doing this kind of thing as a steady part of their practice for their whole career. I do think there is probably a way to go about it that is particularly generative and useful for our students and that doesn’t rub up against our colleagues outside the humanities so abrasively. When I teach my class on the history of international development institutions and the intellectual history of development, for example, I’m certainly speaking back to the way that “development” is conventionally imagined in the discipline of economics. But I’m also trying to let that way of thinking live and breath inside my class, so that one outcome of my course is that a student might choose to prefer that way of thinking and working with development. Often I think when humanists set off to talk about science or other forms of practice in the world, they forestall or foreclose the possibility of an escape from or challenge to the humanistic imagination, they define critique as a form of negation or rejection rather than a productive enrichment or complication. (Yes, I know, it happens far more in reverse, but that’s a problem for a different day.)

Because I can readily see how we might offer more courses like this, I’m not sure that things are quite as gloomy or difficult as Menand imagines they are–given the way he tells the history, it was almost the resigned account of a person who imagines himself the last survivor of a vanishing paradigm, Jor-El waving good-bye to the infant Superman as he rockets from Krypton, rather than as the reform-minded guardian of a grand tradition. I think we’re in the middle of a ferment full of new ideas and practices (as well as the enduring strength of many old ones). The trick will be to see the possibilities of this moment more fully, in a more joyous and permissive mood.

Posted in Academia, Digital Humanities, Generalist's Work, Swarthmore | 2 Comments

On Diamond (Not Again!)

I don’t really mean to get drawn into recurrent arguments about Jared Diamond’s work, because my actual feelings about the actual books are rather mixed and indifferent. Guns, Germs and Steel reads well, it’s a useful teaching book for fueling a discussion about the merits and limits of materialism and environmental determinism, and it can provoke a very interesting conversation about moral responsibility, global inequality and the post-1450 expansion of Europe (almost in spite of itself). I appreciate that Diamond thinks his argument in GGS is strongly anti-racist, I appreciate why others think it has the opposite effect, and think that neither is entirely correct. Even in terms of synthesizing works, I think there are better choices for most of Diamond’s signature arguments, however.

I appreciate that Collapse is, in a way that I find awkward and roundabout, trying to think about the question of determinism. I appreciate that his current book is working hard not to get drawn into sentimentality about hunter-gatherers, that Diamond believes himself to be steering a middle course between ethnocentric arrogance and romanticism about ‘noble savages’. I appreciate that Diamond thinks The World Before Yesterday is deeply appreciative of “traditional societies” and so is baffled to be accused of hating on them.

I also appreciate not just that his audiences are looking for a clear writer who seems knowledgeable about many issues, but for “big theories” of human history and culture that do not require having a Ph.D’s worth of knowledge and training in order to understand or articulate.

The problem is that there are a lot of problems with Diamond’s work in both his command over the literatures he’s synthesizing, the selectivity of his synthesis, and the uncharitable and at times incomprehensible framing he makes of any potential objections (when he can be bothered to acknowledge that such a thing could exist). Scholars who try to point out these things politely get ignored or acknowledged in passing, as in Razib Khan’s update to his post at Gene Expression. I’ve been in a number of discussions over the years with people who like Diamond’s books who then say, “But yeah, he gets a lot of things wrong” or “yeah, his theory is really overexaggerated and simplistic” as if that’s not even worth talking about and as if you’re a hater for even wanting to talk about it. Small wonder that some scholars, particularly anthropologists, lose their shit when there’s a new Diamond book out there. Sometimes you lose your shit when people insist that they don’t really want to talk about all the people (including you) who are not losing their shit. Why doesn’t Khan want to talk about Alex Golub’s careful, detailed response to Diamond’s book? Why isn’t Golub the “typical anthropologist” for Khan but some folks working for an NGO are? Probably because that would take a longer, smarter entry.

I agree with Khan that sometimes the shit-losing leads people to say things that are just as problematic–to sneer at Diamond’s readers, to condemn anybody who tries to have a “big theory” about human history and culture, or to go too big in characterizing what’s wrong with his work. But have some sympathy here, because Diamond and a few others in his intellectual neck of the woods like Stephen Pinker, specialize in cherrypicking big fields of scholarly work for a few friendly citations and then acting as if what they’ve found is the entirety of those fields. Diamond and Pinker also seem unable to resist setting up straw man versions of legitimate criticisms (and then a few of their critics can’t seem to resist falling into that characterization).

In an earlier comment, I mentioned at least a few areas where there seems to me to be a genuine debate with a range of legitimate positions that require respect, if not agreement, in terms of Diamond’s latest (as well as Pinker’s latest book, which has some overlap):

1. Maybe New Guinea isn’t representative of all modern “traditional societies”, let alone hunter-gatherers in all of human history. Maybe there is considerably more variety in terms of violence and many other attributes than Diamond lets on. Maybe he’s not even paying attention to the full range of anthropological or historical writing about New Guinea. Maybe Diamond isn’t even living up to his own stated interest in the variations between such societies.

2. Maybe modern hunter-gathering societies are not actually pristine, unchanging survivals of an earlier era of human history, but instead the dynamic consequence of large and small-scale migrations of agriculturalists and even more recently, industrial workers. At least in some cases, that might be why hunter-gatherers inhabit remote or marginal environments, not because of preference, but as a response to the sometimes-violent movement of other human societies into territories that they used to inhabit. Meaning taking whatever it is that they have been doing in the 20th Century (violence or otherwise) as evidence of what they’ve always done is a mistake.

3. Maybe defining violence or war in a rigorous, consistent, measurable and fully comparative way is much harder than Diamond or Pinker think it is.

4. Maybe between what Diamond calls a “traditional society” and modern “WEIRD” societies (Western, educated, industrialized, rich and democratic) there are lots of other models. Maybe “between” is the wrong term altogether since it implies that there’s a straight developmental line between “traditional society” and modernity, an old teleological chestnut that most anthropologists and historians would desperately like to get away from. I haven’t read very far yet into the book, but Diamond doesn’t seem to have any idea, for example, that there have been numerous societies in human history where there have been many connected communities sharing culture and language at high levels of population density and complexity of economic structure that have nevertheless not had a “state” in the usual sense. What are those? Also: maybe Diamond frequently confuses “traditional” and “premodern”. Much of the time when he says, “Well, we modern WEIRD people do X, ‘traditional societies’ do Y”, the “Y” in question would apply equally to large premodern states and empires.

Or to summarize: maybe Diamond is pushing way, way too hard for a clean distinction between two broadly drawn “types”: “traditional society” and “modern society”, and is distorting, misquoting, truncating or overlooking much of what he read (hard to tell what he read, since there’s no footnotes) to make the distinction come out right.

This is not nit-picking, this is not complaining about a spelling error or some mildly errant footnote on p. 79. This is not pedantry. This is important. The more airtight you want to make your universalisms, the more that you tend to spring leaks–and the more leaks you spring, the faster your boat sinks. A “big theory” that’s advanced with generosity and gentleness, which grants its own provisional character, is a sturdier way to inspire discussion and create understanding. As Golub points out, that is not what Diamond is doing, so much so that his description of other ways of thinking is very nearly incoherent.

Good, simple, accessible synthesis does not require a lack of generosity towards the scholarship that makes it possible. And a good synthesis should always be as much a guide to the possibilities of interpretation around a complex subject as it is a defense of a singular interpretation.

Posted in Academia, Books, Cleaning Out the Augean Stables, Generalist's Work | 4 Comments