Over the last decade, I’ve found my institutional work as a faculty member squeezed into a kind of pressure gradient. On one side, our administration has been requesting or requiring more and more data, reporting and procedures that are either needed to document some form of adherence to the standards of external institutions or that are wanted in order to further professionalize and standardize our operations. On the other side, I have colleagues who either ignore such requests (both specific ones and the entire issue of administrative process) to the maximum extent possible or who reject them entirely on grounds that I find either ill-informed or breathtakingly sweeping.

That pressurized space forms from wanting to be helpful but wanting also to actually take governance seriously. I think stewardship doesn’t conform well to a hierarchical structure, but it also should come with some sense of responsibility to the reality of institutions and their relationship to the wider world. The strongest critics of administrative power that I see among faculty, both here at Swarthmore and in the wider world of public discourse by academics, don’t seem very discriminate in how they pick apart and engage various dictates or initiatives and more importantly, rarely seem to have a self-critical perspective on faculty life and faculty practices. At the same time, there’s a lot going on in academia that comes to faculty through administrative structures and projects, and quite a lot of that activity is ill-advised or troubling in its potential consequences.

A good example of this confined space for me perennially forms around assessment, which I’ve written about before. Sympathy to my colleagues charged with administrative responsibilities around assessment means I should take what they ask me to produce seriously both in the sense that there are consequences to the institution if faculty fail to do in the specified manner and seriously because I value them and even value the concepts embedded in assessment.

On the most basic human level, I agree that the unexamined life is not worth living. I agree that professional practices which are not subject to constant examination and re-evaluation have a tendency to drift towards sloppiness and smug self-regard. I acknowledge that given the high costs of a college education, potential students and their families are entitled to the best information we can provide about what our standards are and how we achieve them. I think our various publics are entitled to similar information. It’s not good enough to say, “Trust us, we’re great”. That’s not even healthy if we’re just talking to ourselves.

So yes, we need something that might as well be called “assessment”. There is some reason to think that faculty (or any other group of professionals) cannot necessarily be trusted to engage in that kind of self-examination without some form of institutional support and attention to doing so. And what we need is not just introspective but also expressive: we have to be able to share it, show it, talk about it.

On the other hand, throughout my career, I’ve noticed that a lot of faculty do that kind of reflection and adjustment without being monitored, measured, poked or prodded. Professionalization is a powerful psychological and intellectual force through the life cycle of anyone who has passed through it, for good and ill. The most powerfully useful forms of professional assessment or evaluation that I can think of are naturally embedded in the workflow of professional life. Atul Gawande’s checklists were a great idea because they could be inserted into existing processes of preparation and procedure, because they are compatible with the existing values of professionals. A surgeon might grouse at the implication that they needed to be reminded about which leg to cut off in an amputation but that same surgeon would agree that it’s absolutely essential to get that right.

So assessment that exists outside of what faculty already do anyway to evaluate student learning during a course (and between courses) often feels superfluous, like busywork. It’s worse than that, however. Not only do many assessment regimes add procedures like baroque adornments and barnacles, they attach to the wrong objects and measure the wrong things. The amazing thing about Gawande’s checklists is that they spread because of evidence of their very large effect size. But the proponents of strong assessment regimes, whether that’s agencies like Middle States or it’s Arne Duncan’s troubled bureaucratic regime at the U.S. Department of Education, habitually ignore evidence about assessment that suggests that it is mostly measuring the wrong things at the wrong time in the wrong ways.

The evidence suggests, especially for liberal arts curricula, that you don’t measure learning course by course and you don’t measure it ten minutes after the end of each semester’s work. Instead you ought to be measuring it over the range of a student’s time at a college or university, and measuring it well afterwards. You ought to be measuring it by the totality of the guidance and teaching a faculty member provides to individual students, and by moments as granular as a single class assignment. And you shouldn’t be chunking learning down into a series of discrete outcomes that are chosen largely because they’re the most measurable, but through the assemblage of a series of complex narratives and reflections, through conversations and commentaries.

In a given semester, what assessment am I doing whether I am asked to do it or not? In any given semester, I’m always trying some new ways to teach a familiar subject, and I’m always trying to teach some new subjects in some familiar ways. I am asking myself in the moment of teaching, in the hours after it, at the end of a semester and at the beginning of the next: did that work? What did I hope would work about it? What are the signs of its working: in the faces of students, in the things they say then and there in the class, in the writing and assignments they do afterwards, in the things they say during office hours, in the evaluations they provide me. What are the signs of success or failure? I adjust sometimes in the moment: I see something bombing. I see it succeeding! I hold tight in the moment: I don’t know yet. I hold tight in the months that follow: I don’t know yet. I look for new signs. I try it again in another class. I try something else. I talk with other faculty. I write about it on my blog. I read what other academics say in online discussion. I read scholarship on pedagogy.

I assess, I assess, I assess, in all those moments. I improve, I think. But also I evolve, which is sometimes neither improvement nor decline, simply change. I change as my students change, as my world changes, as my colleagues change. I improvise as the music changes. I assess.

Why is that not enough for the agencies, for the federal bureaucrats, for the skeptical world? Two reasons, namely. The first is that we have learned not to trust the humanity of professionals when they assure us, “Don’t worry, I’m on it.” For good reasons sometimes. Because professionals say that right up to the moment that their manifest unprofessionalism is laid screamingly bare in some awful rupture or failure. But also because we are in a great war between knowing that most of the time people have what my colleagues Barry Schwartz and Ken Sharpe call “practical wisdom” and knowing that some of the time they also have an innocent kind of cognitive blindness about their work and life. Without any intent to deceive, I can nevertheless think confidently that all is well, that I am teaching just as I should, that I am always above average and getting better all the time, and be quite wrong. I might not know that I’m not seeing or serving some group of students as they deserve. I might not know that a technique that I think delivers great education only appears to because I design tests or assignments that evaluate only whether students do what I want them to do, not whether they’ve learned or become more generally capable. I might not know that my subject doesn’t make any sense any longer to most students. Any number of things.

So that’s the part that I’ll concede to the assessors: it’s not enough for me to be thoughtful, to be practically wise, to work hard to sharpen my professionalism. We need something outside ourselves: an observer, a coach, a reader, an archive, a checklist.

I will not concede, however, that their total lack of interest in this vital but unmeasurable, unnumbered information is acceptable. This should be the first thing they want: our stories, our experiences, our aspirations, our conversation. A transcript of the lived experience of teaching. This is the second reason that the assessors think that what we think about our teaching is not wanted or needed. They don’t want that because they believe that all rhetoric is a lie, all stories are told only to conceal, all narrative is a disguise. They think that the work of interpretation is the work of making smoke from fog, of making lies from untruths. The reason they think that is that stories belong at least somewhat to the teller, because narratives inscribe the authority of the author. They don’t want to know how I assess the act of teaching as I perform it because they want a product, not a process. They want data that belongs to them, not information that creates a relationship between the interpreter and the interpreted. They want to scrub evidence clean, to make an antiseptic knowledge. They want bricks and mortar and to be left alone to build as they will with it.


I get tired of the overly casual use of “neoliberal” as a descriptive epithet. Here however I will use it. This is what neoliberalism does to rework institutions and societies into its preferred environment. This is neoliberalism’s enclosure, its fencing off of commons, its redrawing of the lines. The first thing that gets done with data that has had its narrative and experiential contaminants scrubbed clean is that the data is fed back into the experience of the laborers who first produced it. This was done even before we lived in an algorithmically-mediated world, and has only intensified since.

The data is fed back in to tell us what our procedures actually are, our standards have always been. (Among those procedures will always be the production of the next generation of antiseptic data for future feedback loops.) It becomes the whip hand: next year you must be .05% better at the following objectives. If you have objectives not in the data, they must be abandoned. If you have indeterminacies in what you think “better” is, that’s inadmissable: rarely is this looping even subject to something like a Bayesian fuzziness. This is not some exaggerated dystopic nightmare at the end of a alarmist slippery slope: what I’m describing already happened to higher education in the United Kingdom, largely accomplishing nothing besides sustaining a class of transfer-seeking technocratic parasites who have settled into the veins of British universities.

It’s not just faculty who end up caught in the loop, and like frogs boiling slowly to death, we often don’t see it happening as it happens. We just did our annual fire drill here in my building, and this year the count that we did of the evacuees seemed more precise and drawn-out than last year, and this year we had a mini-lecture about the different scenarios and locations for emergency assembly and it occurred to me: this is so we can report that we did .05% better than last year.

We always have to improve just a little, just as everything has to be “growth-based”, a little bigger next year than last year. It’s never good enough to maintain ground, to defend a center, to sustain a tradition, to keep a body healthy happy and well. Nor is it ever good enough to be different next year. Not a bit bigger, not a bit better, but different. New. Strange. We are neither to be new nor are we to maintain. We are to incrementally approach a preset vision of a slightly better but never perfect world. We are never to change or become different, only to be disrupted. Never to commune or collaborate, always to be architected and built.


So here I am in the gradient again, bowed down by the push on all sides. I find it so hard when I talk to faculty and they believe that their teaching is already wholly and infinitely sufficient. Or that it’s nobody’s business but their own how they teach, what they teach, and what comes of their teaching. Or that the results of their teaching are so sublime, ineffable and phenomenologically intricate that they can say nothing of outcomes or consequences. All these things get said, at Swarthmore and in the wider world of academia. An unexamined life.

Surely we can examine and share, express and create. Surely we can provide evidence and intent. Assess and be assessed in those ways. Surely we don’t have to bury that underneath fathoms of tacit knowledge and inexpressible wisdom. We can have our checklists, our artifacts.

But surely too we can expect from administrations that want to be partners that we will not cooperate in building the Great Machine out of the bones of our humane work. That we’re not interested in being .05% better next year, but instead in wild improvisations and foundational maintenance, in becoming strange to ourselves and familiar once again, in a month, a moment or a lifetime. Surely that’s what it means to educate and become educated in an uncertain world: not .05% more measured comprehension of the impact of the Atlantic slave trade on Sao Tome, but on how a semester of historical study of the Atlantic slave trade might help make a poet forty years hence to write poems, might sharpen an analytic mind, might complicate what was simple or simplify what was complex. Might inform a diplomat ten years from now, might shape a conservative’s certainty that liberals have no answer while voting next year’s Presidential race. Might inspire a semester abroad, might be an analogy for an experience already had. I can talk about what I do to build ramps to all those possibilities and even to the unknown unknowns in a classroom. I can talk about how I think it’s working and why I think it’s working. But don’t do anything that will lead to me or my successors having to forgo all of that in favor of .05% improvements onward into the dreary night of an incremental future.

Posted in Academia, Defining "Liberal Arts", Oh Not Again He's Going to Tell Us It's a Complex System, Swarthmore | 2 Comments

Oath for Experts Revisited

I was just reminded by Maarja Krustein of a concept I was messing around a while back, of getting people together to draft a new “oath for experts”. I had great ambitions a few years back about this idea, about trying to renovate what an expert ought to act like, to describe a shared professional ethic for experts that would help us explain what our value still might be in a crowdsourced, neoliberal moment. The Hippocratic Oath is at least one of the reasons why many people still trust the professionalism of doctors (and are so pointedly scandalized when it is unambiguously violated).

We live in a moment where increasingly many people either believe they can get “good enough” expertise from crowdsourced knowledge online or where experts are all for sale to the highest bidder or will narrowly conform their expertise to fit the needs of a particular ideology or belief system.

I think in both cases these assumptions are still more untrue than true. Genuine experts, people who have spent a lifetime studying particular issues or questions, still know a great deal of value that cannot be generated by crowdsourced systems–in fact, most crowdsourcing consists of locating and highlighting such expertise rather than spontaneously generating a comparable form of knowledge in response to any query. I still think a great many experts, academic and otherwise, remain committed to providing a fair, judicious accounting of what they know even when that knowledge is discomforting to their own political or economic interests.

Mind, you, crowdsourcing and other forms of networked knowledge are nevertheless immensely valuable, and sometimes a major improvement over the slow, expensive or fragile delivery of authoritative knowledge that experts in the past could provide. Constructing accessible sources of everyday reference in the pre-Internet world was a difficult, laborious process.

It’s also undoubtedly true that there are experts who sell their services in a crass way, without much regard for the craft of research or study, to whomever is willing to pay. But this is why something like an oath is necessary, and why I think everyone who depends upon being viewed as a legitimate expert has a practical reason to join a large-scale professional alliance designed to reinvigorate the legitimacy of expertise. This is why professionalization happened during the 20th Century, as groups of experts who shared a common training and craft tried to delegitimate unscrupulous, predatory or dangerous forms of pseudo-expertise and insist on rigorous forms of licensing. I don’t think you can ever create a licensing system for something as broad as expertise, but I do think you could expect a common ethic.

The last time I tried to put forward one plank of a plausible oath, I made the mistake of picking an example that created more heat than light. I might end up doing that again, perhaps by underestimating just how many meal tickets this proposed oath might cancel. But let’s try a few items that I personally would be glad to pledge, in the simplest and most direct form that I can think of:

1) An expert should continuously disclose all organizations, groups and companies to whom they have provided direct advice or counsel, regardless of whether the provision of this advice was compensated or freely given. All experts should maintain a permanent, public transcript of such disclosures.

2) An expert should publically disclose all income received from providing expert advice to clients other than their main employer. All experts should insist that their main employer (university, think tank, political action committee, research institute) disclose its major sources of funding as well. The public should always know whether an expert is paid significantly by an organization, committee, company or group that directly benefits from that person’s expert knowledge.

3) Any expert providing testimony at a criminal or civil trial should do so for free. No expert should be provided compensation directly or indirectly for providing expert testimony. Any expert who serves as a paid consultant for a plaintiff or a defendant should not provide expert witness at a trial involving that client.

4) All experts should disclose findings, information or knowledge that contradicts or challenge their own previous conclusions or interpretation when that information becomes known to them in the course of their own research or analysis. Much as newspapers are expected to publish corrections, experts should be prepared to do the same.

Posted in Oath for Experts | 4 Comments

Putting Out Fire With Gasoline

I appreciate what Sady Doyle is trying to do in this essay on humor, culture and politics. Primarily the essay is addressed to artists and performers (and their audiences) who object to what they perceive as “politically correct” censoriousness. (One notable recent example are the comedians who’ve suggested that they won’t play college campuses because activists attempt to micromanage what they can and can’t say.)

Doyle uses the Glen Ridge rape case, particularly the relationship between an infamous lyric in a Beastie Boys’ song and the actions of one of the rapists, to offer an olive branch to artists and performers. I’m compressing a long and careful development of the argument of the piece, but fundamentally the analysis goes like this: activists know that the artists are “good people”, but if so, when you find out that the content of your expressive work is in the heads of “bad people” or is associated with “bad actions”, you should want to avoid that content in the future. Doyle couches this almost as a secular concern for the souls of artists and performers: “it must be one of the worst feelings in the world”, to discover that something you sang or joked or wrote or painted has been cited by or admired by a person who associates that cultural work with their own commission of evil.

The essay is very careful in the early going to avoid simplistic claims about causality. The content of expressive culture doesn’t cause bad actions to happen, Doyle initially acknowledges. The lyric didn’t cause the rape, it just informed it, gave it substance, suggested its horrific specificities. But by the end of the essay, that’s no longer the case: bad culture not only causes harm to the feelings or subjectivities of some who encounter it, but we’re back to the content of culture causing people to have explicit thoughts, thoughts that have tangible ideological intent to discriminate or harm. (A “man who believes all black people are criminals is going to shoot an unarmed black man”.) I think here Doyle demonstrates what has become a characteristic view of a lot of current identity-based activism: that discrimination, oppression and racism originate from the hidden interiority of individuals, that “bad action” is located in “bad thinking” and “bad personhood”, that bad thinking has a kind of explicit propositional character, and that its propositional content bad is a concentrated, distilled form of everyday language and representation. By the end of the essay, Doyle isn’t worrying about whether that shooter has a song lyric playing in his head when he shoots, but whether the song lyric got him to shoot when he wouldn’t have otherwise done so.

So by this point, the olive branch is this: if you don’t want to be the person who causes someone to do evil, then listen to us when we tell you that what you just said or performed or visualized is going to cause someone to do evil. Because, Doyle says, we know (we think we know) that you aren’t evil. It is almost a doppleganger of the debate on guns: comedians and others are portrayed as if they believe jokes don’t hurt people, people hurt people; Doyle is offering them the chance to think that it’s just jokes, jokes or art or culture as a technology that is separable from the personhood of its maker. You, she argues, can know that “some people are flammable” and you, she argues, can “be careful about where the spark lands”.


Put in this fashion, this is another round in a venerable debate about the responsibility of artists for the consequences of their art. A thought which, I have to confess, first fills me with a certain degree of professorial and middle-aged weariness. It is not that I want citations galore, but I do wish we could get some degree of acknowledgement that this is an ongoing conversation where many good points and difficult experiences have already been had. This does not mean it is impossible to come to new understandings, to move ahead, and every generation also has to undertake its own encounter with fundamental human problems. But just knowing that you are not the first to think these things tends to moderate the degree to which you speak as a missionary might speak to a heathen, as if you’re delivering a message that up to this point has never been heard. That’s especially important if you mean to offer an olive branch. We don’t hate you as a person, we just hate your jokes! is an easier message for a comedian to take, I am guessing, if it is offered as the latest modest turn of a familiar dilemma.

But this point opens up into another landscape of difficulty for this kind of argument. First, Doyle’s approach strikes me as a fairly typical example of the way that current activism has amended a postmodern approach to interpretation and hermeneutics, I think in some ways without knowing that something’s been left out. Foucault announced the “death of the Author”, which to simplify somewhat meant in his thinking and much of other postmodernist or poststructuralist theory, that to understand what a text meant had little to nothing to do with discovering what the producer of that text thought that it meant. For all sorts of reasons: the producer was no longer understood to be a masterful individual agent in control of their own consciousness and intention: power and culture and institutions and history all radiated through the Author like light shining through a prism and thus spoke within whatever the Author produced. But also: the audience, the reader, the viewer, determined what the text meant, and determined that within the circumstances of a single moment of interpretation. It could mean one thing today and another thing tomorrow even to the same person, it could mean one thing before it was used or cited or deployed and another thing after it was used, it could mean two things at once or ten things, it could mean nothing fixed or determinate at all. The text could be paired with another text and change meaning; it could mean something different in a library or a bookstore or read aloud on a tape; it could mean one thing if it was held by a preacher and thrown into a fire and another thing if it was read lovingly to a child in front of a fireplace. I caricature a bit: postmodern approaches to interpretation did not hold, as they are often accused of holding, that texts meant everything or nothing, that signifiers floated utterly free. But it was important in this style to say that meaning was a very large, messy and protean space even for the most seemingly banal or straightforward texts, and that context mattered as much as text, that saying that a certain work always meant something no matter where it was or who was reading it, was a kind of folly.

The postmodern emphasis on language preceding and shaping thought and thought shaping action is intact in this new activist stance, but not the indeterminacy and multiplicity of meaning. And the Author has been brought forth from his grave, but not entirely to a new life. Doyle, like many, argues that the meaning of culture is often quite determinate, and it should be determined not by an act of discerning interpretation but in relationship to a set of social subjects. E.g., meaning still resides in that sense with the audience and with usage and context, but only some audiences and some contexts. Only two audiences have authority to make meaning, in this view: the people who use expressive culture deliberately as a weapon and the people who are wounded by that weapon. The Author is being forgiven here: the Author does not wound. The Author is only the blacksmith who makes the sword on an anvil. Whether the sword is wielded by the righteous or the wicked, or left above the mantlepiece, is not the Author’s will–unless he deliberately peddles it to the wicked.

Anyone else who claims, however, to see the sword as spit for grilling meat, or as a fashion accessory, or as demonstration of metallurgical skill, or as a symbol of aristocratic nostalgia, or as a visual stimulus for writing fantasy novels, or as one of a class of crafted objects, etc., is being ruled out of bounds. Those other meanings and interpretations are unavailable if there is someone somewhere who has been wounded.

Let me try to make the problem more concrete and responsive to Doyle’s argument. Doyle focuses on the documented presence of a lyric about sexual assault with a baseball bat in the thinking of a young man who sexually assaulted a woman with a baseball bat. The first problem with that focus is, “What do we do about the presence of that lyric in the minds of so many who never did anything of the sort?” This point needs to be made carefully, because lurking behind it is the callowness and stupidity of slogans like “All Lives Matter”.

This is a genuine mystery if the argument is made that words and texts and performances do have (or can have) a singular meaning and do reliably serve as the predicate of bad thinking, bad personhood, and bad action. When media critics predict, as they have for decades, that the representation of violence in media will create violent people and violent action in some sort of rough tandem (the more of the first, then the more of the latter) and that doesn’t happen (it didn’t happen), that should mean that the initial assertion that the representation of violence has a fixed meaning and a fixed relationship to self-fashioning is just plain wrong. What it means is that if there’s more violence represented and less violent action that many people consuming that violent media are interpreting it and understanding it in ways that don’t actually incline them mimetically towards what they’ve seen, towards enactment. It means, well, that lots of things are happening when that media is consumed, and not just lots of things across the whole society, but lots of things in every single person.

When I’ve gotten into debates over the years with violence-in-media activists, one of the responses I often hear is, “Well, we’re not concerned with what well-educated, economically comfortable people in stable homes think when they watch violent media, we’re concerned about it as a contributing factor to violence in impoverished, marginalized and unstable homes”. At which point, my response is that “violent media” is being used as a substitute and alibi for poverty, inequality and injustice. It’s being made to stand in for the whole because the whole is perceived as too big and too difficult to attack. If that move were really about a strategic subdivision of a complex problem into small and manageable ones, it might be ok, though even there the whole point of thinking strategically is to prioritize, and violent media’s negligible and difficult-to-demonstrate contribution to violent action should be a low priority even in that context. But the problem is that small and manageable tasks should require small and manageable contributions of labor. Trying to cleanse the culture of violent video games or shows–or to get comedians to stop telling offensive jokes–is not a small or manageable task. So what happens is that the strategy swallows the whole; the small task comes to stand in for the entirety of the problem. Violent media become the way that one set of critics talk about poverty, and so they stop naming poverty for what it is. The enormity of structure disappears from view and becomes equivalent to the manageable choice of what to watch or play that night, or how to film a particular scene. In making a big problem open to our agency as individuals, we flatter ourselves too much. It’s as much an entrepreneurial or self-promoting move as it is a practical one.

Let me raise one last thought to trouble Doyle’s point. People who do evil sometimes leave in their wake considerable evidence about what they were watching, what they were listening to, what they liked and identified with in culture. The story is often told, for example, that Richard Nixon watched “Patton” and was profoundly influenced by it in his decision to illegally invade Cambodia. This story is often compared with the fact that the same film supposedly played a key role in getting the Israeli and Egyptian delegations to agree to the Camp David accords. Same film, seen in very different ways by different individuals and in different contexts. Score one for postmodernism, or maybe just old-fashioned critical analysis. It’s fair to say, “If someone tells you they were hurt by a joke, you should listen”. If there’s a fire where your sparks fell, pay attention. But it’s equally fair to say, “If someone else grabs the spark and builds a warming campfire with it, or cooks a meal over it, or makes a light from it, take note of that.” And equally fair to note that lightning starts fires too–and strikes in ways that no one expects.

After all, in the wake of some of the evil things that people have done, the archives of culture they leave behind often contain texts and songs and performances and images that none of us would intuitively see as a predicate of that evil. Much as I find clowns scary, I would not say that John Wayne Gacy’s obsession with clowns can be predictably “read out” from the art of clowning, nor that clowns ought to take their makeup off as a result. Murderers, rapists, bigots: populate the rogues’ gallery as you will, and you will find that what they viewed and heard and read are often not at all obviously tied to their actions. If you understand social evil as originating from bad thought and bad language and bad culture, and you keep finding that the inventory of social evil’s cultural world is brimming over with much more than you expected, you either have to decide that your understanding of the relationship between representation and action is too simplistic or that there is far more that artists and writers and comedians should have to be responsible for not painting or writing or saying. I think that’s the prospect that makes Patton Oswalt angry and other comedians afraid.

But there’s another mirroring complexity worth respecting: that the inventory of people who have fought for social justice–or who have suffered social injustice–is often also more capacious and contradictory than you’d expect if you think there’s a close relationship between social action and cultural consumption. That people suffering oppression sometimes see meaning and possibility even in texts that are very literally dedicated to that oppression, that the richness and indeterminacy of meaning flows in many directions.

The unpredictability of meaning, in so many different ways, suggests that our first and last response to it should be humility, should be a kind of principled uncertainty about what we think a joke will mean, can mean, has meant. Which is an uncertainty that should afflict comedian and critic alike. You might indeed be showering sparks on flammable people, or even calling down the lightning in an open field. But equally what looks like a spark might be a light in the darkness, or a warming memory of a distant flame. We should not manage that uncertainty by requiring everyone to perform and listen while covered in fire-retardant foam.

Posted in Politics, Popular Culture | 5 Comments

Is There a Desert or a Garden Underneath the Kudzu of Nuance?

I like this essay by Kieran Healy a lot, even though I am probably the kind of person who habitually calls for nuance. What this helps me to understand is what I am doing when I make that nearly instinctive move. I suppose in part I am doing what E.P. Thompson did in writing against theory as abstraction: believing that the important things to understand about human life are always descriptive, always in the details, always in what is (or was) lived, real, and tangible. There are days where I would find more persuasive, both as scholar and person, from the truths found in a novel or a deep work of narrative journalism than from social theory. But it is stupid to act as if one can be a microhistorian in a naive and unstructured fashion: there’s tons of theory in there somewhere, from the selection of the stories that we find worth our time to what we choose to represent them as saying. I do not read about human beings and then insist that the only thing I can do is just read to you what I read. I describe, I compress, I abstract. That’s what Kieran is arguing that theory is, and what the demand for “nuance” prevents us from doing in a conscious and creative way.

I suppose I lately have a theory of theory, which is that it is usually a prelude to doing something to human beings wherein the abstractions that make theory ‘good to think’ will become round holes through which real human square pegs are to be pounded. But this is in some sense no better (or worse) than any other abstraction–to really stick to my preferences, I should take every theory (and its application or lack thereof) on its particulars.

I also think that there is something of a puzzle that Kieran works around in the piece, most clearly in his discussion of aesthetics. (Hopefully this is not an objection about the need for nuance by some other name.) But it is this: on what grounds should we prefer a given body of theory if not for its descriptive power? Because that’s what causes the kudzu of nuance to grow so fast and thoroughly: academics read each other’s work evaluatively, even antagonistically. What are we to value between theories if not their descriptive accuracy? (If that’s what we are to value, that will fertilize the kudzu, because that’s what leads to ‘your theory ignores’ and ‘your theory is missing…’) We could value the usefulness of theory: the numbers of circumstances to which it can apply. Or the ease-of-use of theory: its memorability, its simplicity, its familiarity. Or the generativity of theory, tested by the numbers of people who actually do use it, the amount of work that is catalyzed by it.

The problem with all or any of those is that I don’t know that it leaves me with much when I don’t like a theory. Rational choice/homo economicus fits all of these: it is universal in scope, it’s relatively easy to remember and apply as a way to read many many episodes and phenomena, and it has been hugely generative. I don’t like it because I think for one it isn’t true. Why do I think that? Because I don’t think it fits the actual detailed evidence of actual human life in any actually existing human society. Or the actual evidence of how human cognition operates. But I also don’t like it because of what is done in the name of such theory. That would always have to be a post-facto kind of judgment, though, if I were prohibited from a complaint about the mismatch between a theory and the reality of human life, or it would have to be about ad hominem: do I dislike or mistrust the politics of the theorists?

I think this is why we so often fall back into the kudzu of nuance, because if we clear away the overgrowth, we will face one another naked and undisguised. We’d either have to say, “I find your theory (and perhaps you) aesthetically unpleasing or annoying” or “I don’t like the politics of your theory (and perhaps you) and so to war we will go”. The kudzu of nuance may be ugly and confusing, but it at least lets us continue to talk at and past one another without arriving at a moment of stark incommensurability.

Posted in Academia, Generalist's Work, Oh Not Again He's Going to Tell Us It's a Complex System | 1 Comment

Don’t Panic! Leave That to the Experts

In many massively multiplayer online games (MMOGs), players who are heavily invested in the game (sometimes just in terms of time, but occasionally both time and money) often group together in organized collaborations, usually called guilds.

Guilds pool resources and tightly schedule and organize the activities of the members. This is typically a huge advantage in MMOGs, where many players either work together only temporarily with strangers, play completely by themselves, or belong to guilds that only offer weak or fitful organization. Many MMOGs tune the gameplay so that the most difficult challenges require this level of elite coordination. The rewards for overcoming these challenges typically have an accumulative effect, allowing the elites to overcome still more difficult challenges and to easily defeat other players in direct combat or competition. The virtual goods and powers obtained through elite coordination visually distinguish the members of these guilds when their characters are seen within the public spaces of the gameworld.

As in any status hierarchy, these advantages are only meaningful if the vast majority of participants do not and cannot obtain the same rewards. So the elite guilds in some sense have a very strong incentive to keep everyone else around. A gameworld abandoned by everyone but the elite stops being fun even for them. This is especially acute when the collaboration within a heavily invested group of elite players extends to keeping their advantage over others through pooling insider knowledge about the game systems, or even to protecting knowledge about a bug or flaw in the game systems which can be potentially exploited by everyone.

To give an example, one of the “virtual world” MMOGs that I spent considerable time studying a few years ago was called Star Wars Galaxies. It was a notable turning point in the history of game design in many ways, most of them not particularly happy, but it did give players a very significant amount of control over the gameworld and had a vigorous design infrastructure in particular for allowing players to compete with each other within a virtual economy. Players could produce a wide variety of items for other players, and the very best of these in terms of the power and utility were rare, difficult to make and worth a good deal of money, especially very early in the history of the game. In order to produce the best items, a player had to spend an immense amount of time making inferior items and incrementally increasing their skills.

But early in the game there was a bug. If you knew about it, you could gain a huge amount of incremental skill increase in a very compressed amount of time. So almost immediately after the game went live, there were a small number of players who could make the very best items that conferred enormous power on the owners of those items, literally weeks before it was even possible for anyone to have gained that level of skill. Naturally the wealth they accumulated was equally disproportionate, and that advantage remained permanent, because the developers chose not to strip away that benefit after fixing the bug. By the time everyone else caught up, the early exploiters–who had shared the secret with each other but not everyone else–were essentially a permanent class of plutocrats.

It keeps happening in such games. There’s almost no point to being a new player in games like DayZ or Ark, for example, unless you’re playing on a small server with a group of trusted friends. Even if there were no hacks or exploits, the established players have such enormous advantages that any new player will find again and again that whatever time they invest in gathering resources and making weapons and shelters will be stolen by elite groups of established players. But the established players have a problem too: they need a large group of victims to invest in the game. That’s where the easiest source of wealth is for them: much better to have a hundred newbies labor for two days and to steal what they’ve made in five minutes than it is to directly compete with an equally elite and invested group of rivals. So they need to talk up how fun the game is, to establish it as a phenomenon, maybe even sometimes to show selective mercy, to offer newbies a kind of protection-racket breathing space, to treat them like an exhaustible resource. (Not for nothing do players sometimes speak of “farming” another player as well as some aspect of the gameworld.)

Why is this on my mind today? Well, for one, I’ve been working off and on over the summer on trying to write about virtual worlds. But for another, I can’t help but think about the analogies I see between these experiences and the stock market.


In the middle of a sharp downturn like this one, there are expert investors who come on the radio, the television, the Internet. “Don’t panic,” they say. “Don’t sell. You’re in it for the long haul! That’s what the experts do.”

These appearances also offer many earnest attempts to explain the underlying reasons for the downturn. “It’s China!” “It’s the emerging markets!” “It’s the price of oil dropping!” “It’s the Fed raising rates!” Some of this frantic offering of explanation seems to me to have the same reassuring intent as “Don’t panic”. It is an attempt to rationalize the change, to relate it to something real in the world. In some cases, this offers the investor (small or large) an opportunity to calculate their own risk. “Ah, it’s China. Well, China’s government will find a way to fix it, I would guess.” “Oh, it’s the emerging markets! I always thought those were fishy, I think I’ll reduce my exposure.”

In some cases, I think these explanations are a form of pressure–even blackmail–directed against governments. “Don’t raise the rates, Fed, we like that easy money–so if you do, you’ll ‘shake investor confidence’ even more, and you wouldn’t want that, would you?” We saw that back in 2008, after all: it is the logic of “too big to fail”. Do this, don’t do that, or we’ll pundit the shit out of the investment economy and create a real panic.

Scattered amid the explanations are also some earnest attempts to argue that there is no explanation, to treat the market in a naturalistic object whose behavior is beyond human agency and not well understood by human science. “We’re still not sure about dark matter, and we’re still not sure why the stock market did that.” This too is a kind of reassurance, and often is followed by the reminder not to panic. “It’ll go up again, it just does that, don’t worry.” I think there’s something to that: the 21st Century market is a cybernetic mass brain that thinks in strange ways and reacts at speeds that we have never lived with before.

What I darkly fear is what I think might be said but never is. After all, the experts say, “Don’t panic!, don’t sell, you’re in it for the long haul”, but some of them panicked, or at least their high-frequency trading computers did. Sure, maybe someone else’s Skynet is buying it all, but this wouldn’t happen if it was just Mom and Pop investors getting nervous about China. And I think to myself, “This is like a guild that’s discovered a bug.” They need everyone to stay in so that they can farm them some more. They need to herd the cattle down the soothing Temple Grandin-style chutes. That some of the explanation is neither, “There is a rational thing that is causing this all” nor “This is something so complex that it just does things now and again that no one understands.” That instead some of it is, “We have trouble here in River City”.

The problem is that in 1987 or in 2001 the expert could also say, “If you’re afraid, then after the next rally, move your money into a safe harbor, stay out of the market.” There is no staying out any longer. That’s the other thing that’s changed because of income inequality, because of the way the elite guilds have changed the game. Nothing’s really safe as an investment. Nothing’s really safe as a life or a career. Our institutions (and even our government, especially when it pays pensions) are part of the asset class now. If you just earn a salary and work hard, your income and prospects have gotten steadily worse in the last three decades: the investment economy isn’t just a nice hedge against the worst now, it’s the only way to stay in the middle class.

This is what elite guilds in games would do if they could: require you to play the game.

Too bad if you don’t like spending two days training a velociraptor and building a shelter in Ark only to find that when you logged off for dinner, a couple of elite hackers took your dinosaur, destroyed your shelter and locked your naked body in a cage. (You think I’m kidding, but I’m not.)

Posted in Politics | 4 Comments

We Are Not Who We Will Become

One of the things about the reaction to Allison Bechdel’s Fun Home by a small subset of incoming Duke undergraduates that is important to grasp is that I think it’s a deliberate–and possibly even coordinated–re-deployment of activism about the content of college education that’s previously come from a “left” direction, right down to the way that the students articulate how reading Fun Home would harm their identities and how they ought to have the right to choose a college education that would never compel them to experience either content or instruction that contradicts the identities that they have chosen for themselves.

There is much more embedded inside of that set of moves than just distaste for a single book or the expression of a single ideology about sexual identity, and it is a good example of why many of us worry about political tactics even when we are sympathetic to the particular concerns, feelings or aspirations of people employing those tactics. Because tactics are mobile: they’re not copyrighted or trademarked.

But it’s not just tactics that’s the issue. It’s also philosophical substance. The Christian students at Duke and left or radical students elsewhere are sometimes proposing something basically similar about themselves, and about the relationship between their sense of self and liberal arts education. They’re proposing that identity is a product of agency (whether through struggle or chosen freely) and that the content of a liberal arts education may destabilize, challenge or unsettle that choice.

I think they’re complicatedly wrong about the former assertion: not only are we not necessarily a product of our own conscious self-making, I’m not even sure that we should hold that out as an aspiration for ourselves. Some aspect of our becoming should be a mystery (and will be whether it should be). They’re not wrong about the latter: the content of a good education may in fact destabilize, challenge or unsettle what we are in ways that neither faculty nor students can anticipate. I wouldn’t even care to guarantee that in the short-term that this shifting or unsettling will have positive outcomes for individuals or communities. But I would still say that it ought to be done.

What unites this particular set of complaints against liberal arts education is a kind of resurgent functionalism, a belief that specific content creates specific outcomes. That classical literature creates Western domination, that Fun Home creates sexual desire and lesbianism. That “problematic” texts create predictable problematic outcomes, that knowledge has a relationship to power over people and power within people that can be known in advance of acquiring that knowledge.

The Duke Christian students may even be right in some sense, if in ignorance of what is actually inside of Fun Home. It is not that there is one panel of oral sex that they should fear, but the fact that lesbians (and a closeted gay man) are present as intimately knowable, familiar human beings. That is a danger if you require them to be unfamiliar and inhuman to sustain your own sense of self. But that might be equally the real fear of some students and activists on the left: that texts that they believe to be doing nothing but the work of oppression nevertheless contain multitudes, just as oppressors do. That to pursue liberal arts education is to live a life without guarantees, to love, or at least make peace with, our own uncertainty.

Posted in Academia, Politics, Swarthmore | 10 Comments

Joke’s On You

Here’s my contribution to the DONALD TRUMP HOW IS THIS POSSIBLE sweepstakes:

Donald Trump is polling well for the same reason Bernie Sanders is polling well.

Sort of.

They’re not at all the same in the sociology of their attraction, nor in the content of their discourse about politics and within politics. Trump’s base and Sanders’ base have no overlap at all. The specifics of what they’re saying and how they’re acting is a product of the particular subculture of their party and their constituencies. It’s perfectly correct to say that Sanders’ enthusiasts are mostly progressives fed up with the Democratic Party in general and Trump’s reception has been fueled by ceaseless moves to a right-wing fringe, that in both cases, there is a history of political sentiment and action within each party which explains what’s going on.

The thing that makes them similar, however, is that they are also the latest spiralling out of a general disaffection with the formal political systems of liberal democracy. It is not limited to the United States, for all that commenters abroad are adopting a superior air in their commentary on the buffoonery of Trump. Jeremy Corbyn might be the Labour Party leader soon for similar reasons. Silvio Berlusconi’s longevity in Italian politics despite Trump-ish behavior has something to do with the same restiveness.

People who are fundamentally inside the world of the political classes–long-time civil servants, policy-making experts, mainstream pundits, elected officials, educated elites generally–are having a hard time fully grasping the big-picture story here. We read each election cycle on its own terms, prompted by horserace journalism.

But not only are publics in most liberal democracies dismayed by the incapacity of their elected officials to do much with the sprawling, recumbent states that they theoretically command, not only are they restive about the downward spiral of their economic and social lives and the predation of the global plutocracy, they’re also tired of the screaming inauthenticity of the entire wretched system. That’s what the low approval ratings mean, first and foremost.

The old saw is that insanity is doing the same thing over and over again and expecting the same results applies primarily to something that’s already demonstrably failed. Folly, in contrast, is doing the same thing over and over again and ignoring every sign of its imminent failure because it worked the last time. We drove across the bridge once again this morning, so who cares if it trembled and groaned? The power plant didn’t blow up today, even though all the red lights on the console are blinking, so fire it up tomorrow just like always.

The campaign consultants keep saying, “The old forms of message discipline and voter mobilization will work eventually, just ignore the sideshow.” The pundits keep laughing or crying or getting angry with Trump (and a few with Sanders) for taking time away from serious candidates and serious issues. What I think none of them get is that the bridge is trembling and groaning. What those polled in Iowa are saying about Trump and Sanders is less about affection for the specifics of their platform, just as what the people who might vote for Corbyn are probably in some cases not all that interested in the specifics of his political views. What they recognize in all of them is that they’re real people. That what you hear from them if you go to a speech is who they really are, what they really think, how they really feel. They’re not what their handlers have told them to be, they’re not the product of some laboratory.

Trump may be an insane, clownish vulgarian with horrific and brutal views of most issues, but he is at least really an insane vulgarian. With at least most of the rest of the Republicans, it’s never very clear what they actually are. Do they really hate science or education? Really want to drown government in a bathtub? Really believe ten-year olds should be compelled to carry a rape pregnancy to term? Who knows? They’re all just doing what they think the primary electorate will respond to. They’re awkwardly slouching out onto a vaudeville stage and asking desperately of the bored and disaffected audience, “What is it that you want to see? Do you want juggling? Burlesque? Stand-up? A guitar solo? I can try to do that.” Trump is just walking out and being himself at a party. Like him or hate him, you recognize at least that he is what he really is. Sanders, Corbyn, and so on as well.

What most people are not seeing when we look at our leaders is people. As fewer and fewer of us are part of the elite, as downward mobility latches on to the majority of the liberal democratic publics across the world, fewer people are inside the systems that produce and maintain political elites. What we see is more like what Roddy Piper’s character in They Live saw: manipulative aliens.

This is not to say that real, unperformative humanity should give anybody hope. The system will eventually find a way to knock such people out of the running. Or people will decide sooner or later what anyone hosting a party with Donald Trump attending would eventually decide: that he’s an asshole who needs to be booted to the curb before you lose all your friends. If by some insurgent chance someone like Sanders not only got the nomination but won he’d find that the system as a whole is unbeatable no matter how genuine your convictions might be.

At least as long as it is a system. Because that’s what the groans and trembles in the bridge really mean. Trump is less in that sense a comment on the specific madness of the current Republican Party and more a set of rivets explosively popping out in the bridge supports. Anybody who wants to keep crossing the river had better start thinking about building a better bridge.

Posted in Politics | 3 Comments

In Media Res

Ta-Nehisi Coates tweets (approvingly, I think) that historians are “not the most hopeful bunch”.

I’ve said as much myself. Among the many problems with David Armitage and Jo Guldi’s The History Manifesto is the authors’ belief that historians once had a seat at the table of power and then lost it (in their view because we started being more like humanists and less like social scientists). Historians have played a crucial role in the making of nations and national identity since the end of the 19th Century, but we’ve never been especially welcome in smoke-filled rooms and think-tank boardrooms where policy wonks have plied their craft.

There are lots of reasons why it’s hard for historians to join those conversations in a way that doesn’t complicate or derail the assembly line. Our sense of the relationship between the passage of time and social or political action is slower, longer, more intricate. It’s hard to say with a straight face that if only you make this regulation or announce this initiative that something’s going to change right away. We know how rare it is for intention to match outcome. We’ve seen it all before. We know that when things change for the better, it’s often due most to people who are also not at the table of people earnestly proposing and implementing solutions. And so on.

Which might suggest that if you have students who want to change the world, directing them to the study of history is just going to be an endless parade of deflation and disappointment. Like almost all historians I know, I think that’s not true. There’s the obvious, frequently made point that while history may not provide a ready-made solution, it does provide a much richer, more complicated understanding of where we are and how we got to this point. Trying to act without a historical understanding is like trying to be a doctor who never does diagnosis. Maybe every once in a while you’ve got a patient in the emergency room where you don’t need to know what happened because it’s obvious, and all you need to do is act–staunch the bleeding, bandage the wound, amputate, restart the heart. Usually though you really need to know how it happened, and what it is that happened, if you want to do anything at all to help.

I’m going to suggest there’s another reason to study history if you want to do something to change the world, and it’s something that applies especially to the rising generation of activists. The specific content of historical study offers a diagnosis of the present, and it also often offers a sense of the alternate possibilities, the turns and contingencies that could shape the future. But cultivating a historical sensibility is also an important warning that any time you act, you’re joining a story that’s already in progress.

This is a warning that falls from the lips of older people with distressing ease, because even if we don’t study history, we’ve lived it. We know just from experience what’s come immediately before the present. That knowledge sometimes blinds us, both to how the present might be genuinely new or just about the degree to which the third (or more) time is the charm, that even if events unfold once again as they have in the past, that repetition is sometimes enough to carry weight of its own.

So be wary about the injunction to think about precedent, but still think about it, and in particular think about it if you want to fight to make a change in the world. Because it’s crucial to know whether other people have fought for that change before, and especially to know whether they’re fighting even now. And it’s equally crucial not to take the absence of apparent victory as a sign of their failure or insufficiency, as a justification for the next generation to just grab the steering wheel.

I’ve talked before at this blog about reading grant applications, for example, from recent undergraduates hoping to pursue a project in another country. Again and again, I’ve seen many of these applicants, especially those seeking to go to African countries, act as if they are the first person to ever think of tackling a given problem or issue in that country. As if there’s no one there who has ever done it, and as if there’s no one here who has ever gone there to do it. You could write this off as simply ugly Americanism, but it’s only a more specific example of a generally weak devotion to thinking historically, to putting one’s own story, one’s own aspirations, into motion.

In almost every cause or struggle, in almost every community and institution, there are people who have been trying to do what you think should be done. They’ve almost certainly learned some important things in the process, and very likely have more at stake in those struggles than you do if you’re a newcomer, a traveller, a visitor. Thinking historically is the key to remembering to look for those predecessors before you start, and it’s a key to remembering to take them seriously rather than just look them up as a kind of pro forma courtesy before you get back to doing your own thing.

Almost nothing genuinely begins with your own life. Rupture and newness are a very small (if important) part of human experience. Yes, being mindful that you’re just the latest chapter in an ongoing story is humbling and a bit inhibiting, and another reason for historians to not be “the most hopeful bunch”. But it is better to live in conscious humility than blithe confidence, at least if you genuinely think that progress is possible. There is no need to steal Sisyphus’ boulder just so you can start fresh from the bottom of the hill.

Posted in Academia, Politics | 2 Comments

Peforming the Role

The short summary of the way that UIUC’s administrative and board leadership (and some of their closest faculty supporters) handled their reaction to Steven Salaita is that they screwed up and that serious professional consequences are completely appropriate.

And not just that they screwed up in “handling the fallout”, as if this is a question merely of public relations tactics. They screwed up substantively, philosophically, in terms of fundamentals. The archive of emails now available for critical examination document that error and how pervasive and systematic it was. Chris Kennedy’s interventions in particular are almost textbook examples of what academic freedom as an ideal is meant to prevent: a prejudicial, ideologically-derived attempt to target particular individual scholars using ad hoc standards that are not (and should not be) imposed on the rest of the faculty.

Until Steven Salaita himself says that he’s satisfied with whatever settlement UIUC offers, whether that is rehiring him or some other compensation, I would urge other academics to continue refusing to do service for UIUC as an institution. I know that imposes a burden on the many great faculty at UIUC by isolating them but I think it’s important to keep the pressure on. UIUC has more work to do in any event than settling with Salaita. And it’s not just UIUC that has these problems.

I do have two modest reservations about some of the responses to the email releases by academic critics. The first is that I don’t know that we should exult overly much about the release of the emails. UIUC’s leadership is ultimately responsible for creating the circumstances in which the release had to be sought through legal means, and thus is ultimately to blame for whatever larger consequences this might have. But the use of legal mechanisms to probe into the professional communications of faculty and staff at public universities has already been abused for political ends in the last decade and I fear this is only going to recommend that tactic further. We shouldn’t be too blithe about telling colleagues at public universities that they’ll just have to meet in person more, use the phone more, stick to their personal accounts more, and so on. That creates yet another kind of large-scale structural inequity for public institutions in a landscape increasingly full of such inequities. The acceleration of many work processes through electronic communication is a mixed blessing, but I personally have no longing at all for laboriously printing out recommendation letters, grant applications, dossiers, and many other kinds of professional labors that I handle at least partly through email. I also find it very valuable to get quick takes on institutional questions from colleagues via email and yes, sometimes to exchange cathartic observations about the week’s business with trusted colleagues.

The second reservation is more complicated, and has to do with the hostile commentary being directed at Phyllis Wise’s faculty confidants and to some extent Wise herself. I’m struggling to figure out how to express this feeling, because there’s a lot of inchoate things bundled inside of it. The place to start might be this: I think some of my colleagues across the country are potentially contributing to the creation of the distanced, professionalized, managerial administrations that they say that they despise, and they’re doing it in part through half-voiced expectations about what an ideal administrator might be like.

Occasionally folks in my social media feeds articulate a belief in faculty governance that has a sort of unexamined wash of nostalgia in it. That we had it all in the good old days and lost it, either to some kind of ‘stab in the back’ or through our own inattention or mistakes. (‘Stab in the back’ narratives generally worry me no matter what the circumstances, because they usually inform a politics that’s one part ressentiment and one part scapegoating.) Sometimes the same folks believe that if only faculty were in charge of everything (whether that’s “once again” or “for the first time”) the university would be working again as it ought to.

Now when I push some on that sentiment, it’s usually not hard to get the same critics to concede that there are a host of specialized professional jobs that have to be done in contemporary universities which can’t be done just by any old Ph.D-holding person who walks in the door. So the conversation refocuses. Who’s the problem, in this view? Basically the upper leadership hierarchy, especially at large corporatized universities that have added numerous vice-presidential positions to their administrations in the last decade. These are the administrators that faculty critics believe either are managing portfolios that no one needs managed or that are exercising forms of leadership that faculty are capable of leading on their own through their traditional structures of governance.

I agree completely that many institutions, especially large universities, have created administrative positions that are redundant or unnecessary. I’m not sure I agree with the idea that administrative leadership per se is largely unnecessary, nor do I think even many critical faculty really believe that–and it shows in some of the contradictory edges around the critical response to the Salaita affair.

First, you don’t have to go very far into the discussions and debates on social media about UIUC to find that faculty who believe in the sufficiency of faculty leadership don’t actually trust many other faculty to participate in governance or leadership. Most notably, there’s an undercurrent of debate about why many STEM faculty at UIUC either endorsed the administrative leadership or were indifferent to the issue–and one common explanation is that STEM faculty are already in thrall to the corporatist university or have actively connived in its making. Which means suddenly that the putatively capable-of-self-governance-faculty have been pared down to “just the humanists and social scientists, and maybe not even all of the folks in the latter group”. Which is sort of like saying that you believe in democracy as long as it’s just the people who share your politics who get to vote. Additionally, there’s a lot of contempt directed at the faculty who were exchanging emails with Wise, who are seen as collusive. But any self-governing faculty is going to have people whose genuinely held views of institutional policy are going to resemble the positions now commonly taken by administrative leaders. If Nicolas Burbules had no vice-chancellor to seek favor from, it’s possible that he (or someone like him) would still think as they think and drive deliberation in that direction. Certainly there will be Cary Nelsons on every faculty, aggressively expressing their views in every forum and meeting and doing in governance what Internet trolls often do in online discussions, which is driving the terms of the conversation towards more extreme or narcissistic terms.

Ultimately I think that the people who believe we can do it all on our own know that sooner or later we would all be desperate to delegate some of the responsibility for institutional leadership to appointed individuals, to not have to sit in shared deliberative session and endure an endless plague of Nelsons trying to cat-herd us towards whatever precipice they favor. In a sense, I think every faculty member who has held any sort of administrative responsibility is familiar with exactly how this works: colleagues who believe they should have a say in everything also want someone else to handle all the tedium of acting on all the contradictory imperatives that emerge out of deliberative process.

Moreover, most of us turn out to want at least some of the sausage-making involved in the life of an academic institution to happen with some kind of confidentiality. Even the most radical demands for transparency (and I’m usually one of those inclined to such) balk at doing everything out in the open. Tenure cases are only one part of a larger landscape of necessary judgment and assessment of the professionalism and practice of other professionals in a university. That’s what believing in self-governance means! Professionals often assert that only they can judge other professionals, that this is a prerogative of their training. Ok, but if that means, “And by the way, everybody who has the necessary minimal qualifications to be a professional is definitionally ok in our eyes for life, and everything we’re presently doing is exactly what we should go on doing forever”, then that’s doing it wrong. Even if we banished the spectre of neoliberal austerity, we’d still need to ask, “Are we doing what we should be doing? Are there things we should stop doing?” We’d still need to think about whether there are changes worth pursuing–say, the academic equivalent of Atul Gawande’s “checklist” reform in hospitals. At least the initial stage of many of those conversations is not something I want to be broadcasting to the largest possible audience in the most indiscriminate way. That too is something that I think we turn to “administration” or something like it to accomplish.

I think here is also where Wise’s critics occasionally end up with some strangely unreal implicit expectations of administrative decorum, a vision of leadership performativity that implicitly envisions administrators as more distant, more isolated, less human than the rest of us. For one, I almost feel as if people are expecting Wise to have had discretionary agency where I’m not sure she did or could–where I don’t know that any of us, faculty or administration, do. I think it’s reasonable to have expected Wise to tell Kennedy, for example, that his desired intervention into the Salaita case was unwise and unwelcome and that she would not do it. I don’t think it’s reasonable to expect, as I feel I’ve seen people expect, that she should have excoriated him or confronted him. I think we somehow expect that administrative leaders should be unfailingly polite, deferential, patient, and solicitious when we’re the ones talking with them and bold, confrontational, and aggressive when they’re talking to anyone else. We seem to expect administrative leaders to escape structural traps that we cannot imagine a way to escape from. There’s a lot of Catch-22 going on here.

We as faculty all have confidants, people we can talk to who help us work through our choices and our feelings. I would guess that most of us turn to people who are going to make us feel better, support us, reassure us. Ideally we should also have friends or trusted colleagues who will be honest with us, who will tell us when we’re making mistakes, but there are days when I suspect even the most iron-willed and psychologically robust person is not not looking for that.

And that’s just when we’re rank-and-file people. Imagine anyone in the role that Wise plays, anyone at all. Pick someone with your exact convictions. Pick yourself. Are we really expecting that the person in that role ought to listen judiciously, patiently and indiscriminately to every single person on their faculty with perfect equity and equanimity? We seem to desire leaders who are able say bluntly what we ourselves cannot or would not say and to mobilize institutional power with executive force in ways that we cannot and also desire leaders whose job it is to serve as a kind of infinitely passive psychic dumping ground, to receive every grievance and grudge within the institution without blinking. To decide what we know we can’t decide and to have never decided any such thing and to disavow any intent to make such decisions. To me that’s another kind of managerialism: the administrator as something other than fully human, needing to perform a professionalism that removes rather than connects them.

Posted in Academia | 1 Comment

Yes, We Have “No Irish Need Apply”

Just came across news of the publication of Rebecca Fried’s excellent article “No Irish Need Deny: Evidence for the Historicity
of NINA Restrictions in Advertisements and Signs”, Journal of Social History, 10:1093, 2015, from @seth_denbo on Twitter.

First, the background to this article. Fried’s essay is a refutation of a 2002 article by the historian Richard Jensen that claimed that “No Irish Need Apply” signs were rare to nonexistent in 19th Century America, that Irish-American collective memory of such signs (and the employment discrimination they documented) was largely an invented tradition tied to more recent ideological and intersubjective needs, and that the Know-Nothings were not really nativists who advocated employment (and other) discrimination against Irish (or other) immigrants.

Fried is a high school student at Sidwell Friends. And her essay is just as comprehensive a refutation of Jensen’s original as you could ever hope to see. History may be subject to a much wider range of interpretation than physics, but sometimes claims about the past can be as subject to indisputable falsification.

So my thoughts on Fried’s article.


2) This does really raise questions, yet again, about peer review. 2003 and 2015 are different kinds of research environments, I concede. Checking Jensen’s arguments then would have required much more work of a peer reviewer than more recently, but I feel as if someone should have been able to buck the contrarian force of Jensen’s essay and poked around a bit to see if the starkness of his arguments held up against the evidence.

3) Whether as a peer reviewer or scholar in the field, I think two conceptual red flags in Jensen’s essay would have made me wary on first encounter. The first is the relative instrumentalism of his reading of popular memory, subjectivity and identity politics. I feel as if most of the discipline has long since moved past relatively crude cries of “invented tradition” as a rebuke to more contemporary politics or expressions of identity to an assumption that if communities “remember” something about themselves, those beliefs are not arbitrary or based on nothing more than the exigencies of the recent past.

4) The second red flag, and the one that Fried targets very precisely and with great presence of mind in her exchanges with Jensen, is his understanding of what constitutes evidence of presence and the intensity of his claims about commonality. In the Long Island Wins column linked to above, Jensen is quoted as defending himself against Fried by moving the goalposts a bit from “there is no evidence of ‘No Irish Need Apply'” to “The signs were more rare than later Irish-Americans believed they were”. The second claim is the more typical sort of qualified scholarly interpretation that most academic historians offer–easy to modify on further evidence, and even possible to concede in the face of further research. But when you stake yourself on “there was nothing or almost nothing of this kind”, that’s a claim that is only going to hold up if you’ve looked at almost everything.

I often tell students who are preparing grant proposals to never ever claim that there is “no scholarship” on a particular subject, or that there are “no attempts” to address a particular policy issue in a particular community or country. They’re almost certainly wrong when they claim it, and at this point in time, it takes only a casual attempt by an evaluator to prove that they’re wrong.

But it’s not just that Jensen is making what amounts to an extraordinary claim of absence, it is that his understanding of what presence would mean or not mean, and the crudity of his attempt to quantify presence, that is an issue. There may be many sentiments in circulation in a given cultural moment that leave few formal textual or material signs for historians to find later on. Perhaps I’m more sensitive to this methodological point because my primary field is modern Africa, where the relative absence of how Africans thought, felt and practiced from colonial archives is so much of a given that everyone in that field knows to not overread what is in the archive and not overread what is not in the archive. But I can only excuse Jensen so far on this point, given how many Americanists are subtle and sensitive in their readings of archives. Meaning, that even if Jensen had been right that “No Irish Need Apply” signs (in ads, in doors, or wherever) were very rare, a later collective memory that they were common might simply have been a transposition of things commonly said or even done into something more compressed and concrete. Histories of racism and discrimination are often histories of “things not seen”.

But of course as Fried demonstrates comprehensively, that’s not the case here: the signage and the sentiment were in fact common at a particular moment in American history. Jensen’s rear-guard defense that an Irish immigrant male might only see such a sentiment once or twice a year isn’t just wrong, it really raises questions about his understanding of what an argument about “commonality” in any field of history should entail. As Fried beautifully says in her response, “The surprise is that there are so many surviving examples of ephemeral postings rather than so few”. She understands what he doesn’t: that what you find in an archive, any archive, is only a subset of what was once seen and read and said, a sample. A comparison might be to how you do population surveys of organisms in a particular area. You sample from smaller areas and multiply up. If even a small number of ads with “No Irish Need Apply” were in newspapers in a particular decade, the normal assumption for a historian would be that the sentiment was found in many other contexts, some of which leave no archival trace. To argue otherwise–that the sentiment was unique to particular newspapers in highly particular contexts–is also an extraordinary argument requiring very careful attention to the history of print culture, to the history of popular expression, to the history of cultural circulation, and so on.

Short version: commonality arguments are hard and need to be approached with care. They’re much harder when they’re made as arguments about rarity or absence.

5) I think this whole exchange is on one hand tremendously encouraging as a case of how historical scholarship really can have a progressive tendency, to get closer to the truth over time–and it’s encouraging that our structures of participation in scholarship remain porous enough that a confident and intelligent 9th grader can participate in the achievement of that progress as an equal.

On the other hand, it shows why we all have to think really carefully about professional standards if we want to maintain any status at all for scholarly expertise in a crowdsourced world. I’ve said before that contemporary scholars sometimes pine for the world before the Internet because they felt safe that any mistakes they make in their scholarship would have limited impact. If your work was only read by the fifty or so specialists in your own field, and over a period of twenty or thirty years was slowly modified, altered or overturned, that was a stately and respectable sort of process and it limited the harm (if also the benefit) of any bolder or more striking claims you might make. But Jensen’s 2002 article has been cited and used heavily by online sources, most persistently in debates at, but also at sites like History Myths Debunked.

For all the negativity directed at academia in contemporary public debate, some surveys still show that the public at large trusts and admires professors. That’s an important asset in our lives and we have serious collective interest in preserving it. This is the flip side of academic freedom: it really does require some kind of responsibility, much as that requirement has been subject to abuse by unscrupulous administrations in the last two years or so. We do need to think about how our work circulates and how it invites use, and we do need to be consistently better than “the crowd” when we are making strong claims based on research that we supposedly used our professional craft to pursue. It’s good that our craft is sufficiently transparent and transferrable that an exceptional and intelligent young person can use it better than a professional of long standing. That happens in science, in mathematics, and other disciplines. It’s maybe not so good that for more than ten years, Jensen’s original claims were cited confidently as the last word of an authenticated expert by people who relied on that expertise.

Posted in Academia, Oath for Experts, Production of History | 14 Comments