Production of History – Easily Distracted https://blogs.swarthmore.edu/burke Culture, Politics, Academia and Other Shiny Objects Mon, 14 Aug 2017 19:35:55 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 When In Truth Did We Win Anything? https://blogs.swarthmore.edu/burke/blog/2017/08/14/when-in-truth-did-we-win-anything/ https://blogs.swarthmore.edu/burke/blog/2017/08/14/when-in-truth-did-we-win-anything/#comments Mon, 14 Aug 2017 19:35:55 +0000 https://blogs.swarthmore.edu/burke/?p=3190 Continue reading ]]> Progress is dead.

In the same sense that Nietzsche spoke of the death of God, only to be habitually misunderstood by the same kinds of people that misunderstand Einstein saying that God does not play dice with the universe. The question Nietzsche had was how it might be possible to retain some consistent vision of values or ethics in the absence of a belief in God as the unquestioned authority over such values. The whole point was to find some deeper, more robust way to sustain those values.

So what “progress” is it that has died? The kind that people–primarily white, educated and liberal people–told themselves had already been accomplished and would inevitably continue to be accomplished. Progress as slightly smug self-congratulation is dead. Progress as the accomplished work of an earlier generation of almost mythical heroes is dead. Progress as irreversible is dead.

The aspirational content of progress is not, any more than ethics and morality were dead with “God”. We just have to find a deeper way to work for those aspirations and to never assume that they are final, finished work if they appear, however briefly, to be an animating part of our public institutions and civic lives.

So what does this mean as a revision of the more smug style of telling the history of the modern world? It does not mean that we must tell the opposite history: that the last two centuries have been a never-ending catastrophe of anti-progress, that nothing has ever changed, that a nightmare that began in 1492 has continued uninterrupted and undifferentiated ever since. That is the same kind of nihilism that Nietzsche was desperate to avoid as the concept of God lost its status as the secure guarantor of moral claims.

We have no grounds for complaining about the failures of our present if we did not somewhere develop an understanding of what a better world would be like. That understanding has risen out of experience and experiment, out of actions taken and institutions remade. It has been and remains real. If we tell ourselves that nothing has ever changed, we are also telling ourselves, whether we mean to or not, that nothing ever can change.

The weariness that is settling over most of us–even people who long have been bowed under by the weary awareness that the promise of progress has never been fulfilled–is because we now know that anything that does change can be changed back again. Slavery was abolished, but it can be resurrected. In corners and shadows in our world, it has been. One form or another of legal racism has been edited out of the laws, but it either marches on regardless of the law or the law falls into the hands of people who would perpetuate racism. One group of people arises who reject injustice, but another group finds their way to injustice and they baptize themselves in its foul pools. There are no procedures or rules or systems that prevent the renewal of social evil. There is no philosophy or belief which is self-proving and secure against its half-hearted adoption by insincere and doubtful adherents.

Trying to figure out what in the human past is so thoroughly past that it will never come again is a fool’s errand. Trying to think of the past as an atavism that erupts somehow into a present full of progress is equally foolish. We don’t carry a terrible past inside of us like a parasite. We make new futures of terror and beauty from what we have been, but also from what we are. There’s always a new way to be terrible. The torch-bearers of Charlottesville are not mocking ghosts who can only briefly haunt the living. They are terrible children, familiar fathers, the man next door, the face behind the counter or the voice on the phone. New and urgent, but also known burdens, the rock that we sisyphi push up the hill and that veers to crush some of us–always the same some–reliably and repeatedly on its way back down.

Progress is not a machine programmed to arrive at a predestined utopia. It is not an arc that bends towards justice like the rain falling to the force of gravity. It is a twisting road we must walk in a never-ending maze of twisting roads. We walk it because we ought to, not because we’ve been given assurances of getting to the other side.

]]>
https://blogs.swarthmore.edu/burke/blog/2017/08/14/when-in-truth-did-we-win-anything/feed/ 10
On Confederate Counterfactuals https://blogs.swarthmore.edu/burke/blog/2017/07/24/on-confederate-counterfactuals/ https://blogs.swarthmore.edu/burke/blog/2017/07/24/on-confederate-counterfactuals/#comments Mon, 24 Jul 2017 18:32:11 +0000 https://blogs.swarthmore.edu/burke/?p=3180 Continue reading ]]> For some years, I’ve taught a course on counterfactual history. Unlike many scholarly historians, I find counterfactual history useful for a variety of reasons.

For one, I accept the argument that a number of its proponents have made that all arguments about historical causality are at least implicitly counterfactual, and that those claims can often be made more effectively if the counterfactuals are explored more explicitly. For the same reason, I think all claims about the contingent nature of historical events and about human agency in history require at least some acknowledgement of counterfactual possibilities.

I also think there are some humanists who’ve done an important job of asking about the emotional and philosophical meanings of certainty in history, about why we’re sure that certain key events or long-term narratives are inevitable or necessary. Often our need to see certain things as highly deterministic is less derived from evidence or analysis and more by a sense of our contemporary politics or values, that to acknowledge certain contingencies or uncertainties in the past is to make something in the present more fragile than we wish it to be.

I also just think counterfactuals are interesting and enjoyable and that is sufficient justification for pursuing them. I’m glad to turn E.H. Carr’s famous denunciation of the counterfactual as a “parlour game” on its head and see that as an endorsement. Counterfactuals and historical fiction both challenge the limits of historical scholarship and force historians to recognize that there are other ways of knowing, imagining and making use of the past that may require other practices of imagination and interpretation than the traditional approach favored by historians since the late 19th Century.

That said, the striking thing about actual counterfactual writing is not its imaginative character but instead how cramped and fetishized much of it is. A vast percentage of it, both by fiction writers and by scholars who’ve taken a stab at it, concerns a small handful of famous battles, a small handful of famous white male leaders, and a smattering of familiar and very Eurocentric events. Niall Ferguson, in his introduction to the anthology Virtual History, seems to think that this narrowness of focus is one of the things that recommends counterfactuals as a scholarly exercise. (He makes a fairly tortured argument that counterfactual writing is a salutary poke in the eye to Marxist-inflected social history and must concentrate on a small subset of historical actors where we have explicit evidence that they consciously contemplated several courses of action before undertaking one of them.)

One reason I think it’s worth pushing counterfactuals more generally is to ask what counterfactuals written outside of that cramped space might look like, and why we might be reluctant in some cases to undertake them. If I try to write a counterfactual analysis of the “scramble for Africa” of the late 19th Century, I immediately confront some pretty serious conceptual, political and intellectual challenges. If I confine my counterfactual to Bismarck or Cecil Rhodes or Joseph Chamberlain or David Livingstone, I’m just reproducing the old Eurocentric narratives that claim that the conquest of sub-Saharan Africa was just a kind of epiphenomenal side-effect of European history decided upon by famous male leaders. If I try to write a counterfactual where African agency produces a different substantive overall outcome, I’m in danger of “blaming the victims”, of imagining that Africans could have stopped colonialism if they’d only done something other than what they did. (And if I try to do that, I’m also up against serious limits to plausibility and accuracy, since there really doesn’t seem to have been an overall possibility of a different outcome from collective or sustained action by Africans, just variations in local outcomes.) If I argue that colonialism was completely deterministic and inevitable and no counterfactuals are possible, I put in jeopardy a whole series of nested assumptions about the moral responsibility of imperial leaders and European nations. But these all seem like valuable conversations to have, and if asking about counterfactuals as a possibility helps push them forward, good.

The other way to think about the cramped space that most counterfactuals live in is to ask why they’re so uncreatively confined to a narrow range of conjectures about what-might-have-been. So let’s take one of the two stock counterfactuals, namely, “What if the South had won the Civil War?”, which the producers of Game of Thrones have announced will be the basis of the next series they will produce. This has not surprisingly and to my mind completely justifiably produced a lot of dismayed chatter on social media.

Partly it’s because Benioff and Weiss don’t by their own admission have much knowledge about this extremely crowded field of counterfactual writing. “I read a book by Shelby Foote” does not inspire confidence. If nothing else, I’d tell them to hire some researchers stat so that they don’t end up being sued by one of about thirty authors for pretty much rehashing an existing might-have-been story. Maybe they should even option one or more of those stories: Bring the Jubilee might work pretty well. (But please god, not that awful goddamn Harry Turtledove book.)

The deeper problem is that for a subject that receives this much attention, the range of counterfactuals is narrowly confined to essentially nostalgic takes on the antebellum South, to the point of being a kind of odd side branch of Lost Cause thinking. There are exceptions, but not many. They’re also generally obsessed with battlefield analysis, Gettsyburg in particular, and Pickett’s Charge even more particularly.

If you really thought about it, here’s some other counterfactuals about the Civil War that are at least as plausible as the more typical, “The South wins and either becomes a racist nightmare dystopia that dominates the North or it becomes a genteel civilization that eventually slowly emancipates the slaves and makes racial peace”.

1. The North imposes a genuinely tough and unforgiving form of military occupation and sees Reconstruction through more thoroughly until it’s finished, resulting in an America with more racial justice and with a South that is fully reintegrated into the Union, more along the lines of post-1950 Germany or Japan. Nobody writes that one up, but it’s not completely without plausibility, nor is it without appeal. (Counterfactual fiction has a somewhat understandable aversion to writing about outcomes that were far better than the real world because of the loss of dramatic potential, but there are good examples of engaging stories that follow that path.)

2. A US where slave revolts became widespread after Harper’s Ferry (or at some earlier moment), leading to an overall collapse of public order in some slave states and subsequent federal intervention, eventually leading to emancipation without a Civil War.

3. A US where the South secedes and the North decides to let them secede but also overthrows Dred Scott, encourages fugitive slaves, and closes the border to the South and prevents westward movement. The South becomes an impoverished shithole banana republic and in the early 20th Century begs for readmission to the Union.

4. A South that is permitted to secede that then wages war on Cuba, Puerto Rico, Central America and Mexico to try and secure more territory for slavery and eventually loses in a series of border conflicts, including the re-annexation of most of Texas.

5. A South which successfully sues for favorable peace after Gettysburg only to fall to a socialist revolution in the early 20th Century due to an alliance between slaves, freedmen, small landholders and industrial laborers against the old plantation class.

See, the thing you discover is that whether you’re doing fiction or you’re trying to make a careful counterfactual argument that is somewhat scholarly in nature, almost all “The South and the Civil War” counterfactuals are captive to the Lost Cause and are deeply solicitious of Southern white manhood–of the need to compliment the honor and dignity of Confederate soldiers, the legitimacy of the Confederate cause, to treat the Civil War as a noble conflict between brothers, and so on. But there are so many other stories that could be told–or conjectures that could be made. (And have been made, at least by some scholars of Reconstruction.)

So if Benioff and Weiss keep going with this, I really urge them to leave Shelby Foote behind. If they really must do this, try something else that’s really provocative for a change. I think a series where an independent South is a horrific failed state or a series where Reconstruction is genuinely harsh to good ends also would get people talking, and for once, the provocations would be aimed in a different direction than they habitually are.

]]>
https://blogs.swarthmore.edu/burke/blog/2017/07/24/on-confederate-counterfactuals/feed/ 25
Enrollment Management: The Stoic’s Version https://blogs.swarthmore.edu/burke/blog/2016/09/15/enrollment-management-the-stoics-version/ https://blogs.swarthmore.edu/burke/blog/2016/09/15/enrollment-management-the-stoics-version/#comments Thu, 15 Sep 2016 17:24:18 +0000 https://blogs.swarthmore.edu/burke/?p=3016 Continue reading ]]> I have had a few interesting conversations with colleagues online about recent news of falling enrollments in college history courses nationwide, conversations which broadly echo similar discussions among faculty in other disciplines about the same phenomenon in their classes.

Speaking generally, two things tend to strike me about these recurrent discussions. The first is that many faculty make extremely confident assertions about the underlying causes of shifting enrollments that are (at best) based on intuitions, and moreover, these causal theories tend to be bleakly monocausal. Meaning that many faculty fixate on a single factor that they believe is principally responsible for a decline and dig in hard.

The second is that the vast majority of these causal assertions are focused on something well beyond the power of individual history professors or even departments of history (or associations of historians!) to remedy.

Just to review a range of some of the theories I’ve encountered over the last two years of discussion, including recently:

a) It’s a result of parental and social pressure for utility and direct application to viable careers.
b) It’s a result of admitting too many students who are interested in STEM disciplines. (Which is sometimes just relocating the agency of point #a.)
c) It’s a result of badly designed general education requirements that give students too much latitude and don’t compel them to take more history or humanities.
d) It’s a result of too many AP classes in high school, which gives students the idea that they’ve done all the history they might need.
e) It’s a result of bad or malicious advising by colleagues in other departments or in administration who are telling students to take other subjects.

At best, if these are offered as explanations which are meant to catalyze direct opposition to this hypothesized cause, they lead professors far away from their own courses, their own pedagogy, their own department, their own scholarship, all of which are vastly easier to directly affect and change. At worst, these are forms of resignation and helplessness, of not going gentle into that good night.

It might not be completely useless to engage in public argument about why history actually is useful in professional life or in the everyday lives of citizens. Or to argue against the notion that we measure subjects in higher education according to their immediate vocational payoffs. All faculty at liberal-arts institutions should be contributing to making that kind of case to the widest possible publics. However, argument in the general public sphere about these thoughts is less immediately productive in engaging enrollments than similar arguments made to actual students already matriculating at the home institutions of historians. Those students are knowable and are available for immediate consultation and dialogue. What they think about history or other humanities may not be what a far more abstract public thinks. They may be seeking very particular kinds of imagined utility which a historian could offer, or simply need to have some ideas about how to narrate the application of historical inquiry to other spheres and activities.

Complaining about requirements, about advising, or about AP classes is similarly distracting. Changing general-education requirements is a particularly dangerous answer to an enrollment problem for a variety of reasons. Compelling students to take a course they not only do not want to take but actively oppose taking is very likely to contribute to even greater alienation from the subject matter and the discipline overall, unless the subject matter and the pedagogy are of such overwhelming value that they singlehandedly reverse the initial negative perception. Moreover, there’s a game-theoretic problem with using requirements as an instrumental answer to enrollment shifts, which is that in a faculty organized around departments, this leads to every department with declining enrollments demanding new requirements specifically tailored to enrollment capture, which in turn forces departments which are the beneficiaries of stronger enrollment trends to weaponize their own participation in curricular governance and defend against a structure of requirements that takes students away from them. Like it or not–and I think we ought to like it–student agency is an important part of most of higher education, and indispensible in liberal-arts curricula especially. The only coherent alternative to a curriculum predicated on student choice is either an intellectually coherent and philosophically particular approach like that of St. John’s College or a core curriculum that is not departmentally based but is instead designed and taught outside of a departmental framework. Asking for new requirements is a way to avoid self-examination.

That’s generally the problem I have with these kinds of explanations. They take us away from what we can meaningfully implement through our own labor, but also they allow us to defer introspection and self-examination. If current students find the traditional sequencing of many college history majors to be uncompelling, whether that’s because of having taken AP courses or not finding the typical geographic and temporal structures compelling or useful, there is nothing about that sequence which is sacred or necessary. History is not chemistry: one does not have to learn to use Avogadro’s number and basic laboratory techniques in order to progress further in the subject. Maybe courses that are thematic which are taught across broad ranges of time and space are more appealing. Maybe courses that connect understanding history to contemporary life or issues in explicit ways are more appealing. Maybe courses that emphasize research methods and digital technologies are more appealing. Maybe none of the above. But those should be the only things that historians in higher education are concerned with when they worry about enrollments: what are we doing that’s not working for our actually-existing students? Could we or should we do other things? If we refuse to do other things because we believe that what we have been doing is necessary, what is it that we have been doing that’s necessary, and why is it important to defend regardless?

Historians should be (but generally aren’t) especially good at thinking in this way because of our own methodological know-how and epistemological leanings. If it turns out that what we are inclined to treat as natural and necessary in our current curricular structures and offerings is in fact mutable and contingent simply by comparison with past historical curricula, then when is it exactly that we became convinced of the necessity of those practices? And what was the cause of our certainty? If it turns out that what we defend as principle is in fact just a defense of the immediate self-interest of presently-laboring historians, then our discipline should itself help us gain some necessary distance and perspective about our interests.

Especially if it turns out that our perception of our interests is in fact harming our actual self-interest in remaining a viable part of a liberal-arts education. Perhaps the first, best way historians could demonstrate the usefulness of our modes of inquiry is by using them to understand our present circumstances better and imagine our possible futures more clearly. Even if we want to insist that lower enrollments should not by themselves resolve questions about the allocation of resources within academia (a position I agree with), we might find that there are new ways to articulate and explain that view which are more persuasive in the present rather than simply invoked as an invented tradition.

]]>
https://blogs.swarthmore.edu/burke/blog/2016/09/15/enrollment-management-the-stoics-version/feed/ 6
“Dates Back Millennia” https://blogs.swarthmore.edu/burke/blog/2016/01/13/dates-back-millennia/ Wed, 13 Jan 2016 16:13:31 +0000 https://blogs.swarthmore.edu/burke/?p=2914 Continue reading ]]> You know, I have less of an unqualified hatred for the “dates back millennia” line than I used to. I’m thinking this as I see my feed fill up with friends and colleagues complaining about Obama’s use of it in his speech to talk about the Middle East. To some extent, historians overreact to its use by politicians for two separate reasons.

The first is that of course it’s factually wrong and not at all innocently so. Which is to say that this line of explanation, whether offered as a quick throw-away or as a substantive claim, looks away from the history of the 20th Century and the very decisive role played by European colonialism and post-WWII American intervention in structuring many supposedly “ancient hatreds”. In the case of Israel-Palestine, that is particularly convenient for the United States (and for Zionists) because the direct and immediate causal relevance of the precise way in which the state of Israel came into being and the ways in which the current states of the Middle East were brought into the geopolitics of the Cold War are the major and direct causal underpinnings of contemporary conflicts. It runs from mature responsibility and from genuine analytic understanding all at once.

The second reason for the reaction is Invoking “ancient hatreds” not only is a misdirection of attention, it also naturalizes conflicts in the bodies and minds of the combatants. It’s a kind of shrug: what can one do? but it also turns more to psychology than history as the toolset for thinking through current politics, which is at best futile and at worst creepy.

So why do I qualify my dislike? First I think among historians we all recognize that there’s a strong turn to the modern and contemporary among our students and our publics, a presentism that most of us criticize. But I think in moments like this, we contribute some to that presentism. We should leave a door open for times before the 20th Century to matter as causal progenitors of our own times and problems. Sure, that argument has to be made carefully (shouldn’t all historical arguments be thus?) but I actually think all of the past is weighing on the present, sometimes quite substantially so. “Ancient hatreds” isn’t quite the right way to put it, but there are aspects of conflict in the Middle East which do genuinely derive structure or energy from both the Ottoman period (early and late) and from times before that.

It’s also that I think we end up in getting angry at politicians who are trying to kick over the traces of their own government’s recent historical culpability but in so doing forget that there are many other actors who also believe and are motivated by the supposed antiquity of their actions. On some level, if they do think so, we ought to at least listen carefully and not quickly school-marm them about why the experts hold that they’re wrong. Authenticity is a strange twilight realm. If people believe that they are upholding something ancient, that has a way of becoming true enough in some sense even if they’re wrong about the history between them and that past moment and wrong about what the ancient history really was. It might be easier simply to focus on the culpability of some states and actors for the current situation and leave aside compulsively correcting their history in some cases.

But finally, as long as we’re talking culpability, the one problem with always, invariably locating conflict and hatred as having their most relevant origins in Western colonialism and in the decisions made during post-WWII decolonization is that we risk having our own version of a distraction from uncomfortable truth. As I noted, maybe sometimes there really is something older at play. There’s a really great book that the historian Paul Nugent wrote about the Ghana-Togo borderlands in West Africa that makes the argument that counter to the common trope that the Berlin Conference simply arbitrarily created random and incoherent borders–that the border there was both reflective of older 19th Century histories and that the communities in the borderlands did much to fashion those boundaries. More uncomfortably, maybe sometimes there’s something far more recent and contingent at play–maybe sometimes in current global conflicts even our preferred causal stage is an “ancient conflict” of little real empirical relevance to combatants, who are instead being put into motion by the political and cultural histories of the last twenty years or even the last ten.

]]>
All Saints Day https://blogs.swarthmore.edu/burke/blog/2015/11/09/all-saints-day/ https://blogs.swarthmore.edu/burke/blog/2015/11/09/all-saints-day/#comments Mon, 09 Nov 2015 22:35:33 +0000 https://blogs.swarthmore.edu/burke/?p=2900 Continue reading ]]> Commenting on the debate over Halloween costumes seems freshly risky this week, but the subject has been on my mind since I read this New York Times article on the subject on October 30.

My first thought would be that calls for the resignation of the Silliman House masters at Yale are dangerously disproportionate to the email that they wrote in response to polite guidance from the Yale administration. I’ll come back to why that disproportionate response worries me so much later in this essay.

And yet I don’t entirely agree with the way that Erika Christakis chose to come at the issue. I wish everyone could back up a step so that the entire discussion is not about free expression vs. censorship or between safe spaces and stereotype threats. Once the discussion has locked into those terms, then the “free speech” advocates are stupidly complicit in defending people who show up at parties in blackface or are otherwise costumed or having themed parties with deliberately offensive stereotypes. Once the discussion has locked into those terms, people who want to say that such stereotypes have a real, powerful history of instrumental use in systems of racial domination are forced to understand that advocacy as censorship–and are also unable to leave space open to hear people like Erika and Nicolas Christakis as making any other kind of point.

The real issues we should be talking about are:

1) The concepts of appropriation and ownership. This is where moves are being made that are at least potentially reactionary and may in fact lead to the cultural and social confinement or restriction of everyone, including people of color, women, GLBQT people, and so on. In some forms, the argument against appropriation is closely aligned with dangerous kinds of ethnocentrism and ultra-nationalism, with ideas about purity and exclusivity. It can serve as the platform for an attack on the sort of cosmopolitan and pluralistic society that many activists are demanding the right to live within. Appropriation in the wrong institutional hands is a two-edged sword: it might instruct an “appropriator” to stop wearing, using or enacting something that is “not of their culture”, but it might also require someone to wear, use and enact their own “proper culture”.

When I have had students read Frederick Lugard’s The Dual Mandate in British Tropical Africa, which was basically the operator’s manual for British colonial rule in the early 20th Century, one of the uncomfortable realizations many of them come to is that Lugard’s description of the idea of indirect rule sometimes comes close to some forms of more contemporary “politically correct” multiculturalism. Strong concepts of appropriation have often been allied with strong enforcement of stereotypes and boundaries. “Our culture is these customs, these clothing, this food, this social formation, this everyday practice: keep off” has often been quickly reconfigured by dominant powers to be “Fine: then if you want to claim membership in that culture, please constantly demonstrate those customs, clothing, food, social formations and everyday practices–and if you don’t, you’re not allowed to claim membership”.

And then further, “And please don’t demonstrate other customs, clothing, food, social formations and everyday practices: those are for other cultures. Stick to where you belong.” I recall a friend of mine early in our careers who was told on several occasions during her job searches that since she was of South Asian descent, she’d be expected to formally mentor students from South Asia as well as Asian-Americans, neither of which she particularly identified with. I can think of many friends and colleagues who have identified powerfully with a particular group or community but who do not dress as or practice some of what’s commonly associated with that group.

What’s being called appropriation in some of the current activist discourses is how culture works. It’s the engine of cultural history, it’s the driver of human creativity. No culture is a natural, bounded, intrinsic and unchanging thing. A strong prohibition against appropriation is death to every ideal of human community except for a rigidly purified and exclusionary vision of identity and membership.

Even a weak prohibition against appropriation risks constant misapplication and misunderstanding by people who are trying to systematically apply the concept as polite dogma. To see one example of that, look to the New York Times article, which describes at one point a University of Washington advice video that counsels people to avoid wearing a karate costume unless you’re part of the real culture of karate. But karate as an institutional culture of art and sport is already thoroughly appropriated from its origins in Okinawa, and it was in turn an appropriation of sorts from Chinese martial arts–and no martial arts form in the world today is anything even remotely like its antecedents in practice, form or purpose. Trying to forbid karate costuming to anyone but a truly authentic “owner” of the costume is a tragic misunderstanding of the history of the thing being regulated. It’s also a gesture that almost certainly forbids the wearing of a costume that has a referent that is not wholly imaginary. If a karate outfit is appropriation for anyone but a genuine Okinawan with a black belt, then so also are firefighters, police, soldiers, nurses, doctors, astronauts and so on. Even imaginary characters are usually appropriations of some kind of another, drawn out of history and memory.

It is precisely these kinds of discourses about appropriation that are used by reactionaries to protest Idris Elba being cast as Heimdall, or to assert that a tradition of a particular character or cultural type being white or male or straight means it must always be so. It might be possible to configure a critique so that appropriation from below is always ok and appropriation from above is never ok, but that kind of categorical distinction itself rests on the illusion of power being rigid, binary and fixed rather than fluid, performative and situational.

What I think many activists mean to forbid is not appropriation but disrespect, not borrowing but hostile mockery. The use of costumes as weapons, as tools of discrimination. But it’s important to say precisely that and no more, and not let the word appropriation stand in for a much more specific situational critique of specific acts of harmful expression and representation. “Appropriation” is being used essentially to anticipate, to draw a comprehensive line proactively in order to avoid having to sort out with painful specificity which costumes and parties are offensive and which are not after the fact of their expression.

2) But this leads to my second point: “appropriation” is being used for the convenience of custodial authority, for the use of institutions, for the empowerment of a kind of kindly quasi-parental control over communities.

Institutions–like college administrations and particularly the legal advisors they employ–don’t like situational judgments, they don’t like critiques that apply with strong force in some situations and don’t apply at all in others. So they often seek to rework demands for change into rules and guidelines that can be applied evenly to all subjects at all times. That’s one reason why appropriation as a concept at least has the potential to force people to perform the identities they claim according to a pre-existing sketch in the hands of institutional power.

Custodial authority in this respect and many others is a danger for other reasons. Here I can’t do much more than echo Fredrik deBoer’s warning against “University Inc.”: the custodial university quickly becomes the neoliberal corporate university. On some campuses, student activists are incidentally or accidentally strengthening the capacity and reach of custodial power over faculty, staff and students alike. Among other consequences, this change in academic institutions often puts faculty from underrepresented groups at much more intense risk: student activists are sometimes accidentally undercutting one of their most cherished objectives.

Even when the people in the crosshairs do not have that vulnerability, they have the basic vulnerability that all working professionals have in the disastrous political economy of early 21st Century America. In the Christakis’ case and many others, I feel as if simplistic ideas of asymmetrical power and “punching up” are being used to overlook the potentially disastrous consequences of introducing greater precariousness into the lives of middle-aged professionals. Sometimes the consequences of failed leadership is sufficient cause to warrant making an individual’s life precarious, and sometimes the asymmetry of power is enough that one can sleep easy about the consequences–say, with the resignation of the University of Missouri’s president, who I think we can say will in fact land on his feet. But often not. What’s being said to the Christakises in those videos is serious business, and I don’t know that those saying it seem to realize it is, even though many of them clearly feel with legitimate passion that what was said by Erika Christakis is also serious business that makes them feel unsafe in a place where they prize a sense of security. It’s a cliche, but here something of “two wrongs don’t make a right” is important.

This is also a concern about the future of academic institutions themselves. This is the other problem with some of these protests. I feel badly for everyone today in that everything they write on social media, every protest they attend, every response they give, has some chance of being seized upon by commenters all over the world. Nobody was looking at my college life with that kind of attention. But for anyone who aspires to political action,even action as intimate and simple as seeking personal safety and happiness, they have got to pay attention to the infrastructure surrounding that action, and to the consequences that will flow from it. Bit by bit, protests that seem to assert that yes, the university is indeed a world completely apart from the social and cultural realities around it, add fuel to the fires being set by reactionary politicians all around the United States. Bit by bit, protests where the rhetoric that is meant to be strictly local but is turned national or global end up looking tone-deaf or disproportionate. This could be a learning experience: liberal arts learning is supposed to increase the capacities of students to speak, think, write and act in the world around them. But for it to be a learning experience, in some cases students (and faculty) will have to treat the question of how a particular claim will sound or mean outside of the local context seriously. And they will need to think very carefully about matching critical demands to visions of proportionality that sound reasonable to more than just the group at hand.

3) This leads in turn to my third point. What is going on with struggles over Halloween costumes and much else besides within college and university culture has implications for the futures of liberal arts-educated students. And they are not the implications that are commonly drawn either by “free speech advocates” or by defenders of current campus activism.

“Free speech”, broadly speaking, is not what is at risk in most campus disputes. Occasionally it is to some extent: that’s how I interpret the seriously misconceived protests at Wesleyan recently against the student newspaper. Even in the case of Wesleyan, however, the initial impulse to inhibit or constrict what can be said gave way to something more managerial and neoliberal, this time not from administration but from student leadership itself. The student assembly proposed cutting the funding of the paper in the name of a drive for efficiency, having it “compete” for positions against others with an inbuilt incentive-based reward for incorporating diversity.

What I think that move suggests is that some of the drive for cultural transformation, with its constant turn towards custodial forms of managerial and institutional power, may be in danger of turning away from an ideal of creating safety and security for all towards an ideal of governance over others. That the struggles now underway have at least some danger of congealing into an intramural struggle for elite power in the political economy to come. On one side, the future economic elites: the students from selective institutions feeding into the finance industry and Silicon Valley. On the other side, the future cultural managers and bureaucrats: the students from selective institutions feeding into consultancies, non-profits, risk management administration, human resources, into the civic middlemen of a future capitalism.

Where that danger becomes clearest is precisely in the talk of guidance and guidelines, suggestions and “soft rules”. Not so much in the talk itself, but in who the talk is aimed at. Free speech advocacy tends to see every guideline from an institution as a law, and turn to a libertarian vocaculary to contest it. The issue is less the making of law and more the incipient character of class hierarchy in the political economy to come.

One of the things that I heard coming from a substantial wave of student activism here several years ago was that they held themselves to be already knowledgeable about all the things that they felt a good citizen and ethical person should know. It was the other students, the absent students, the students who don’t study such subjects, who worried them. And some of the activists had a touching faith in a way in the power of our faculty’s teaching to remake the great unwashed of the student body. If only they took the right classes, they’d do the right thinking. As one Swarthmore student in spring 2013 said in the group I was in, “I can’t believe there are students here who graduate without having heard the word intersectionality.”

This moment worried me, even though it is important as always to remember: this was a young person, and I said things under similar circumstances that I would be deeply embarrassed to hear quoted directly back to me. It worried me because I hear that same concern a lot across the entire space of cultural activism, both on and off-campuses.

It worries me first because that student and many similar activists are wrong when they assume that what they don’t like in the culture is a result of the absence of the ideas and knowledge that they hold dear. Far more students here have been in a course where concepts like “intersectionality” come up than this student thought. All political ideologies in the contemporary American public sphere, from the most radical to the most reactionary, have a troubling tendency to assume that agreement with their views is the natural state of the mass of people except for a thin sliver of genuinely bad actors, and therefore where a lack of agreement or acceptance holds, it must be because the requisite knowledge has been kept from the masses. This is a really dangerous proposition, because it blinds any political actor to the possibility that many people have have heard what you have to say and don’t agree for actual reasons–reasons that you’ll have to reckon with eventually.

It worries me second because I think some activists may be subconsciously thinking that if they can sufficiently command custodial or institutional power, they will not have to reckon with such disagreement. Not only does that mistake custodial power as permanently and inevitably friendly to their own interests, it is where the temptation to use class power against other social groups will enter in, has already entered in.

This is what worries me most. The thing that I wish that student had recognized is that some of the people that he wishes knew the word intersectionality already know the reality of it. They might not have the vocabulary he does, but they have the phenomenology right enough. Perhaps more right than the student did.

I worry, as in the case of Halloween costumes and much else, that at least some cultural activists are setting themselves up as future commissioners of culture over other social classes and their worlds, that this is as much about admonishing people “out there” for their failure to use the right terms, for their outre mannerisms and affect, for their expressive noncompliance. That this is all about kids who will become upper middle-class (or rich) through access to education judging and regulating kids who will not have that status or education, no matter where the educated kids started in life. That making blanket policies about Halloween costumes and much else might become a building block of class differentiation, part of a system of middle-class moral paternalism.

That’s what an earlier generation of cultural activism left me doing as a young graduate who wanted to be an “ally”: piously correcting people outside of my immediate social universe whenever life put me into close contact with them. Often when it was the most innocent and well-intended on my part, it gave the greatest offense, as when I once started talking about the importance of working-class unionism with my non-union working-class cousins that I was meeting for the first time at my paternal grandfather’s house.

At least in some cases, the entire infrastructure of current cultural activism is disabling the need for careful listening, for patience, for humility, at the moments where it is needed most, particularly within the ethical commitments that many activists themselves treasure and articulate. That’s why guidelines and rules and custodial dictates and finger-wagging about general concepts like appropriation are a problem: they take what is profoundly situational and circumstantial and turn it systematic. They interrupt rather than intensify attention. They make a spectrum of expressive practice into a right-wrong binary.

We need to tell someone thinking of wearing blackface to a party to absolutely stop right there and think again. We need to tell someone planning a fraternity party with a “gang theme” to cut that shit out or else. Neither of those moments is meaningful expression or harmless fun, and there needs to be no room for them. But we also need to not give ourselves permission to piously tell the kid in the karate uniform that they’re appropriating someone’s culture, or to inform the guy in the cowboy uniform that cowboys were nothing but agents of genocidal conquest.

We need to not self-nominate as authorities over culture, especially the speech and cultural activity of people whom we arrogantly judge don’t know as much about it as we do. We need to be in culture, in circulation, even acting through appropriation and imitation, a part of the crowd and not above it. We are all dancers, not choreographers; our only choreographer is the endless, ceaseless and sometimes abrasive motion of human thought and expression in a never-simple world.

]]>
https://blogs.swarthmore.edu/burke/blog/2015/11/09/all-saints-day/feed/ 17
Yes, We Have “No Irish Need Apply” https://blogs.swarthmore.edu/burke/blog/2015/07/29/yes-we-have-no-irish-need-apply/ https://blogs.swarthmore.edu/burke/blog/2015/07/29/yes-we-have-no-irish-need-apply/#comments Wed, 29 Jul 2015 15:48:46 +0000 https://blogs.swarthmore.edu/burke/?p=2848 Continue reading ]]> Just came across news of the publication of Rebecca Fried’s excellent article “No Irish Need Deny: Evidence for the Historicity
of NINA Restrictions in Advertisements and Signs”, Journal of Social History, 10:1093, 2015, from @seth_denbo on Twitter.

First, the background to this article. Fried’s essay is a refutation of a 2002 article by the historian Richard Jensen that claimed that “No Irish Need Apply” signs were rare to nonexistent in 19th Century America, that Irish-American collective memory of such signs (and the employment discrimination they documented) was largely an invented tradition tied to more recent ideological and intersubjective needs, and that the Know-Nothings were not really nativists who advocated employment (and other) discrimination against Irish (or other) immigrants.

Fried is a high school student at Sidwell Friends. And her essay is just as comprehensive a refutation of Jensen’s original as you could ever hope to see. History may be subject to a much wider range of interpretation than physics, but sometimes claims about the past can be as subject to indisputable falsification.

So my thoughts on Fried’s article.

1) Dear Rebecca Fried: PLEASE APPLY TO SWARTHMORE.

2) This does really raise questions, yet again, about peer review. 2003 and 2015 are different kinds of research environments, I concede. Checking Jensen’s arguments then would have required much more work of a peer reviewer than more recently, but I feel as if someone should have been able to buck the contrarian force of Jensen’s essay and poked around a bit to see if the starkness of his arguments held up against the evidence.

3) Whether as a peer reviewer or scholar in the field, I think two conceptual red flags in Jensen’s essay would have made me wary on first encounter. The first is the relative instrumentalism of his reading of popular memory, subjectivity and identity politics. I feel as if most of the discipline has long since moved past relatively crude cries of “invented tradition” as a rebuke to more contemporary politics or expressions of identity to an assumption that if communities “remember” something about themselves, those beliefs are not arbitrary or based on nothing more than the exigencies of the recent past.

4) The second red flag, and the one that Fried targets very precisely and with great presence of mind in her exchanges with Jensen, is his understanding of what constitutes evidence of presence and the intensity of his claims about commonality. In the Long Island Wins column linked to above, Jensen is quoted as defending himself against Fried by moving the goalposts a bit from “there is no evidence of ‘No Irish Need Apply'” to “The signs were more rare than later Irish-Americans believed they were”. The second claim is the more typical sort of qualified scholarly interpretation that most academic historians offer–easy to modify on further evidence, and even possible to concede in the face of further research. But when you stake yourself on “there was nothing or almost nothing of this kind”, that’s a claim that is only going to hold up if you’ve looked at almost everything.

I often tell students who are preparing grant proposals to never ever claim that there is “no scholarship” on a particular subject, or that there are “no attempts” to address a particular policy issue in a particular community or country. They’re almost certainly wrong when they claim it, and at this point in time, it takes only a casual attempt by an evaluator to prove that they’re wrong.

But it’s not just that Jensen is making what amounts to an extraordinary claim of absence, it is that his understanding of what presence would mean or not mean, and the crudity of his attempt to quantify presence, that is an issue. There may be many sentiments in circulation in a given cultural moment that leave few formal textual or material signs for historians to find later on. Perhaps I’m more sensitive to this methodological point because my primary field is modern Africa, where the relative absence of how Africans thought, felt and practiced from colonial archives is so much of a given that everyone in that field knows to not overread what is in the archive and not overread what is not in the archive. But I can only excuse Jensen so far on this point, given how many Americanists are subtle and sensitive in their readings of archives. Meaning, that even if Jensen had been right that “No Irish Need Apply” signs (in ads, in doors, or wherever) were very rare, a later collective memory that they were common might simply have been a transposition of things commonly said or even done into something more compressed and concrete. Histories of racism and discrimination are often histories of “things not seen”.

But of course as Fried demonstrates comprehensively, that’s not the case here: the signage and the sentiment were in fact common at a particular moment in American history. Jensen’s rear-guard defense that an Irish immigrant male might only see such a sentiment once or twice a year isn’t just wrong, it really raises questions about his understanding of what an argument about “commonality” in any field of history should entail. As Fried beautifully says in her response, “The surprise is that there are so many surviving examples of ephemeral postings rather than so few”. She understands what he doesn’t: that what you find in an archive, any archive, is only a subset of what was once seen and read and said, a sample. A comparison might be to how you do population surveys of organisms in a particular area. You sample from smaller areas and multiply up. If even a small number of ads with “No Irish Need Apply” were in newspapers in a particular decade, the normal assumption for a historian would be that the sentiment was found in many other contexts, some of which leave no archival trace. To argue otherwise–that the sentiment was unique to particular newspapers in highly particular contexts–is also an extraordinary argument requiring very careful attention to the history of print culture, to the history of popular expression, to the history of cultural circulation, and so on.

Short version: commonality arguments are hard and need to be approached with care. They’re much harder when they’re made as arguments about rarity or absence.

5) I think this whole exchange is on one hand tremendously encouraging as a case of how historical scholarship really can have a progressive tendency, to get closer to the truth over time–and it’s encouraging that our structures of participation in scholarship remain porous enough that a confident and intelligent 9th grader can participate in the achievement of that progress as an equal.

On the other hand, it shows why we all have to think really carefully about professional standards if we want to maintain any status at all for scholarly expertise in a crowdsourced world. I’ve said before that contemporary scholars sometimes pine for the world before the Internet because they felt safe that any mistakes they make in their scholarship would have limited impact. If your work was only read by the fifty or so specialists in your own field, and over a period of twenty or thirty years was slowly modified, altered or overturned, that was a stately and respectable sort of process and it limited the harm (if also the benefit) of any bolder or more striking claims you might make. But Jensen’s 2002 article has been cited and used heavily by online sources, most persistently in debates at Snopes.com, but also at sites like History Myths Debunked.

For all the negativity directed at academia in contemporary public debate, some surveys still show that the public at large trusts and admires professors. That’s an important asset in our lives and we have serious collective interest in preserving it. This is the flip side of academic freedom: it really does require some kind of responsibility, much as that requirement has been subject to abuse by unscrupulous administrations in the last two years or so. We do need to think about how our work circulates and how it invites use, and we do need to be consistently better than “the crowd” when we are making strong claims based on research that we supposedly used our professional craft to pursue. It’s good that our craft is sufficiently transparent and transferrable that an exceptional and intelligent young person can use it better than a professional of long standing. That happens in science, in mathematics, and other disciplines. It’s maybe not so good that for more than ten years, Jensen’s original claims were cited confidently as the last word of an authenticated expert by people who relied on that expertise.

]]>
https://blogs.swarthmore.edu/burke/blog/2015/07/29/yes-we-have-no-irish-need-apply/feed/ 14
The Listicle as Course Design https://blogs.swarthmore.edu/burke/blog/2014/08/11/the-listicle-as-course-design/ https://blogs.swarthmore.edu/burke/blog/2014/08/11/the-listicle-as-course-design/#comments Mon, 11 Aug 2014 18:51:58 +0000 https://blogs.swarthmore.edu/burke/?p=2658 Continue reading ]]> I’ve been convinced for a while that one of the best defenses of small classes and face-to-face pedagogy within a liberal arts education would be to make the process of that kind of teaching and coursework more visible to anyone who would like to witness it.

Lots of faculty have experimented with publishing or circulating the work produced by class members, and many have also shared syllabi, notes and other material prepared by the professor. Offering the same kind of detailed look at the day-to-day teaching of a course isn’t very common and that’s because it’s very hard to do. You can’t just videotape each class session: being filmed would have a negative impact on most students in a small 8-15 person course, and video doesn’t offer a good feel for being there anyway. It’s not a compressed experience and so it doesn’t translate well to a compressed medium.

I have been trying to think about ways to leverage participation by crowds to enliven or enrich the classroom experience of a small group of students meeting face-to-face and thus also give observers a stake in the week-by-week work of the course that goes beyond the passive consumption of final products or syllabi.

In that spirit, here’s an idea I’m messing around with for a future course. Basically, it’s the unholy combination of a Buzzfeed listicle and the hard, sustained work of a semester-long course. The goal here would be to smoothly intertwine an outside “audience” and an inside group of students and have each inform the other. Outsiders still wouldn’t be watching the actual discussions voyeuristically, but I imagine that they might well take a week-to-week interest in what the class members decided and in the rationale laid out in their notes.

——————–

History 90: The Best Works of History

Students in this course will be working together over the course of the semester to critically appraise and select the best written and filmed works that analyze, represent or recount the past. This will take place within a bracket tournament structure of the kind best known for its use in the NCAA’s “March Madness”.

The initial seeding and selection of works will to be read by class members will be open to public observers as well as enrolled members of the class. The professor will use polls and other means for allowing outside participants to help shape the brackets. One side of the bracket will be works by scholars employed by academic institutions; the other side will be works by independent scholars, writers, and film-makers who do not work in academia.

The first four weeks of the class will be spent reading and discussing the nature of excellence in historical research and representation: not just what “the historian’s craft” entails, but even whether it is possible or wise to build hierarchies that rely on concepts of quality or distinctiveness. Class members will decide through discussion what they think are some of the attributes of excellent analytic or representational work focused on the past. Are histories best when they mobilize struggles in the present, when they reveal the construction of structures that still shape injustice or inequality? When they document forms of progress or achievement? When they teach lessons about common or universal challenges to human life? When they amuse, enlighten or surprise? When they are creatively and rhetorically distinctive? When they are thoroughly and exhaustively researched?

At the end of this introductory period, students will craft a statement that explains the class’ shared criteria, and this statement will be published to a course weblog, where observers can comment on it. Students will then be divided into two groups for each side of the bracket. Each group will read or view several works each week on their side of the overall bracket. During class time, the two groups will meet to discuss their views about which work in each small bracket should go forward in the competition and why, taking notes which will eventually be published in some form to the course weblog. Students will also have to write a number of position papers that critically appraise one of the books or films in the coming week and that examine some of the historiography or critical literature surrounding that work.

The final class meeting will bring the two groups together as they attempt to decide which work should win the overall title. In preparation, all students will write an essay discussing the relationship between scholarly history written within the academic and the production of historical knowledge and representation outside of it.

]]>
https://blogs.swarthmore.edu/burke/blog/2014/08/11/the-listicle-as-course-design/feed/ 3
On The Invisible Bridge https://blogs.swarthmore.edu/burke/blog/2014/08/07/on-the-invisible-bridge/ Thu, 07 Aug 2014 18:02:10 +0000 https://blogs.swarthmore.edu/burke/?p=2654 Continue reading ]]> I’ve been following some of the discussion about Rick Perlstein’s new book on the 1970s.

I agree with many scholars that the basic problem with online endnotes is the persistent danger of the main text and the sourcing becoming disconnected over time unless there’s a heavily institutionalized plan for any necessary migration of the citations. At this point, there’s really nothing of the sort, so I think both publishers and authors would be well-advised to just stick with putting the notes in the printed text.

I’m guessing that Rick Perlstein might be wishing he’d done just that at this point. It’s not clear that it would have protected him from the basically spurious claim that his new book plagiarizes an earlier book by Craig Shirley, but from the current state of the back-and-forth between the two authors and their various defenders and lawyers, it may be that Shirley jumped to the conclusion that Perlstein had paraphrased him without proper attribution because the numerous attributions were in the online endnotes. It’s more likely, though, that Shirley objected because Perlstein looked at the same things that Shirley did and came to very different conclusions. Following the discussion online and looking at the evidence, I really don’t see anything that I would call plagiarism and not much that I would even call careless.

I’m just starting the book for its actual content, but I’m sympathetic to David Weigel’s suggestion that Perlstein is being targeted because Reagan is a more sacred figure for contemporary cultural conservatives than Goldwater or Nixon. Most of them abjure Nixon as a RINO, if they remember him at all, and Goldwater is at this point as relevant to many of them as Calvin Coolidge. Many current conservatives, however, have a strongly vested interest in not remembering Reagan in his actual context, where he presents some real puzzles in terms of our contemporary moment.

For me, though, the persistent argument I like most in Perlstein’s previous two books applies with more force to progressives than to conservatives. I suspect his new book will continue the general thrust of his analysis in this respect. I think Perlstein shows (and means to show) that postwar American conservatism has surprisingly extensive and complex social roots and that at least some of its social roots have a kind of genuine “from below” legitimacy. This might account for why his previous two books initially received appreciative readings from conservatives, in fact.

In his book on Goldwater, Perlstein documents, among other things, that one of Goldwater’s enduring sources of support was from small business owners, especially away from the major coastal cities. I read Perlstein as being genuinely surprised not only that there was a sort of coherent social dimension to this vein of support but that the antipathy of this group towards the federal government had some legitimacy to it, primarily because as federal authority expanded after the war, small businesses got hit with a wave of regulatory expectations that had a serious economic impact on them.

In general in his books, Perlstein does a great job of careful investigatory attention to the social origins of conservative sentiment and ideology and then couples that investigation to a critical appraisal of how political elites and party leaders reworked or mobilized those sentiments. The layered account he gives of the rise of postwar conservatism explains a great deal about how we got to the point we’re at today. While he’s not at all sympathetic to either the content or consequences of conservatism as he describes it (then and now) what I think his account comprehensively rebukes is the kind of progressive response to right-wing political power that falls back on tropes like “astroturfing” or that otherwise assumes that conservatism is the automated, inorganic response of a dying demographic to the loss of social power, that there is nothing real to it or that its reality is simple and self-interested.

I remarked briefly on Twitter that I think most of Perlstein’s progressive fans miss the implications of his work in this respect (and he replied that this needed more than 140 characters for him to make sense of my point). In a way, I’d see Perlstein’s work as a modern companion to the richer kinds of histories of “whiteness” that Nell Irvin Painter, David Roedinger and Noel Ignatieff have written, none of which encourage us to see whiteness as a subjectivity or social formation that was defined solely by instrumental self-interest or that was constructed entirely “from above” with conscious design.

The implications of an analysis like Perlstein’s for actual participation in contemporary politics would be to first peel apart the sources of historical and social energy within your opposition and look carefully at where there are real and imagined grievances that you can actually appreciate, address or be in conversation with. Communitarians have one axis of sympathy they can try to traverse; liberals and libertarians another.

The second is to never assume the charge of astroturfing does much of anything to advance a meaningful politics or for understanding why things actually happen in elections, in governance, in popular consciousness: that is the move of a largely intra-elite war of position that gains inches at best, not yards. Focusing on astroturfing, even when it is undoubtedly happening and has significance for controlling dominant “framing” narratives that influence politics, is mostly an alibi for not doing the much harder work of understanding what’s happening in the larger lived experience of communities and regions. The astroturfing charge is ultimately a sort of degeneration of an older left belief in ideology, a belief that coherent formations of thought and belief crafted by self-conscious elites then structured consciousness and directed political action outside the elite. Thus you get folks like Thomas Frank thinking that losing Kansas is largely a matter of dastardly hegemons cunningly and deliberately blinding people to their authentic self-interest, rather than a slower organic history in which people connected some existing religious, cultural, and social convictions to an increasing disenchantment with the role of the state in their everyday lives, a connection that they have held to with some degree of deliberate agency.

Third, stop assuming that postwar conservatism’s content is wholly protean or arbitrary. “Big government” in this sense may be in all sorts of ways a really messed-up construction that obscures the degree to which mostly-conservative voting districts are actually the enthusiastic recipients of all sorts of public money, but it’s not a random or senseless trope at its origin point, either, at least not as I read the history that Perlstein so ably distills. Which doesn’t mean that the social reality of its derivation is positive, either, since at least one of the aspects of “government” that had become an issue by the early 1970s in Perlstein’s view, as per Nixonland, is its interventions into the political economy of race and racial discrimination.

Fourth, restore some contingency to the story. Perlstein is very good on this in particular when he’s talking about political elites, politicians and party leaders, that the ways in which the fusion of popular and party agendas happened was full of false starts, unpredictable gambits, and improvisations.

All of which to me imply that progressives today habitually underestimate the historicity, rootedness and local authenticity of what they regard as conservatism, and therefore mostly end up stuck with intra-elite theaters of struggle and debate within familiar institutions and communities, all the while misperceiving those as more than they really are. I’ll be curious to see whether this part of what I see in Perlstein’s history changes as we move in to his “invisible bridge”.

]]>
Historians Don’t Have to Live in the Past https://blogs.swarthmore.edu/burke/blog/2013/07/24/historians-dont-have-to-live-in-the-past/ https://blogs.swarthmore.edu/burke/blog/2013/07/24/historians-dont-have-to-live-in-the-past/#comments Wed, 24 Jul 2013 14:42:13 +0000 https://blogs.swarthmore.edu/burke/?p=2397 Continue reading ]]> In what way is the American Historical Association’s notion of a six-year embargo on digital open-access distribution of dissertations even remotely sustainable in the current publishing and media environment surrounding academia?

On one side, you have disciplinary associations like the Modern Language Association and the American Anthropological Association that have somewhat similar traditions of tying assessment and promotion to the publication of a monograph that are to varying degrees embracing open-access publishing and digital dissemination and trying to work out new practices and standards.

On the other side, you have disciplines that have no particular obsession with the idea of the published monograph as the standard.

Whether or not the published monograph is or ever was a good standard for judging the worth of a historian’s scholarship, how long does the AHA think that historians can stand alone in academia as a special case? “Oh, we don’t do open-access or digital distribution until we’ve got a real book in hand and are fully tenured, those few of us remaining who are in tenure-track positions, because that’s a fundamental part of history’s particular disciplinary structure.”

Um, why?

“Because history dissertations take a long time to write and thus need protection?” Right, unlike anthropology or literary criticism or other fields in the humanities. FAIL.

“Because many publishers won’t publish an open-source dissertation?” Right, so this assumes: a) the dissertation will be so little revised that the two texts would be essentially identical and b) but the magic fairy-dust of a book makes it the real benchmark of a properly tenurable person. E.g., “Oh noes, we couldn’t decide if someone’s scholarship was tenurable from a dissertation that is nearly identical to a book”. Here’s where the real fail comes in because it reveals how much the disciplinary association is accepting the clotted, antiquated attachment of a small handful of tenured historians to their established practices even when those practices have had any semblance of reason or accommodation to reality stripped from them.

Let’s suppose that university presses do stop publishing essentially unrevised dissertations. I can’t blame them: they need to publish manuscripts that have some hope of course adoption and wider readership, sold at a reasonable price, or they need to price up library editions high enough that the remaining handful of “buy ’em all” libraries will make up for the loss of libraries that buy in a more discretionary fashion.

You can understand why the publishers who are largely following option #B would not want to publish monographs that were marginally revised versions of open-access dissertations, because even the richest libraries might well decide that buying a $150 physical copy is unnecessary. But by the same token, again, why should a tenure and promotion process value the physical copy over the digital one if they’re the same? Because the physical copy has been peer-reviewed? Meaning, if two scholars who do not work for the same institution as the candidate have reviewed the manuscript and deemed it publishable, that alone makes a candidate tenurable? Why not just send out the URL of a digital copy to three or four reviewers for the tenure and promotions process to get the same result? Or rely more heavily upon the careful, sophisticated reading of the manuscript (in whatever form) by the faculty of the tenuring department and institution?

What the AHA’s embargo embarrassingly underscores is the extent to which many tenured faculty have long since outsourced the critical evaluation of their junior colleagues’ scholarship to those two or three anonymous peer reviewers of a manuscript, essentially creating small closed-shop pools of specialists who authenticated each other with little risk of interruption or intervention from specialists in other fields within history.

Thirty years ago, when university presses would publish most dissertations, you could plausibly argue that the dissertation which persistently failed review and was not published by anyone had some sort of issue. Today you can’t assume the same. Maybe we never should have given over the work of sensitive, careful engagement with the entire range of work in the discipline as embodied in our own departments, but whether that was ever a good idea, it isn’t now and can’t be kept going regardless.

Suppose we’re talking about option #A instead, the publishers who are being more selective and only doing a print run of manuscripts with potential for course adoptions or wider readership. Suppose you use that as the gold standard for tenurability?

That’s not the way that graduate students are being trained, not the way that their dissertations are being shaped, advised and evaluated. So you would be expecting, with no real guidance and few sources of mentorship, that junior faculty would have the clock ticking on their first day of work towards adapting their dissertations towards wider readability and usefulness. That’s a dramatic migration of the goalposts in an already sadistic process. You could of course change the way that dissertations are advised and evaluated and therefore change the basic nature of disciplinary scholarship, which might be a good thing in many ways.

But this would also accelerate the gap between the elite institutions and every other university and college in even more dramatic fashion: writing scholarship that had market value would qualify you for an elite tenure-track position, writing scholarship that made an important if highly specialized contribution to knowledge in a particular field of historical study would qualify you for more casualized positions or tenure-track employment in underfunded institutions that would in every other respect be unable and unwilling to value highly specialized scholarship. (E.g., have libraries that could not acquire such materials, curricula where courses based on more specialized fields and questions could not be offered, and have little ability to train graduate students in fields requiring research skills necessary for such inquiry.) In terms of the resources and needs of institutions of higher learning, it arguably ought to be the reverse: the richest research universities should be the institutions which most strongly support and privilege the most specialized fields and therefore use tenure and promotion standards which are indifferent to whether or not a scholar’s work has been published in physical form.

Yes, it’s not easy to move individual departments, disciplines or entire institutions towards these kinds of resolutions. But it is not the job of a professional association to advocate for clumsy Rube Goldberg attempts to defend the status quo of thirty years ago. If individual faculty or whole departments want to stick their heads in the sand, let that be on them. An organization that aspires to speak for an entire discipline’s future has to do better than that. The AHA’s position should be as follows:

1) Open-access, digitally distributed dissertations and revised manuscripts should be regarded as a perfectly suitable standard by which to judge the scholarly abilities of a job candidate and a candidate for tenure in the discipline of history. A hiring or tenuring committee of historians is expected to do the work of sensitive and critical reading and assessment of such manuscripts instead of relying largely on the judgment of outside specialists. The peer assessment of outside specialists should be added to such evaluation as a normal part of the tenure and promotion process within any university or college.

2) The ability of a historian to reach wider audiences and larger markets through publication should not become the de facto criteria for hiring and tenure unless the department and institution in question comprehensively embraces an expectation that all its faculty in all its disciplines should move in the course of their career towards more public, generalized and accessible modes of producing and disseminating knowledge. If so, that institution should also adopt a far wider and more imaginative vision of what constitutes engagement and accessibility than simply the physical publication of a manuscript.

]]>
https://blogs.swarthmore.edu/burke/blog/2013/07/24/historians-dont-have-to-live-in-the-past/feed/ 19
There Are More Things on Heaven and Earth Than Dreamt of in Your Critique https://blogs.swarthmore.edu/burke/blog/2013/07/11/there-are-more-things-on-heaven-and-earth-than-dreamt-of-in-your-critique/ https://blogs.swarthmore.edu/burke/blog/2013/07/11/there-are-more-things-on-heaven-and-earth-than-dreamt-of-in-your-critique/#comments Thu, 11 Jul 2013 19:11:40 +0000 https://blogs.swarthmore.edu/burke/?p=2384 Continue reading ]]> Just back from some research work that took up my energy for writing and thinking, I spent some time catching up on blogs and social media. I followed one link out from a Facebook friend to Paul Mullins’ excellent Archaeology and Material Culture blog, which often has content that I bookmark and mean to respond to but never quite get around to tackling. The linked essay was actually an older one, on “ruin porn”. This caught my eye because I like ruin photography quite a bit, as well as the general practice of “urban exploration”.

Mullins has a typically careful, considered, densely hyperlinked appreciation of the topic that exemplifies the best kinds of curation that digital culture has to offer. But as I followed some of his links to the critics of “ruin porn”, even the more subtle and careful critiques like John Patrick Leary’s response to ruin photography centered on Detroit, I found myself thinking about some tendencies in humanistic writing that I think first took shape in the 1980s but continue to be a limitation of some humanistic intellectuals both inside and outside of the academy.

I’m aware that some of what I’m about to say commits some of the sins I’m identifying, partly because I want to abstract some of these observed problems away from any single one of the links that Mullins offers rather than turn this into “my blog vs. that blog” (especially when some of the linked blog entries in Mullins’ essay are two or three years old: the presentism of digital culture sometimes means that people are deeply puzzled when you start a debate about entries that the authors wrote years ago). None of the critiques that Mullins includes in his overview perfectly exemplifies the issues I’m about to describe, and in many cases they simply reminded me of overall frustrations I have with large strains and tendencies in work that I want to like more than I do, including issues I sometimes see in the writing of my students.

So here are six tendencies that I have a problem with:

1) The complaint against omission and the demand for an impossible culture. Reading through Mullins’ links, some of the critics of ruin photography complain that many ruin images leave out people, excise the history of how a building or place came to be ruined, or omit the political economy of abandonment and neglect. I see some similar, if less experienced and articulate, arguments in some student writing: asked to critique, many students opt to accuse a scholar or writer of leaving out, neglecting, forgetting, rather than to directly disagree with or argue against a claim or analysis. At the least, this often amounts to a weak, evasive or highly selective form of intertextuality: it lets the critic pose one text that they happen to know against the text they’re critiquing, without having to have anything like a systematic knowledge of entire genres, tropes or bodies of scholarly thought or without having to know whether or how the two texts being contrasted might plausibly actually be expected to be in relation to one another. (In the case of my students, I sometimes see them holding one author accountable for having neglected or forgotten another work which was produced at a much later date.) So looking at ruin photography in isolation without asking what histories of visual culture it draws from or is talking to, or whether the critic has the same expectation of ruin photography of other sites and places or even of non-ruin landscape and architecture photography (which often also leaves out people and political economies and processes), is a problem.

Two photographs that leave out people, histories and political economies, for example, and are almost entirely “aestheticized”:

spray

was

It’s a bigger problem when this strategy demands impossible culture–or it holds all culture accountable for not being a single idealized type of work which is (surprise) frequently either the kind of work that the critic does or is the kind of work that the critic professionally identifies with. This is when criticism starts to look like the worst Christmas list from the most overprivileged child, an endless list of unprioritized and urgent demands. There is no photography, no performance, no representational work, that can include totality and even the desire that they should is a terrible misfire.

This has always struck me at the least as the kind of thing that a dullard senior professor does when called upon to act as a discussant or critic, to compile a long list of omissions from the work on offer without any reading of the meaning of those omissions and the possibilities of their inclusion. Taken seriously, this kind of demand actually encourages the production of expressive and interpretative work that looks like a horribly literalist editorial cartoon with labels on all its images, less an analysis and more a catalogue. What does a photograph of a ruin look like if it includes all the people, all the processes, all the political economies, all the histories, that went into making the ruin? It looks like an archive or an exhibition, and an endless and imaginary one at that.

One of the wellsprings of this kind of wish for an impossible culture, I think, is the way that a sort of backdoor empiricism infiltrated humanistic practice in the academy via historicism. Much as the ‘hard’ social sciences have come to rest on an almost parodistically exaggerated distortion of the positivism that that they (wrongly) attribute to the natural sciences, some humanistic work trying to do useful sociopolitical work slowly but surely came to take on board the dullest kind of empiricism of the most literalist historian or sociologist to the point that this mode of criticism can appreciate or admire no text for what it is or does, only complain of what it is not.

2) The assertion of ownership. The critique of ruin photography and urban exploration often comes down to this: that its practicioners are carpet-baggers, cosmopolitan passers-by, tourists and short-termers, inauthentic. At best, these formulations are true but banal. Places, communities and people give up different stories at different time scales as well as in response to different styles of knowing. There are things you can’t know about a place after a day, a week, a year, a decade, a life–and things you can’t know about a place if your decade there was from age 10 to 20 or from age 60 to 70. But this proposition is always reversible. The longer you’re in a place, the less able you are able to see other things. Banality dulls the ability to see beauty and horror. Watching a building crumble slowly over the years, knowing intimately the processes of its abandonment, may make it hard to see how startling or interesting the results of that history might be to fresh eyes. At its worst, playing the authenticity game puts a very deadly weapon in the hands of repressive actors. It’s always something that can be turned back on the critic: there will always be an experience or subjectivity beyond the critic’s own boundary, a point at which they will also have to borrow from, rework, or rely upon accounts of experience or embodiment that the critic doesn’t have in themselves and didn’t live in, or they will have to distract mightily from the presumption of their own assumed authority. (Say, the arrogance of anyone speaking for what “Detroiters” in general think of Detroit’s history and landscape: I don’t recall there being a plebiscite or poll that documented what most of ‘them’ think, I don’t think that Detroiters as a whole think the same things or have lived the same lives in relationship to their landscapes.) Coupling the right to represent to a privileged subject position is a bad move, and it’s easy to fall into that from the simpler and often useful assertion that a particular representation has assumed its own privileged relationship to the truth of what it shows. But in fact ruin photography often makes quite clear its outsider status and to see something in ruins which locality and rootedness do not see.

3) Starting with accusation rather than curiosity. I think this tendency is especially deadly to humanistic study and writing, a point that is also made by the recent Harvard College report on the humanities. As the report puts it, “among the ways we sometimes alienate students from the Humanities is the impression they get that some ideas are unspeakable in our classrooms.” Or in expressive culture and critical interpretation at large. What is striking about some of the critiques of ruin photography is that they do not start with the question, “So why are these images being produced? Where are they being produced? What do their creators say about them? Who views them? Who likes them?” in a time when all of these questions are more richly answerable on a vast sociocultural scale than ever before. And I do not mean to suggest they be answered in the dullest or most literally sociological ways. Instead, ask them as if the answers might be a surprise, because when asked in that spirit, they often are. We’re not blank slates: what we see or know forms out of what we already see and know, and we’re not discovering an already-made ontological reality that only waits for us to ping it with the right gadget. Curiosity is a spirit, an attitude, a starting posture. It’s a way of tuning your instrument before the performance, or seeking ongoing inspiration for inquiry. Critique that follows something like curiosity, something like an ethnographic understanding of the thing we want to criticize, is more powerful both because it is truly earned and because it is more precisely targeted. When you have to fling around tropes like “hipster” to hit your target, the humanities that results is about one step above a Buzzfeed listicle.

4) The lost opportunties of anti-curation. Again, in a time when it’s possible to bring together large bodies of text and representation and work the resulting aggregations of “medium data” as a way to think to and invent new possibilities of seeing, it seems depressing to see some kinds of humanistic critique involved in the disembedding of expressive culture, in the assembly of cherrypicked galleries of grotesques, of being stuck in the flow of digital attention without ever making strategic decisions to go beyond or below the flow of the picture or meme that floods past our doors. That flood is a marvelous thing and there are incredibly fruitful ways of knowing and interpreting that are growing out of it. But we ought to be able to take whatever detritus washes up in front of us and then dive deeper into the wreck or look out at all the flotsam and jetsam around it. Sure, suddenly it seems that here are all these photographs of Detroit in ruins. But go wider and suddenly there are all these photographs of ruins in general, not just post-industrial North American cities. Or even wider and suddenly there are all these photographs period. Suddenly the critic’s perception of a boundary around one trope or expressive moment requires a much stronger defense. Go deeper and suddenly a fascination with ruins seems both more historically interesting as an affect of modernity or much more banal as a consequence of the movement of people and capital. None of the widening and deepening that curatorial practice entails forbids the critic to criticize, but it does place important burdens and challenges upon critique.

5) The mistrust of beauty and pleasure. It’s really striking in some criticisms of ruin photography to see the critics resist or defract the appeal of the images themselves, often via the term “ruin porn”. Like “food porn”, that can be a lightly ironic way for people who actually produce and appreciate the images to label their own desire, but some critics of the form seem to use it much more seriously as an indictment of aestheticization itself, that any image or performance or text which produces desire and pleasure is prurient and unsavory. This is again a kind of backdoor positivism sneaking into the picture, as if a more real and less aestheticized image would be both always possible and inevitably preferable. It’s not like that: street photography, for example, with its typical emphasis on naturalism and embodiment and documentary realism, is another aesthetic, whether it stages itself among decline or in the midst of wealth. The argument that we should have a critical or systematic preference for an aesthetic is the hardest one in the world to make in the humanities, and that difficult labor is something that polemicists typically bypass whether they’re self-declared conservatives insisting on the “Western tradition” or progressives complaining about hipsters aestheticizing ruined buildings in Detroit. It’s not that “this ruin is beautiful” is a sufficient justification for an image in its own right for anyone but the producer of the image but that it’s a possible justification. It’s not just positivism that sneaks into the picture here but productivism, the proposition that culture has work to do, and that the work of producing culture should always somehow justify the investment of time and resources not just of the artist but of the viewers.

6) Which I think leads to my last complaint: that this kind of critique doesn’t look to recuperate, reimagine, reinterpret but to forbid. In some sense there should be no image, no expressive work, no text, no performance whose existence we regret to the point of wishing it had never happened. This is where there is often a disjuncture between humanistic work on the past (which accepts the inevitability of the texts and performances of interest and is therefore often capable of interpreting them in fresh and novel ways rather than just wishing they had never been) and work in the present, which much more often attempts to instruct or set boundaries around the creation of culture and interpretation in the near-term future.

All of this, by the way, isn’t important just because of how it affects acceptance of work by scholars and public intellectuals, or how it affects the institutional status of the humanities. Much of this is also a good explanation for why contemporary progressive intellectuals struggle so hard to make headway in the politics of culture, as these tendencies both hobble any dialogue between critics and practicing artists, performers and producers of expressive culture and they inhibit humanistic thinkers from producing their own persuasive cultural artifacts outside of the institutional networks that provide secure guarantees of value and praise for their work. If a person read nothing but a diet of the strongest and most dogmatic critiques of ruin photography and was then handed a camera and told to go take a picture of a ruin, that person would have to have an extraordinary bulwark between their creative impulses and their critical training to ever press the shutter button. Or, more likely, they would find themselves refusing to take the picture knowing in advance of the inadequacy of the gesture.

]]>
https://blogs.swarthmore.edu/burke/blog/2013/07/11/there-are-more-things-on-heaven-and-earth-than-dreamt-of-in-your-critique/feed/ 8