Precautions and Paralysis

In response to a prompt recently, I had to try and do a bit of forecasting about higher education.

I’ve spent too much time teaching about the history of futurism and prediction to find that a comfortable invitation. No matter how sophisticated you think you are as a forecaster, you’re mostly limited to extrapolation from existing trends. Visible trends can either be dealt with incrementally as they continue along their course of probable development. Or they can’t be, and their projected endpoint is so dire or dangerous that it calls for dramatic action now. Yet it’s exceptionally difficult to get anyone to act dramatically now based on a forecast where only the worst-case scenario justifies that kind of action, and for good reason. Not merely because such forecasts are not infrequently wrong, but because acting correctly means understanding not just what is likely to happen but why it is happening. Not to mention that the underlying causes of future developments have to be something that the forecasting actors can meaningfully affect or adapt to.

Occasionally an imaginative person can see some novel or unexpected development lying in wait, but that also doesn’t do much good, because it usually requires that person’s distinctive cast of mind to appreciate the forecast. If it’s hard to trust the collective wisdom of methodologically precise trendspotters, it’s harder still to trust some outlying eccentric. Even if you did choose to put all your money down on a single number and wait for the wheel to spin, being the first to adapt to some forthcoming reality might not do you any good. Spending fifty years in the fallout shelter eating beans out of a can might mean that you’re one of the few to live through the apocalypse, but by that time, your shelter is probably obsolete and you’ve traded fifty years of the good life for ten more years of beans in the darkness.

All of that said, it still seems important to at least try to think about the future. The cautionary example that I think is most pertinent for academics is newspaper and magazine journalism. Fifteen years ago, some of the developments that have cast the future of print journalism as we have known it into doubt were already quite visible. But few people in the industry took those developments seriously as a threat, even if they were otherwise interested in online media and digital culture.

Would it have made any difference if print journalists in 1995 had sat down for an industry-wide summit, accurately forecast what online media would look like in 2010, understood the implications for their own business model, and had tried to plan accordingly? What could they have done that they did not do?

This question takes on even sharper edges when you consider what exactly did the most damage to the business model of print journalism: not the movement of content to online venues, but the movement of advertising (classified ads in particular when it comes to newspapers). Then muddy the waters even more and ask how much of what has happened is about a shift in generational attitudes, reading practices, and conceptions of information.

If a perfectly accurate forecast of every major development between 1995 and 2010 had been delivered to that summit, there are some actions which might have shifted the future onto a more favorable track. But some of those would have required that the people sitting around the table then sacrifice their own involvement in print journalism in favor of other writers, editors and executives with different skills and perspectives, since one of the shortcomings of print journalism has been the reliance of many print journalists on closed-shop, old-media conceptions of what makes for good and bad content. Generally there are not many takers when you ask people to save the long-term future of their industry by sacrificing their own immediate future, even if twenty years down the road they’ll be unemployed anyway. Maybe some professionals in the industry could have made the leap to a new paradigm, if they were absolutely convinced that it was coming.

Could print journalists and publishers have found a way to steal a march on Google, Facebook, and most especially Craigslist in 1995 if they knew they were on the horizon, and in a way that would have saved their revenue streams? In advance of the technology and social practices that have made all of those companies successful? Despite the dot-com crash that was still to come?

Certainly journalists could have done something to avoid the self-inflicted wounds that have cost them so much collective credibility and so hampered their claims to be indispensable guardians of the public interest. No Jayson Blair or Stephen Glass, perhaps. The line of willing dupes and accomplices of the Bush Administration’s manipulations of evidence and information after 9/11 would have been much shorter. But considering that so many Americans were in that line as well, maybe that would simply have made journalists targets of a different kind of populist ire, with no change in their overall reputation.

So suppose some equally on-target warnings were dropped in front of academics today? That our revenue sources would change or dry up, that our status in the wider society would transform or diminish, that the way we worked or thought was out of touch in some novel way that would have novel consequences? What would we do? What could we do?

Posted in Academia | 7 Comments

Looking Backward

I’ve been fiddling with the syllabus for my Image of Africa class, which I am to teach this fall for the first time in a while.

No course in my repertoire has changed as much in my underlying assumptions about its purposes and rationale even while the materials I’ve assigned have been somewhat consistent from iteration to iteration. The first time I taught it, fresh out of graduate school, I came into the subject material with a somewhat doctrinaire understanding of the instrumental role of representations of Africa in the domination of African societies. Even then, I was rethinking that assumption, and teaching the class helped to spur that rethinking. Now, sixteen years later, my own perspective on the subject matter has flip-flopped to presumptive skepticism: I’m unsure of how and when representation is a necessary, let alone sufficient, condition of inequality, domination, or power, though I’m totally willing to credit that representation can cause or shape social action to distinctive ends. This feels like a wide-open set of questions to me now.

I’ve decided to structure the class from a series of contemporary images or tropes backwards into their historical development. Normally I’m uneasy about history courses which start from the present and move back in time, as this can have the effect of squashing all contingency, of making the present inevitable. In this case, I think it works well, because one of the central puzzles of studying these tropes is to understand how we recognize them and reproduce them even when we don’t know their historical referents any longer. This is history as the detective’s art, except that when we finally do get back to the scene of the crime and we know it’s Colonel Mustard and the noose that did the deed, it’s not necessarily clear what the crime actually was or whether it really matters any longer that it was committed.

I’m going to start the class where I’ve started it before, with a slide show of images of missionaries and pith-helmeted explorers in the cookpots of African (or generically black) cannibals. I’ve got a bunch of new images I’ve found in various places. This is a great example of the basic idea of the course: as you trace back, you see how the image disseminated outward from more specifically colonial and African referents into the whole of popular culture and eventually became a generalized trope.

I’m also going to look in the first session at a more puzzling example (which I’ve also used before in the class): Boss Nass from The Phantom Menace, whom a number of critics claimed to recognize as an African chief.

This recognition was part of a general critique of the use of stereotypes in the film. When you look carefully at how Nass was spotted as such, it comes down to several elements. One, the general “blackness” and minstrelsy of the Gungans like Jar Jar Binks (compared to the Charlie Chan Asianness of the Neimoidians). Two, the pidgin that they speak in. Third, the “African” look of Nass’ clothing. But think for a minute about how complex an assemblage that really is: minstrelsy is largely drawn out of American cultural history; evocations of pidgins as colonial language widely reference a number of historical experiences, and the clothing that viewers saw as “African” is a much more contemporaneous image. This doesn’t mean Nass isn’t a stereotype. Once the resemblance is pointed out, I see some callback to many images of avuncular African chiefs in mid-20th Century films like Africa Screams. (Right after this clip starts, for example.)

But then the history that’s caught up in this one image is in terms of process incredibly complex and intricate. Surely George Lucas, whatever his childhood-violating sins might be, didn’t have all this history consciously or even unconsciously in mind. Even more to the point, if the referents caught up in a contemporaneous image are this intricate, what, if anything, is it actually doing with or to its audience?

—————

Right now, here’s the modules I’m planning on doing in the class over 14 weeks, with some sketches of material that we’ll look at, most of it in excerpts or short selections. In some cases, I intend to end with a major scholarly work on the trope, so that we don’t so get that work as controlling authority that dominates the initial encounter with the material but instead as a “further reading” that expands the history. Some of the films we certainly won’t see in their entirety, because many of them I’m using simply as a typical genre representative rather than a unique work which originated or powerfully shaped a genre.

Ideas and suggestions, especially mentions of brief or powerful scenes, images or materials that really fits these themes, are extremely welcome.

Introduction
Cannibal cookpots and Boss Nass
Binyavanga Wainana, “How to Write About Africa”

Safaris, great white hunters (2 weeks)
Contemporary wildlife & nature programming
The Ghost and the Darkness
Hatari
The Naked Prey
Africa, Texas Style
Ernest Hemingway, “The Short Happy Life of Francis Macomber”
Theodore Roosevelt, African Game Trails
H. Rider Haggard, King Solomon’s Mines
Frederick Selous, A Hunter’s Wanderings in Africa
Edward Steinhart, Black Poachers, White Hunters

Africa as Hobbsean nightmare; war, genocide and atrocity (2 weeks)
District 9
Far Cry 2
Hotel Rwanda
Press coverage and other writings on Rwanda, Sierra Leone, Darfur, Congo
Tears of the Sun
The Wild Geese
Colonial documents on African violence and warfare
King Leopold’s Ghost
Heart of Darkness
Press coverage of the Anglo-Asante War (1873-74) and the Anglo-Zulu War (1879)

Africa as diasporic heritage and lost homeland (3 weeks)
Heritage tours in Ghana (Ebron, Holsey)
Oyotunji African Village
Henry Louis Gates, Wonders of the African World
Kevin Gaines, African-Americans in Ghana
Shaft in Africa
George Lamming, The Pleasures of Exile
Langston Hughes, The Big Sea
Ibrahim Sundiata, Brothers and Strangers
Pagan Kennedy, Black Livingstone
Edward Wilmot Blyden, selected work
James Campbell, Middle Passages

Africa as natural history museum exhibit (1 week)
“African Voices”, Smithsonian Museum of Natural History
“Hall of African Cultures” controversy; Arnoldi, “Reflections” essay
Trophy heads and body parts controversies, 1990s-2000s; Skotnes, “Civilized Off the Face of the Earth”
Art/Artifact exhibit
Robert Gordon, Picturing Bushmen
Coombes, Reinventing Africa

Africa as tribal, as icon of the primitive (1 week)
Avatar
“I Am African” ad campaign
Going Tribal, Discovery Channel, “Return to Africa”
Dover African Tribal Designs
eBay search: African + Tribal
The Gods Must Be Crazy
Sheena, Queen of the Jungle
Ace Ventura 2: When Nature Calls
Disneyland, the Jungle Cruise
Africa Screams
Darkest Africa

The witch doctor (1 week)
Diablo III, Witch Doctor
Dingaka
Ross Bagdasarian, “Witch Doctor”
White Witch Doctor
White doctor books: selection
Herge, Tintin au Congo
Witchcraft ordinances, colonial era
Peter Geschiere, The Modernity of Witchcraft
“Saving Africa’s Witch Children”

African rulers and power (2 weeks)
The Last King of Scotland
Congo (brief clips)
Mobutu imagery (1970s-1980s) and When We Were Kings
Press coverage of Idi Amin and Jean-Bedel Bokassa
Coming to America
Black Panther comics
Kwame Nkrumah iconography (1960s-1970s)
Sanders of the River
Shaka Zulu miniseries
Thomas Mofolo, Chaka
E.A. Ritter, Shaka Zulu
James Stuart archives
Carolyn Hamilton, Terrific Majesty

Final week
Ruth Mayer, Artificial Africas

Posted in Academia, Africa, Production of History | 22 Comments

A Funny Thing Happened on the Way to the Forum

A short while before I was supposed to get on a plane last week to fly to Chicago and participate in an event at Northwestern University, something I’d really been looking forward to, I started to have the sinking sense that the increasingly worse sudden pain in my lower right back that I’d hoped was just a back spasm was in fact a bit more than that. With some trepidation, I set off for the ER at the local small hospital, hoping I’d be checked out and told that it was just a spasm, handed a painkiller and waved onto my 6 a.m. flight the following morning.

It didn’t quite work out like that. It ended up being the first time I’d ever been admitted to a hospital. I learned some interesting new things. Among them, if an overweight man in his forties walks in at 9 p.m. with severe localized pain in the lower right abdomen, they get moved fast through the system. I actually waited to put in my name until a teenager who thought her wrist was broken (it turned out to be sprained) got her name in, figuring I wasn’t as urgent, but as soon as the desk nurse saw my description of what I’d come for, they whisked me in to the back. It didn’t turn out to be what they expected, which was a kidney stone or appendicitis. Instead, it was mild acute diverticulitis, an inflammation of intestines.

Don’t worry, everything’s cool now. Two days of IV antibiotics, a longer course of oral antibiotics and some advice about dietary changes (farewell to thee, o sesame seed…though I already forgot about that once, as I looked down in horror on a sesame seed roll that was already 3/4 on its way to my intestine).

Anyway, one of the things I kept thinking about during the experience was a familiar theme for me, which is the ongoing problem of the professions. Academia and medicine, it seems to me, share some similar problems. Academia’s issues I see from the professional’s side, medicine’s problems from the perspective of the clientele. The first perspective tends to put me in the position of an apologist, the second as accuser. Maybe between the two some kind of insight is possible, though when I add it all up, I’m left with the sense that many modern professions are simultaneously indispensible, a highwater mark of social progress and hopelessly screwed up in ways that can’t really be fixed by outsiders or insiders.

If lying on a hospital bed at three in the morning waiting for some kind of solution to wend its way into your lower intestiine so you can be run through a scanner while an IV bag is dumping fluids into your flesh via a needle in your hand has an upside, it was the charming doctor in the ER who turned out to be very interested in African history. (She asked: I didn’t volunteer.) It wasn’t just that this gave me a welcome distraction. It was more that this made me feel like a person rather than a slab of meat or a naughty child brought in for punishment and a stern talking-to. Other doctors have made me feel otherwise in the past, and this is one reason that I’m getting ready with some resignation to hunt for another primary care physician.

I know my own psychology well enough to know what kind of relationship I expect to have with a doctor, to know my own pattern of expectations and my own tendency to just avoid or evade professionals who violate those expectations. I want someone who treats me as something of a social peer while also being a professional who has skills and competencies very different than mine. Frankly, I want my pediatrician, who knew me really well but was also someone I trusted and allowed to cajole or criticize me as a teenager.

I can see this from the other side as well. Not all professors can be all things to all students. As I finish up my grading for this semester, I’m very conscious of the fact that there are students who don’t flourish under my laissez-faire policy of treating everyone in my classes as a presumptive grown-up, capable of deciding for themselves whether it’s worth investing time or effort in my class. Some students need a drill sergeant or a surrogate parent or a big brother or a boss. I can’t do that. I wouldn’t expect every doctor, in the same sense, to be able to malleably be the person that I need while also being the person that some other patient needs.

The problem in part is that it’s hard to figure out where the professional and personal character of doctors or professors ends and their institutional systems begins. Should I tell my primary care physician that I don’t like the peremptory on-the-clock office visits? That I don’t know whether to trust her scheduling me for tests or her prescription of medicines when she doesn’t really bother to explain to me what she’s thinking with either, or present me the alternatives? What if she tells me that’s just the way the system works at this point, that no one is going to give me anything other than that? I could sound like a student complaining about being in an introductory lecture-based course with four hundred other students at a state university where the professor is reading in a monotone from presentations prepared fifteen years ago and my only direct contact is with a bored teaching assistant. That’s the way the railroad runs. You can switch a small liberal-arts college, but you have to get in and you have to have the money. Or maybe you’re an even touchier subject: there are students who do spectacularly well only under very specific institutional regimes even at small liberal-arts colleges, or who can only connect with some very specific kind of pedagogy, and who knows where that is out there in the world?

As a client, I don’t even know any longer what a reasonable expectation about my medical care might be, or how fussy and particular a patient I really am. I don’t know whether my medical future primarily ought to be imagined as a case of enduring what I have to endure and avoiding the worst-case scenarios or whether the right set of professionals could deal with the puzzle that is me in some way that I can’t myself deal with. That’s what both therapeutic and educational professionals promise, after all: they will do something for you which you cannot by definition do for yourself. That’s what all the hubbub of assessment and outcomes-tracking is about: are the professions adding value? And are they adding value commensurate with the tolls (financial and otherwise) they impose on their clientele and their societies?

Posted in Academia, Domestic Life | 7 Comments

Africans and the Slave Trade

It’s been a very busy couple of weeks, as the last half of April so often is. Usually that leaves me with a mind like a blown-out tire for the week where everything calms down, and this year has been no exception. I’ve patched up the old cerebellum a bit now and I’m ready to resume blogging.

One of the discussions that happened while I was snowed under with work involved Henry Louis Gates Jr.’s New York Times op-ed about slavery and reparations. Gates argued that because Africans themselves were the principal slavers who fueled the Atlantic slave trade, the question of reparations is a permanently vexed one.

Africanist historians have been round the bend on this conversation before many times, not just about the overall issue of African participation in the slave trade, but specifically about Gates’ interventions into that discussion. After his Wonders of the African World television series, there was a well-attended panel at the African Studies Association meeting that year which pilloried Gates for his many perceived slights to Africa and Africans.

The reaction this time among scholars has been a bit more muted (so far), perhaps because of the favorable attention to Gates among scholars in Black Studies following the events that lead to the “beer summit”. Maybe it’s also because the argument for reparations has become more muted anyway in recent years, and because the fact of African participation in the slave trade is so firmly established for Africanists that it’s hard to muster much enthusiasm for a public debate about it.

That said, I do have a few things to add to the discussion as it has developed across listservs and blogs.

First, that we shouldn’t underestimate the extent to which the basic facts of the Atlantic slave trade in West and Central Africa between 1550 and 1850 are not at all known to the American public. In my survey course on West Africa in the era of the slave trade this semester, I’ve definitely had some students for whom the issue of African participation was a novel and upsetting revelation.

Second, some of the conventional strategies that both scholars and public intellectuals use to argue that we should just move along, nothing to see here, don’t entirely hold water. A couple of prominent examples:

a) “Of course we know that some corrupt African kings or leaders sold their own people. There was bad leadership then and there’s bad leadership now that preys on community; this is just more reason to put our trust in community rather than leaders.” This is a very reassuring political angle on the issue that flatters a lot of contemporary progressive and radical politics. Unfortunately it really doesn’t describe the totality of African participation in the slave trade. There are certainly examples of hierarchical, centralized states in West and Central Africa where rulers or court elites controlled the slave trade and expanded slave raiding largely out of self-interest. Dahomey is the most frequently cited example, and Kongo would be another.

The problem is that there are also a number of examples of organized slave raiding and trading that originated from social institutions that were more integrated into communities and less a case of a hierarchy above and outside of the everyday life of towns and villages. In some cases, they resembled merchant companies, in other cases they were built up out of age-grades, religious or spiritual societies or other social networks. On an even less-organized basis, it was not necessarily that uncommon for members of extended kin networks to sell more vulnerable or marginal members of their own families as the power and reach of the Atlantic slave trade grew in the late 1600s and 1700s. Or for members of one village to raid a neighboring village without any command from a king or paramount ruler of some kind.

That might invite an opposite distortion, of portraying West and Central Africa at the height of the slave trade as caught up in a Hobbesean war of all against all, and that largely wasn’t the case, either. (There were a few places where the social order broke down almost completely, as in the civil war that engulfed the Yoruba kingdom of Oyo.) Slave raiding and slave trading was socially organized, and it was extremely heterogenous in its distribution. Some societies or communities didn’t engage in it at all or actively tried to disengage from or escape the Atlantic world, some societies largely engaged in defensive raiding, and others invested heavily in the Atlantic trade. Sometimes those variations had a lot to do with location, sometimes a lot to do with accidents, sometimes a lot to do with the choices and preferences of European buyers and the shifting politics of national and mercantile competition among Europeans, and sometimes it had to do with the choices that Africans themselves made, rulers and ruled alike.

All of which amounts to a typical scholarly gambit: “It was more complicated than that”. But in this case, the complications ought to defeat any simple attempt to isolate African participation to a convenient group of mustache-twirling villains just as it also defeats Gates’ somewhat bizarre notion that there was a unitary “Africa” which participated in and can be blamed for its part in the African slave trade.

b) Which raises common response #2: things were different back then. There was no “African people” and hence slave traders weren’t “selling their own people”. The meanings and implications of slavery within African societies were very different from slavery in the wider Atlantic world. In societies defined by the difference between kin and strangers, there wasn’t a concept of individual freedom for West and Central Africans to invoke–or betray. The moral, social and political framing of violence, embodiment, identity and so on were not the same as we imagine them as today.

All of which strikes me as an extension of a crucially important point about early modern history in general. Namely, that it was not a mere prologue to the world of the 19th and 20th Centuries, and that we should be scrupulous about reading the modern back into it. If you don’t study early modern history in its own terms, you more or less completely eliminate any element of contingency from modernity. This is why it’s such an important field of study in history departments, but also why it’s often hard to get students to understand how central it really is. Because to approach it correctly, you have to confound expectations that you’re simply tracing modernity to its roots or its infancy.

You don’t want to confound those expectations completely, of course, because there are important causal connections between the world of 1500 or 1650 and the world of 1750 or 1850. But this is perhaps the single most important area where the first task is to make the familiar strange before you allow people to go back to finding what they expected to find.

This, unfortunately, has some unsettling implications for the Atlantic slave trade. It means most importantly that we can’t just argue that African participants were operating within unfamiliar social contexts, that their subjectivities and identities were not what we expect them to be, that slavery meant something different to them in their world, and not perform something of the same kind of defamiliarization exercise on European and American actors involved in the early modern slave trade as well.

In many of the responses to Gates, there is an attempt to hold steady the moral and political culpability of European and American actors while arguing for the alien character of African societies in the same time period. Before the mid-1700s, I think that’s a hard balancing act to pull off. The entirety of the Atlantic world in the 1500s and 1600s is different in fundamental ways: violence, freedom, suffering, personhood and much else didn’t mean what they meant later. It’s not that sailors and captains and financiers running the slave ships and the slaving business were innocent, but that the terms under which we would convene a court of transhistorical judgment are vexed no matter who is in the dock.

After 1750, I think the moral, social and political underpinnings of the Atlantic slave trade increasingly tilt towards our own frameworks and outlook, but that also goes as much for West and Central African participants as it does for European and American ones. As concepts of freedom are born out of dialectical encounter with slavery, as resistance to slavery as a phenomenon grows, as the legal and political institutions we associate with the Enlightenment come into being, the context and meaning of slavery changes, but that potentially stretches well into Atlantic Africa as much as anywhere else. If you start to hold traders and bankers and sailors and overseers responsible because they had other choices, because there was a possibility for opposition, you have to start imagining that African participants also have responsibility.

——–

Now what you do with that imagination is a different question entirely, including whether or not you think some kind of reparations, however structured, are necessary or possible. Because at least one other complex dimension of the Atlantic slave trade is that the wealth it created accumulated very differently (or failed to accumulate) in West and Central African societies from how it accumulated in Europe and the Americas. And that, as far as the consequences of the trade, is perhaps the single most important issue of all. In that sense, African participation and Euro-American participation are completely different in their nature.

Posted in Africa, Production of History | 2 Comments

Literacy Quizzes (Again)

There is probably no point whatsoever to a critique of the Intercollegiate Studies Institute’s civics literacy quiz (via 11d), but hope springs eternal, I guess.

First off, my generic criticisms of these kinds of historic or social-science literacy quizzes. Usually, they’re served up to the public with an inference that younger people are more ignorant about this information than they used to be in the past, but rarely if ever is there enough long-term data to even hint at whether that’s true or not. It’s possible that Americans or other national publics knew even less than they do today. Given that this is possible, it’s also natural to question whether or not we actually need to know any of this information, what the consequences of ignorance might be. Which tends to be a point that many quiz-givers of this kind gloss over, resting as they do on a generalized presumption that knowledge of history, politics, civics and so on must somehow be essential.

Reading ISI’s quiz 2008, I have a sense that they do have a vision of the outcomes, namely, that nestled among classic basic questions about American government are some “zingers” from which you can argue that some current policies exist only because a majority of the public is ignorant of the truth. For example, the question that correctly states that the phrase “a wall of separation between church and state” is found in Jefferson’s writings, not in the Constitution or other official documents. It’s pretty easy to see the jump from that question to a beloved contemporary conservative argument that church-state separation was not actually part of the Founders’ intention and that all they specified was that the federal government establish no official religion. (Another question on the quiz.) But surely then if we’re going to ask for literacy on this particular issue so that Americans can have an informed opinion, we’d want quiz questions about the history of Supreme Court interpretations of the establishment clause (especially the Fourteenth Amendment and incorporation), debates among the Founders themselves about the establishment clause, the connection to the free exercise clause, and so on. Or if you wanted a blandly neutral question, how about just a question about the language of the establishment clause? Even if you style yourself an “originalist” or “textualist”, you should have the honesty to concede that for much of the history of our courts and our society, Americans have favored other ways of interpreting our founding documents and ideas.

There’s a few other questions that strike me as having this kind of double-intent: technically true, but where the implication is misleading. There’s a question about the content of Martin Luther King’s “I Have a Dream” speech that’s clearly intended as a debunking “gotcha”, that King didn’t advocate any specific policy initiatives in the speech. Technically true, but considering that King himself made it very clear that he wasn’t just wishing abstractly that we could magically become racial brothers and instead argued that Americans needed to forcefully act to legally and politically secure racial equality, it’s at the least not a very good question, hardly the main civics lesson out of the civil rights movement as a whole.

Some of the economics questions on the quiz are also tendentious in this fashion. Say, that international trade most often leads to gains in a nation’s productivity. It’s not flatly untrue, but as a simple statement of fact that leaves a lot of crucial caveats and debates off to the side. It’s not in the same class as a question like, “What are the three branches of American government?”

If that’s the style of questioning, then there probably also need to be questions in a similar vein that figure prominently in arguments from the leftward side of the aisle. (I suspect the Susan B. Anthony and Roe v. Wade questions are prophylactically intended as such.) Or, you know, maybe just avoid this kind of question altogether in favor of strictly neutral ones.

It’s also interesting to look at which questions are most frequently answered correctly and incorrectly in the 2008 results. I think what I draw from some of those is not “Oh noes think of the children!!!!!” but “Lots of folks were wrong, but it’s kind of a trivial ‘gotcha’ question where I don’t think it matters much that the majority doesn’t get it right”.

For example, almost 3/4 of the respondents misidentify a phrase from the Gettysburg Address (I suspect most of those getting it wrong think it’s in the Preamble to the Constitution). It’s not unimportant that Lincoln said it when he said it: Garry Wills wrote a whole book arguing that the phrase and the speech were a bold realization of implicit, unkept promise embedded inside the Constitution. But neither are contemporary Americans wrong to misremember the phrase as lying within the Declaration or the Constitution: that’s sort of Wills’ point, that Lincoln discovered the idea to be always already present.

Another: over 80% got the subject of the Lincoln-Douglas debates wrong. Again I suspect many identified that the debates were about slavery, rather than whether slavery would be extended to new states. And again, I guess I wonder at the significance of getting that wrong, because the moral question of slavery was an undercurrent to those debates, to the Civil War, and to its aftermath. Unless you’re one of those who want to maintain that it was all about Northern aggression and states’ rights. Which is to say the least, a contentious interpretation, and hardly one that makes a person who holds to it “literate” in American history or civics. Quite the opposite.

This is precisely the kind of thing that pedagogy which favors interpretations, arguments, the complexity of things has tried, rightly, to move beyond: a sense that a concrete fact always outweighs a messier interpretative truth, or that the baseline fact is a necessary precondition of the interpretative questions. There’s nothing wrong with having a precise knowledge of the content of the Lincoln-Douglas debates, but I’m more concerned first that a student of American history and civics get a solid understanding of the historical role of slavery in the establishment and development of American government and society.

It all comes down to: what do people need to know, urgently, in order to participate meaningfully in American society without any presupposition of what they’ll do as participants? Then, secondarily, what ought they to know, what might affect their participation if they knew it? Then down the line somewhere, what would enrich or complicate their participation, and given them a more intricate sense of how the past is both different to and similar to the present, of the messy highways and byways by which certain common views and traditions have established themselves? I think very little of ISI’s quiz falls into the first or even second category unless you have a very strong fixed notion not just about the content of American civics but what the specific proper practice of it ought to be.

Posted in Academia, Cleaning Out the Augean Stables, Politics, Production of History | 5 Comments

The Work of Cultural Capital

This entry about the Ramey study on family time at 11d got me thinking. Laura, citing David Brooks and Tara Parker-Pope, observes that a shift towards parents spending much more time with their children doesn’t seem to have any downside.

I think so too. I’ve pointed out before that this is one positive way to think about the end of a world where children roamed freely on their own adventures through suburban wildernesses, that maybe we’re transitioning to a desirable middle-class world where families adventure together, where the world of children and adults is less culturally and socially separate than it once was.

On the other hand, I keep thinking that there’s more to it than the emotional satisfaction that some parents of my generation have found in the company of their children, and not just the conventional issue of whether we’re smothering our kids with too much control or attention.

I wonder if part of what’s happening with middle-class to upper middle-class families and time is also conditioned by the rising difficulty of reproducing social class in the United States.

I’m going to be somewhat simplistic here just to try and get the point across. Crudely speaking, you could argue that in the 1950s that the middle distribution of income was not just far larger and the ends of the spectrum drawn in closer towards that middle, but that the American middle-class imagined that it had hit upon a fairly stable formula for its own reproduction. Namely, a relatively minimalist range of signpost practices defining middle-class respectability that could be passed on to the next generation along with expanded access to a high-quality education system that included professional training at its culmination. Put the two together and you had a system for social mobility that could be imagined both as egalitarian and meritocratic, accessible to many, securely reproducible, but not a guaranteed and accumulating legacy.

The cultural signposts were defined and then demolished within the span of a single generation: Jell-O, Levittown, Leave It to Beaver, the Brady Bunch went from being idealizations to hateful conformities to ironic ridiculousness fairly quickly from 1965 to 1985. Income equity went roughly the same way, at the same pace, and higher education, while still a passport, controlled entry to an increasingly murky and complex world of economic and social advancement.

That 1950s middle-class could split the world of children and adults as radically as it did for two reasons. First, because a working patriarch could actually hope to accumulate enough in his own life to leave an inheritance for children that would help insure the reproduction of the status that he’d achieved, both indirectly through education and directly through property and money passed to the next generation. Second, because women in the home and social institutions like school could do the work of middle-class cultural reproduction in a relatively minimalist fashion, on a kind of assembly-line. You didn’t have to worry too much about a child’s interior experience of schooling and childhood if the outer signs of respectability were successfully monitored and secured.

So what I wonder a bit is if the insecurity of middle-class life and the uncertainties of reproducing it in the next generation is producing a much more intense focus on generating a flexible, responsive kind of cultural capital in the children of professionals. Jell-O, church attendance, and the pinewood derby for Cub Scouts doesn’t secure anything any longer. Nor in any simple sense does education by and of itself. So families draw together in part to cultivate the self, to create exposure to a wide range of stimulating experiences which are nevertheless selected for their potential for cultural capital creation. Music lessons, language lessons, access to computer and digital tools, constructivist toys and games, travel selected for enrichment potential rather than ’empty’ leisure, parentally-accompanied museum visits and so on. Schools do many of these activities as well, but many professional parents increasingly distrust the capacity of schools to properly enrich their children unless the school is somehow distinctively individualized in its approach to enrichment. Because, in part, the cultural capital that creates some sense of distinction in a new entrant to middle-class life is that which is intensely individualized.

This is an issue that Michele Lamont touches on in How Professors Think, but it’s a point that extends across most of the professions. The job candidate or aspiring professional or competitor for funding who stands out is often the person who appears the most individually distinctive while also locking down all the visible or apparent baseline benchmarks of credentialing and competency. That’s the person who gets tagged as having “quality of mind”. That’s what applicants to selective colleges try to accomplish as well, to assure admissions officers that they have excelled at all the standard expectations and that they are unique and special individuals. The unique and special part often comes straight from the kinds of cultural capital that a particular household has worked to cultivate in all members of the family, and that work involves drawing closer together, sharing experiences while also controlling or directing them with some vaguely productivist, self-improving ethos in mind.

Posted in Academia, Domestic Life | 6 Comments

Four Reasons Why The World Is Better Because of the Internet

Cash Gordon
Chatroulette in Ben Folds concert (via 11d)
Hark a Vagrant
Wikileaks
———

One of the things that drives me nuts about the stalwart defenders of old media and their closed-shop underpinnings is an unwillingness to concede that online media have conclusively demonstrated just how stale the air was in the old-media room, how much they excluded a huge range of imaginative and distinctive creators, just how narrow and socially particular the tastes of editors and publishers were, and just how many things were kept from publics that they should have seen or known.

Say you enjoy comic strips. Imagine that there’s no Internet. We’d have the same bad, dull, legacy-infested newspaper comic strips that we have today (and no Comics Curmudgeon to mock them). Maybe one or two of the people who’ve created webcomics since the 1990s would in the papers, but most wouldn’t be. Maybe a few of the people who’ve created webcomics would be doing other kinds of sequential art or humor, but most wouldn’t be. Now pick the best 30 webcomics and stack them against the average newspaper comics page. It isn’t even a close comparison: the webcomics are not only vastly better than the newspaper, but in some cases, they’re creative in ways that don’t even have ready comparison to the best strips of the past. The digital world isn’t just an improvement, it’s a vast expansion of the creative space of this one genre.

Look at what happened with the Tories’ Cash Gordon website, in part because they put some people in charge who didn’t know what they were doing. That kind of crowd subversion strikes me as so much smarter and robust than anything in the Situationist playbook of thirty years ago. Look at Ben Folds’ repurposing of an already clever reusage of Chatroulette, the speed and nimbleness and sheer inventiveness of it is like nothing I can remember from when I was a kid, except maybe improv theater of some kind–but here we all get to see it, reuse it, and see it again.

The Wikileaks video of an Iraq War shooting that’s being linked to and discussed around the world has its analogies in past leaks and disclosures, but these are less and less something that a small group of experts and editors make closed-room decisions about. It’s up to all of us to decide whether and how some information matters, and up to all of us to capture and circulate video and text and evidence, to make it harder to hide and conceal. That’s all to the good, because if there’s anything we’ve learned about post-Woodward & Bernstein investigative reporting, especially on the matter of Iraq, is that most editors and producers and journalists with lots of inside-the-Beltway connections can’t be trusted to make critical decisions for the rest of us about what we need to know and see. They had their chance to serve the sacred civic role that they so often attribute to themselves, and they blew it.

Concede how much some kind of shake-up in old-media hierarchies was desperately needed, and I’m happy to concede that the online world is also infested with trolls, that it often uglifies rather than beautifies public life, that some online obsessions and performances are trivial or stupid, that for every wonderful breath of unexpected creativity there are ten creators who best should have kept the world from seeing their work, and that the diffusion of expertise in favor of crowdsourcing can sometimes be really problematic. But the starting point is that however you add up that balance sheet, you’ve got to acknowledge that pre-digital media vastly underutilized the human potential for imagination, that our possibilities have become so much richer and varied.

Posted in Information Technology and Information Literacy, Popular Culture | Comments Off on Four Reasons Why The World Is Better Because of the Internet

What To Do When Unfogged Is Down

Regarding the Great Greenwald vs. Kerr vs. Ferrell Crooked Timber dustup of April 1st, ok, I’ll bite. Rich Pulchalsky snarks at me enough in the comments, after all. Besides, what the hell, Unfogged is down anyway, so I might as well.

I also don’t know which post of mine Rich is remembering, but I have to say he’s got a point about some lessons I’ve learned in seven years of blogging and other online writing before that. There’s really very little to be said for trying to carry on a conversation (online or otherwise) with people who have nothing but an instrumental view of conversation as a means to their own anti-pluralistic or illiberal ends, who concern-troll every debate in the hopes of getting someone to take the bait. There are a set of writers who work hard every day trying to create a framework where the only right answers can be some kind of dogma, who will never for one passing second acknowledge the legitimacy of evidence which contradicts their own pet doctrines, who are never even momentarily in any danger of being persuaded by any countervailing viewpoint. For these writers, all online discussion is a colossally elaborate manipulation. I spent too much time in developing this blog arguing for an indiscriminate openness to conversation. Pursuing conversation with the comprehensively dishonest is a fool’s errand, and I’ve sometimes been just such a fool.

That said, I still believe the following:

1) That I’m unconvinced that alternative approaches to those same actors are any more effective at checking or limiting their influence. Mockery feels good, and maybe strengthens group affiliation among like-minded readers, but the main game for the worst participants in the public sphere is attention, it’s to grab the eyeballs. If they’re invulnerable to persuasion, they also tend to be invulnerable to satire, or feed off and benefit from either of those responses in different ways. Someone giving them enough respect to try and persuade them is legitimating; someone satirizing them is evidence of the inauthenticity and snobbery of the satirist. Cf. Sarah Palin. Uncompromising, brutal invective also doesn’t seem to me to do much besides arguably nurturing group loyalty among those who agree with the invective. There’s some point at which if you’re looking to effectively fight back against destructive or malicious actors, anything in the blogosphere is beside the point, and everyone who is talking or commenting thereof is wasting their time, no matter what they’re doing. I haven’t lost any of my irritation with people who anoint themselves with an activist halo simply because they write lots of invective in comments threads.

On the other other hand, invective and satire are much more interesting and entertaining to read than benumbed consensus-seeking. In that sense alone, they’re often worth the price of admission.

2) It’s possible, indeed likely, that someone who approaches most or all online conversation as combat will misperceive many possible conversations that could develop into something else, and misperceive many people who could be productive, generative participants in a conversation that contains healthy differences within it. Again, deep in that Crooked Timber thread, Rich Puchalsky makes the legitimate point that liberals (including me) have had a bad tendency to triangulate against the left and imply thus that only liberals are interested in pluralism. This isn’t the case, but I do think it’s the case that the more we perceive online discourse as battle (whatever the perspective from which we do so), the more likely that we become kids with hammers who see everything as nails. At some point, if some kind of pluralism is what we ultimately aim for, we have to try and practice it when we can. I think this takes being really sure that the person you’re talking to (or at) is purely malicious or worthless in their aims before you label them as such. On the other hand, there’s nothing wrong with strong disagreements, strongly worded, about matters of principle even with people that you think are potentially reasonable or persuadable. I don’t think anyone should hesitate to strongly criticize Orin Kerr, for example, but I think Kerr isn’t a bad discursive actor with whom no productive conversation can be had.

3) The question of how to live with people who recognize no commonalities or shared obligations with you, who deliberately construct for themselves a world of practice and belief which is impermeable to any contradiction or challenge, who see everything that their social enemies say as permanently and perpetually inauthentic, isn’t resolved at all by any decision about how to approach online conversations. There are numerous bloggers and pundits who support Tea Party rhetoric in a way that strikes me as wholly instrumental, who use Tea Party adherents as social tools and their package of tropes and beliefs as blunt objects designed to hammer the frame of public debate into a congenial shape, to make some ideas unthinkable and some initiatives infeasible. That said, there’s a still a real sociality out there beyond those pundits and think-tank scribes and hacks. You can treat Michelle Malkin or Eric Erickson as village idiots without a twinge of conscience but sizeable social groups, whatever their beliefs or actions, however wrong they seem, are a different kettle of fish in a great many respects: in their causality, in their habitus, in their consequences. And a different kind of problem in the question of how to live with the existence of a social group who isn’t interested in living with your own existence, of what to do about that.

4) Intellectuals often seek out difficult social, political, cultural or aesthetic problems that have no easy or right answer, and relish conversations which stay in that zone. Yes, that’s partly about treating discourse as a game, partly an approach that intellectuals prefer because it pleases them and suits their cast of mind. Difficult problems discussed in complex terms: fun! Again, maybe frequently to the point of misperceiving existing conversations when there’s only one participant playing by those rules.

But this is also an empirical assertion, that the class of problems in the human world which fit this description vastly outnumber the class of problems to which there are ready, simple, no-fail answers. This is about desiring as a sign of human progress that contemporary societies get to the tough problems and acknowledge them as such, which in an ideal world ought to involve vastly higher distributions of humility and openness. But again, if the players on the other side are acting in completely bad faith, this is a kind of unilateral disarmament. Messiness and ambiguity lose to crudity and manichean simplifications every time.

Posted in Blogging, Politics | 1 Comment

Evaluation Across the Disciplines

One perspective that I’ve occasionally heard from colleagues that makes me grit my teeth, that I have very little patience for, is that it is so difficult to evaluate the quality of work in other disciplines or judge their comparative worth in a discussion about resources that it’s best not to even try. When someone with that perspective ends up in a process where judgment of other disciplines is required, they tend to approach it strictly as an exercise in horse-trading, and when they end up in a conversation about the stewardship of finite resources, they tend to argue for cuts (or gains) to be distributed evenly across the whole curriculum, with each discipline making its own autonomous decisions about what that actually entails. In practical terms, this is a bad way to run the business of a college or university. In intellectual terms, you can only hope that a person articulating this view doesn’t actually mean it, given how arbitrary the distinctions between many disciplines actually are.

I don’t encounter an explicit version of this argument very often. The last time I ran across it was at a planning meeting about three years ago. It pops up more implicitly more often, sometimes in ways that strike me as unconscious or reactive, a retreat from the wearisome difficulties of thinking and deliberating far from familiar territory.

I think that retreat is understandable. Evaluating claims or proposals outside of your own specializations and disciplinary training is harder work. Moreover, not everyone carries out that work in the most responsible or fair-minded fashion. It’s inevitable that some of your judgment of other disciplines is tied up in your preference for your own, because your cast of mind has a lot to do with why you do the work that you do and not some other kind of work. The trick is to keep yourself from going down the rabbit hole too far, from becoming a gatekeeper or having a monomaniacal vision of what makes for legitimate scholarship.

Here are some basic best practices principles I try to keep in mind in these kinds of deliberations.

1. Know your limits. Yes, I said that it’s bad to give up on judging work outside your own experience. That said, somewhere in the curriculum, there’s strong scholarship and teaching being done that you are simply incapable of judging on its intellectual merits. For me, it’s mathematics. I can only assess a research proposal from a mathematician on information external to the proposal itself (the quality of mind and accomplishments of the applicant, for example). At some point, you will have to rely on the sage judgments of other people that you trust. (Which, of course, means that every planning and assessment body needs disciplinary diversity.) It’s a bit easier in planning discussions: you may not know how to think directly about what it is that another department does, but you can often see how it fits into and relates to the rest of a curriculum. Still, even there, part of staying involved in the total picture is working to have an informed opinion about specific allocations of resources by departments. It would be very hard for me to have any meaningful thoughts on what fields a mathematics department needs most or are most intellectually compelling.

2. Be slow to ask about applications. This is a tougher rule to follow, but it’s especially important in evaluating proposals or research. Outsiders to a discipline tend to have vulgar conceptions of how to apply another discipline’s scholarly research to the real world, overlooking all the complex intervening processes between intellectual work and its applications or uses. The person who is quick to ask “Well, what good is this? What can you do with it?” of someone from another disciplinary tradition is often a person who would never ask that question in a simple or crude fashion on home ground. Simplistic demands for real-world applicability are also a lousy way to drive planning decisions, for much the same reason. What that gets you is a mix of dullards, habitual overpromisers and snake-oil salesmen.

3. Familiarity can breed contempt. You’d think that the worst behavior is directed at disciplines which are the farthest from one’s own, but in fact, it’s often the disciplines which are most proximate which are the biggest targets of aggressive or unfair behavior. When you’re looking at a proposal or a claim for resources and you understand the subject matter, the methodology, the importance of the research or teaching, and it just isn’t the way that you would do it in your own disciplinary tradition, it’s very tempting to punish someone for not having had the good judgment to be in your own discipline. This is especially a problem for scholars operating in new disciplines or who identify as interdisciplinary, as they tend to be held accountable to the standards of rival disciplines who see that work as a threat to their own viability. Ideally, this is a tendency that everyone who wants to be a good actor in institutional decision-making will be aware of in themselves and guard against. Realistically, this is probably the area where the wary eye of institutional leadership is most needed. Siblings fight more than strangers, and need parental supervision from time to time.

4. Innovative in its own context can be tired in someone else’s. Favor the local context of a proposal or idea first, or run the risk of penalizing people who try to reach out. It took a while for this to point to really sink into my own practice. Part of becoming literate in other people’s disciplinary landscape is learning about their risk-to-reward ratio. If your discipline is centrally built around qualitative fieldwork, it’s almost impossible not to feel impatient at a person in a discipline with little or no fieldwork tradition taking what look like baby steps in that direction. If you’re a social psychologist, a move towards the consideration of psychology in economics may not look very exciting. But as a general principle, I think you want to reward risk-takers and innovators across the curriculum, and innovation is primarily something that happens against the backdrop of specific disciplines, not the big stage of the entire institution.

5. Bloodlust for the wounded is dangerous. At the end of every meeting for assessing proposals I’ve ever been in, there comes a cathartic moment where one or two proposals that managed to irritate the hell out of almost everyone get briefly torn to shreds. That’s a good moment, on the whole: it confirms that there are strong shared values and strong common principles. What’s equally important about that moment is that in getting it off your chest, you can leave that opinion in the room. What happens in Vegas stays in Vegas. Everyone in an academic life will write bad proposals: proposals which are too early, or too late. Proposals that never could make sense, and proposals that could but not coming from their author. Proposals that will make sense the next time they are written. Sometimes even a really bad proposal needs someone to imagine it as it might be so that it doesn’t permanently attach itself to the reputation of an individual or a discipline. The same thing goes when you’re evaluating the place of disciplines and their claim to resources. Sometimes you have to imagine a discipline as it might someday be or as it is elsewhere, not as it concretely is embodied in the research and teaching of particular individuals. A discipline sometimes needs an advocate who can explain its value better than its actual practitioners can. It’s easy to look for the wounded antelope and finish it off, but it’s often not good stewardship of the long-term interests of an academic institution.

6. Generosity is a required institutional discipline. Believe me, I can be as mean as the next sonovabitch in my opinion of scholarship or disciplines or people when I’m in a conversation with trusted friends. Nobody’s obliged to be a genuine pollyanna. But if you’re in a planning meeting or doling out resources, that has to be put aside as much as possible.

Posted in Academia | 3 Comments

I Tribulated and All I Got Was This Dumb T-Shirt

Forgive me a brief moment of irreligious kidding. (I’m tempted to not kid and talk about ongoing revelations about the Catholic Church, but I doubt I could stay studiously cool on that topic.)

But something that’s occurred to me in recent weeks, thinking about the Rapture and the Left Behind books (which I teach in my History of the Future course) and so on.

What if the Rapture already happened, and only about three or four people disappeared, so few having met the standard? And thus this is the Tribulation already. I get that preterist Christians have thought this way for a while. In fact, given the complexity of Christian eschatology, I’m sure this is an old-hat proposition in some schools and not particularly funny. (I also have a sinking feeling that it’s probably a standard schtick in somebody’s comedy act already.) But it certainly is an amusing proposition up against the particular constituency of American religious conservatives who are certain that the Rapture is imminent in the near-term future and that they will be among those called in it.

Posted in Miscellany | 1 Comment