Not Even Wrong

I missed this story when it first appeared, but apparently Rush Limbaugh has been saying that Barack Obama’s father was actually an Arab from “an Arab part of Africa”. Look, why bother with real places at all, if you’re comfortable saying this sort of thing in public with millions of people listening? Just say that Obama’s father was a Calormene from Tashbaan and his mother was a Ferengi who ran a bar for Denebian slime devils.

But this does show you something about the persistence of culture, though. There is a kind of thermodynamics to narratives and rumor that achieve a certain degree of initial circulation. They can be created, but they’re almost impossible to destroy. What I think some village idiot on Limbaugh’s staff (or some other deranged partisan workshop) pulled out of the cultural substrate of the last two hundred years is a kind of mutant offspring of the “Hamitic myth” plus a hazy fragment or two of the history of Swahili society in East Africa.

The “Hamitic myth” was a proto-imperial view held by some European travellers and observers that African societies, particularly in East and Southern Africa, could be distinguished by whether they were original to Africa or composed of alien and more ‘evolved’ outsiders who came from the Middle East. The basic historical picture drawn by the myth was wrong, and its imagined racial hierarchy even more so. However, East African societies over the last two millennia were shaped a great deal by successive migrations of different linguistic and cultural groups and forms through the region. Some of those migrations passed through from west to east, or northwest to southeast, some from north to south, and some back in the other direction. Particularly at the northern end of the region, there were some connections to the historical world of the Middle East and the Indian Ocean. The Luo people (Obama’s father was Luo) speak a Nilotic language, which connects them in diffuse ways with other people spread through East and Northeast Africa. Also, the Swahili coast of East Africa (which is not where Luo-speakers come from), has long been shaped by cultural and economic interaction with the societies of western India, the Red Sea and the Persian Gulf.

Saying that this in any way connects Obama to the Middle East through descent is like saying that my heritage is partly French because the name Burke in Ireland came originally from Norman invaders, or that my heritage is Spanish because de Burgos or de Burca was a Hispano-Norman surname. You can’t slap contemporary national and ethnic labels down on histories where those labels make no sense at all. Actually, it’s not even that close. It’s like saying that because my great-grandfather came from Ireland, and Ireland was sometimes invaded by Vikings, and Vikings were connected to Germany, and the Mongols defeated Teutonic Knights in 1241 and probably there was some interbreeding involved in all those connections and hence I am probably at least a bit Mongol.

Many white Americans like to imagine a loose, affectionate connection to one or several ethnic or national “original” identities connected to one or more immigrant ancestors. That often gets looser and more imaginary the further people get from the historical moment of immigration. So the Ireland that a group of Irish-American families I grew up with could imagine was built from Darby O’Gill and the Little People, lots of Chieftains albums, and occasional boozy donations to the IRA made in pubs or when someone passed a hat at a party. That’s ok: I honestly don’t think people mistook that for reality, or made strong claims based on the more or less harmless and romantic images involved. (Though obviously that trickle of money to the IRA had some real meaning in the world, if it actually ever got to them.) This is the same kind of historical register that encouraged people to account themselves 1/16th Cherokee, or after Roots, that engendered new rememberings of descent from stolen royalty in Africa for some African-Americans. This is often a good, creative use of history, a way to try and locate oneself and one’s family in time and space. In American society, it’s a subtle counterbalance to the relentless pressure to constantly reinvent oneself, to fit into changing places, changing needs.

But this kind of memory can become a rotten, decomposing foundation for self-understanding when it starts to believe too much in the fixity and stability of the past it has set its eye upon. When any “European-American” forgets (or never knew) how much the making of “Europe” is both recent and how much movement of people, goods, ideas and culture in, out and through the borders of what we now think of as Europe has been involved over the long haul. The same goes for any European (or other) nationality that one wishes to claim as heritage. The pleasant (or even grim) heritage you may think about as your roots is almost certainly a far more recent and shallow thing that you know. But precisely because many Americans regard heritage in a somewhat romanticized, somewhat imaginary way, as a set of selections off an a la carte memory menu, this kind of gibberish coming from someone like Limbaugh may sound plausible in some sort of way.

Posted in Africa, Production of History | 4 Comments

Planned Contraction or Chaotic Retreat?

Even leaving aside the economic news of the last year, I’ve become convinced that all but perhaps four or five American universities with extraordinary wealth have come to the end of a long period of bountiful growth. I’ve muttered the occasional Cassandra-like warning for more than a decade now that we were collectively heading for a unseen cliff, and I think now the precipice is visible ahead.

Let me lay out the basics as I see them.

1) Tuition is a smaller part of the revenue base for many of the wealthier institutions than outsiders commonly imagine. Other income, most importantly endowment income, pays for a significant proportion of the per-student cost of selective higher education. Moreover, at many selective institutions, tuition is effectively charged on a sliding scale: basically, universities and colleges ask an entering student how big their parents’ wallet is, and set their fee from the answer.

Still, few institutions could survive if they minimalized or eliminated tuition. Steady increases in tuition above the rate of inflation in the 1970s and 1980s allowed many institutions to dramatically improve what were often shabby facilities, improve what were then poor working conditions for faculty, and to offer a much fuller range of services. Since then, those increases have underwritten further growth along the same lines, though there are also important areas where most higher education has been strongly exposed to cost increases that were also well above inflation, such as libraries and information technology.

This era is over. Tuition cannot keep rising at this rate, at least at institutions which are or attempt to have need-blind admissions. First, family incomes are not keeping pace with tuition increases. Rising income inequality means there is a steadily smaller and smaller group of families that can be expected to pay the full cost of higher education. Unless institutions were willing to slide out their fee scale even further, with a premium charge for the richest 1% of families that was double or triple the current highest charge, the base charge cannot keep going up above inflation at all. (I suppose you could argue that legacy admissions and capital campaigns and the like are a premium pricing tier of a sort, but the more overt that equivalence becomes, the more corrosive the likely consequences.)

This is leaving aside external political pressure over tuition increases, which I think is likely to increase regardless of who wins the Presidency this November.

2) Endowment income. Let’s just say I think it’s unwise to expect that anything but the unusually well-invested endowments can expect anything like the rate of return that many have seen over the last two decades. I worry that many will be hard pressed to hold on to the position they’re at right now if the overall investment environment gets any worse. This income pays for a lot of what colleges and universities have right now, but it’s not going to pay for another decade of growth comparable in scale and magnitude to the two decades previous.

3) Fund-raising. I just don’t think there’s the money out there for capital campaigns like the ones that most institutions have run in the last two decades. The people who made those campaigns work are tapped out, or they want to move on to other (arguably more deserving or needy) philanthropic targets. The current economic climate is the opposite of the irrationally exuberant eras of money accumulation that occasionally filled fund-raising targets. A lot of smaller annual donations are going to get harder for many professional families to make. Colleges and universities will still be able to make up some ground through fund-raising, but I don’t think it will compare to past efforts, let alone make up the difference in missing revenue from other sources.

4) Anything else. A fortunate few institutions may have unusual sources of revenue to tap (IPOs, intellectual property rights, etc.) but that’s not most of us.

==========

So, the party’s over, I think. However, I’m not hearing a lot of preparation for what higher education will look like if steady growth is over. Planning for minimal growth or even contraction in some cases might just require budgetary prudence and restraint, but I do think there’s a different mindset involved. It’s not just about questioning every area of current and future expenditure.

Here are some of the shifts in thinking needed.

1) Knowledge does not expand automatically like yeast dough. Progress is not adding disciplines, subjects, new areas of knowledge. Progress is knowing what we know better: making better use of what we know, communicating what we know more effectively, and not treating all forms and types of knowledge as if they were the same or required the same kinds of resources. Progress is discriminating between older disciplines or ways of knowing that need active stewardship and older disciplines or ways of knowing that can be replaced or transformed by new programs and projects. Progress is recognizing where some areas of the average curriculum would benefit from amalgamation and a movement towards generalism.

2) The provision of a full-service infrastructure by most residential universities and colleges needs a hard, skeptical reconsideration on an ongoing basis. Moreover, I do not think the presumption of such a reconsideration is necessarily towards the elimination of frills and creature comforts. It’s possible instead that there are some professional and administrative services that many universities carry out through their own staff that could be better and perhaps more minimally provided by external vendors, for example.

3) Where higher education is exposed to cost increases which are potentially within its control, universities and colleges need to band together comprehensively as buyers and dictate terms to their own advantage. There’s not much that can be done about energy costs or insurance: any big employer is exposed to those at an equal base, and then to whatever extent they consume those above and beyond that base. On the other hand, libraries and information services are areas of unique exposure. I’ve written a lot about these issues: there are reasons for academia to completely rethink its production and consumption of publication and knowledge that have nothing to do with cost. But it’s also insane to be exposed to escalating costs when we potentially have such massive collective leverage over the sellers.

4) All colleges and universities need to look hard at different areas of their curriculum in terms of the expenses involved, and ask whether some kinds of pedagogy or some subjects are worth the resources that they seemingly demand. At a lot of selective institutions, there is a political covenant that instructs both administration and faculty to act as if all areas of the curriculum are notionally equivalent in their right to resources, and that the actuality of resource distribution can be safely ignored. (When you add professional schools at large research universities into the mix, this picture gets more complicated. Nobody acts as if the Wharton School at the University of Pennsylvania is either notionally equivalent in its right to resources, or forgets that Wharton has a very large resource pool of its very own. ‘Every tub on its own bottom’, as the saying goes.) In this resource-usage sense, the hostility that sometimes gets directed at the humanities is a bit crazy: most (though not all) of the humanities are a very cheap date. About the only time they get disproportionately costly is when the faculty-to-student ratio in any given humanistic discipline is markedly lower than the average at a given institution. In an environment where growth is not happening, no institution can afford to say that the relative cost-to-benefit of any given curricular enterprise is off the table.

5) My impression is that many colleges and universities have put some of the bounty of the past two decades into the maintenance of political amity within their own faculty and administration through paying for duplicate or multiple approaches to curricular and institutional decisions. Meaning, rather than choose to commit to one philosophy or approach, we often choose instead to fund competing or contradictory approaches in order to avoid the political turmoil involved in telling one faction or group, “No”. Maybe some of that is still worth doing, but in some cases, it simply may not be a sustainable approach.

6) Colleges and universities that have chosen to underwrite growth in some areas by steadily degrading the working conditions of adjunct and non-tenured instructional staff and simultaneously relying on that staff to generate more and more of the teaching are going to have to flatly stop doing things that way. One reason: in a far more difficult economic climate, I really think that prospective students will be far less indulgent of degredation of educational quality that follows on that approach.
Another reason: some institutions have already gone as far as they possibly can go along this track, short of converting their entire faculty to short-term, badly-paid, chronically abused adjuncts. (That’s not far off from the picture at a few large universities.) Instructional staff are going to need far better working conditions, and that is going to have to come out of some other area of expenditure, whether it’s the salary budget for tenure-track faculty and administration, or from some other institutional budget.

7) I think the most important but subtle thing that has to happen is just that every stakeholder in academia is going to have to develop new mental habits, to stop assuming or believing that growth is the default. At least at selective institutions, I find that in everyday conversation about curricular questions, administrative choices, and so on, the assumption of growth or plenitude is deeply ingrained.

Posted in Academia | 16 Comments

Lipstick on a Financial Collapse

Last week, I meant to bring up a column by Megan McArdle. I don’t often make someone else’s blog writing my point of departure, but this post got under my skin enough that I keep coming back to it mentally. McArdle argued that political barbs directed at the scandal that made the news last week regarding corruption in the Mineral Management Service within the Interior Department were functionally equivalent to the flap over Obama’s “lipstick on a pig” comment.

If I parse the entry carefully enough, McArdle seems to acknowledge that the first is “substantive” and the second not, but then seems to think that raising a political objection to the first is the same as the second: banal, business-as-usual, and unworthy of further discussion or reporting.

That’s an epic fail on the Sesame Street “one of these things is not like the other” exercise. That’s how we got into much of the mess we’re in right this very moment, on so many levels. This is the consequence of this sense that all government is equally and indifferently corrupt and non-functional, that none of what states do for good or ill turns on the competency or morality of any particular political leadership but is instead a generic and invariant result of the nature of bureaucratic states. That a report of corruption or administrative incompetence is like a report of the weather or the tides, an expected and everyday event with its own cycles.

This is another of the many things that observing Zimbabwean politics from the mid-1980s until now taught me. I’ve just written that social and political change below the surface of formal diplomatic events and agreements matters most. But Zimbabwe also taught me that the bad decisions of bad men and women can by themselves make all the difference in the world. Nothing was inevitable about Zimbabwe, and most of what it has become is squarely the responsibility of the top political and social elite that gained power over the 1980s. What happened was not “all states fail”. It was “some people failed, and dragged everyone else down to hell with them.”

Corruption is a pervasive political and economic reality across the world. It’s a basic part of the modern order, whether the place we’re thinking about is a small ex-industrial city in New England or a postcolonial African nation. But it has its tipping points where it slides from being an entropic force that we struggle constantly against to being a dark tidal wave that roars over our heads and drowns us all. Those lines are crossed by the wrong people at the wrong time doing things more poorly, more venally, more destructively than others. And perhaps more, those lines are crossed by the indulgent and indifferent attitude that says, “Oh, the public interest was sold out for some momentary cheap thrills in an administration endlessly indulgent of such behavior? Dog bites man, you know, shocked there is gambling going on here and all that.” Corruption becomes omnipresent, pervasive, irresistable when it is ok to say and believe that we all do it, that everyone does it, that it’s a non-issue, that it’s as unimportant as a bit of meaningless hot air over trivial words.

There are some dark waves threatening our collective heads today. If they come crashing down, it will be in no small measure because bad people made bad decisions with their own immediate political and personal interests at heart, while some onlookers serenaded them with sweet fiddle music all the while.

Posted in Politics | 14 Comments

Chickens Not Counted

I don’t see any reason for enthusiasm about the signing of a Zimbabwean power-sharing agreement. Whether it will be at all meaningful not only remains to be seen, but depends very much on changes that are very much below the visible surface of events.

When I was in graduate school, the students working on doctorates in history and anthropology who had an interest in Africa met once a year with a group of doctoral candidates studying African affairs who were working in a public policy institution affiliated with my university. We presented our work in progress to each other. It was an awkward meeting for both sides.

The problem for me then (and now) is that the policy scholars treated at least most issues and problems as amenable to policy. Not just a custom-designed, situational policy, in fact, but a policy recipe or formula that could be applied to a broad range of supposedly similar situations. For some problems, that was fine, either because the issue at hand was already circumscribed by the formalities of the interstate system or lay well within the formal infrastructure of the global economy. In other cases, while local histories and practices might affect how a policy would be received or implemented, the formula seemed sound enough to me.

But I remember one presentation very well, in which the student (with very evident support from one of the senior professors in his program) laid out a ten-step procedure for negotiating an end to civil conflicts in Africa. This was in the 1980s, mind you, so some of the most uncontrollable and violent insurgencies in contemporary Africa had not yet become prominent. This student was mostly addressing the situation in Angola and Mozambique, where external actors were playing a big role in fueling civil conflict. Still, the basic idea seemed wrong to me, because wars of that kind (maybe any kind) strike me as unique, and because the end of civil conflict rarely relies substantially on a well-designed accord or agreement. If such an agreement takes hold, it is because there are more complex political, economic or social transformations underway in response to war: one class of elites is losing its grip on resources or its ability to coordinate social action. Two social groups have decided they have more to gain through some kind of accord than they have to gain through all-out war. A new generation of political leaders have mobilized a new political base, or have acquired a new vision through education or novel forms of socialization. Maybe a charismatic leader helps to sway elite opinion or coordinate mass action. In any event, none of that is created by an agreement: the agreement, if meaningful, is a visible sign of more complex transformations.

Do I see that in Zimbabwe right now? Not especially. Part of the problem is that the people I need to be able to see are invisible to me and even to most observers closer to the scene. Mugabe is not the issue. The issue is the upper ranks of various security services. The issue is the upper elite of the ruling party, and for that matter, the mid-level apparatchiks. What have they decided, if they’re conscious of having decided anything?

If they do not help to make the words in the agreement real, the agreement will not be anything but a potential trap for the opposition. By themselves, the opposition cannot make power-sharing or political reform a reality. Without the security apparatus ceding some power, without institutional elites conceding to or joining in reform, this is just one more case of ZANU-PF’s long-running tactic of neutralizing opposition by momentarily appearing to bring it inside the fold. All of the most adroit authoritarian regimes in postcolonial Africa have practiced this technique. (Mobutu, for example, was a master of it.)

The real signs of movement or change will appear in the next three months, if the agreement was meaningful rather than a brief intermission in a long, depressing drama. The official media in the next three months will be telling. Not whether it praises the power-sharing deal: that would just be more of the same. What would be really telling is opening up the state-run newspaper and television station to a diversity of views, or to good reportage. Or privatizing the official media outright. Watch fiscal policy: if reform is really taking hold, the central bank will start acting like one. Watch how the police and military move and act: if they’re going to allow a political transformation, they’ll start their own processes of reform and begin to stay out of Zimbabwe’s public affairs. And so on.

If the only visible sign of the accord is that Morgan Tsvangirai has a new title and ostensibly speaks for the government on television from time to time, then nothing really happened and the agreement is just another of the many ten-point plans that mediators and policy experts (Africans or Westerners) have helped to draft as figleafs to cover their own impotence.

Posted in Africa | 4 Comments

I’m Totally Past Representationalist Work Now

I’ve suggested sometimes that liberal arts professors should have to take a course from one of their colleagues every three or four years. Mostly I like that idea because it is a way to build in a commitment to generalism, to widening our horizons.

But I’m auditing my second course in studio arts now and the other thing that taking a class that’s well outside your existing knowledge and competencies reminds you is that it is really tough to sense that you’re a below-average student who simply doesn’t quite get it, tough to keep coming back and sticking with it. It’s especially tough when you can see that there are students who have an intuitive or highly developed sense for something you understand only dimly. It’s all the more difficult when you’re an old dog trying to learn new tricks, I’m sure, but I don’t think anybody likes that feeling of fumbling around in the dark. I think it’s especially frustrating when what’s involved is technical, when there are really some preliminary procedures that need to be done right in order to get to the layer where you are more in control of the results and more able to defend your choices as real choices instead of as mistakes that you’ve decided to relabel as purposeful.

I worry sometimes that here at Swarthmore and many similar institutions, we teach too much to the students who already get it, particularly when they get it the way that we ourselves get it. So the humbling reminder of what it’s like to be on the other side seems to me to be useful for refining our teaching as well as our knowledge.

Posted in Academia | 7 Comments

One of Ours to Hospital, One of Theirs to Morgue?

The usual fratricidal conversations between Democrats, liberals, and so on are now in full swing, as one faction argues that the right answer to Republican mudslinging is to answer every dirty, trivializing, nonsense charge with an equally dirty, trivializing response. The other faction argues that it’s got to be about ideas, that people have got to ignore the nonsense and punch through to what really matters, that Democrats can’t descend to that level or otherwise what’s the difference?

I know I’m usually seen as one of the latter people, but I really think I come at this from a perpendicular angle to that debate.

First, since I’m on a Nixonland jag lately, let me just point out that Perlstein gives an excellent working description of how Nixon throughout his career would bait the Democrats into “dirty” responses to him, and how well he used that to mobilize ressentiment. There are times where throwing mud back really doesn’t work rhetorically, or allows an opponent to play the martyr. Yes, some of that is about the way that the media amplifies one comment rather than the other. It’s like when some kid called you a filthy name in grade school, and then you responded in kind just as the teacher happened to come into earshot. You’re the one who gets into trouble. You can complain about the uneven amplification of slimy or dirty political remarks, but for the moment, the unevenness is a fact like the sky being blue. You just have to judge situationally when it is that you’re going to pick up street cred in the schoolyard by cussing right back and when it is that the teacher is going to hear you and not the other kid.

Second, these are not mutually exclusive options: you can advance a strong argument about substance and principles while dishing out some low blows with brass knuckles on as warranted.

Third, the real problem is just that many Democratic politicians AND some of their most ardent supporters who persistently argue for hitting back below the belt are just really bad at that kind of rhetorical game. I have to laugh sometimes when I read some of the self-professed liberal and radical tough-guys making concrete suggestions about how to imitate Karl Rove.

Sometimes it’s some kind of tin-eared contention that more stridently dogmatic and ideological liberal rhetoric would arouse the populace and it would be general strikes and flowers all the way to November.

Other times it is something like arguing that when your candidate gets called “uppity” you should call the guy who said it a “racist”. What do you think he was trying to do with that comment in the first place? Calling him a racist is like putting a maraschino cherry on top of what he said. It confirms what he said, it advertises it. You’d be better off calling him a stupid doodyhead.

So maybe I’ll sound just as tin-eared, but here’s an example of where I think there is an appropriate below-the-belt response. When the right-wing talking point for the day is that Obama voted for sex education for kindergarden students, and it turns out that what he advocated was a very simple, non-explicit program to teach kids about how to avoid unwanted contact from strangers and adults that is more or less the same program used by the notoriously radical Cub Scouts, you don’t respond by trying to refute the original charge, you don’t go onto defense. You respond by saying, “The Republican Party is trying to protect child molesters and make it easier for pedophiles to abuse your children”. You cite some examples, like the former mayor of Spokane, Jim West. You connect that to other Republican sexual scandals (there are so many to choose from) to suggest that the Republicans harbor all sorts of secretive perverts and then try to shield them from the consequences.

Yes, that’s a distraction from what really matters in this or any other election. Yes, it’s a nonsense debate and the retaliatory charge is as wildly distorted as the original attack. But if you believe in fighting fire with fire, that’s what’s involved. It isn’t just about the will to retaliate, it’s about the skill to play that game the way it’s played, and having an ear for what works.

Yes, I don’t do that kind of thing because I don’t think that’s what intellectuals or scholars should do. It’s not our job, and most of us would be bad at it anyway. It’s a violation of our version of the Hippocratic Oath. But I’m fine with other folks returning fire in that fashion if that’s the way this or any other political contest is going to be waged. Just do it right, and hope that someday we can have an armistice and get back to talking about what matters.

Posted in Politics | 18 Comments

From the Gut

If you like, you can read my long intellectualized response to the struggle over intellectuals and culture below.

I also have a much more visceral, personal response to the kind of anti-intellectual populism that’s been more visibly present in American life over the last decade. This is the “gut reaction” that Robert Zimmerman talks about in the comments on my other thread. Intellectually, I understand why educated professionals should be regarded as a snobbish power elite. (Margaret Soltan sketches out quite a few of the reasons at her blog today.)

Personally, emotionally, there’s something that doesn’t add up, about how I got to where I am today, about the person I was and became.

By the time I was in third grade, back in the early 1970s, I had been a voracious reader for some time. I particularly liked natural history, biology, and science on one hand and fantasy and science fiction on the other. My sister was similarly voracious: the joke in our household was that rounding us up for dinner was a real chore because we were likely to be so deep into reading that we didn’t hear anyone calling.

I was completely innocent about what this habit meant in the larger world around me. So in third grade, if we had a lesson about hermit crabs and I happened to know the scientific name of several different species, the details of their life cycle, the ecology of intertidal zones, that was all good, as far as I could tell.

In fourth grade, that got my face shoved into the dirt or into fences, it got me kicked and spat upon, it got a ring of girls chanting “you’re a scientific martian” at me during recess. Rinse and repeat for the next four grades or so, welcome to the geek subculture. By high school, things changed a bit, the social hierarchies spread out and complicated somewhat, the recognition of where life was tending started to settle in. But in the years where I came home and cried most days after getting hit or bullied, I looked around at the world for some clues about my situation. I don’t think I had to look very hard for evidence that in mainstream American culture kids–and adults–who knew too much were mocked, marginalized, represented as effete, useless and weak.

Yes, this was also the highwater mark of the authority of technocrats: see my other entry for more subtle histories. But the air around me felt poisoned. That’s what so much fantasy and science fiction, especially for children, fed upon, in fact, and why I grew even more attracted to it over time: its protagonists experienced grievance and marginality and usually had special compensatory powers and experiences conferred upon them as a result. But it wasn’t just speculative fiction. When I got to junior high, I started reading more conventional literary work, particularly short stories, and that felt just as much like an induction into a kind of secret garden, a hidden and despised fellowship of readers (adult and adolescent) who enjoyed literature or found history thrilling rather than disparaged it to join in the polite company of one’s peers.

Now since the 1970s, this sensibility, these themes, has moved from being a marginal literature to being the central engine of much American popular culture. How many stories and films have we seen about the marginalized smart and sensitive child who becomes a key player in the workings of destiny because they’re smart and sensitive? Hermione Granger and Harry Potter are only the latest in a long line of characters fitting this mold. The geek has become a heroic economic and cultural figure in American society, and a major driver of both cultural production and cultural consumption.

And yet here we are: not only listening to mockery and lashings of eggheads and intellectuals, but the eggheads and intellectuals are sometimes apologizing for being that way about as abjectly as I ever pleaded not to be hit one more time.

Time opens up perspectives. It’s possible to realize, many years later, that what you intended to say in all innocence, deeply wounded someone else because of their insecurities, their own baggage. One older male relative of mine who had done military service was once talking to some of the kids about it, and about how the enlisted weren’t treated all that well. I was about ten, I think, and I commented that it seemed like soldiers had always been treated badly by generals, all the way back to Sargon the Great. It was like I’d slapped him, and of course, in some sense I had. But it was all I had to bring to the conversation: I read a lot, I knew a lot about military history. I was trying to give a gift, but from his perspective, it was just a smelly little turd. He was telling me what he knew from life; I was saying what I knew from books, in a ten-year old’s way. I honestly had no sense that I was thinking I was better than him, or that my books told me more than his life told him.

You know, I want to say: it was ok. It was ok that I knew that. It was a helpful thing to know that. It helped me to understand what he was saying. If he was a more open kind of person himself, he could have done something good with that comment, used it himself. Rolled the ball of the conversation along rather than shut it down. Still, there’s a fundamental asymmetry. I could take what he said and add it to my knowledge, make use of it. He couldn’t take what I said unless he followed me into formal knowledge, or trusted me so much that what I said was in the books was as good as truth. (Not wise if you’re talking to a ten-year old.)

Who is most sinned against in that kind of moment?

It’s hard for me to pin the Scarlet Letter E for Egghead to my chest and beg apology for knowing things, or reading literature, or liking the heirloom tomatoes I grow in my backyard, or any of the things that compose my professional and personal being. It’s hard for me to see myself as some growling, powerful elite who daily intrudes upon the private lives of a humble family of church-goers in the heartland and forces them to watch pornography while I turn their children into Marxoislamicist transsexuals.

It’s not as if getting your face pushed into fences ever quite comes to an end, either. I was at a party a few years back where one guy, upon hearing I was a professor, immediately wanted to make sure I knew how to throw a football and put me through my paces. Yeah, I found that basically gentle and amusing, but it’s not as if I then got a chance to find out what he thought of Foucault, if you know what I mean.

So emotionally, I just can’t quite get my head around the idea that somewhere along the way, I magically became the swan and now it’s other people who suffer uglyducklinghood.

Posted in Academia, Domestic Life | 19 Comments

Horn of Africa Redux

Matthew Yglesias has a polite “I told you so” up regarding the current situation in Somalia and Ethiopia.

I’ll have one of what he’s having, bartender.

Posted in Africa | Comments Off on Horn of Africa Redux

The Why of Culture War

One of the arguments I understand Rick Perlstein to be making in Nixonland is that American political life has been increasingly shaped by a public culture war since the 1960s because that was the distinctive political response crafted by Nixon (and to a lesser extent Reagan) to the social turmoil of the 1960s. Culture war, in this view, is another way to describe Nixon’s gathering of motley and contradictory “Orthogonian” grievances together under one banner, united only by their feelings of ressentiment at their perceived loss of social and cultural capital during the 1960s.

As an answer to the question posed by Laura at 11D, “Why culture war?”, this is a good historical beginning. It only gets us so far, though. One of the odd things about Nixonland is that Perlstein doesn’t attend to the social underpinnings of the political history he is recounting as he did in his previous book on Barry Goldwater. If I follow his argument correctly, perhaps that’s because Perlstein believes that Nixonland politics didn’t mobilize any coherent social groups, just a patchwork assembly of all social fractions who perceived themselves to be scorned or excluded from institutional, local or national life, who felt intruded upon by people and interests that they didn’t regard as legitimately possessing a right to intrude.

I think there are a few other things to add if we want to answer, “Why culture war?”. Perlstein deals in passing with the counterculture and the New Left as active agents in this history, often jabbing politely but unmistakeably at their hubris, as well as highlighting the extent to which their social class (either already held as part of their upbringing, or aspired to as product of their education) was a provocation to the social identity of police, industrial workers, and so on.

One outgrowth of left-identified cultural and identity politics that germinated in the late 1960s and 1970s was a loosely Gramscian assumption about both the means and ends of political struggle, that the transformation of institutional and cultural life were seen as the precondition of a successful transformation of political and social life, and that radicalized, emancipatory institutions would be seen as the sign that such a success was underway. Contemporary conservative critics tend to vastly overstate the strength and distribution of this perspective during the 1980s among American intellectuals, artists, academics, and so on, but I think it’s fair to say that a very loose, undertheorized version of this critique had a lot of influence at that point.

So this begins to explain the situation of culture war, and responds to the following complaint by Rotwang at TPM Cafe:

The oligarchy’s message to the masses is the populism of fools — mobilizing support on the basis of hatred of the meritocratic elite. The real elite includes the Supreme Court, the Congress (in the 90s), corporate leaders, the Administration — overwhelmingly Republican. The fake elite are journalists, television anchors, Hollywood stars, professors, and rock musicians. A genuine populist cannot be a right-winger, since the real elite is itself right-wing.

Rotwang is baffled: why attack the fake elite? It is the real elite that does you harm, that rules over you. The first thing we need to remember is that this is not entirely true, that the ressentiment described by Perlstein is also a response to actual intrusions. The real elite’s economic and political power is remote and obscure within the practice of everyday life. What Rotwang calls the fake elite are far more present and visible, and when they intrude or disrespect what people take to be valuable and precious in their own habitus, the threat they seem to pose is far more immediate and visceral. When some in the fake elite saw those intrusions as instrumental, deliberate, programmatic, as politics by other means, they incidentally poured a massive amount of fuel on that ressentiment and put a blowtorch to it.

It doesn’t matter that whatever people take to be a settled way of life or worldview is not sui generis, but derived also from a history of institutional and political life. What matters is, as Gramsci himself noted, what is taken to be truth or common sense, and what is taken to be an unnatural challenge to that truth. In response to Thatcherism, Stuart Hall began to think some time ago about the fact that such common sense may have some sense to it, that it is not just arbitrary or a side effect of some perfectly composed hegemonic program. Take it a step further, towards Edmund Burke: various forms of situated common sense are a reasoned, organic consequence of the slow accumulation and layering of historical sediment, and that applies to everyone, whether they are a radical performance artist living in Soho or a small-town evangelical who runs a hardware store in Kansas. Direct the action of “politics” at the violent excavation of the foundation under anyone’s feet, and you should scarcely expect them to ignore you in favor of distant if awesomely powerful forces that intervene and circumscribe everything else about their lives. You, the “fake elite”, are right there with a spade and pickaxe.

Another discussion last week, this one at Obsidian Wings, touched on another dimension of the problem. Dr. Ngo worries about the anti-intellectualism of Republican populism, about the hostility to competence and expertise. This is an old theme for me at this site as well. But I think Dr. Ngo overlooks an important historical underpinning of that anti-intellectualism, which is that at least some of it is a completely reasonable response to the real actions of intellectuals and experts within post-1945 technocracies, and as such, isn’t just an attitude limited to the United States. In fact, I’d suggest that this is one of the key reasons why liberalism and the modern bureaucratic state have suffered from a persistent malaise almost everywhere since the 1980s, why they inspire so little loyalty or dedication from most national populations, and why many intellectuals on the right and the left fret so persistently about trying to imagine a way out of the belly of the whale and yet resignedly accept that no such way can be clearly described or imagined.

In the 1950s, the high modernist future was intrusted to experts, scientists, and technocrats, who were understood as acting alongside and outside of the normal circulations of social life. Watch all sorts of basically benign presentations of science and expert authority from 1950s pop culture. The expert was an otherworldly portal from which a new future relentlessly would flow: flying cars! no more disease! poverty vanquished! amazing new cities that YOU will live in! atomic trash disposal! racial integration handled without protests and unpleasantness! religious faith made obsolete! None of this was really up for debate, none of it was a decision: it was a teleological chart, an inevitable consequence of expert knowledge. Even when, whether schlockily or seriously, expertise was represented as “mad science”, the answer to it was usually good science. Got Dr. Doom? Get Reed Richards.

What produced widespread alienation and distrust of technocratic solutions? First, that many of those which were implemented went badly awry or wasted enormous resources to no good end. Second, that at least some expertise was uncloaked over time as being nothing more than the Wizard of Oz in his booth with his levers, that strong claims to resources and social power were being made by humbugs of various kinds. Third, the challenge to the technocrats embodied by Jane Jacobs’ work gained a lot of traction. Jacobs can almost be seen as the “good Nixon”, if you follow Perlstein: someone who gathered up the resentments produced by people whose organically functional lives had been badly intruded upon by technocratic policy and suggested that good design, good policy, good stewardship could arise out of the way people organically lived rather than in contradiction to the world as it was.

So much as I share Rotwang and Ngo’s frustrations that the “fake elite” is so persistently targeted, that education is seen as a liability, that experts and intellectuals have become the dog that you kick and abuse while still relying upon him to guard your house, it is not as if culture war in this sense comes from nowhere, or has no underlying sense to it. Much of it is an entirely understandable and justified response to history as Americans (and indeed the world) have lived it since 1945.

There’s another related thing that occurs to me about culture war, resentment and Nixonland politics. I do think we can say a bit more about the underlying social architecture of those politics in one key respect. It’s true that many different people under many different circumstances respond to a call-out to Orthogonian resentment. If you’re a hunter, you get tired of being shat on by some New York celebrity, even if that person doesn’t have one one-millionth of the political and economic impact on your life (or even your ability to hunt) that the people making fiscal policy in Washington have. If you’re a comic-book nerd, you get tired of people making fun of comic-books. Hit the right notes, and almost any of us can be called out because we feel marginalized, silenced, scorned and yet suddenly there is One of Us at the center of public attention and we identify with them. There I am! There is that Famous Person! They are mistreating that Famous Person just the way that I am mistreated. That’s generic Orthogonian sentiment, all around us all the time.

At the same time, though, we shouldn’t forget the social identity of the original Orthogonian, Richard Nixon himself, because I think that’s a key to a lot of culture-war politics now. Who fights most ardently in the culture wars? Everyone involved in the public waging of culture wars invokes a silent majority, a Them, some masses, the people, the ordinary folk, who allegedly feel trespassed upon and are about to rise up. Only the rising up part never really happens outside of a shitload of trolling in long message threads, or is only one part of very complicated composite decisions that people make in the voting booth. Folks still send their kids to the colleges that are allegedly swarming with leftists, they still watch the TV dominated by bias, they go see the movies that have such filthy images in them, they listen to the music by those bad bad people. The culture is more consumed, more popular, more omnipresent for all that it is also supposedly so alienating and made with such conspiratorial intent.

So again, who are the actual culture warriors? To a very significant extent, this is not a war of elites versus masses, it is an intramural struggle between closely related social fractions whose professional and lived worlds overlap and rub up against one another. Professors versus middle-rank white collar professionals. Commercial illustrators versus chic gallery artists and their patrons. This group of thirty-something think-tank fellows versus that group of thirty-something think-tank fellows. Students from Harvard Law looking for Department of Justice positions versus students from Regent University looking for Department of Justice positions. Mothers who take time off from competitive career tracks to rear young children versus mothers who use nannies and round-the-clock day care. The fiercest and nastiest social and cultural struggles in any society are often between people who are closely proximate and thus are struggling for the same objective rather than between people who are very distant from each other.

This is another reasons why there is so much verbiage and culture-war hot air directed at what Rotwang calls the “fake elite”. Many of the educated elite who are the foot soldiers of the contemporary culture wars aren’t in the money elite, and aren’t going to be in money elite, whether they’re professors or bloggers or civic activists or church leaders or community organizers. This is not to say that they’re poor, but that the economic and institutional world they operate within has at best a thin overlap with the world of CEOs and investment bankers and senior law partners and property developers and old-money wealth and so on. On the other hand, there may be considerable cultural capital which resides more with the “fake elite” that the money elite either respects or resents.

This past summer when I happened by chance to be at a political fund-raiser held at a very wealthy person’s house (I was there as volunteer kitchen staff than as a donor), I was musing how on one hand, I will never earn in my entire lifetime of work more than a small proportion of the value of the home I was in, and yet on the other hand, I was culturally and socially competent to evaluate the architecture and art on display, and to participate in conversation.

The difference between my capital and the capital of the people who owned that home as far as the wider society is concerned is that my capital may paradoxically be harder to dream of acquiring. I can imagine that the wealth required to own that home could be mine through pure serendipity. I could win the lottery or the World Series of Poker. I could invent something. I could start a small business that catches on and makes me rich. A Hollywood director could decide that I was just the right person for a movie he was going to make. I could put a little film onto YouTube, have it go viral, and suddenly get a big-money contract to make a film. I could write a novel and it could become the next Harry Potter sensation. Fate could catapult me from where I am to that money elite: all that you need as the price of entry to that elite is the money. Or at least so it often seems to Americans. So it isn’t just that whatever social and political power they have is in some sense cloaked or remote, it is that we all can imagine, regardless of our situation, that some miracle could intervene and we could be there too.

But you can’t imagine a lucky chance that makes you part of the professional and cultural elite. You can’t get lucky and become a doctor if you’re a certified EMT. You’ll have do to the time. You can’t get lucky and become a tenured professor at Harvard if you’re a well-loved, inspiring mid-career professor at a community college. You’ll have to write four books and schmooze endlessly at conferences and claw your way into prominence. You can’t get lucky and become a lawyer when you have an autodidact’s knowledge of the legal system. You can be the best teacher in a K-12 school and be paid less than the weakest hack who went and got a master’s degree.

So in the grand scheme of things in American society, to aspire to the professional elite means doing all the things required to gain the certification or license to be part of it. And to be someone with professional skill and commitment that you personally feel is better than those who have the certification and who gain higher rewards due to having that certification. Serendipity is a negative thing: it is the jobs you will be denied because the system is too chummy or requires connections you don’t have, the white-collar job where the person who has the right parents or the right identity or the social fluency gets rewarded while you stay in a dead-end lower managerial job. The professional elite is ruled by fate and inheritance, by conformity to institutional structure, not the dream of suddenly breaking out into boundless autonomy or being blessed by happy chance.

At least some culture war in American society since 1970 is fought within those cramped terrains, across and along minute differences in the status hierarchies of professional and managerial worlds. That was Richard Nixon. A lawyer, just the kind of lawyer who doesn’t have preapproved, smoothed access to cultural and social capital. That’s the core of the Orthogonian impulse that Perlstein describes.

========

Here’s a final thought that is much on my mind lately. So what if there is culture war in America, whether at the widest scale of American society, or in this narrower, intramural sense? A lot of us complain of it as if it detracts from a real politics, from decisions and issues that we’re meant to be tackling. Some of us are indifferent to struggles over values because we aspire to be indifferent to values. Not without values, but believing that everyone is free to have their own personal values and ethics and that contention over divergent values is best cordoned off into private and civic realms. The public and political, we hold, is for other kinds of decisions and debates, and the more effort that goes into culture war, the less space we have for what really matters.

Maybe that’s true, but it’s an interested perspective. It is, in its own way, a culture war argument. What else might be bad about culture war? I suppose for some, there is a sense that it detracts from national or social cohesion in a time where we perceive that to be all the more necessary. Perhaps. I don’t think there is a lot of evidence that national cohesion is actually a necessary precondition of waging war or responding to economic crisis or dealing with social upheaval.

So why all the fretting? Following the above analysis, it could just be that people who stand atop certain narrowly composed professional and cultural hierarchies are fretting about the possibility of losing position, that it’s a self-interested concern. I think there’s a bit of that driving the discussion. But for me, I think it’s also that I have a sense that of all the kinds of social struggle you could have within a national or regional community, culture war of this kind is the most likely to run away from all the participants, to take on a life of its own. Rotwang writes that American anti-intellectualism leaves an exemption for the neurosurgeon, the engineer, the indispensible expert. If there’s historically evident danger to culture war, it is that it is hard to keep it nothing more than a Punch-and-Judy pantomine, hard to keep it confined to a narrowly intramural struggle within specific professional or social hierarchies. There are pathways out of Nixonland that go into very dark, dangerous places that no one wants to traverse.

Posted in Academia, Blogging, Politics | 6 Comments

Spore Needs More (and Less)

Will Wright’s The Sims is the best-selling digital game of all time, and probably one of the games most disliked by people who play a lot of digital games.

Wright’s new game Spore seems to be producing a very similarly divided reaction, though time will tell whether it achieves anything like the commercial popularity of The Sims. At least some of the negative reaction to Spore from gamers has nothing to do with the design and everything to do with the digital rights management that Electronic Arts has chosen to impose upon the product. I have to agree that Spore has bad DRM, the kind that unilaterally violates established marketplace practice. Spore can only be installed three times before you have to seek special permission for additional installations. Some computer users can run through three installations in six months if they need to reformat their operating system, upgrade their machines several times, or experience a bug with the software itself. (Or in this case, with EA’s terrible Download Manager, which has caused quite a few problems for users.)

What of Spore itself, however? What a great many gamers have said about the product is that it is a great software toy with design tools that approach genius, but a weak or indifferent game. I’m no different in my assessment. In fact, I think that all-in-all, it’s a worse game than The Sims in terms of its underlying mechanics. Even as a toy, it has some issues.

Spore is divided into five stages, each with a different underlying game structure. These are short and superficial games: as many reviewers have noted, they are in effect minigames you have to play to unlock five different kinds of editors that allow you to customize your creatures.

Wright is not the first person to dream of creating a kind of “total game” that moves between fundamentally different game-mechanical structures. A lot of blue-sky design discussions I’ve read over the years have touched on similar ambitions: say, a game where you begin as an individual soldier that is a first-person shooter, graduate to being a squad commander that is an RTS, and then become a general playing with a strategic turn-based game. It sounds thrilling until you think what that actually means in design terms: that for one product, you’ll need three full games, each of them a compelling experience and you’ll need to somehow have all of them interconnect and accumulate. Otherwise, why bundle them together, if what you did in the first game doesn’t affect how the second game is played, and both in turn don’t structure the gameplay of the last?

That’s my (and many others’) problem with Spore as a game. My seven-year old and I both really enjoyed the first two stages of the game, where you first control a cell and add evolutionary features to it, the second of which you control your creature on land and shape its evolution. The two phases interlock quite a bit, and you feel a strong emotional connection to the creature you design as you shape it. There are some really clever visual and design touches in the second phase in particular.

Then you move into the Tribal and Civilization stages and here for me the game simply falls flat on its face even as a toy or design tool. The attributes you gave your creature in the second phase almost don’t matter at all in the Tribal phase, and really don’t matter in the Civilization. The only impact that earlier play makes on these phases is in a special ability, and that’s determined not by what makes your creature visually or substantively unique, but by a fairly crude summary of your overall behavior in the last phase. The editor tools for the Tribal phase aren’t particularly interesting, and they do not customize well to many of the Creature designs. You can design a completely fantastic Creature that makes use of Tribal and Civilizational elements, but you can’t really fit these elements to an underlying Creature, e.g., you may have access to a particular mask in the Tribal phase, but the mask is generic rather than something which conforms to the anatomy and physiology of the creature you’ve made. Yes, I know this is demanding rather a lot from Wright’s design, but the animations in the Creature phase are so astonishingly varied, and seem to come so organically from the anatomy of different Creatures, that the Tribal and Civilizational designs are a sharp disappointment.

So what you have here is an evolutionary, accumulative game where past the point of sentience, everything that has come before has little to no accumulative weight. I tend to come down somewhere in a vague middle on nature vs. nurture discussions about human practices and history: our technological and social history as a species is shaped in many ways both subtle and gross by the fact that we’re primates who walk upright, have large heads, four fingers and an opposable thumb on two hands, bilateral symmetry, strong visual acuity, and so on. I really do assume that a four-legged alien that manipulated objects with tentacles and had 360 vision from eyeballs on stalks would make artifacts and have social systems that reflected its biology in distinctive ways, even if technology itself would tend to cancel out or override some of its prior evolutionary history. It just doesn’t feel that way in Spore, even in some clever, superficial but visually enjoyable way.

This gives Spore a sharply disjunctive feeling. You’re drawn deeply into managing the fate and design of your creature in the early game. Then suddenly most of that feels irrelevant and you’re intensely conscious of the game mechanics, which in the Tribal and Civilizational phases are shallow, simplistic and rather boring. I really wish they’d left these phases out entirely, and simply given you an editor at the end of your Creature’s evolution that let you design some technologies that you thought looked appropriate for its use.

I haven’t done the final Space phase enough yet to decide whether the game (or toy) becomes very exciting once again at that point. Certainly the more that the Space phase exposes you to the astonishing creativity of other players using the design tool, the more there is some kind of satisfaction that arises from Spore. Curiously, though, it is not a satisfaction with the genius of Wright and his colleagues, but the genius of other players. I’m sure that’s something that Wright was aiming for, as he has in the past, to make Spore first and foremost an authorial platform, a tool for creative expression. Parts of Spore succeed wildly at that ambition. Parts of it, even in those terms, fail, and they fail because they don’t interlock with the rest of the expressive power of Spore-the-toy, Spore-the-tool.

Posted in Games and Gaming | 2 Comments