Jump. Jump Now! (Or Later.) (Or Climb Down Slowly.)

Elite colleges and universities chased each other to a budgetary precipice. Did peer institutions build a new building of some kind? You need one! Did they dramatically expand services to the student body? Do it too! Redesign the dining hall so that it was Chez Panisse East? Eliminate loans? Ratings and rankings drove a bit of this race (and without any need for the more overt kinds of Clemson-style trickery) but some of it was also just driven by the knowledge that highly selective admissions are a kind of self-solving achievement: the first guarantee of an excellent educational experience for students is being surrounded by other excellent students.

Now it’s going to be interesting to see whether the same institutions try to stay tightly packed when they jump off that cliff. A lot of them are already well past a soft-landing scenario unless the market has a sharp turnaround in the coming year. It doesn’t really matter how big the endowment was three or four years ago, because however big it was, harvesting the interest fueled the operating budget. Even if you increase the percentage of endowment spending, you’re not likely to make up the difference between the budget you had from endowment income four years ago and the income you’ll have next year.

This spring, there’s a range of short-term adaptations out there. Swarthmore and many other institutions have instituted a salary freeze; Brandeis (whose budgetary disaster is greater than almost anyone else) is suspending contributions to retirement accounts. Some institutions are doing what they can to improve things on the revenue side, but there’s not much to be done there except admit more students.

If things stay roughly the same next year, most institutions will have little choice but to undertake a big adjustment in some major part of their budget. I’m still on record as arguing for the kinds of strategies that I advocated when times were flush, at least at smaller institutions: draw faculty together around generalist practices, slowly erode overspecialized and excessively sequential curricular designs in favor of a loosely constituted core. Yes, this means eliminating a few faculty lines, but the idea is to do that very slowly.

Slow may not be an option any longer. Some temporary adaptations might be, and by that I don’t mean hiring lots of adjunct positions to sustain an overextended curricular design. Meaning, a lot of institutions might: a) suspend hiring in vacant lines for the next two years and b) if a suspended department complains they can’t support a program of study as it is designed with such suspensions, consider a temporary change in that program of study to make it sustainable with fewer resources. Similar short-term adaptations on the salary and benefits side might be to freeze hiring in non-faculty positions, suspend sabbaticals for a year, suspend course releases for administrative service. A lot of the other things that some will suggest aren’t big enough. (I’ve read of suggestions at a number of institutions that travel funds be cut temporarily, which at most places I suspect barely add up to one or two salaried positions, if that.)

The real crunch points will come from one of two major decisions: either cutting or amending need-blind admissions on one hand, or eliminating current positions on the other. I was really surprised to see Reed College taking the leap and committing, at least in the short-term, to eliminating need-blind admissions. Of the two big jumps, that seems to me to be the harder of the two politically for most selective private colleges and universities. Need-blind is very nearly gospel at most of the institutions that practice it. There are little ways to hedge against it, like admitting nothing but fully-paid students off the waiting list. But Reed, to give them credit, didn’t try to wrap up their decision in weasel qualifiers. They did it, they’re not happy, but that’s what they chose to do.

In a way, that’s easier to do, though. The students you lose when you give up need-blind are only potential individuals to you, in a way not that different in emotional terms than the students that you think might be potential admits but that you ultimately reject. What you give up with need-blind is a larger idea about higher education, meritocracy and social justice, but in various ways, most selective institutions already have deeply contradictory ideas about how those things mesh together.

Actively eliminating positions is in emotional and political terms a completely different matter.

First, the long low-level rhetorical sniping between administrators and faculty at most institutions is inevitably going to erupt into open warfare if there is a decision to cut positions. Faculty point out, justifiably I think, that the core mission of a university or college is instruction, that everything else is support. On the other hand, the premise of a residential college full of 18-22 year olds also centers on the idea that learning happens outside of the classroom, that the value of the education goes beyond formal learning. Faculty will point out that the growth in administrative ranks, especially at large universities, has well out-paced growth in faculty positions, particularly in tenured positions. Administrators will point out that higher education has become necessarily more complex to administer in a variety of ways, and that faculty have a host of hidden compensations to fall back upon.

Second, while everyone would like to imagine that positions might be eliminated by some rational, fair-minded selection criteria, it doesn’t usually go down like that in the real world. The best case scenario probably remains looking for retirements or vacant positions to shut down, but there’s no guarantee that those will come in sufficient numbers at the right time. In fact, given the hit to retirement accounts, their pace is likely to slow just when you might wish them to come in profusion. If the problem on the administrative side is disproportionate growth in the last two decades, I suppose you might look at the most-recently created positions and seek consolidations in them. That’s going to be a difficult way to go if only because that makes all those positions into a constituency that will defend themselves collectively. In fact, it’s hard to think of a scenario that involves actively occupied positions (faculty or staff) at any institution that won’t pit various constituencies against each other in some pretty vicious ways.

Given all that, Reed’s decision makes more and more sense. You can flip back to need-blind pretty easily as soon as you can afford it again. Reducing the size of staff and faculty is probably as long-term a project as increasing it was, especially if it’s to be done well, with philosophical and institutional coherence, rather than to be done by allowing larger power blocs to gut high-performing but exposed rivals in the academic equivalent of Thunderdome.

Posted in Academia, Swarthmore | 13 Comments

Gordon Brown and Omar Bongo

About the only thing I can say about Omar Bongo being dead is, “I hope it hurt a bit”. You can’t even say, “Thank goodness that’s over”, because his death won’t change much if anything about the way that the Gabonese state operates.

The fight against corruption in Africa is ebbing, says the New York Times. Not surprising, since much of that fight at the Times sees it has involved a scattering of government officials in isolated departments who were installed in office in part to comply with the vogue for “good governance” in development circles. No matter how dedicated such authorities have been, changing the culture of governmental power and state-society relations is beyond their grasp. The flow of money and power into postcolonial African states from institutional donors, international organizations, outside states seeking influence or resources and companies seeking contracts for extractive enterprise doesn’t really depend on compliance with anti-corruption dictates. Any African elite dependent on that flow knows that the development industry sticks with a particular idea about conditionality for no more than a decade. There are careers to be made off of new policy formations, and past failures are perpetually recuperable as tomorrow’s new and improved approaches.

There is little genuine pressure from below or above within most African societies to change the practice of governance. From below because everyday life is influenced by the state either in the weakest of ways or through dramatic if arbitrary violence. From above because few postcolonial African societies have an elite whose interests stand at a distance from the state: anything that might check or inhibit the state’s ability to extract resources from the global system directly impinges on elites themselves. Look at Gabon. It’s hard to imagine where a serious attempt to transform the social order might come from, save a thin sliver of educated elites with an interest in a more conventionally liberal kind of state, the kind of people who are easy to imprison, intimidate, compromise or exile. Bongo and his associates have had an enhanced ability to buy off any social unrest with oil revenues, but the basic distance between the everyday life of rural and urban people and the attenuated operations of a storefront sovereignty is the same in Gabon as it is across much of the continent.

But it’s also hard to take corruption-fighting ministries too seriously in Africa because as with so many things, what the West imposes on Africa it does not impose upon itself. The bailouts of the last six months are only one example. Corruption is a structural part of the modern state everywhere, not just in Africa. There are big differences in scale (in several respects) between sucking off tax revenues to pay for moat-cleaning in the UK and the Bongo family’s extravagances, but also some strong resemblances.

Everywhere the liberal idea of the state is at least in malaise, if not active crisis. Its problems are old, and so is the conversation about those problems. Is the tendency of modern political classes to become more and more self-aggrandizing a cyclical one that is interrupted and corrected by strong legal and constitutional safeguards, checked and balanced? Or have political elites since 1975, even in relatively liberal and democratic states, become more and more protected from social and political restraints? I tend to think that it’s more the latter than the former despite some notable exceptions and complications. It’s hard to believe that anybody now could have the kind of credulous faith in the nation-state as an administrative or managerial institution that was sometimes expressed earlier in the 20th Century. (Even in dystopian terms: one of the brilliant touches of Gilliam’s film Brazil was that dystopia, too, should be imagined as corrupt and inefficient rather than the perfect machine of Orwell’s envisioning.)

You get a sense of how professional and managerial elites anywhere, not just Gabon, struggle with their relationship to the political classes, by watching the punditry coming out of the UK in the last week. Look at the fatuous tone of the Economist in the past week, for example. Much tut-tutting about the horrible misjudgement of British parliamentarians, and wishing for the stables to be swept clean of all the muck. But also fretting about how it would be a bad thing to hold an election where the expense accounts were the main issue, much concern that a freakshow side-tent might become the center ring. That’s not the Economist alone: you can find echoes of that double-gesture in punditry left and right.

In fact, that’s a realistic response in some sense, because there are no untainted parties to vote for that are not otherwise tainted by their ideology–and besides, given how systematic the abuses of the current political class are, why should anyone suppose that fringe parties would not quickly find ways to spend public funds on their own follies?

The taint runs deeper too than elected officials and bureaucratic elites, whether we’re talking the UK, the US, Gabon or anywhere else you care to name. If the fury that people feel is curiously unlikely to be more than fuel for conversational righteousness over a pint or two, it’s because most folk know that few of us are more than a few degrees separated from practices and behavior that some pitiless observer might name as corrupt, and few of us are more than a few degrees distant from some kind of largesse distributed by the state or by equally powerful civic institutions. I remember a conversation in a decaying post-industrial small city in New England a few years back. One woman I was talking with had just retired from nursing. She bitterly complained about African-American “welfare cheats” but then ten minutes later talked about how she got a doctor to falsely attest that she was looking for work so that she could claim unemployment benefits, which she felt was her right. It’s always the other guy, but it’s hard to think of a way to stop the other guy that doesn’t create a new bridge for bureaucrat-trolls to hide under and demand some price from those who want to clip-clop across it.

But how could there be a better or more powerful basis for voting in the UK than the expense-account scandal? Those who counsel that an election should be fought over “real issues” are missing the point. None of the real issues matter if you’re voting for a political class that cares little for delivering anything meaningful on those issues. (If you promise “change”, that had better not stop with “not appointing flaming incompetents to positions of authority”.) How else can political classes be made to feel the murmur of a threat to their position if there are no consequences for systematic misrule? How can we recognize the distributed costs of corruption to our human possibilities if not by making corruption the center of public attention?

Posted in Africa, Politics | 5 Comments

DIY Discoveries

I’ve spent a good portion of the last week catching up on household tasks of various kinds. Whether it’s something I’m increasingly comfortable with, like gardening, or something where I’m still finding my way like working with power tools on various projects, this kind of work has become really satisfying for me.

I’ll probably pick up Shop Class as Soulcraft as a result, given that it’s getting good reviews. Though Francis Fukuyama’s review of the book this Sunday didn’t do the book any favors, given the aggressiveness of Fukuyama’s ideological packaging of its argument.

I’m completely persuaded that it’s a good thing in general for middle-class folks to learn as much as possible about the machines, objects and processes that surround them, and to learn through direct hands-on work with all of them. If that’s what the book argues, I’m with it. But Fukuyama argues that the shutting down of shop classes in high schools is a result of a campaign by experts to denigrate this kind of work in favor of “knowledge work” or “symbolic analysis” due to changes in the global economy. Fukuyama (and maybe Crawford, author of the book? I’ll wait and see on that one) is smarter than this: I think he knows very well that most of that argument about the future of skills and labor isn’t aimed at plumbers, carpenters, electricians and so on. It’s about the disappearance of unskilled factory labor from much of the United States.

I also think Fukuyama knows full well that European and American middle-classes, especially intellectuals, have long fretted about becoming too distant from manual labor, from practical knowledge, from direct experience. Complaints about the alienated, enfeebled culture of white-collar work go back a wee bit farther than “Dilbert”. I’m perfectly ok with the wheel turning once again: I’ve argued myself that a great liberal arts course would interweave studying the history of cars and traffic, the public policy of transportation, and a hands-on disassembly and reconstruction of an actual automobile. But let’s get a bit savvy here and ask why a sentiment that seemingly rejects middle-class “knowledge work” and white-collar labor is so recurrently popular with middle-class white-collar workers. And why most of the people who enthuse about the message don’t quit their jobs as symbolic analysts but just build stuff in their garage or tinker with farming and so on. Fukuyama is generous enough to acknowledge this point but I’m not sure he grasps it fully. Learning how to do it yourself is just another dimension of bourgeois culture, right alongside being a foodie or going to the theater. Which is fine by me: nothing shameful about middle-class life, as far as I’m concerned. Still, that takes down the rhetorical heat by about ten notches.

————–

So miscellaneous things I’ve found or wondered about this week.

1) Precision is, as in all things, my bugbear. One reason I’ve always preferred cooking to baking is that you can usually fix a dish that’s gone wrong, but if you’re off by much with baking, you’re screwed. This turns out to be even more true when you’re making something like a bookshelf or hanging a screen door yourself. I missed by maybe an eighth of an inch on one of the spade-bit drill holes I had to make for the door lock installation on the screen door and was barely able to adapt the whole thing with some careful filing. If I had missed by much more, I think I would have had to just take the whole door down and call it a loss. That’s one key problem with DIY stuff in general: learning by doing can get expensive.

2) Everyone likes to complain about bad instructions. What I was struck with in the case of the screen door was that the instructions were difficult for me to understand, but they weren’t bad. First, I had the classic problem of a left-handed person: all the illustrations were of right-handed tool use. Sometimes I find I try to reverse what I’m seeing in my mind and then I end up ‘tricking’ myself back into a double reversal. A more complicated installation like this is also difficult to write instructions for both because the instructions have to be flexible (this door could hinged on either side, and had variable widths to accommodate for different doorframes) and because various parts have names that are non-intuitive. The instructions have to build a vocabulary for you, a temporary jargon, because they can’t just say “That thing” and “that other thing”.

3) I finally got a fairly cheap chipper/shredder (and it’s not a very good model: you get what you pay for, I guess) and found to my surprise that a huge pile of debris I’ve been building up for three years reduced to about three or four wheelbarrows of finely shredded mulch. That’s leaving aside the logs and sticks bigger than 2 inches, which I need to chainsaw down and then break into firewood. Makes me think again about the amount of material that must be needed to make really huge amounts of mulch. Not that there’s any shortage of woody waste in the mid-Atlantic area but still.

4) Has anyone come up with a cheap disposable mask that doesn’t fog up safety goggles? So far I’ve tried a couple of different brands, some of which promise that their foam seals prevent fogging, but I still end up close to blind after ten minutes of work when I’m wearing a mask and goggles.

5) My table saw scares the crap out of me, but it does seem to be the answer for making dado grooves on the bookshelves I’m making. I’m still not as precise as I want to be, but I’m getting there.

6) For the third time in my life, I’ve planted mint in a garden without thinking very carefully about how to control its spread. I think mint should come with a biohazard sticker on it.

7) I’ve really despaired about getting my lawn into shape through my own seeding, with no chemical use. But this spring I looked around and realized that it had really grown in after three years of seeding. I guess grass is just a long-term project if you’re not going to go with heavy treatments.

8 ) Before I started trying to do some household maintenance and repair myself, I always thought that the major cost of hiring people to do that work was labor. Same for making pine or ash bookshelves. But for furniture, it’s increasingly clear to me that materials are most of the cost of retail furniture unless it’s very high-end crafted work. Same I think for a goodly amount of household work: the cost is in the equipment you need to do it and the materials it requires.

Posted in Domestic Life | 16 Comments

Ponies Can Be Allocated Directly

Greg Mills and Jeffrey Herbst argue that it’s time to resume aid to Zimbabwe to help out the unity government.

The problem here is that while almost everyone would like to help Morgan Tsvangirai and his allies “soft-land” the Zimbabwean crisis, a lot of observers are also perfectly aware that Mugabe and his closest supporters may be using Tsvangirai and the MDC for precisely this purpose, as a way to get development money flowing back into the hands of the ruling elite. This isn’t the first time that the powers-that-be have neutralized potential opposition figures by bringing them into the government and giving them a taste of largesse, nor the first time that they’ve done just enough to try and perform fake compliance with some minimum conditionalities for aid.

I was especially struck by this paragraph from Herbst and Mills:

To consolidate progress, donors should end their ambivalence about the unity government and begin to support Mr. Tsvangirai’s aims. Development assistance can be allocated directly. Replenishing the hospitals and re-equipping schools are measurable and defined projects. More generally, Western governments and nongovernmental organizations should become more publicly enthusiastic about the unity government, especially because they haven’t been able to offer a better option.

“Development assistance can be allocated directly.” Not to be a wet blanket, but how? Unless, of course, the government (still effectively dominated by ZANU-PF and Mugabe) gives permission for development assistance to be allocated directly. Which, particularly in the case of hospitals and schools, it is unlikely to grant, since that would involve surrendering some measure of control over state institutions. This is like saying, “Freedom of the press can be practiced by distributing publications freely”. Sure! If the government which suppresses freedom of the press allows that to happen.

No outside institution has a plausible plan of action for producing better governance in North Korea, either, but I don’t see why that should produce higher levels of enthusiasm for the inevitable.

Tsvangirai and his allies are in a terrible spot. Whatever can be done to help them should be done. But if there was ever a time for ironclad conditionality, this is the time. The interests behind ZANU-PF power will not share any authority that matters unless they have no other choice.

Posted in Africa | 3 Comments

What’s Distinctive About Africanist Historiography?

Swarthmore has an elaborate system of Honors seminars. The basic premise of the system is that third and fourth year students participating in the system take four small, intensely focused double-credit seminars, three in a major subject, one in a minor subject. At the end of their time at Swarthmore, they take written and oral examinations in these subjects designed and given by experts in the field who are not faculty at Swarthmore.

I’ve always found it a very difficult challenge to design a syllabus for these seminars. I’d like to ensure that I’m not just teaching a proto-graduate seminar primarily aimed at students who will be going on to work on a Ph.D in history or anthropology: I want there to be some intellectual value in the seminar beyond knowledge of a canonical literature in my specialization. But I’ve had a hard time deciding what the appropriate framing of my specialization is to accomplish that purpose.

What I’ve settled on for the last decade and a half is a seminar focused on the history of colonial Africa, beginning roughly with the 1870s and concluding with contemporary Africa. Since I do not have a prerequisite course for the seminar, I get students with widely varying levels of prior knowledge of African history, and I have to teach the seminar with no presumptions on that score.

This is a basic pedagogical dilemma for Africanists in most institutions, with most kinds of courses. The central concept of the Honors program is that courses are being taught at an advanced, challenging level, so I don’t want to spend a lot of time just laying out a bare-bones sequential history of modern Africa. But this tends to lead to students who have some interesting, sophisticated things to say about the contradictions of indirect rule or the role of gender in colonial society but who are somewhat uncomfortable about the difference between Togo and Botswana.

It’s hard to redesign these syllabi because external examiners are often dealing with two years’ worth of students, and need to have a stable syllabus that applies to both groups equally. I’m in a “gap year” now, though, so my chance for a big overhaul has arrived. I’ve tended in the past to rely on a few big overview texts that I think have strong, interesting arguments and then to throw in a collection of books and articles that I find challenging or interesting, worth debating or discussing, with a relatively minimal organization. So, for example, two weeks on the social history of colonial Africa with a changing selection of required and extended readings.

I’ve been considering a classic strategy for redesign, which is either to go smaller, to the history of southern Africa, or to go bigger, to the history of the British Empire, with the hope of resolving the main focal point of the course discussions. One major axis of discussion has tended to be, “What’s empirically distinctive about the history of modern Africa?”, the other axis has been “How can we use African history to talk about the character, causes and consequences of modern imperialism or even of modernity in general?” The problem with the former discussion it is implicitly comparative in two ways: to premodern Africa and to other modern societies. The problem with the latter discussion is that it requires attention to theoretical and empirical debates about imperialism and modernity that aren’t limited to African examples.

The flaw with the “going smaller” approach is first that it potentially buys into the area-studies parochialism of African Studies. I don’t know that I want to solve the problem of comparison by abolishing comparison and taking a region of Africa as historically self-referencing. Second, there’s a practical problem. I’ve enjoyed inviting friends and colleagues who work on other regions of Africa to be examiners. Southern Africa locks me into a much smaller group of people (many of whom I like very much) which can just pose difficulties in terms of availability. On the other hand, however, I think I could get students to a point at the end of the semester where they were not only literate in high-level historiographical and analytical debates in this subfield, but very comfortable with concrete questions about who did what to whom at what date in which location.

The flaw with the “going bigger” approach is that it will accentuate that sense of vagueness about specificity save for a command over the history of empire as it developed in metropolitan Britain itself. E.g., I think I could get students in an Honors seminar to a point where they were very comfortable telling me about the impact of Gordon’s campaign in Sudan on British politics and on the later development of British imperialism, but at the cost of knowing little or nothing about the Mahdi or Sudanese society. Which is the classic trade-off of imperial history versus area-studies approaches to the colonial era of history in a particular region or place.

——

I’m close to settling back where I started, with a seminar that’s focused on colonial Africa as a whole. But I’d like to sharpen up the way the course is organized and see if I can’t work harder to give students in the course a comfort level with concrete knowledge of specific places, times and events.

In preparation for this redesign, I’ve been asking myself: what’s intellectually distinctive about African history as a field of scholarly knowledge? What questions has it posed in particularly interesting or compelling forms compared to the wider discipline of history? I really want to focus on one big theme of this kind rather than trying to throw everything but the kitchen sink in the mix, which has been more of my approach in the past. I come up with several possibilities. I’m thinking here of ways to organize the existing body of scholarly publication around debated or contentious propositions, not arguments which reflect my own sympathies or views.

1) The historiography of Africa is methodologically and/or epistemologically distinctive. Africanists have to think through problems of archival interpretation in creative ways, have to think about the status of oral narrative in new ways, have to grapple with debates about nomothetic and ideographic knowledge in a unique way, have distinctive issues with the validity of comparative or universal history, have to struggle with the “constructedness” of their field of knowledge in special ways.

2) The particular character of colonialism, globalizing capitalism or modern institutions in African history raises a distinctive range of questions for historians and anthropologists which has some comparative significance for understanding colonialism, globalizing capitalism or modernity in general.

3) The marginal or failed position of many African societies within contemporary global systems is a special challenge for many comparative or universal frameworks and requires historical investigation into the roots or causes of this marginality and thus to possible resolutions or addresses to these problems. (Or illuminates the extent to which all modernity is an incipient failure or in a state of unresolvable crisis, in some more pessimistic or critical frameworks.)

4) African societies (or some subset of African societies) have some distinctive material, cultural, philosophical character over their longue duree; studying the colonial era is just a way to focus an exploration of the particular character of African societies as they experienced new pressures from external forces and institutions.

Posted in Academia, Africa, Swarthmore | 6 Comments

The Pursuit of Happiness

This is a good summary of the current state of work on the experience of happiness, and its implications for homo economicus and public policy based on assumptions that people are rational utility-maximizers. Makes an interesting companion to this article on the Harvard Study of Adult Development. (via 11d).

What this work shows is that we’re lousy about predicting the consequences of possible future events for our state of mind. Our reported experience of happiness tends to return to a middling point after events that we expect to make us permanently happier or sadder (such as achieving major career goals or suffering a huge disability or tragedy). I agree this is a significant rebuke to the idea that rational actors will generally select actions which increase their happiness: if happiness in the long-term returns to a default setting regardless of the consequences of our actions, then a study which concludes that one choice objectively has more utility than another is missing the point.

I agree that some of this research has interesting policy implications. If it turns out that chronic pain or constant environmental irritants are more inimical to happiness than permanent disabilities or losses, it’s possible to imagine a plausible adjustment to law and policy which could accomodate that finding.

I’m not sure the people debating the policy implications of happiness research are entirely getting the point, however. What this work also implies is that our younger selves are poor custodians of the interests of our older selves. Among other things, if you took this really seriously, it would mean that the entire idea of contract is profoundly flawed. How could I possibly make a binding commitment now on behalf of my older self, given that I have no ability to predict what will make my older self happy or unhappy? On some level, we already recognize this to be true. That’s why we have divorce as well as marriage, and allow contracts to be affected by the changed circumstances of the parties involved. This is the stuff of middle-age crisis: that we didn’t know what we were getting into, that we didn’t understand what life would be like. But if you were going to formalize the most extreme implications of this research, you’d need to see the self both as exceptionally discontinuous (that my younger self knew nothing of my older self’s needs, and therefore should have no determinate role in my older self’s condition) and as exceptionally continuous (that it doesn’t matter that much what my younger self chooses or what harm is done to me by others, because my happiness will return to a default state anyway).

Even worse, it’s very possible that knowing what makes people happy doesn’t help us to produce more happiness, either as individuals or as a society. The Atlantic article points out that the person who understands the findings of the Harvard study best hasn’t been able to apply those findings to his own life, except perhaps that he understands his own failings better than most people do. So much of what we do in public policy is aimed at the production of more happiness, more satisfaction. If it turns out that knowing why we are happy (or not) doesn’t affect why we are happy or not, a lot of dominoes may fall. Or maybe part of being happy is believing that we can be good custodians of our personal and social happiness?

Posted in Domestic Life, Miscellany | 3 Comments

Never Happened

Oh, well, if the truth is going to get our soldiers into trouble, by all means, let’s airbrush it out of existence. Why stop with photographs? Let’s shred all remaining records from the last eight years. We clearly need a national security-oriented amendment to the Constitution as well so that the government can use prior restraint to suppress any reporting that the Pentagon thinks could cause trouble for whatever war we’re in at the moment. Sure, people in countries we’re occupying might know a few things about how we’ve carried out the occupation, but who are they going to tell? Al-Qaida? We all know that terrorists don’t believe anything they read unless it appears in the U.S. mass media.

Posted in Politics | 5 Comments

Figgleton v. Ditchens

So recently there was a good bit of blogging reaction to the public disagreement between four of the most tendentious intellectuals on planet Earth: Stanley Fish, Terry Eagleton, Richard Dawkins and Christopher Hitchens. All four of them are prone to making and then furiously humping straw men while avoiding introspection about their own previous work and thinking. In this particular case, the issue was the muscular public atheism of Dawkins and Hitchens, which Eagleton and now Fish have criticized as ignoring the genuine virtues of religious thought.

I completely agree with the criticism that Eagleton and Fish aren’t talking about religious practice or religious institutions which actually exist in the world, but instead a secular person’s ideal spirituality, primarily concerned with the limits to knowledge, the importance of mystery, the meaning and phenomenology of human life, and so on, and that both of them rig the game so that there can be no legitimate challenges to religion. Many actually-existing religions have very strong truth claims that are expansive in scope rather than the kinds of tentative, humble embrace of the unknowability of human existence that Eagleton and Fish see as the essence of religion. Dawkins and Hitchens, on the other hand, irritate me not just because they lack even the slightest trace of introspection about their own past errors and exaggerations, but because on the subject of religion and atheism, they have such truncated tunnel-vision arguments.

A pox on the whole discussion as these four construct it. This isn’t exactly a new debate. Finding the shrillest or most tendentious formulation of long-standing arguments on these issues is not much of an accomplishment.

As with many similarly well-worn discussions, I’d just as soon review the available lines of argument about why secular or atheistic thinkers perhaps should have an interest in religion or spirituality which goes beyond being resolutely hostile, which takes religion to be an interesting subject to investigate with an open mind (rather than just finding new ways to arrive at familiar criticisms). Any of these lines of argument has its own shortcomings, and none of them seem to me to prevent strong criticisms of some or all religions, but all of them seem to me to provide some intellectual texture and complexity lacking from recent “muscular atheism” of the Dawkins-Hitchens type. It’s not that they don’t consider some of these lines of argument, but that they simply see them as speedbumps on the road to the crusade.

Here’s what I come up with when I make a list.

1. Religion is adaptive, instinctive, or inevitable (in human consciousness or in social experience), and therefore arguing against it is largely beside the point. I know that Dawkins has entertained versions of this argument, as have other evolutionary psychologists who have a critical perspective on religion. There’s a familiar dodge in this kind of argument about the evolutionary roots of a contemporary behavior of which the arguer disapproves: that the behavior was once adaptive and is now maladaptive. But this claim is often asserted rather than studied or demonstrated, usually with striking disregard for what “adaptive” means in evolutionary biology, as well as weak arguments about why the new norms are preferable. In the context of contemporary global society, in what respect is strong religious faith maladaptive? The most secular populations in the contemporary world have the lowest birth rates. Where’s the evidence that the reproductive success of religious populations is threatened by their religious belief or practices? These uses of evolutionary argument have never really escaped the intellectual failings of social Darwinism, in that they’re used to make moral or social claims about what human beings should be instead of what they are while ignoring actual evolutionary science. In any event, this kind of argument should really be a much bigger impediment for Dawkins-style atheism than it appears to be.

2. Religion is sociohistorically embedded. You could argue that regardless of one’s personal opinions of religious belief or practice, that religions and spirituality are as deeply embedded in human social organization as state sovereignty, law, kinship structures and so on. You might be able to make a philosophical argument against a specific religion or religion in general, but it would be irresponsible to allow that opposition to blind you to your intellectual responsibility to explore the complex history of religious practice and sentiment or to unrealistically assume that this history can be simply dispensed with because of the cogency of a philosophical argument. I suppose you could go from this line of argument to suggest that a passionately anti-religious person needs to understand that their political project is a profoundly revolutionary one, no different in scope than an anarchist who wants to eliminate the nation-state. And as with any revolutionary project, the scope raises a moral problem about the costs of pursuing it and a practical problem about the plausibility of pursuing it.

3. Religion is functional. This approach is where a decent number of secular intellectuals who have studied religion tend to alight, conceding that whatever the philosophical problems of religion, it serves some kind of useful long-term or short-term functions for its adherents and as such, makes some kind of sense. This argument has all the problems that functionalism has applied to any practice, but it’s still a pretty serious challenge to the strongly anti-religious, in part because the range of possible functions is so broad: psychological comfort, social networking or mobilization, territorially expansive form of political connection that doesn’t rely on kinship, enforcement of moral norms, you name it. The anti-religious might argue that these functions can be better served by other institutions or belief systems, but it’s up to them to demonstrate that. Or they can argue that these functions are themselves bad, but that’s a much harder thing to do in many cases than knocking some specific bit of theology from a given religion.

4. Local religious practices and experiences and large-scale religious institutions are different. E.g., this is the conventional “I’m not against religion, just against organized religion” argument, an observation that an anti-religious critic who reasons about all religion from the actions or beliefs of a large-scale formal religious institution is missing an important distinction. This is the reverse of what the commenters at Crooked Timber noted about Eagleton and Fish, which is that they construct an idealized philosophical account of spirituality that ignores the concrete institutional reality of religion.

5. The private or local habitus of religious life is different from the ideological life of religion. Similar to #4, an observation that how the experience of spirituality may have little or nothing to do with formal ideologies or philosophies put forth by religious organizations, and that a critical view of the latter should not be projected easily onto the former.

6. Religious ideology is a superficial gloss on top of bad social action; the bad action is not caused by religious ideology. So, for example, if an anti-religious critic were to ascribe the cause of the Crusades to the existence of religious faith or religious organizations, they might arguably be missing deeper or more powerful underlying social, economic and political causes of the Crusades. This is a fairly familiar kind of debate between historians whether we’re talking about religion or not, about whether or when cultural, intellectual or social conflicts visible at the “surface” of events are are actually causes of those events or not. I think at the least you could suggest that long lists of bad events attributed to religious faith or organizations are intellectually lazy, that almost any given event is a lot messier when you poke into it. For example, just saying that the Catholic Church suppressed Galileo’s findings and ergo, that religion suppressed scientific truth and human progress is pretty much greasy kid’s stuff as far as understanding that specific history, which also involved Italian court politics, the economic and social transformation of Renaissance Italy, debates within Western European Catholicism about many subjects, and a good deal else.

7. Religious thought and experience is a subclass of philosophical exploration of questions about the meaning of human life. This is where Fish and Eagleton are coming from, and while they make the argument in manipulative fashion, there’s certainly a more interesting version of it available which acknowledges that the norm of religious life may not involve philosophical exploration but that religion is at least one example of a broader class of such explorations, and that the broader class involves something valuable and important that cannot be provided by most scientific thought.

More? I’m fairly unsympathetic to some of these lines of argument, but I at least know that a lot of ink has flowed under all of these bridges.

Posted in Politics | 15 Comments

It’s a Trap

So. This Star Trek film? It’s pretty goddamn excellent. It’s sort of like the second time that a good play gets performed and the new casting is better than the old casting, the new staging is better than the old staging, the dumb lines and unworkable scenes have been edited out, the dramatic narrative shifted in good ways. It’s still recognizable as the old play, but way better in many respects.

If you haven’t seen it yet, but you’re planning to see it based on the trailer, I’m guessing you already recognize the excellence of casting Simon Pegg as Scotty. You probably are already impressed at the look of the film. Well, your presentiments are spot-on, only it’s even better. The cast nails the essence of the characters without ever sliding into a Rich-Little-style impersonation, except maybe Karl Urban as Dr. McCoy, but his McCoy is so right that it’s never an issue. The Kirk-Spock relationship is freshened up (K/S fans have some new wrinkles to work with: more in a minute) but even better, all the supporting characters are given great defining bits. Chekov in this version, for example, is so much better as a character than he ever was in The Old Show.

When the credits roll at the end, I think just about everyone will be clamoring for a sequel.

———-

So what’s the problem? Well, there isn’t one, if we’re just talking about it as an entertaining film or even as an attractive, sustainable version of Star Trek. No bashing the film for being fun and watchable here.

No, what I’m wondering is whether geeking out about the plot is part of the fun or not. Spoilers follow from here on.

J.J. Abrams makes stuff that you’re meant to geek out about. He does geek service and feeds off of geek service. So in that sense, you’d think the welcome mat would be out for thinking about the plot, the setting and all that. But in this case, I’m worried not so much that plot holes are large, but that thinking too much about them might actually deflate some of the pleasure of the film.

There are the traditional Trek gripes to have about the film, maybe preserved as much as homage as anything else. (The film has tons and tons of great little Easter eggs: McCoy calls out for “Nurse Chapel” at one moment, there’s a tribble in a cage on Scotty’s desk, and so on.)

Trek has never ever had a “total mythos” that makes any sense. Starfleet makes no sense as an organization, the Federation makes no sense as a culture, the future that Trek shows us is plainly a cardboard cutout for the fun characters and schticks to perform in front of. This film keeps with that tradition. I could just barely find a way to rationalize a Romulan mining ship from 130 years in the future being armed to the teeth with photon torpedoes that can wipe out Klingon and Federation ships with ease, and for that ship to be 50 or 75 times bigger than ships in the past. It’s one thing to be a Somali pirate overhauling a supertanker, but if a supertanker with 100 well-armed crewmen and deck guns was coming to crash into an East African port and a little pirate speedboat was given the mission to stop it, that would be another matter, and maybe comparable to the situation that the Enterprise is in. But on the other hand, it’s pretty hard to come up with an explanation for why any single Starfleet captain has all the passwords to permanently turn off every single defense that the Terran solar system possesses and why (as usual) there isn’t a constant hum of interstellar traffic (military and otherwise) around both Earth and Vulcan. A lot of Trek films treat Earth and Vulcan more like they’re the most isolated, underdeveloped and undefended locales in the entire galaxy.

Like I said, this kind of plotting is a Trek tradition. With the exception of DS9, no Trek show has ever paused long enough to work up any consistent representation of its overall setting or tried to make a single world or culture really make sense. It’s not the point of Trek, something which the original writer’s bible made clear by expressly declaring that the show would not ever return to Earth as a setting and where questions about the cultural, religious or social nature of the Federation would not be welcome. Still, at least Abrams saved some Trek traditions for the next film (or two), such as the fact that every admiral in Starfleet is secretly a power-mad authoritarian, budding lunatic or screaming incompetent.

Geeking out about all that is just the usual thing. The real issue is the choice to “sideboot” the franchise with a complicated time travel story. Time travel is NOT one of Trek’s better traditions, though it’s occasionally spawned a good episode. Russell Arben Fox has a nice entry about the way the film plays with time, causality and history in the context of Star Trek. I was thinking about some of the same issues as I watched, and in many ways, this approach was just as troubled as I suspected it might be.

One of the few clumsy bits of exposition in the film involves the central storytelling conceit: Spock pauses and comes pretty close to breaking the fourth wall to deliver the word about what to expect from Abrams’ version from here on out: that nothing is predictable about what will follow, that anything could happen, that the characters have been cut away from their prior histories.

If you look at the way the film handles the reboot with a deeply geeky eye, that declaration doesn’t really follow. The only person whose history is changed by Nero’s original timejump is Kirk. Instead of being a conventionally ambitious scion of a military family, he’s a rebel without a cause. This leads him to a bar on a night when Starfleet cadets are in town (it’s not entirely clear why they’re in Iowa: to visit a ship construction site, maybe?) which leads to a barfight which leads to him meeting Captain Christopher Pike which leads to Kirk meeting Leonard McCoy aboard a recruitment ship and enlisting in Starfleet Academy.

I’m heading into deep geekery now, so follow only if you dare. There are a lot of small but important changes which now follow on this shift. McCoy and Kirk are fast friends from the very beginnings of their career and serve together from the first moment onward, which was not true before. Presumably because of this, Kirk does not become close friends with Gary Mitchell. There’s no mention of two of his known girlfriends at this point in his life (Ruth and Carol Marcus), presumably because Rebel Kirk is even more of a devil-may-care horndog than Ambitious Kirk, and also because Gary Mitchell isn’t pimping for him.

Sulu, Chekov and Scotty still have character histories that could well be consistent with what they were before, given that we knew nothing official about them. But Spock and Uhura, on the other hand? There’s never been even the slightest hint in the old continuity that they were romantically involved. This is a nice touch, but it means that Spock’s personal history is also different at a moment in time when there’s no reason for it to be. (As is Uhura’s.) So even the rules of the premise, so loudly and mechanically declared three or four times during the film, are being broken.

By film’s end, everybody’s history is different. They’ve all come together at an earlier date in their lives than they did before. Kirk is captain of the Enterprise even earlier than he had been. Christopher Pike is not horribly disfigured and as far as we know, has never visited the planet Talos IV. Spock and other Enterprise stalwarts don’t serve with Pike. We may have glimpsed Number One briefly, but none of the cast members appear to serve with her. Gary Mitchell is not part of Kirk’s circle or one of his officers. Kirk takes command as an unrepentant rebel and hotshot with no real service record prior to becoming captain. As far as we know, the experiences that he is known to have had prior to being captain in the old continuity have not happened: he hasn’t seen a massacre of colonists by Kodos the Executioner or encountered a vampiric space cloud which kills his captain. Spock now has a completely different backstory: his mother is dead, his planet gone. Scotty’s been given technological knowledge that comes from his future (actually, have we ever seen that kind of transporting in any version of the show?). When the Doomsday Machine shows up, does Kirk even know Decker? Presumably he’s not been the butt of Finnegan’s jokes: it’s hard to imagine Rebel Barfight Kirk putting up with that kind of crap.

You get the idea. This sideboot of Trek is a bit like DC’s Crisis on Infinite Earth series, which promised a simplifying reboot of DC’s superhero comics and ended up a bleeding narrative wound, a storytelling Rube Goldberg machine. Precisely because it insists that these are the characters you already know, in the universe that they were originally situated within, but now changed by a single intervention in their timeline, it means that every story that happened to them before is now an open question. If Pike doesn’t go to Talos IV, who will? If Kirk and company don’t go to the edge of the galaxy and have a crew member endowed with incredible psychic powers, will anyone? Who will discover the Guardian of Forever now? Hard to see this Kirk as having the gravitas to fall in love with Edith Keeler if it’s this crew that does find it. Presumably there is no all-Vulcan starship to go inside a giant space amoeba and die. There’s no Vulcan to go to for pon farr, in fact, which is doubtless going to lead to horny, violent Vulcans wandering around the galaxy in confusion like a bunch of salmon confronted with new dam construction.

Then there’s the biggest headache of all: Old Spock. I was kind of stunned that they didn’t find a peremptory way to get rid of him at the end of the film: send him back to his future, drop him ambiguously into a black hole, or just outright kill him. At the least, send him off to exile. But no, he’s right in plain sight, hanging out with the surviving Vulcans, without even the fig leaf of a secret identity. (Kind of hard to do when you’re hanging out with a telepath who is your father.) So on Nova Vulcan, there’s a genius who knows how to make black holes, the Genesis wave, transwarp drives, the USS Defiant, and so on. He knows the locations and useful secrets of many planets and cultures which the Federation has yet to explore. He knows about the Borg. He knows about the Dominion. He’s trying to save the heritage and culture of the Vulcan people, so he can’t afford to be sanguine about galactic-level threats that he’s aware of. Besides, once the word goes out (and can it possibly be kept a secret) that there is one person who is the key to total power somewhere on Nova Vulcan, every galactic nutcase and conqueror will be hunting for him.

And so on. With a clean reboot, none of those questions come up but the good storytelling directions of this movie as an origin story would still be in force. So why do it? Well, there’s the obvious reason of trying to keep the old continuity viable as an intellectual property (same reason DC Comics didn’t just start every single comic over with #1 with Crisis). But I almost wonder, given that Abrams likes to see geeks laboring in the narrative saltmines, if the purpose wasn’t precisely to give Trek fans all those unbelievably nerdy questions to fret and debate about, to launch a thousand convention panels and fanfics. I’m just not sure that this is much fun, compared to trying to learn Klingon or trying to figure out the backstory to green Orion women.

Posted in Popular Culture | 27 Comments

The Laptop in the Classroom

Our Lady of Scathing Online Schoolmarmery forgive me, but I don’t think I will be banning laptops in my classrooms in the near-future.

The case against classroom laptops is that they encourage students to divert their attention from class, either to other tasks like email or to total goof-off activities like watching videos or porn. This is viewed as a problem not just for the distracted student but for any students able to see the offending laptop use.

For the most part, I’ve benefited from laptop users in discussions and lectures. Students who have superb search skills have introduced useful material or questions into discussion. In a few cases, I’ve had students find pertinent archival video in response to the drift of the conversation which I’ve then put up on the classroom projector.

I am sure there are students in my classes who have multitasked during a lecture or discussion. I’ll be honest with you. I’ve done the same on my laptop when I’ve been in the audience during conferences or lectures, usually email. I’ve done that in response to being bored, but I’ve also done it as a kind of thoughtful doodling while feeling quite engaged and interested in what the speaker is saying and taking copious notes. So it doesn’t worry or offend me that a student might be doing the same. If it’s because they’re bored, that’s an issue with my presentation. (Though I’m not going to take responsibility for getting universal engagement: you can’t get blood from a stone, and some students are stones.) If the audience is still being thoughtful, taking good notes, and retaining information while multitasking, why should I care?

If a student using a laptop is not paying attention at all, that’s a problem. I think the people who blame the technology may be forgetting that this is an old part of the art of being a student. Equipped with nothing but pen and paper, students have doodled, snuck in magazines, drowsed, written letters, daydreamed behind sunglasses and spent time surveilling other students in preference to watching the professor. The most outrageous example of obvious disengagement that I’ve ever seen in my own classes came last year in a room with about a quarter of the students using laptops. It was a student who brought crossword puzzles to class discussion and dutifully completed them with a bored look on her face.

I didn’t make a fuss about that behavior, so I’m unlikely to make a fuss about laptops, either. I’m not a student’s mommy and I’m not a student’s nanny. If they want to waste four expensive years, I’m not going to shake a reproving finger at them or humiliate them impersonally in the style of The Paper Chase‘s Professor Kingfield. (I completely approve of those professors who want to do that, mind you. It’s just not my style.)

About the only thing that strikes me as distinctive about laptops is that a student viewing movies or images would be a unique annoyance to other students around them. If I thought that was happening a good deal, I’d be more inclined to consider a ban, or to take action against the offending student. (Swarthmore students and alums reading this post: am I right in thinking this is fairly uncommon behavior? Or have you been in my classes or other classes here fuming in annoyance over some guy watching YouTube and wishing the professor would do something about it?)

I know that my institution’s classrooms are not at all typical of the wider world of academia. Distracting laptops in lectures delivered to three or four hundred students in large universities or a night course at a community college where some students are trying to get professional retraining after working a full day are a different matter than laptops in a twenty-person discussion course at an elite college. I suspect in some institutions that the misuse of laptops is more common on a per capita basis.

At least some of the time, however, I worry that anti-laptop sentiment at other institutions is a red herring meant to distract from the real culprit: a pedagogy built around the droning delivery of static lectures (or PowerPoint slides) to huge audiences of understandably disengaged students. You could ban every conceivable distraction and order students strapped into their seats but that alone is not going to compel engagement or learning if the professor doesn’t take on the burden of keeping students engaged. The devil laptop is sometimes like the demon rum: an alibi for sins commenced long before the hated object made its appearance.

Posted in Academia, Information Technology and Information Literacy | 35 Comments