Mikey Doesn’t Like It. Neither Do I.

A basic point: if the Transportation Security Administration is permitted by the executive branch to “mythbust” on its web pages, it should be instructed to fully and accurately respond to public confirmations that the putatively busted myths are in fact not busted at all.

Companies can get away with peddling public-relations crap on their front pages, though I think they too can and increasingly do pay a price for that kind of behavior. A government agency shouldn’t be allowed to.

The TSA’s “mythbusting” on children and the no-fly list has two ugly weasel-wording escape hatches built into it. Any problems, it implies, are the fault of the airlines, so blame them. And the no-fly list isn’t the same as the watch list, which they know full well.

There have been numerous reported episodes over the last seven years of children and babies being patted down, questioned, or otherwise subjected to strenuous screening because they have the same name as someone on the watch list. The case reported by the Times is just the latest.

I’m in the camp of people who think most of what the TSA does is meaningless “security theater” that can and has slowly eroded our autonomy as human beings. Leave that larger debate aside for a moment. A different kind of point arises in this case: if a government agency is going to engage the public through its web site, and questions about the accuracy or truthfulness of what it says to the public arise, it should never be permissible to refuse to respond directly to questions. If the New York Times is running a front-page story about a specific case that demonstrates that what the agency says is a weasel-worded evasion at best, there should be a standing directive that comes straight from the office of the Presidency that an agency is required to respond in specific terms to the specific question.

Transparency is not just about giving up documents or materials with glum reluctance when forced to do so, or piling vague weasel words on top of existing weasel words. It’s an active ethos that should be as high a priority at all times as the specific business of any agency. That’s especially the case when it’s an agency which has been granted increasingly strong capacities to preemptively suspect and intrusively interfere with its own citizens and the world in general.

Posted in Politics | Comments Off on Mikey Doesn’t Like It. Neither Do I.

The Gathering Twilight, Part the Second

Gary Jones at Muck and Mystery takes on a piece about Jaron Lanier’s new book that caught my eye as well.

My negative reaction to Lanier’s views wasn’t quite as strong as Gary’s was, but I had some similar feelings. I think what Lanier says about the economics of cultural production is basically true, in some respect. As Gary notes, cultural monopolies of various kinds were profitable for much of the 19th and 20th Century, at least for the folks who were at the top of various productive hierarchies. The entrepreneurial revolution which began with post-Gutenberg literature moved steadily through other cultural media and forms, and a lot of people earned both spectacular wealth and ordinary livelihoods as that wave spread.

Lanier’s quite right that this economy is seriously threatened now. It relied upon scarcity and it relied upon a closed shop in order to produce for those at the top of those systems of production and those further down the pyramid. Modern cultural production has always had its doubles, however, who did not realize value primarily through the sale of cultural products and who were seriously impeded by the way that copyright and control of dissemination interfered with their own work. Academia is one of those doubles: the circulation of academic knowledge was and remains impeded by many profit-seeking publishers, when all that most academics ever wanted to do was accrue reputation capital through dissemination.

Still, I also think there’s something to Lanier’s point that there are aesthetic as well as livelihood costs to an open-source paradigm for producing culture. I tend to think a bit about digital games that have significant systems dedicated to user-produced content. You come across maps or skins or quests that are both brilliant and something that the main designers would never have thought to make. And you come across a huge number of totally shit disasters that make you sorry you ever heard of video games. That’s both because creativity isn’t evenly distributed and because not very many people have both the time and the resources to do something that’s basically a big gift to the world, which will produce no value for themselves. Even in the best cases, however, user-created content has a diffuse sort of feel to it, without the coherence and tightness that a work which has the full force of a controlling author backed by a publishing institution can sometimes create.

It’s a different kind of culture, and that’s not always a good thing. Lanier may also have a point about the economic circumstances that will confront culture-makers in the mid-21st Century. Gary is hopeful that we’ll soon know who the real creators among us are, that we’ll be freed from the kind of middlemen who don’t greenlight the right movies, pass on the best manuscripts, sneer at the art which inspires the world, judgments which are only corrected through the serendipitious interventions of other middlemen at some later date. But I do wonder if the real creators are going to be rewarded commensurately with their talent, should an open-source culture be better at finding and recognizing them. One thing you could say for the high-water mark of the old culture industry from about 1920 to 1990 or so, we often held our best authors and poets and musicians and filmmakers in high esteem and paid them well beyond a living wage.

If the economics of cultural production are changing, however, I think that both Lanier and folks like Lawrence Lessig on the other side of the issue are sometimes trapped in a debate about copyright and digitization that misses some important fundamentals that have nothing to do with any of that.

There’s two important factors at play in the consumption of culture that don’t tend to enter into the sound and fury around intellectual property. The first is time, specifically leisure time. If the 1950s-1990s were a highwater mark for the commodification of culture in the United States, it’s partly because they were also a highwater mark for the sequestration of leisure time from labor time. For the last three decades, working Americans have seen that leisure time slowly clawed back for the sake of work or for the sake of a productivist temperment even outside of work, towards a belief that the things we do should somehow always be generating value, towards a classically bourgeois construction of virtuous leisure. This is what a lot of folks writing about childhood have been commenting on lately, that middle-class children have been increasingly yoked to the proposition that what they do when they are not in school should still somehow be productive of skills and talents which will have value later in life.

The consumption of popular culture can sometimes get a piece of that action, but the less leisure time we have, the less we can watch or play or read.

When the vector of decreasing leisure time crosses the other major vector of the past fifty years heading in the other direction, namely the increasingly affordable availability of vast catalogs of cultural works, you’re heading for a kind of trouble which has nothing to do with intellectual property rights. We can buy or otherwise obtain the rights to view or read or listen to almost all commercial television programming, almost all films, virtually all video games, and so on now. Old comic books that I once would have had to prowl to find are reprinted as trade paperbacks. I can search online catalogs and find used books that would have taken me a lifetime to track down. And what drove the boom time sales in many media forms in the 1990s was precisely this availability and affordability: consumers took the opportunity to build libraries of material that previously would have been possible only for a wealthy and eccentric collector.

The thing is, once you’ve got it, you don’t want much more. The production of new material in any medium, whether it’s by old-style publishers or new media creators, can’t possibly keep pace, can’t possibly provision us with novel experiences which demand that we continue to buy or rent or consume culture at the same pace. There aren’t enough new people coming into the system who want to build libraries themselves, and in any event, some of them inherit the libraries of their parents or siblings or friends. Most of us don’t want a new device or technology for viewing or reading or consuming culture, either. What we have is good enough, and many people have gone through at least one or two cycles of replacing already acquired works, enough to know that they don’t want to do it again and don’t have the resources to do it again anyway.

More importantly, the affordances of time which drove the desire for more cultural product are steadily vanishing in the economy of the moment. Those who have jobs are more and more compelled to give as much as they can to them, those who have families feel more and more obliged to overparent their children. Productivism again reigns as a supreme bourgeois virtue. Time spent just listening or reading or viewing, if you can’t recuperate it as time getting educated or improved in some tangible way, is shameful time, not a shared triumph of the middle-class milieu. Those who don’t have jobs or whose livelihood hangs by a thread are hardly in a mood or a situation to snap up those bargain-priced DVDs.

So that’s what makes the situation of cultural producers darker than it once was. It’s a pretty fundamental thing: too much product, not enough buyers. I love a world where lots of poets and singers and journalists and video game designers make a solid living. And yes, Lanier’s right that the world of the future will be less like that than the world of the past, which I think is a sad thing. If that world is constricting, though, best to stop blaming it on kids downloading or the online mob. Look instead to making the circumstances of economic and social life more favorable once again to time spent on poetry and song and news, and don’t expect the good old days of rampaging purchases of back catalogs to come again any more than you’d expect a played-out mine to suddenly magically replenish its ore.

Posted in Games and Gaming, Popular Culture | Comments Off on The Gathering Twilight, Part the Second

Reading the Not-Yet

I really like John Holbo’s point about teaching Descartes’ actual writings as an introduction to “modern philosophy” in this Crooked Timber post.

There’s a general pedagogical point here about intellectual history. When we teach canonical texts that are commonly held to be the origin or starting point of a new intellectual, political, scientific or moral tradition, we frequently confuse undergraduates, because texts which originate novelty are not aware of all the practices and claims that will be built upon them. It’s as if I told my students, “We’re going to study this house with all its walls and windows and furniture and paint and I want you to think about all that and see it while we study the hole being dug for its foundation.”

The CT thread has an interesting discussion about whether you really want to read a text like Descartes’ Meditations in its own historical context (which some suggest is a move that favors history over philosophy), but everyone agrees that there’s a pedagogical misfire of some kind involved in trying to read a text as the origin point of a standardized school of thought that had yet to exist. Part of the problem with just switching and looking for a later text which performs that standardization is that those tend to be much duller works for discussion and interpretation. If all you’re looking for is a standardized description of a philosophical or theoretical approach, better to just assign a Wikipedia or Stanford Encyclopedia entry if they’re halfway decent.

But then that brings back the problem of how to read and teach the interesting texts which are interesting in part because they’re not-yet what later authors will interpret them to be while anticipating and teaching about what later authors will invent around and out of those originary works. What can be especially unsettling about this for undergraduates is discovering that what later authors will see as founded in an older text is often not really there in any strong or distinctive form, that the later authors are responding as much to a lineage of successive interpretations as they are to the foundation.

Posted in Academia, Production of History | 3 Comments

The Gathering Twilight, Part the First

In a review of Elena Gorokhova’s memoir of childhood in the Soviet Union, there’s a quote of her youthful realization about Communism:

“The rules are simple…They lie to us, they know we know they’re lying, but they keep lying anyway, and we keep pretending to believe them.”

Here in the gathering twilight of 21st Century America, the situation is hardly much different, with one exception: we seem to want the lies, we compete to outdo the power elite with our own tall tales, we luxuriate in the drowning filth of our fabulistic excesses.

Frank Rich made the case a few weeks ago that Tiger Woods was the emblematic man of our moment not because of his sexual escapades but because of the total disconnect between the popularity of the iconography of his squeaky-clean professionalism and his actual life.

Rich suggests that there is a growing consensus on both left and right that virtually no public figure’s iconography is trustworthy, and that basing your political and social choices on those narratives is a fool’s errand.

That, I have to admit, seems true enough. I can remember thinking that John Edwards seemed like a decent enough candidate in his first run for the Presidency, but I can’t even begin to remember why I thought that: some vague impression of his electability, a few catch-phrases here and there that mimicked positions I could charitably imagine having a resemblance to what I’d like to see happen, and yes, some sense that he seemed like a capable, decent leader. In retrospect, obvious bullshit, all of it. I can remember telling a few friends in 2000 that Bush seemed to have some interest in governing towards the middle, because of a few little rhetorical flourishes, and I thought that again when he gestured in that direction right after 9/11. Again, bullshit.

I gave money to Joe Sestak in his race for Congress here in my district, and while virtually anybody was an improvement over the previous incumbent, Sestak’s actual voting record wasn’t anything like what I’d heard him talking about doing when I went to a fund-raiser and his lack of interest in setting up a real constituent-relations operation was palpable. We were just a way station on the road to something else, just a little resume-builder.

I didn’t buy the same bill of goods in voting for Obama, so I’m not as intensely disappointed by him as those that did. That said, he doesn’t even seem to be governing up to my more modest expectations, settling for “not aggressively bad like the last guys”.

If I’m setting out to buy a dishwasher or a video game, I feel pretty good that crowdsourcing is going to help me find a decent product, that the flow of information online will give me a peek at the actual experiences of users. I feel like I’m pretty experienced at spotting obvious shills, in part because they typically describe products or services in phony language or improbably complimentary terms. I get burned now and again, but not very often.

Politicians and public life, not so much, none of it, because almost all of us are engaged in one way or another in adorning the lies and tale tales of the political elite, in pushing a line or selling a product.

Just about every blogger I read and respect, and I include myself, has a politics that is an a la carte assemblage of positions and favored projects strung together loosely by attitude and affect. Most of the people I like are too smart and wary to be active, aggressive shills for any particular candidate, but there’s still a lot of qualified nods for some leaders and lip-curling disdain for others, based largely on whether they’re telling the lies that we like or the lies that we hate, whether they match up at some moment with some random item on our personal checklists of things-we-like. As Rich suggests, even people that like to imagine themselves as tough-minded independents and skeptics tend to invest in politics the way that audiences invest in the narrative of a contestant on Top Chef or The Amazing Race.

And then beyond that conversation is a vast domain of other readers and writers busy spinning and confabulating in a far less guarded way, a heaving ocean of shillery.

We lie to us, we know we’re lying, we know we know we’re lying, but we keep on lying anyway, and we keep on pretending to believe ourselves.

And yet there are also these moments where real understanding seems possible, where online discourse breaks through to expose our mutual authenticities, where YouTube shows us moments of genuine political life, where a real person is suddenly there speaking about hard choices. Times where the bedrock on which beliefs and politics really rests upon is exposed. I don’t think all my political desires, all my personal checklist, is just a collection of advertising slogans, and I don’t even think that’s true of many of the people I most disdain. Some of what I believe is a product of my self-interest, as it is for all of us, and some of it is a product of what I honestly know about the world and about what makes for best practices at this moment in human history.

For our own velvet revolution, for at least a possibility of moving the ball forward past this stagnant, curdled moment in American life, I think what we’ll all have to do is take the risk of authenticity, to develop a grown-up taste for the rough edges and honest imperfections of lives as they are lived. In our politicians, in our public figures, in ourselves. To stop carrying water for liars or telling simplified fabulisms because we think that will serve some end that we deem necessary. To drop our deflector shields. Living and speaking within a world of acknowledged ambiguity, uncertainty, and imperfection is an end in and of itself.

Otherwise, 21st Century American life is going to amount to just us, the online comments threads, and those wonderful people out there in the dark…a long slow fading as we dreamily revisit over and over again our old glories, waiting endlessly for our close-up.

Posted in Politics | 9 Comments

Dragon Age

I finished a full run through Bioware’s Dragon Age. I’ve always liked Bioware’s approach to RPG design, and this is definitely their best to date. Not because of the setting, which is at times painfully generic or derivative. (The dwarven city is a direct visual quote of Ironforge, for example.) Not because of the storyline, though there are some very nice smaller stories and quests.

It’s the characterization that shines, which has always been the best part of Bioware’s games. The big step forward in Dragon Age is that they’ve finally gone beyond making the “dark path” through character development simply be a matter of being a selfish, snarky asshole. That was a particular problem with Mass Effect: Shepard could either be a basically noble, decent sort or he/she was a greedy jerk who made bitchy remarks to everybody.

In Dragon Age, the characterization choices you get resolve out much more satisfyingly. First, because the branch points in the narrative are often genuinely ambivalent or agonizing, with no simple “right” choice for someone who wants to play a good or noble hero. Second, because the range of charaterizations includes pragmatism, bitterness, alienation, unselfish dedication, desire, etcera. The reactions of the NPCs are also really excellently complex and the interactions between them are really satisfying. Morrigan in particular is a really impressive, intricate female character who continually surprised me.

It’s not a revolutionary game, but it does move a certain genre or mode of interactivity forward towards much more satisfying storytelling.

Posted in Games and Gaming | Comments Off on Dragon Age

Bring Out the Dead?

BloggEd, a really excellent blog by a family who are all involved in higher education one way or the other, has been talking about William Pannapacker’s Chronicle of Higher Education column that advises potential doctoral students in the humanities to avoid going unless they’re independently wealthy, have a partner that can support them, are exceptionally well-connected, or need the credential for some specific professional objective.

Pannapacker (writing as Thomas Benton) has been offering this advice for a while, and as someone who is known for saying that the short answer to the question “Should I go to graduate school?” is “no”, I’m generally in agreement with his skepticism about the prospects of aspiring academics.

There are valid counterarguments. One of the responses that BloggEd got was that every year, there are brilliant students who beat the odds and land a great tenure track position. I have some former students who fit that description, and in quite a few of their cases, I put aside my skepticism and enthusiastically suggested they pursue a Ph.D in history or anthropology. The problem with this thought is that every year, there are brilliant students who don’t beat the odds, and it’s not because they’re not brilliant. Or they win a consolation prize, which is one of the myriad really crap jobs in academia, whether that’s a bushel of adjuncting gigs or a poorly compensated, benefits-minimal position with a 5/5 load in a college or university whose medium-term survival is shaky.

Pannapacker suggests that when we egg on a student even knowing how grim the prospects are, we should have some responsibility for the consequences. Hence his view that the better way to show that responsibility is to severely constrict the numbers of students pursuing humanities Ph.Ds in the first place. Other critics worried about the situation tend to come at it from the other side and argue that something has to be done to increase the number of tenure-track positions or otherwise address exploitative labor practices in higher education.

I frankly don’t see much hope on either score, and it’s one reason why I’m happy not to be directly training graduate students. Constriction isn’t going to happen as long as doctoral students provide cheap teaching labor and as long as the only productivity metric used by administrations is high numbers of students trained.

Better employment conditions across academia aren’t going to happen unless there’s a serious reconstruction of the internal economies of most academic institutions. Magical thinking isn’t going to get us anywhere: you’ve got to find the money. That means doing less of some major activities, doing things differently, some change in the basics as they stand. It probably also means smoothing out the distribution of “better employment”, getting rid of 1/2 teaching loads at R1 institutions so that there’s less of a need to turn to adjuncts and teaching assistants, for example. But at most universities, that’s not really where the money issues lie.

The other problem is that the customers pretty much accept whatever higher education as a whole does. Unless the paying public starts demanding better value for their money, there’s little incentive to change. Students and their families presently tolerate institutions that are poorly attentive to teaching either because they don’t see any affordable alternatives for the credentials they need or because it takes four years before a student may fully grasp how they’re being short-changed.

So what is plausibly within reach of a professor who wants to try and at least chip away at these issues? Let’s stick to the humanities, and I’ll sing some songs that I’ve warbled here before. Whether they’re teaching graduate or undergraduate students, humanists have to provide much more resonant explanations of their value to students and to society as a whole.

It’s ok if some of that value comes from activities as ineffable as exploring the meaning of human existence or reading great novels. That’s not enough by itself, but value doesn’t have to be relentlessly reduced to practical utility.

But some of what we do also has to be practical, and that shouldn’t be difficult to offer to students at any level. Knowing history, for example, should give any professional some insight into how human institutions react to change, how to work with social structures, and so on. Studying the humanities should make someone more articulate as a writer and speaker. Studying culture should give future culture-makers ideas and visions, as well as techniques.

What won’t do is the proposition that the value of the humanities is defined by the standards of its disciplines, so that analytic philosophy or scholarly history or critical theory recognized as excellent by and within disciplines is tautologically deemed of value because it meets disciplinary standards. By now it should be clear that what the disciplines value is not self-evidently valuable to any one else, even to colleagues in other disciplines. The only way it could have remained so is for expertise to retain a kind of unquestioned authority protected by elites in power. Humanists have spent forty years trying to tear that down, and digital media have successfully completed that project.

What also won’t do is pretending that the humanities have a research project which is on par with that of the sciences. Humanists have research that matters, and write works that are important. But it’s not a mirror image of what the sciences are doing: there should be less humanistic research, more carefully composed, and it should almost always have a wider and less specialized audience in mind.

It’s within the reach of any humanist to make better arguments for the larger value of humanistic knowledge to society, to argue that a society that doesn’t take an interest in philosophy, in history, in culture, in language, is coarse and brutal and will wither in the long run, lacking any sense of why any activity is done or ought to be done. It’s within the reach of any humanist to try harder to make their knowledge have a more supple practical usefulness to a wider range of students and readers rather than those who already conform most to the mindset of academic specialists.

The bigger structural issues are beyond most of us, and at the moment, I don’t see any comprehensively great ideas out there about which direction higher education as a whole ought to go. So change what we can change and hope that does some diffuse good for the larger problems.

Posted in Academia | 7 Comments

The Bestest Native of All

Some readers will remember that I defended District 9 against some of the criticisms levelled against it. I don’t know that I’d do the same for Avatar, at least not its plot. My position on the film otherwise is pretty close to the standard line that the visuals are game-changing to some extent, that this might be the 3D equivalent of The Jazz Singer: a mediocre film that transforms the technology of cinematic representation.

The plot, though, is close to a direct retread of Dances With Wolves, one of my least favorite movies. In certain respects, Avatar‘s version is even more annoying. Not only does the white man show up and get to out-native the natives after he undergoes the requisite rites of passage, he turns into a low-rent Paul Muad’dib and becomes their prophetic leader.

So why do I think this is different from District 9‘s version of the going-native trope? A lot of it comes down to Wikus as a character and to the staging of that plot. Wikus doesn’t want to go native, he is really never particularly noble in the way he lives out that narrative, and his pivotal role in the events of the film is driven by a series of accidents.

The thing of it is, in the early modern world, there were a lot of episodes of cultural contact that were rather like Wikus’ situation. There were in fact Europeans who “went native” after being captured or shipwrecked or lost. What’s interesting about them is many acquired a lot of knowledge about and empathy for the societies they found themselves within–but many of them were also perfectly willing to sell out their new companions when they came back into contact with their European countrymen. I think District 9 makes it clear that this is pretty much how Wikus sees things right up to the end of the story, perhaps even after, it’s just that he’s never given the opportunity to do so because his own body has become a desired resource.

So this is why I think it’d be a mistake to just take the going-native trope off the board entirely, to insist that it’s always racist in some fashion. It may always be about race, but that’s another matter. To be honest, virtually all depictions of aliens in speculative fiction are about race in some manner. Walling that entire terrain off as inevitably offensive is a classic kind of white-liberal politesse, about making sure that nothing is ever racist by never doing imaginative or creative work with race at all.

The version Avatar serves up, though, if it’s not racist in the Simon Legree sense, is pretty noxious: it turns the native into moral cardboard cutouts and the European-gone-native into a demonstration of the paragon virtues of Western universalism: so accomplished that it inhabit any subject position, any history, any culture, and outdo it in every respect, be the Native 2.0. The protagonist of Avatar is never really lonely, never truly confused, and his choice in the end is made simple: he’s giving up nothing that matters, nothing at all. His new people doubt him and hate him for a moment, but it’s nothing like what usually happens to a T.E. Lawrence or an Emma McCune, allowed so far in to a local world on the strength of romantic longing for the Other, but always suspected of ulterior motives and manipulative intent.

==============

If I can be permitted to sneak in a second far more geeky objection to the film, I wish that the film Avatar took the alienness of its setting more seriously, and I honestly think that a bit of thinking through the background story could have made the stock plotline less noxious and more dramatically fraught.

Here’s the issue: why would a planet where all life is symbiotically connected have what looks like evolutionary competition? Why have predators and prey at all? Why should the Na’vi make weapons and hunt? Shouldn’t they just plug in their tentacles at the nearest tree and have some protein on the hoof sent their way for compliant consumption? Why, for that matter, are there different groups of Na’vi who ordinarily do not cooperate (hence the need for Jake Sully to lead them into an alliance)?

The only way I can think to make this all work is that life on Pandora evolved by natural selection but at some point, some kind of symbiotic organism inside the biosphere achieved a networked sentience and then began to infiltrate or colonize all organisms on Pandora. If the film had introduced this as an idea, it might have lent a bit more tension to Jake Sully’s situation. How can he be sure that the Na’vi aren’t really just zombies under the control of a creepy planetary-scale parasite? If he takes the final step and leaves his escape hatch behind, how can he be sure that he’s still himself? It’s kind of like the dilemma that people taking psychoactive drugs have to face before beginning treatment: will I still be me? How will I know if I’m not? This might also have made the situation of the corporate people a bit more sympathetic, if the biology of Pandora had been at times more viscerally invasive rather than invariably beautiful and awe-inspiring. It’s a bit easier to bomb a swarm of parasitic worms infiltrating the bodies of other organisms than it is to gun down night elves and gossamer jellyfish.

Posted in Popular Culture | 5 Comments

Disposed to Propose

I’ve done a fair bit of judging proposals for grants over the years, and a recent experience doing so pushed me to finally assemble some notes and thoughts I’ve been collecting. These are specific to undergraduates: graduate and faculty proposals are a different kettle of fish. These points apply not just to proposals for grants and fellowships, though. They even cover certain kinds of writing for the classroom, but more pointedly a good deal of activism and community learning undertaken by undergraduates at institutions like Swarthmore. Primarily what I’m going to talk about here is a sort of rhetorical posture that causes problems in a lot of proposals and plans.

I’m giving this advice both pragmatically and idealistically. Pragmatically because I assume that many undergraduates seeking grants or proposing action would like to win a grant or see their proposals enacted, and the problems I’m going to talk about often keep that from happening. Idealistically because I think some of the shifts I’m suggesting are desirable whether or not they result in a greater success rate for applicants.

————-

The basic problem I’ve seen over the last five years or so, witnessed very intensely during my recent experience of judging, is a Promethean posture commonly adopted by undergraduates who are proposing to study social problems, formulate public policy, or work with communities, especially communities that are commonly understood to have special burdens in terms of social problems or to need some kind of policy intervention.

I feel some responsibility for the intense tone of hubris, sometimes verging on messianism, that sometimes cripples proposals of this kind.

First, because I think that this kind of rhetoric is first taught to smart, ambitious undergraduates through the ordeal of applying to selective universities (and all the preparatory work those students do during their high school years). We goad 18 year-olds into narrating their lives as bursting with accomplishment, as already being fully realized, and often particularly reward those who describe those accomplishments in terms of service. I could do with a few more admitted students who are smart and well-read but who haven’t already performed an innovative new procedure in prosthetic surgery on lepers wounded by land mines while also boosting agricultural yields in surrounding communities and using microfinance to help encourage fair-trade production of organic wines.

Second, I think sometimes in the social sciences in particular, we solicit and reward student writing with strong arguments, which sometimes includes an expansive, assertive vision of action or policy, or at least a strong claim about the authority of social theory. It’s a bit as if a medical school curriculum included a series of courses which asked students to write–purely abstractly–aggressive plans of surgical intervention without mentioning the Hippocratic Oath. We often don’t get around to directly advising students about how to move from this kind of writing (which has its value) to intelligently laying out a humbler, more tentative approach to problems and policies as they exist in the world outside the university.

————-

So, some specific advice for undergraduates seeking fellowships and grants that have some element of social action or who are involved in community projects or drafting plans for social action.

1. Keep it manageable. Work with one place or a small group of people. Study or work with a small, tractable portion of an issue. In my recent experience, I was staggered–and occasionally amused–by the immensity of the ambitions in some proposals.

2. Along with modesty in subject focus, some modesty towards the site of proposed work or activism would be helpful. This problem is often most exaggerated among students with the strongest political or social ideologies. (Not just, though quite often, on the left: I’ve seen this issue crop up among openly religious conservatives as well.) A surprising number of proposers set themselves up as bringing fire from the gods. It just comes off badly when a 22-year old asks for resources to go to a place with which they have at best a short-term acquaintance in order to lecture the local people about self-actualization, economic development, political transformation, what have you. Many newly minted B.A.s have something of distinctive value to bring to the table, but they shouldn’t oversell themselves or their ideas.

3. Respect the existing intractability and historical development of real-world social problems. This is partly a matter of attitude, but it’s also about doing some intellectual homework about why social problems exist and about what people in various places are already trying to do about them. The social problem a proposal is hoping to study or address doesn’t exist simply because the person making the proposal has yet to make their triumphant arrival on the scene. Proposals which casually sweep away practical obstacles as if they didn’t exist and ignore the realities of history and context leave a very bad impression.

4. Don’t reinvent the wheel. If it just takes me a few minutes on Google to find out that the supposedly novel organization an undergraduate proposes to create in some struggling community is more or less a carbon copy of an existing community organization that’s already there, then the need for yet another organization has to be at least discussed in the proposal itself. I’m not sure what’s worse when I read something like this: when I think the proposer doesn’t know that they’re just duplicating existing efforts or when I think that they do know it. It’s better to work with an existing organization if you’re a newcomer, whatever its shortcomings might be–if nothing else, working with a badly flawed existing group is a great way to come away a lot wiser about what can and cannot be done. (See #3.)

5. First do no harm. Works for doctors, and it should apply equally to aspiring development experts, policy wonks, demographers, and so on. There isn’t any way to get around the fact that a tremendous amount of social policy amounts to human experimentation (really, in some sense, any meaningful action we take in the world, even as individuals, can have that dimension). But an undergraduate or recent graduate who is proposing to do something where there is real potential for serious and direct harm to the health, welfare and well-being of real individuals leaves most evaluators feeling very uncomfortable unless that proposal is being carried out under strong supervision. Again, this is where I often feel acutely worried about what we’re teaching in the social sciences, given that we seem to be egging on students who think nothing of proposing aggressive and very direct interventions in communities and personal lives.

6. Be curious. If you’re going somewhere to learn something, then be open to what you might learn. I concede that a well-written proposal needs to come off as confident and well-prepared. But a proposal that’s written as if the proposer has already had the experiences they’re asking to be funded to have is unconvincing. There’s a way to sound both tentative and diligent at the same time.

Posted in Academia, Africa, Politics | 2 Comments

The Problem of Organizations

This is an old idea, particularly in the branch of sociological thought that descends from Weber, but it really seems to me that the political problem of the 21st Century is not a problem of markets or capitalism, not of the state, not of ideologies or religions, but of institutions and organizations. Loosely speaking, what doesn’t work about government as a whole is also what doesn’t work about a local religious charity. What doesn’t work about financial capitalism is what doesn’t work about the Chamber of Commerce in a small town.

Loosely speaking, I stress again. The consequences of the collapse of the global financial system are different than when a local veterinary practice stops being meticulous about controlling infection in their clinical work. The inability of a vitally important government bureaucracy to deliver useful services where they’re needed most means something different than when a new restaurant loses money and fails because the front-of-the-house staff are poorly trained and indifferently motivated.

It’s not just consequences that differ. Modern institutions succeed and fail because of their organizational cultures, and those are shaped by their own singular internal histories as an organization and by the larger history of the activity to which they are devoted, as well as by the contingent actions of individuals working within an organization. This is where an overly functionalist account of organizations always comes to grief. But it seems to me that there is also something about modern institutions that makes their problems comparable: similar relationships to state, society and market, but also to individuals, similar devices and technologies of organization.

This is my roundabout way of approaching recent stories about institutional failure within higher education. Take for example this New York Times story about the Stevens Institute, a New Jersey university staggering from charges of financial mismanagement levelled against its president. As the Times notes, it’s not the first time we’ve seen this in the last decade: university president, often at a lower-tier or less-selective institution, backed by servile trustees, gorges on personal luxuries and extravagances, usually claiming that they were necessary for effective fund-raising. Often accompanied by defensive and petty retaliations visited upon critics within the organization and in the wider community, and various other kinds of tinpot managerialism.

The larger sociocultural story involved here, beyond just universities, is the rise of the new executive plutocracy across a range of institutions, because similar episodes of executives-gone-wild have cropped up in many businesses and in non-profit community organizations over the last decade. The larger institutional issue is the failure of mechanisms designed to safeguard the long-term mission of such institutions. Trustees, boards of management, auditors, ratings firms, accreditors, all turn out to be possible to suborn when there’s sufficient will, when there’s a lack of transparent operations, and when the institution in question is one which no one really cares about save those who work with or rely upon it.

But institutional failure can happen from below as well, in the ordinary work or activities of an organization. Margaret Soltan wrote recently about the case of the UC Davis employee charged with reporting on sexual violence who inflated crime numbers for years as well as misused funds. As Soltan points out, this was more than just an individual misdeed. But I’d also say it’s more than just a problem with academia or with a particular belief system. This is a danger every single time you create an office, position, division or what have you within an organization that has a specialized responsibility and no sunset clause. That’s a hammer that’s almost bound to see everything as a nail even if the people charged with that responsibility are saints.

The subtle problem that organizations and institutions pose to contemporary life is that people who live inside an institutional culture often are so sensitive to the nuances of the way things work, the limits and possibilities of change within the institution, that they let problems and failures slide or pass. No one wants to be that guy, the one who rants about everything. And that’s what often happens to someone inside an institution who blows the whistle on a bad practice or a growing issue, because that often ends with that person in a kind of internal exile, and in that circumstance, a loss of a sense of proportion is all but inevitable. Everything will come to look suspect or corrupt.

Institutions often resist external monitoring for the same reason. Institutional actors really do know things about how and why things work inside their worlds that outsiders don’t know. Precisely because a lot of that knowledge is about culture, about the unspoken patterns of everyday life, it can’t just be made transparent by providing documents or access. A monitoring group which gains an intimate knowledge of that interior culture of their target of scrutiny becomes a part of that culture in the process, loses perspective. A monitoring group which maintains a steely, formal distance from that kind of knowledge tends to constantly screw up processes which are working well, to create formalities and record-keeping requirements which become a burden without providing a service, and to be presumptively hostile to the organization they monitor even when it’s not warranted.

And of course outside monitors are themselves institutions, and just as prone to developing arteriosclerotic rot in their own procedures and internal culture. No institution wants to be monitored by another institution which may become just as fallible or broken.

Quis custodiet ipsos custodes and all that. It reminds of Dr. Seuss’ image of a world full of people holding the tail of the person in front of them.

There are ways to split the difference, but most of them are time-consuming and resource-consuming, the equivalent of breaking the alarm glass when there’s a fire. Outside monitors can use forceful legal implements like depositions to produce narratives about how an institution works (though this is more often past tense: how it worked, before it failed). Consultancies with specialist knowledge offer themselves (for a fee) as honest auditors who will sensitively tell an institution how to fix itself–but there’s only so far you can go before the customer stops payment.

The political and social problem of making institutions renewable and self-repairing without handing them a perpetual license to seek transfers, to be always “too important to fail”, is the real problem of the 21st Century. It applies across market and state, civil society and private life. I think there’s a part of the problem that is in some broad sense technical, that can benefit from information technologies, from new forms and designs of social networks, from new possible relationships between publics and institutions. But it would be a mistake to think primarily in those terms, to make these out to be questions of design. Thomas Malaby’s recent book on Second Life is a smart, meditative example of why trying to create culture through superior design usually falls short of its own ambitions, and indeed, those ambitions are often part of the problem.

Institutions work best through and are safeguarded most by strong cultures of professionalism, loyalty, and honor. Institutions are most at risk from parasitic infiltrations which adroitly use professionalism and loyalty as shields and weapons, who act like cancer cells, turning healthy structures into diseased ones. Problems of cultural maintainance and cultural creation are the hardest of all, because they can only be worked upon through incremental action within culture, with a humble sense of the immediate horizons of plausible transformations.

But the fate of institutions can’t just be left to them alone, because even the least of them has some kind of consequential social and economic power. If we need to think about how to live better within our institutions, we also need to think about how to act more wisely towards the institutions of others, to concern ourselves with their workings and when necessary, find smarter and more humane ways to intervene in their affairs and even to shut them down. The tools we have aren’t up to it, and the habits we have even less so.

Posted in Academia, Politics | 3 Comments

Fantasy Bests

It’s a New Year, so I’m going to get back in gear on this blog, which I’ve had to leave a bit moribund for a while as I concentrated on some other things and did some travelling. Many entries to come.
———————

I kept meaning to put my list of the six best fantasy novels into the comments thread at Crooked Timber but time got the better of me and before I knew it what was up at CT instead was a bizarrely contentious comments thread on Scott McLemee’s totally legitimate critique of Cornel West’s latest book.

So much later, here’s my list of the best six TEN fantasy novels it is!

But first a word on this sort of exercise as well. Some might poo-poo the idea of such a list as always hobbled by the mixing of apples and oranges, or by the impossibility of clearly defining the field from which a list is selected. The thread at Crooked Timber had a lot of that kind of discussion. But it also showed why the exercise is a good one, partly because it brings out into the open the range of assumptions that audiences make about a particular kind of culture. It’s also interesting to see how passionately felt these kinds of judgments can be, both about individual works that one puts (or does not put) on a list, and about what the principles of constructing such a list ought to be. For myself, when I make a list like this, I try to balance representing the diversity of a field with a nod to canonical works which I agree have great historical importance in shaping that field. Plus I like to throw in a few idiosyncratic judgments about work that I think is underrepresented or overlooked.

So here’s my list:

John Crowley, Little, Big. In the CT thread, there was a pretty sharp split between people that simply don’t like this book and those that love it. I can actually see both sides. It’s a very atmospheric work: you’re either drawn into the mood it creates or you’re not.

Barry Hughart, Bridge of Birds. There isn’t a lot of fantasy out there that works with non-Western themes, stock narratives, and so on. Some of the few books that try to do so come off pretty badly because they’re built on a crudely Western perspective on non-Western folk cultures or mythologies. But I really like Hughart’s work with a fantasy China in his hard-to-find series.

Ursula Le Guin, The Farthest Shore. Earthsea seems another series that divides a lot of genre readers. For me, it was an important counterpoint to Tolkien when I first discovered it: quieter, more contemplative, intelligent in its thinking about magic. It’s such a commonplace in fantasy works that magic has a price or a cost, but rarely is that worked out as more than a slogan, given that readers are almost always meant to covet magic and identify with sensitive wielders of its power within a given setting.

Mervyn Peake, Titus Groan. Another book to savor for mood rather than plot, but I think that’s often what defines fantasy best, as a setting and feeling. Plot-driven fantasy frequently struggles to be anything besides “innocent farm boy discovers he is secretly a prince, gets a magic sword and a wise mentor, meets girl, loses girl, defeats enemy, wins kingdom, gets girl.” I first read Titus Groan while living in a homely but pleasant bedsit in London while doing my dissertation research: it pretty much defined for me that sense of a fantasy work that generated a sense of being adrift in a world whose everyday rules and sensations were different from my own.

K.J. Bishop, The Etched City. Yet another book that’s about mood rather than story. (In fact, the plot misses a lot of opportunities for smart closure and clever connections.) I regard this book as my favorite example of the kind of fantasy that Mieville, Vandermeer, or Alan Campbell have written, of grim quasi-Victorian imaginary cities full of dark satanic mills of one sort or another.

Lloyd Alexander, The High King. Despite my slang above on the boy-becomes-king narrative, this is a really terrific example of that baseline story. What makes it work so well even now is partly the persuasive underlying morality of the story, that its protagonist is faced with such difficult choices and genuinely earns his kingship rather than by some innate nobility.

Roger Zelazny, Lord of Light. Right, I know, it’s also “science fiction”. A good book for pushing genre definitions in that respect, but it’s also just a great book, period. When I first read it as a teenager, I do remember getting a bit tripped up on the temporal framing of the story until I’d read it through twice, and even now that seems a bit rough to me. It has some of Zelazny’s typical schtick, but it’s in its most appealing and interesting form here.

T.H. White, The Once and Future King. Long a favorite, but I do sometimes wonder why when I re-read it. It has long stretches that are emotionally distant. The Lancelot-Guinevere material suffers some from White’s own remote and austerely tormented masculinity, his inability to really imagine Guinevere (or any other woman) in an even vaguely sympathetic way. When I was young, the material after Arthur’s childhood didn’t always work for me. But now at least some of it does: the regrets, the inability to break habits, the confinement of commitments made and codes adopted. The moral force of the first book is also still so very powerful, and the little asides about medieval life are also a kind of ground-floor realism about that backdrop that the routine sword-and-sorcery works in the genre still decline to take up.

Marion Zimmer Bradley, The Mists of Avalon. Best read alongside White, but it’s also a smart critique of the entire genre, and opened the way for a lot of other inversions and deconstructions.

JRR Tolkien, The Lord of the Rings. Gotta have it, even if its many imitations are an affliction on fantasy as a whole.
——-

What’s not on my top ten, and why.

George R.R. Martin. Partly because the series is unfinished (I suspect it will remain so) and partly because I think the pleasures of A Song of Fire and Ice are partly a matter of counterprogramming against a wretched brood of tolkienish imitators.

John Bellairs, The Face in the Frost. A bit too slight to make the top ten, but I do love this book.

Phillip Pullman, His Dark Materials. I like The Amber Spyglass better than most people do, but I’d agree the series falls down a bit in a number of ways in the third volume.

Samuel Delany, Neveryon. I tried to teach this book once in a course on historical memory. Unfortunately takes about 600 pages (and two books in the series) for the point to sink home, so it didn’t work very well. I think I’d include it as part of any master course in fantasy–it’s a great work of literary criticism disguised as a work of literature, really.

J.K. Rowling, Harry Potter. I really do like these books as a whole, but I don’t think of them as top ten material.

Guy Gavriel Kay. Again, almost. I just don’t think anything Kay has written quite cracks this list–yet. But I feel as if some future work might. Kay raises the same question for me that Susanna Clark (Jonathan Strange and Mr. Norrell) does: namely, what does making a work of fiction into a work of fantasy permit that writing a historical novel does not? I’m not always clear with Kay or Clark what writing in a speculative mode accomplishes.

Clive Barker, Imajica. The CT thread brought this up, and I was almost tempted to include it, as I remember it making a big impression on me when I read it. But there’s something about the book that doesn’t quite cross the threshold, though I’m hard-pressed to say why.

Robert Holdstock, Mythago Wood. Too diagrammatic, which is sort of the point of the book, I know. Again, I’d include it in any master-class on fantasy, for sure.

Neil Gaiman. You may commence throwing things at me, but I think he’s a pleasant but unextraordinary fantasy writer who is also the writer of a very good comic-book series. None of his fantasy novels have wowed me, though none of them have bugged or annoyed me, either.

Madeline L’Engle, Wrinkle in Time and Wind in the Door. Still very good books, but re-reading them, I found them a bit preachy and very prone to declare rather than show when it comes to declaring things beautiful and wonderful and horrible.

Jorge Luis Borges. If I were to classify him as fantasy? Oh yes, we have a winner. I guess when all is said and done, I still think of fantasy as genre, which is not the same as fiction with elements of the fantastic. That list is a different list populated with Swift, Shelley, Borges and others. But I know full well that this is also a bad view in many respects, using genre as confinement, as a kind of fannish self-hatred, and so on. It sets up a wretched situation where the fan has to argue that their favorite works are “really” literature, or deserve favorable comparison with “mainstream” work. But genre is real, or at least the real product of histories of readership and circulation, and can’t just be abolished like that. I do think it would be profitable to ask which of the ten above I’d put into the same weight class as Swift or Borges and think they’d emerge creditably. On the flip side, ask me when the last time I read Swift for pleasure, and he might not come out so well. (Borges does pretty well in either context, on the other hand.)

Posted in Books, The Mixed-Up Bookshelves | 13 Comments