Oblivion!

I’ve got to do a better job of breaking up the “Grandpa Burke Tells the Rest of the World What To Do, Dammit” posts with more fun stuff. I mean, these days, the best blogs I’m reading are several great comics-oriented blogs rather than the Very Serious Academic blogs.

So, if you’re not playing Oblivion, and you like computer or video games, you should be playing it. In fact, if you’ve never played a computer or video game, you might want to think about giving Oblivion a whirl, if you have a PC that meets the technical requirements. There’s some stuff involved that is counter-intuitively part of the deep “grammar” of computer role-playing games, but the game also delivers a lot of immersive pleasures that I think even a non-gamer could appreciate.

If you’re one of those who haven’t played this kind of game, or any game, the basic gimmick is that you create a character to represent you in this world and jump right into the game. You start in prison, meet the Emperor of a vast kingdom who happens to need to use the secret passage in your cell in order to escape a gang of magical assassins, and get plunged into the world from there.

The difference between Oblivion and the vast majority of other games of its kind is the openness of the gameworld and the often beautiful naturalism of its visual design. In a way, its closest cousin is not another fantasy role-playing game, but Grand Theft Auto, in that you can go anywhere and do almost anything within the geographical confines of the setting. It’s different from Grand Theft Auto is that you can play a highly moral character if you want, in fact, that’s sort of the default “moral setting”. But everywhere you go, you get the sense that life is happening all around you. Computer-controlled characters move around, conduct business and conversations of their own, come and find you if they have business with you, and so on. There is a main storyline for you to follow, but you can also ignore it and become part of a huge number of smaller peripheral stories, which often present you with some interesting choices about which side to take.

My daughter Emma and I have been playing quite a bit of it together. It is fascinating to her like no other game she has seen so far. She’s very strict about forbidding her character to steal or do anything dubious, is obsessively protective of her character’s horse. She is adamantly collecting a huge amount of rat-meat taken from giant rats that her character has destroyed, despite the lack of usefulness of the meat.

She refuses (so far) to advance the main storyline as it involves chasing down some demons in the rather scary alternative dimension of Oblivion. I think that’s good–I went in there last night with my own character after she was asleep and the images involved are a bit strong for her. Occasionally I have to lead her away from other things. I had to maneuver her out of the room when her character started a mini-quest that turns out to involve a gang of female thieves who lure unfaithful men out to a remote farmhouse by promising them a liason only to rob them when they show up. I was sort of impressed by the writing of this storyline, which could easily have had that typical cheesy geek-male prurience but didn’t. Otherwise it’s turning out to be a great game for me to play with her collaboratively.

Posted in Games and Gaming | 11 Comments

Your Assignment: Turd in the Punchbowl

I had a postdoctoral fellowship at the Center for Historical Analysis at Rutgers that was one of the more satisfying professional experiences I have ever had. The topic for the year was the history of consumption and commodities, and a lot of the fellows, both local and from outside, were really compelling scholars who gave me a lot of insights into my own research project.

It started with a really terrifying experience, however. The first seminar meeting was dedicated to a paper presented by a visiting speaker, an accomplished historian who had previously done excellent work on the history of religion in England. The speaker was presenting some work on the history of consumerism and masculinity in 20th Century England that focused heavily on toiletries for men. Since one of the major topics I was writing about was the production and consumption of similar products in southern Africa, the Center’s convener for the year, Victoria de Grazia, asked me to be the discussant. This made me nervous, to be kicking the whole thing off with my first non-grad-student service as a discussant. What made me more nervous, as the event approached, was that the paper was, in my opinion, pretty bad. It had a really worrisome methodological twist, in that it was based on access to company records that only this historian had been or would be permitted to see, but that wouldn’t have been an issue but for the argument and tone of the paper. Basically the paper gave off some weird psychoanalytic vibes, as its main argument seemed to be that in the course of the 20th Century, the natural earthiness of male odor had been disciplined by overly prim and controlling female sensibilities acting through the agency of toiletry manufacturers. I could imagine a version of that argument at least being taken seriously, but at least in this iteration of this paper, it was in all honesty pretty goofy.

So I fretted for a week. What should I say? Be pleasant and neutrally encouraging, the way discussants or academic book reviewers often are? “This paper makes an important contribution to the field…” and all that? Or should I be straightforward, if polite, about how troubled the paper was? I didn’t think then and don’t think now that the job of the discussant is to pedantically correct the presenter about relative trivia, or even to tell the presenter about the paper they should have written. It’s to figure out what is discussable in a given paper. But you still have to choose whether to discuss the paper from the premise that it sets the conversation off in a strong way or whether the discussion is about what’s wrong with the paper.

I ended up coming in with a pretty hard-hitting commentary which went over well, though the presenter was obviously very unhappy with me. The point for me was to try and use the commentary as a springboard to bigger issues: what was good evidence in the context of studying the history of consumerism and commodities? What can we find in corporate holdings, and what can we do to encourage them to be shared with historians? What represents a sound interpretation about the personal experience of consumer desire in the past? Why do felt needs and material practices change? But I did make it clear that the scholar presenting the paper was barking up the wrong tree on each and every one of those questions.

I’m thinking about this memory this week in the context of sparring a bit with Mark Bauerlein at the Valve. Partly I was just irritated with Bauerlein for filing a kind of entry that at this point is long past its expiration date, the “look at the stupid titles on papers at conference sessions I didn’t attend”. That’s the blog equivalent of phoning it in.

In the discussion, however, Bauerlein made an interesting point about scholars in the field of composition who try to address the problem of how racial identity affects writing, and implicitly, about whether that effect is something that should be reduced through efforts to standardize analytic writing. Bauerlein observed, “The proper handling of racial differences, including hypothesized differences in writing or arguing, takes years of study in history, sociology, demographics, and psychology…I spent three years in archives documenting race relations from 100 years ago, when racial differences were an intense focus in public and private life, and I wouldn’t trust myself to manage that question in the classroom or at a conference.”

I observed in reply at the Valve that the standard Bauerlein sets here ought to apply not just to assertions about the impact of racial identity on student writing, but to any question in composition, and even to any question involving the interpretation of texts or culture. Here I want to go a different direction, namely, that if Bauerlein thinks this, he has every reason to attend a conference session of this type and make that criticism in a friendly and constructive manner to the presenters. A lot of academic life has a tendency towards the “mannered and tendentious”, towards closed and self-confirming forms of reasoning and evidence. To sing an old song of mine, t’s not just “liberal” disciplines that have that problem. Economics is as self-confirming and hermetically sealed a discipline as you’ll ever see, and its closed circles are typically anything but liberal or left-wing.

In practice, this is part of what interoperability in academic life should be about: the willingness to go to presentations or to read articles where you have fundamental disagreements with the starting premises or underlying assumptions and to push back critically on what is being said or written.

You have to be fair, and you have to talk about what is actually said in the paper or written in a scholarly work in the context of the discipline at hand. It’s both bad manners and pointless to go to an economics presentation and hold the paper accountable for simply being in the discipline of economics. I once got involved in a discussion of an article with a colleague in economics whom I like quite a bit where I started going off about the entire concept of the SES as a metric. That was not a useful contribution on my part. On the other hand, a case where I thought I got it right was when I went to a faculty lecture by a colleague in religion who was talking about myth and historical narrative in an area of Southeast Asia where I pushed pretty hard about how he was criticizing certain modern narratives as inauthentic and non-indigenous while ignoring the ways in which the narrative he took to be “baseline indigenous” was easily interpreted as a story told by conquerers about their violent suppression of an earlier autocthonous group in the region. In the context of this presentation, that was an unwelcome reading: it disrupted the entire premise of the lecture, and came from outside the framework that the speaker was working within. But I think it was a useful reading that pushed back on the lecture in a way that the speaker recognized as legitimate.

It’s hard to find the time in any college or university to even attend lectures and presentations within your own core areas of competency. I’ve missed several this semester that I felt a serious obligation and desire to attend. In some ways, however, we should feel an even greater drive to go to presentations where our presence is unexpected and press hard on those presentations from unexpected directions. If Bauerlein is doubtful about whether a scholar of composition can successfully integrate evidence about racial sociology, psychological identity, cognition, and the history of racialized reading and writing into a presentation, he owes it to the people making such presentations to push back on their work in those terms, to raise the questions that their immediate colleagues would refuse to ask or would not think to ask. It’s not just him: I think this should be a requirement of professional scholarly life, to “cross-train” both for your own benefit and for the benefit of others. We shouldn’t just tolerate this sort of unfamiliar intervention, but actively wish for it, as long as it is professional, generous in intent, and accepting of the basic legitimacy of our own disciplinary practices.

Posted in Academia | 4 Comments

Kenyon’s Confession

I have been a bit surprised at how surprised some observers are about Jennifer Delahunty Britz’ op-ed piece in the New York Times regarding the role of gender in the admissions process at many selective private colleges and universities.

Britz revealed that Kenyon is forced to turn away a disproportionate amount of qualified female applicants in order to achieve an approximate balance between male and female students in their admitted class, due to the fact that a higher number of qualified women apply than men. Some of the angry responses, such as Katha Pollitt’s in The Nation, attack Kenyon directly as if their policy is an offensive aberration that needs to be opposed. Unfortunately for anyone who objects to this approach, it’s actually pretty common.

In fact, this approach is part of a wide range of attributes that the logic of selective admissions favors in similarly unbalanced fashion. Better to be from Alaska than New York City. Better to be a first-generation college student–or a legacy. Better to be a person of color, unless you’re of East Asian descent. Better to have a highly anomalous talent or background than be a valedictorian and student body president. And so on. Anybody who reads Jacques Steinberg’s The Gatekeepers, on a single year’s admissions process at Wesleyan, will get a fairly good sense of the complicated, sometimes almost absurdly intricate, ambitions for a range of identities, experiences and aptitudes that go into composing an ideal class at a selective institution. (Pollitt especially should read it, since she frets about whether Wesleyan does what Kenyon does. Answer: yeah, probably.) Of course, this is also just highly selective private institutions we’re talking about here: major public institutions are another matter, as are less selective private ones.

There’s a hubris involved in the whole process, a kind of social engineering that is sometimes bizarrely fine-grained when you get down to the readings that particular admissions officers offer of particular dossiers. None of these schools are admitting true wild cards: there is a pretty narrow respectable range to the “diversity” they seek. On the other hand, all of them could probably fill an entire class with nothing but highly accomplished white men and women from upper middle-class backgrounds whose main declared educational ambitions would be to major in economics or biology. They don’t primarily because they feel that this would ultimately harm the appeal of their educational program to future applicants and negatively affect the overall health and vigor of the institution. Hence the disproportionate desire to admit students from South Dakota, Native American students, students who’ve spent their spare time in high school fighting Guinea worm in Nigeria or breeding champion pigs in Nebraska–and men, when the applicant pool is strongly tilted towards women.

If you don’t like this approach when it comes to gender, then arguably you don’t like it when it comes to race, ethnicity, geographical origin, and even accomplishment when accomplishment is not directly connected to probable academic success. If you think this is valid on everything but gender (as Pollitt seems to), I’d like to hear how you see the difference. I think you could make a good argument for simply randomizing the admissions process at most selective universities and colleges (e.g., set a high minimum range of admissions criteria and leave it at that), but that would be a big change–I have a colleague here who has seriously advocated that shift. That would probably leave you with a student body that was as much as 60% women and other demographic shifts as well. You’d lose the ability to try and ensure a balance of academic interests and plans among your admitted students as well, with some likely consequences. There might be other ways to make the admissions process more difficult to game and less engaged in trying to micromanage a range of perceived attractive attributes. For one, I wish all the selective colleges and universities would drop the personal essay in favor of a more rigorous essay that asked for analytic or intellectual responses to an ambitious or challenging prompt.

But if you see any legitimacy to weighting in favor of students from Pitcairn Island and students who are world-class kazoo players, then the only way you can differentially object to the pursuit of a fifty-fifty gender ratio is that only this objective is unimportant in trying to engineer heterogeneity in an admitted class, that a 60-40 ratio is no different than a 50-50 ratio in its effects on the culture and life of a selective college campus. That’s possible, but I do think there’s probably a tipping point where that gender ratio really would begin to affect the pervasive feel or character of a college or university. It’s also possible that those who object to this approach might discover in their objection that they object to “affirmative action” in college admissions across the board when the goal of such action is about trying to engineer diversity or pluralism in an admitted class. I’d at least like to see some of the people with the strongest reactions, like Pollitt, show some awareness of the nature of the minefield they’re careening into.

Posted in Academia | 29 Comments

School Ties

Margaret Soltan has been doing a fantastic job lately of tracking stories about the poisoned, hopelessly corrupt relationship between higher education and athletics.

Here we are in the middle of March Madness again. This year, as every year, brings to light the endless, fervid creative energies invested by some college athletics programs in evading both the spirit and letter of rules and restrictions. “Prep schools” designed to free youthful athletic prospects from any semblance of education. “Bunny courses” hand-picked by administrators to ensure star athletes never have to meet any serious academic standards. More importantly, some of the stories that Soltan has been tracking have suggested that even the usual alibi, that such programs are major sources of operating funds for public and private universities, is false in many cases, that many major athletic programs cost far more than they take in, or that their financial benefits to the academic programs are minor at best. Not only are many institutions selling their soul, they’re not even getting a good price for it.

I noted that it’s fairly odd in a way that so much fervor gets put into the debate over whether the professoriate is too liberal, or whether academic institutions are stagnant, and yet this topic gets glossed over by the usual suspects hammering on universities and colleges for their shortcomings. In scope and significance, there is much more to worry about in terms of athletics and academia.

Athletics are an important part of any four-year residential program of education. The standard justifications for an athletics program at a school like Swarthmore are perfectly valid. You learn things through athletic competition that you don’t learn in the classroom. When we eliminated football here, one of the anguished objections was that Swarthmore students were already too over-intellectualized, too captured by a narrow conception of scholasticism. I don’t know that I agree with that, exactly, but I understand something of the underlying complaint. Athletics are potentially a good cure for insularity in many respects. However, whether you’re Swarthmore or UCLA, all of these functions are served just as well by a strong program of intramural and small-stakes intercollegiate competition. None of them are dependent upon building a strongly competitive athletics program, or having powerhouse teams.

The other function of an athletics program in many academic institutions is the real reason they are defended so strongly by so many. They give alumni a way to hang on to the institutions, a sense of ongoing connnection, a powerful and legitimate fiction of extensible institutional and communal identity. There is very little else that alumni can view from a distance that allows this kind of pleasant affinity, this sense of belonging to an ongoing tradition. I’ve found it hard to fit in a continuing engagement with our alums this semester through a year-long program of reading recent novels and memoirs of Africa, which is entirely my own fault due to overcommitment (and yes, distraction: once again, I’m not joking about the title of this blog). Most colleges and universities don’t have anything even that tangible to offer as a connection to faculty (and many faculty would scorn it if it were asked of them). At many major public and private universities, I’d guess that coursework isn’t very high on the list of things that alumni remember fondly anyway. What else might you root for at your alma mater, or watch proudly from a distance?

It’s not just alumni. In many parts of the United States, the college football and basketball teams are the only local source of entertaining, high-level competitive athletics. They’re the civic glue that holds communities and regions together, gives them a sense of collective pride and identity. So many communities desperately need what college athletics provides to them: they couldn’t afford it otherwise on their own. While what goes on behind the closed doors of classrooms and offices within the university may be more distantly appreciated for the benefits it provides to the rising generation, much of the educational dividend flows elsewhere as graduates head out and away. The payroll, of course, is the other direct civic benefit, but even that goes to a smaller area than what a major athletics program with competitive teams can provide to a large region.

Against that, what is the harm of highly competitive college athletics to academic institutions? I think the conventional point that it harms the athletes themselves is underappreciated by the devotees of most college sports, in part because they (like the athletes themselves) tend to have tunnel vision for the perishingly small number of college stars who succeed wildly in professional careers. Even here, you could wonder at the opportunity cost: a star baseball player can head straight into a minor league career if they’re competitive at the age of 18, without even needing to pretend to being a college student. What that star does at the age of 35 or 40, at retirement, is up to them. Why not the same for all major professional sports? Why futz around with college?

For those athletes who are not going to be professional successes, or whose professional status is marginal, college might in fact make sense. It only makes sense, however, if it prepares the borderline athlete for something other than athletics, gives them meaningful credentials and training. Right now, many competitive college athletics programs don’t do that for their students: they train their students instead in evading education and academic challenge. So what you get, from high school onward, is a much larger group of men, many of them African-American, who are exploited as cannon fodder, used up and thrown away, with nothing to show for it.

That may be the least of it, though. I think more deeply than any institution which exempts some fraction of its clients or its members from the standards it otherwise upholds is eating through its own foundations. When cheating and mocking the rules becomes normalized, when the abuse of power is a secret in plain sight, when the values an institution claims to cherish are routinely rubbished in practice, you’re just counting down until the whole thing collapses.

These are old complaints, and they don’t seem to convince those who value college athletics in their current form. Partly because they try to shoehorn the legitimate civic value of those programs back into a conventional academic program. I think the only way out is to spin off those programs, to take them out of universities altogether. What major university programs should do, I think, is continue to invest money in their athletics programs as a form of contributory investment in their communities and regions, but decouple them entirely from the academic institution. If a student wants to play, that’s his or her choice. Let him or her earn an outright salary as an athlete, no different than a student holding down a job in any other context. If he or she doesn’t make the cut into the big leagues at a later date, then he or she can go back to college at the age of 22 or 24 or 28. If he or she can’t balance work and school, then the athlete should choose, just as any other student chooses. Impose no more academic restrictions or requirements on college-sponsored athletics teams. Make college-sponsored athletic teams into the official minor leagues of professional football and basketball, and invite the NBA and the NFL to join universities in investing in the teams and refining forms of standardized uptake from these minor leagues.

So you’d have “The Michigan Wolverines”, who would still compete at the university stadium (offered to the team as part of the university’s subsidy), retain affectational ties to UM, and be subsidized substantially by the university. They’d play other teams in their minor league division that were also formerly university teams, to retain traditional rivalries. The team could compete freely for athletes without any need to admit them to the university, paying free-market salaries to players. Just as now, some athletes could potentially opt to leap straight to the big leagues, and just as now, there would be significant reasons for the NBA and NFL to discourage that in all but a handful of cases.

The only thing that might seem lost in this scheme is that sense that students in residential programs benefit from an athletic experience, but the big-name programs don’t service that need anyway. Even here at Swarthmore, competitive rugby, lacrosse, soccer, field hockey, and ultimate frisbee were in my view far better at speaking to that need than football, and similarly so I think at most colleges and universities. None of that needs to end: every university with a residential program still would need to support competitive sports which have no major professional outcome as well as strong intramural programs. The support for such programs, however, wouldn’t need coaches who demand $3 million annual salaries or entail the ongoing subversion of the curriculum: everything about them could be in scale with the everyday goals and structure of a university.

Posted in Academia | 8 Comments

Battlestar Galactica Season Ender

Avert your eyes if you haven’t seen it yet!

Reading over some of the fan sites, especially the official forum, it’s amazing to see the stark division of opinion on the episode, with about half loving it and half really, really hating it. There’s also a lot of discussion of the motivation for making such a radical move, with some pointing to the falling ratings of the second season episodes.

I grant that some of the second season was weaker than the first season. Even episodes that I thought were dealing with dramatically and situationally important issues (such as the growth of a black market) were developed in a weak fashion, or in a way that smacked a little too much of normal episodic television where characters have major dramatic crises that then seem to disappear in later episodes, or are developed inconsistently in various ways. That goes also for some of the ways that the second season tried to complicate the overall dramatic situation. Rather than a progressively greater reveal of the “Cylon plan”, we got some contradictory information that begins to suggest that the writers have no idea what the “Cylon plan” actually is, always a bad sign on a show that depends on suggesting deep designs underneath the apparent exterior of events. Or, to mention the black market again, we got shows that suggested one thing about the evolving situation of the fleet (scarcity growing, desperation growing) that was contradicted by the casualness with which the central characters consumed various luxuries. It would be perfectly consistent with the show’s mood to suggest that the military and political leadership are elites and behaving as such, but there wasn’t always a consistent tone as far as that went.

But I thought the season finale was spectacular, both for its daring and for showing again some pretty deep thinking about the dramatic setting and the dramatic situation of the characters. Some of the fans are screaming that Admiral Adama wouldn’t have allowed Baltar’s election, or wouldn’t have allowed the deterioration of the Battlestars in orbit around New Caprica. I think that makes perfect sense, because that’s the lesson that this character learned in earlier episodes (that there are limits to his abilities to make events come out the way he wants them, and that the human race has to collectively earn its own salvation or future).

Now as for how they’re going to get out of this situation, and where it’s going to go next, I have a couple of speculations. I’d love to see the writers follow up on the episode just before the 2-part finale which introduced a division of opinion within the Cylon culture and tie that to a greater reveal of the “Cylon plan”. My thought has always been that what the humans of the 12 Colonies might discover is that they’re not human at all, but simply an older generation of synthetics created by the “Gods of Kobol”, that the Gods of Kobol are the original synthetics created by the original human race, and that the original human race are the folks who now live on Earth. This would really complicate the Frankenstein narrative that the show is always poised on the edge of by unsettling anyone’s claim to “original” humanity. It would be more about everyone’s contentious claims to self-determination.

But even if this intuition or suggestion is way off-base, I can see one major way for the writers to resolve the situation on New Caprica within 3-5 episodes, which is what they’ve promised, without forcing the Adamas to come up with some improbable kind of military miracle. Namely, the outbreak of a Cylon civil war. What if the Cylons who’ve occupied New Caprica are the enemies of the “war heroes” faction that wants peace with humanity, and whose occupation of New Caprica pushes the debate within Cylon society past the breaking point? Then we could see a really interesting scenario where a joint Cylon-Battlestar fleet saves the New Capricans but where they have to flee from a stronger majority force of Cylons who have won out at home. Then you could have all sorts of layers of danger and treachery in the new fleet–Cylons infiltrating Cylons infiltrating humans infiltrating Cylons. The need to know when to trust and when to betray, when to fight and when to negotiate. Both sides withholding vital information from one another. Baltar could actually be over on the Cylon fleet without becoming a sort of Calicos-style villain sitting up in a chair plotting humanity’s destruction. And so on.

Anyway, for my money, it was a really gutsy, fascinating, unexpected episode. I’m not quite clear on why the people who hate the episode with a passion like the show: I suspect what they’re looking for in the program is a grittier “Space Above and Beyond”, which just seems to sell the program’s potential short.

Posted in Popular Culture | 4 Comments

The Parasite Within

I’ve been telling critics of the war in Iraq for three years that they have to take the neoconservative argument about American foreign policy seriously when it’s made by serious people like Paul Berman, Paul Wolfowitz and a smattering of others in and out of the current administration. Seriously in several respects.

It should be taken seriously as one of the most important causal roots of post-9/11 policy, meaning that yes, for the second time in the last forty years, the United States is involved in a major war conceived of by intellectuals to service an abstract conception of global history and political causality.

A good deal of what neoconservatives have had to say about existing international institutions has a lot of validity, and indeed what they have said echoes some critiques on the left. The United Nations has been and remains largely captive to corrupt statist elites and bureaucratic inefficacy, many treaties are sham performances rather than binding commitments, many international institutions exist to reproduce themselves and their own interests rather than serve as vehicles of transformative power and so on. If we ever get beyond the fiasco of the Iraqi conflict (something I increasingly doubt) we shouldn’t just return to multilateralist business as usual, but work out instead some different configurations of international institutions and the assumptions that undergird their activities.

Equally, what the neoconservatives had to say about the contradictions and inconsistencies in a lot of existing postures taken by their critics was legitimately potent. I went to a meeting here at Swarthmore a year before the Bush Administration took office where speakers condemned the suffering to innocents caused by continuing sanctions against Iraq; by December 2001 some of the same people were calling for the extension and tightening of the sanctions regime as a preferable alternative to war. A lot of the material emerging now on Iraq before and after the war has made it clear that the sanctions really did have a grevious impact on Iraqi civilians and relatively little impact on Hussein: if you’re upset by civilian deaths in the war, it’s pretty hard to see how you could not be upset by the civilian costs of sanctions. Not that this contradiction is new: people who supported sanctions against South Africa opposed them against Poland and vice-versa, often on the flimsiest of grounds. I still remember Ronald Reagan saying in a press conference that the reason why sanctions weren’t appropriate in South Africa was that the conflict there was “a tribal thing”.

More deeply, I still think that some of the neoconservatives scored a legitimate point about the patterning of Western responses, both on the left and among conservative realists, to illiberalism abroad. It isn’t just that each side excused its friends and excoriated its enemies according to friend-or-foe signals set in the Cold War. There was a powerful intellectual moment in African studies whose influence is still very marked in the field where critical depictions of European colonialism and apartheid essentially complained of the illiberal character of those regimes while at the same time exempting postcolonial nationalists from the same critique on the grounds that their achievement of sovereignity was the key thing to cherish and protect. Sovereignity and liberalism bear a kind of distant causal relationship to one another, but the former is no guarantee at all of the latter. If the problem with colonialism or apartheid (or Israeli occupation of the West Bank, or various US-client dictators like Somoza or Marcos) were violations of rights-bearing humanity, then a transfer of power should have had few implications for a critique of such violations, the critique should simply continue with full force after such a transfer. If the problem with colonialism was simply a violation of sovereignity, then at least some of the conventional content of anticolonial and antiapartheid sentiment in African studies and political critique aimed at other areas of the world from the 1960s to the 1990s was misplaced. The more austerly intellectual forms of neoconservatism legitimately called attention to the mismatch between what many intellectuals in the West had to say about global injustice between the 1970s and 1990s with their reflexive idolatry at the shrine of sovereignity.

This being said, the number of truly disciplined, committed, intellectually authentic neoconservatives both inside and outside of the Bush Administration has always been in question. The danger with responding seriously and respectfully to the neoconservative critique as many “liberal hawks” did is that many of the people preaching the neocon line on Iraq, Afghanistan, the “war on terror” and much else were purely expedient and instrumental in doing so, exploiting the sincerity of such liberal hawks in order to advance a much darker kind of policy objective, an old-style paleoconservative form of uber-nationalist realism in an unusually brutalist, frankly stupid, and grossly triumphalist form.

I’ve gone around and around on this issue over the years in many different conversations, and yes, I think there’s no alternative but to admit that liberal hawks and folks like them, including myself, got played in some ways. Just as I think the authentic neocons inside and around the Bush Administration got played. Just as the deep strains of Wilsonian ambition in American culture got played.

What’s happening now, if you read the emergent structures of argument within the blogging world pretty widely, is that the realist parasite within neoconservatism has pretty much burst through the chest of its host and is grinning with sharp alien teeth at onlookers. Start tallying it up, and you’ll see a lot of wingnuts overtly discarding any pretence of being constrained by the ideals of “freedom” in their views of what the US should do in Iraq. Bit by bit, what is being advanced instead is the proposition that it’s time to stop playing by the rules, to give as good as we get, to abandon restraint. There’s always been a low throbbing drumbeat of that sentiment out on the right-wing fringes, rising often defensively as revelations from Abu Ghraib or Gitmo came forward, but now it’s becoming the overt and standard line among Bush loyalists. It has its popular doppleganger: increasingly one hears in vox populi coverage in the media the old Vietnam War trope that the politicians aren’t allowing the soldiers to win the war–a war which if you actually swallowed the neoconservative line was always about a political rather than military objective.

The shift can be heard even within the Administration, where there is more and more talk of democracy and less and less talk of freedom. How can there be talk of freedom, when even the most loyal US clients in Iraq, such as the Kurdish political elite, are being given a free pass to lock up dissenters and create a one-party state? Democracy, in this context, is an old realist code-word: it means the US is looking for a way to install a safely dominant figurehead like Mubarak or Musharraf, possibly with some sham pretence that the leader is elected or that there is a legislature that matters. We’ve done none of the work of building institutions that make either liberalism OR democracy meaningful. You could say that’s because the neoconservatives like Wolfowitz were naive, gullible, or plain stupid in their understanding of how democratic freedoms come into being, or you could say it’s because the realists like Rumsfeld and Cheney never had any intention to engage in institution-building. It ends up at the same place, with the US beating a retreat after building some kind of political fig leaf and from there incurring a costly legacy of subsidizing and propping up an unpopular regime against what is likely to be continuous pressure from many sides.

It’s one thing to be a realist in the context of an unmistakeably realist conflict. Sure, there were neoconservative forerunners in the Reagan Administration who were probably just loopy and deluded enough (Jeane Kirkpatrick, say) to think that Jonas Savimbi or any number of other US proxies were liberal democratic revolutionaries. Mostly proxy wars were fought under Reagan as they had been fought since Truman, with the single goal to push back on Eastern Bloc intrusions and maintain existing hegemonic spheres of influence.

Those policies were frequently stupidly unnecessary and destructive of long-term US interests, but with Iraq, the problem is far, far worse. First because the stakes are vastly higher than they were in Angola or El Salvador or Grenada or the Horn of Africa or even Vietnam. Second because about the worst possible combination of policy frameworks for advancing any coherent objectives is a genuinely idealistic neoconservative mask over a brutalist face. That combination leaves behind it broken and bitter local elites who actually trusted in the idealism and put their lives and futures at stake on its behalf, it serves as a license to brutalism everywhere, it feeds the ideological credibility of radical Islamism. It achieves nothing except the waste of blood and treasure, putting a regime into place whose long-term prospects amount to a return to Hussein’s version of Iraq. Already there are new mass graves being built busily in the soil of Iraq: some filled with the victims of the anti-American insurgency, some being filled by the secret militias of those that the US counts as allies or at least relies upon to prop up its occupation. Already the new Iraqi state is being divided up as a set of corrupt fiefdoms, having been tutored in the new era of corruption by various US-approved contractors.

The yearning for something better and freer in Iraq is real. The possibilities that its elections have revealed are authentic. It’s all being lost precisely because underneath what I take to be genuine passion for and authentic commitment to neoconservative ambitions, a realist beast has always lurked. Now that the intellectual shallowness and credulousness of the neoconservative understanding of historical causality has been inalterably revealed, there are really only two choices left for those who supported and still support the war. Either demand that the Bush Administration finally get serious about the promotion of freedom, which entails closing Gitmo, firing Rumsfeld, and a massive host of other policy shifts in that consistent direction or stop pretending to idealism. And if you’re a genuine realist, what the hell are you doing supporting this war in the first place? Any realist who is serious about that worldview, whether liberal or conservative, knew this war was a dog from the outset and said so.

Posted in Politics | 26 Comments

Dungeons & Dragons Online

Boy, isn’t it always the way: you get to spring break with a long list of things to do and then you get sick. So while I’m trying to get over this sore throat, I’ve been looking into the current computer and video games scene. Excuse the geekout here for a bit, if you’re one of my ungeeky readers. I’d blog this at Terra Nova, but it seems a bit short of a TN-worthy entry.

The marketplace for massively-multiplayer online games continues to confuse me. I can’t help but note that more than a few of the developers who speak in public about their industry just can’t seem to bend their heads around a few important baseline facts in their understanding of the runaway success of World of Warcraft. For folks who don’t follow this stuff, the basic thing you need to understand is that World of Warcraft is more successful by a massive amount than anything before it. In a context where most observers thought that the market for such games had almost hit an upper bound barring some dramatically new type of product, World of Warcraft (WoW) came in and simply delivered the same kind of product at a much higher level of quality and expanded the market by five or six times.

Observers, gamers and developers are still debating what that means. The developers at rival companies tend to favor the proposition that WoW’s success is largely a function of the big money its developers spent in making it, an argument that is sometimes made with a bit of sour grapes. Some developers and many gamers tend to look at its actual design and try to figure out why it works as well as it does (I personally give a lot of credit to WoW’s consistent aesthetics).

The money argument is a sound one, except that those making it often don’t pay close attention to what WoW’s developer, Blizzard, spent money on. They spent it on three things: relatively smooth technical functionality, art design, and the creation of huge amounts of content, all in service to an extremely clear overall conceptual and business plan that included some serious thought about how to manage later changes to the game design. The lesson I draw from that is not that a successful game of this type has to have a particular kind of design, but that to be successful, spend the time minimizing bugs and technical problems and make a ton of content before you get your product out on the market.

Hurrying to launch, even if you’re bleeding money, is basically a way to lose whatever investment you’ve made so far. We’ve seen this again and again in this marketplace: products with commercial potential pushed out the door before they were ready, and thus never achieving anything close to their potential success.

Now comes Dungeons and Dragons Online (DDO), from a company that already made the mistake of rushing to market once, with their product Asheron’s Call 2. DDO is really a pretty different design paradigm, and I suspect people are going to draw the wrong conclusion from that difference if the game underperforms in the market (as I think it’s going to). The design idea of DDO is to get players to savor each adventure for itself, to concentrate on delivering a different content model that gets away from repetitively accumulative play of the kind that crops up in World of Warcraft, to capture (as much as possible) the experience of playing “pen-and-paper” games in the pre-computer age.

Not a bad idea. Except that from the outset, it’s hard to think why players would pay a monthly subscription fee to do that. DDO doesn’t have a “virtual world” to speak of. The need for interacting with other players outside of the adventures themselves is minimal. There’s little economy, little sociality, nothing much persistent to the world. There are already computer games where you can play online for free with other people in very comparable ways: Diablo or Neverwinter Nights, just to mention two. For a monthly subscription fee, you need to deliver something more.

Especially if content is king in the design you’ve got to offer. Especially if the content you have was exhaustively experienced by many of your potential players during beta testing and you don’t do anything to scramble or randomize that content after the beta test ends. Even DDO players who just want to experience the content and don’t care about levelling are finding that most of the other players are just rushing through dungeons and quests because they know where everything is and what every turn and twist of the plotlines are. So even new players who might enjoy for a month or two the content which is available to them are finding themselves shackled to other players who’ve done it all before and can’t help but know how the story turns out. (In part because the game pretty much requires playing in groups.)

If the main satisfaction you deliver is the collective experience of narratives, you’d better try to maintain some sense of surprise or novelty. You’d better have a technical innovation in mind that will let you deliver content differently as well as more quickly if that’s the centerpiece of your game’s market appeal. Or at least you ought to hold back some surprises, traps, twists, in the content you’ve designed and put that in after the beta test ends, before the game goes live. Otherwise, why bother? What is the point of chasing a market niche without technical and design insights that allow you to inhabit that niche comfortably?

I wonder a bit if DDO’s developers aren’t asking themselves the same thing right now. They walked into a cul-de-sac and I’m not sure if their eyes were open or shut when they did it. Regardless, they’re facing a six-story brick wall right now and unless they suddenly develop the ability to defy gravity I think that’s about as far as they’re going to go.

Posted in Games and Gaming | 5 Comments

Lines, Grids, Legibility: A Follow-Up on Summers

Richard Posner and Gary Becker have been arguing in the wake of the Summers resignation that what universities need is far stronger structures of centralized leadership and control, with a relative removal of faculty from governance.

I find myself tempted by this kind of rhetoric. There have been times where I would have preferred centralized control or at least a greater weighting on centralized authority in various decisions at Swarthmore. That has something to do with a perception on my part that I would have agreed with what I took to be the preferences of various leadership figures. That, of course, is the first simple problem with centralized leadership in any institution. It’s fine as long as you’re happy with the leaders, not so fine when you’re not. If a more decentralized model tends towards collective outcomes that don’t suit you, at least you can usually opt out or evade those outcomes in your own autonomous domains. Not so with strong centralization.

There are plenty of examples of how this can be more destructive than the ameoba-like movement of more decentralized and faculty-directed institutions. Case Western is going through something like this now: a leader with centralizing impulses ignored all sorts of concrete danger signals about the financial implications of his decisions, and now problems that could probably have been managed relatively easily have grown to serious proportions. It’s an especially good example in that I find the concept behind the SAGES courses that Edward Hundert advocated to be fairly attractive. It’s seductive to think that you could just cut to the chase and implement such a curricular design by fiat, but it doesn’t work that way.

There are other reasons to worry about the insistent claim that greater centralization produces greater efficiencies and more rapid decision cycles. Sometimes that’s because the corporate analogy is wielded with such heedlessness to justify this move. It’s not that I blanch in terror at the idea of the “corporatization” of the university, in fact, I quite like some of what said corporatization entails (responding to student demands more effectively, a greater concern for applied or practical outcomes in research, and so on). It’s that I don’t think there’s much reason to think that your average corporation provides a good organizational example of effective centralized decision-making, efficient governance, or quick-fire but judicious leadership. All modern institutions have some issues in this respect, however they’re structured. Frustrating as I find academia at times, I don’t think being transformed into Dilbert with a pointy-haired boss would improve my lot much.

Perhaps because I’ve been working through a lot of what James Scott calls “high modernist” polemics about planning and organizational design this semester with my History of the Future course, I can’t help but feel my flesh crawl when folks start casually talking about the need to centralize and plan, to make the disorderly precincts of faculty departments legible to administrative power, to lay out higher education on the grid. It’s not just that this is often the prelude to the kind of authoritarianism that Scott critiques very well, it’s also that such schemes usually backfire in spectacular ways, often destroying the institutions that they came to rescue.

There are more concrete nitty-gritty problems to consider as well. Posner advocates getting faculty out of administration and governance altogether, making them employees and nothing else. The thing of it is, most universities and colleges would grind to a halt tomorrow if faculty weren’t doing administrative and governance work of various kinds. It’s what we do as employees. Sure, not everyone does such work well or consistently: some faculty shirk it, some screw it up. But there’s a lot that needs to be done day in and day out–structuring the curriculum, tracking the students, checking their programs, administering grants, supervising non-faculty employees, and so on. One of the major sources of growth in the salary budgets of universities in the last fifty years is on the administrative side. If you pull faculty out of such work altogether, you’d have to expand administrative budgets three-fold just to handle the increase in work. Some governance you can’t disaggregate from the labor of faculty in any straightfoward way.

I do agree with the diagnosis that Posner lays out, to a significant degree: faculties are hard to lead and won’t pick up the slack of leadership themselves. To some extent, they’re poorly trained and personally disinclined to think about institutional interests first and foremost, and tend, as Posner says, to be “smug and superannuated” to increasing degrees in relation to the wealth and prestige of their institutions.

My answer is not greater centralization, nor is it resignation. This is exactly the reason I consistently push for the erosion of disciplines and departments. It’s not just that I find the behaviors they produce frustrating and occasionally even anti-intellectual. It’s that what I want to see in faculties, in some form (there are many ways to get to this goal) is greater interoperability. That’s not a word that flows off the tongue easily, but in this case, it’s just the thing I have in mind. The more that faculty are transparent to each other, dependent upon one another, the more that their expertise is mobile to sites and areas of changing interest, the faster their institutions can respond to new challenges, both intellectual and fiscal. This isn’t so much changing the way faculty formally participate in governance as it is a re-engineering of their institutional cultures of practice, a structured lowering of the transaction costs that presently make universities so sluggish in the face of change, that produce so many nooks and crannies for feudal turf wars.

There are simple ways to accomplish this. One thing I’ve suggested here recently is that faculty should teach courses in departments other than their own. Not as a cross-listing, but within the core curricular offerings of another department. Right now my entire FTE is “owned” by the History Department; in another system, maybe 1/15th of my teaching in a three-year cycle would be “owned” by another department, and so too all my colleagues (though only after tenure: being evaluated for tenure by two departments is a recipe for nastiness.) There are far more complex ways to engineer this shift as well. Even the simple changes are likely to be difficult to accomplish. Pushing for them doesn’t have the red-meat tooth-and-claw satisfactions of calling for the barricades to be mowed down and Haussman-style avenues punched through the disorder, but I feel a lot better about the outcome of interoperable changes than I do about laying out a legible grid for a Sun King to preside over.

Posted in Academia | 6 Comments

Same As It Ever Was

The Kenyan government has launched what amounts to a brazen assault on press freedom, sending hooded police into the offices of a major newspaper and burning an entire print run. The minister of Internal Security was ready with a depressingly typical bit of postcolonial African repressive threat-in-plain-sight bravado: “If you rattle a snake, you must be prepared to be bitten by it.” Translation: stop covering corruption in the government, or else.

For all you can’t-happen-here types, you might want to listen carefully to the way that the Kenyan government has justified its actions. First, in the name of national security, claiming that the paper was plotting to stir up ethnic unrest in its coverage of government corruption. Guess what? They can’t say more because to go into the details would compromise their ability to gather information. That sounds familiar. No oversight, no transparency, no accountability, and a pervasive logic of “national security” are structurally bad things: you can’t shake your finger scoldingly at African governments but nod approvingly at the same sort of policy and rhetoric cocktail when it’s “good guys”.

Second, there’s a clear reference to the now-orthodox historical narrative about genocide in Rwanda that names radio broadcasts as an important cause of the killings. This is a good example of why details and principles need to work together. It’s important to maintain the principle that even dangerous speech (whether it’s ethnic mobilization in postcolonial Africa or David Irving denying the Holocaust) has to be protected–but also to hold people who want to argue otherwise responsible to the details. The radio broadcasts in Rwanda came directly and demonstrably from a cabal of actors within pre-genocide government, and were highly coordinated. The Kenyan government is going after speech in civil society and charging in the vaguest possible terms that this speech has malicious intent. Even if you think speech in some contexts ought to be restricted (and I don’t), the standard of proof has to be extraordinarily high. If you’re going to talk about radio in Rwanda, and suggest that perhaps that shows that in some circumstances, speech is too dangerous to be permitted (and yes, quite a few academics and policy experts have done just that) you ought to know you’re offering a loaded rhetorical weapon to a broad range of scoundrels.

Posted in Africa | 7 Comments

Full of Dis-passionate Intensity?

During Monday night’s program on Open Source, which I really enjoyed participating in, there was a brief moment where the studio I was in dropped its connection, just as I was about to say something mildly critical about Swarthmore. (Everything else that got said made liberal-arts colleges come out looking like the positive alternative to big research universities, which is an impression I’m quite happy to give.)

What I was thinking about is the extent to which even in a teaching-centered institution that promotes a pretty healthy degree of connection between the faculty, we mostly teach courses that narrowly service departmental curricula deriving from a state-of-the-art sense of what a given specific discipline entails. Broader, connective, integrative courses, or material that doesn’t belong to a conventional discipline, often falls out of view.

This has been on my mind a lot this academic year for various reasons. I spoke along with a colleague to our Alumni Council early this year on this problem, that the faculty don’t ask ourselves enough what it is that 18-22 year olds who are not going to be academics themselves really need to learn or would benefit from knowing, preferring instead to ask, “What’s the proper sequence of courses for this discipline”, or “what’s in the scholarly literature on this topic?” as if the discipline or literature’s benefits are self-evident. We review interdisciplinary minors here every five years, and sometimes do external reviews of departments, but we don’t really expect departments and disciplines to provide an ongoing, renewable and contestable sense of their relevance to the students, the college, the curriculum.

These are old complaints for me in the context of this blog, I know. I think the newer context to which they are becoming relevant is an increasing sense among the student body and recent alumni that Swarthmore has a degree of intensity that is unwholesome or counterproductive.

It’s hard to know what to make of that sentiment when I encounter it. Over time, I’ve felt more and more remote from student experience, for some of the same reasons that Rebekah Nathan discusses in My Freshman Year, her account of a year she spent being with students at her institution. It’s a natural thing. Professors have a very bad tendency as they get older to subscribe to the local declension narrative and talk about how things aren’t as good as they were in the old days, and so on. Sometimes I really don’t want to know more about the students, either: aspects of their institutional experience belong to them, and them alone.

There is some evidence burbling up here and there in ways I can’t ignore that what was mostly an amusing schtick about the college (e.g, the catchphrase, Anywhere else it would have been an ‘A’) is maybe slowly transforming into something less light or neutrally quirky, that it has costs that maybe we’d want to lessen or redirect.

Keep in mind that students also tend to think they know more about what’s going on than they do, or to see things in ways that appear to anyone else to be pretty self-absorbed and maybe even melodramatic. Not just here, everywhere. The smaller the community, perhaps the easier it is for waves of collective melodrama to pass through everyone. Students also can misperceive or exaggerate trends, or assume themselves to be typical or representative in their experience when they’re not.

All these small colleges have their own personalities, and much of what is publically understood about those personalities is something of an illusion. Still, the “brand image” is also a bit of an attractor. Swarthmore’s said to be intense, serious and intellectual, and so it draws some 18-year olds who are or imagine themselves to be intense, serious or intellectual. Particularly for those who think that they are that way but find that they really are not, the feedback loop that gets produced by the match of institutional image and self-image can be a bit daunting, depressing, alienating. Of course, even for those who are authentically intense, part of that authentic personality tends to be a certain kind of dour, serious, self-important take on things.

To some extent, my response to that is a classic old-fogeyish, “Welcome to life!” Our expectations about experiences rarely match the experience itself. Coming from California to go to college in Connecticut, I thought the entire East Coast was all bricks-and-ivy and people in tweed coats smoking pipes, that it was all very sophisticated and European and intellectual. And I thought that’s what I wanted when I was still a surly adolescent. I was wrong about what the East Coast was, and in pretty short order I also found I was wrong about what I wanted. It would be silly to hold a place responsible for not being what I foolishly thought it would be, or be surprised that what I thought I wanted at 18 is not what I wanted at 25 or 35 or 40.

The things that make the biggest difference in the life of an undergraduate college student are often the things that the best possible planning cannot account for or capture. You can’t know who will be on your hall or be your roommate. I haven’t talked to my roommates for two decades but I married a woman who lived on my freshman hall and I’m still married to her. One thing with no effect on me, the other with a life-altering impact, both unpredictable. The professors and classes who mattered: I didn’t know who they would be. The things I thought I was interested in that I turned out not to be interested in. The things that interested me then and don’t interest me now.

You can overplan this part of your life when you’re looking at schools, and thus misattribute both later satisfaction and later bitterness to the choice you ended up making. The things that matter about the choice are the size and organizaton of a college or university, the types of programs it supports, the structure of its curriculum, that kind of thing. There are major rough choices to make along those dimensions, but past a certain point, flip a coin and stay loose about how things unfold. Here I sound a lot like my colleague Barry Schwartz, who offers some pretty valuable practical insight into choices and how we ought to think about making them.

Still, I do worry about the concern for intensity and a sense of dissatisfaction that I hear more and more of here among students and recent alums, because I think it does reflect a tendency of the faculty here and perhaps faculty almost everywhere to go about the business of liberal education with a kind of grimness, without explaining in any sustained and potentially debatable or contestable manner why we think particular courses, disciplines, and so on are important. I tell my students that the first (but not only) question of a liberal education always should be “so what?”, and expect them to rise to that challenge on their papers, in their discussion. I’d like to tell my colleagues the same, only I think I’d get a pretty sizeable number of blank stares or some irritable circle-the-wagons scoldings in reply. Or I’d get answers to “so what?” that are primarily intended for the consumption of other academics rather than students or wider publics.

If some students here (and perhaps at other institutions of our type) feel beaten down or frustrated by intensity, maybe it’s because it seems to no evident purpose save itself, because it feels like ritual self-injury , because the real answer to “so what?” is simply and dully, “because”. We don’t take the time for better answers and assume they will trickle down magically somehow. Or we don’t have those better answers and so dodge the question.

As an undergraduate at a similar kind of place in the 1980s, I took a ton of classes. I think I actually had, if I’d claimed my APs (I never bothered), the second largest number of credits for a 4-year undergraduate ever at my institution. That was intense sometimes, but I enjoyed the intensity. Because it was just about the pleasure of knowing. If a class wasn’t working out or wasn’t interesting to me, I dropped it. I started one class with a nice man who had a very serious drinking problem and I quit the course just because I could see it wasn’t going to return much to me. Didn’t matter, I wasn’t doing it for a major or a pre-professional track or anything else. I took classes in my two majors just to see what they were like, on topics that I had no prior or fixed reason to care about. I didn’t care that much about grade–I took an upper-level biology class on animal behavior and did pretty poorly in it grade-wise but got a lot out of it. I took two years of Spanish and one year of Latin because I thought it was important to try to learn languages, even though I was terrible in languages then and still am today. I was having fun, is the key thing. Yes, an egghead’s kind of fun, but fun nevertheless.

I think it’s still a bit of a secret to the students here, past and present, that some of the students who thrive most or get the most out of this place (and others like it) are those who try not to care too much about it all. I’ve had a couple of “B” students over the years where I think they’re better, more capable all-around intellects and people than some of my most conventionally strong “A” students. The “A” students tend to be more like me or other faculty, to know how to navigate the game as we define it, but often not to know how or when to defy, ignore or circumvent the game. There are exceptions: there is also a kind of occasional “A” student here who is almost terrifying in their sense of self-possession–I can think of a couple of alums in academia, two journalists, and a few other alums I’ve known who could be described that way. Good for them, but that’s rare. The thing anyone can do is make sure you’re playing and not being played, whatever your grades, and I think some of the students who end up with a partially negative sense of their time here are the ones who felt that they have no choice but to be played. You always have choices.

Still, I also think it’s partly our fault as a faculty in this place, partly the collective culture of academia, and also partly the fault of various pressures and expectations put upon our students by themselves and by people who matter to them in their personal lives. I do think we can do a better job explaining what we do and why we do it. I suspect if we did a better job, we’d do a lot of what we do differently. Intensity when you’re full of passion, commitment, lost in something abiding and authentic, is very different than a kind of multitasking, routinized, just-because intensity, the intensity of having three humdrum essays and a mid-term exam due on the same day. You can’t expect to deliver the former intensity consistently as a service to your students (by its nature it is elusive) but it would not be unreasonable to strive for it for insistently.

Posted in Academia | 5 Comments