Falling Walls, Burning Buildings, Gutting Budgets

I’m in a very James-Scottian mood, torn between excitement and despair, watching the events in Egypt.

I fully appreciate the importance of thinking locally about these events and resisting easy attempts to swallow them up in glib narratives.

However, I also feel as if I’m glimpsing another manifestation of something as big as 1848 was across Western and Central Europe. Which, of course, was not something limited to that year alone, but was instead a political, economic and social moment of crisis and transition that stretched from the French Revolution to the late 19th Century.

This is not a novel thought, but I really feel that what is happening in Egypt is part of a comprehensive crisis of the modern nation-state as an institutional form, a crisis whose strongest initial eruption was the fall of the Berlin Wall in 1989, then hitting a series of regimes around the world at a steady rate since then until now.

This is not a particular crisis of authoritarian, socialist or even pronouncedly statist regimes, however: I think it has had manifestations in various liberal democracies as well.

The modern nation-state can no longer provide services it once seemingly easily provided. It no longer protects from the vicissitudes of a globalized economy or restrains the excesses of financial capitalism, in most cases doesn’t even pretend to do so. The state, wherever it is, has no trusted vision of progress, no dream of modernism, no hope of helping its citizens secure a future of satisfaction and comfort. (Indeed, in the US, we now hear ghastly visions of ‘winning the future’, consigning citizens to a perpetual vision of ever more stringently extorted productivity, never-ending competitive one-upmanship.)

Everywhere the words of bureaucrats, ministers and presidents are sick, cynical, passionless and self-interested jokes designed largely to secure the authority of political classes through the tired rehearsal of well-worn gestures, and everywhere populations know those performances as perverse and unamusing pantomimes. Everywhere the nation-state tends towards bloat, corruption, inflexibility, paralysis.

Yes, I’m generalizing wildly, and at another moment, I would doubtless write a dismissive critique of my own words. Some states are trusted more, some states retain some sense of vision, some still deliver many of the services or fulfill the roles that they historically have filled. In some cases, the population is so intertwined with the state that popular disgust towards the state is nearly equivalent to self-loathing; corruption flows through the whole of societies, not just within the government, very much including the United States. (Note that as I write, the government of Kuwait has apparently just announced a new large monthly payment to all of its citizens for the next 14 months.) A few states are working hard to become more transparent, less corrupt, more flexible. But not many in any meaningful sort of way, and where they are, two steps forward often turns to three steps back.

Yes, rejection of the nation-state is taking on radically different local forms, evolving under highly particular circumstances. Not all revolts or reactions have been the same or even similar, even when the common structural problem is the senescence of the nation-state as we have known it. The Tea Party is one such rejection of the nation-state, but so very different than what is happening in Egypt, and not merely because of the obvious divergences of place, cultural practice, social forms and governmental regimes.

It is the radical divergence between all places around the world that is so important: nowhere does anyone have a comprehensive, potentially sharable, plausibly global vision of the political order, what we should be governed by (or how we ourselves ought to govern), of how to secure the parts of modernity that so many people in so many parts of the world have found desirable. Here people try to erect a clean technocracy, there they try a reformed liberal state, here they seek a competent or managerial authoritarian, there they try a furious sectarian nationalism, here a religious purification. Some just reject a single regime head, others the state, and still others to light the entirety of the social fabric on fire.

But amid all this, the most ridiculous and obscure vision of all now appears to be an earlier claim that we had arrived at the end of history. (I know, many of you thought it silly when it was first said.) Progress and teleology has broken decisively on the anvil of the state’s enfeeblement. Small wonder that Hardt and Negri in Empire resort to unsatisfying, vaporous invocations of the Multitude while insisting that they can say nothing of what will come of their eventual revolt. Because indeed nothing can be said.

In the past decade, both global and local political classes have offered nothing but enfeebled incremental, technocratic and self-absorbed fumblings to a succession of shared economic and social crises, hemmed in on all sides by both self-inflicted and exterior constraints. Not even evident self-interest can push some national elites towards reform: now in Egypt, yesterday in Zimbabwe, tomorrow who knows? rulers, ministers, bureaucrats continue commit elaborate forms of social suicide, driving not only their people but their own fortunes towards the abyss, sometimes in the most transparently avoidable ways.

I’d welcome the uprisings and rejections save for the dreadful likelihood that in most cases nothing better lies behind them. No one knows the way out of this cul-de-sac, nobody has a better idea. In many cases, those most disaffected by or angry about the deteroriation of the nation-state’s capacity and vision have still more horrible or destructive ambitions in mind, where the best thing we could hope for would be a bewildered, enfeebled liberal democracy weakly steered by weary technocrats lacking in all conviction.

Posted in Politics | 3 Comments

Skills, Competencies and Literacies, Oh My

One of the themes that’s come up in strategic planning here is the question of the teaching of skills, competencies, literacies (and related concepts) in our curriculum. I’ve initially been a bit taken aback at the strong discomfort some of my colleagues have expressed about an explicit attention to these themes.

However, I understand the larger reasoning that leads to their unease. The language of “skill” or “competency” lends itself all too easily to highly instrumental, pre-professional or managerial approaches to higher education, and the trend towards those approaches in many universities is accelerating. The humanities suffer especially acutely in that kind of move, but any kind of inquiry or teaching that tries to retain the spirit of the liberal arts struggles to hold position.

The reason to study history, literature, philosophy, theology, art history, dance or any other humanistic field is not to develop instrumental skills for their own sake. No less the study of chemistry, engineering, economics or linguistics. On the other hand, I do not think those subjects justify themselves entirely on their own: they are for something more. You study history to understand why the world is the way it is, to consider the varieties and possibilities of human life, to contemplate and not easily resolve the meaning of events and patterns, to try to inhabit past structures and modes of feeling. What comes of that? At a minimum, I’d hope for some kind of wisdom. And I’d equally hope that wisdom would lead to wariness about quick or glib attempts to apply that study to acting in the world. (A sentiment I shared some years ago with a graduating class at Swarthmore.)

However, I really can’t see why that deep purpose is any sense incompatible with becoming a progressively more skilled, competent person in the practical tasks that confront us in our professional and personal lives. All of the abstract, somewhat pretty description I’ve offered so far is typically glossed as “critical thinking”. Critical thinking should make you a comprehensively more capable person: a better writer, a better speaker, more able to read, more able to interpret the thoughts and actions of others and to anticipate social trends and structures, more capable of designing research and collecting data, and so on.

If you assess or grade a student in a course in a liberal arts course, you are forced to judge whether that student is doing the work of interpretation or critical thinking well or poorly, and whether they are improving in how they do that work. Skills or competencies are somewhere in there. I completely buy that they shouldn’t be viewed as detachable from the deeper, richer purposes of critical thinking, but I’m equally concerned about the opposite descriptive error. There’s no reason to describe the work of interpretation as having no practical use, as being necessarily antagonistic to instrumental purposes.

The study of literature can (indeed, ought to) inform the work of producing literature or other culture, most immediately. But it can have practical, concrete, generative effects on any human work, whether it’s being a mid-level bureaucrat at the World Bank, being a plumber, or working as a surgeon. And some of those improvements or cultivations can be connected to more granular skills: reading, writing, speaking, presenting, emotional understanding.

What I worry about when I hear this pushback against talk of skills, literacies and competencies is that it is easy for that to slide into a belief that liberal arts inquiry is distinguished by its comprehensive resistance to or rejection of the language of practicality, applicability or usage. Or that such talk, if not rejected, should be swallowed up in a fog of opacity and evasion, dissolved by irreducible complexities.

It is one thing for a hermit perched atop a mountaintop to insist that his knowledge is necessarily esoteric, that it can only be explored through shared ascetic experiences and never communicated as transmissible product. It’s another thing for a civic institution that imposes considerable collective costs on its supporting society to insist on the same thing: that education is so sublime and ineffable that it should only be lived, never narrated or applied. I think there’s a coherent position about inquiry or contemplation along those lines, but not when it’s coupled to the modern university or to the claims of contemporary academics to shared forms of expertise and authority that arise from a common base of technical and disciplinary training.

If a parent asks me, “What will my child get from studying with you and your colleagues for a price tag that will buy me a house in some real estate markets” and my answer is solely, “They will understand the mysteries of the world a bit more deeply” or “They will be a better person”, those are legitimately repellent or unworthy answers for a great many people. (And we shouldn’t be particularly pleased with the parents who will be satisfied with the idea that we’re making a future elite a bit more cultivated and dignified.) I can’t understand why we would ever insist on those as solitary or exclusive answers. I would say instead, “They will be better at almost any job they choose to do and any life they choose to lead, in ways that I can describe quite concretely, and part of being better is that they will understand the mysteries of the world more deeply and will have begun to explore the art of being human within those mysteries more fully, in a more sustained way, than they would have if they had not gone to college”.

Posted in Academia, Swarthmore | 2 Comments

The Cold Call

We’ve known for a while that my daughter isn’t liking this year at school as much. Yesterday she revealed one of the things that’s bothering her, which is that in her perception the teacher frequently “cold calls” on her during math work, and that she’s felt humiliated by getting the answer wrong or not being able to calculate the answer quickly. This puts me in a tough spot partly because I really don’t want to be one of those parents who questions the professionalism of a teacher at the drop of a hat (academics being especially prone to this): not only is there an issue of respect involved but also there can be unintended consequences from speaking to a teacher about an issue like this: a child who is very pointedly not called upon after an parental intervention might feel just as singled out or humiliated as otherwise.

What I did want to think about was the pedagogical justification for the “cold call”, for picking a student at random and asking them to produce an answer or comment about the material that the class is working on. It’s really not something I do in my own classroom, but I know colleagues who do it at times or do it a good deal. I can think of two major arguments for this technique. The first is competency-based: that there will be contexts outside the classroom in which students will have to produce relevant answers to direct queries quickly and accurately, so it’s best to give them practice at doing so inside the classroom. The second is the instrumental use of humiliation as a motivational tool, that a student who is not doing the work needs to be publically shamed in some fashion.

The first argument makes some degree of sense to me, but on the other hand, I’m not clear that cold-calling is the best way to practice or learn how to give rapid, accurate answers to public queries. When I’m teaching to a skill that I think is something that is called for in professional or civic contexts where I don’t entirely agree with the manner in which it is called for in some of those contexts, I try to make that clear, so that the skill itself is something open to active questioning or consideration by students. More importantly, whether I’m happy with the instrumental framing of a particular competency or not, I try to make sure that we get to it in stages, that I consciously deconstruct everything that’s involved in a single demonstration or enaction of that skill. I’d try to do some of that explanation whether I was working with elementary school children or college students. This is part of how you persuade students that they need to do something that they may find unpleasant or unfamiliar. A persuaded student sticks with the task a lot longer than a commanded student.

The second argument? I concede that in some cases it might work in some narrow sense of that word. Being humiliated unquestionably motivates a lot of human action and feeling. You don’t forget that experience in a hurry. Neither, however, do you easily forgive it, even when the person doing it to you seemed to genuinely have your best interest in mind. People who’ve been publically shamed or humiliated do not reliably change their behavior so as not to be ashamed: it’s just as common for them to form a more abiding hatred of their tormentor and his or her purposes, or to adopt more tenacious habits of avoidance. If purposefully making a student feel ashamed is ever a legitimate pedagogical tool, I think it’s the equivalent of breaking the glass on a fire alarm: done only when the only alternative is to let a building burn down.

The really tricky thing is the intersection between the first objective and the second. I can’t anticipate every possible emotional reaction to a teaching technique or strategy that I commonly use and see as effective or necessary. I’ve been surprised in the past by hearing from a student that some approach I thought was fairly ordinary made them feel excited or passionate or alienated or unhappy. I can’t try accommodate every possible temperament or background or I will end up teaching to none of them. I do think I can make a good guess when a strategy has a higher chance of disturbing or confusing some relatively common subset of my students, and that’s when I have to make a pretty smart call about whether the upside justifies the risk.

In my own professional vision, I’d put cold calling very high up on the list of “probably not worth it”: skills of rapid, accurate response to public queries can be built up (and put to the test) in other ways. The risks–particularly in this case to elementary-age girls and their engagement with mathematics–seem very high. Most people don’t develop a motivation to do something independently, with passion, that they associate with emotional pain and embarassment.

Posted in Academia, Domestic Life | 19 Comments

Fun Home

As long as I’m talking about parenting, two anecdotes:

1) We teach on the MLK holiday, but I had time to take my no-school-that-day daughter out to lunch. We ended up talking about a bunch of things, but at one point she decided that she needed to know about the entire continuity of the character of Green Lantern. I attempted to escape the subject at several points, but she was fairly implacable. Somewhere around “and then it was revealed that Parallax was actually a yellow fear demon and Hal Jordan came back to life”, I was getting some pretty serious wtf glances from other diners.

2) My daughter’s birthday parties each year revolve around a theme that she selects, and I usually end up playing a character relevant to the theme. This year is Greek mythology, so I’ve decided to be Fredalus, Daedalus’ obscure and not very successful cousin.

The interesting thing to me about the interest in mythology is that it wasn’t really sparked by the Percy Jackson books: those followed on rather than preceded her engagement with the myths. She found them the way that I did, by reading the D’Aulaires’ Book of Greek Myths well before the film of the first Percy Jackson book came out. What’s especially interesting is that there’s a lot of general interest in Greek mythology among children that we know and her cousins as well, and at least some of the other kids we know took the same path to getting interested in the subject.

This is another one of those cases of cyclical cultural themes where a complex-systems approach is very helpful as an interpretative tool, partly because it helps defer some of the tendency to overread the intentional or instrumental meaning of this kind of recurrence (of the sort that I think WJT Mitchell slides into at times in The Last Dinosaur Book).

There’s certainly an “interested” historical explanation for why classical Greek literature and culture have remained in circulation within a Western society which has self-consciously defined the West as derived from Greco-Roman precedents, but the lightly sanitized (although the D’Aulaires version delightfully keeps some of the edgier elements in view) simplified apparatus of Greek mythology has some of the same root-level attractions as dinosaurs, Pokemon, Harry Potter, superhero comics or Star Wars. The individual stories tie into a comprehensively imagined and interconnected world and effectively teach children how to master two important kinds of systematic knowledge: taxonomy and intertextuality. That mastery then yields social rewards in relationship to other children who’ve also taken an interest in that cultural system. The D’Aulaires’ approach encourages a reader to learn the names of gods, demigods, heroes and kings, and to understand how different types or taxa interrelate within those stories. They also encourage young readers to not only relate the stories as they tell them but to relate their telling of stories to other tellings of the same stories by different authors and from different times.

I think these kinds of mythoi have some of the same “procedural” learning involved that you see in certain digital games and media: they not only teach content but they teach a mode of learning content at the same time, in autodidactical combination.

All of which is a TL;DR way of saying that I’m happy to be Fredalus supervising a “cut the head off the Medusa” version of pin-the-tail-on-the-donkey.

Posted in Domestic Life, Popular Culture | 3 Comments

Smarter Than the Average Bear?

Apparently I’m not alone in not liking Amy Chua, the self-described “Tiger Mom”. If I were a superpowered person from the comic books, she would pretty much be my opposite number. I’m the “Bear Dad”: I hibernate a lot, amble along eating berries and grabbing salmon, doing the omnivorous whatever-it’s-cool thing and encouraging my cubs to do the same, or not, as it strikes them. Maybe every once in a while I have one of those awesome standing on my hind legs and confronting a cougar moment like in an old Disney nature film, but then it’s back to eating some berries and putting on fat for the winter. I’m personally inclined to be like “free-range” parents such as Lenore Skenazy.

http://www.flickr.com/photos/trevin/58910501/sizes/m/in/photostream/

I don’t have much to add to some of the common criticisms of Chua’s arguments about parenting, like her vaguely creepy racial mapping of parenting as cultural destiny.

Still, let me see if I can add one distinctive note to the public debate of the last week. If I have any professional weight to throw around here, it’s as someone who teaches in a highly selective institution of higher education that might be one of the destinations that “tiger moms” would aim their children towards.

I’d say yes, I do see some tiger children from time to time here, particularly among science majors. Some of Chua’s critics claim that children subjected to her kind of parenting habitually self-destruct the moment they’re out of reach of their controller. Anecdotally, I don’t really see that pattern. Sure, I can think of cases that seem to fit, but I can equally think of cases of young adults who were driven very hard by uncompromising parents who pretty much accept and embrace that vision when it comes time for them to fly out of the nest, and who pass it on to their children in turn. I’ve also seen more than a few “bear children” go from being gentle, omnivorous wanderers to being totally lost souls whose downward mobility is as precipitous as a waterfall.

Where Chua is just frankly wrong is the proposition that bear children don’t “win prizes”, that tiger children are set for life, that they win and dominate. She’s wrong empirically: I frequently meet people who are at the top of their respective professions or situations who were raised in every way the opposite of Chua’s children. And she’s wrong morally: life is not an instrumental prize that you secure permanently and unambiguously at some magical point in your adulthood, nor should it be. Look what happened to some people who were by the consensus of their peers “winners” as they headed into their 20s. Look at how some of Wall Street’s winners shat the collective bed recently.

If I have seen a pattern, among my students and my parental peers alike, it’s that parents who try to be someone that they’re not, pursuing a parenting style that doesn’t come from their own life experience, are the ones who will create the most psychic havoc for their children and for themselves. That’s the really pernicious thing about figures like Chua, or indeed most folks who try to sell a complete parenting philosophy to an anxious middle-class public, whatever the recipe they’re peddling. Parents who are trying too hard to do what bourgeois consensus views as the right thing, who are too sensitive to the glances and petty remarks around the edges of a PTO meeting, who peer surreptitiously around the living room of neighbors to spy out their domestic rituals (half to ensure the conformity of the neighbors, half to assure the spy of his or her own conformity): those are the people whose kids are much more likely to massively disavow what they’ve been pushed or required to do, or to angrily lament the lack of earlier pushing or prodding by their permissive parents. More to the point, those are the kind of parents who inhabit the work of novelists writing about the domestic discontents of bourgeois families, who try too hard to perform an inauthentic self for too long and then one day skate out onto thin ice and fall right through.

I would never tell a tiger mom to be a bear. Of course, that’s a very bear dad thing to say, the essence of the whole-of-the-parenting-law-is-do-as-thou-wilt. I’m not saying that I don’t have opinions and advice about the parenting (and children) of others, but for me that’s a very intimate, complicated feeling. I wouldn’t presume to tell a stranger how to do it right, other than to say that it’s a mistake to do something because other people say to do it.

Unfortunately, middle-class life is perhaps of necessity a nervous condition built on a desperate quest for social distinction: always aspiring, never achieved. Inasmuch as Americans continue to maintain their collective belief that everyone in the United States is middle-class, that American identity is always aspiration, maybe they can never get to a point of accepting that an individual philosophy of parenting should grow naturally from the cultural soil of every individual life and come to rest at that point.

Chua and anyone else trying to sell parenting to anxiety-ridden people might learn a lesson from Ann Hulbert’s intellectual and social history of parenting advice. The main lesson would be humility: this has happened before and it (unfortunately) is likely to happen again. When Chua says that “permissive parenting” is a new thing, she’s just flat out wrong. But then every advicemonger says the same thing each time: they have come to restore some past wisdom from some present trend towards degeneration.

I suppose that’s why I don’t just tend to my own flock and let things happen as they ought to: the history of advice to parents is a long series of families knocked out of joint, separated from their common sense and practical wisdom, and worse yet, sometimes pulling the culture as a whole along for the ride.

Posted in Domestic Life | 3 Comments

Incentives for Faculty

I’ve had a bunch of conversations in the last five years that have turned on the question of what makes new practices or approaches attractive to tenure-track and contingent faculty, if anything.

This kind of discussion springs from a deeply rooted vision of cultural and social change that ultimately traces back to a very liberal, individualist and frequently capitalist understanding of human agency and motivation. If you think that individual actors respond primarily to centralized commands or strictures, or to preconscious and emotional drivers that have little to do with the surrounding environment, to some higher rationality that is no respect about self-interest, etcetera, then talk of incentives is already a misfire. Or if you think that any instrumental effort to encourage some practices and discourage others is a mistake or an irrelevance, equally so.

But if there’s any institution where “incentive” seems to me to be a fairly good way to approach deliberate or purposeful change, it’s tenure-track academia, in which individual faculty generally have considerable autonomy in developing their own interests, their modes of participation in institutional life, their work process and so on. Which I think is all to the good both for practical and philosophical reasons. A command approach doesn’t work and is a bad idea in any event. But given that some degree of institutional coordination is also an important need for higher education, you need to figure out some way to align or connect what individual faculty choose to do, and when there are opportunities or dangers in the larger context of higher education, some coordinated response is also important.

So what do faculty want? Some possible ways to look at this question:

1. Money. There are a lot of reasons why many colleges and universities don’t use this incentive too openly or actively, however. And some reason to think that a model of baseline salaries for most and huge salaries for a small sliver of highly desirable faculty at the top has even more perverse effects in relation to academia’s missions than it does in most businesses.

2. Security, e.g., promotion and tenure. Obviously this incentive works in the sense of motivating people to tolerate the process of graduate training, the perilous crapshoot of the academic job market, and the little tyrannies that junior academics often have to cope with on a daily basis. Precisely because it is both central and heavily mystified as a process, it is a very hard mechanism to consciously skew in some new direction in order to produce new outcomes. Even when administrators or influential faculty embrace the idea of crediting new modes of scholarly output in digital or new media formats, for example, shifting actual practices of assessment by peers is exceedingly difficult. Add to this point the fact that tenure as an institution is dying, and the incentive value of expanding or extending its protections thus unlikely to appeal to most administrations.

3. Enhanced autonomy. At least some faculty interested in new practices of publication, teaching, or engagement might find an offer of increased freedom from departmental or institutional strictures to be an attractive incentive. One of the dangers here is that the major reason why an institution might want to encourage experimentation or innovation in those practices is as a test of the viability of those experiments for wider adoption. If getting people to experiment involves detaching them from ordinary structures of governance or supervision, that possibly will limit the impact of those experiments as well as free the experimenter of the obligation to persuade colleagues of the advisability or generativity of their new ideas, cancelling out most of the positives for the institution as a whole.

4. New resources (technological tools, support positions, faculty lines, dedicated leaves). For one, potentially very expensive. For another, potentially inflexible or structural changes being made well in advance of testing out the viability of some new approach. And yet another problem: many faculty would not welcome the additional supervisory or administrative responsibilities that new resources typically entail.

5. Endorsement, validation, a kind word from on high. I wouldn’t underestimate how powerful a motivator this kind of appreciation can be. At some point, it’s possible that many people might expect such an endorsement to be backed up by something more substantial and be disillusioned if it weren’t. Administrative endorsement also often mobilizes as much antagonism as it does appreciation among faculty eager to protect their own prerogatives.

6. Nothing that a local administration can provide: if there are incentives, they’re extra-institutional, vested in disciplines or professional associations or public culture or government approval and so on.

7. The subservience and fawning obeisance of mere mortals. Sadly, I’d say I’ve met academics who seem to be primarily motivated by this objective. It would seem to me to be a bad idea to give them what they want as a matter of official or administrative policy.

8. Nothing at all: faculty who do interesting things that create new practices or spread new ideas are primarily self-motivated, and trying to underwrite or encourage their work might actually insult some of them.

Posted in Academia | 1 Comment

Blizzard Is CLU

x-posted to Terra Nova

I don’t understand why Tron: Legacy has come in for so much critical abuse. I like it as much as my colleague Bob Rehak does. Just taken as an action film, it’s considerably more entertaining and skillful than your usual Michael Bay explosion fest, with set-pieces a good deal more exciting than its predecessor. However, like the original Tron, the film also has some interesting ways of imagining digital culture and digital spaces, and more potently, some subtle commentary about some of the imaginative failures of the first generation of digital designers.

Some critics seemed disappointed that the film takes place in a closed system, the Grid, created by Jeff Bridges’ Kevin Flynn, expecting it to ape the original film’s many correspondences between its virtual world and the technology of mainframe computing and early connectivity. In the original Tron, once Kevin Flynn finds himself inside the world of software and information, he finds himself meeting embodied programs that correspond to actual software being used in the real world, he has a companion “Bit” who can only communicate in binary, he has to make it to an I/O tower so that the program Tron can communicate to his user and so on. Critics seemed to expect that Kevin Flynn’s son would be transported inside a world built on the contemporary Internet, that he would venture from Ye Olde Land of Facebook on a Googlemobile past the some pron-jpg spiders scrambling around the landscape of Tumblr and then catch a glimpse of the deserted wasteland of Second Life.

The director wisely avoided that concept, but I nevertheless think the film is in fact addressing at least one “real” aspect of contemporary digital culture. Kevin Flynn, trapped inside the Grid for more than a decade, discovers that his basic aspirations in creating a virtual world of his own were fundamentally misdirected. He sets out to build a private, perfect world populated by programs of his own design. The complexity of the underlying environment that he creates turns out to be a “silicon second nature” that spontaneously generates a form of a-life that uses some of what he’s put into the environment but that also supercedes his designs and his intentions. Too late, he realizes that the unpredictability of this a-life’s future evolution trumps any aspiration he might have had in mind for his world. Too late because his majordomo, a program of his own creation, modeled on himself, called Clu, stages a coup d’etat and continues Flynn’s project to perfect the world by eliminating contingency, unpredictability, organicism, redundancy. In exile, Flynn realizes that the most perfect thing he’s ever seen is imperfect, unpredictable life itself: the son he left behind, the life of family and community, and the life he accidentally engendered within a computer-generated world.

Whether the analogy was intended or not, that narrative strikes me as a near-perfect retelling of the history of virtual world design from its beginnings to its current stagnant state. The first attempts to make graphically-based persistent virtual worlds as commercial products, all of them built upon earlier MUD designs, sometimes made a deliberate effort to have a dynamic, organic environment that changed in response to player actions (Ultima Online’s early model for resource and mob spawning). But even products like Everquest and Asheron’s Call offered environments which could almost be said to be shaped by virtual overdetermination: underutilized features, half-fleshed mechanics, sprawling environments, stable bugs and exploits that gave rise to entire subcultures of play, all contributing to worlds where the tangle of plausible causalities made it difficult or impossible for either players or developers to fully understand why things happened within the gameworld’s culture or what players might choose to do next.

Some of the next generation of virtual worlds, such as Star Wars: Galaxies, ran into these dynamics even more acutely. Blizzard, on the other hand, launched World of Warcraft with a clear intent to make a persistent-world MMO that was more tractable and predictable as well as one that had a more consistent aesthetic vision and a richer, more expertly authored supply of content.

That they succeeded in this goal is now obvious, as are the consequences of their success: other worlds have withered, faded or failed, unable to match either the managerial smoothness or content supply offered by Blizzard. Those that remain are either desperately trying to reproduce the basic structure of WoW or have moved towards cheap, fast development cycles and minimal after-launch support with the intent to make a profit from box sales alone, in the model of Cryptic’s recent products.

With the one major exception, as always the lone exception, of Eve Online. In terms of Tron: Legacy, Eve is the version of the Grid where the a-life survived. Though in the film, the a-life, the isomorphic algorithms, that appears are said to be innocent, creative, imaginative; the moral nature of Eve’s organic, undesigned world is infamously rather the opposite.

But what Eve proves has also been proven by open-world single player games like Red Dead Redemption or the single-player version of Minecraft: many players crave unpredictable or contingent interactions of environment, mechanics and action. In RDR, if you take a dislike to Herbert Moon, the annoyingly anti-semitic poker player, you can go ahead and kill him, in all sorts of ways. He’ll be back, but more than a few players found some pleasure in doing their best to get rid of him in the widest range of creative ways. You can solve quests in ways that I’m fairly sure the designers didn’t anticipate, using the environment and the mechanics to novel ends. You can do nothing at all if you choose, and the world is full of things to do nothing with.

Open-world single-player games allow a range of interactions that Blizzard long since banished from the World of Warcraft. In the current expansion of WoW, I spent a few minutes trying to stab a goblin version of Adolf Hitler in the face rather than run quests on his behalf, even knowing, inevitably, that I would eventually end up opposing his Indiana-Jones-derived pseudo-Nazis and witnessing his death. I’d have settled for the temporary resolution that RDR allows with Herbert Moon, but WoW is multiplayer and Blizzard has decided that the players aren’t allowed to do anything that inconveniences, confuses or complicates the play of other players.

I don’t know that this is Blizzard’s fault, exactly: the imperfections of virtual worlds are precisely what so many of us have spent so much time discussing, worrying about, and trying to critically engage. Trolls, Barrens chat, griefers: you name it, we (players, scholars, developers) have fretted about it, complained about it, and tried to fix it.

The problem is that the fix has become the same fix CLU applied to the Grid: perfection by elimination, perfection by managerialism. What now strikes me as apparent is that this leaves virtual worlds as barren and intimidated as the Grid has become in the movie, and as bereft of the energetic imperfections of life. That way lies Zynga, eventually: the reduction of human agency in play to the repetitions of code, to binary choices, to clicks made when clicks are meant to be made.

Where the spirit of open worlds survives, it survives either because the worlds are open but the hell of other players has been banished and the game stays safely single-player or minimally multiplayer or because the world has surrendered to a Hobbesian state of nature, to a kind of 4chan zeitgeist.

I can’t help but wonder, as Flynn does, whether there’s some slender remnant possibility that is neither of these.

Posted in Games and Gaming, Popular Culture | 2 Comments

Footnotes to XKCD on University Websites

I spent some time before Christmas informally gathering several kinds of curricular information from the websites of the COFHE schools for use in Swarthmore’s strategic planning work.

Here’s some of what I informally learned in the process.

1. XKCD, typically, hits the bullseye. Most of us who use university and college websites heavily have just learned where the stuff we need actually is, but I pity anyone who has to find something for the first time with the possible exception of prospective students who manage to go directly to admissions-related resources and may actually want to see all the stuff about the university’s mission and so on.

2. In particular, finding a) graduation requirements and general-education requirements and b) requirements for specific departmental majors is about two magnitudes harder than finding out which Senator put a hold on a Presidential nominee. The range of ways and locations where this information is listed was really dizzying.

3. It can be extremely difficult to tell from departmental pages what the research specializations, interests or work of faculty members actually is. Some departmental pages give really rich, interesting information about their faculty, some barely bother to list their individual names and contact information.

4. Looking at the history departments of COFHE schools, I eventually got a fairly confident sense of what the genuinely typical fields across that group tended to be, though variations in faculty size were fairly pronounced. Here’s an odd impression that I ended up having along the way: that the most atypical fields were often associated with the most senior faculty rather than the most junior. We tend to think of innovation as something that happens as a result of a deliberate decision to invest in a new or underrepresented field, but instead it sort of seemed to me that unusual fields were either the product of idiosyncratic faculty development or were survivals of the formerly typical distribution of fields in an earlier era.

Posted in Academia | 1 Comment

Mimesis and Interactivity

Here comes a bunch of blogging! Fasten your seat belts.

================

So yes, we got a Kinect at our house. I am the very model of the modern gamer tech geek. As an incremental change to the wand-driven interface design of the Wii and PS3, I admire it. I’m far more fascinated by the really imaginative hacking of the powerful capabilities of the device, and the unintended ends to which they may lead. I confess I was also a bit disappointed that the interface didn’t function like a combination of “Minority Report” and the Bat-Computer to the extent that I’d secretly hoped it might.

What frustrates me most about the Kinect, however, is not the device itself but the common misapprehension of some middlebrow game and digital media critics, most prominently Seth Schiesel of the New York Times, that the Kinect is the future of a naturalistic, real-world mode of interacting with digital appliances and media. Schiesel states the hope succinctly: that the banishment of game controllers, iPod dials, keyboards and other control devices in favor of intuitive motions of physical bodies and natural language commands is the end of a geek-favoring barrier to the consumption of digital media and the use of digital tools and the beginning of a great democratization of the digital.

This is in the end a very geek-oriented way of imagining why some media practices seem to cohere to geeks, that design is destiny, that technology intrinsically favors or excludes users because of its particular material or conceptual nature, usually a feature or architecture that a critic or designer believes can be and should be changed.

I don’t entirely disagree with this perspective. Design matters, and it matters in ways that are not purely a mirror of sociology or culture. This is even true of the Kinect or Wii or Sony Move control systems in particular. Schiesel and others are perfectly correct to say that kicking a virtual soccer ball or doing a virtual exercise routine with a motion-capture system is intuitive in a way that using a multi-button controller is not, and that this intuitive design permits many people to play some digital games when they would otherwise think that the effort of learning a control scheme doesn’t justify the reward of playing the game.

What bugs me about the middlebrow celebration of the downfall of the multibutton controller and its kindred devices (keyboards, etc.) is the naive understanding of mimesis buried inside that enthusiasm. The driving faith here is that representation and lived experience should have a 1:1 correspondence in order to rid ourselves of the work and difficulty that comes from a slippage between the two. There’s at least a kissing-cousin resemblance between this view and older positivist ideas, lingering on in some scientific and social-scientific circles, that we should tinker ceaselessly with language until all ambiguity is banished from it and it thus can be used for the efficient description of the real world.

Let’s say that Microsoft continues to hammer the bugs and quirks out of the Kinect, making its recognition of both language and motion closer and closer to how we hear and interpret speech and action with our own perceptual systems. Let’s even pretend that there won’t be a more and more obvious uncanny valley of some kind as it does so. As the system becomes more and more mimetic, at least in theory, will that truly rid of us of complex control schemes that only a geek could love?

Of course not, at least not if digital games work with the unreal, the imaginary, the impossible. What an odd thing that anyone should wish for games to become more restrictively mimetic to “reality” at a moment when digital technologies are otherwise opening up representational possibilities in film and television.

Stick for a moment even to sports games. A programmer could make a better and better Kinect-controlled soccer game, but if that is only going to involve those actual physical routines we use in a real game of soccer (which are themselves not something that human beings are born knowing, and are in some cases anything but intuitive: a game where you can’t use your hands? Not exactly a natural idea for a primate with opposable thumbs), two problems will quickly arise. First, if the action I see on the screen is to be synchronized with the action I perform in real-world space, the action can be in general be no faster or slower than my real physical motion. Maybe you’re different than me, but I don’t play soccer at the speed of a Ronaldo or Beckham. So nothing in the digital game can appear to be enhanced from the world unless everything is enhanced or exaggerated to the same degree, and every computer-controlled player or physical action has to be as slow and boring to watch as I am in real life. Second, I can’t do anything that doesn’t involve a match between an on-screen avatar’s motions and my motions. The avatar can be first-person or third-person, but I can’t do something like control multiple avatars, or control markedly non-human objects or creatures unless I learn to do something very imaginative, abstract or counter-intuitive with my body in real-world physical space. I can be a tiger in a Kinect game, but that either has to involve translating my normal bipedal ape motions into the motions of a four-legged feline or it has to involve my mimicking the motions of a four-legged feline.

From there, it’s a pretty short step to the Kinect version of having to memorize a series of finishing moves. It’s not as if this is something that digital media force upon our normally naturalistic, intuitive bodies. A boss fight in World of Warcraft has always seemed to me to have a very strong analogy to choreography, and I can easily see a Kinect-style future for a game of that kind where getting the right sequence of heals on the tank would look more like T’ai Chi than keyboard typing. But all of that will involve something as intricate and complex as contemporary controller interfaces (or real-world multiperson dance recitals). Without that slippage-filled interfacing complexity, I won’t be able to be a Jedi in a Kinect game: a game could interpret my raised hand as a Force choke, a push as a Force push (much as it looks in films and cartoons) but I can’t tell my avatar to do a eight-foot tall Jedi backflip without a gesture which is very fundamentally not an eight-foot tall backflip.

We can’t be freed of the work of representation, the ambiguity of language. Why should we want to be? That is like imagining a freedom from life itself. It will be all to the good if the Kinect makes a game designer deep in his or her cubicled warrens wonder if the best way to connect a player’s actions with the attack of a fantasy warrior in an imaginary world is X O O X left trigger long-hold-on-X as opposed to the player making a fist in the air and waving it around. Anything that unsettles byzantine practices of culture by reminding us of their contingency is good, because that’s what catalyzes the creative discovery of the novel and unfamiliar. That creativity will be stillborn if it has to satisfy the expectation of the Schiesels of the world that they will never again have to learn something unfamiliar in order to control the unfolding of the imaginary.

Posted in Games and Gaming, Popular Culture | 3 Comments

Cheaters Sometimes Prosper

Historiann responds to the story of a cheating scandal at the University of Central Florida much the way that I do: before we get to talking about the misdeeds of the students, a 600-person course in strategic management taught very directly from a textbook, so directly that the professor felt comfortable using an exam provided by the textbook publishers, is really a far bigger issue.

Reading the astonishingly lengthy Wikipedia entry on strategic management provides a bit of insight in its own right. It’s a bit of a wonder to me that there are so many people who remain a-quiver with anger over the fading heyday of crit-theory jargon in the humanities when there are fields of professional training as chock-a-block with buzzwords as “strategic management” appears to be. I readily understand the need for people to act as strategic managers as described. I can see the validity of many of the debates that surround management in most organizations (business and otherwise): is strategy a matter of an astute, intuitive reading of the environment, or should it be driven by some kind of rigorous data collection, and so on.

But the idea that to be a strategic manager one ought to study strategic management? That’s mired in the big muddy of a lot of professional education. There are undoubtedly fields of professional work that have very steep requirements for highly specific formal knowledge before ever undertaking work of any kind. Still, even with professions like medicine and law where there is undoubtedly a very large body of formal knowledge that is necessary to successful practice, much of the real learning takes place through practice.

The establishment of formal certifications and informal expectations for professional degrees in many other professions is less about preparing people for the work they’re going to do and more about using educational institutions as a pre-screening device that saves employers the effort of having to consider an almost infinitely large pool of possible candidates for managerial or professional jobs. Some percentage of possible applicants will not have the money or the time to complete a degree. Some additional percentage will be unable to complete the certification due to life circumstances or sheer intellectual inability. At the end, potential employers get to look at a much smaller pool of applicants, but they’re not necessarily people who know much more about how to actually do the jobs for which they might be hired. Moreover, once sufficient numbers of certified people are established in a particular professional setting, they tend to busily go about altering the functioning of their work environment so that professional certification (and attendant jargon) is an effective social barrier to any outsider who might trespass.

Essentially this is mandarism, 21st Century American-style. Work that ought to call for a fairly heterogenous mix of people with different past experience and formal knowledge becomes a monoculture. Strategic management, for example, seems precisely the kind of thing where you might want someone whose sense of a business comes from the ground-up, say in sales; another person who sees the big picture of an industry; another person who has very formal training in data collection and institutional research; another person with an unusual disciplinary background (say, an anthropologist or a former military officer, etc.) I’d wager you’d do as well or better if you want to train “strategic managers” to simply expose them to a fairly vanilla liberal arts core curriculum with some special emphasis on economics and organizational sociology.

I suspect most students in these kinds of professional programs understand this perfectly well, that what they’re paying for is to survive a more or less mechanical process of screening, that what they’re doing in their courses has little to do with what effective work in these professional fields might resemble, that universities are trolls collecting tolls in order to clip-clop across the bridge.

In that context, it’s hardly surprising that students would cheat: it’s fairly difficult for faculties and administrators to act as if they’re upholding a sacred covenant when they’re really just gatekeepers churning six hundred people at a time through a purely textbook-driven course.

This is the same kind of cynicism that lets the now-infamous “Shadow Scholar” carve out a career as a ghostwriter of student papers. (Like a lot of folks who read the Chronicle article, I think Mr. Shadow is laying it on with a trowel, but I don’t doubt that there are people who do something like this for a living.) Over the last year, I’ve gotten emails from five online services that seem to me to be trying to leverage social media into cheating aids, such as Course Hero. I was initially a bit puzzled about why such services might contact professors as potential contributors, but in some ways the only difference between teaching in some for-profit online universities or teaching in some professional programs and contributing to cheating sites is that the latter pay off so poorly. In the end, a lot of this comes down to the same thing: being part of the machinery that processes people who are understandably desperate to buy an entry into a middle-class job. If there are crackdowns on cheating, that simply because the screening function of the machinery breaks down if every single person who pays the money gets the certification, both for some of the student customers and for the end consumers, the businesses and organizations who are subcontracting the work of mandarinizing the white-collar workforce out to universities.

I’ve observed before that one solution to the problem involves reconstructing professional and graduate education (including, yes, Ph.Ds in academic subjects) so that the value added by actual education is the first, second and last priority. Which means smaller classes, less teaching to templates and textbooks, a more problem-driven and experiential focus to education, and less disciplinarity and technical narrowness in fields that by their nature ought to be about intellectual and professional heterogeneity. Which of course also means that the economies that govern many large universities would need a massive overhaul. Those unable to take that on? Maybe they should just go out of business altogether.

Barring that kind of reconstruction, it’s hard to get too exercised about aspirant mandarins cheating their way through the screening process. After all, in more than a few workplaces in contemporary American life, proficiency in cheating is arguably one of the best skills you could master. Bank managers are telling us all with a straight face that while they flagrantly cheated their way through the procedures governing foreclosure and holding mortgages, we need to just look past that and let them go about the necessary business of getting out of the mess that securitization of mortgages left them (and us) mired in. We’ve been told to overlook mind-blowing fraud in the supply of equipment and services to our Iraq and Afghanistan operations in the name of the greater good. If the cause is good, or even if it’s not, expediency now apparently trumps honesty throughout our political and social lives.

Accountability is for everyone or it’s for no one. Or maybe, as in the case of this cheating scandal, it’s only for the dumb chumps stupid enough to get caught and powerless enough that they have to don their sackcloth and ashes and duly perform their false regrets.

Posted in Academia, Politics | 4 Comments