Image of Africa courseblog

I’ve finally gotten my courseblog (and Twitter feed) for History 86, Image of Africa, fully set up. I’m really looking forward to this class: it’s become as much a class on the history of transmedia interactions as it is about the impact of representations of Africa. I’m also using the course to think about the disciplined use of online and digital resources, which should come out in some upcoming entries here.

Students will be posting their “spottings” of Africa-related tropes as the semester goes on, and I expect to post at least some of their written work from the course to the blog. If you’re interested in commenting there, send me an email at tburke1 @ swarthmore.edu.

Posted in Africa, Swarthmore | Comments Off on Image of Africa courseblog

The Temperament of Serpents

I went surf fishing in Delaware for the first time this summer. My previous experience had been limited to freshwater fishing, mostly for trout, some with a spinning reel using lures and bait, some fly-fishing.

So I spent some time reading about the gear, reading the surf, casting techniques, what you’re fishing for and when you fish for it, and so on. I got myself a 10′ rod and spinning reel and some of the tackle I’d need. I read a few books on surf fishing and a few online forums where surf fishing enthusiasts were talking about their experiences. Then when we were down there I went to a nearby bait and tackle place, got my license and asked about what folks were fishing for, knowing that August is a bad time of year for much of anything.

All of this was useful preparation, and helped me have that conversation at the store in a way that didn’t make me look like a complete idiot. I took advantage of formal and informal knowledge, and became in some slight way a knowledgeable person about what I was going to do.

When I got to the beach with my family, I knew how to rig my line, and how to use my existing knowledge of casting to cast much further than I normally would. What I didn’t know, and what formal knowledge could not let me know, is that in August, almost no one surf fishing in that part of Delaware was surf fishing for the sake of catching fish or even for the zen of fishing per se. Almost everyone there, and it was fairly crowded, was there with their cars (there’s a somewhat expensive permit that lets you drive on the beach if you are “actively surf fishing”). They were there for the sake of the beach, not really to fish. Basically, it was much closer to a big tailgating party than it was to the kinds of fishing I’d seen done along trout rivers and streams, where almost everyone is fairly intent on the act of fishing (which, to be fair, entails a fair amount of just soaking up the scenic environment).

I certainly didn’t have a problem with this (except for when the kids on one side of us kept getting in front of me when I wanted to cast) but it was a bit odd to have spent so much time learning about something only to find that almost everyone else just cast a line in the water and left it untouched for six hours (some of them using small fresh water rods, and more than a few people with no tackle at all on the line anyway). It was like getting all dressed up in formal clothing for a party where everyone else arrives in t-shirts and jeans. The word is that once the summer ends and the fish that anglers prize arrive in mid-September, the scene completely changes. And even in August, at night, it’s a completely different crowd.

The point being that formal knowledge is useful preparation for experience, but only if you’re prepared to abandon, modify, and adapt that knowledge rapidly when you discover that actual communities of use don’t go by the book. I worry a lot that most higher education skimps on or avoids entirely that moment where formal knowledge meets experience and gets put in its place.

This is an old debate in educational studies, of course, the familiar Punch and Judy drama of constructivism vs. guided instruction. But this is also an increasingly important way to think about crowdsourcing, about vernacular knowledge, about how communities of usage might interact more usefully with formal scholarly knowledge production and when the kind of knowledge that comes through long usage or experience ought to lead or trump scholarly investigation rather than the other way around.

One example that was on my mind last week came from a short note in the New York Times Science Times last week. A reader sent in a question, asking for help in identifying a snake he found in Queens, NY. The answer from a herpetologist was that the snake was a brown snake, related to garter snakes and water snakes. Another expert chimed in that the typical temperament of the species is somewhere between ribbon snakes (“friendly”) and northern water snakes (“very grumpy”).

I think I’ve mentioned before on this blog that for much of my childhood, I was convinced that I would be a herpetologist. I had pet snakes (a rosy boa, a ribbon snake) and was somewhat infamous in my family for my avid interest in observing and catching snakes and lizards when the opportunity arose. I’ve retained much of this interest even if I’m not a professional herpetologist. While living in southern Africa, I’ve caught (and released) chameleons, geckos and skinks and watched adders and cobras from a safe distance. (Yeah, don’t worry, I’m not going all Crocodile Hunter here.) I often catch garter, ribbon and water snakes in this region to show my daughter when we’re on hikes, though I wasn’t able to get my sister and wife to come and look at a hognose snake the last time I ran across one in the woods. We keep a pet ball python today at home.

So this little note in the NY Times made sense to me. I suspect any naturalist with an interest in a particular species, family or class of animals develops a similar working, experiential sense of the variations in behavior or temperment that they observe when tracking, watching or handling those organisms. Gardeners end up with a working sense of the growth habits of particular plants and weeds and how they interact, pastoralists similarly with domesticated animals. Engineers and tinkerers develop a working feel for how mechanisms and technologies actually function and interact in real contexts of usage.

Vernacular experts are sometimes wrong. The shared understandings that form in communities of use not infrequently harden into orthodoxies. They can come to believe phantom patterns or endorse counterproductive or destructive practices based on received wisdoms. The urge to generalize is very strong whether you’re an academic expert or an everyday practicioner. I contacted one herpetologist to ask whether there was any scholarly study of “temperament” in various snake species. She said no, but she sagely observed that in her experience, species don’t have temperaments, but individual snakes do. That also seems right to me: if you were too convinced that all ball pythons are placid and well-suited to be pets, you’d be pretty confused when you ran across one that was aggressive the moment it came out of the egg.

But it also seems to me that conventional scholarly knowledge does even worse if it is asked to engage some of the observational truths that come from usage, experience and practice. If I asked disciplinary biology to formally study the varying temperaments of snake species, I suspect that one of the end results might be an argument that temperament doesn’t exist, that it’s impossible to study without running afoul of Hawthorne effects, or perhaps that temperament if observable can be explained in terms of adaptive advantage to different species in the context of their environments. None of which feels right to me: different individuals and species do seem to me to have “temperament” (a variation in how they react to handling or human proximity or captivity), it seems to me that naturalists with an interest in snakes can reach meaningful non-scholarly consensus about their observation of these behaviors, and if temperament exists, I’m thinking it’s a mistake to formalize an explanation of it in terms of adaptation (a spandrel or epiphenomenon if there ever was one).

I don’t think there’s any danger that scholarly herpetologists are about to crush this delicate vernacular flower under their feet: they have enough important research to do as it is, and there are few enough of them to do it. But in other contexts, academics have had a tendency to let their formalisms run roughshod over what communities of use know. Sometimes that leads in the long run to the academics getting egg on their faces, sometimes it leads to really productive changes in the practice and understanding of vernacular experts. Sometimes nothing much happens except mutual contempt.

This is where new technologies for meshing or amalgamating knowledge-producing communities that have very different norms and composition might lead to a much richer range of outcomes, to something genuinely new, where “crowdsourcing” has the possibility to become something that neither overcomes nor submits to academic knowledge while academic knowledge retains some of its conventional advantages and strengths while addressing some of its persistent weaknesses. It seems to me that scholars have the potential to move into a new era where we could much more consistently distinguish between three declarations: 1. “This is a problem that we’re best suited to engage within our established institutional practices”; 2. “This is a problem that’s best left to people who’ve learned about through intensive personal and community experience”; 3. “This is a problem that’s best engaged through meshing or connecting heterogenous styles of knowing”.

Posted in Academia, Digital Humanities, Information Technology and Information Literacy, Production of History | 3 Comments

The Emperor’s New Interface

A beginning-of-the-semester raft of posts is on the way. Let me start off with a little appetizer of outrage before I get on to the long-winded equivocating, though. It seems like most librarians are willing to kiss and make up with Ithaka & JSTOR over its recent changes to its interface. Given JSTOR’s present indispensibility, that’s wise on their part.

I, on the other hand, have no such inclination. Oh, I’m not uniquely angry at Ithaka’s leadership, mind you. Nor do I think this was a conspiracy to get users to accidentally pay for content that their libraries already own. But if you want a single moment that reveals how flatly insane the entirety of academic publishing actually is, this is that moment. All the fig leaves fell to the ground for a couple of weeks.

To sum it up, as I have before at this blog: academic institutions (and grant-giving agencies outside of academia) subsidize scholarly research through sabbaticals, through supporting laboratories and libraries, through travel funds and so on. When scholars report and disseminate their research in short-form articles or papers, they traditionally have done so by giving away the written report to outside publishers. (Or worse yet, the researchers have had to pay someone to disseminate or publish their findings, a cost which is also borne by the universities or by granting agencies.) Then scholars gave away something else to the publishers: the work of peer review, done on an entirely voluntary basis, which was the primary value-added that made the publications desirable in the first place. These publishers then sold back the published results to universities, often at very high profit-seeking mark-ups.

What do scholars get out of disseminating or publishing their research? Primarily they gain reputation, which may indirectly produce financial rewards. Only very rarely does an academic receive direct financial gain from the act of publication itself. How do you gain reputation? Through the widest possible circulation of the research publication.

So: in the system as it existed from about 1970 to the present, universities had to pay twice (or more, if you count supporting peer review as a form of academic labor) for research, and because publishers held the rights to the work that was donated to them, work did not circulate as widely as it could. Quite the contrary: conventional publication sharply limited its circulation.

That was one thing when the publishers were bearing the costs of the physical production of print. Digital publishing is not cost-free (UI design, storage, interoperability and preservation all cost something), but neither does it have any of the burdensome overhead of print.

So why do we tolerate the rank insanity of this system now that we can completely obliterate it? Peer review is completely mobile to digital publication: it was already done remotely anyway. Editorial boards are completely mobile to digital publication. There’s still a place for book publication that is handled by presses, books which potentially have slightly wider audiences than one hundred fifty research libraries. There’s still a role for a few wider-circulation print journals that also reach wider audiences. But the vast majority of academic publication can avoid the middlemen entirely, which would simultaneously save money and serve the purpose of scholarly publication far better.

Now, JSTOR’s interface design didn’t actually change anything about who owns what material. All it did was briefly showcase the nudity of the emperor, reveal more nakedly how much of our academic patrimony we gave away for decades and decades, made us rattle our beggar’s cups a bit more. At least in the fairy tale, when the town sees the emperor is naked, they don’t close their eyes until they can get back to imagining him as regally clothed again. No, they run the salesmen out of town and laugh the emperor back into his palace.

Posted in Academia, Information Technology and Information Literacy, Intellectual Property | Comments Off on The Emperor’s New Interface

Tomato Tomatoe

I’ve got a steady flow of tomatoes from the garden now, though I’ve lost a few to blossom-end rot this year, I think because it’s been so hot and relatively dry. So far I’ve made a spicy wine-and-tomato sauce with a few of the bird peppers from the garden, a tomato-and-mint soup, and several rounds of tomato-mozzarella-avocado salad (I’ve started dressing it with tomato water mixed with a touch of olive oil, lime juice and soy sauce, and this really works well). My favorite thing from the garden this year, though, was the fresh cranberry beans soaked and then fried lightly, added to some thin slices of zucchini from the garden that I dipped in chickpea flour and fried with chorizo and garlic.

Working with fresh vegetables from the garden helps me put general foodie preoccupations in perspective. This New York Times piece on expensive boutique ice cream raises the question of when it makes sense to prefer local or high-end foods and when it doesn’t. I like locavores and slow-food advocates because the consequences of their advocacy is often very good food. But the more religious versions of both turn me off. I don’t think it’s at all clear that eating local is always a net plus in environmental terms, for example. I know it’s not always a net plus in terms of taste or quality. There’s nothing better than heirloom tomatoes from your own garden, but plenty of things that I have grown over the last decade aren’t measurably better-tasting for having come from my own yard. When I find that’s the case, I stop growing them. (I also stop growing them when it turns out that the local varmints can’t keep their paws and beaks off of them.) The mainstays are tomatoes, beans and greens, all of which seem better to me grown right here.

In terms of local foods, cheese and dairy can often be superior, but that’s often because of the way the dairy is run or the skill with which the cheese is made, not because it’s local. Local meats can be better, but that’s generally the case only if there’s something different about the conditions under which it is kept or the breed quality (especially with heirloom breeds). Eggs are different: a freshly-laid egg is a thing of wonder. Local produce is better if it’s something where spoilage is a factor over longer distances or if it’s a fruit or vegetable where mass production has totally destroyed flavor in favor of standardization and shippability (tomatoes or apples). And all of this applies if you’ve got the money to pay for distinctiveness: none of these locavore preferences scales at all well to mass production. I was down at the Italian Market in Philadelphia earlier this week, and honestly, in some cases, I don’t see that the produce or meat there outdoes a good-quality supermarket, except that you can get more cuts and things like tripe from the butchers there.

All of this goes double or triple for prepared or manufactured foodstuffs. There are mainstream brands that I think are superior to up-market organics, and in some cases better than what you might make yourself. I can make corn tortillas from scratch and then cut them up and fry them, but honestly, there are a number of brands of tortilla chips that would outdo anything I can do at a cheaper price, without the labor. Good food is good food: it can come from a factory or from the little old lady next door, from a big farm or from a garden.

Posted in Domestic Life, Food | 11 Comments

Geeking Out About Dragons and Alt-History

I’ve talked about Naomi Novik’s Temeraire series before, which is an alternate history focused on the premise that many of the major governments of the world between 1600-1800 have had access to intelligent dragons as military, economic and cultural resources.

Novik’s series is focused on the adventures of a British naval captain and his accidentally-acquired dragon, who turns out to be a highly intelligent and strong-willed member of a breed previously found only in China. Over time, Captain William Laurence and the dragon, Temeraire, have grown increasingly estranged from the British military and now from British society as a whole. What originally started as a bit of a mash-up of Anne McCaffrey’s Pern and Patrick O’Brian’s Aubrey/Maturin novels has developed its own distinctive feel.

As I’ve noted previously, Novik’s alternative history has the escalating feel of galloping away from her in a way that I find kind of intriguing if also perilous to the coherence of the series.

The changes that her story has made to world history are now so comprehensive that they’re plainly straining her ability to keep all the balls in the air, which I think is one reason why the newest volume in the series sometimes feels a bit boring and glum, like it is stalling for time. Still, I really enjoy thinking through the cascading sequence of alternate events and conditions that she’s set in motion, like those thought-experiments where legal scholars sit down and try to figure out what laws would govern vampirism or lycanthropy if they were real.

Novik deserves a lot of credit for not just returning the status quo in each book to a kind of Napoleonic-era + dragons baseline. That’s what a lot of her fans seem to want: the comfort of keeping early 19th Century British military officers as British military officers, in a setting where the British Empire is a pleasantly nostalgic backdrop to the action. There are a lot of complaints from readers that the characters are “too modern”, the plot developments too politically correct. I think in many cases, these are readers who don’t really know much about the actual history of the British Empire (and therefore regard it as impossible that there should have been actual British people in 1800 who were anti-imperialist or at least indifferent to imperialism) and are more comfortable with non-Western people in such tales being nothing more than background elements. I agree that Novik is starting to use Temeraire as a kind of ‘modern’ critic of imperialism, but given that the European-trained dragons in some respects function as “anthropologists from Mars” (e.g., they’ve previously not had much exposure to human institutions or knowledge, but Temeraire’s sharp interest in these subjects has changed things), it’s not at all unreasonable that he should ask some basic questions, such as why lodging a claim of territorial sovereignty based on Captain Cook getting off his boat briefly makes any sense whatsoever.

Spoilers ahead for Tongues of Serpents.

Sending Temeraire and Laurence off to Australia is a solid low-key follow-up to the last volume’s major developments: it gets the characters away from the major global events unfolding, and lets Laurence slowly come to the next stage of his development as a character, turning his back on the British Empire for good. Temeraire clearly has already come to the point of regarding imperialism as nonsense, though in a dragonish fashion.

But enough information gets added to the picture of Novik’s alternative world that the next volume honestly should take place in a setting that is thoroughly unlike the early 19th Century in any respect: this is no longer just Napoleonic Europe + dragons. Here’s what I noted:

There are now at least two other major “dragonish” species of creatures in this world, with serious political and military implications. There are sea serpents, some of them trained and under the control of a renascent Chinese empire, and there are bunyips in the Australian outback, which aren’t under human control but are clearly intelligent and hostile to human beings.

In the meantime, an alliance of African kingdoms using weaponized dragons has attacked European ports in the Mediterranean in retaliation for the slave trade and have been given naval transports by Napoleon to Brazil, where they have continued their attacks, now on slave plantations. (I complained earlier about Novik’s idea that “the Tswana”, a single state/people from southern Africa, could have crossed the rest of the continent to North and West Africa as if it were more or less unpeopled and then carried out military operations from there, so I’m taking her continued mentions of “the Tswana” as being an alliance of multiple African states. Because that’s what makes sense to me.)

So let’s sum this up: Britain no longer has effective naval superiority in the eastern Pacific because of Chinese sea serpents, plus China is no longer the enfeebled Qing China of the early 19th Century, but instead under leadership determined to push back on European advances. African states are working in alliance to destroy the slave trade, European states no longer have territorial footholds in West or Southern Africa, and with the aid of Napoleon, Africans have begun an invasion of the New World.

In addition, we hear a bit more about North America in this volume, including the proposition that dragons there are increasingly being used by both European and Native American merchants for air transport of commodities rather than as military assets.

All that adds up to an utterly different, almost alien world, quite aside from there being dragons and such:

*No plantation slavery past 1810 anywhere in the world, assuming that the African alliance doesn’t meet meaningful resistance. Huge implications not just for the New World and Europe, but for Africa.
*Air transport of goods within continental landmasses, so no need for railroads or even canal-building in North America. (This is assuming dragon husbandry can produce sufficient numbers of animals to meet increased demand + sufficient food for the dragons. Industrialization of meat production might come earlier in this world!)
*Societies previously vulnerable to European expansion are strongly defended: it’s hard to see how Europeans would gain imperial hegemony over the Australian outback, China, or Africa.

Now add to this that whether she knows it or not, Novik is laying the groundwork for some kind of dragon liberalism, that Temeraire is more or less heading in the direction of a dragonish verison of the Enlightenment. I’m not sure Novik will want to pull the trigger on this particular mantlepiece gun, but it’s hard to see how she can avoid it. I keep wondering why Temeraire hasn’t read Rousseau, Voltaire, Adam Smith, John Locke, Montesquieu and so on, given his interests. (Maybe he has and I just missed it, but…) Conversely, of course, imagine what Enlightenment thinking would have looked like if there was another unmistakeably sentient species sharing the planet with human beings, and what the intellectual consequences of news about the emancipated status of dragons in Chinese society in particular might have been within European society. Dragon Chartism can’t be far off. Though Novik has also done more and more in each volume to establish what the dragonish version of “reason” looks like, and it’s not entirely human. Dragons have a psychologically dependent relationship on the human that they imprint upon at birth, and dragons have an avid near-instinctive interest in loot and riches that has nothing to do with accumulation in the human sense.

Of course, the extent to which Novik is engaged in world-building is also raising a lot of questions not just about future events in her series, but about the implausibility of the past of her world. How exactly did Europeans engage in post-1492 expansion in the New World if the Incas and other Native American states had dragons? (We know that the Inca and Aztec Empires resisted Iberians successfully, but on the other hand, we now know also that Portugal has extensive holdings in Brazil. These are hard to reconcile.) What exactly do the Americas look like, anyway, and where on earth were all those African slaves going to? (Something I wondered about the last time I posted on the series.)

Why did West African states tolerate the slave trade in the first place? Did the Mongols use dragons, and wouldn’t that have made a difference in their conquests whether they did or not? More importantly, why have intelligent dragons ever tolerated subservience to humans? How could dragons make any ecological sense whatsoever given their need for huge amounts of meat? Some dragon breeds in the books are able to eat two or three large mammals per day. (Not to mention economic sense: even a small dragon force would have put a huge burden on most preindustrial societies.) I’m hard pressed to understand dragon evolution in any respect, even given centuries of artificial selection.

Etcetera. But like I said, I enjoy the extent to which Novik is at least allowing these kinds of questions to slowly rise to the surface in the series. I really do think it’s time for her to move into a completely new narrative line and start putting the dragons into the politics of the European Enlightenment: a dragon-rights campaign would make perfect sense, given the direction of the series so far.

Posted in Books, Popular Culture, Sheer Raw Geekery | 3 Comments

Evidence Is Old-Fashioned?

So, more wailing and gnashing of teeth about Andrew Breitbart.

The New York Times has a piece on plagiarism that reviews an increasingly prominent argument that contemporary college students simply don’t know that copying the words of another writer verbatim is plagiarism, that they’ve grown up in a different kind of textual environment that will eventually produce new norms for everyone.

I’m sympathetic to certain versions of this claim. I’d agree that many students are taught poorly how to cite online material. I’d agree that there really are new kinds of text-making practices in digital environments that arise out of networked or collective systems for sharing information.

What we’ve come to understand as plagiarism is a relatively short-term consequence of a highly individualized and relatively recent conception of authorship, creativity and property rights. Many years ago, I was surprised to find that 18th and 19th Century European travel writers sometimes committed what I saw as outright plagiarism, reproducing or directly paraphrasing work by an earlier traveller. Over time, I began to realize that for some writers, this was a “best practice”: if you didn’t have time to visit an area along your route, but someone else had, then include what they had to say, but fold it into your own authoritative account.

But I’m enough of a devotee of our recent view of authorship and creativity (and property) to think that the norms established around plagiarism during the 20th Century need some kind of continuing defense, just with sufficient awareness of the changes in textual production and circulation.

What really worries me is what’s happening to the larger purpose of the analytical writing which tempts some to plagiarism. The thing I’m honestly afraid of is that we’ve come to a point where the professional value of learning to build strong arguments based on and determined by a command over solid evidence is in rapid decline.

I think in the last four decades of the 20th Century of American life, the ability to build a strong case whose factual foundation could withstand fairly determined examination by a variety of critics paid off in a wide variety of professional and personal contexts. I’m not saying that the quality of knowledge claims in that era was always beyond dispute: quite the opposite. A lot of social research from that time turns out to have been flawed in its claims, in its evidence, in its rhetoric, in its method. But I do think that both academics and non-academic professionals often tried hard to get it right, changed features of the arguments they were inclined to make based on evidence, and when their evidence was found seriously wanting, abandoned or strongly modified views that they’d previously held.

There are a zillion reasons why that spirit has receded strongly from public life. It’s not all about the sudden surrender of the Republican Party hierarchy to a populist fringe that treats all evidence as infinitely malleable to its needs, and evidentiary debates as the culturally perverse hobby of an elite it disdains. But that’s the latest and strongest fruit to hang from long-growing roots and branches. The upshot is that we’re in a moment where it’s not clear that there are any meaningful professional, social or personal consequences to believing whatever you want and unabashedly cutting “evidence” to fit the Procrustrean bed of your beliefs. Evidence or facts are becoming a rhetorical flourish, like opening a letter “Dear Sir:”, or calling an openly totalitarian nation “the Democratic Republic of”. You include “evidence” because that’s the form, but the substance hardly matters.

So here’s the question, then: am I committing a kind of futureward malpractice if I tell students that the quality of their evidence matters? Is this one more way that I’m just an academic training people to be academics and ignoring the future needs of other professions and careers in the world as it actually is? I know this sounds like dramatic pearl-clutching, but I look at the case of Breitbart and a seemingly endless parade of other pundits and writers wrong about small facts and big facts, casually mangling and manipulating evidence, and I don’t see that it hurts any of them. I don’t see that the mainstream media cares much any longer, if it ever did, about enforcing a different standard. I don’t see that this kind of writing or speaking means anything negative for a political career or a career in public service. Business, law, medicine: if you’re on top, you’re not going to get called to account for any distortion, no matter how gross, and if you’re not on top, you’ll be producing distortions on command for those at the top.

It’s not just the professions, either. There’s one blog that I really love to read that has a regular commenter who has a near-perfect style that combines the recirculation of right-wing talking points, the undisguised evasion of unwanted ‘frames’, and a passive-aggressive retreat into personal and anecdotal accounts when directly challenged, a style for which Ronald Reagan should have been awarded a patent. I think this style probably makes this person successful at producing outcomes in her everyday civic and professional life. I know that when I’m in everyday civic contexts and I come up against someone who fuses that kind of approach with a dogged determination to have their way, I just say screw it and walk away unless the stakes are the highest possible. (And that’s partly how we get to situations where the stakes are the highest possible, because of incremental erosion on smaller issues.)

So maybe that’s the kind of writing and speaking we need to train our students to do: rhetorically effective and infinitely mutable on substance, entirely about rather than just sensibly attentive to affect and audience. At what point is it perverse to continue making buggy whips while the Ford plant churns away right next door?

Posted in Academia, Politics | 10 Comments

I Want My AuthenticiTV

I largely believe in the everyday critical capacity of contemporary audiences. In many ways, I think cultural consumers today are the most sophisticated in human history. To some extent, that’s because their toolkits, both intellectual and technological, have a lot of flexibility and capacity, but also it’s because the volume, fecundity and range of contemporary expressive culture is so staggering and its interpenetration with everyday life so thorough that people can’t help but know more than they think they know about texts and artifacts.

This is one reason why I am so profoundly irritated by conventional media-effects hand-wringing, which often strikes me as much less intellectually sophisticated in its simplistic ideas about the mimetic powers of representation, the extent to which showing or describing or making an action or an image causes it to happen in the real world, than the average man-in-the-street. (There are other reasons to dislike media-effects research, not the least of which that empirical work in the field is often manipulative or tissue-paper thin.)

That said, I don’t think audiences are thoroughly ironized postmodernists whose multilayered consumption of culture is always knowing and masterful, who are never tricked into taking the wrong text or performance too seriously. That’s how some cultural producers cynically try to talk themselves out of trouble at times, mind you. The hard thing about our cultural moment from the audience perspective is that sometimes we do take our cultural pleasures seriously, sometimes we do expect authenticity or truth, sometimes we don’t want to be tricked. Sometimes we don’t want to wander in a metatextual maze, even if there’s a kindly Daedalus around to provide a thread. When audiences invest in authenticity or expect truth, the last thing they want to hear is a producer or author telling them, “Oh, grow up, this is show business, this is just a product, this is how the game is played”. And when audiences divide on this expection, that’s when you can expect the rhetorical blood to flow–and in some situations, maybe not-so-rhetorical.

Three examples I’ve run into in the last week.

One, the Hexbug Nano. 1) I’ve always been interested in robotic toys. 2) I’ve long argued that the potential “killer app” toy or pasttime for digital-age kids would be a more fully functioning, complex version of something like Pokemon, a collect-them-all world of creatures that had digital genetics and evolution, creatures which could interact to create successive generations of new creatures with new combinations of attributes or capacities. So I noticed the Hexbug Nano in a Toys R Us while we were travelling and my daughter and I picked one up to mess around with.

The packaging suggested to me that this might be the toy that combined 1 and 2, that the little bugs would interact to share some kind of persistent genetic information. (The package suggests that there is a “rare mutation” inside.) What you get instead is an attractively packaged bristlebot. Which is as fun as a bristlebot, meaning, fun for a little while, but nothing like what the packaging implies. I thought this video was especially eye-rolling, as it implies that what the Nanos are doing is intentional, involves learning, or is otherwise responsive to environmental cues, instead of being random motion. What we have here is a digital Sea Monkey, really.

But for at least some consumers, that’s enough: the movement of the bugs is entertaining and they had no other ideas about what they were getting. Or in the case of many kids, their imagination trumps the reality, and they’re perfectly happy attributing intentionality to the bugs. As far as that goes, though, I think they’d be better off just building a maze out of household objects and catching some living insects: the variety of responses will be richer, the attributional fantasy more compelling.

So if it’s good enough for some, why am I crying in my beer? This is where culture’s mysteries arise: why do I invest when others stay safely on the surface, unruffled? Why do I demand something more honest (or something more ambitious)?

A different case where almost no one in the audience is content with taking things for what they are, settling for what producers offer, is this season of Top Chef. Reading across a diversity of fan sites and comment boards, there’s a nearly universal dislike for this season, and some evidence in early falling ratings that this isn’t just a lot of talk. The negative reaction, as I read it, is largely focused on the uncharismatic cast, the lesser quality of the food, and most especially on the editing of the show, which has highlighted gossipy, negative and petty interactions between the cast members.

What Top Chef viewers are saying back to the producers is that they’re not content to watch the show in a deeply ironic, postmodern fashion, knowing that it’s-just-a-reality-show and that whatever they’re seeing is simply the storyline that the producers have decided to show them. Instead, they’re claiming that at least some past competitions have had the virtue of authenticity, that the people and the food and the emotions have been real, and the reputational stakes have had genuine meaning in the careers and lives of the contestants. Frequently, you see self-described fans contrasting the show to other reality programs, arguing that it’s “classy”, “real”, “collegial” by comparison.

In last week’s episode, the editing strongly implied that one chef stole another chef’s dish and presented it as his own, leading to the possible-thief winning that week’s competition. The show’s chief judge and leading spokesman, Tom Colicchio, wrote a blog entry that more or less dismissed the scenario, arguing both that there was no way to know whether it happened, and that if it had happened, it wasn’t a big deal. The negative response has been voluminous, with almost three times the number of comments as for any other episode.

What I find interesting is that almost all the viewers are having the same triple-layered reaction that I myself had to the episode. First, knowing that the editing for the program intended to provoke this intense response, that the producers want people to be involved and angry and blogging and linking, and that in all likelihood, the current episode this week will somehow resolve the narrative line of the theft of the pea-puree rather like the way that old cliff-hanger serials would get their hero out of trouble: by rolling back in time and showing you a scene of his escape that you never saw in the earlier episode. Second, a lot of people are angry not just at what the storyline of the episode contained, but at the violation of the cultural contract between audience and producers. In other words, they don’t want to give the producers the satisfaction of reacting the way that the producers want them to react, because the very fact that the producers are willing to reveal so nakedly the style and technique of their manipulation of events disrupts the investment that the audience wants to make in Top Chef.

So third, a lot of people are trying to figure out what the appropriate metatextual response is: stop watching? Write critiques? Generate negative buzz? Or ignore the show altogether? The problem being, when you do invest or trust in the authenticity of a cultural work, it’s hard to think metatextually, because you don’t want to. But one thing I also know is that the producers of the show are playing with fire: when you force audiences to switch codes, when you pull back the curtain to show the little man playing with levers, when you break those contracts, you often kill your gold-laying goose. The really interesting metatextual question for me is why that happens as often as it does, why producers find it so hard to understand how their audience thinks, what pleasures and experiences and investments they’re deriving from a work of culture.

One more example that I’m planning to talk about in a subsequent blog entry: the case of Andrew Breitbart and Shirley Sherrod. This is a complicated example, already ably dissected by a whole range of online writers and journalists. But of course, also enabled in the first place by online writers and journalists, and that’s the problem in a nutshell with the contemporary American public sphere. What are we to do as audiences when we want to exercise a selective ability to take some writing and reportage as authentic, to really invest in its communicative and factual capacity, and yet we also know full well that virtually no one producing that content, in any medium or any format, cares any longer to make good on that investment? Or, more disturbingly, what are we to do when we suspect retroactively that we’ve never had that capacity? Mooning about for some past moment when reporters dished up the truth, tirelessly and ruthlessly investigating the hard fact, is less credible (and less emotionally satisfying) than believing the tooth fairy left you a quarter under your pillow. Even so, it feels to me that in some past moment, if you’d been caught burning the Reichstag this flagrantly, you would had been shuffled off to some dusty, unpaid corner of the public sphere to edit a hand-mimeographed newsletter for an audience of ten or twenty local cranks. Instead of being rewarded in a world where no publicity is ever bad publicity.

Getting Top Chef back to where its audience wants it to be strikes me as being at least plausible, and whether or not anybody ever makes the digital-genetics robot Pokething of my dreams is not really of any importance except maybe to some manufacter’s pocketbook. The public discussion of the most important questions of our day? Not so much.

Posted in Blogging, Consumerism, Advertising, Commodities, Politics, Popular Culture | 3 Comments

Tenure from a Wide Angle

So I see that the issue of tenure has come up again while I was away. No surprise: it will keep coming up until some stable new institutional norm for academic employment emerges.

What is much clearer now than when I first began to be involved in these debates is that tenure in higher education has already been abolished. That’s right: you can all stop talking now about whether it’s a good idea or a bad idea for higher education as a whole, because it is no longer a common institutional practice. It remains at top tier universities and colleges as a perk, as something that makes their jobs attractive to desirable employees. Like all the perks and features that make skilled people want to work for Google or the bonuses that make Goldman & Sachs the place to be for an investment banker. As such, I expect some version of it to remain at the institutions which can afford it. If you had to ask me whether I’d prefer to triple my salary or have a very strong guarantee of continued employment, I’d choose the latter, particularly in this economy.

You can argue against tenure in these terms if you’re against incentives in general. I don’t see too many critics of tenure with a consistent view along those lines. You can argue against it if you think it is a poor incentive for attracting the people that elite institutions should really want, but then you’ll have to tell me who they ought to want instead, why they should want them, and what alternative incentive would attract them. You can argue against it if you think it is more costly (in money or otherwise) than comparable incentives that would work equally well. But in these terms, it’s nothing particularly different than what other successful businesses and incentives do to retain workers. It’s only different in that much of the U.S. economy has moved to being a buyer’s market rather than a seller’s market as far as labor goes, and most workers don’t have incentives or perks or bonuses, rather instead they’re treated as disposable units who have no choices or options. Including, unfortunately, most of higher education. It seems to me that the problem here is the general evolution of work and the diminution of middle-class life in the United States, and tenure per se is something of a red herring in that discussion.

I’ve long voiced my skepticism about whether tenure protects academic freedom in terms of the subject matter and methodology of scholarly research. Even with tenure, both administrators and colleagues who are determined to make life difficult for a maverick (or nutcase) researcher can do so. Most scholars are disinclined to be mavericks anyway, both because consensus thinking in many disciplines is fairly reasonable, useful and truthful and because scholarship by its nature is inclined to be incremental and collective. When scholars want to break from consensus, they are often so strongly motivated by their ideas or insights that they would whether or not they had tenure. Tenure is also such a powerful incentive that acquiring it tends to suppress rather than reward iconoclastic thinking. I don’t feel that tenure makes me feel safer to have new ideas or explore new intellectual terrain. Incidentally, I don’t also feel that it protects me much in the classroom, and maybe that’s good thing: the classroom is not a space for me to just spout off about whatever impulsive thought has come to mind. (That’s what blogs are for!) Even with tenure, I should always think carefully about how I teach, because that’s what doing that job well involves.

Where I do feel protected by tenure is with regard to institutional policy and action, in the autonomy I have to shape my courses, participate in governance, enforce what I see as due diligence, have opinions about administrative policy. If you look at institutions without tenure, or with very weak tenure protections, it’s clear that this is the domain where faculty need strong security of some kind. When faculty blow the whistle on profligate presidents, refuse to cooperate with corrupt collegiate athletics, disagree strongly with the dictates of administrators or trustees, defend the integrity of their departments or curricula, they are often the targets of direct and sometimes strikingly crude retaliation. When those faculty are contract or adjunct faculty, they often get shown the exit.

Contemporary American universities are decentralized by their very nature, and by and large that decentralization is what allows them to be as excellent and productive as they are. Individual faculty and groups of faculty share very significantly in the management and custodianship of their universities and colleges. When that managerial share declines and administration becomes more centralized and hierarchical, the quality of a university falls in proportion to that shift.

The thing is, this too is a wider issue than the university, because I think most workplaces and institutions have the same issues with excessive hierarchy and centralization, and we’re all paying the price of that development. In general, we need both businesses and civic institutions to have a flatter, more decentralized character, to use networks more effectively to accomplish their work, and to strongly protect the autonomy of skilled people to do what they do best and to speak out clearly about errors and failures where they see them.

When it works, tenure doesn’t just protect faculty whistleblowers, but also motivates faculty to be good custodians of their institutional future. We could use that in every workplace. Both British Petroleum and the United States as a whole would be better off if the workers at Deepwater Horizon had been able to voice their concerns not just to the top of their corporate hierarchy but to all stakeholders and concerned parties, including the public. Bradley Manning is a fucking hero, whatever his motivations: the citizens of the United States, Afghanistan and the world deserved to know all along what Wikileaks has now revealed. Sunlight really is the best disinfectant, and we need strong legal and institutional guarantees that sunlight will shine hard and long into every nook and cranny of our lives. So in this sense, let’s think less of tenure as a narrow privilege of intellectual freedom that belongs uniquely to a single profession and more as one possible way to produce organizations which are productively decentralized, generous in the autonomy they provide to motivated and skilled workers or contributors, and which strongly protect people on the inside who want to talk freely with the outside world about what works and doesn’t work in their organizations.

Posted in Academia | 4 Comments

Camp Grenada

Back from our big summer camping trip, this time in Acadia National Park. Fun, but there was a bit of a curse on this particular expedition. First the valve on our mattress broke and we slept on the hard and pointy rocks that make up the campsites at Blackwoods. (I ended up coveting the soft moss around each campsite as a result.) On childhood backpacking trips, I was used to sleeping on bedrolls, but I have to admit that I’ve gotten accustomed to inflatable mattresses in the years since. We picked up a cheap replacement and duct-taped its also-leaking valve shut.

This turned out to be a trip-saving move as a huge thunderstorm rolled over the camp that night, dropping what seemed like an inch or so of rain during the evening. Our tent is pretty watertight and I’d pitched it so that the water flow under it kept anything from pooling, but we still had quite a bit of water inside. I’d thought to put everything up on our chairs, so only a few towels and whatnot got soaked. Most of the camp cleared out that morning, as it seemed most people had everything get soaked.

Then I agreed to go whale-watching with my daughter. I’ve never been seasick before, but it’s been a while since I was on a boat. For some reason this particular voyage really got to me at both ends of my digestive system, and it was well over a day before I felt halfway human again.

And then it rained heavily again. But we did have some great hikes and several totally beautiful days. Plus this year I found some decent campwood, which made cooking over the fire a much more relaxing experience.



Posted in Domestic Life | Comments Off on Camp Grenada

Extension Tutorials?

On a somewhat related topic to the use of customized writing services. I keep getting emails from a Canadian company that is basically trying to sell themselves as short-term research assistants for hire. I’m not terribly impressed with the company’s web page, and I at least worry that one of the uses to which their services could be put would be by students or researchers trying to fake having done work that was done by someone else. On the other hand, I can see a lot of legitimate uses that people in some fields might have for a service of this kind: a non-fiction writer needing a solid briefing document on some specialized subject, a novelist looking for background, a researcher who needs a quick run-down of canonical research in an unfamiliar field.

For some reason, this company’s services did make me think about another possible business model that I think really could gain traction in higher education, partly in response to a meeting I had here at the college this week. One of the questions at this meeting was, “Is there some way to use information technology to leverage access to specializations or expertise which we can’t afford to support directly through a tenure-track faculty member?”

That’s a familiar question in higher education these days. The answer at many large universities is depressingly not “Sure, let’s use technology”, but instead, “Let’s just hire some more very poorly paid adjuncts!” In one sense, though, I think that answer has it right. The technology in and of itself can’t really solve that problem at all: it’s only a means for facilitating a solution. The solution involves reorganizing academic labor in some fashion.

Let’s suppose that a smaller or more focused college or university has a student who is hitting the limits of what their institution can provide them in one specialized subject area, or there is a student who is carrying out a culminating research project which extends beyond the competency of the tenure-track faculty which the faculty nevertheless agree is an exciting and legitimate project.

You can’t ever hire enough faculty to solve that problem, even if you use low-paid adjuncts, which you shouldn’t. So what I was thinking about was some kind of retainer model for a guided but also auto-didactical experience. Say, where the institution contacts a specialist that matches the student’s needs, offers a fee if the specialist designs a directed reading, “meets” with the student using Skype four times or so in a semester, and then is flown out to have a face-to-face meeting with the student at the end of the semester.

The problem with that model is that if I were asked for the names of colleagues who could do that work for a student I was advising, my first inclination would be to name the people I know best, who are mostly senior enough that I think they’d have no interest in doing that work even if the fee was generous. The people I’d love to name would be ABDs or just-completed Ph.Ds whom I haven’t met yet who might still be on the job market, or under-employed as adjuncts. The problem is finding them.

So here’s a rough business model for doing something like this: a student with the need for an extension tutorial of this kind is identified by faculty. Faculty advisors have a conversation with the student and come up with a specific description of the field or area of interest. Perhaps two months before the next semester, the institution posts a CFP with this description on the relevant area of their institutional website and distributes it to relevant professional associations. In most cases, you’d limit replies to ABDs or Ph.Ds, but there might be fields or interests where you’d have other qualifications. The institution would list the compensation (say, $3,000 per tutorial, something like that?). Make the tutorial count for general education credit but don’t have it assessed for a grade (just credit/no credit) since the whole point is that this is for highly motivated students who develop strong specific subject interests later in their study which aren’t served by the institution: it’s about developing expertise for the sake of the expertise. The business opportunity would come in finding and vetting a qualified pool of institutions capable of answering to these CFPs and matching candidates to queries, in setting up the network that connected students to the people offering the tutorials, though I can see ways to do this that cut the middleman out completely.

Posted in Academia | 5 Comments