Cheats and the Cheaters That Use Them

To foil cheaters, some universities have adopted an impressive range of new technologies, according to a New York Times story this week. All that stuff looks pretty expensive, especially if it isn’t just a showy pretense designed to fool would-be cheaters into thinking someone might be watching.

Now I’d take a slightly different approach, if I were those universities busily installing monitors and watching for pen cameras. In at least some contexts, I’d let students go ahead and cheat.

Let’s break this down a bit. There are two different kinds of cheating at issue in the article. The first is cheating on an exam or problem set in a very large lecture-based course. The other is plagiarized writing submitted for courses of varying sizes.

Why is either kind of cheating a concern? What’s the consequence of cheating to a university or to society as a whole? Primarily, the cost is that universities could end up certifying graduates as having strong or accomplished expertise in particular areas and have that certification mean absolutely nothing as far as actual competence. Which you’d think would then lead to a loss of faith in that certification, and hence to a loss of interest in preferentially hiring graduates with such certifications. Secondarily, in the case of plagiarism, we’re also concerned with the theft of someone else’s intellectual property. I suppose you might think that cheating and plagiarism are also moral problems, but I’m not terribly convinced that many big universities care that much about the development of the moral or ethical character of their graduates, nor is it clear that they should do to any great degree. This is first and foremost about protecting the value of the credentials.

So let’s look at the exam situation. What’s the issue if a student cheats on a final exam in a 800-person lecture course and gets an “A” that they didn’t deserve? Let’s say this is a necessary premed intro course in Biology or Chemistry, and the student is a premed. Surely it’s going to show in the next course up the sequence that the student doesn’t have the knowledge they pretended to have? Or maybe the next course after that? Some time before graduation? So all you have to say is, “If you cheat, you might get away with it today, but you’re going to crash and burn next year. We teach this sequentially because you have to learn it sequentially, not because we’re cruel ogres who hate you all.” Right?

Well, here’s the problem. In a lot of Mighty Big Universities these days, it’s possible that the student who cheats in the 1000-person class will cheat in next year’s 800-person class and will cheat in the following year’s 600-person class. And no one will ever really see that student with any specificity or individuality from the first day they enroll to the last day of classes, and most of the people who will teach that student will be graduate students or adjuncts who are handling very large numbers of students for very poor compensation and generally don’t have the time or relationship to the institution to do more than basic due diligence in assessing student performance. In those kinds of universities, the skilled cheater whose primary tests of competence are going to be formal exams could arguably get away with it all the way to the end and into medical school. Where, we all hope, they will in fact finally get caught, though I think we also all know that’s not necessarily the case either.

So the answer isn’t really technology. The answer is pedagogy. It’s ok to deliver introductory courses in lecture format to large groups of students. It’s not ok to not follow that up with much smaller, more face-to-face courses where every single individual student’s daily, continuous facility with the subject matter is made visible to an instructor. It may turn out that staffing that kind of curricular design is more expensive than installing monitors and watermarking note paper and so on. But that’s what really makes the credential worth something, makes it an assurance that every graduate has been observed and assessed by an expert as possessing the competencies that he or she is certified as possessing. If a large introductory biology class in the first year was always followed by a 30-person lab course where the primary means of assessing student performance were not formal exams but instead involved their ability to produce relevant knowledge on an ongoing basis in the lab, you’d spot cheaters more effectively than if you installed every technological safeguard imaginable.

Plagiarized writing is a more complicated issue in certain respects, but it’s still easier to deal with through pedagogy than it is by perfecting Turnitin.com’s algorithms.

First, again, if a student consistently progresses over four years into smaller courses where they receive more individual attention from a professor, usually it’s going to be obvious when there is a considerable gap between the student’s formal written work and their capabilities in discussion or in types of writing that can’t be cut-and-paste plagiarized, such as short response papers, timed writing exercises in class, or hybrid forms like blog entries. If not within a single class, the discrepancy will often become visible across several classes, and is often the first thing that tips a professor off that they need to examine a student’s writing more closely, when their performance in a current course is wildly at odds with their performance in a past class where the kind of writing and subject matter is was very similar. (This tends to mean either that something really emotionally or personally difficult is going on in the student’s life, that the professors in each course have wildly divergent standards, or that the student has at some point submitted work not their own.)

Second, the easiest way to beat most plagiarism is to come up with essay prompts that are highly customized to a course, and to individualize or customize your course designs in some respect. I constantly shift my readings mostly to please myself and to chase what I think are the most interesting issues or questions in a particular subject area, but it doesn’t hurt to be using material that’s not commonly used in classes and to craft assignments which would be almost impossible to find plagiarisable material for. I think most humanists and social scientists can find a way to accomplish this in upper-level courses.

The only thing you can’t beat that way is the student who is willing to pay high prices for a skilled analytic writer to produce a customized product that responds to the prompt. I’m sure there are services like that out there. I’m unfortunately pretty certain that it’s not just students that use them, but some researchers (after all, isn’t that what pharmaceutical companies have been caught doing for some researchers)?

I’m also unfortunately sure that this is an area where it’s a bit harder to say that crime does not pay. I have no trouble saying with a straight face that habitual cheating in subjects where knowledge is concrete, technical and cumulative will eventually have disastrous consequences for the cheater. When it comes to writing or humanistic knowledge? Well, there’s more than a few professors who’ve gotten away with plagiarism in the last decade, after all. There are non-fiction authors and journalists who’ve had whole careers stuffed full of fabulism and misrepresentation and never had to pay any price for it. I’ve known people who’ve written reports for governments or think tanks who’ve spotted blatant plagiarism in the work of peers and known that it was pointless to call attention to it.

Here maybe you really do have to fall back on the proposition that education is partly about ethical training, with the main proposition here being, “Don’t develop bad habits that will seriously limit your potential later on.” Not so much the threat that you’ll be caught and punished, revealed and ruined, but that in more subtle ways, you’re willingly taking on the role of being a bottom-feeder and a second-rater at the age of 21, when it’s a bit early to angle for that as your place in life. There’s a human limit for living with being Christian de Neuvillette, no matter how many Cyrano-for-hires you may rely upon in the interim.

Posted in Academia | 1 Comment

Islands in the Mire

Try to imagine, if you will, a person buried alive, growing desperately weak, fighting and struggling…

to reach up through the dirt in order to grab some nearby stones to pile on top of his grave.

That’s what watching the contemporary scene of mainstream journalism is like these days. Every day there’s some new bit of incremental suicide. Just take the last week. We had mainstream reporters taken by surprise because a reporter used access to a source in order to report the news rather than as an end in itself. I especially loved the TV journalist midway through this Jon Stewart segment who scolds Michael Stanley, saying that he won’t be given access to the same sources in the future. We had writers like Caitlin Flanagan continuing to write out their own mental hang-ups as if they were peer-reviewed social science and being given significant soapboxes by major press organizations to do so.

And we had David Weigel, resigning because in a listserv he expressed opinions about the people he covered.

Nobody wants the press to be objective. Demands for objectivity are the same as diving to the ground clutching your ankle in the World Cup: they’re meant to play the referee into calling a foul (or refusing to call one). What most readers want is a press that cares about the truth. This is not the same thing as objectivity. Most of the time it’s exactly the opposite, particularly when the truth is subtle, contradictory or ambiguous, as it often is. If I came across a reporter covering American politics who did not have pungent, strong opinions about the personal character of most of the people he was covering, I’d assume he was a dullard, incompetent, a zombie or an alien. Or some mixture of those.

A reporter who is interested in the truth is going to need to have the mental discipline to see issues from multiple angles, and to have a lack of ego when it counts, to have the ability to detach from their own customary or comfortable perspective. That also is not the same thing as objectivity, and it most certainly is not the lack of a customary perspective. Journalism is not compiled at the Vulcan Science Academy.

What reporters need is skill with the getting and processing of information and the ability to communicate what they’ve learned clearly, vividly and concisely. As far as I can see, that describes Weigel pretty well. But his bosses decided to act as if they were a bunch of Victorian spinsters getting the vapors at the sight of a man’s naked ankle because of his public comments, when they’ve very evidently tolerated (even welcomed) reportage and commentary with vastly more contempt for truthful inquiry, simply because its author didn’t slip up and reveal his or her axe-grinding enslavement to some spinner’s brand in a public comment.

Right, I know, here comes the usual recital of past Pulitzer Prize-winning investigative reporting that we’re supposed to be so eternally grateful for that we imagine that those few islands of dedication justify an endless swampy expanse of hackery. No matter how pretty those verdant outposts are, they can’t hide the stench of decay from the surroundings.

Posted in Blogging, Information Technology and Information Literacy, Politics | Comments Off on Islands in the Mire

The Gods That Underachieved

The opening anecdote of this New York Times article on cyber-bullying led me to think about some bigger issues than what middle schoolers are doing on Facebook.

The striking thing about the incident that opens the article is that the parents of the girl who received the abusive, sexualized messages don’t contact the parents of the boy who allegedly sent those messages. Why not? Because the two fathers coach in the same league. So they want the school to do for them what they cannot do for themselves, to act as a quasi-judicial institution, to police what its students do in their everyday life.

I wouldn’t be at all surprised if the same parents, in other contexts, complained about the size or pervasiveness of government. The peculiarity of our time is that people all around the planet know that the high modernist state failed to live up to its promises of the perfected management of human societies through technocracy. Those who live in democratic societies know that their progress towards being fairer, more just or more open is at best stalled. Those who live in authoritarian societies must increasingly wonder at whether change can ever come, as the one area where the capacity of the contemporary nation-state continues to show improvement is its ability to mobilize violence against its own citizens and to manage their dissent.

But that 20th Century story is like a phantom limb: if we gaze directly at it, we know it is gone, but distracted by other sights, we feel it still there. We still imagine that the technocratic state, bursting with capacity and always-improving expertise, can do something, anything, to fix both huge and intimate problems.

Scoundrels like Louisiana governor Bobby Jindal (or indeed, most of his political party) try to play this expectation for all it is worth, simultaneously angrily calling for government to use its missing limb, saying that their own local political body has all the muscle that the bigger government now lacks, and claiming that they’re glad that the nation-state is an amputee and they wouldn’t have it any other way. It’s like a bedtime story where the prospect of being transported to fairyland is set up simultaneously as magically possible, a wistfully recalled but long-gone nostalgic artifact of childhood, and disparaged as a foolish distraction from the real business of life.

We expect the government to have prevented the spill in the Gulf or to be able to stop it somehow, and yet we also expect that it be the complicit cause of the spill and incapable of fixes–and all the while, think that our own expectations for energy availability in our homes and work have nothing to do with the problem. The reality of limited state power and capacity as a permanent fixture of the foreseeable future never seems to take hold. Our unconscious expectations haven’t changed.

What this leaves behind is an evacuated sense of the social. I’m as uneasy as ever about the view of Putnam-style communalists that the answer to this problem is to revivify the sociality of the 19th or 20th Centuries, sometimes because they imagine that the state would be an important force in making that happen (which seems to me to be another flexing of the phantom limb), and sometimes because they overestimate the counterposing muscularity of the social to solve the problems that the technocratic state cannot solve. Take the sports-coaching father, reluctant to confront his colleague. Put the onus back on him to do so, comprehensively refuse him access to any juridical or institutional solution, and there would still be good reasons for that father to just let it slide, for the same reason that most of us don’t confront our neighbors or our co-workers or strangers on the road about some legitimate gripes, about problems that really could stand to solved. The social isn’t the antidote for utopian expectations of technocracy any more than herbal tea is a substitute for brain surgery.

What I’m left with in some cases is a kind of queasy mix of postmodernism and pragmatism: that all instruments are limited and all solutions are partial. That what we really need is a political and social language for describing better outcomes, incremental improvements, and probable solutions for the significant majority of what we expect both from our governments and we expect from ourselves as social actors. I’m not saying that utopian expectations or radical demands for reform or dramatic mobilizations for action need to disappear entirely from the picture, but that we need a way to sharply delineate the circumstances under which that way of thinking is helpful. Right now, we all have a tendency to fall back on the assumption that technocracy can do anything (even that it can instrumentally prevent good solutions from being implemented) and that expertise is the product of a positivist science marching inevitably towards perfected knowledge of everything.

Posted in Politics | 2 Comments

It’s Complicated

“It’s complicated”. Jon Stewart tweaked Obama last week for repeatedly framing his approach to issues with this phrase, and he’s not the first to do so. That complaint was very likely the reason why Obama felt the need to awkwardly declaim that the reason he consults with experts is that he’s trying to make sure that he kicks the right asses.

Since many issues are in fact complicated, it’s hard for me to say that it’s wrong to say that they are.

Stewart suggests that Obama could just simplify those complicated issues. Fair enough. More to the point, though, I think what you can do is simplify the reasons why something’s complicated, because not all complexity is created equal and not all complexity demands an equally complicated political or personal response.

Policies and issues are complicated because:

1) Complication: the political and institutional process by which decisions are reached and implemented is very multilayered and decentralized, maybe to the point that it’s not even clear who actually would be in charge of implementing a policy. Simplification: clearly describe to your publics what you want to do, delegate someone to get it done, and make it clear that somebody’s getting fired and/or agencies are getting shut down if it doesn’t happen. (If that’s not just a posture, then start setting up a quiet process of internal organizational reform intended to improve implementation of decisions.)

2) Complication: the political process of implementing a decision involves exquisitely intricate compromises that are so carefully calculated that they would collapse if they were spoken of directly. Simplification: in a deal of this kind, all the parties want to come to an agreement, but generally need to be seen to be delivering contradictory results to their divergent constituencies. So here I think the simple thing is that someone in a leadership positions has to be willing to scuttle the deal if the participants are so sensitive that they’re keeping it from happening, and to let everyone understand that. (The barb here might be to make it privately clear that you’re going to pick someone as the fall guy if that happens, with attendant pressures to follow.) But yes, this kind of complication ties your hands when it comes to simplifying things for a wider public. That’s just what happens with some decisions in life. Happens all the time in family or personal life, for example, where getting everyone on the same page means not talking too explicitly about what’s going on.

3) Complication: the technical issues involved in a policy problem are authentically difficult; it’s not clear that there is a viable technical solution to the problem as it stands. Simplification: be really clear about the moral, ethical or social stakes involved in the issue. Be equally clear that you’re really screwed on the technical front. Technical complexity doesn’t need to occlude moral or ethical commitments. And sometimes you really should just get up and say, “Look, for the moment, we’re just fucked, ok?”

4) Complication: the moral or ethical issues involved are authentically ambiguous. E.g., this is the kind of thing that adults say to each other when they have desires they’d rather not have, take actions they wish they hadn’t taken, have feelings that they can’t resolve, can’t decide whether to rob Peter or pay Paul, or know full well that no matter what you do, a good person is going to get screwed or cheated. Simplication: there isn’t any beyond brutal honesty. This is the point where you say to your publics, “It’s time to be grown-ups together and stop asking Daddy to buy you the pretty bauble in the window and make all the bad monsters magically go away.”

So which of these applies to the current situation with the oil spill, or to health care? 1 and 2, certainly. With the spill, 3). I’m not sure 3) is much involved in health care: I think it’s an alibi that lets policy-makers and stakeholders pretend that they’re still researching and have yet to find the correct technical answer.

4)? Not much involved in the oil spill at all, unless you really feel that the answer to energy dependency really is “drill baby drill”. In which case, sure, get up and say, “Look, grow up, this is the eggs breaking that are needed to make the energy omelet”. We’ve had plenty of Serious People telling us that you need torture and assassination and expensive invasions and violations of the Constitution to fight terrorists, so why not some Serious People telling us that obtaining oil from our own waters or land is worth any negative consequences. Is 4) involved in health care? Maybe, if you think that our decisions in that arena will always and inevitably amount to some kind of rationing of a scarce resource.

Posted in Politics | Comments Off on It’s Complicated

Huge Untapped Natural Resources

Many moons ago, in my first teaching gig at a New England prep school’s summer session, I was responsible for a unit on Africa. I poked around in the school’s library and found an old educational film intended for American kids that had been made in the early 1960s.

It was a quintessential bit of Cold War geopolitics. The film’s basic presentation of contemporary Africa went something like this:

1. Africa has huge untapped natural resources that will benefit American industries!
2. There are many new nations in Africa that have just been freed from British and French imperialism. They are full of hope! We are their friends!
3. But: watch out for Commies.
4. There are huge untapped natural resources!
5. There are dams and railroads and a few skyscrapers in Africa, all of them brand-new! Also: people there have many age-old traditions which will fade as there are more dams and skyscrapers and cars.
6. There are huge untapped natural resources!

The narrator’s voice as he said “huge untapped natural resources” (at least six times in a ten-minute film) was so full of unironic lust and desire that the students in my class were howling with laughter by about the third time he said it.

This particular trope applied to Africa may have felt new to American audiences, but it certainly went back to the mid-19th Century and the origins of modern European imperialism on the continent. Africa’s untapped resources weren’t always seen as mineral or natural: in late 19th Century British politics, Joseph Chamberlain made a lot out of the idea that future African workers would be important consumers of British manufactured goods.

Still, I think the most interesting or characteristic working out of the trope during the imperial era of African history came in Frederick Lugard’s Dual Mandate. I’m reminded of Lugard’s tortured and contradictory thinking about the question of Africa’s material wealth and its relationship to imperial rule because it very much echoes, in depressing and ominous ways, the manner in which Afghanistan’s resource wealth has been suddenly “discovered” by the New York Times at a moment of political and managerial crisis for American military power in Afghanistan.

Lugard was writing in part to justify the continuation of British imperial rule in Africa to a public that sometimes viewed Africa as a secondary or burdensome commitment (as opposed to India). He was also writing to try and solidify what he considered to be the orthodox managerial strategy for British officials in Africa, a sort of “operator’s manual” for future bureaucrats.

Lugard’s central contradictory argument went something like this:

1. The British Empire rules in Africa primarily as part of a “civilizing mission”. Its presence is substantially for the benefit of Africans themselves. British rule is intended to gently modify African institutions, society and culture so that Africans can enter the modern world without changing their essential beliefs, culture or identity. This process is expected to take a long time. Africans who already act like modern, liberal individuals, who demand political and civil rights, who are urban, cosmopolitan and ‘detribalized’ are inauthentic, un-African individuals who are the unfortunate result of missionary education or other intrusions by alien institutions, and should be ignored or contained.
2. The British Empire rules in Africa primarily for the benefit of Great Britain itself. African societies were incapable of making proper use of their enormous material wealth or labor power and because of their violent, disorganized character, posed a serious threat to the security of any effort to develop that wealth or use that labor for the good of the global economy. The British Empire will maintain the peace and organize economic enterprise in Africa through firm imperial control, which will in the long run benefit Africans themselves.

Some scholars, most particularly Mahmood Mamdani, see in Lugard’s formulation a totally instrumental kind of conscious contradiction that strengthened imperial power, as opposed to mere cynicism or a more accidental, incoherent kind of contradiction. I tend towards that latter interpretation: I read Lugard as switching helplessly between the humanitarian and nationalist explanations (and between the internal contradictions that attend on both logics of imperial rule).

Anyway, it seems pertinent today because both of these ideas are raging like a forest fire through American policy at the moment, in ways that are almost direct reproductions of Lugard’s rhetoric. “We’re here to help reconstruct Afghanistan’s government and culture so that it can be a productive part of the global system!” and “Hey, there’s a lot of lithium in them thar hills, and they’re not really mining it properly by themselves! This is how we’ll get rewarded in the end for spending blood and treasure now.”

If you take Mamdani’s view, this contradiction is entirely to be expected, a revelation that American intervention is and has always been imperial in character. I also think it reveals that there’s an imperial element to Afghanistan and Iraq, but one that is as confusing to American policymakers and military officials as I think it was to Lugard. The impulses that drive the decision to intervene aren’t the same as the forces that shape the management of an intervention. Since I tend to think there’s something to the proposition that modern European imperialism involved the proverbial “fit of absent-mindedness”, I tend to think that Lugard was more or less throwing every justification he could think of at the wall to see what would stick.

As are American policymakers. Because every action of this kind, whatever its initial justifications, creates a clientele of officials, consultants, experts, lobbyists, politicians, demagogues and so on who become dependent on its continued existence. So when there’s a political threat to the ongoing operation, they throw everything you have at the wall to see what sticks. And the “wall” in this case is the New York Times or whatever other publication or reporter they can get to compliantly print leaks or briefings without asking inconvenient questions. Say, in this case, questions like “Um, haven’t we known about these resources for at least three years? Or maybe more like twenty years?”

Posted in Africa, Politics | Comments Off on Huge Untapped Natural Resources

iFlit? kk: uBore.

One thing you can say for the first wave of blogs: ubiquitious self-publishing was an unintended cure for the tendency of editors or publishers in old-media publications to seize the microphone for their own indulgence. A hundred thousand commenters up on their soapboxes has helped to expose insipid or banal old-media editorializing for what it is, and raised the bar for what counts as a stylistically distinctive column or an original perspective.

At least, that’s what I keep hoping will happen. Waiting for old-media outlets to use the current information culture to find fresh voices and gifted stylists is a bit like watching a drowning man scorn a bunch of float cushions in order to clutch to an anvil. The major newspapers stick to columnists who comfortingly echo the old op-ed culture of forty years ago (or who were writers in the op-ed culture of forty years ago): the arguments, the analysis, the riffs are all as predictable and narrow as Marmaduke or The Family Circus on the comics pages. It’s like deciding to hop on a Viking’s funeral boat while thinking it’s a cruise ship bound for the Caribbean.

I know: speaking of tired riffs, this is one of mine. But I was triggered off this morning listening to a commentary by the local NPR station’s commenter, who was hired last year. Is he a fresh new voice? No, he’s a former columnist and editorial page editor for the Philadelphia Inquirer named Chris Satullo. Does he have a stylistically distinctive voice or some interesting new angles on public debates? No, almost all of his opinions are generic and tired. Occasionally on local or state politics he’s able to leverage what he knows about the players and the issues into something marginally more interesting. Otherwise, though, it just drives me nuts that a media organization which is dealing with a changing information landscape decides that the best way to respond is by putting the aural equivalent of lukewarm Wheatena on the airwaves. Maybe that’s what the average NPR listener likes, I suppose, but that’s the Marmaduke Syndrome in action: stick with your shrinking audience of smugly complacent white suburbanites right until you cross the event horizon into an endless void.

Satullo’s latest commentary on information technology is, I suppose, meant to anticipate this objection by arguing that the old way of doing things was better. It’s certainly yawn-inducing as an opinion coming from someone with more than two decades of newspaper experience: reading it is like spotting the 3,545th cat in Katzenstein holding up the tail of the cat in front of him (for those of you who read your Dr. Seuss). Whether you want it in short or long form, you can find pretty much the same opinion from a whole host of old-media defenders, ad nauseum.

As an accidental (or so I assume) self-parody, on the other hand, the column has some potential. Satullo takes note of new research suggesting that multi-taskers are less efficient at completing a job and end up knowing less than people who focus tightly on a single task or learning one thing at a time. That research has struck me as credible. Satullo then argues that this is a regression back to the early stages of human evolution, leaving behind some kind of cognitive progress we achieved in between one million years ago and the first email being sent. Well, I’m sure there’s some bowdlerized bit of evo-psych out there that invites that jump. But this is the very problem that Satullo complains about and tries to pin on current information technology, the careless use of poorly digested factoids to flatter your existing prejudices.

The alternative isn’t to get a Ph.D in cognitive science and insist that radio commentaries read like monographs. The alternative is to ask more unsettling kinds of skeptical questions that require no special knowledge to compose but may require a bit of investigation to answer.

Is it true, for example, that the American media environment of 1960 or 1980 was more truthful, that lies in the public sphere were pervasively exposed by journalists and experts? Would we be able to trust experts and journalists to selflessly serve the public interest now if only it were not for Wikipedia, Google and all that digital rot subverting the establishment? Really? You really want to say that there were fewer lies in public life in 1955 or 1975? That when we let experts go about their business undogged by bloggers and online rumors, human progress marched on unimpeded?

Or if you really want to go into more interesting territory, what is knowledge, anyway? What made us more knowledgeable as a society, a world, a species? When did that happen and why? What has come of being more knowledgeable, and what do we stand to lose? Those questions don’t take you back to the East African savannah a million years ago: they take you back to the late 19th Century, or to the late 18th Century, or to the Renaissance, or to medieval monasteries, or other more intricate destinations that don’t allow for dully banal answers.

If you’ve got a bit of guts, tell me what’s going to happen in the near-term future if the digital age goes on as it has so far. Make some predictions, put yourself on the line. Past media hysterics have a pretty low batting average when it comes to doomsaying. Violence on TV was supposedly making future generations of Americans progressively more and more violent, but when heavy TV viewers came of age, violent crime rates started dropping and continued to drop even as the content of television became more violent.

Another original direction: tell me how you think things could be different than they are. If you’re just going to complain about the follies of your fellow humans, then the stylistic bar is really set high: you better be Mark Twain or H.L. Mencken or at the least, Molly Ivins. Because the only thing that kind of complaint could possibly have going for it is to be entertaining. Otherwise, if you’re serious about the criticism? Show me the roadmap to a different future. Don’t end with “remains to be seen” or anything remotely like it. Which is exactly what Satullo does in this commentary: he says, “we’re maddening creatures” and concludes that Steve Jobs can’t change that. So what’s the problem? Sit back, relax. Wait, we need to take care that our gadgets don’t drive us “even more mad”? Why? I thought we’re safely maddening. Ok, so how do we “take care”? Is just listening to our designated daily pearl-clutching from a Respected Columnist sufficient for “taking care”? No doubt, for Satullo.

Some years ago, Thomas Geoghegan wrote a book called Which Side Are You On? Trying to Be for Labor When It’s Flat on Its Back that I quite liked. I find myself feeling the same way about mainstream journalism. There’s so much value embedded within its traditions and so much potentiality bottled up inside of it. Which is why it is so infuriating to see those traditions so ill-served and that value so poorly developed.

Posted in Blogging, Cleaning Out the Augean Stables, Information Technology and Information Literacy | 8 Comments

Bench Pressing

Here in the heart of my middle age, I keep thinking about living in suburbia. I remember as a college student being sure that’s what I didn’t want to do, but now I have to admit that I find it largely satisfying. That is substantially just the lived difference between being a college student and a middle-aged man. But I’m also minded of how I was prompted, as someone who aspired to be an intellectual, to think that the suburbs were dead and sterile, the cities were cultural and cutting-edge, and the rural was a sort of last bastion of authenticity (but also supposedly boring). There was a whole subgenre of films and novels about the repellent and artificial character of suburban life that flourished between the 1960s and 1980s. It’s a trope which I think has declined in influence but which still has some punch to it.

It’s a densely historical and layered vision, and it’s more than just a weapon in a long-running culture war. The suburbs didn’t happen by accident and they aren’t a pure expression of the natural market desires of postwar Americans. But they aren’t the spiritually and culturally vacated wasteland of The Ice Storm or American Beauty, either. Or maybe that’s just what the Fred Flintstones and Harry Angstroms think when they look about, mere heartbeats before thwarted desire or disease or failure come for a visit?

So at least I suppose those who write fiction from the peak of Mount Olympus may think when they gaze down on the tract-house mortals in their folly. But Saint Steven of the Extra-Terrestrial is there to bless the Big Wheels and Habitrails of yesteryear, so all may be well.

Me? I dunno, my cup doesn’t quite runneth over, but it’s pretty full. Yesterday, before the rains fell, I prowled around my little half-acre world and was happy.

And hey, the bench I’ve been working on is coming along pretty well.

Posted in Domestic Life | 18 Comments

Double Consciousness of Double Standards

Ah, the African Renaissance. Can you feel those winds of change?


(photo by Chris Nevins)

Feels more like a boat becalmed in the middle of the Sargasso Sea with no breeze in sight. Statues that charmingly invoke North Korean aesthetics? Check. The absurdity of the dictatorial rulers of Equatorial Guinea sponsoring a UNESCO award intended to celebrate inquiry in the life sciences? Check. Why stop there? How about the Robert Mugabe Prize for Investigative Journalism? The Charles Taylor Prize for Peace, if the investigators ever manage to find his hidden money? It’s been a while since we had an African head of state coronate himself an emperor, so maybe we’re about due for that too.

At a deeper level, the statue in Senegal is a fairly good symbol for the key concepts behind the African Renaissance as described by Thabo Mbeki, Olusegun Obasanjo, and Yoweri Museveni. All their talk of the need for African solutions to African problems, and the accompanying concession that African problems are significantly due to African actors, hasn’t led to any lessening of the veneration of the state as the single vector for delivering whatever reforms might be needed. Inasmuch as Mbeki and his peers have ever addressed civil society, social movements, individual rights, or cultural habitus, they’ve tended to assume that the goal of the African Renaissance is to subsume society within the state, that reform is marked by the more perfect unification of government and people. The idea that society is fundamentally different from the state, or that reform might involve the constraint or limiting of state power, doesn’t enter the picture.

It’s a pretty short and entirely coherent step from there to the ugly monumentality of the African Renaissance Monument.

To be honest, though, if I leave aside those deeper arguments about the nature of state power and the intractability of the nationalist imagination in postcolonial Africa, the other feeling I can’t help but have when I read about the monument or about the Obiang Prize or other similar issues is just that so much of the official action of African governmental representatives that appears on the global stage grates because it appears so amateurish, marked by what Achille Mbembe has tried to describe as “the banality of power and the aesthetics of vulgarity” (though Mbembe has bigger theoretical fish to fry than I do in this more humble reaction). I’ve previously been struck by the same feeling about some of the fumbling managerialism that’s swept through South African universities recently. I hate this kind of managerialism everywhere, but it feels even worse when it’s got that derivative, hack-job aura about it.

The problem is that I’m only too aware that this is a perspective which is prompted by how and when African government action receives coverage in the international press. Smart, effective, technocratically assured government, or responsible engagement of local communities by national representatives, doesn’t get any ink. And there are plenty of examples of stories that could be covered that would fit that description without any need for being a nationalist cheerleader.

Moreover, while I know I try hard for a personal consistency, it’s also true that African actions that get mocked or criticized are often precisely the same activities that occasion little comment or objection when you find them in the history or contemporary affairs of Western societies. Prizes created by corrupt, wealthy individuals that seem to bear little resemblance to their actual ethics during their lifetime? If you’re going to freak out about the Obiang Prize, I suppose you ought to freak out about the Rhodes Scholarship or the Nobel Prize. Or if you feel, as I guess I do, that once some plutocrat or dictator lets his gold slip from his grip, it’s free to do good things in the world, maybe that puts the Obiang Prize in some perspective. (Though there’s a meaningful difference in governance between an independent foundation that gives awards from an endowment and a UNESCO committee.) If you think an ugly statue that invokes the worst of socialist realism in a country that has better or more urgent uses for funding is a problem, then there’s a lot of monumental work across the world that ought to offend. Shit, the United States carved up a fucking mountain with the heads of its presidents: why doesn’t that strike us as ridiculous as all those Lenin statues that ended up in junkyard or Mr. Hero-of-the-African-Renaissance with his adoring wife a half-step behind him?

Maybe it’s enough to say: because Lincoln et al were actually good leaders. Or that Cecil Rhodes wasn’t as odious as Teodoro Obiang Nguema Mbasogo, although that strikes me as quibbling about which circle of hell we’re peering at. More, I think, it comes down to the luxury of history: that those things that the West has venerated with monumentality or glossed with pleasantries don’t seem as provocatively vulgar or amateurish just because there’s a patina of time lending them a respectable sheen. Postcolonial African governments act in the intense but careless scrutiny of the now, and are punished, always, for somehow failing to leapfrog into a future most of us ardently envision. Even when you know all the reasons why you shouldn’t have double standards, and should either slather scorn around with abandon, or look tolerantly upon all follies aware that this too shall pass, it’s hard to live up to that knowledge. I can’t even be certain that I should try to.

Posted in Africa | 3 Comments

Big-Tent Problems

One of the best experiences I’ve had in my career to date has been participating in a relatively informal group that meets irregularly at Bryn Mawr to talk about complex systems, emergence, and information theory among other topics. I’ve had a hard time making the meetings more recently due to my teaching schedule, but I still really cherish this group of people and the kind of conversations they’ve nurtured, and the entire spirit of the group. I think it’s not an accident that the group is informal and only subsidized indirectly by administrative funding. This is one of the points that I find myself making on a cyclical basis to foundation officers who want to help higher education change some of its practices of assessment or to embrace new models for organizing curricula and research. Frequently, the harder you try to make change happen, and the more formal your funding and structuring of such promotional efforts are, the less interesting and effective the results. If there isn’t some group of people already trying to do things differently, you can’t make it happen just with money.

I was musing a bit about the conversations that this group tends to get into, which have tended over time to circle back to some of the same themes and disagreements, as is to be expected any time people are involved in discussion over the long-term. So here’s one thing I was thinking about: what intellectual issues and questions by their nature require discussion between a very heterogenous group of disciplines and intellects for innovative solutions or some kind of forward motion to emerge?

Almost any problem or question could probably benefit from having more than one perspective or angle devoted to it, but for many academic questions or policy problems, the natural range of useful contributions ought to be fairly narrow. Clearly certain kinds of novel thinking about how to plug the oil well in the Gulf of Mexico are desperately needed, but I don’t think a conference room full of poets, evolutionary biologists, linguists and political scientists would have much to contribute to the immediate technical question of how to stop the oil from leaking or what the best interim strategy is for cleaning up or mitigating its damage. (Yes, they could help us understand the event, interpret its consequences, or talk about how political institutions should deal with such issues.)

Also, all disciplines need help from outside their own community to answer the question of why they should study what they study. No discipline can answer the question “so what?” self-sufficiently. But this is a different kind of issue. I’m focused here just on intellectual and applied problems where heterogeneity in methods, bodies of knowledge and perspective are a requirement for progress. A few examples, and I’d be glad to hear of more along these lines:

1) SETI.

Paul Davies’ The Eerie Silence is a persuasive critique of the intellectual and programmatic shortcomings of SETI to date. Davies points out that SETI investigations to date have a whole bunch of anthropocentric assumptions about information, communication, technology and evolution embedded inside of their efforts to pick up signs of intelligent life elsewhere in the universe. Against this critique, the conventional defense of SETI (that we haven’t listened to more than a miniscule fraction of star systems, using more than a miniscule fraction of the spectrum in which communication might be broadcast) seems pretty weak.

I don’t think Davies really goes far enough in trying to open up the issues involved, however–and I think part of that is the desire to keep the questions safely knowable through science, however speculative. That’s characteristic of these types of problems that need heterogeneity of some kind: the current “owners” of the issue are reluctant to turn the speculative dials all the way up to 11 because that seems likely to result in lines of investigation or discussion which aren’t useful or productive. For example, Davies raises old questions about SETI’s teleological understanding of technological evolution, which pop up in the last two parameters of the classic form of the Drake equation. As Davies observes, this is one of many places where SETI has been too intellectually narrow in its focus, too inclined to look in the mirror of a certain kind of mid-20th Century modernist-rationalist vision of human history and to write that across the stars.

But even Davies is reluctant to really open up those questions. He points out that we can rethink questions about the origin of life simply by asking whether there were multiple abiogenesis events on Earth itself, and thus “alien” lifeforms already within our biosphere. Ok, but similarly you can open up questions about what what information and communication are, what intelligence is, what the directionality (if any) of the culture of intelligent beings is over time, and about the multiple contingencies and accidents of technological history might be. Again, just using Earth and its history. To do that, you’d need historians, anthropologists, linguists, computer scientists, philosophers, translators, cultural critics, and economists in the room, and not just the usual suspects or people who are already inclined to buy into the conventional embedded narratives that lead people to SETI. If that conversation is to be at all useful at opening up the problem of alien intelligence and whether or how it might signal its existence, it has to explore the whole of the possibility space without rushing to “things we can plausibly detect or investigate”.

2) Artificial intelligence.

Here I think the lesson’s already been learnt. Go back to the postwar beginnings of AI research and look at blithe pronouncements by Minsky, Simon and other early scholars in the field about how human-equivalent AI would be relatively easy to create. It’s easier to convince people that fresh approaches and unsettling questions are necessary when they’ve hit a brick wall or when they’ve had to eat a few helpings of humble pie. This is not to say that the biggest possible tent for AI research is now buzzing with happy, collaborative discussions: there are a lot of long-standing epistemological and methodological rivalries, and it can be very hard to get some of those constituencies into a mutual and exploratory discussion. But when you look over the whole field as a noncombatant, it’s hard not to be impressed by the presence of multiple disciplines and perspectives.

3) Economic development.

It is really hard to create opportunities for people who want to raise skeptical or critical questions about the normative backdrop of development work or policy-oriented studies of development to be in the same room as policy makers, NGO administrators, or scholars who work within those normative boundaries. (Skeptics who have appropriate professional or disciplinary backgrounds, like William Easterly, get heard, but that’s about the limit of what’s seen as intelligible.) On the other hand, many theoretical, historical or ethnographic studies of development projects are remote from a whole range of pragmatic or lived choices and challenges. I think it’s past time to wheel the whole apparatus of development and aid into the workshed for some fundamental redesigns, the kind of reworking where every assumption and idea receives a mandatory dose of heavy skeptical review that ranges from basic points of philosophy and ethics to finely calibrated technical questions.

4) Education

Even more than development, education is a domain of study and policy where most of the main stakeholders need a time out, to go and work for a bit in a clean room with some unaccustomed partners and novel artifacts and resources. Maybe more than development, there is a substantial amount of existing heterogeneity to work with, even within education studies itself–but if an open conversation about development might consist of introducing several hermetically sealed groups to one another, when it comes to education, different groups and approaches are all too conscious of each other’s existence. I know that the stakes are very high, and the antagonisms between various players are really deep-seated, so this may not be a realistic hope.

5) Cultural creation.

I feel like there’s a lot that people who study or interpret expressive culture could add in a discussion of how to create culture, but also vice-versa. This is always the hope in literature departments that combine working writers or translators with literary critics, or art departments that combine studio artists with art historians and critics. Sometimes it works, sometimes it doesn’t–but I think the conversation could be even more productive with a much wider range of participants, including some researchers and intellectuals who wouldn’t even necessarily see themselves as directly doing work that has implications for producing expressive culture.

Posted in Academia | 4 Comments

Imaginary Menaces

The dedicated Kool-Aid drinkers who brought us the entertaining proposition that risks should be made public and any profits that derived from public investment should be made relentlessly private have left a number of curious inversions in their wake. Witness the odd rhetorical twist in the opening of today’s New York Times story about AT&T’s decision to introduce pricing tiers affecting data usage by cellphone owners.

The tiers seem like a perfectly reasonable business move to me. You have a small number of customers (primarily iPhone users) who are making heavy demands on your network and costing you more than they’re worth. You adjust your pricing so that the most expensive users pay a fee that makes them profitable. For many users, their fees will actually go down, as they will opt for and stay under the 2-megabyte limit of the middle tier plan.

But the article starts by describing and sneering at “data hogs” as if they were a public problem, using classic tragedy-of-the-commons flourishes. The only reason to think about this as a problem of public goods has to do with how we allocate and sell off spectrum. But if you had an issue with that, it wouldn’t be with what customers do, but with the entire idea of selling the public interest in spectrum to private companies. Once we’ve decided to handle it that way, what AT&T does to extract profit from the spectrum it holds rights to is its problem, subject to regulation. If AT&T decides to offer unlimited usage plans in order to draw iPhone customers, ok. If AT&T finds out that eats into their profits, ok. That’s AT&T’s problem.

What a few customers do under an unlimited usage plan is, well, use what they’ve been offered. If I go to an all-you-can-eat restaurant and Orson Wells shows up and proceeds to eat an entire cow’s worth of beef, it’s not Orson Wells’ problem, nor is it the problem of other customers, and it especially is not the problem of society at large. (Yes, ok, obesity itself may pose an issue in terms of public goods, but that’s a much bigger question. Just as “open access to online information” is a much bigger question than this data plan’s pricing.) Would you run a story in the business section on the problem of people who eat too much at all-you-can-eat restaurants and sigh with relief that at last they’re being surcharged, society is saved! The restaurant makes that offer gambling that it will draw people who will eat at best modestly more than they would have otherwise. If they calculate wrongly, they’ll have to change their pricing or their market image or their food quality or something. Or they’ll go out of business. Which is also their problem.

Posted in Information Technology and Information Literacy, Politics | 2 Comments