On the Arrival of Rough Beasts

One of the things I find most interesting about the history of advertising is the long-running conflict between the “creatives” and their more quantitative, data-driven opponents within ad agencies. It’s a long-running, widespread opposition between a more humanistic, intuitive, interpretative style of decision-making and professional practice and a more rules-driven, empirical, formalistic approach.

The methodical researchers are generally always going to have to create advertisements and construct marketing campaigns by looking at the recent past and assuming that the near-term future will be the same. In an odd way, I think their practices have been the analog equivalent to much of the algorithmic operations of digital culture, trained through the methodical tracking of observable behavior and the collection of very large amounts of sociological data. If you know enough about what people in particular social structures have done in response to similar opportunities, stimuli or messages, the idea goes, you’ll know what they will do the next time.

My natural sympathies, however, are with the creatives. The creatives are able to do two things that the social science-driven researchers can’t. They can see the presence of change, novelty and possibility, even from very fragmentary or implied signs. And they can produce change, novelty and possibility. The creatives understand how meaning works, and how to make meaning. They’re much more fallible than the researchers: they can miss a clue or become intoxicated with a beautiful interpretation that’s wrong-headed. They’re either restricted by their personal cultural literacy in a way that the methodical researchers aren’t, and absolutely crippled when they become too addicted to telling the story about the audience that they wish was true. Creatives usually try to cover mistakes with clever rhetoric, so they can be credited for their successes while their failures are forgotten. However, when there’s a change in the air, only a creative will see it in time to profit from it. And when the wind is blowing in a stupendously unfavorable direction, only a creative has a chance to ride out the storm. Moreover, creatives know that the data that the researchers hold is often a bluff, a cover story, a performance: poke it hard enough and its authoritative veneer collapses, revealing a huge hollow space of uncertainty and speculation hiding inside of the confident empiricism. Parse it hard enough and you’ll see the ways in which small effect sizes and selective models are being used to tell a story, just as the creatives do. But the creative knows it’s about storytelling and interpretation. The researchers are often even fooling themselves, acting as if their leaps of faith are simply walking down a flight of stairs.

This is only one manifestation of a division that stretches through academia and society. I think it’s a much more momentous case of “two cultures” than an opposition between the natural sciences and everything else. If you want to see this fault line somewhere else besides advertising, how about in media-published social analysis of this year’s presidential election in the United States? Glenn Greenwald and Zaid Jilani are absolutely right that not only have the vast majority of analysts palpably misunderstood what was happening and what was going to happen, but that most of them are now unconvincingly trying to bluff once again at how the data makes sense, the models are still working, and the predictions are once again reliable.

The campaign analysts and political scientists who claim to be working from rock-solid empirical data will never see a change coming until it is well behind them. Up to the point of its arrival, it will always be impossible, because their models and information are all retrospective. Even the equivalent of the creatives in this arena are usually wrong, because most of them are not really trying to understand what’s out there in the world. They’re trying to make the world behave the way they want it to behave, and they’re trying to do that by convincing the world that it’s already doing exactly what the pundit wants to the world to do.

The rise of Donald Trump is only the most visible sign of the things that pundits and professors alike do not understand about which way the wind is blowing. For one, Trump’s rise has frequently been predicted by one set of intuitive readers of American political life. Trump is consequence given flesh, the consequence that some observers have said would inevitably follow from a relentless disregard for truth and evidence that’s been thirty years on the making, from a reckless embrace of avowedly instrumental and short-term pursuit of self-interest, from a sneering contempt for consensus and shared interests. He’s the consequence of engineering districts where swing votes don’t matter and of allowing big money to flood the system without restraint. He’s what many intuitive and data-driven commenters have warned might happen if all that continued. But the election analysts can’t think in these terms: the formal and understood rules of the game are taken to be unchanging. The analysts know what they know. The warning barks from the guard-dogs are just an overreaction to a rustle in the leaves or a cloud over the moon.

But it’s more than that. The pundits and professors who got it wrong on Trump (and who are I think still wrong in understanding what might yet happen) get it wrong because the vote for Trump is a vote against the pundits and professors. The political class, including most of the Republican Party but also a great many progressives, have gotten too used to the idea that they know how to frame the narrative, how to spin the story, how to massage the polls, how to astroturf or hashtag. So many mainstream press commenters are now trying to understand why Trump’s alleged gaffes weren’t fatal to his candidacy, and they’re stupidly attributing that to some kind of unique genius on Trump’s part. The only genius that Trump has in this respect is understanding what was going on when his poll numbers grew rather than dropped after those putative gaffes. The content of those remarks was and remains secondary to his appeal. The real appeal is that he doesn’t give a shit what the media says, what the educated elite say, what the political class says. This is a revolt against us–against both conservative and progressive members of the political class. So of course most of the political class can’t understand what’s going on and keep trying to massage this all back into a familiar shape that allows them to once again imagine being in control.

Even if Trump loses, and I am willing to think he likely will by a huge margin, that will happen only because the insurgency against being polled, predicted, dog-whistled, manipulated and managed into the kill-chutes that suit the interests of various powers-that-be is not yet coalesced into a majority, and moreover, is riven internally by its own sociological divisions and divergences. But even as Trump was in some sense long predicted by the gifted creatives who sift the tea leaves of American life, let me also predict another thing: that if the political class remains unable to understand the circumstances of its own being, and if it is not able to abandon its fortresses and silos, the next revolt will not be so easily contained.

Posted in Academia, Oath for Experts, Oh Not Again He's Going to Tell Us It's a Complex System, Politics | 1 Comment

Dramatic Arc

Me at the beginning of a class meeting where I’ve assigned one of my favorite books.

Me realizing that maybe a quarter of the class read it with any real attention despite the fact that I already said it’s going to be an essay question on the final.

Me inside as we wind down the class.

Posted in Academia, Swarthmore | 6 Comments

Cost Control Is a Progressive Value

If you are a long-time reader of this increasingly intermittent blog, you know I have some recurrent frustrations and fascinations that I return to again and again. Sometimes in fact the blog is intermittent because I am afraid I am becoming a bore on those themes.

One of these recurrent issues for me is the financial sustainability of higher education. On one hand, I aspire to some skepticism about the neoliberal attempt to produce scarcity where it does not need to exist, out of belief that the mindset of scarcity produces proper decisions about value, that homo economicus behaves wisely whereas people who feel they live amid plenty and security waste resources and underproduce value. It’s not just that the moral underpinnings of that view are barren and repellant, it’s also that it is plainly empirically untrue whether we’re talking universities or companies. Realism about limits and constraints is a good thing, but artificially producing constraints in order to compel a winner-take-all struggle and squeeze productivity out of people has already destroyed much of what the 20th Century usefully accomplished towards the forging of saner, kinder, and more richly meaningful societies.

But the opposite of phony scarcity is not “we’re rich, so we can do whatever we want”. Which is a view that I hear sometimes from local and national colleagues and students, that in order to reject scarcity we must never be deterred by or involved in determinations of financial and material limits to our resources. That for one is one of the major ways that tenure-track faculty in many institutions became at least passively complicit in the casualization of academic labor. Acting as if one’s own teaching load or service obligations or labor is a matter of strictly personal or departmental negotiation with an administrative head, and the larger implications of the outcome of those negotiations are somebody else’s business is how in some cases we ended up with curricula that dictated that matriculants had to take courses that couldn’t possibly be staffed out of the available tenure-track labor force. Faculty at some institutions participated slowly and incrementally in the building of a curriculum that could never be possibly staffed by even the wealthiest institution and then blamed administrators for the shift to impoverished, marginalized and excluded laborers.

Though of course they are to blame in many universities, and for doing a great deal that has made all sorts of financial situations worse. Faculty and students may sometimes push for and be appallingly naive about institutional growth, but one of the basic reasons to have academic administrations in the first place is to keep a university or college close to its essential mission, and to resist relentlessly additive expansion of that mission.

The late Marshall Berman’s book All That Is Solid Melts Into Air has a marvelous analysis of Faust as the “primal Growthman”, as the quintessential example of a modern archetype, the person dedicated to a vision of modernity as not only ceaselessly mutable but dedicated to the replacement of all that we have with more than we have, with the making of all things into bigger things, with accumulation and expansion. It’s not a surprise to me that the most heedless and energetic Fausts of our own times are now restlessly looking to “disrupt” anything that seems to expand too slowly. Some of that’s about money, about financial Alexanders who weep because there are no worlds left to conquer, no investments left to make. Some of it very nearly a religious or sacred belief: that whatever seems to stand still is an offense. No wonder too that people like Elon Musk are shilling for Mars colonies and asteroid mines. Faust is hungry for the same reason Cookie Monster is: tired of waiting for a batch from the oven, he’s eaten the spoon, the pan, the mixer, the table. The cookies are as good as eaten already: they were eaten before they were even mixed.

Do not feed the Faust. That’s really what reaching sustainability is going to be about. At every moment, in every conversation, in every plan and meeting and process, any progressive academic who pays even the remotest attention to sustainability in higher education (or elsewhere) is going to invariably ask: what can we repurpose or reuse? If we want novelty, or change, or difference, if we believe in originality and innovation, what can we do differently? So in that sense, one thing we should always be alarmed about is any sign that adminstrators (or colleagues) have found a new source of revenue. Even if that’s about making up cuts in public support, which is how University of California administrators defend bringing in more out-of-state students with relaxed admission standards, new revenue (or even replaced revenue from a new source) is always imagined as temporary but effectively becomes permanent from the moment it is integrated into an operational budget. Whatever was done to secure new revenue will have to be done forever after. Growth quickly requires itself unless it’s limited, finite and finished all in a single momentary flash, unless it is only for a single specific purpose.

I think in a way most of us know it, and that’s why academics are so temperamentally conservative. We know that whatever new things we do will eventually be at the cost of something we are already doing, unless we sign on as little apprentice Fausts. But that’s the harder habit that sustainability in all our life will eventually call upon us to accept and even embrace, to live impermanent lives. We will need to build yurts where now we build fortresses, to move on as the intellectual seasons change. I think we can offer working lives of security and satisfaction within an academy that doesn’t grow. Impermanence in the work we do and the missions we accept is not precarity. But the only way we get there is to accept that if we don’t talk about cost and limits and budgets in this spirit, no one else in our present worlds will. That kind of talk cannot be outsourced, it cannot be deferred, it not someone else’s business. When temptation comes in the form of bigger and more (though not in the form of restoration and preservation, which are sorely needed), we’ll have to be able to turn it down.

Posted in Academia, Politics | 2 Comments

A Chance to Show Quality

Romantic ideals of originality still remain deeply embedded in how we recognize, cultivate and reward merit in most of our selective systems of education, reputation and employment. In particular we read for the signs of that kind of authentic individuality in writing that is meant to stand in for the whole of a person. Whether it’s an essay for admission to college, a cover letter for a job, an essay for the Rhodes or Fulbright, an application for research funding from the Social Science Research Council or the National Science Foundation, we comb for the signs that the opportunity-seeker has new ideas, has a distinct sensibility, has lived a life that no one else has lived. Because how else could they be different enough from all the other worthies seeking the opportunity or honor so as to justify granting them their desires?

Oh, wait, we also want to know, almost all of the time, whether the opportunity-seeker is enough like everyone else that we can relate their talents, ideas, capabilities, plans and previous work to the systems which have produced the applicants. We want assurances that we are not handing resources, recognition and responsibility to a person so wholly a romantic original that they will not ever be accountable or predictable in their uses. We want to know that we are selecting for a greatness that we already know, a merit that we already approve of.

This has always been the seed that grows into the nightmare of institutions, that threatens to lay bare how much impersonality and distance intrudes upon decisions that require a fiction of intimacy. Modern civic institutions and businesses lay trembling hands on their bankrolls when they think, however fleetingly, that there is a chance that they’re getting played for fools. That they are dispensing cheese to mice who have figured out what levers to push. That when they read the words of a distinctive individual, they are really reading the words of committees and advisors, parents and friends. That they are Roxane swooning over Christian rather than Cyrano, or worse, that they are being catfished and conned.

The problem is that when we are making these choices, which in systems of scarcity (deliberately produced or inevitably fated) must be made, we never really decide what it is that we actually value: unlikeness or similarity, uncertainty or predictability, originality or pedigree. That indecision more than anything else is what makes it possible for people to anticipate what the keepers of a selective process will find appealing. Fundamentally, that boils down to: a person with all the qualifications that all other applicants have, and a personal experience that no one else could have had but that has miraculously left the applicant even more affirmed in their qualifications. Different in a way that doesn’t threaten their sameness.

I’ve been involved in a number of processes over the years where those of us doing the selecting worried about the clear convergence in some of the writing that candidates were doing. We took it to be a sign that some candidates had an advantage that others didn’t, whether that was a particularly aware and canny advisor or teacher, or it was some form of organized, institutional advice. I gather that there are other selective institutions, such as the Rhodes Foundation, that are even more worried, and have moved to admonish candidates (and institutions) that they may not accept advice or counsel in crafting their writing.

The thing is, whenever I’ve been in those conversations, it’s clear to me that the answer is not in the design of the prompt or exercise, and not in the constraints placed on candidates. It’s in the contradictions that selective processes hold inside themselves, and in the steering currents that tend to make them predictable in their tastes. When you try to have it all, to find the snowflake in the storm, and yet also prize the snowfall that blankets the trees and ground with an even smoothness, you are writing a human form of algorithm, you are crafting a recipe that it takes little craft to divine and follow. The fault, in this case, lies in us, and in our desires to be just so balanced in our selection, to stage-manage a process year in and year out so that we get what we want and yet also want what we get.

Maybe that was good enough in a time with less tension and anxiety about maintaining mobility and status. But I suspect the time is coming where it will not be. Not because people seek advantage, but because anything that’s predictable will be something relentlessly targeted by genuine algorithms. Unpredictability is never a problem for applicants or advisors, always for the people doing the selection or the grading or the evaluation. If you don’t want students to find a standard essay answer to a standard essay prompt, you have to use non-standard prompts. If you don’t want applicants to tell you the very moving story of the time they performed emergency neurosurgery on a child in the developing world using a sterilized safety pin and a bottle of whisky, you have to stop rewarding applicants who tell you that story in the way that has previously always gotten your approval. If what we want is genuine originality, the next person we choose has to be different from the last one. If what we want is accomplished recitation of training and skills, then we look for the most thorough testing of that training. When we want everything, it seems, we end up with performances that very precisely thread the needle that we insistently hold forth.

Posted in Academia, Information Technology and Information Literacy, Swarthmore | 3 Comments

Opt Out

There is a particular kind of left position, a habitus that is sociologically and emotionally local to intellectuals, that amounts in its way to a particular kind of anti-politics machine. It’s a perspective that ends up with its nose pressed against the glass, looking in at actually-existing political struggles with a mixture of regret, desire and resignation. Inasmuch as there is any hope of a mass movement in a leftward direction in the United States, Western Europe or anywhere else on the planet, electoral or otherwise, I think it’s a loop to break, a trap to escape. Maybe this is a good time for that to happen.

Just one small example: Adam Kotsko on whether the Internet has made things worse. It’s a short piece, and consciously intended as a provocation, as much of his writing is, and full of careful qualifiers and acknowledgements to boot. But I think it’s a snapshot of this particular set of discursive moves that I am thinking of as a trap, moves that are more serious and more of a leaden weight in hands other than Kotsko’s. And to be sure, in an echo of the point I’m about to critique, this is not a new problem: to some extent this is a continuous pattern that stretches back deep into the history of Western Marxism and postmodernism.

Move #1: Things are worse now. But they were always worse.

Kotsko says this about the Internet. It seems worse but it’s also just the same. Amazon is just the Sears catalogue in a new form. Whatever is bad about the Internet is an extension, maybe an intensification, of what was systematically bad and corrupt about liberalism, modernity, capitalism, and so on. It’s neoliberal turtles all the way down. It’s not worse than a prior culture and it’s not better than a prior culture. (Kotsko has gone on to say something of the same about Trump: he seems worse but he’s just the same. The worst has already happened. But the worst is still happening.)

I noted over a decade ago the way that this move handicapped some forms of left response to the Bush Administration after 9/11. For the three decades before 9/11, especially during the Cold War, many left intellectuals in the West practiced a kind of High Chomskyianism when it came to analyzing the role of the United States in the world, viewing the United States as an imperial actor that sanctified torture, promoted illiberalism and authoritarianism, acted only for base and corrupt motives. Which meant in some sense that the post-9/11 actions of the Bush Administration were only more of the same. Meet the new boss, same as the old boss. But many left intellectuals wanted to frame those actions as a new kind of threat, as a break or betrayal of the old order. Which required saying that there was a difference between Bush’s unilateralism and open sanction of violent imperial action and the United States during the Cold War and the 1990s and that the difference was between something better and something worse. Not between something ideal and something awful, mind you: just substantively or structurally better and substantively or structurally worse.

This same loop pops up sometimes in discussions of the politics of income inequality. To argue that income inequality is so much worse today in the United States almost inevitably requires seeing the rise of the middle-class in postwar America as a vastly preferable alternative to our present neoliberal circumstances. But that middle-class was dominated by white straight men and organized around nuclear-family domesticity, which no progressive wants to see as a preferable past.

It’s a cycle visible in the structure of Howard Zinn’s famous account of American history: in almost all of Zinn’s chapters, the marginalized and the masses rise in reaction to oppression, briefly achieve some success, and then are crushed by dominant elites, again and again and again, with nothing ever really changing.

It’s not as if any of these negative views of the past are outright incorrect. The U.S. in the Cold War frequently behaved in an illiberal, undemocratic and imperial fashion, particularly in the 1980s. Middle-class life in the 1950s and 1960s was dominated by white, straight men. The problems of culture and economy that we identify with the Internet are not without predicate or precedent. But there is a difference between equivalence (“worse now, worse then”) and seeing the present as worse (or better) in some highly particular or specific way. Because the latter actually gives us something to advocate for. “Torture is bad, and because it’s bad, it is so very very bad to be trying to legitimate or legalize it.” “A security state that spies on its own people and subverts democracy is bad, and because it’s bad, it’s so much worse when it is extended and empowered by law and technology.”

When everything has always been worst, it is fairly hard to mobilize others–or even oneself–in the present. Because nothing is really any different now. It is in a funny kind of way a close pairing to the ahistoricism of some neoliberalism: that the system is the system is the system. That nothing ever really changes dramatically, that there have been in the lives and times that matter no real cleavages or breaks.

Move #2: No specific thing is good now, because the whole system is bad.

In Kotsko’s piece on the Internet, this adds up to saying that there is no single thing, no site or practice or resource, which stands as relatively better (or even meaningfully different) apart from the general badness of the Internet. Totality stands always against particularity, system stands against any of its nodes. Wikipedia is not better than Amazon, not really: they’re all connected. Relatively flat hierarchies of access to online publication or speech are not meaningful because elsewhere writers and artists are being paid nothing.

This is an even more dispiriting evacuation of any political possibility, because it moves pre-emptively against any specific project of political making, or any specific declaration of affinity or affection for a specific reform, for any institution, for any locality. Sure, something that exists already or that could exist might seem admirable or useful or generative, but what does it matter?

Move #3: It’s not fair to ask people how to get from here to a totalizing transformation of the systems we live under, because this is just a strategy used to belittle particular reforms or strategies in the present.

I find the sometimes-simultaneity of #2 and #3 the most frustrating of all the positions I see taken up by left intellectuals. I can see #2 (depressing as it is) and I can see #3 (even when it’s used to defend a really bad specific tactical or strategic move made by some group of leftists) but #2 and #3 combined are a form of turtling up against any possibility of being criticized while also reserving the right to criticize everything that anyone else is doing.

I think it’s important to have some idea about what the systematic goals are. That’s not about painting a perfect map between right now and utopia, but the lack of some consistent systematic ideas that make connections between the specific campaigns or reforms or issues that drawn attention on the left is one reason why we end up in “circular firing squads”. But I also agree that it’s unfair to argue that any specific reform or ideal is not worth taking up if it can’t explain that effort will fix everything that’s broken.

4. It’s futile to do anything, but why are you just sitting around?

E.g., this is another form of justifying a kind of supine posture for left intellectuals–a certainty that there is no good answer to the question “What is to be done?” but that the doing of nothing by others (or their preoccupation with anything but the general systematic brokenness of late capitalism) is always worth complaining about. Indeed, that the complaint against the doing-nothingness of others is a form of doing-something that exempts the complainer from the complaint.

——-

The answer, it seems to me, is to opt out of these traps wherever and whenever possible.

We should historicize always and with specificity. No, everything is not worse or was not worse. Things change, and sometimes neither for better nor worse. Take the Internet. There’s no reason to get stuck in the trap of trying to categorize or assess its totality. There are plenty of very good, rich, complex histories of digital culture and information technology that refuse to do anything of the sort. We can talk about Wikipedia or Linux, Amazon or Arpanet, Usenet or Tumblr, without having to melt them into a giant slurry that we then weigh on some abstracted scale of wretchedness or messianism.

If you flip the combination of #2 and #3 on their head so that it’s a positive rather than negative assertion, that we need systematic change and that individual initiatives are valid, then it’s an enabling rather than disabling combination. It reminds progressives to look for underlying reasons and commitments that connect struggles and ideals, but it also appreciates the least spreading motion of a rhizome as something worth undertaking.

If you reverse #4, maybe that could allow left intellectuals to work towards a more modest and forgiving sense of their own responsibilities, and a more appreciative understanding of the myriad ways that other people seek pleasure and possibility. That not everything around us is a fallen world, and that not every waking minute of every waking day needs to be judged in terms of whether it moves towards salvation.

We can’t keep saying that everything is so terrible that people have got to do something urgently, right now, but also that it’s always been terrible and that we have always failed to do something urgently, or that the urgent things we have done never amount to anything of importance. We disregard both the things that really have changed–Zinn was wrong about his cyclical vision–and the things that might become worse in a way we’ve never heretofore experienced. At those moments, we set ourselves against what people know in their bones about the lives they lived and the futures they fear. And we can’t keep setting ourselves in the center of some web of critique, ready to spin traps whenever a thread quivers with movement. Politics happens at conjunctures that magnify and intensify what we do as human beings–and offer both reward and danger as a result. It does not hover with equal anxiety and import around the buttering of toast and the gathering of angry crowds at a Trump rally.

Posted in Blogging, Information Technology and Information Literacy, Oh Not Again He's Going to Tell Us It's a Complex System, Politics | 4 Comments

#Prefectus Must Fall: Being a True History of Uagadou, the Wizarding School

So there’s been a spot of disagreement about how to think about state systems in Africa in relationship to J.K. Rowling’s world-building for her Harry Potter novels. I feel a bit bad about perceptions that I was being unfair, but I also mostly continue to feel that this is just the latest round in a long-standing interdisciplinary tension (arguably all the way back into Enlightenment philosophy) about what exactly can be compared about human societies and on what basis the comparison ought to be made. I think that’s a discussion in which African societies have often been described as having a deep history of not having what Europe has, with the comparison serving to explain disparities and inequalities in the present-day. I am not the first to react strongly to that mode of comparison.

But I also do feel that it’s important in some sense not to have a dispute that is both scholarly and political completely overwhelm the possibility of giving useful guidance to J.K. Rowling and other creators who work with fantasy or speculative fictions. In general, I would like to see specialists in African history and anthropology be prepared not only to provide useful, digestible knowledge to fiction writers but also to non-specialists. Which means, I think, showing how it could be possible to draw upon specific African histories and experiences to create and imagine fictions and stories that incorporate African inspirations rather than to treat Africa as a zone of exclusion because it’s too difficult or touchy.

So: a bit of fanfiction, intended to demonstrate how to subtly rework what Rowling has already said about her wizarding world.

—————————

#PrefectusMustFall
A MANIFESTO

For a month now the instructors at Uagadou have dutifully assembled to ward off attempts by students, particularly those in Ambatembuzi House, to cast kupotea on the statue of Peter Prefectus that has been at the foot of the Great Stairway for the past sixty years.

Prefectus’ own nkuni spirit has joined the teachers in defending his statue, though as always it is hazy and distracted, only half here, half wandering indistinctly in the halls of England’s Ministry of Magic. We say that they must allow the spell to be cast: let him go home once and for all. There are few left in Prefectus House, anyway. The white wizards who still live in Africa go to Hogwarts, Ilvermorny or Durmstrang, as do some number of Africans.

Prefectus Must Fall. Though we students love Uagadou and what we learn here, it is time for this school to be a truly African school. Not the “African” of silly affectations like using hands instead of wands that a few teachers introduced forty years ago in an attempt to get away from Prefectus’ wholesale importation of the curriculum of Hogwarts! Let us rediscover the real history of African magic, of the many magical styles and ways of learning from Africa!

We know the truth now. This old, rotting, half-real castle shivering in the mountains isn’t a thousand years old, it’s 110 years old. Or more to the point, it’s a thousand-year old school that was stolen and stuffed inside an imposter’s cheap recreation of the school that never let him be a teacher. Peter Prefectus was a fourth-rate wizard stuck in a basement of the British Ministry of Magic who decided that if he couldn’t teach at Hogwarts, he’d go off to Africa just like the Muggle officer Harry Johnston and make a Hogwarts there.

There was a school here once, back before the kingdom of Bunyoro rose. It wasn’t for all Africans everywhere, but Swahili and Ituri and Khoisan wizards from the coast and the jungle and the forests all came. People from the shores of the big lakes came, people from the hills and savannah came. That’s where Peter Prefectus built his fake Hogwarts, where that old school was. The leaders of that ancient school foolishly let him and helped lift the stones and cast the spells. They felt they needed to understand what was happening, and to learn the magics that Prefectus offered, but all they did was sell out our heritage!

They don’t tell you when you get sorted that Prefectus was an incompetent who had the cheek to believe that his teachers and pupils were incapable of any real magic anyway. He never learned an African language, not one, but made the students learn spells like “expelliarmus” and “impedimenta”. He hired other European wizards and let them bully and hurt and even kill the Africans who came there. We had wizards like Grindlewald and Voldemort here too, but they were in charge and no one came to the rescue, not for us.

We know the truth. Prefectus must fall.

Prefectus stole two schools! The ancient one of the lakes and then he had the cheek to try to steal a name from near to another old African place of magical learning, the school which today still exists at Kumbi Saleh in the ruins of Ghana. Hard times for it now, harried by sinister wizards hiding in the Sahara who believe that all magical schools should be destroyed. That is another reason Prefectus must fall: it is time for Uagadou to do its part in helping other African wizards in their struggles. Kumbi Saleh should not have to wait for a half-hearted delegation of wizards from Beauxbatons and Durmstrang to save it from attack. We should not hear any longer from our headmaster and teachers that it is “against tradition” for Uagadou to play a role.

Uagadou, even in disrepair, is still wealthier than our real comrades at the ancient academies in Kumbi Saleh and Axum. We should help them and work with them and learn from their wisdom about wizarding. We should be working with the “moving school” of Eshu, the secret society of West African wizards who have no castle or building, but who move tirelessly from one site of ancient power to the next, from Old Oyo to Benin to Kumasi, walking the ways that they know. We should talk to the small schools that meet all over the continent, and reach out to wizards too poor or endangered to think of coming here. Uagadou should train far more Africans than it does, and stop just being for a small handful of families made powerful by their dealings with the European wizards.

Prefectus Must Fall! Unite to liberate our school and our peoples! Leave off the lies, cast away the glossy brochures that arrive by Dream Messengers to entice you here. Face the truth!

Posted in Africa, Sheer Raw Geekery | 8 Comments

On Uagadou, the African Wizarding School

I have a good deal to say on the plausibility of a wizarding school in J.K. Rowling’s fantasy world, and the first would be that I should know better than to send Twitter to do a blog’s job, I guess. There is a good deal wrong with Henry Farrell and Chris Blattman’s defense of Rowling’s imagination. To some extent more wrong than Rowling herself. You may from the outset roll your eyes and say, “It’s imaginary, let it go” and I hear you, but in fact the kinds of imaginary constructions of African societies and African people that operate in fantasy, science-fiction and superhero universes are actually rather instructive guides to how Western-inflected global culture knows and understands the histories of African societies as a history of absence, lack or deficit rather than as histories of specific presence, as having their own content that is in many ways readily knowable.

Let’s start from the very beginning, with Rowling’s expansion of her world-building in Harry Potter. When she recently imagined what the whole world in her fantasy universe looks like, what did she say about it?

1. That most nations in her world do not have their own wizarding schools. Most wizards are “home-schooled”.
2. That distance education (“correspondence courses”) are also used to train wizards.
3. That the eleven wizarding schools that do exist in the world share some common characteristics that derive from the common challenges and affordances of magic. They tend to be remote, often in mountainous areas, in order to insulate themselves from Muggles, in order to attempt to stay out of wizard politics as much as possible, and to maintain some independence from both Muggle and wizard governance.
4. That there is an International Confederation of Wizards to whom a budding wizard can write (via owl) to find out about the nearest wizard school.
5. So far, Rowling has announced that there are three wizarding schools in Europe, one in North America (on the East Coast), one in Japan, on in Brazil (in the rainforest), and one in Africa called Uagadou, pronounced Wagadu. As far as I know the others aren’t announced yet.

What of Uagadou?

1. It’s pronounced Wa-ga-doo. Farrell and Blattman take this to be a reference taken from the place of the same name associated with the ancient empire of Ghana. (Which was located in what is now Mali and Mauritania in West Africa.)
2. There are smaller wizarding schools in Africa, but Uagadou has an “enviable” international reputation and is a thousand years old.
3. It enrolls students from all over the continent.
4. Much magic, maybe all magic, comes from Africa.
5. Wands are European inventions; African wizards just use their hands.
6. Uagadou doesn’t use owls for messages, it uses Dream Messengers.

In response to Twitter complaints that this is just more “Africa is a country” thinking, where the entire continent gets one school that is an undifferentiated mass of African-ness, without specific location, Rowling has responded first to say, “Students from all over” and second, that Uagadou is in Uganda, in the “Mountains of the Moon”, by which she probably means the Rwenzori Mountains in northwestern Uganda.

——

Farrell and Blattman set out to defend Rowling, saying that it is plausible that all of sub-Saharan Africa would only have one wizarding school. (I’m guessing that before she’s done, there will be a wizarding school in Egypt or otherwise near to North Africa, so let’s leave that aside.) Farrell and Blattman do so by saying that Sub-Saharan Africa didn’t have a “state system”. In an initial tweet, I expressed my irritation by noting that there were states in Africa, to which Farrell replied that their article concedes that there were. Just that material environments “conspired against” state development until colonialism, and that the fewer states that existed were far apart, and thus that there was no state system, no competitive relationship between states, and thus that states did not become strong through such competition, unlike in Europe or Japan, where there were more rivalrous relationships between states because of the relative scarcity of land.

I think I am right to say that Farrell and Blattman’s acknowledgement that there were states is essentially prophylactic, meant to head off precisely the kind of Twitter objection I offered. The substance of their piece is still this: Africa had an absence of something that Europe had a presence of, and that this is what makes Rowling’s fantasy a historically plausible one, that rivalrous states that form a state system that is about control over a scarce resource (land) could lead to having multiple wizarding schools, and that Africa’s absence of these things means that having only one makes sense too. “There has been a relatively solid state” in England for a thousand years, they say, so of course Hogwarts. Uagadou, in contrast, must have formed in the absence of a state. And maybe it shares a name with a place that was thousands of miles away because perhaps “the school began in a faraway territory, before it hid itself in the remote mountains of central Africa, fleeing slave raiders and colonial powers”.

———-

I have on occasion expressed frustration with Africanists for insisting that non-specialists must go deep inside the particulars of specific African histories in order to win the right to talk about them. And the similar inclination of many practicing historians to view large-scale comparative history or the more universalist aspirations of many social scientists with suspicion. But this is a case where some of that suspicion is warranted, I think. Partly because Farrell and Blattman insist on the tangible historical plausibility of Uagadou in Rowling’s fantasy world and they then toss in just enough history to be tangibly wrong.

Here’s the thing. First, if I were going to construct what is essentially a fantasy counterfactual of a relationship between the place Wagadu and some other place in sub-Saharan Africa, that a group of wise and knowledgeable wizards moved from an important trading community in the empire of Ghana to somewhere else in Africa, I’d at least stick to historically plausible routes of movement and connection. Wagadu and the eastern side of the Rwenzori Mountains is roughly like imagining that an ancient group of Irish wizards relocated to Ukraine in order to get away from British landlords. It’s very nearly random, and that’s the problem. It’s exquisitely well-meaning of Rowling to want to imagine Uagadou in the first place, and to respectfully draw out of African history for the name of the place. But it doesn’t make sense in terms of very real histories that can be described for what they actually were, not in terms of some abstracted absence in comparison to Europe.

Equally, I’d wonder at the counterfactual that has Uagadou moving a thousand years ago, before the trans-Atlantic slave trade, and at the height of state-building (even state system building) in the upper Niger and Sahel. It’s not as if the idea of great institutions of learning and teaching built through the revenues of trade are fantasies in that region of West Africa at that specific time: there were real institutions of that kind built in Timbuktu and Gao at exactly that moment which depended on very real long-distance connections between Muslim polities in Egypt and North Africa and the major states and polities of the West African interior. Why would Uagadou want to get away from all of that in 1016 CE? Even if Farrell and Blattman want Africa’s supposed lack of state systems to be the magic variable that produces more than one wizarding school, Uagadou’s birthplace has exactly that. And if even the wise wizards of Uagadou decided they had to leave, why the east side of the Rwenzori mountains, to which the peoples of their home region had no links whatsoever?

But hey, at least Farrell and Blattman’s defense is intact in the sense of western Uganda not having a state system, right? That would have made Uagadou different than other wizarding schools coping with state systems! Except that the region between western Lake Victoria and the Rwenzoris was another place in sub-Saharan Africa where multiple states and polities with sometimes rivalrous relationships go back at least three or four hundred years. If Uagadou was really trying to move to a place where there weren’t very many human beings or there weren’t states or there weren’t state systems (or it arose in such a place, if we discard the relationship between the name Wagadu and Uagadou), western Uganda isn’t the place to put the imaginary school.

———

Ultimately this is why I think Farrell and Blattman’s defense of Rowling is more problematic than Rowling herself. I think Rowling is trying to do the right thing, in fact, to include Africa and Africans in her imaginary world, and she’s not just reaching for lazy H. Rider Haggard or Edgar Rice Burroughs tropes of cities in jungles and excitable natives yelling Ungowa! Bwana! But the fact is that the way she picks up a name to stand in for a more respectful conception of Africanity still underscores the degree to which the history of African societies is a kind of generic slurry for most people. If I had imaginary Scots-named people running around in an imaginary Pomerania dotted with imaginary Finnish place names, most readers of my fantasy would understand that I was doing some kind of mash-up, and if I didn’t have some infodump of an alternate history at some point to explain it, they’d likely regard what I was doing as random or incoherent.

Farrell and Blattman are trying to provide a kind of scholarly imprimateur for that same sort of mashup, but the histories of the places that come into view in Rowling’s imagination are knowable and known. If you ask me to provide the fictional background of a wizarding school in western Uganda and why it is the only one in sub-Saharan African and admits pupils from all over a very large continent, the last thing I’m going to do is start farting around with gigantic generalizations about states and state systems that immediately frame Africa as a place which has a lack, an absence, a deficit, that is somehow naturalized or long-running. I’m going to build my plausibility up from the actual histories of African societies.

So maybe I’m going to talk about the historical world of western Uganda for what it was, for which I have a more than adequate scholarly literature, and try to imagine what a wizarding school there looks like that makes sense in that history. And the first thing I think is that it isn’t a castle in the mountains if it’s a thousand years old and it isn’t distinguished from European wizarding just by using hands rather than wands. I start to think about what magical power in western Uganda might be like, even in a world full of magical power.

If I start to think about why there’s only one school, and why the whole continent uses it, I stop thinking about a thousand years and start thinking about two hundred. I stop messing around with giant social scientistic abstractions and start thinking about colonialism. Which, to head off Farrell and Blattman’s likely objection, they do too–but not as an explanation for Rowling’s fantasy Africa being in a state of relative global deprivation. I start thinking about why Uagadou is in fact like Hogwarts, physically and otherwise. Perhaps why the University of the Witwatersrand is not wildly different from Oxford in the generalities of its institutional functioning. I think about the world in the last three hundred years, and why institutions in modern nation-states resemble each other in form even if they don’t in power or privilege or relative resources or impact. And then I wonder why Rowling doesn’t simply go there too.

The answer would in some sense because Rowling’s descriptions of the wizarding schools wants to retain some whimsy and some friendliness to a young-adult sensibility. But you can imagine African magics in a globalized fantasy from within their imaginary histories rather than from outside and even stay friendly to a young adult sensibility: as Vicki Brennan noted, that’s a good description of Nnedi Okorafor’s Akata Witch. And Rowling’s Harry Potter books inscribe the history of World War II into the wizarding world, and racism and fascism into the conflicts wizards face today. Why isn’t colonialism Dark Magic of a particularly troubling sort–the kind that suppresses many African ways of learning wizardry and then leaves behind a single, limited institution for learning magic that is built on a template that comes from somewhere else?

There’s a plausible history for Uagadou right there, but it can’t be a thousand years old if that’s the case. This is the basic problem.

You can tell a story that imagines fantastic African societies with their own institutions arising out of their own histories, somehow protected or counterfactually resistant to the rise of the West. But you have to do that through African histories, not with an audit of African absence and some off-the-shelf environmental determinism.

You can tell a story that imagines that imaginary wizarding schools arise only out of histories with intense territorial rivalries within long-standing state systems, but then you have to explain why there aren’t imaginary wizarding schools in the places in the world that fit that criteria rather than frantically moving the comparative goalposts around so that you are matching units like “all of Sub-Saharan Africa” against “Great Britain”. And you have to explain why the simultaneous and related forms of state-building in West Africa and Western Europe created schools in one and not the other: because Asante, Kongo, Dahomey, and Oyo are in some sense part of a state system that includes England, France and the Netherlands in the 17th and early 18th Centuries.

You can tell a story about how many different ways of learning wizarding in an imaginary Africa were suppressed, lost, denigrated, marginalized or impoverished, leaving a single major institution built on an essentially Western and modern model, and write colonialism into your world of good and evil magic. If you have a faux Hitler in the Dark Wizard Grindlewald, why not a faux Rhodes or a faux Burton as another kind of dark wizard?

That’s not what Rowling has put out so far. And it’s definitely not the kind of thinking that Farrell and Blattman offer in an attempt to shore up Rowling. All they offer is a scholarly alibi for Africa-is-a-country, Africa-is-absence, Africa-can-be-mashup-of-exotic-names.

Posted in Academia, Africa, Sheer Raw Geekery | 10 Comments

On the Deleting of Academia.edu and Other Sundry Affairs

Once again with feeling, a point that I think cannot be made often enough.

Social media created and operated by a for-profit company, no matter what it says when it starts off about the rights of content creators, will inevitably at some point be compelled to monetize some aspect of its operations that the content creators did not want to be monetized.

This is not a mistake, or a complaint about poor management practices. The only poor practices here are typically about communication from the company about the inevitable changes whenever they arrive, and perhaps about the aggressiveness or destructiveness of the particular form of monetization that they move towards.

The problem is not with the technology, either. Uber could have been an interface developed by a non-profit organization trying to help people who need rides to destinations poorly serviced by public transport. It could have been an open-source experiment that was maintained by a foundation, like Wikipedia, that managed any ongoing costs connected to the app and its use in that way. And that’s with something that was already a product, a service, a part of the pay economy.

Social media developed by entrepreneurs, backed by venture capital, will eventually have to find some revenue. And there are only three choices: they sell information about their users and content creators, even if that’s just access to the attention of the users via advertisements; they sell services to their users and content creators; they sell the content their creators gave to them, or at least take a huge cut of any such sales. That’s it.

And right now except for a precious few big operators, none of those choices really let the entrepreneurs operate a sustainable business. Which is why so many of of the newer entries are hoping to either threaten a big operator, get a payout and walk away with their wallet full (and fuck the users) or are hoping to amass such a huge amount of freely donated content that they can sell their archive and walk away with their wallet full (and fuck the users).

If the stakes are low, well, so be it. Ephemeral social conversation between people can perhaps safely be sold off, burned down and buried so that a few Stanford grads get to swagger with all the nouveau-richness they can muster. On the far other end, maybe that’s not such a great thing to happen to blood tests and medical procedures, though that’s more about the hideous offspring of the social media business model, aka “disruption”.

But nobody at this point should ever be giving away potentially valuable work that they’ve created to a profit-maker just because the service that hosts it seems to provide more attention, more connection, more ease of use, more exposure.

Open access is the greatest idea in academia today when it comes to make academia more socially just, more important and influential, more able to collaborate, and more able to realize its own cherished ideals. But open access is incompatible with for-profit social media business models. Not because the people who run academia.edu are out of touch with their customer base, or greedy, or incompetent. They don’t have any choice! Sooner or later they’ll have to move in the direction that created such alarm yesterday. They will either have amassed so much scholarship from so many people that future scholars will feel compelled to use the service–at which point they can charge for a boost to your scholarly attention and you’ll have to pay. Or they will need to monetize downloads and uses. Or monetize citations. Or charge on deposit of anything past the first article. Or collect big fees from professional associations to for services. Or they’ll claim limited property rights over work that hasn’t been claimed by authors after five years. Or charge a “legacy fee” to keep older work up. You name it. It will have to happen.

So just don’t. But also keep asking and dreaming and demanding all the affordances of academia.edu in a non-profit format supported by a massive consortia of academic institutions. It has been, is and remains perfectly possible that such a thing could exist. It is a matter of institutional leadership–but also of faculty collectively finally understanding their own best self-interest.

Posted in Academia, Information Technology and Information Literacy, Intellectual Property | 2 Comments

Technologies of the Cold War in Africa (History 90I) Syllabus

I saw last year that some smart academics were using Piktochart to design more graphical, visual syllabi, so I took a stab at it.

Loading...

Loading…

Posted in Academia, Africa, Swarthmore | Comments Off on Technologies of the Cold War in Africa (History 90I) Syllabus

“Dates Back Millennia”

You know, I have less of an unqualified hatred for the “dates back millennia” line than I used to. I’m thinking this as I see my feed fill up with friends and colleagues complaining about Obama’s use of it in his speech to talk about the Middle East. To some extent, historians overreact to its use by politicians for two separate reasons.

The first is that of course it’s factually wrong and not at all innocently so. Which is to say that this line of explanation, whether offered as a quick throw-away or as a substantive claim, looks away from the history of the 20th Century and the very decisive role played by European colonialism and post-WWII American intervention in structuring many supposedly “ancient hatreds”. In the case of Israel-Palestine, that is particularly convenient for the United States (and for Zionists) because the direct and immediate causal relevance of the precise way in which the state of Israel came into being and the ways in which the current states of the Middle East were brought into the geopolitics of the Cold War are the major and direct causal underpinnings of contemporary conflicts. It runs from mature responsibility and from genuine analytic understanding all at once.

The second reason for the reaction is Invoking “ancient hatreds” not only is a misdirection of attention, it also naturalizes conflicts in the bodies and minds of the combatants. It’s a kind of shrug: what can one do? but it also turns more to psychology than history as the toolset for thinking through current politics, which is at best futile and at worst creepy.

So why do I qualify my dislike? First I think among historians we all recognize that there’s a strong turn to the modern and contemporary among our students and our publics, a presentism that most of us criticize. But I think in moments like this, we contribute some to that presentism. We should leave a door open for times before the 20th Century to matter as causal progenitors of our own times and problems. Sure, that argument has to be made carefully (shouldn’t all historical arguments be thus?) but I actually think all of the past is weighing on the present, sometimes quite substantially so. “Ancient hatreds” isn’t quite the right way to put it, but there are aspects of conflict in the Middle East which do genuinely derive structure or energy from both the Ottoman period (early and late) and from times before that.

It’s also that I think we end up in getting angry at politicians who are trying to kick over the traces of their own government’s recent historical culpability but in so doing forget that there are many other actors who also believe and are motivated by the supposed antiquity of their actions. On some level, if they do think so, we ought to at least listen carefully and not quickly school-marm them about why the experts hold that they’re wrong. Authenticity is a strange twilight realm. If people believe that they are upholding something ancient, that has a way of becoming true enough in some sense even if they’re wrong about the history between them and that past moment and wrong about what the ancient history really was. It might be easier simply to focus on the culpability of some states and actors for the current situation and leave aside compulsively correcting their history in some cases.

But finally, as long as we’re talking culpability, the one problem with always, invariably locating conflict and hatred as having their most relevant origins in Western colonialism and in the decisions made during post-WWII decolonization is that we risk having our own version of a distraction from uncomfortable truth. As I noted, maybe sometimes there really is something older at play. There’s a really great book that the historian Paul Nugent wrote about the Ghana-Togo borderlands in West Africa that makes the argument that counter to the common trope that the Berlin Conference simply arbitrarily created random and incoherent borders–that the border there was both reflective of older 19th Century histories and that the communities in the borderlands did much to fashion those boundaries. More uncomfortably, maybe sometimes there’s something far more recent and contingent at play–maybe sometimes in current global conflicts even our preferred causal stage is an “ancient conflict” of little real empirical relevance to combatants, who are instead being put into motion by the political and cultural histories of the last twenty years or even the last ten.

Posted in Academia, Politics, Production of History | Comments Off on “Dates Back Millennia”