Welcome to the Skinnerdome

I think a tremendous amount of writing so far this election season about the Presidential race shows primarily that the effect of social media on public discourse is increasingly dire. Here’s the thing: I would characterize the majority of what I have read as arguing that the convictions people have declared are only held by them because of some form of prior social ideology or consciousness, that they are not based on anything “real” in terms of a particular candidate’s likely policies, rhetoric or record.

When we’re dealing with large-scale voting patterns, a certain amount of sociological musing is completely appropriate, because that’s what the patterns make visible–that this group of people likes a certain person more or less, etc. At that level, sociological thinking is explanatory and it is also an important part of arguing for or against particular candidates in terms of reading what they mean and what they will do.

But when we bring that into an address that’s meant to speak to particular individuals to whom we are connected via social media, it feels, first of all, reductive: as if those individuals whom we have chosen to be connected to are no more than their sociologies. More importantly, the arguments we have at this point feel straight out of B.F. Skinner: many writers in social media treat other people as if they are something to be conditioned–to be pushed this way or that way with the proper framing. With the punishment of scolding and call outs if they’re being sociologically bad, with praise and attention if they’re expressing the proper selfhood. We begin to be the master of our own little Skinner boxes rather than as human beings in rich conversation with other human beings. We begin to think of each other person in our feeds as a person to be punished and rewarded, conditioned and shaped. We stop thinking of our own reasons why we believe in a particular candidate, why we think *or* feel what we think or feel. More importantly, we stop thinking of the reasons why someone else feels or thinks that way, and stop being curious about those reasons if they’re not being shared or enunciated. The difference in their views starts to be merely exasperating, the manifestation of an enemy sociality. Disagreement starts to be like an untrained puppy making messes in our space: we give treats, we hit noses with rolled-up newspaper. If the puppy doesn’t learn, we euthanize. We start thinking of “frames”, of rhetoric as the way to run our Skinner box. We don’t persuade or explain, we push and pull.

I think there’s a reason why formal debate named “ad hominem” as a logical fallacy. It’s not that arguments are not in fact a result of the personalities or sociologies of the people making them. We’ve all had to argue with people whose arguments are motivated by spite or some other emotional defect or are defenses of their social privileges. But it is that allowing ourselves the luxury of saying so during the discussion short-circuits our capacity to engage in future conversation–it becomes the default move we make. We start to have conversations only when someone who is a shining paragon of virtue in our eyes steps forward. (And that person increasingly may become someone who is emotionally and sociologically identical to ourselves.) One by one human beings around us vanish, and so too does evidence, inquiry, curiosity. We end up in a landscape of affirmation and disgust, of reaction to stimuli–e.g., as we Skinner box others, so too are we Skinner boxed.

Posted in Blogging, Information Technology and Information Literacy, Politics | 3 Comments

Three Thoughts on the Nomination of Hillary Clinton

1. This is simple. If Hillary Clinton loses the election to the possibly the most unpopular, polarizing and vulnerable Presidential candidate in American history, I will not blame her. I will, however, blame her supporters, aides and the Democratic party leadership. If she were to lose, it would not be because of her gender or even because of her own relative personal unpopularity. If she were to lose–and I hope very much and believe that she is going to win–it will be because her party is structurally hamstrung and continues to nominate uninspiring technocrats whose main electoral virtues are that they are not the Republican candidate and that they are competent and well-trained. In this sense, gender notwithstanding, Clinton is pretty much the same as Kerry, Gore and Dukakis. (Bill Clinton and Barack Obama are somewhat similar in their actual governance but at least have a more charismatic style of leadership.) If Clinton were to lose, then it is not the Republican Party that is over, but the Democratic Party: that would be a sign that the one major usefulness of the party is finally exhausted. If she loses, we all have a nightmare on our hands, but I for one intend to pause just long enough to remind her supporters that just about the only thing they had to offer was her alleged electability.

2. It is not the loss of Clinton that I fear most, though that prospect is so terrifying that it barely bears thinking of, because I don’t think she’s going to lose. It is her victory that worries me.

Let’s imagine a best-case scenario in which Clinton wins in a landslide, the Democrats regain the Senate, and the House shifts to a near 50-50 balance. A situation not that far off from 2008. What do I think we can anticipate as positive outcomes in that situation?

First, the leaders of the Republican Party will have finally gone off the cliff in their three decades-long game of chicken with populist anger. That might well interfere with their ability to block or constrain the kinds of useful policy leadership that Clinton is capable of providing. The logjam at the Supreme Court and in other appointments might be broken. Some modest and incremental attempts to address income inequality might be in the offing.

Second, Clinton will continue the use of the executive branch as a kind of spoils system for the political class, meaning that she primarily will not look all across the country for appointees who bring a mixture of fresh thinking and meritocratic skills to executive leadership but instead pull from the same pool of Democratic stalwarts who already know to pitch their ideas within the confines of what the party leadership and its chief pool of financial backers are prepared to consider. That approach often gets competent and admirable people for government, but it also prevents the White House or the Congressional leadership from considering both new specific policy ideas and more fundamental challenges to their entire way of thinking about the nation’s (and the world’s) problems.

Third, this lack of freshness will be particularly evident in national security policy. America’s unending war will continue uninterrupted, and our intelligence and law enforcement agencies will largely have carte blanche to do as they see fit with little to no meaningful oversight.

So that’s not that bad compared to the alternative, certainly. Some good things will happen, some bad things, but none of the upper bound of the bad is anything approaching Trump’s possible upper bound of transformative disaster. Why am I so worried about four years of Clinton?

It is not enough. Because I think Clinton and all the people surrounding her are very nearly incapable of recognizing, let alone responding to, the actual crisis they will be facing. That crisis is not Islamic militants. It is not political stalemate or Republican obstructionism. It is not police brutality and the scourge of racism. It is not income inequality or a lack of financial regulation. It is not even or only the structural transformation of the global political economy through technological change and social reorganization. All of those seemingly disparate things are only symptoms of a general problem.

The general problem is that the modern liberal nation-state and its characteristic institutions are simply no longer capable of delivering on their baseline promises and possibilities to any national population anywhere. Even in nations that appear by most measures to be successful, the state withers due its lack of vision. Liberalism cannot handle the extension of its rights to all who are entitled, and its major alleged champions increasingly endorse depraved forms of military and economic illiberalism in the name of its defense. The brief moment of reform in which capital seemed to be harnessed to social democracy is very nearly over, and the difference between illicit and licit economies now seems paper-thin at best. Very little policy gets made because it’s the right thing to do; most policy is about transfer-seeking. Every dollar is spoken for. Every play is a scrum in the middle that moves the ball inches, never yards. Political elites around the world either speak in laughably dishonest ways about hope and aspiration or stick to grey, cramped horizons of plausibly incremental managerialism. Young people all around the world recognize that there is little hope of living in a better or more comfortable or more just world than their parents did, and their grandparents must often live every day with the possibility of losing whatever they’ve gained, that they are one lost job or sickness away from falling without a safety net.

In the United States, what this all means in a more immediate sense is that Donald J. Trump is only the beginning. He may have a peculiarly American cast to his authoritarian populism, but he has his counterparts elsewhere in the world, many of whom have enjoyed or threaten to enjoy similar electoral success or other access to power. We reach out for analogies, fascism most prominently, but those are useful only in suggesting the dangerousness of our moment.

So I worry about a Clinton Presidency because it is at best likely to be a stalling action in the face of this gathering storm, and at worst may well accelerate and aggravate its arrival. Trump is only the herald; his successor will likely be a more fearsome, skillful and dangerously plausible version who will speak directly to the spirit of desperation in the hearts of many. Clinton doesn’t understand what’s out there in the world. She will allow herself to be lulled by the notion that the election of a woman on the heels of the election of an African-American is progress and that all opposition to them is just the dead-end reactionary impulses of a dying order. Her vision will be clouded by a swarm of blandishing pundits whose understanding of social change and political challenge is confined to horse-race predictions and the delivery of favors and services to various clients.

3) Let’s be optimistic and suppose that Clinton turns out to have depths I don’t suspect, or that the leadership of the Democratic Party and various liberal supporters don’t just spend two years schadenfreuding themselves about the Republicans and then start to panic when it becomes clear that Trump and Sanders voters were the canaries in the coal mine, a sign of profound alienation from the way things are. What could she actually do if my description of the scale and character of crisis were real?

The first thing she or other leaders could do is simply start talking honestly and straightforwardly about these problems–with a bit of real passion and anger mixed in. No more polling. No more Bill Clinton-style searching for small but popular initiatives that can get a favorable day or two in the news cycle. Only connect. That would be the first revolution. It’s what they don’t do at Davos or G8 meetings. It’s what they don’t do in Brussels or inside the Beltway. Start talking about what’s really going on out there and start talking to people in ways that are about what’s really going on.

The second thing she could do is talk about and explore, again with passion, anger and fearlessness, how a system of checks and balances has turned into a system of chokepoints and barriers. It’s not just Republican obstructionism, though that has contributed mightily. What we need is a genuine investigation of and national conversation about how thinking about the future turned so small and cramped, unless it’s jackass billionaires who want to be immortal and live on Mars. When people voice their frustration with the political system and register low approval ratings for almost the entirety of the political class, that’s what they’re responding to: that everything that requires a mixture of vision, will and competency gets sandbagged and obfuscated by people who either have something to gain from inaction or who are hoping to capture any action to their own exclusive advantage, whether that’s crafting a response to the Zika virus, dealing with crumbling infrastructure, or rebuilding an economy that works for most people. Most of us can see plainly what needs to be done on a variety of fundamental challenges in front of us–only a few of them are genuinely and irresolvably difficult in both moral and technical terms. The mystery, which a real leader might explore and confront, is what stands in the way of the doing. In many cases, it is the very systems that we presently believe exist to solve problems.

That’s all. I don’t expect magic solutions from anyone, especially Clinton. But I think visionary leadership now might simply be speaking to the scale and nature of the human crisis of the 21st Century, rather than trying to beat a few more years out of a queasy admixture of technocratic managerialism and a sort of insincere, half-hearted invocation of New Deal liberalism shorn of all passion or promises. It’s enough to recognize the crisis and speak to it. That alone is startling, and accounts in some measure for both Trump and Sanders having the success that they’ve had.

Just being in charge for the next four years? That’s enough to guarantee losing control of the future entirely, I fear.

Posted in Politics | 26 Comments

On the Arrival of Rough Beasts

One of the things I find most interesting about the history of advertising is the long-running conflict between the “creatives” and their more quantitative, data-driven opponents within ad agencies. It’s a long-running, widespread opposition between a more humanistic, intuitive, interpretative style of decision-making and professional practice and a more rules-driven, empirical, formalistic approach.

The methodical researchers are generally always going to have to create advertisements and construct marketing campaigns by looking at the recent past and assuming that the near-term future will be the same. In an odd way, I think their practices have been the analog equivalent to much of the algorithmic operations of digital culture, trained through the methodical tracking of observable behavior and the collection of very large amounts of sociological data. If you know enough about what people in particular social structures have done in response to similar opportunities, stimuli or messages, the idea goes, you’ll know what they will do the next time.

My natural sympathies, however, are with the creatives. The creatives are able to do two things that the social science-driven researchers can’t. They can see the presence of change, novelty and possibility, even from very fragmentary or implied signs. And they can produce change, novelty and possibility. The creatives understand how meaning works, and how to make meaning. They’re much more fallible than the researchers: they can miss a clue or become intoxicated with a beautiful interpretation that’s wrong-headed. They’re either restricted by their personal cultural literacy in a way that the methodical researchers aren’t, and absolutely crippled when they become too addicted to telling the story about the audience that they wish was true. Creatives usually try to cover mistakes with clever rhetoric, so they can be credited for their successes while their failures are forgotten. However, when there’s a change in the air, only a creative will see it in time to profit from it. And when the wind is blowing in a stupendously unfavorable direction, only a creative has a chance to ride out the storm. Moreover, creatives know that the data that the researchers hold is often a bluff, a cover story, a performance: poke it hard enough and its authoritative veneer collapses, revealing a huge hollow space of uncertainty and speculation hiding inside of the confident empiricism. Parse it hard enough and you’ll see the ways in which small effect sizes and selective models are being used to tell a story, just as the creatives do. But the creative knows it’s about storytelling and interpretation. The researchers are often even fooling themselves, acting as if their leaps of faith are simply walking down a flight of stairs.

This is only one manifestation of a division that stretches through academia and society. I think it’s a much more momentous case of “two cultures” than an opposition between the natural sciences and everything else. If you want to see this fault line somewhere else besides advertising, how about in media-published social analysis of this year’s presidential election in the United States? Glenn Greenwald and Zaid Jilani are absolutely right that not only have the vast majority of analysts palpably misunderstood what was happening and what was going to happen, but that most of them are now unconvincingly trying to bluff once again at how the data makes sense, the models are still working, and the predictions are once again reliable.

The campaign analysts and political scientists who claim to be working from rock-solid empirical data will never see a change coming until it is well behind them. Up to the point of its arrival, it will always be impossible, because their models and information are all retrospective. Even the equivalent of the creatives in this arena are usually wrong, because most of them are not really trying to understand what’s out there in the world. They’re trying to make the world behave the way they want it to behave, and they’re trying to do that by convincing the world that it’s already doing exactly what the pundit wants to the world to do.

The rise of Donald Trump is only the most visible sign of the things that pundits and professors alike do not understand about which way the wind is blowing. For one, Trump’s rise has frequently been predicted by one set of intuitive readers of American political life. Trump is consequence given flesh, the consequence that some observers have said would inevitably follow from a relentless disregard for truth and evidence that’s been thirty years on the making, from a reckless embrace of avowedly instrumental and short-term pursuit of self-interest, from a sneering contempt for consensus and shared interests. He’s the consequence of engineering districts where swing votes don’t matter and of allowing big money to flood the system without restraint. He’s what many intuitive and data-driven commenters have warned might happen if all that continued. But the election analysts can’t think in these terms: the formal and understood rules of the game are taken to be unchanging. The analysts know what they know. The warning barks from the guard-dogs are just an overreaction to a rustle in the leaves or a cloud over the moon.

But it’s more than that. The pundits and professors who got it wrong on Trump (and who are I think still wrong in understanding what might yet happen) get it wrong because the vote for Trump is a vote against the pundits and professors. The political class, including most of the Republican Party but also a great many progressives, have gotten too used to the idea that they know how to frame the narrative, how to spin the story, how to massage the polls, how to astroturf or hashtag. So many mainstream press commenters are now trying to understand why Trump’s alleged gaffes weren’t fatal to his candidacy, and they’re stupidly attributing that to some kind of unique genius on Trump’s part. The only genius that Trump has in this respect is understanding what was going on when his poll numbers grew rather than dropped after those putative gaffes. The content of those remarks was and remains secondary to his appeal. The real appeal is that he doesn’t give a shit what the media says, what the educated elite say, what the political class says. This is a revolt against us–against both conservative and progressive members of the political class. So of course most of the political class can’t understand what’s going on and keep trying to massage this all back into a familiar shape that allows them to once again imagine being in control.

Even if Trump loses, and I am willing to think he likely will by a huge margin, that will happen only because the insurgency against being polled, predicted, dog-whistled, manipulated and managed into the kill-chutes that suit the interests of various powers-that-be is not yet coalesced into a majority, and moreover, is riven internally by its own sociological divisions and divergences. But even as Trump was in some sense long predicted by the gifted creatives who sift the tea leaves of American life, let me also predict another thing: that if the political class remains unable to understand the circumstances of its own being, and if it is not able to abandon its fortresses and silos, the next revolt will not be so easily contained.

Posted in Academia, Oath for Experts, Oh Not Again He's Going to Tell Us It's a Complex System, Politics | 1 Comment

Dramatic Arc

Me at the beginning of a class meeting where I’ve assigned one of my favorite books.

Me realizing that maybe a quarter of the class read it with any real attention despite the fact that I already said it’s going to be an essay question on the final.

Me inside as we wind down the class.

Posted in Academia, Swarthmore | 6 Comments

Cost Control Is a Progressive Value

If you are a long-time reader of this increasingly intermittent blog, you know I have some recurrent frustrations and fascinations that I return to again and again. Sometimes in fact the blog is intermittent because I am afraid I am becoming a bore on those themes.

One of these recurrent issues for me is the financial sustainability of higher education. On one hand, I aspire to some skepticism about the neoliberal attempt to produce scarcity where it does not need to exist, out of belief that the mindset of scarcity produces proper decisions about value, that homo economicus behaves wisely whereas people who feel they live amid plenty and security waste resources and underproduce value. It’s not just that the moral underpinnings of that view are barren and repellant, it’s also that it is plainly empirically untrue whether we’re talking universities or companies. Realism about limits and constraints is a good thing, but artificially producing constraints in order to compel a winner-take-all struggle and squeeze productivity out of people has already destroyed much of what the 20th Century usefully accomplished towards the forging of saner, kinder, and more richly meaningful societies.

But the opposite of phony scarcity is not “we’re rich, so we can do whatever we want”. Which is a view that I hear sometimes from local and national colleagues and students, that in order to reject scarcity we must never be deterred by or involved in determinations of financial and material limits to our resources. That for one is one of the major ways that tenure-track faculty in many institutions became at least passively complicit in the casualization of academic labor. Acting as if one’s own teaching load or service obligations or labor is a matter of strictly personal or departmental negotiation with an administrative head, and the larger implications of the outcome of those negotiations are somebody else’s business is how in some cases we ended up with curricula that dictated that matriculants had to take courses that couldn’t possibly be staffed out of the available tenure-track labor force. Faculty at some institutions participated slowly and incrementally in the building of a curriculum that could never be possibly staffed by even the wealthiest institution and then blamed administrators for the shift to impoverished, marginalized and excluded laborers.

Though of course they are to blame in many universities, and for doing a great deal that has made all sorts of financial situations worse. Faculty and students may sometimes push for and be appallingly naive about institutional growth, but one of the basic reasons to have academic administrations in the first place is to keep a university or college close to its essential mission, and to resist relentlessly additive expansion of that mission.

The late Marshall Berman’s book All That Is Solid Melts Into Air has a marvelous analysis of Faust as the “primal Growthman”, as the quintessential example of a modern archetype, the person dedicated to a vision of modernity as not only ceaselessly mutable but dedicated to the replacement of all that we have with more than we have, with the making of all things into bigger things, with accumulation and expansion. It’s not a surprise to me that the most heedless and energetic Fausts of our own times are now restlessly looking to “disrupt” anything that seems to expand too slowly. Some of that’s about money, about financial Alexanders who weep because there are no worlds left to conquer, no investments left to make. Some of it very nearly a religious or sacred belief: that whatever seems to stand still is an offense. No wonder too that people like Elon Musk are shilling for Mars colonies and asteroid mines. Faust is hungry for the same reason Cookie Monster is: tired of waiting for a batch from the oven, he’s eaten the spoon, the pan, the mixer, the table. The cookies are as good as eaten already: they were eaten before they were even mixed.

Do not feed the Faust. That’s really what reaching sustainability is going to be about. At every moment, in every conversation, in every plan and meeting and process, any progressive academic who pays even the remotest attention to sustainability in higher education (or elsewhere) is going to invariably ask: what can we repurpose or reuse? If we want novelty, or change, or difference, if we believe in originality and innovation, what can we do differently? So in that sense, one thing we should always be alarmed about is any sign that adminstrators (or colleagues) have found a new source of revenue. Even if that’s about making up cuts in public support, which is how University of California administrators defend bringing in more out-of-state students with relaxed admission standards, new revenue (or even replaced revenue from a new source) is always imagined as temporary but effectively becomes permanent from the moment it is integrated into an operational budget. Whatever was done to secure new revenue will have to be done forever after. Growth quickly requires itself unless it’s limited, finite and finished all in a single momentary flash, unless it is only for a single specific purpose.

I think in a way most of us know it, and that’s why academics are so temperamentally conservative. We know that whatever new things we do will eventually be at the cost of something we are already doing, unless we sign on as little apprentice Fausts. But that’s the harder habit that sustainability in all our life will eventually call upon us to accept and even embrace, to live impermanent lives. We will need to build yurts where now we build fortresses, to move on as the intellectual seasons change. I think we can offer working lives of security and satisfaction within an academy that doesn’t grow. Impermanence in the work we do and the missions we accept is not precarity. But the only way we get there is to accept that if we don’t talk about cost and limits and budgets in this spirit, no one else in our present worlds will. That kind of talk cannot be outsourced, it cannot be deferred, it not someone else’s business. When temptation comes in the form of bigger and more (though not in the form of restoration and preservation, which are sorely needed), we’ll have to be able to turn it down.

Posted in Academia, Politics | 2 Comments

A Chance to Show Quality

Romantic ideals of originality still remain deeply embedded in how we recognize, cultivate and reward merit in most of our selective systems of education, reputation and employment. In particular we read for the signs of that kind of authentic individuality in writing that is meant to stand in for the whole of a person. Whether it’s an essay for admission to college, a cover letter for a job, an essay for the Rhodes or Fulbright, an application for research funding from the Social Science Research Council or the National Science Foundation, we comb for the signs that the opportunity-seeker has new ideas, has a distinct sensibility, has lived a life that no one else has lived. Because how else could they be different enough from all the other worthies seeking the opportunity or honor so as to justify granting them their desires?

Oh, wait, we also want to know, almost all of the time, whether the opportunity-seeker is enough like everyone else that we can relate their talents, ideas, capabilities, plans and previous work to the systems which have produced the applicants. We want assurances that we are not handing resources, recognition and responsibility to a person so wholly a romantic original that they will not ever be accountable or predictable in their uses. We want to know that we are selecting for a greatness that we already know, a merit that we already approve of.

This has always been the seed that grows into the nightmare of institutions, that threatens to lay bare how much impersonality and distance intrudes upon decisions that require a fiction of intimacy. Modern civic institutions and businesses lay trembling hands on their bankrolls when they think, however fleetingly, that there is a chance that they’re getting played for fools. That they are dispensing cheese to mice who have figured out what levers to push. That when they read the words of a distinctive individual, they are really reading the words of committees and advisors, parents and friends. That they are Roxane swooning over Christian rather than Cyrano, or worse, that they are being catfished and conned.

The problem is that when we are making these choices, which in systems of scarcity (deliberately produced or inevitably fated) must be made, we never really decide what it is that we actually value: unlikeness or similarity, uncertainty or predictability, originality or pedigree. That indecision more than anything else is what makes it possible for people to anticipate what the keepers of a selective process will find appealing. Fundamentally, that boils down to: a person with all the qualifications that all other applicants have, and a personal experience that no one else could have had but that has miraculously left the applicant even more affirmed in their qualifications. Different in a way that doesn’t threaten their sameness.

I’ve been involved in a number of processes over the years where those of us doing the selecting worried about the clear convergence in some of the writing that candidates were doing. We took it to be a sign that some candidates had an advantage that others didn’t, whether that was a particularly aware and canny advisor or teacher, or it was some form of organized, institutional advice. I gather that there are other selective institutions, such as the Rhodes Foundation, that are even more worried, and have moved to admonish candidates (and institutions) that they may not accept advice or counsel in crafting their writing.

The thing is, whenever I’ve been in those conversations, it’s clear to me that the answer is not in the design of the prompt or exercise, and not in the constraints placed on candidates. It’s in the contradictions that selective processes hold inside themselves, and in the steering currents that tend to make them predictable in their tastes. When you try to have it all, to find the snowflake in the storm, and yet also prize the snowfall that blankets the trees and ground with an even smoothness, you are writing a human form of algorithm, you are crafting a recipe that it takes little craft to divine and follow. The fault, in this case, lies in us, and in our desires to be just so balanced in our selection, to stage-manage a process year in and year out so that we get what we want and yet also want what we get.

Maybe that was good enough in a time with less tension and anxiety about maintaining mobility and status. But I suspect the time is coming where it will not be. Not because people seek advantage, but because anything that’s predictable will be something relentlessly targeted by genuine algorithms. Unpredictability is never a problem for applicants or advisors, always for the people doing the selection or the grading or the evaluation. If you don’t want students to find a standard essay answer to a standard essay prompt, you have to use non-standard prompts. If you don’t want applicants to tell you the very moving story of the time they performed emergency neurosurgery on a child in the developing world using a sterilized safety pin and a bottle of whisky, you have to stop rewarding applicants who tell you that story in the way that has previously always gotten your approval. If what we want is genuine originality, the next person we choose has to be different from the last one. If what we want is accomplished recitation of training and skills, then we look for the most thorough testing of that training. When we want everything, it seems, we end up with performances that very precisely thread the needle that we insistently hold forth.

Posted in Academia, Information Technology and Information Literacy, Swarthmore | 3 Comments

Opt Out

There is a particular kind of left position, a habitus that is sociologically and emotionally local to intellectuals, that amounts in its way to a particular kind of anti-politics machine. It’s a perspective that ends up with its nose pressed against the glass, looking in at actually-existing political struggles with a mixture of regret, desire and resignation. Inasmuch as there is any hope of a mass movement in a leftward direction in the United States, Western Europe or anywhere else on the planet, electoral or otherwise, I think it’s a loop to break, a trap to escape. Maybe this is a good time for that to happen.

Just one small example: Adam Kotsko on whether the Internet has made things worse. It’s a short piece, and consciously intended as a provocation, as much of his writing is, and full of careful qualifiers and acknowledgements to boot. But I think it’s a snapshot of this particular set of discursive moves that I am thinking of as a trap, moves that are more serious and more of a leaden weight in hands other than Kotsko’s. And to be sure, in an echo of the point I’m about to critique, this is not a new problem: to some extent this is a continuous pattern that stretches back deep into the history of Western Marxism and postmodernism.

Move #1: Things are worse now. But they were always worse.

Kotsko says this about the Internet. It seems worse but it’s also just the same. Amazon is just the Sears catalogue in a new form. Whatever is bad about the Internet is an extension, maybe an intensification, of what was systematically bad and corrupt about liberalism, modernity, capitalism, and so on. It’s neoliberal turtles all the way down. It’s not worse than a prior culture and it’s not better than a prior culture. (Kotsko has gone on to say something of the same about Trump: he seems worse but he’s just the same. The worst has already happened. But the worst is still happening.)

I noted over a decade ago the way that this move handicapped some forms of left response to the Bush Administration after 9/11. For the three decades before 9/11, especially during the Cold War, many left intellectuals in the West practiced a kind of High Chomskyianism when it came to analyzing the role of the United States in the world, viewing the United States as an imperial actor that sanctified torture, promoted illiberalism and authoritarianism, acted only for base and corrupt motives. Which meant in some sense that the post-9/11 actions of the Bush Administration were only more of the same. Meet the new boss, same as the old boss. But many left intellectuals wanted to frame those actions as a new kind of threat, as a break or betrayal of the old order. Which required saying that there was a difference between Bush’s unilateralism and open sanction of violent imperial action and the United States during the Cold War and the 1990s and that the difference was between something better and something worse. Not between something ideal and something awful, mind you: just substantively or structurally better and substantively or structurally worse.

This same loop pops up sometimes in discussions of the politics of income inequality. To argue that income inequality is so much worse today in the United States almost inevitably requires seeing the rise of the middle-class in postwar America as a vastly preferable alternative to our present neoliberal circumstances. But that middle-class was dominated by white straight men and organized around nuclear-family domesticity, which no progressive wants to see as a preferable past.

It’s a cycle visible in the structure of Howard Zinn’s famous account of American history: in almost all of Zinn’s chapters, the marginalized and the masses rise in reaction to oppression, briefly achieve some success, and then are crushed by dominant elites, again and again and again, with nothing ever really changing.

It’s not as if any of these negative views of the past are outright incorrect. The U.S. in the Cold War frequently behaved in an illiberal, undemocratic and imperial fashion, particularly in the 1980s. Middle-class life in the 1950s and 1960s was dominated by white, straight men. The problems of culture and economy that we identify with the Internet are not without predicate or precedent. But there is a difference between equivalence (“worse now, worse then”) and seeing the present as worse (or better) in some highly particular or specific way. Because the latter actually gives us something to advocate for. “Torture is bad, and because it’s bad, it is so very very bad to be trying to legitimate or legalize it.” “A security state that spies on its own people and subverts democracy is bad, and because it’s bad, it’s so much worse when it is extended and empowered by law and technology.”

When everything has always been worst, it is fairly hard to mobilize others–or even oneself–in the present. Because nothing is really any different now. It is in a funny kind of way a close pairing to the ahistoricism of some neoliberalism: that the system is the system is the system. That nothing ever really changes dramatically, that there have been in the lives and times that matter no real cleavages or breaks.

Move #2: No specific thing is good now, because the whole system is bad.

In Kotsko’s piece on the Internet, this adds up to saying that there is no single thing, no site or practice or resource, which stands as relatively better (or even meaningfully different) apart from the general badness of the Internet. Totality stands always against particularity, system stands against any of its nodes. Wikipedia is not better than Amazon, not really: they’re all connected. Relatively flat hierarchies of access to online publication or speech are not meaningful because elsewhere writers and artists are being paid nothing.

This is an even more dispiriting evacuation of any political possibility, because it moves pre-emptively against any specific project of political making, or any specific declaration of affinity or affection for a specific reform, for any institution, for any locality. Sure, something that exists already or that could exist might seem admirable or useful or generative, but what does it matter?

Move #3: It’s not fair to ask people how to get from here to a totalizing transformation of the systems we live under, because this is just a strategy used to belittle particular reforms or strategies in the present.

I find the sometimes-simultaneity of #2 and #3 the most frustrating of all the positions I see taken up by left intellectuals. I can see #2 (depressing as it is) and I can see #3 (even when it’s used to defend a really bad specific tactical or strategic move made by some group of leftists) but #2 and #3 combined are a form of turtling up against any possibility of being criticized while also reserving the right to criticize everything that anyone else is doing.

I think it’s important to have some idea about what the systematic goals are. That’s not about painting a perfect map between right now and utopia, but the lack of some consistent systematic ideas that make connections between the specific campaigns or reforms or issues that drawn attention on the left is one reason why we end up in “circular firing squads”. But I also agree that it’s unfair to argue that any specific reform or ideal is not worth taking up if it can’t explain that effort will fix everything that’s broken.

4. It’s futile to do anything, but why are you just sitting around?

E.g., this is another form of justifying a kind of supine posture for left intellectuals–a certainty that there is no good answer to the question “What is to be done?” but that the doing of nothing by others (or their preoccupation with anything but the general systematic brokenness of late capitalism) is always worth complaining about. Indeed, that the complaint against the doing-nothingness of others is a form of doing-something that exempts the complainer from the complaint.


The answer, it seems to me, is to opt out of these traps wherever and whenever possible.

We should historicize always and with specificity. No, everything is not worse or was not worse. Things change, and sometimes neither for better nor worse. Take the Internet. There’s no reason to get stuck in the trap of trying to categorize or assess its totality. There are plenty of very good, rich, complex histories of digital culture and information technology that refuse to do anything of the sort. We can talk about Wikipedia or Linux, Amazon or Arpanet, Usenet or Tumblr, without having to melt them into a giant slurry that we then weigh on some abstracted scale of wretchedness or messianism.

If you flip the combination of #2 and #3 on their head so that it’s a positive rather than negative assertion, that we need systematic change and that individual initiatives are valid, then it’s an enabling rather than disabling combination. It reminds progressives to look for underlying reasons and commitments that connect struggles and ideals, but it also appreciates the least spreading motion of a rhizome as something worth undertaking.

If you reverse #4, maybe that could allow left intellectuals to work towards a more modest and forgiving sense of their own responsibilities, and a more appreciative understanding of the myriad ways that other people seek pleasure and possibility. That not everything around us is a fallen world, and that not every waking minute of every waking day needs to be judged in terms of whether it moves towards salvation.

We can’t keep saying that everything is so terrible that people have got to do something urgently, right now, but also that it’s always been terrible and that we have always failed to do something urgently, or that the urgent things we have done never amount to anything of importance. We disregard both the things that really have changed–Zinn was wrong about his cyclical vision–and the things that might become worse in a way we’ve never heretofore experienced. At those moments, we set ourselves against what people know in their bones about the lives they lived and the futures they fear. And we can’t keep setting ourselves in the center of some web of critique, ready to spin traps whenever a thread quivers with movement. Politics happens at conjunctures that magnify and intensify what we do as human beings–and offer both reward and danger as a result. It does not hover with equal anxiety and import around the buttering of toast and the gathering of angry crowds at a Trump rally.

Posted in Blogging, Information Technology and Information Literacy, Oh Not Again He's Going to Tell Us It's a Complex System, Politics | 4 Comments

#Prefectus Must Fall: Being a True History of Uagadou, the Wizarding School

So there’s been a spot of disagreement about how to think about state systems in Africa in relationship to J.K. Rowling’s world-building for her Harry Potter novels. I feel a bit bad about perceptions that I was being unfair, but I also mostly continue to feel that this is just the latest round in a long-standing interdisciplinary tension (arguably all the way back into Enlightenment philosophy) about what exactly can be compared about human societies and on what basis the comparison ought to be made. I think that’s a discussion in which African societies have often been described as having a deep history of not having what Europe has, with the comparison serving to explain disparities and inequalities in the present-day. I am not the first to react strongly to that mode of comparison.

But I also do feel that it’s important in some sense not to have a dispute that is both scholarly and political completely overwhelm the possibility of giving useful guidance to J.K. Rowling and other creators who work with fantasy or speculative fictions. In general, I would like to see specialists in African history and anthropology be prepared not only to provide useful, digestible knowledge to fiction writers but also to non-specialists. Which means, I think, showing how it could be possible to draw upon specific African histories and experiences to create and imagine fictions and stories that incorporate African inspirations rather than to treat Africa as a zone of exclusion because it’s too difficult or touchy.

So: a bit of fanfiction, intended to demonstrate how to subtly rework what Rowling has already said about her wizarding world.



For a month now the instructors at Uagadou have dutifully assembled to ward off attempts by students, particularly those in Ambatembuzi House, to cast kupotea on the statue of Peter Prefectus that has been at the foot of the Great Stairway for the past sixty years.

Prefectus’ own nkuni spirit has joined the teachers in defending his statue, though as always it is hazy and distracted, only half here, half wandering indistinctly in the halls of England’s Ministry of Magic. We say that they must allow the spell to be cast: let him go home once and for all. There are few left in Prefectus House, anyway. The white wizards who still live in Africa go to Hogwarts, Ilvermorny or Durmstrang, as do some number of Africans.

Prefectus Must Fall. Though we students love Uagadou and what we learn here, it is time for this school to be a truly African school. Not the “African” of silly affectations like using hands instead of wands that a few teachers introduced forty years ago in an attempt to get away from Prefectus’ wholesale importation of the curriculum of Hogwarts! Let us rediscover the real history of African magic, of the many magical styles and ways of learning from Africa!

We know the truth now. This old, rotting, half-real castle shivering in the mountains isn’t a thousand years old, it’s 110 years old. Or more to the point, it’s a thousand-year old school that was stolen and stuffed inside an imposter’s cheap recreation of the school that never let him be a teacher. Peter Prefectus was a fourth-rate wizard stuck in a basement of the British Ministry of Magic who decided that if he couldn’t teach at Hogwarts, he’d go off to Africa just like the Muggle officer Harry Johnston and make a Hogwarts there.

There was a school here once, back before the kingdom of Bunyoro rose. It wasn’t for all Africans everywhere, but Swahili and Ituri and Khoisan wizards from the coast and the jungle and the forests all came. People from the shores of the big lakes came, people from the hills and savannah came. That’s where Peter Prefectus built his fake Hogwarts, where that old school was. The leaders of that ancient school foolishly let him and helped lift the stones and cast the spells. They felt they needed to understand what was happening, and to learn the magics that Prefectus offered, but all they did was sell out our heritage!

They don’t tell you when you get sorted that Prefectus was an incompetent who had the cheek to believe that his teachers and pupils were incapable of any real magic anyway. He never learned an African language, not one, but made the students learn spells like “expelliarmus” and “impedimenta”. He hired other European wizards and let them bully and hurt and even kill the Africans who came there. We had wizards like Grindlewald and Voldemort here too, but they were in charge and no one came to the rescue, not for us.

We know the truth. Prefectus must fall.

Prefectus stole two schools! The ancient one of the lakes and then he had the cheek to try to steal a name from near to another old African place of magical learning, the school which today still exists at Kumbi Saleh in the ruins of Ghana. Hard times for it now, harried by sinister wizards hiding in the Sahara who believe that all magical schools should be destroyed. That is another reason Prefectus must fall: it is time for Uagadou to do its part in helping other African wizards in their struggles. Kumbi Saleh should not have to wait for a half-hearted delegation of wizards from Beauxbatons and Durmstrang to save it from attack. We should not hear any longer from our headmaster and teachers that it is “against tradition” for Uagadou to play a role.

Uagadou, even in disrepair, is still wealthier than our real comrades at the ancient academies in Kumbi Saleh and Axum. We should help them and work with them and learn from their wisdom about wizarding. We should be working with the “moving school” of Eshu, the secret society of West African wizards who have no castle or building, but who move tirelessly from one site of ancient power to the next, from Old Oyo to Benin to Kumasi, walking the ways that they know. We should talk to the small schools that meet all over the continent, and reach out to wizards too poor or endangered to think of coming here. Uagadou should train far more Africans than it does, and stop just being for a small handful of families made powerful by their dealings with the European wizards.

Prefectus Must Fall! Unite to liberate our school and our peoples! Leave off the lies, cast away the glossy brochures that arrive by Dream Messengers to entice you here. Face the truth!

Posted in Africa, Sheer Raw Geekery | 8 Comments

On Uagadou, the African Wizarding School

I have a good deal to say on the plausibility of a wizarding school in J.K. Rowling’s fantasy world, and the first would be that I should know better than to send Twitter to do a blog’s job, I guess. There is a good deal wrong with Henry Farrell and Chris Blattman’s defense of Rowling’s imagination. To some extent more wrong than Rowling herself. You may from the outset roll your eyes and say, “It’s imaginary, let it go” and I hear you, but in fact the kinds of imaginary constructions of African societies and African people that operate in fantasy, science-fiction and superhero universes are actually rather instructive guides to how Western-inflected global culture knows and understands the histories of African societies as a history of absence, lack or deficit rather than as histories of specific presence, as having their own content that is in many ways readily knowable.

Let’s start from the very beginning, with Rowling’s expansion of her world-building in Harry Potter. When she recently imagined what the whole world in her fantasy universe looks like, what did she say about it?

1. That most nations in her world do not have their own wizarding schools. Most wizards are “home-schooled”.
2. That distance education (“correspondence courses”) are also used to train wizards.
3. That the eleven wizarding schools that do exist in the world share some common characteristics that derive from the common challenges and affordances of magic. They tend to be remote, often in mountainous areas, in order to insulate themselves from Muggles, in order to attempt to stay out of wizard politics as much as possible, and to maintain some independence from both Muggle and wizard governance.
4. That there is an International Confederation of Wizards to whom a budding wizard can write (via owl) to find out about the nearest wizard school.
5. So far, Rowling has announced that there are three wizarding schools in Europe, one in North America (on the East Coast), one in Japan, on in Brazil (in the rainforest), and one in Africa called Uagadou, pronounced Wagadu. As far as I know the others aren’t announced yet.

What of Uagadou?

1. It’s pronounced Wa-ga-doo. Farrell and Blattman take this to be a reference taken from the place of the same name associated with the ancient empire of Ghana. (Which was located in what is now Mali and Mauritania in West Africa.)
2. There are smaller wizarding schools in Africa, but Uagadou has an “enviable” international reputation and is a thousand years old.
3. It enrolls students from all over the continent.
4. Much magic, maybe all magic, comes from Africa.
5. Wands are European inventions; African wizards just use their hands.
6. Uagadou doesn’t use owls for messages, it uses Dream Messengers.

In response to Twitter complaints that this is just more “Africa is a country” thinking, where the entire continent gets one school that is an undifferentiated mass of African-ness, without specific location, Rowling has responded first to say, “Students from all over” and second, that Uagadou is in Uganda, in the “Mountains of the Moon”, by which she probably means the Rwenzori Mountains in northwestern Uganda.


Farrell and Blattman set out to defend Rowling, saying that it is plausible that all of sub-Saharan Africa would only have one wizarding school. (I’m guessing that before she’s done, there will be a wizarding school in Egypt or otherwise near to North Africa, so let’s leave that aside.) Farrell and Blattman do so by saying that Sub-Saharan Africa didn’t have a “state system”. In an initial tweet, I expressed my irritation by noting that there were states in Africa, to which Farrell replied that their article concedes that there were. Just that material environments “conspired against” state development until colonialism, and that the fewer states that existed were far apart, and thus that there was no state system, no competitive relationship between states, and thus that states did not become strong through such competition, unlike in Europe or Japan, where there were more rivalrous relationships between states because of the relative scarcity of land.

I think I am right to say that Farrell and Blattman’s acknowledgement that there were states is essentially prophylactic, meant to head off precisely the kind of Twitter objection I offered. The substance of their piece is still this: Africa had an absence of something that Europe had a presence of, and that this is what makes Rowling’s fantasy a historically plausible one, that rivalrous states that form a state system that is about control over a scarce resource (land) could lead to having multiple wizarding schools, and that Africa’s absence of these things means that having only one makes sense too. “There has been a relatively solid state” in England for a thousand years, they say, so of course Hogwarts. Uagadou, in contrast, must have formed in the absence of a state. And maybe it shares a name with a place that was thousands of miles away because perhaps “the school began in a faraway territory, before it hid itself in the remote mountains of central Africa, fleeing slave raiders and colonial powers”.


I have on occasion expressed frustration with Africanists for insisting that non-specialists must go deep inside the particulars of specific African histories in order to win the right to talk about them. And the similar inclination of many practicing historians to view large-scale comparative history or the more universalist aspirations of many social scientists with suspicion. But this is a case where some of that suspicion is warranted, I think. Partly because Farrell and Blattman insist on the tangible historical plausibility of Uagadou in Rowling’s fantasy world and they then toss in just enough history to be tangibly wrong.

Here’s the thing. First, if I were going to construct what is essentially a fantasy counterfactual of a relationship between the place Wagadu and some other place in sub-Saharan Africa, that a group of wise and knowledgeable wizards moved from an important trading community in the empire of Ghana to somewhere else in Africa, I’d at least stick to historically plausible routes of movement and connection. Wagadu and the eastern side of the Rwenzori Mountains is roughly like imagining that an ancient group of Irish wizards relocated to Ukraine in order to get away from British landlords. It’s very nearly random, and that’s the problem. It’s exquisitely well-meaning of Rowling to want to imagine Uagadou in the first place, and to respectfully draw out of African history for the name of the place. But it doesn’t make sense in terms of very real histories that can be described for what they actually were, not in terms of some abstracted absence in comparison to Europe.

Equally, I’d wonder at the counterfactual that has Uagadou moving a thousand years ago, before the trans-Atlantic slave trade, and at the height of state-building (even state system building) in the upper Niger and Sahel. It’s not as if the idea of great institutions of learning and teaching built through the revenues of trade are fantasies in that region of West Africa at that specific time: there were real institutions of that kind built in Timbuktu and Gao at exactly that moment which depended on very real long-distance connections between Muslim polities in Egypt and North Africa and the major states and polities of the West African interior. Why would Uagadou want to get away from all of that in 1016 CE? Even if Farrell and Blattman want Africa’s supposed lack of state systems to be the magic variable that produces more than one wizarding school, Uagadou’s birthplace has exactly that. And if even the wise wizards of Uagadou decided they had to leave, why the east side of the Rwenzori mountains, to which the peoples of their home region had no links whatsoever?

But hey, at least Farrell and Blattman’s defense is intact in the sense of western Uganda not having a state system, right? That would have made Uagadou different than other wizarding schools coping with state systems! Except that the region between western Lake Victoria and the Rwenzoris was another place in sub-Saharan Africa where multiple states and polities with sometimes rivalrous relationships go back at least three or four hundred years. If Uagadou was really trying to move to a place where there weren’t very many human beings or there weren’t states or there weren’t state systems (or it arose in such a place, if we discard the relationship between the name Wagadu and Uagadou), western Uganda isn’t the place to put the imaginary school.


Ultimately this is why I think Farrell and Blattman’s defense of Rowling is more problematic than Rowling herself. I think Rowling is trying to do the right thing, in fact, to include Africa and Africans in her imaginary world, and she’s not just reaching for lazy H. Rider Haggard or Edgar Rice Burroughs tropes of cities in jungles and excitable natives yelling Ungowa! Bwana! But the fact is that the way she picks up a name to stand in for a more respectful conception of Africanity still underscores the degree to which the history of African societies is a kind of generic slurry for most people. If I had imaginary Scots-named people running around in an imaginary Pomerania dotted with imaginary Finnish place names, most readers of my fantasy would understand that I was doing some kind of mash-up, and if I didn’t have some infodump of an alternate history at some point to explain it, they’d likely regard what I was doing as random or incoherent.

Farrell and Blattman are trying to provide a kind of scholarly imprimateur for that same sort of mashup, but the histories of the places that come into view in Rowling’s imagination are knowable and known. If you ask me to provide the fictional background of a wizarding school in western Uganda and why it is the only one in sub-Saharan African and admits pupils from all over a very large continent, the last thing I’m going to do is start farting around with gigantic generalizations about states and state systems that immediately frame Africa as a place which has a lack, an absence, a deficit, that is somehow naturalized or long-running. I’m going to build my plausibility up from the actual histories of African societies.

So maybe I’m going to talk about the historical world of western Uganda for what it was, for which I have a more than adequate scholarly literature, and try to imagine what a wizarding school there looks like that makes sense in that history. And the first thing I think is that it isn’t a castle in the mountains if it’s a thousand years old and it isn’t distinguished from European wizarding just by using hands rather than wands. I start to think about what magical power in western Uganda might be like, even in a world full of magical power.

If I start to think about why there’s only one school, and why the whole continent uses it, I stop thinking about a thousand years and start thinking about two hundred. I stop messing around with giant social scientistic abstractions and start thinking about colonialism. Which, to head off Farrell and Blattman’s likely objection, they do too–but not as an explanation for Rowling’s fantasy Africa being in a state of relative global deprivation. I start thinking about why Uagadou is in fact like Hogwarts, physically and otherwise. Perhaps why the University of the Witwatersrand is not wildly different from Oxford in the generalities of its institutional functioning. I think about the world in the last three hundred years, and why institutions in modern nation-states resemble each other in form even if they don’t in power or privilege or relative resources or impact. And then I wonder why Rowling doesn’t simply go there too.

The answer would in some sense because Rowling’s descriptions of the wizarding schools wants to retain some whimsy and some friendliness to a young-adult sensibility. But you can imagine African magics in a globalized fantasy from within their imaginary histories rather than from outside and even stay friendly to a young adult sensibility: as Vicki Brennan noted, that’s a good description of Nnedi Okorafor’s Akata Witch. And Rowling’s Harry Potter books inscribe the history of World War II into the wizarding world, and racism and fascism into the conflicts wizards face today. Why isn’t colonialism Dark Magic of a particularly troubling sort–the kind that suppresses many African ways of learning wizardry and then leaves behind a single, limited institution for learning magic that is built on a template that comes from somewhere else?

There’s a plausible history for Uagadou right there, but it can’t be a thousand years old if that’s the case. This is the basic problem.

You can tell a story that imagines fantastic African societies with their own institutions arising out of their own histories, somehow protected or counterfactually resistant to the rise of the West. But you have to do that through African histories, not with an audit of African absence and some off-the-shelf environmental determinism.

You can tell a story that imagines that imaginary wizarding schools arise only out of histories with intense territorial rivalries within long-standing state systems, but then you have to explain why there aren’t imaginary wizarding schools in the places in the world that fit that criteria rather than frantically moving the comparative goalposts around so that you are matching units like “all of Sub-Saharan Africa” against “Great Britain”. And you have to explain why the simultaneous and related forms of state-building in West Africa and Western Europe created schools in one and not the other: because Asante, Kongo, Dahomey, and Oyo are in some sense part of a state system that includes England, France and the Netherlands in the 17th and early 18th Centuries.

You can tell a story about how many different ways of learning wizarding in an imaginary Africa were suppressed, lost, denigrated, marginalized or impoverished, leaving a single major institution built on an essentially Western and modern model, and write colonialism into your world of good and evil magic. If you have a faux Hitler in the Dark Wizard Grindlewald, why not a faux Rhodes or a faux Burton as another kind of dark wizard?

That’s not what Rowling has put out so far. And it’s definitely not the kind of thinking that Farrell and Blattman offer in an attempt to shore up Rowling. All they offer is a scholarly alibi for Africa-is-a-country, Africa-is-absence, Africa-can-be-mashup-of-exotic-names.

Posted in Academia, Africa, Sheer Raw Geekery | 10 Comments

On the Deleting of Academia.edu and Other Sundry Affairs

Once again with feeling, a point that I think cannot be made often enough.

Social media created and operated by a for-profit company, no matter what it says when it starts off about the rights of content creators, will inevitably at some point be compelled to monetize some aspect of its operations that the content creators did not want to be monetized.

This is not a mistake, or a complaint about poor management practices. The only poor practices here are typically about communication from the company about the inevitable changes whenever they arrive, and perhaps about the aggressiveness or destructiveness of the particular form of monetization that they move towards.

The problem is not with the technology, either. Uber could have been an interface developed by a non-profit organization trying to help people who need rides to destinations poorly serviced by public transport. It could have been an open-source experiment that was maintained by a foundation, like Wikipedia, that managed any ongoing costs connected to the app and its use in that way. And that’s with something that was already a product, a service, a part of the pay economy.

Social media developed by entrepreneurs, backed by venture capital, will eventually have to find some revenue. And there are only three choices: they sell information about their users and content creators, even if that’s just access to the attention of the users via advertisements; they sell services to their users and content creators; they sell the content their creators gave to them, or at least take a huge cut of any such sales. That’s it.

And right now except for a precious few big operators, none of those choices really let the entrepreneurs operate a sustainable business. Which is why so many of of the newer entries are hoping to either threaten a big operator, get a payout and walk away with their wallet full (and fuck the users) or are hoping to amass such a huge amount of freely donated content that they can sell their archive and walk away with their wallet full (and fuck the users).

If the stakes are low, well, so be it. Ephemeral social conversation between people can perhaps safely be sold off, burned down and buried so that a few Stanford grads get to swagger with all the nouveau-richness they can muster. On the far other end, maybe that’s not such a great thing to happen to blood tests and medical procedures, though that’s more about the hideous offspring of the social media business model, aka “disruption”.

But nobody at this point should ever be giving away potentially valuable work that they’ve created to a profit-maker just because the service that hosts it seems to provide more attention, more connection, more ease of use, more exposure.

Open access is the greatest idea in academia today when it comes to make academia more socially just, more important and influential, more able to collaborate, and more able to realize its own cherished ideals. But open access is incompatible with for-profit social media business models. Not because the people who run academia.edu are out of touch with their customer base, or greedy, or incompetent. They don’t have any choice! Sooner or later they’ll have to move in the direction that created such alarm yesterday. They will either have amassed so much scholarship from so many people that future scholars will feel compelled to use the service–at which point they can charge for a boost to your scholarly attention and you’ll have to pay. Or they will need to monetize downloads and uses. Or monetize citations. Or charge on deposit of anything past the first article. Or collect big fees from professional associations to for services. Or they’ll claim limited property rights over work that hasn’t been claimed by authors after five years. Or charge a “legacy fee” to keep older work up. You name it. It will have to happen.

So just don’t. But also keep asking and dreaming and demanding all the affordances of academia.edu in a non-profit format supported by a massive consortia of academic institutions. It has been, is and remains perfectly possible that such a thing could exist. It is a matter of institutional leadership–but also of faculty collectively finally understanding their own best self-interest.

Posted in Academia, Information Technology and Information Literacy, Intellectual Property | 2 Comments