Cleaning Out the Augean Stables – Easily Distracted https://blogs.swarthmore.edu/burke Culture, Politics, Academia and Other Shiny Objects Tue, 28 Jul 2020 20:14:28 +0000 en-US hourly 1 https://wordpress.org/?v=5.4.15 Mucking Out Mead https://blogs.swarthmore.edu/burke/blog/2020/07/28/mucking-out-mead/ https://blogs.swarthmore.edu/burke/blog/2020/07/28/mucking-out-mead/#comments Tue, 28 Jul 2020 17:59:10 +0000 https://blogs.swarthmore.edu/burke/?p=3312 Continue reading ]]> Via Mohamad Bazzi of New York University, I learned last week about several articles published in the last few years by Lawrence Mead, also of NYU. I had a vague awareness of Mead as a kind of post-Moynihan “pathology of poverty” scholar who had had some influence over public policy in the 1990s, but otherwise I hadn’t really encountered his work in detail before. Bazzi was responding to a July 2020 article in the journal Society entitled “Poverty and Culture”. After I read it, I looked at a 2018 article by Mead in the same journal, titled “Cultural Difference”. The two substantially echo each other and are tied to a 2019 book by Mead, which I really dread to look at.

1. In “Culture and Poverty”, Mead is talking about “structural poverty” (though he doesn’t use the term), and yet does nothing to reference a very large body of comparative social science on structural poverty that has been published between 1995-2020. His references to poverty scholarship are entirely to work from the mid-1990s or before.

2. Paragraph 3 in the article chains together a set of assertions: low-income neighborhoods lack “order”, marriage is in steep decline, poor children do poorly in school, and “work levels among poor adults remain well below what they need to avoid poverty”. These require separate treatment, but they are chained together here to form a composite image: structural poverty causes “disorder”, it is tied to low rates of marriage and school performance, and it’s because the poor don’t work enough. This is sloppy inferential writing, but it is only an appetizer before a buffet of same.

3. Poverty arises, says Mead, from not working or from having children out of wedlock who are not supported. Not just here but throughout his article (and similar recent work), Mead seems to completely unaware of the fact that in the contemporary United States, some people in structural poverty or who are close to the federal government’s official poverty line are in fact employed. It also takes some astonishing arrogance and laziness to say that arguments that racial bias, lack of access to education, or lack of access to child care play a role in causing structural poverty have been flatly and undebatedly disproven—with only a footnote to your own book written in 1992 as proof of that claim.

4. On page 2 (and in other recent Mead writings) we arrive at his core argument, which is basically a warmed-over version of Huntingdon’s “clash of civilizations”. though even that goes unreferenced; he has a few cites of modernization theory and then one of Eric Jones’ European Miracle and McNeill’s Rise of the West, again without acknowledging or seeming to even be aware of the vast plenitude of richly sourced and conceptually wide-ranging critiques of modernization theory and Jones’ 1987 book. He doesn’t even seem aware that McNeill’s own work later on cast doubt on the idea that the West’s internal culture was the singular cause of European domination after 1500.

5. So let’s spend time with the intensely stupid and unsupportable argument at the heart of this article that vaguely poses as scholarship but in fact is nothing of the sort. Mead argues that Europeans who came to the Americas were all “individualists” with an inner motivation to work hard in pursuit of personal aspiration and that they all “internalized right and wrong” as individually held moralities, whereas Native Americans, blacks and “Mexicans absorbed by the nation’s westward expansion” were from the “non-West” and were hence conformists who obeyed authoritarian power and who saw ethics as “more situational and dependent on context”, “in terms of what the people around them expect of them”.

6. So today’s poor are “mostly black and Hispanics, and the main reason is cultural difference. The great fact is that these groups did not come from Europe…their native stance toward life is much more passive than the American norm…they have to become more individualist before they can ‘make it’ in America. So they are at a disadvantage competing with European groups—even if they face no mistreatment on racial grounds”. This, says Mead, explains “their tepid response to opportunity and the frequent disorder in their personal lives”.

7. This entire argument would not be surprising if you were reading the latest newsletter from Stormfront or the local chapter of the Klan. But as scholarship it is indefensible, and that is not merely a rejection of the ideological character of the argument. Let me muck out the stables a bit here in factual terms just so it is clear to anyone reading just how little Mead’s argument has to do with anything real.

8. Let’s start with the African diaspora and the Atlantic slave trade. In West and Central Africa between 1400 and 1800, what kinds of societies that were in contact with the Atlantic world and were drawn into the slave trade are we dealing with in terms of moral perspectives, attitudes towards individualism and aspiration, views of work, and so on?

9. First off, we’re not dealing with one generically “African” perspective across that vast geographical and chronological space, and we’re not dealing with collective or individual perspectives that remained unchanged during that time. I’m going to be somewhat crudely comparative here (but what I’m calling crude is essentially about ten magnitudes of sophistication above Mead’s crayon scrawling: in his 2018 essay “Cultural Difference”, Mead says “most blacks came from Africa, the most collective of all cultures”.) Consider then these differences, quickly sketched:
a. Igbo-speaking communities in the Niger Delta/Cross River area between 1600-1800 famously did not have chiefs, kings or centralized administrative structures but were woven together by intricate commercial and associational networks, and in these networks both men and women strove to ascend in status and reputation and in wealth (both for themselves and their kin). There was a strong inclination to something we might call individualism, a tremendous amount of emphasis on aspiration and success and something that resembled village-level democracy.
b. Mande-speaking societies associated with the formation of the empire of Mali in the upper Niger and the savannah just west of the Niger and subsequent “tributary” empires like Kaaba in upper Guinea were structured around formal hierarchies and around the maintenance of centralized states with an emperor at the top of the hierarchy. But they also invited Islamic scholars to pursue learning and teaching within their boundaries (and built institutions of learning to support them) and reached out to make strong new ties to trans-Saharan merchants. Moreover, the social hierarchies of these societies also had a major role for groups of artisans often called nyamakalaw: blacksmiths, potters, weavers, and griot or ‘bards’, who not only were a vibrant part of market exchange but who also had an important if contested share of imperial authority that involved a great deal of individual initiative and aspiration.
c. The Asante Empire, one of a number of Akan-speaking states in what is now Ghana, rose to pre-eminence in the 18th and 19th Century, and both its rulers and its merchant “middling classes” showed a tremendous amount of personal ambition and investment in individual aspiration, as did their antagonists in the Fante states to the south, who were heavily involved in Atlantic trade (including the slave trade) and who were very much part of Atlantic commercial and consumer culture. Cities like Anomabu and Cape Coast (and others to their east) were commercial entrepots that in many ways resembled other cosmopolitan Atlantic port cities in Western Europe and the Americas.
d. (I can keep going like this for a long while.) But let’s throw in one more, just because it’s illustrative, and that’s the Kingdom of Dahomey. It was an authoritarian state—though so was most of “the West” in the 17th and 18th Century, coming to that soon—but it was also deeply marked by religious dissent from those who profoundly disagreed with their ruler’s participation in the Atlantic slave trade, as a number of scholars have documented, as well as very different kinds of personal ambitions on the part of its rulers.
e. The upshot is that you cannot possibly represent the societies from which Africans were taken in slavery to the Americas as conformist, as uniformly authoritarian, as fatalistic or uninterested in personal aspiration, or as unfamiliar with competitive social pressures. I think you can’t represent any of them in those terms (I’m hard-pressed to think of any human society that matches the description) but none of the relevant West or Central African societies do. It’s not merely that they don’t match, but that they had substantially different ideas and structures regarding individual personhood, labor, aspiration, social norms, political authority, etc. from one another.

10. Let’s try something even sillier in Mead’s claims (if that’s possible), which is the notion that “Hispanics” or “Mexicans” are “non-Western” in the sense that he means. Keep in mind again that the argument depends very much on a kind of notion of cultural difference as original sin—he doesn’t even take the easy Daniel-Moynihan route to arguing that the poor are stuck in a dysfunctional culture that is a consequence of structural poverty—an argument that has a lot of problems, but it is in its way non-racial (it’s the same claim that undergirded J.D Vance’s Hillbilly Elegy, for example): culture is a product of class structure which then reinforces class structure in a destructive feedback loop. Mead is pointedly rejecting this view in favor of arguing that cultural difference is an intact transmission of the values and subjectivities of societies from 500 years ago into the present, and that the impoverishing consequences of this transmission can only be halted by the simultaneous “restoration of internal order” (e.g., even tougher policing) and the black and brown poor discovering their inner competitive individualist Westerner and letting him take over the job of pulling up the bootstraps.

11. Right, I know. Anyway, so Mead has a second group of people who are carrying around that original sin of coming from the “non-West”, full of conformism and reliance on authoritarian external commands and collectivism and avoidance of individual aspiration: “Hispanics”, which at another point in the article he identifies more specifically as “Mexicans”. I would need a hundred hands to perform the number of facepalms this calls for. Let’s stick to Mexico, but everything I’m going to say applies more or less to all of Latin America. What on earth can Mead mean here? Is he suggesting that contemporary Latinos in the United States who have migrated from Mexico, are the descendents of migrants from Mexico, or are the descendents of people who were living within the present-day boundaries of the United States back when some of that territory was control either by the nation-state of Mexico or earlier as a colonial possession of Spain, are somehow the product of sociohistorical roots that have nothing to do with “the West”?

12. Mead does gesture once towards the proposition that by “Western” he really means “people from the British Isles and northern Europe”; at other times, he seems to be operating (vaguely) with the conception of “Western” that can include anybody from Europe. He could always make the move favored by Gilded Age British and American racists and claim that Spain, Portugal, Italy, and Greece are not really Western, that their peoples were lazy collectivists who liked authoritarian control, and so on—it’s consistent with the incoherence of the rest of the argument, but he may sense that the moment he fractures the crudity of “Western” and “non-Western” to make more specific claims about the sociopolitical dispensations of 1500 CE that produced contemporary “cultural difference”, he’s screwed. In his 2018 essay, it becomes clearer why he would be screwed by this, because then he couldn’t contrast European immigrants from Italy and Eastern Europe in the late 19th Century with the really-truly “culturally different” black and brown people—if he drops Spain out of “Western” (by which he really means “white”), he’s going to lose his basis for saying that Giovanni DiCaprio had a primordial Western identity but Juan Castillo is primordially non-Western.

13. He’s screwed anyway, because there is no way you can say that Mexican-Americans are “non-Western” because they derive their contemporary cultural disposition from some long-ago predicate that is fundamentally different than that of white Americans and that this has nothing to do with the ways that societies in the Americans have structured racial difference and inequality. What is he even thinking is this ancient predicate? That Mexican-Americans are reproducing the sociocultural outlook of Mesoamerican societies that predate Spanish conquest? That Spain was non-Western, or that the mestizo culture of early colonial Mexico was totally non-Western? I can’t even really figure out what he thinks he is thinking of here: the Ocaam’s Razor answer is “well, he’s a bigot who wants to explain African-American and Latino poverty as a result of a ‘cultural difference’ that is a proxy for ‘biological difference’”, because his understanding of the histories he’s flailingly trying to invoke is so staggeringly bad that you can’t imagine that he is actually influenced by anything even slightly real in coming to his conclusions.

14. To add to this, he clearly knows he’s got another problem on his hands, which is why Asian-Americans aren’t in structural poverty in the same way, considering that most of his Baby’s First Racist History Stories conceptions of “cultural difference” would seemingly have to apply to many East, Southeast and South Asian societies circa 1500 as well. (And to Europe too, but hang on, I’m getting there.) In his 2018 essay, he’s got some surplus racism to dispense on them: some of them “become poor in Chinatowns” (citing for this a 2018 New York Times article focused on “Crazy Rich Asians”), and saying that despite the fact that they do well in school, Asians do not “assert themselves in the creative, innovative way necessary to excel after school in an individualist culture” and “fall well short of the assertiveness needed to stand out in America”. But he’s not going to get hung up on them because they pretty well mess up his argument, much like anything remotely connected to reality does.

15. Another reality that he really, really does not want to even mention, because he can’t have any conceivable response to it, is “well, what about persistent structural poverty in parts of the United States where the poor are white? And not just white, but whiteness that has pretty strong Scots-Irish-English roots, like in parts of Appalachia?” In terms of how he is conceptualizing cultural difference, as a cursed or blessed inheritance of originating cultures five or six hundred years old, he’s completely screwed by this contemporary structural fact. He can’t argue that it’s just a short-term consequence of deindustrialization or globalization—the structural poverty of Appalachia has considerable historicity. It used to give white supremacists fits back in the early 20th Century too.

16. Moreover, of course, everything I’ve said above about the complexity of the West and Central African origins of people taken across the Atlantic as slaves goes very much for Europeans arriving in the Americas. The idea that the Puritans, for example, represent a purely individualistic Western culture pursuing individual aspiration who are not ruled by and conforming to external authority is a laughably imprecise description of the communities they made. The sociopolitical and intersubjective outlooks derived from the local origins of various Europeans arriving in the Americas between 1500 and 1800 were substantially different. The states that many came from were absolutist, hierarchical, authority-driven, and the cultures that many reproduced were patriarchal, controlling, and not particularly anything like Mead’s sketch of “Western” temperaments, which is just a kind of baby-talk version of the Protestant work-ethic, a concept which actual historians doing actual research have complicated and questioned in a great many ways. Moreover, as many scholars have pointed out, the conflicts between these divergent origins were substantial until many colonists found that the threat of Native American attacks and slave revolts pushed them towards identifying as a common “white” identity.

17. Speaking of slavery, it’s another place where the entire premise of Mead’s article is just so transcendently awful and transparently racist. Mead is arguing that somehow the cultural disposition of a generic “Africa” survived intact through enslavement, which even the most enthusiastic historian of black American life would not try to claim for more positive reasons, and that slavery had no culture-making dimension in its own right. The debate about African influences, “Africanisms” and so on in the African diaspora is rich and complicated and of long-standing by scholars who actually do research, but that same research amply documents how the programmatic violence of slavery aimed to destroy or suppress the diverse African heritage of the enslaved. That research also documents the degree to which Africans in the Americas participated in the making of new creole or mixed cultures alongside people of European, Native American, and Asian descent. It’s easy to see why Mead has to make this flatly ridiculous claim and avoid seeing slavery as a culture-making (and culture-breaking) system, because it leads right away to the proposition that structural poverty among African-Americans has causal roots in enslavement, in post-Civil War impoverishment, in racial discrimination and segregation in the 20th Century. It also takes some spectacular, gross misperception, by the way, to see slave-owners collectively as canonical examples of “Western” hard-working, aspiration-fulfilling individualists. Right, right, having a hundred slaves plow your fields for you under threat of torture and death is the essence of inner-driven individualism and hard work

18. I’m leaving completely aside in all of this an entire different branch of absurdity in the article, which is that Mead says nothing about growing income inequality and lack of social mobility in the United States over the last thirty years, and nothing about what life is actually like for people who are working minimum wage jobs with all of what he calls “Western” motivations—with an individualist sensibility, with aspirations for improvement, and so on. He might say that getting into the historical details about Western and non-Western cultural differences is just beyond his remit in a short article connected to a long project. I don’t think he can say that legitimately, because extraordinary claims call for extraordinary evidence, even in a short article. But there is no way that he can excuse not citing or being even aware of the last thirty years of social science research on structural poverty in the United States. The footnotes in both his 2020 article and his 2018 article are like time-capsules of the 1990s, with the occasional early-2000s citation of scholars like Richard Nisbett.

19. I’ve bothered to lay all this out because I want people to understand that many critiques that are dismissed breezily as ideological or “cancel culture” derive from detailed, knowledgeable, scholarly understandings of a given subject or concept—and that in many cases, if a scholar or intellectual is arguing that another scholar should not have a platform to publish and speak within it is because the work they are producing shows extraordinary shoddiness, because the work they are producing is demonstrably—not arguably, not contentiously, but unambiguously—untrue. And because it is so dramatically bad, that work has to raise the question of what that scholar’s real motivation is for producing that work. Sometimes it’s just laziness, just a case of recycling old work. That isn’t anything that requires public dismissal or harsh critique.

But when the work is not only bad, but makes morally and politically repellant claims, it’s right to not merely offer public criticism but to raise questions about why a respectable scholarly journal would offer a place to such work: it mocks the basic ideals of peer review. It’s right to raise questions about why a prestigious university would regard the author of such work as a person who belongs on its faculty and tout him as an expert consultant in the making of public policy. That may be an accurate description of his role in setting policy on poverty in the past and his past work may possibly be not as awful as this recent work (though the contours of some of this thinking are visible, and reveal anew just how deeply flawed the public policy of the Clinton Administration really was). This is not about punishing someone for past sins, nor for their political affiliations. It is about what they have chosen to put to the page recently, and about the profound intellectual shoddiness of its content, in service to ideas that can only be called racist.

]]>
https://blogs.swarthmore.edu/burke/blog/2020/07/28/mucking-out-mead/feed/ 4
All Grasshoppers, No Ants https://blogs.swarthmore.edu/burke/blog/2015/07/20/all-grasshoppers-no-ants/ https://blogs.swarthmore.edu/burke/blog/2015/07/20/all-grasshoppers-no-ants/#comments Mon, 20 Jul 2015 16:34:18 +0000 https://blogs.swarthmore.edu/burke/?p=2843 Continue reading ]]> It would be convenient to think that Gawker Media‘s flaming car-wreck failure at the end of last week was the kind of mistake of individual judgment that can be fixed by a few resignations, a few pledges to do better, a few new rules or procedures.

Or to think that the problem is just Gawker, its history and culture as an online publication. There’s something to that: Gawker writers and editors have often cultivated a particularly noxious mix of preening self-righteousness, inconsistent to nonexistent quality control, a lack of interest in independent research and verification, motiveless cruelty and gutless double-standards in the face of criticism. All of which were on display over the weekend in the tweets of Gawker writers, in the appallingly tone-deaf decision by the writing staff to make their only statement a defense of their union rights against a decision by senior managers to pull the offending article, and in the decision to bury thousands of critical comments by readers and feature a miniscule number of friendly or neutral comments.

Gawker’s writers and editors, and for that matter all of Gawker Media, are only an extreme example of a general problem that is simultaneously particular to social media and widespread through the zeitgeist of our contemporary moment. It’s a problem that appears in protests, in tweets and blogs, in political campaigns right and left, in performances and press conferences, in corporate start-ups and tiny non-profits.

All of that, all of our new world with such people in it, crackles with so much beautiful energy and invention, with the glitter of things once thought impossible and things we never knew could be. Every day makes us witness to some new truth about how life is lived by people all around the world–intimate, delicate truths full of heartbreaking wonder; terrible, blasphemous truths about evils known and unsuspected; furious truths about our failures and blindness. More voices, more possibilities, more genres and forms and styles. Even at Gawker! They’ve often published interesting writing, helped to circulate and empower passionate calls to action, and intelligently curated our viral attention.

So what is the problem? I’m tempted to call it nihilism, but that’s too self-conscious and too philosophically coherent a label. I’m tempted to call it anarchism, but then I might rather approve rather than criticize. I might call it rugged individualism, or quote Aleister Crowley about the whole of the law being do as thou wilt. And again I might rather approve than criticize.

It’s not any of that, because across the whole kaleidoscopic expanse of this tumbling moment in time, there’s not enough of any of that. I wish we had more free spirits and gonzo originals calling it like they see it, I wish we had more raging people who just want the whole corrupt mess to fall down, I wish we had more people who just want to tend their own gardens as they will and leave the rest to people who care.

What we have instead–Gawker will do as a particularly stomach-churning example, but there are so many more–is a great many people who in various contexts know how to bid for our collective attention and even how to hold it for the moments where it turns their way, but not what to do with it. Not even to want to do anything with it. What we have is an inability to build and make, or to defend what we’ve already built and made.

What we have is a reflexive attachment to arguing always from the margins, as if a proclamation of marginality is an argument, and as if that argument entitles its author to as much attention as they can claim but never to any responsibility for doing anything with that attention.

What we have is contempt for anybody trying to keep institutions running, anybody trying to defend what’s already been achieved or to maintain a steady course towards the farther horizons of a long-term future. What we have is a notion that anyone responsible for any institution or group is “powerful” and therefore always contemptible. Hence not wanting to build things or be responsible. Everyone wants to grab the steering wheel for a moment or two but no one wants to drive anywhere or look at a map, just to make vroom-vroom noises and honk the horn.

Everyone’s sure that speech acts and cultural work have power but no one wants to use power in a sustained way to create and make, because to have power persistently, in even a small measure, is to surrender the ability to shine a virtuous light on one’s own perfected exclusion from power.

Gawker writers want to hold other writers and speakers accountable for bad writing and unethical conduct. They want to scorn Reddit for its inability to hold its community to higher standards. But they don’t want to build a system for good writing, they don’t want to articulate a code of ethical conduct, they don’t want to invest their own time and care to cultivate a better community. They don’t want to be institutions. They want to sit inside a kind of panopticon that has crudely painted over its entrance, “Marginality Clubhouse”, a place from which they can always hold others accountable and never be seen themselves. Gawker writers want to always be “punching up”, mostly so they don’t have to admit what they really want is simply to punch. To hurt someone is a great way to get attention. If there’s no bleeder to lead, then make someone bleed.

It’s not just them. Did you get caught doing something wrong in the last five years? What do you do? You get up and do what Gawker Media writer Natasha Vargas-Cooper has done several times, doing it once again this weekend in a tweet: whomever you wronged deserved it anyway, you’re sorry if someone else is flawed enough to take offense, and by the way, you’re a victim or marginalized and not someone speaking from an institution or defending a profession. Tea Party members and GamerGate posters do the same thing: both of their discursive cultures are full of proclamations of marginality and persecution. The buck stops somewhere else. You don’t make or build, you don’t have hard responsibilities of your own.

You think people who do make and build and defend what’s made and built are good for one thing: bleeding when you hit them and getting you attention when you do it. They’re easy to hit because they have to stand still at the site of their making.

This could be simply a complaint about individuals failing to accept responsibility for power–even with small power comes small responsibility. But it’s more than that. In many cases, this relentless repositioning to virtuous marginality for the sake of rhetorical and argumentative advantage creates a dangerous kind of consciousness or self-perception that puts every political and social victory, small and large, at risk. In the wake of the Supreme Court’s marriage decision, a lot of the progressive conversation I saw across social media held a celebratory or thankful tone for only a short time. Then in some cases it moved on productively to the next work that needs doing with that same kind of legal and political power, to more building. But in other cases, it reset to marginality, to looking for the next outrage to spark a ten-minute Twitter frenzy about an injustice, always trying to find a way back to a virtuous outside untainted by power or responsibility, always without any specific share in or responsibility for what’s wrong in the world. If that’s acknowledged, it’s not in terms of specific things or actions that could be done right or wrong, better or worse, just in generalized and abstract invocations of “privilege” or “complicity”, of the ubiquity of sin in an always-fallen world.

On some things, we are now the center, and we have to defend what’s good in the world we have knowing that we are there in the middle of things, in that position and no other. To assume responsibility for what we value and what we do and to ensure that the benefits of what we make are shared. To invite as many under our roof as can fit and then invite some more after that. To build better and build more.

What is happening across the whole span of our zeitgeist is that we’ve lost the ability to make anything like a foundational argument that binds its maker as surely as it does others. And yet many of us want to retain the firm footing that foundations give in order to claim moral and political authority.

This is why I say nihilism would be better: at least the nihilist has jumped off into empty space to see what can be found when you no longer want to keep the ground beneath your feet. At least the anarchist is sure nothing of worth can be built on the foundations we have. At least the free spirit is dancing lightly across the floor.

So Gawker wants everyone else to have ethics, but couldn’t describe for a moment what its own ethical obligations are and why they should be so. Gawker hates the lack of compassion shown by others, but not because it has anything like a consistent view about why cruelty is wrong. Gawker thinks stories should be accurate, unless they have to do the heavy lifting to make them so.

They are in this pattern of desires typical, and it’s not a simple matter of hypocrisy. It is more a case of the relentless a la carte -ification of our lives, that we speak and demand and act based on felt commitments and beliefs that have the half-life of an element created in a particle accelerator, blooming into full life and falling apart seconds later.

To stand still for longer is to assume responsibility for power (small or large), to risk that someone will ask you to help defend the castle or raise the barn. That you might have to live and work slowly for a goal that may be for the benefit of others in the future, or for some thing that is bigger than any human being to flourish. To be bound to some ethic or code, to sometimes stand against your own desires or preferences.

Sometimes to not punch but instead to hold still while someone punches you, knowing that you’re surrounded by people who will buoy you up and heal your wounds and stand with you to hold the line, because you were there for them yesterday and you will be there with them tomorrow.

]]>
https://blogs.swarthmore.edu/burke/blog/2015/07/20/all-grasshoppers-no-ants/feed/ 8
Rebooting https://blogs.swarthmore.edu/burke/blog/2015/01/09/rebooting/ https://blogs.swarthmore.edu/burke/blog/2015/01/09/rebooting/#comments Fri, 09 Jan 2015 05:23:49 +0000 https://blogs.swarthmore.edu/burke/?p=2713 Continue reading ]]> Folks who follow this blog, or my social media presence generally, have probably noted that I’ve not had much to say lately.

Part of that is feeling overly busy, but it’s more the consequence of a growing sense of perplexity and unease about online discourse, about academia, and about the political moment (both domestic U.S. and global). I feel as if I’m losing my voice, or as if it’s not worth speaking up. Or even, sometimes, that the risks to speaking outweigh any benefits to myself or to others. Also, about my inability to easily distinguish between my feelings about all of that and my middle-aged anomie. One of the great failings of some public writers is a narcissism that encourages them to confuse a confessional for an analysis, to think that their moment in life is the world’s moment. Just because at fifty you’ve seen what seem like a few recurrent cycles, have learned to hurt and be hurt, or have seen predictions fulfilled and consequences dealt out, doesn’t mean you’re right about what you think you’re seeing now. Though perhaps that’s middle-aged anomie as well, at least my version of it. I started this blog uncertain about whether to trust my own readings and arguments, and have become less trusting with each passing year.

But just as all honest writing–perhaps most of all truthful fiction–has its inevitable cruelties, so too does all argument have its narcissism, its vanities. To give voice to any opinion at all about how the world ought to be? That means you hold your own thought in higher regard, even if just for the moment that thought is ventured, than the thought of someone else. To share an insight into the way the world is means you think you know something that others haven’t, can’t or won’t see. So very well. Here’s to self-regard, however provisional, and to trying to see clearly, even when it hurts.

I want to start a new year of writing in public with a series of fragments that will repeat each other, as well as some old themes at this weblog. An exploration of this moment in public conversation, in politics, in the lives of academic communities. Rather than tie up all my thoughts in a single logorrheic essay, as I usually do, I’ll try to break up this exploration into smaller overlapping parts and see if it all weaves together. Call the whole thing Grasping the Nettle, and we’ll see how it goes.

]]>
https://blogs.swarthmore.edu/burke/blog/2015/01/09/rebooting/feed/ 2
Gamergate. Shit, We’re Still Only in Gamergate. https://blogs.swarthmore.edu/burke/blog/2014/10/17/gamergate-shit-were-still-only-in-gamergate/ https://blogs.swarthmore.edu/burke/blog/2014/10/17/gamergate-shit-were-still-only-in-gamergate/#comments Fri, 17 Oct 2014 21:46:55 +0000 https://blogs.swarthmore.edu/burke/?p=2704 Continue reading ]]> A couple of nights ago, I got up to go to the bathroom. Still only partially awake, I flushed and stumbled back to bed, only to hear the gushing sounds of the toilet overflowing. I seriously considered just letting it keep going, but I did a U-turn and went back to plunge out the blockage and sop up the mess with towels.

That’s how I feel about writing about what’s going on with what has stupidly become known as “Gamergate” in the last month or so. (The title itself flatters the pretensions of the worst people drawn to it.) I really don’t want to, I’ve been trying to avoid it, but this whole thing is not going to go away. The truth is, for those of us who know both the medium and its audiences, the last month is not a sudden rupture that changed everything. It’s just an unveiling of a long-festering set of wounds.

That dense nest of pain and abuse raises such complex feelings and interpretations in me. I hardly know where to begin. I’m just going to set out some separate thoughts and hope that they ultimately connect with one another.

1) If there is such a thing as “a gamer”, meaning someone defined in part by their affinity for video and computer games as a cultural form, I’m a gamer. Games have been as important to me as both a leisure activity and a source of inspiration and imagination as books. Before I ever venture any deeper into the stakes of Gamergate, my most elemental reaction is raw disgust with other gamers who have the unmitigated arrogance to represent their feelings, their reactions, their ugliness as “what gamers think”, as if they’re the “us” being put upon by some other “them”. On several forums that I used to frequent before this last month, I’ve had the displeasure of reading other long-time participants anoint themselves as the representative voice of “gamers”. My first impulsive thought is always, “Look here, sonny jim, I was playing Colossal Cave Adventure on the campus network in 1983, and Apple Trek on an Apple IIe when you weren’t even a lustful thought in your parents’ minds, so don’t say anything about what real gamers think. I didn’t vote for you. You don’t represent me. You don’t represent most of the people who play games.”

2) As a result of my background, at academic meetings about digital culture and games, I’ve often identified myself, somewhat jokingly, as a “native informant” rather than a scholar who comes to games as an object of study with no prior affinity for them. (Which of course earned me a pious, self-righteous correction at one meeting from a literary scholar who wasn’t aware that I also work on African history about how I might not know that the word ‘native’ has a complex history…) In that role, I’ve often found myself suggesting that there were insider or “emic” ways to understand the content and experience of game and gameplaying that many scholars rode roughshod over in their critique of that content. In particular, I’ve tried to suggest that there are dangers to reductive readings that only take an interest in games as a catalog of racialized or gendered tropes whose meaning is held to be understood simply from the act of cataloging. Equally, I’ve observed that seeing games as directly conditioning the everyday social practices and ideologies of their audiences (particularly in the case of violence) is both demonstrably wrong as an empirical argument and is also a classic kind of bourgeois moral panic about the social effects of new media forms, something that often leads to empowering the state or other forms of authority in very undesirable ways. I’ve argued, and still would argue, that at least some kinds of mobilizations through social media against racist or sexist culture are both too simplistic in their interpretations of content and counterproductive in their political strategies. I’m not going to stop arguing that certain kinds of cultural activism are stuck on looking for soft targets, that they avoid the agonizingly difficult and painstaking work of social transformation.

But this is another reason I hate the people associated with “Gamergate”. They are working hard to prove me wrong in all sorts of ways. I’d still argue that the kind of tropes that Anita Sarkeesian has intelligently catalogued are subverted, ignored or reworked by the large majority of players, but it seems pretty undeniable at this point that there is a group of male gamers whose devotion to those tropes is deeply ideological in the most awful ways and that it absolutely informs the way they think of themselves across the broad spectrum of their social lives, including their real relationships to women. It seems pretty undeniable at this point that there are men who identify as “gamers” who are willing to threaten and harm simply to protect what they themselves articulate as a privileged relationship to gaming.

3) But then, my protestations about complexity have always been checked by my own experience as a game-player and as an academic thinking about games. I’ve always known that the “Gamergate” types were out there in considerable numbers. Ethnographic studies of game culture have been thinking about this issue for years. Players themselves have been thinking and talking about it, every time they’ve tried to think of ways to defeat griefing, ways to keep female players from being harrassed, ways to make more people feel comfortable in game environments.

In one of my essays for the now-defunct group blog Terra Nova, I noted how odd it was to find myself in virtual worlds like Ultima Online and World of Warcraft playing alongside teenagers and adult men that I intuitively recognized as the kind of people who had bullied me when I was a kid. Profane, aggressive, given to casually denigrating or insulting others, enjoying causing other people inconvenience and even real emotional pain, crudely racist, gleefully sexist. Not all of them were all of that, but many of them were at least some of that. In many environments, there were enough men like that to ensure that everyone else stayed away, or avoided many of the supposed affordances of multiplayer gaming. But maybe this is part of the problem, that geeks and nerds, especially those of us who identified that way back when it got you a lot of contempt and made you a target for bullies, convinced themselves that being victimized automatically conferred some kind of virtue you on you. Maybe the problem is that I and others always felt that “Barrens chat” was the work of some Other who had infiltrated our Nerd Havens, when in fact it was always coming from inside the room. I remember once in junior high school when the jocks were bullying a mentally disabled kid by shoving him inside the shed where all the equipment was kept and then holding the door closed on him. They yelled for a couple of the geeky kids, including me, to come help them keep the door shut while the boy cried and banged and tried to get out. And it was so uncharacteristic for the jocks to ask us to join in that we almost did it just out of relief at being included.

Being a target doesn’t vaccinate you against being an abuser later on. In fact, it creates for some gamers a justification for indulgent kinds of lulz-seeking bad behavior, a sort of lethal combination of narcissistic anarchism with the sort of revenge-fantasy thinking that’s normally only found in the comic-book monologues of supervillains.

4) What I’ve seen since “Gamergate” became a thing is that some of the older male gamers who have always been clear that they were just as annoyed by subliterate teenager brogamers on XBox Live, that they also hated griefers and catasses in MMOs, that they also think badly of the most creepy posters on Reddit, lots of these guys who postured as being the reasonable opponents of extremists of any kind, have turned out not at all the disinterested or moderate influence they imagine themselves to be. I’ve watched guys who claim to think that everyone’s being overexcited by this controversy becoming profoundly overexcited themselves, and very much in a one-sided way against “games journalists”, “neckbeards”, “feminists”, “the media”, “social justice warriors” and so on. At around the one-hundredth post professing not to care very much about the whole thing, you have to turn in your “I don’t care” card. Most of them say, half-heartedly, that of course it’s bad to harass or issue death threats, with all the genuine commitment of Captain Louis Renault saying he’s shocked about the gambling in the backroom of Rick’s Cafe Americain. They usually go on to specify a standard for harassment that disqualifies anything besides Snidley Whiplash tying Penelope Pittstop to the railroad tracks, and a standard for “real death threats” that disqualifies anything that doesn’t end with someone getting killed for real.

I can’t quite say I’m shocked by these non-shocked people, but I have found myself deeply disturbed to see significant groups of formerly reasonable-signifying male posters in various forums accepting without much dissent sentiments of tremendous moral vacuity like, “If you post feminist criticisms of games, then you just have to expect to get harassed and attacked” or “Well, some guy on XBox Live threatened to rape me during a game of Call of Duty, you just shrug it off”. I’ve been wondering just how wrong I am about people in general online when I think the best of them, or how misguided I am to try to see the most interesting possibilities in how someone else thinks, if it turns out that when the crunch comes, the people I’ve thought would have their hearts in the right place are instead too busy frantically defending their right to download Jennifer Lawrence nudes to care about much else.

5) The assertion by many “Gamergate” posters that they represent the economic lifeblood of the gaming industry is just demonstrably wrong. And this is an old point that should have long since had a stake driven through its heart. The current criticism is focused on various indie games, which the gamergaters charge wouldn’t get any attention at all if “social justice warriors” weren’t promoting them. But the fact is first that the most economically successful games in the history of the medium have not been made with the sensibilities of the most devotedly “gamerish” game-players in mind. Moreover, the history of video and computer games is full of interesting work that didn’t cater to a narrow set of preferences. Today’s “indie games” have many precursors. Arguing for the diversification of tropes, models, mechanics is good for gaming in every possible way. It’s not that companies should stop making games for these “gamers”, it’s more that the major commercial mystery of the gaming industry is that so MANY games should be made for them, considering how much money there is to make when you make a good game that appeals to other people too or instead. Maybe this is what accounts for the intensity of the reaction right now, that we are finally approaching the moment where games will be made by more kinds of people for more kinds of people. Fan subcultures are often disturbingly possessive about the object of their attachment, but this has been an especially ugly kind of upswelling of that structure of feeling.

6) Many of the most strident gamergate voices are bad on gender issues but they’re also just a nightmare in general for everyone involved in game development (except for when they ARE game developers). These are the guys who hurl email abuse and death threats when they don’t like the latest patch, when they think a game should be cheaper (or free), when they have a different idea about what the ending to a game should be, when they don’t like a character or the art design or a mechanic. These are the people who make most games-related forums a cesspool of casually-dispensed rhetorical abuse. These are the people who make it a near-religious obligation to crap on anything new and then to be self-indulgently amused by their own indiscriminate dislike. So much of the fun–the enchantment–of gaming has already been well and truly done in by gamergaters in other ways: they have destroyed the village they allegedly came to save. Much of what they do now is a bad dinner theater re-enactment of the anti-establishment sentiments of an earlier digital underground, one that elevates some of the troubling old tendencies and subtexts into explicit, exultant malice.

]]>
https://blogs.swarthmore.edu/burke/blog/2014/10/17/gamergate-shit-were-still-only-in-gamergate/feed/ 6
Heroic Measures (The Modest Proposal Remix Edition) https://blogs.swarthmore.edu/burke/blog/2014/01/13/heroic-measures-the-modest-proposal-remix-edition/ https://blogs.swarthmore.edu/burke/blog/2014/01/13/heroic-measures-the-modest-proposal-remix-edition/#comments Mon, 13 Jan 2014 17:13:50 +0000 https://blogs.swarthmore.edu/burke/?p=2553 Continue reading ]]> Bill Keller has spent the last two years in a dull and very public exasperated talking-to with the rest of the world for not being enough like Bill Keller. Since the New York Times helpfully selected him among all the writers whose opinions ought to be published periodically within the pages of the New York Times, out of gratitude for his nostalgic reprise of William Randolph Hearst’s brilliant use of the press to start wars, he has written periodically about his mild and grudging regrets for misleading the entire country, how he knew Nelson Mandela personally and it turns out Mandela could sometimes be a jerk, how he sort of liked Doris Kearns Goodwin’s new book and some other stuff that his buddies on the opinion page have had opinions about. He hasn’t tweeted copiously through it all, but he’s been reading some tweets, occasionally. Even by contemporary standards of old-media irrelevance, he’s irrelevant. A rapt audience of a few Times editors, a few other pundits and a couple of old people follows his marshmellow-soft narrative of truisms, hackneyed repetitions, noncommittal middle-of-the-roadisms, and smug posturings, occasionally annoying a larger audience enough to warrant a few angry tweets and blog posts.

In the last entry or so, his tone has changed slightly; his condescension has become a little less forgiveable. As 2014 began, the insufferable and privileged character of old-media punditry that had colonized the major American daily newspapers became much less tolerable. He was deemed too much of an asshole to just ignore. He is now lighting up the Internet with fury, serving as linkbait for the New York Times, which has embraced him as a source of new media advertising revenue.

Bill Keller is still alive, still writing, though you wouldn’t guess it by reading him. The column has become less about prolonging his career and more about defending his wife’s column.

“The words of my column become words that express why I’m paternalistically disappointed by most folks for not being enough like me, not dying the way I think they ought to die, and doing other things that really they should know better about,” he might as well write after reading a collective blast of tweeted exasperation. “The ebb and flow of not-Keller America, of not-Keller world. And so, too, inevitably, of all the things that Keller has done before.”

]]>
https://blogs.swarthmore.edu/burke/blog/2014/01/13/heroic-measures-the-modest-proposal-remix-edition/feed/ 2
Recombinant Friedman https://blogs.swarthmore.edu/burke/blog/2013/05/29/recombinant-friedman/ https://blogs.swarthmore.edu/burke/blog/2013/05/29/recombinant-friedman/#comments Wed, 29 May 2013 15:26:12 +0000 https://blogs.swarthmore.edu/burke/?p=2363 Continue reading ]]> My rephrasing of Thomas Friedman’s column today:

“One of the best ways to learn about the changing labor market, if you can’t find a taxi driver to have a conversation with on your way to the airport, is to find some well-connected guy who used to work at Goldman-Sachs, who has a start-up that he’s desperately trying to flog, and let him tell you all about how great his start-up idea is.

If you let that guy write half your column for you, you’ll discover that today’s employers, unlike yesterday’s, would like their employees to have useful skills. And they don’t care where you got your skills from: they’ll be glad to undervalue and underpay you for those skills no matter where they’re from, and discard you like an old toilet paper roll once they’ve decided that they want some other skills.

With some pluck and drive, you can work your way up from the mail room! Oh, wait, those are the old notes. Checking. Ah! Just take the energy to teach yourself neurosurgery, high-tech manufacturing assembly, preparation of petri dish cultures, Python, persuasive analytic writing, and graphic design when you get home from working two different low-wage service jobs, and you’re sure to find an employer looking for those skills who will overlook you because there’s no real way on a resume to show that you’ve acquired those skills through self-study.

Trust me, though, you won’t regret not going to college. In this bold new world where employers actually want people who can do the jobs they’re hiring for, I and my taxi driver can tell you that a world of opportunity awaits if you know the right people at Davos.”

]]>
https://blogs.swarthmore.edu/burke/blog/2013/05/29/recombinant-friedman/feed/ 11
On Diamond (Not Again!) https://blogs.swarthmore.edu/burke/blog/2013/02/07/on-diamond-not-again/ https://blogs.swarthmore.edu/burke/blog/2013/02/07/on-diamond-not-again/#comments Thu, 07 Feb 2013 21:02:49 +0000 https://blogs.swarthmore.edu/burke/?p=2241 Continue reading ]]> I don’t really mean to get drawn into recurrent arguments about Jared Diamond’s work, because my actual feelings about the actual books are rather mixed and indifferent. Guns, Germs and Steel reads well, it’s a useful teaching book for fueling a discussion about the merits and limits of materialism and environmental determinism, and it can provoke a very interesting conversation about moral responsibility, global inequality and the post-1450 expansion of Europe (almost in spite of itself). I appreciate that Diamond thinks his argument in GGS is strongly anti-racist, I appreciate why others think it has the opposite effect, and think that neither is entirely correct. Even in terms of synthesizing works, I think there are better choices for most of Diamond’s signature arguments, however.

I appreciate that Collapse is, in a way that I find awkward and roundabout, trying to think about the question of determinism. I appreciate that his current book is working hard not to get drawn into sentimentality about hunter-gatherers, that Diamond believes himself to be steering a middle course between ethnocentric arrogance and romanticism about ‘noble savages’. I appreciate that Diamond thinks The World Before Yesterday is deeply appreciative of “traditional societies” and so is baffled to be accused of hating on them.

I also appreciate not just that his audiences are looking for a clear writer who seems knowledgeable about many issues, but for “big theories” of human history and culture that do not require having a Ph.D’s worth of knowledge and training in order to understand or articulate.

The problem is that there are a lot of problems with Diamond’s work in both his command over the literatures he’s synthesizing, the selectivity of his synthesis, and the uncharitable and at times incomprehensible framing he makes of any potential objections (when he can be bothered to acknowledge that such a thing could exist). Scholars who try to point out these things politely get ignored or acknowledged in passing, as in Razib Khan’s update to his post at Gene Expression. I’ve been in a number of discussions over the years with people who like Diamond’s books who then say, “But yeah, he gets a lot of things wrong” or “yeah, his theory is really overexaggerated and simplistic” as if that’s not even worth talking about and as if you’re a hater for even wanting to talk about it. Small wonder that some scholars, particularly anthropologists, lose their shit when there’s a new Diamond book out there. Sometimes you lose your shit when people insist that they don’t really want to talk about all the people (including you) who are not losing their shit. Why doesn’t Khan want to talk about Alex Golub’s careful, detailed response to Diamond’s book? Why isn’t Golub the “typical anthropologist” for Khan but some folks working for an NGO are? Probably because that would take a longer, smarter entry.

I agree with Khan that sometimes the shit-losing leads people to say things that are just as problematic–to sneer at Diamond’s readers, to condemn anybody who tries to have a “big theory” about human history and culture, or to go too big in characterizing what’s wrong with his work. But have some sympathy here, because Diamond and a few others in his intellectual neck of the woods like Stephen Pinker, specialize in cherrypicking big fields of scholarly work for a few friendly citations and then acting as if what they’ve found is the entirety of those fields. Diamond and Pinker also seem unable to resist setting up straw man versions of legitimate criticisms (and then a few of their critics can’t seem to resist falling into that characterization).

In an earlier comment, I mentioned at least a few areas where there seems to me to be a genuine debate with a range of legitimate positions that require respect, if not agreement, in terms of Diamond’s latest (as well as Pinker’s latest book, which has some overlap):

1. Maybe New Guinea isn’t representative of all modern “traditional societies”, let alone hunter-gatherers in all of human history. Maybe there is considerably more variety in terms of violence and many other attributes than Diamond lets on. Maybe he’s not even paying attention to the full range of anthropological or historical writing about New Guinea. Maybe Diamond isn’t even living up to his own stated interest in the variations between such societies.

2. Maybe modern hunter-gathering societies are not actually pristine, unchanging survivals of an earlier era of human history, but instead the dynamic consequence of large and small-scale migrations of agriculturalists and even more recently, industrial workers. At least in some cases, that might be why hunter-gatherers inhabit remote or marginal environments, not because of preference, but as a response to the sometimes-violent movement of other human societies into territories that they used to inhabit. Meaning taking whatever it is that they have been doing in the 20th Century (violence or otherwise) as evidence of what they’ve always done is a mistake.

3. Maybe defining violence or war in a rigorous, consistent, measurable and fully comparative way is much harder than Diamond or Pinker think it is.

4. Maybe between what Diamond calls a “traditional society” and modern “WEIRD” societies (Western, educated, industrialized, rich and democratic) there are lots of other models. Maybe “between” is the wrong term altogether since it implies that there’s a straight developmental line between “traditional society” and modernity, an old teleological chestnut that most anthropologists and historians would desperately like to get away from. I haven’t read very far yet into the book, but Diamond doesn’t seem to have any idea, for example, that there have been numerous societies in human history where there have been many connected communities sharing culture and language at high levels of population density and complexity of economic structure that have nevertheless not had a “state” in the usual sense. What are those? Also: maybe Diamond frequently confuses “traditional” and “premodern”. Much of the time when he says, “Well, we modern WEIRD people do X, ‘traditional societies’ do Y”, the “Y” in question would apply equally to large premodern states and empires.

Or to summarize: maybe Diamond is pushing way, way too hard for a clean distinction between two broadly drawn “types”: “traditional society” and “modern society”, and is distorting, misquoting, truncating or overlooking much of what he read (hard to tell what he read, since there’s no footnotes) to make the distinction come out right.

This is not nit-picking, this is not complaining about a spelling error or some mildly errant footnote on p. 79. This is not pedantry. This is important. The more airtight you want to make your universalisms, the more that you tend to spring leaks–and the more leaks you spring, the faster your boat sinks. A “big theory” that’s advanced with generosity and gentleness, which grants its own provisional character, is a sturdier way to inspire discussion and create understanding. As Golub points out, that is not what Diamond is doing, so much so that his description of other ways of thinking is very nearly incoherent.

Good, simple, accessible synthesis does not require a lack of generosity towards the scholarship that makes it possible. And a good synthesis should always be as much a guide to the possibilities of interpretation around a complex subject as it is a defense of a singular interpretation.

]]>
https://blogs.swarthmore.edu/burke/blog/2013/02/07/on-diamond-not-again/feed/ 4
The State of the Art III: Facebook (and 500px and Flickr) as a Window Into Social Media https://blogs.swarthmore.edu/burke/blog/2013/01/23/the-state-of-the-art-iii-facebook-and-500px-and-flickr-as-a-window-into-social-media/ https://blogs.swarthmore.edu/burke/blog/2013/01/23/the-state-of-the-art-iii-facebook-and-500px-and-flickr-as-a-window-into-social-media/#comments Wed, 23 Jan 2013 17:13:11 +0000 https://blogs.swarthmore.edu/burke/?p=2218 Continue reading ]]> III. The Business Model as Belief and Reality

Why is Facebook such a repeatedly bad actor in its relationship to its users, constantly testing and probing for ways to quietly or secretly breach the privacy constraints that most of its users expect and demand, strategems to invade their carefully maintained social networks? Because it has to. That’s Facebook’s version of the Red Queen’s race, its bargain with investment capital. Facebook will keep coming back and back again with various schemes and interface trickery because if it stops, it will be the LiveJournal or BBS of 2020, a trivia answer and nostalgic memory.

That is not the inevitable fate of all social media. It is a distinctive consequence of the intersection of massive slops of surplus investment capital looking desperately for somewhere to come to rest; the character of Facebook’s niche in the ecology of social media; and the path-dependent evolution of Facebook’s interface.

Analysts and observers who are content with cliches characterize Facebook as sitting on a treasure trove of potentially valuable data about its users, which is true enough. The cliched view is that what’s valuable about that data is names associated with locations associated with jobs associated with social networks, in a very granular way. That’s not it. That data can be mined easily from a variety of sources and has been mined relentlessly for years, before social media was even an idea. If an advertiser or company or candidate wants to find “professors who live in the 19081 area code who vote Democratic and shop at Trader Joe’s in Media” they can buy that information from many vendors. If that were all Facebook was holding, it wouldn’t have any distinctive wares, even imagined, to hock. All it could do is offer them at a bargain rate–and in the global economy, you can’t undercut the real bargain sellers of information. Not that this would keep Facebook from pretending like it has something to sell, because it has a bunch of potentially angry investors ready to start burning effigies.

What Facebook is holding is a type of largely unique data that is the collaborative product of its users and its interface. But if I were a potential buyer of such data, I’d approach my purchase with a lot of caution even if Facebook managed to once and for all trick or force its users into surrendering it freely to anyone with the money to spend. If my goal is to sell something to Facebook users, or to know something about what they’re likely to do in the future, in buying Facebook’s unique data, what I’m actually learning about is a cyborg, a virtual organism, that can only fully live and express inside of Facebook’s ecology. Facebook’s distinctive informational holding is actually two things: a more nuanced view of its users’ social networks than most other data sources can provide and a record of expressive agency.

On the first of these: the social mappings aren’t easily monetized in conventional terms. Who needs to buy knowledge about any individual’s (or many individuals’) social networks? Law enforcement and intelligence services, but the former can subpeona that information when it needs to and the latter can simply steal it or demand it with some other kind of legal order. Some academics would probably love to have that data but they don’t have deep pockets and they have all sorts of pesky ethical restrictions that would keep them from using it at the granular level that makes Facebook’s information distinctive. Marketers don’t necessarily need to know that much about social networks except when they’re selling a relatively long-tail niche product. That’s a very rare situation: how often are you going to be manufacturing a TARDIS USB hub or artisanal chipotle-flavored mustache wax and not know exactly who might buy such a thing and how to reach them?

Social networks of this granularity are only good for one thing if you’re an advertiser or a marketer: simulating word-of-mouth, hollowing out a person and settling into their skin like a possessing spirit. If that’s your game, your edge, the way you think you’re going to move more toothpaste or get one more week’s box office out of a mediocre film, then Facebook is indeed an irresistable prize.

The problem is that most of us have natively good heuristics for detecting when friends and acquaintances have been turned into meme-puppets, offline and online. Most of us have had that crawling sensation while talking to someone we thought we knew and suddenly we trip across a subject or an experience that rips open what we thought we knew and lets some reservoir, some pre-programmed narrative spill out of our acquaintance: some fearful catechism, some full-formed paranoid narrative, some dogma. Or sometimes something less momentous, just that slightly amusing moment where a cliche, slogan or advertising hook speaks itself from a real person’s mouth like a squeaky little fart, usually to the embarrassment of any modestly self-aware individual.

Facebook could, probably will, eventually wear down its users’ resistance and stop labeling or marking or noting when it is allowing a paying customer to take over their identities to sell something to their social networks. We’ll still know that’s happening to a friend up until the day that an AI can use all that data to so convincingly simulate our personal distinctiveness that there’s no difference between the AI’s performance and our own. At which point, so what? Because then my accurately simulated self will just be selling or persuading on behalf of that which I would, with all my heart, sell or persuade, in the voice I would normally use to persuade with.

As long as Facebook’s potential customers want to use my social networks to sell something I wouldn’t sell, in a way I wouldn’t sell it, most of the people who “know” me through Facebook will know that it’s not me doing that, and they know that better and better proportionately in relation to the amount of information I’ve provided to them all through Facebook. (E.g., the best protection from being puppeteered is paradoxically more exposure rather than less.)

So what of the other unique information Facebook holds, a record of everything I’ve “liked”? Surely that’s information worth having (and thus worth paying Facebook for) for anyone desperate to sell me products, persuade me to join a cause, or motivate me to make a donation? Not really (or not much), for two reasons. First, because existing sources of social and demographic data are generally good enough to target potential customers. If you know who the registered Democrats with graduate-level education making more than $75,000 a year are in Delaware County Pennsylvania, you have a very good understanding of their likely buying habits and of the causes to which they are likely to donate. If you’re selling something that has a much more granular target market, it’s almost certainly more efficient and cheaper to use a more traditional media strategy or to rely on social networks to sell it for you simply because they’re interested in it. If you’re the budget-photography company YongNuo, you don’t need spend money to mine my Facebook likes and posts to see I’m interested in moving into studio-based strobist photography: existing networks of hobbyists and professionals are sufficient to acquaint me with your products. If you’re trying to sell a Minecraft pendant necklace, your potential customers are going to do a fine job of notifying each other about your product.

More to the point, if I’m trying to sell you a product or a cause and I find you through data-mining your pattern of “likes” on Facebook, what is it that I’ve found? Maybe not the “you” that actually buys things, shows up to political rallies, writes checks to a non-profit. I’ve found the algorithmic cyborg that clicks “like” on Facebook, half-human and half-interface, formed out the raw stuff of things that are clickable and linkable and feed-compliant. Which is sometimes a set that overlaps with what can be bought and done and given in the rest of our lives and sometimes is very palpably not. If my sales or success depended on the liking of Facebookborgs reliably translating into behavior elsewhere, I’d be on very thin ice. And I’d just as soon not pay much to get onto thin ice.

—–

So what about the rest of social media? Do they have something to sell, something worth investing in? Sometimes they do, and that brings me back to Flickr and 500px, where I started this series. What Flickr and 500px have to sell, first and foremost, is not information but services: data storage, a form of publication, and access to a community with a narrower focus than “all of life”. Both of them have at least a rough model for how to make a bit more revenue on the backend, through facilitating the sale of community members’ images to any external buyers (while giving the creator of the image a cut of the revenue). That is not a business model that is going to make them megabillions, but it’s very likely a sustainably profitable enterprise when all is said and done. It rests on a fragile foundation, as Flickr in particular has discovered. Your paying customers have to care enough about the social capital they have invested in the service to pay for it, the publishing interface has to be updated to look contemporary and run on contemporary hardware, and the archive has to be searchable enough that external buyers (whether it’s someone looking for a canvas to hang on their wall or a media organization looking for stock footage) can sift through it. All of which takes work for a labor force that has to be kept lean and cheap. One slip and your users, the real source of your value, are going to pack their bags and content for the next new thing. When that starts to happen, it can cascade quickly into collapse. If you do something to try and slow the flight of content and participation, by making content difficult to extract or erase, you might spark the equivalent of a bank panic.

There’s one other social media business model that demonstrably works, if in the spirit of 21st Century financial capitalism: it’s the digital version of a pump-and-dump. Set up a specialized social media service, lure a venture firm or investor in that’s looking to bet a bit of money on the next new thing, spend a bit of money on an interface design, put on a dog-and-pony show that gets the restless digerati in the door and providing some kind of content. If dumb luck is really with you, maybe you stumble into the next YouTube or Twitter, you somehow provide a space or tool in a niche that no one knew existed. If dumb luck is sort of with you, you’re Instagram and you get bought up by bigger fish who need to prove to their investors that they’re working towards a profitable business model and are using acquisitions as a distraction from tough questions. In that case, your business model is to be someone else’s business model, only you can’t say as much without shining a spotlight on a naked emperor’s private parts. In the worst case (probably) you burn someone’s money, earn some salary, get some experience, and have a story or two to tell to your next investor–or at least build a resume that gets you hired at a real company.

Social media that provide a service that is sufficiently valuable that people will pay for it, however little, have a business model that is not only sustainable but that doesn’t require them to constantly breach the trust of their users or work against what their communities want.

Social media that have no business model except trying to monetize the information that users provide to them will, sooner or later, be required to breach trust and demolish whatever is useful in their service, to come back again and again with new interfaces and terms of service that lie or conceal or divert. Even if they get away with it for a time, they’re selling a product that is far less valuable than the innumerable info-hucksters and digital prophets (or even protectors of privacy) think it is. In some ways, it might be best if Facebook just got it over with and gave itself permission to sell every last scrap of information it’s holding: what we might all discover is that there’s hardly anyone at all who will pay for that service at the scale and price that Facebook needs them to pay.

]]>
https://blogs.swarthmore.edu/burke/blog/2013/01/23/the-state-of-the-art-iii-facebook-and-500px-and-flickr-as-a-window-into-social-media/feed/ 3
The Shameless Style https://blogs.swarthmore.edu/burke/blog/2012/05/07/the-shameless-style/ https://blogs.swarthmore.edu/burke/blog/2012/05/07/the-shameless-style/#comments Mon, 07 May 2012 12:56:04 +0000 https://blogs.swarthmore.edu/burke/?p=1958 Continue reading ]]> When people feel no shame, watch out. Any cultural or political system that depends on participants living up to minimal commitments for quality, integrity, or diligence where there are no consequences besides embarassment for outright discarding those commitments can thrive right up to the moment that someone stands up with a big shit-eating grin and blithely drops through the ground floor. As they fall, they often pull everyone with them into an abyss–because no one wants to be the last chump trying to live up to quaint standards and expectations.

The Chronicle of Higher Education has decided to be a case in point by giving Naomi Schaefer Riley a chance to pull their coverage of academic institutions and life through the event horizon of shamelessness. If you haven’t read the columns, this summary will do: in column 1, Riley deemed some dissertations and their graduate student authors in Black Studies to be worthless, ridiculous wastes of resources and effort without reading the dissertations in question or knowing anything about the topics covered. Riley judged, for example, that “black midwifery” in the 19th Century U.S. is self-evidently an absurdly overspecialized and pointless subject to study. In column 2, Riley doubled down and said that in writing a column about dissertations in a publication about higher education, it was “not her job” to read those dissertations.

I am maybe less surprised than some the Chronicle of Higher Education has so aggressively punched a hole into the sewer, though I’m hoping a bit that they can rethink it a bit before jumping in behind their columnist. I’m not so surprised because CHE has plainly struggled with digital culture, like many mainstream media publications that had a sense as late as the early 2000s that their business model was immortal and invulnerable. When you don’t have a commanding vision of your own, you’re vulnerable to the pitches of snake oil salesmen. In this case, the Chronicle fell for the idea that the only way to command attention in an online age is to hire a bunch of rodeo clowns and let them enrage the bulls, that you get your daily dose of eyeballs through handing a megaphone to the most aggressively careless person you can find and then coming along afterwards to say that you only wanted to “start a discussion”.

This isn’t the only way to build a readership: there are plenty of other models. It’s an especially unsuitable way to approach building a digital presence for a publication devoted to academia. I’m feeling somewhat vindicated about my own decision to not move this blog inside their architecture when I was invited to do so. I don’t think I’d want to move in any event as I’m proud of this blog’s continuing presence inside the information architecture of my own institution. But publications that suddenly decide to throw their considerable weight behind the next new thing often make bad mistakes as they rush to catch up. They learn the wrong lessons from consultants and advisors rather than work their way towards authenticity along the slower, harder road of home-grown practices.

I’m not particularly interested in rewarding the Chronicle for bowing to the shameless style. If they want to continue to try and get my eyeballs–or links–with this kind of writing, I’m not going to oblige. And I think that if it keeps up, I’ll be asking my colleagues to reconsider our institutional subscriptions to the print edition.

]]>
https://blogs.swarthmore.edu/burke/blog/2012/05/07/the-shameless-style/feed/ 4
Saying It Again https://blogs.swarthmore.edu/burke/blog/2012/01/10/saying-it-again/ https://blogs.swarthmore.edu/burke/blog/2012/01/10/saying-it-again/#comments Tue, 10 Jan 2012 18:03:14 +0000 https://blogs.swarthmore.edu/burke/?p=1836 Continue reading ]]> From the department of pointless but compulsory exercises: every single time Rick Santorum or anyone with similar views says the following two things:

a) What, you want gay marriage? What’s next, legitimating polygamy?

and

b) The only form of legal, sanctioned marriage that any human society in all of human history has ever sanctioned is between one man, one woman,

the following rejoinder should be automatic from anyone in the audience to whom these things are being said:

c) Actually, Rick, the most commonly sanctioned or legalized form of marriage in human history across a wide span of societies has been polygamy, albeit with numerous variants. You might notice this if you actually read the Bible like you claim to.

However, there’s something more at stake in this special cultural conservative version of an all-Cretans-are-liars paradox. It’s not just a question of whether it’s ignorance or cynicism lurking behind political pandering.

What this paired sentiment expresses more deeply is have-cake-and-eat-it-too vision of modernity and progress among cultural conservatives, and not just in the United States. I see something of the same in the most skilled recyclers of the tradition-modernity relation that was given its undead power under colonial rule in 20th Century African societies.

If I were able to actually have a conversation with Santorum in which the historical reality of sanctioned polygamy in most human societies was made impossible for him to ignore or soundbite into oblivion, I’m willing to bet that the likely way out of the trap would be to argue that contemporary life has overcome that old evil, that we’ve progressed. Santorum and other American Christian conservatives would likely put the origin of that progress somewhere other than secular liberals would. They’d probably ascribe it to the rise of Christianity, all the way back to the early Church, whereas a more secular (or at least not religiously conservative) view would probably be than contemporary companionate, monogamous marriage (or any companionate, monogamous relationship, really) is a direct consequence of the working out of liberal individualism and rights-based personhood after 1750.

But it really doesn’t matter which claim you turn to. If you think that the relative eclipse of polygamy (still practiced and legally as well as morally sanctioned in many parts of the world) is a good thing, as I presume Santorum does given his suggestion that legally sanctioning gay marriage would open the door to polygamy, you believe in progress, that some aspects of the human condition have improved over time through the deliberate efforts of human beings to reform or change their social structures. And the moment you believe in that, saying, “It’s natural for people to live a certain way, all societies have done it that way” is off the table as a justification of contemporary policy whether or not your claim about the naturalness of living that way is true or not.

(Which, in fact, Santorum’s claim about the universality of nuclear families and monogamous marriages is not. Not in any way, including its address to homosexual practices. The foundation stone of ‘the Western tradition’, classical Greece, very much included sanctioned homosexual relationships between male citizens, for example.)

The moment you accept that progress is the real explanation for a transformation in human practices that you defend or endorse, you shouldn’t be able to invoke the universal, unchanging natural character of that practice against some other argument for yet another change or reform.

And yet, of course, this is done all the time, because the rhetorical alternatives are to either embrace arbitrary bigotry or construct some weird Tower-of-Babel claim about the future consequences of reform. E.g., in the case of gay marriage, if modern companionate relationships are a good example of progress, that means that we’re capable of changing how we legally and socially sanction and regulate marriage or relationships for the better. If we’re capable of that, why not include sanctioning companionate relationships between same-sex couples? With the invocation of unchanging, natural traditions disallowed, the only ‘why nots’ left are: because we should hate or despise same-sex couples for fundamentally arbitrary or non-rational reasons; or because sanctioning same-sex relationships would lead to further bad consequences. American cultural conservatives often take a stab at the second argument in public discourse (indeed, that’s where Santorum leads into his ‘oh noes bestiality-will-be-legal’ line) but this is an even easier set of arguments to puncture: either the imagined consequences are those which already follow in full measure from legally sanctioned heterosexual relations or they involve a vision that legal sanction is the same as contagion, that it creates practices that would not otherwise exist, a belief that has a lot of odd collateral implications.

]]>
https://blogs.swarthmore.edu/burke/blog/2012/01/10/saying-it-again/feed/ 1