First Thought: On Generosity and Teaching

In the last week, I’ve heard a lot of staff and faculty talk about how hard it is going to be to teach and advise students in the near-term future, how much they now feel that they have to second-guess everything they might say and how much they feel a risk of whatever they say being misquoted or appropriated and misused.

Teaching and advising and counseling are all intimate practices if they’re done in the way that Swarthmore and other small liberal-arts colleges aspire to do them. Which means they involve vulnerability on all sides. It means that they can, potentially, take up as much emotional and intellectual energy as both teacher and student have to spare and then go well beyond that. One reason that I think committed scholars at large research universities were at least passively acquiescent in the gradual outsourcing of some undergraduate instruction to adjuncts and graduate students starting in the 1980s was simply that they were grateful to be relieved of the heavy emotional and temporal weight that committed face-to-face teaching and counseling places upon professionals.

That weight is harder and heavier to carry depending upon the way that students and colleagues read your identity and personality. A professor or administrator with an open emotional affect may find themselves with more clients than someone with a more forbidding or severe affect, which is no one’s fault: it’s just how we read people and how we present to be read. But that affect may not match up with we can actually provide to students and advisees either intellectually or emotionally: a severe-seeming person may connect in more ways with more people than someone who seems open and inviting.

More pressingly, students often make assumptions or demands for support, empathy and guidance from women, people of color, and GLBTQ faculty and staff that they don’t make from white male professionals. I’m really glad when I can mentor or connect with a student in a way that goes beyond the classroom. I’m also very aware that there are students who wouldn’t think to seek me out for that guidance but would make very strong demands from faculty or staff that they perceive as being closer to their own identities and backgrounds. As one of my colleagues likes to put it, “the students aren’t going to be in your office crying on your desk very often, but they’re there every day doing it to me.”

Even when students are right about their perceptions of a similarity of background, that means the work of teaching and advising is unevenly distributed (which a major reason why it is critical to pursue faculty diversity), and those faculty and staff thus have to do more to protect their time and their energy. And sometimes the students are wrong: some likenesses they imagine and bring into a teaching or counseling experience are not there at all. Which puts that teacher or advisor in a very complicated position, not wanting to spurn or turn away from the responsibilities of the job, needing somehow to get things clear at the start, but also not wanting to have to spend the rest of their professional lives condemned to recount deeply personal histories of their own lives every September to a student who is seeking somebody or something to hold onto. That’s my privilege, to not face that dilemma. It’s also a place where students need somehow to find more places to put their trust, to take more chances, while also to be themselves more heedful and self-aware about what they’re asking of the teachers and advisors they turn to preferentially.

Trust is the big word here. You cannot trust again and again when your trust is always broken, never returned. I know that. But teaching and counseling, if they’re going to be small and focused and human and intimate, can’t happen in the absence of generosity. Generosity that leads to creativity, to taking chances, that produces a diversity of models and practices. This week has reminded me yet again that Swarthmore and places like it have badly oversold themselves as bubble utopias, but even if we finally gave that pretense up, our remaining distinction will rest on practices that require this special kind of generosity, one that is increasingly in short supply in the world around us. If we can’t manage that much, we might as well come out with our hands up and surrender to the MOOC.

————–

I’ve been married for almost 27 years. When I was dating my wife in college, I was the child of an upper middle-class professional family and she was from a working-class background. I did a lot of what would now be called “microaggression” early in our relationship, around social class. Obviously I was forgiven or tolerated for it, for which I’m very grateful. Life and time and experience and my wife taught me lessons that my undergraduate courses couldn’t. I’m still learning. But we also both know that even after 27 years and knowing each other better than anyone else, we often still don’t know exactly what the other person needs that day, is feeling or thinking. Sometimes we guess wrongly, occasionally profoundly so. I sometimes think, given that families and partners and long-time friends often get that kind of thing wrong, how can I or anyone else hope to get it right with a student that we’ve only known for a few weeks or at best a few years?

Good teaching and advising (and learning and being advised) require the freedom to experiment, to come at matters from several angles, to throw things at the wall and see what sticks. Which means they also require making mistakes and courting misrecognition. There is an irreducible element of risk. No amount of sensitivity training can get away from that risk. The only way to avoid it is to brutally standardize all moments of human interaction, to create a technocratic infrastructure that regulates and records anything that might involve the risk of getting it wrong. The history of the last century shows that in complex modern institutions, if you invite that technocracy in the door, it rarely needs a second invitation, and it never departs until it manages to thoroughly hollow out and distort the host institution.

Doing what we do right–in a human way, without technocracy–does involve taking responsibility for mistakes, both individually and institutionally. Generosity in teaching and advising requires honesty and transparency to work, and a strong measure of humility as well, on all sides. That much needs a good deal of repair and rethinking at Swarthmore and institutions like it, plainly. Nor should anyone call systematic forms of racism or discrimination a “mistake”. The security guards that Seema Jilani describes at the White House Correspondents’ Dinner weren’t making a mistake, weren’t engaged in ordinary oversight or misrecognition. Whomever is pissing on the Intercultural Center isn’t just “Whoops, I didn’t get that this bothers you”. Sexual assault isn’t a case of “oh, I didn’t realize that wasn’t consensual”.

But if there isn’t other space left in most of what we do as teachers, advisors and community members to make mistakes, space left to misunderstand, space left to not give in all cases what someone needs or wants, space left to talk past or not hear, space left to try things, space left to give what we individually know how to give and refuse what we individually aren’t able to provide, we’re pretty well done. All the creative, generative possibilities for the future of a liberal arts education depend upon that.

Posted in Academia, Swarthmore | Comments Off on First Thought: On Generosity and Teaching

The Humane Digital

As a way of tackling both the question “whither the humanities” and the thorny issue of defining “digital humanities” in relationship to that question, I’ll offer this: maybe one strategy is to talk about what can make intellectual work humane.

First, let’s leave aside the rhetoric of ‘crisis’. Yes, if we’re talking about the humanities in academia, there are changes that might be called a crisis: fewer majors, less resources, a variety of vigorous attacks on humanistic practice from inside and outside the academy. Are the subjects of the humanities: expressive culture, everyday practices, meaning and interpretation, philosophy and theory of human life, etc. going to end? No. Will there be study and commentary upon those subjects in the near-term future? Yes. There will be a humanities, even if its location, authority and character will be much more unstable than they were in the last century. If we want to speak about and defend the future of the humanities with confidence, it is important to to concede that a highly specific organizational structuring of the highly specific institution of American higher education is not synonymous with humane inquiry as a whole. Humane ways of knowing and interpreting the world have had a lively, forceful existence in other kinds of institutions and social lives in the past and could again in the future. To some extent, we should defend the importance of humane thinking without specific regard for the manner of its institutionalization in part to make clear just how important we think it is. (E.g., that our defense is not predicated on self-interest.) Even if we think (as I do) that the academic humanities are the best show in town when it comes to thinking humanely.

I keep going back to something that Louis Menand said during his talk at Swarthmore. The problem of humanistic thought in contemporary American life is not with a lack of clarity in writing and speaking, it is not with a lack of “public intellectuals”. The problem, he said, is simply that many other influential voices in the public sphere do not agree with humanists and the kind of knowledge and interpretation they have to offer.

With what do they disagree? (And thus, who are they that disagree?) Let’s first bracket off the specifically aggrieved kind of highly politicized complaint that came out of the culture wars of the 1980s and 1990s and is still kicking around. I don’t think that’s the disagreement that matters except when it is motivated by still deeper opposition to humanistic inquiry.

What matters more is the loose agglomeration of practices, institutions and perspectives that view human experience and human subjectivity as a managerial problem, a cost burden and an intellectual disruption. I would not call such views inhumane: more anti-humane: they do not believe that a humane approach to the problems of a technologically advanced global society is effective or fair, that we need rules and instrumments and systems of knowing that overrule intersubjective, experiential perspectives and slippery rhetorical and cultural ways of communicating what we know about the world.

The anti-humane is in play:

–When someone works to make an algorithm to grade essays

–When an IRB adopts inflexible rules derived from the governance of biomedical research and applies them to cultural anthropology

–When law enforcement and public culture work together to create a highly typified, abstracted profile of a psychological type prone to commit certain crimes and then attempt to surveil or control everyone falling within that parameter

–When quantitative social science pursues elaborate methodologies to isolate a single causal variable as having slightly more statistically significant weight than thousands of other variables rather than just craft a rhetorically persuasive interpretation of the importance of that factor

–When public officials build testing and evaluation systems intended to automate and massify the work of assessing the performance of employees or students

At these and many other moments across a wide scale of contemporary societies we set out to bracket off or excise the human element , to eliminate our reliance on intersubjective judgment. We are in these moments, as James Scott has put it of “high modernism”, working to make human beings legible and fixed for the sake of systems that require them to be so.

Many of these moments are well-intentioned, or rest on reliable and legitimate methodologies and technologies. As witnesses, evaluators, and interpreters, human beings are unreliable, biased, inscrutable, ambiguous, irresolvably open to interpretation. Making sense of them can often be inefficient and time-consuming, without hope of resolution, and sometimes that is legitimately intolerable.

Accepting that this is the irreducible character of the human subject (the one universal that we might permit ourselves to accept without apology) should be the defining characteristic of the humanities. The humanities should be, across a variety of disciplines and subjects, committed to humane ways of knowing.

So what does that mean? To be humane should be:

Incomplete. E.O. Wilson recently complained that the humanities offer an “incomplete” account of culture, ethics and consciousness (and kindly offered to complete the account by removing the humanities from the picture completely). What Wilson sees as a bug is in fact a feature. The humanities are and should be incomplete by design—that is, there should be no technology or methodology which we might imagine as a future possibility that would permit complete knowledge achieved via humane inquiry nor should we ever want such a thing to begin with. A humane knowledge accepts that human beings and their works are contingent to interpretation. Meaning much, if not absolutely anything, can be said about their meaning and character. And they are contingent in action. Meaning that knowledge about the relatively fixed or patterned dimensions of human nature and life is a very poor predictor of the future possibilities of culture, social life, and the intersubjective experience of selfhood.

Slow. As in “slow food”, artisanal. Humane insights require human processes and habits of thought, observation and interpretation, and even those processes augmented by or merged into algorithms and cybernetics should be in some sense mediated by or limited to a hand-crafted pace. At the very bottom of most of our algorithmic culture now is hand-produced content, slow-culture interpretation: the fast streams of curation and assemblage that are visible at the top level of our searching and reading and linking rest on that foundation. This is not a weakness or a limitation to be transcended through singularity, but a source of the singular strength of humane thought. We use slow thought to make and manipulate algorithmic culture: social media users understand very quickly how to ‘read’ its infrastructures but it is slow thought, gradual accumulations of experience, discrete moments of insight, that permit that speed. There is no algorithmic shortcut to making cultural life, just shortcuts that allow us to hack and reassemble and curate what has been and is made slowly.

Dedicated to illegibility. By this I do not mean “difficult writing” in the sense that has inspired so much debate within and about the humanities. By this I mean a permanent, necessary suspicion baked into our knowledge about all political and social projects that require a human subject to be firmly legible and compliant to the needs of governance in order to succeed in their operations. Often the political commitments of humanists settle down well above this foundational level, where they are perfectly fine as the choices of individual intellectuals and may derive from (but are not synonymous with) humane commitments. That is to say, our political and social projects should arise out of deeply vested humane skepticism about legibility and governability but as a general rule many humanists truncate or limit their skepticism to a particular subset of derived views.

Is this a riff on Isaiah Berlin’s liberal suspicions of the utopian? Yes, I suppose, when it’s about configuring the human subject so that it is readily understandable by systems of power and amenable to their workings. But this is also a riff on “question authority”: the point is that if power can be in many places, from a protest march to a drone strike, the humane thinker has to be a skeptic about its operations. Humane practice should always be about monkey-wrenching, always be the fly in the ointment, even (or perhaps especially) when the systems and legibility being made suit the political preferences of a humane thinker.

Playful, pleasurable, & extravagant. My colleague in a class I co-taught last semester made me feel much more comfortable with my long-felt wariness about influence of Bourdieu-ian accounts of institutions and culture, and how in particular they’ve had a troubling effect on humanistic inquiry that often amounts to functionalism by another name. My colleague’s reading of Michele Lamont’s How Professors Think was to read it as calling attention to how much academics do not simply make judgments as an act of capital-d Distinction, as bagmen for a sociological habitus. Instead, she argued that it was evidence for the persistance of an attention to aesthetics, meaning, pleasure that is not tethered to the sociological (without arguing that this requires depoliticitizing the humanities). That our intellectual lives not only should be humane but that they are already.

This is very much what I mean by saying that humane knowledge should be playful and even extravagant: that every humanistic work or analysis should produce an excess of perspectives, a variety of interpretations, that it should dance away from pinning culture to the social, to the functional, to the concrete. Humane work is excess: we should not apologize meekly for that or try to recuperate a sense of the dutifully instrumental things we can do, even as we ALSO insist that excess, play and pleasure are essential and generative to any humane society. That their programmatic absence is the signature diagnostic of cruelty, oppression and injustice. This is what I think Albie Sachs was getting at in 1990 when he said that with the beginnings of negotiations for the end of apartheid, South African artists and critics should now “be banned from saying culture is a weapon of the struggle”. Whatever fits the humane to a narrow instrumentality, whatever yokes it to efficiency, is ultimately anti-humane.

So what of the digital? Many defenders of the humane identify the digital as the quintessence of the anti-humane, recalling the earlier advent of computational or cliometric inquiry in the 1970s and 1980s. Should we prefer a John Henry narrative: holding on to last gasp of the humane under the assault of the machine?

Please, please no. digital methods, digital technologies and digital culture are already a good habitus of humane practice and the best opportunity to strengthen the human temperament in humanistic inquiry.

Again and again, algorithmic culture has confronted the inevitable need for humane understanding, often turning away both because of its costs (when the logic of such culture is to reduce costs by eliminating skilled human labor) and because of a lack of skill or expertise in humane understanding among the producers and owners of such culture. I’ve long observed, for example, that the live management teams for massively-multiplayer online games frequently try to deal with the inevitable slippages and problems of human beings in digital environments by truncating the possibilities of human agency down to code, making people as much like a codeable entity as possible, engineering a reverse Turing-Test. And they always fail, both because they must fail but also because they don’t understand human beings very well.

This is an opportunity for humane knowledge (we can help! Give us jobs!) but also often evidence of the vigor of humane understandings and expertise, that the human subject as we understand it recurs and reinvents so insistently even in expressive and everyday environments that see a humane sensibility as an inconvenience or obstacle.

But this is not just an extension of the old, it is sometimes in a very exciting way genuinely new. “Big data” and data analytics are seen by some intellectuals as an example of opposition to the humane. But in the hands of many digital humanists or practicioners of “distant reading”, they demonstrate that the humane can become strange in very good ways. Schelling’s “segregation model” is not an explanation of segregation but a demonstration that there are interpretations and analyses that we would not think of out of ourselves, a reworking without mastery. The extension and transformation of the humane self through algorithmic processing is not its extinction: approached in the right spirit, it is the magnification of the humane spirit as I’ve described it.

This is not a CP Snow “two cultures” picture, either. Being humane is not limited to the disciplines conventionally described as the humanities. Natural science that is centrally interested in phenomena described as emergent or complex adaptive systems, for example, is in many ways close to what I’ve described as humane.

We might, in fact, begin to argue that most academic disciplines need to move towards what I’ve described as humane because all of the problems and phenomena best described or managed in other approaches have already been understood and managed. The 20th Century picked all the low-hanging fruit. All the problems that could be solved by anti-humane thinking, all the solutions that could be achieved through technocratic management, are complete. What we need to know next, how we need to know it, and what we need to do falls much more into the domains where humane thinking has always excelled.

Posted in Academia, Defining "Liberal Arts", Digital Humanities, Oh Not Again He's Going to Tell Us It's a Complex System | 3 Comments

On the Clery Act Complaint

I didn’t support the campaign to have fraternities restricted or banned at Swarthmore, largely for reasons I articulated earlier this academic year.

I do support the students who’ve filed a Clery Act and Title IX complaint against Swarthmore, and similar groups of students at other campuses.

I might disappoint those students by qualifying that support in the following manner: that I don’t think they yet have a completely clear view of the alternative processes or outcomes that they’d prefer. Once you start a conversation about the difficulties involved in building a better system, you might have some appreciation for why most administrations in higher education have settled for so long for the complex, contradictory and unsatisfying systems of reporting, counseling, and judiciary review that have been built up over the last three decades.

Let’s start with the support first, though. The first and foremost reason that I think this development is a very good thing for Swarthmore and institutions like it is that the filing plus an independent review by consultants will at last create a documented, independent body of testimony and evidence about student experiences and administrative procedures that everyone can use as the standard reference point for going forward.

Ever since I’ve arrived at the college in the mid-1990s, I’ve known students that have alleged that the college handled reports of abuse, assault, harassment or rape inadequately. I’ve had a pretty wide range of committee assignments since I’ve been at Swarthmore but one area that I’ve had no involvement with is judiciary procedures. The one time I served on the Dean’s Advisory Committee almost two decades ago, we mostly discussed alcohol policy, though with little address to the role of alcohol in assault or rape allegations. Without personal experience, I’ve had nothing else to use to evaluate those student allegations except trust in the students I know and trust in my colleagues in the administration, which have pulled in opposite directions. I do know from experience that students sometimes are profoundly wrong or exaggerated in what they say about other aspects of internal process or decision-making at the college. At the same time, I’ve known that sometimes faculty and administration don’t accurately hear or mentally transcribe what they say to students. So without anything direct to go on, it’s been hard to know what to say. Was this a common problem? A sporadic one? What kind of problem was it: a problem with specific procedures, with particular staff members, with a generalized culture, with a specific kind of incident, with the entire society around us? Most of the students I’ve heard from are students I trust very deeply, but they’ve almost always been telling me about what friends or friends of friends have said, not speaking about their own direct experiences. The constant thrumming of discontent has always worried me.

One reason that those stories were vague or indirect was the completely legitimate reason that victims often don’t want to go public, don’t want to have to endure skepticism and hostility, don’t want to have to repeatedly tell a story of trauma, don’t want to be responsible for educating everyone else about victimization.

Another and more important reason, it turns out, is that we have told students that our judiciary procedures require absolute confidentiality from everyone involved in a hearing, so that the students who took the step to most clearly document cases of harassment, assault or rape believed they were required to keep that documentation secret, whether or not they were satisfied with the outcomes of the process. (The student publication the Daily Gazette has published a very good series of investigative reports on this issue that helped to bring this point forward.) So our procedures, intentionally or not, have helped to maintain an environment where it is impossible for the community to have documented knowledge or awareness of the incidence, character or resolution of assault and rape and yet equally where survivors and victims can do little but informally or privately testify about their experiences. Small wonder then that there has been a recurrent, corrosive murmur about the untrustworthiness of institutional process: there has been no way for that murmur to be anything more.

Which is why clear, documented, transparent scrutiny from several different bodies is a good and necessary outcome. It’s the only way to move forward.

Whatever the investigations find, however, there are some persistent contradictions in the advocacy of the students filing the complaints that will prove hard to resolve.

Some changes will be easy to make, and may already have been made. Particularly on reading the investigations in the Daily Gazette, I don’t have any hesitation about saying the following:

1) That a student reporting harassment, assault or rape should never, ever be asked if he or she was drunk or in any way culpable. I don’t even think it’s relevant to ask whether the complainant clearly said no or made an objection to harassing behavior. As many critics have pointed out for decades, that puts the impetus on the person being approached to say or do something, as if the person making an approach can safely assume until something is said that it’s ok to make sexual advances or remarks. If someone’s reporting, the baseline assumption should be that there’s something to report and that the person reporting is a victim of another person’s actions.

2) That sympathy for and counseling to victims be absolutely hard-wired into the reporting process. It’s my impression that the college has moved pretty forcefully in this direction already.

3) That once a report is made, there’s a public record of the report, without names, creating a verified, public database about the incidence of such reports and their resolution. I think from 2011 onward, Swarthmore and most other colleges have fixed that part of the process, in response to federal requirements.

4) That some accommodations of victims should be made much more expeditiously than we have done. For example, moving accuser or accused to other dorms should happen without a lot of hassle or delay. To be honest, I think that should even happen more quickly in cases where there are strong personality conflicts–most residential colleges have tended to treat most friction in living spaces as a “learning experience” that relates back to diversity.

5) That we can do way better than force victims to sit down with the accused in a small room and have to testify to peers, faculty and administrators who may or may not have training or experience relevant to rape and assault cases, in an environment that is at best indifferent to the mental well-being of the victim.

6) That we shouldn’t ever restrain victims from speaking about their experiences. Confidentiality is a powerful but exceedingly dangerous sociopolitical technology that should be used only in very specific and limited contexts. Academic institutions are prone to the massive overuse of confidentiality across a very broad range of practices and procedures, and this is one of them.

However, rethinking judiciary procedures in specific is likely to be a bigger problem.

Here’s the chief contradiction I see among the advocate groups who have been pushing for changes at Swarthmore and other campuses. Some of the students involved, including some at Swarthmore, argue that they would like to see much more expeditious actions taken against reported rapists or assailants, most typically quick movement on expelling offenders and creating some form of permanent record or notation of the reasons for their expulsion. (See for example Tucker Reed’s account of her experiences at USC.)

At the same time, some advocates respond that they would prefer for action to be taken by administrative processes within their institutions rather than by the external legal system, for a number of reasons. First that the legal system is by any standard even slower at producing results; second that it is often far more violating or traumatic for victims than the worst collegiate procedure; and most interestingly third that many of the victims report some degree of compassion or concern for the future of their attackers, preferring that they simply be removed from the community rather than suffer criminal penalties. (At the same time, most victims quite legitimately reject outright that they themselves be compelled via a judiciary process to participate in the rehabilitation or education of their attackers.)

All of these points make sense but they pull in opposite directions. For one, the idea that we should be sufficiently sympathetic to rapists or assailants within a community to not seek criminal penalties, just removal, is in tension with the frequently-repeated dictum that rape is rape, that we shouldn’t see rape that involves two drunken acquaintances who’ve had consenting encounters in the past as any different or lesser than “stranger rape” that leaves the victim severely injured. Nor is it fair to leave the determination of whether to treat rape as a criminal violation reported to the police as a burden on the victim. But requiring or mandating a criminal report in all instances creates problems: by their own account, many victims might be less likely to make a report if they knew that was the outcome, both out of anxiety about the process (knowing, among other things, what an adversarial system will do to distort or manipulate the victim’s experiences) and even out of reluctance to visit criminal penalties on their attackers.

If that leaves colleges like Swarthmore with a need to have a better or different judicial process, what should that look like? I’ve always been a bit unhappy with the pseudo-judicial systems that many universities maintain: they’re demonstrably prone to manipulation in other ways and on other issues besides rape and assault.

If the desired outcome is that an attacker be expelled quickly, and that there be a record of the reason for the expulsion, that’s going to create some serious burdens on the institution. First because that’s a fairly serious penalty that requires something like if not identical to the presumption of the American judicial system: a presumption of innocence until due process is observed. Let’s say you expel a student after three years and put on their transcript, “Expelled for sexual assault”. The expelled student can rightfully say that you have deprived them of the benefits of three years of tuition and the expected lifetime benefits of completing a Swarthmore education. Considering that in recent years, students have sued universities even over what seem like open-and-shut issues like low grades in a course, it’s not unreasonable to expect that more frequent expulsions with clear transcription of the cause will lead to litigation, with potentially large damages being sought.

Meaning that the standards for a finding of assault in an internal procedure would have to be high enough to withstand scrutiny in a civil proceeding and also that they probably ought to be in that being expelled and having a transcript with a note as to cause is a serious penalty if not as serious as a criminal finding of rape or assault might be. If so, that’s likely to run counter to what the critics of current policies are seeking, in several respects. The counseling of victims would have to be utterly firewalled away from a judicial procedure (the degree to which that’s presently the case is the source of a lot of frustration, in that we’ve formerly had deans serving both as the advisors for a judicial procedure and as counselors to victims) and the judicial procedure would have to operate with something like the presumption of innocence for the accused if not with an openly adversarial approach to evidence and questioning, which walks us right back into the problem of seeming unsympathetic or skeptical towards victims. In a small school, it’s going to be very difficult to have one institutional structure that puts no burden on victims, openly acts as their advocates and counselors, promises them justice, redress and healing and then have another institutional structure that can’t promise anything of the sort and then have those two structures interact to produce a coherent and decisive outcome very quickly. It’s going to be equally hard not to have that kind of two-sided approach, however. The students seeking change shouldn’t expect that it’s going to be easy or even possible to create an internal procedure that does all or even much of what they advocate.

They should also consider that building such a system might have many unexpected or unanticipated consequences. For example, if the process has to withstand legal scrutiny, it might be hard to keep other kinds of criminal actions (involvement in drugs, underage drinking, non-sexual violence, even intellectual property violations) wholly out of the loop of that system, to make other kinds of infractions subject to more informal, confidential mediation. We might almost have to have three (or more) separate systems or to define sexual assault, rape and harassment as offenses which are so completely different in impact and gravity that nothing else requires the same kind of process or procedure. In the case of the latter approach, it probably won’t be long before someone argues that some other class of offense or harm is equally serious and requires the same kind of handling.

I suppose as something of an afterword, I’d also suggest that the students bringing the complaint should not be quite so cynical about the possible outcome of an independent review. I support the filing of the Clery Act complaint because I think the more investigative processes the better and because it’s the only way to create more trust in the long-term in the community. But it’s important not to treat investigations of this kind as a zero-sum contest that can either be won or lost, and to therefore “work the ref” by creating a pre-emptive narrative about the intentional insincerity of your opponent and therefore the pervasive untrustworthiness of everything “they” are doing. I think a probing investigation–whether by consultants or the federal government–is likely to find that the mistakes and problems of institutional process over the last twenty years are a product of messy histories (ranging from the crazy-quilt contradictions of in loco parentis at colleges to an ever-shifting legal environment). If so, there is no “they” who has acted with devious intention, nor any “they” who are out to hurt the institution. When I look at this filing, I don’t see sides, I see a lot of people who want the best for Swarthmore and higher education and who are doing the right thing as much as they can in the way they see best, and in the case of the students, doing it with great courage and determination. This isn’t zero-sum: it’s the rare kind of dispute where there are ways for everyone to come out a winner, if only everyone will leave enough space for that to happen and have enough generosity to agree that once we get past the obvious changes, there will be difficult puzzles that can’t be so easily solved.

Posted in Academia, Swarthmore | 36 Comments

Scan Here For More Information

We still have a Saturn that we bought brand-new not too long after the company began manufacturing. At 130,000 miles, it’s not exactly going strong, but it’s been a good car.

I have to admit that we bought it as much for sentimental reasons as any kind of hard-bitten consumer evaluation of price or quality. Meaning that we liked the promotional language about reinventing the relationship between labor and management, about rethinking the work process of car production, and about getting rid of the unsettling environment in car retailing where salesmen try to figure out how much they can get away with in setting a price.

I remember too that we used to buy Ben and Jerry’s ice cream not just because we liked it but also because we liked what we heard about their wage scale.

What’s curious is that of all the ethical commitments that liberal-leaning consumers with discretionary income try to maintain today (dolphin-safe tuna! locally-sourced food! environmentally-safe detergents!) the circumstances of workers rarely if ever figure into the imagination, and yet, it’s not been so long since the treatment of workers did have a place at that somewhat trendy table. Now? You can see the banners at Whole Foods that mark off the company’s ethical commitments and not expect to see anything about its laborers or even about the labor conditions at the point of supply. That’s not just that the owner of the company is something of an infamous asshole about labor and regulation, it’s par for the course. Apple moved to deal with rumbles about labor conditions among its Chinese suppliers before they became a major issue, but it’s hard to imagine consumers making this a major part of their brand preferences or even foregoing certain products entirely. I don’t say that as an accusation against others: I can’t imagine myself not having a mobile device or desktop computer out of scruples about the workplace ethics of the producers.

What I can imagine is that I might be willing to pay more for a product that came with guarantees about workplace conditions. That is more or less how “ethical consumption” operates in general: as a form of upscaling. That’s where there’s a standard that’s going strong: fair trade. But it’s interesting to see how the application of fair trade branding has been both deep and narrow to certain product categories, and how little the standards have changed the overall picture.

Ethical consumption built around labor standards runs into the same wall that similar kinds of branding efforts encounter: that they mean absolutely nothing without a trusted independent auditor who has extensive access to all parts of the production process or some other kind of extensive and transparent access to information about the manufacturer or supplier. A lot of products that are labeled as green or organic turn out to be little more than just that: labels.

I know that many activists are deeply suspicious of ethical consumption as a concept, indeed of consumption as a domain of meaningful agency or worthwhile causality. That’s a big conversation that I’ve been involved in for my entire life as a scholar. I’ve never accepted this disdain for consumption. But the time has come perhaps for different campaigns to come together to push for a general change, and labor issues should be the major reason for that banding together.

Our legal system insists that investors in public companies are entitled to information, and that the same information should be available to all of them at the same time. We also believe as a matter of policy that consumers are entitled to some information about finished products (nutrition, expiration dates, location of manufacture) but on the whole, consumers have much less available to them unless the manufacturer subscribes to an independent audit. That’s what should change. Every product I buy, whatever it is, should come with a small scannable tag that contains full disclosure of its site of manufacture, the supply chains for its components, the labor conditions in those manufacturing sites, the materials in the product, and so on. Falsifying that information should be a crime and expose the manufacturer to civil penalties.

In a digital age, keeping that tracking information associated with a product should be little additional burden to a company (I hope none of them would pretend that they themselves don’t really know where products are coming from or how they’re made?). If the companies can track me around the web, it’s only fair that I should be able to track them in turn. The only reason not to share it is that you don’t want it known by consumers. If I’m content, like Matthew Yglesias, with the proposition that poor countries not only do but should have lax safety standards, then that’s fine: I can go ahead and buy clothes made in those countries without hesitation. If I’m not content and actually think, unlike Yglesias, that there is something I can and should do about that situation, it would be a good thing to actually know that I’m looking at a pair of jeans made in Bangladesh rather than waiting for the brand name of those jeans or of the retail outlet that sells them to show up in the rubble of a collapsed building. Even libertarians (supposedly) believe in information, right?

Posted in Politics | 7 Comments

Outside the Classroom

One of the questions about “ethical pedagogy” that I keep circling around, as long-time readers of this blog know, is what kinds of teaching or engagement faculty should pursue with students beyond formal courses. It is a cliche to say that learning mostly happens “outside the classroom”, but as far as residential undergraduate colleges go, it’s a vitally necessary declaration. Of all the things that bothered me about Benjamin Ginsberg’s self-gratifying attack on all things administrative in higher education, The Fall of the Faculty, it was his scorn for “learning outside the classroom” that annoyed me the most. Because the disdain of faculty for anything outside of the formal classroom is yet another way that we are paving the road to adjunctification and MOOCification.

The problem, both ethical and practical, is how to carry out deliberate pedagogy inside the life of a residential college community. If my classroom pedagogy is built around eliciting and honing the widest variety of modes of interpretation and expression for my students, about exploration and freedom, my teaching in community has to be as well. There are no grades to hand out, no assignments to mark up. But another thing that’s different is that in community what students and colleagues say and do has a direct impact on my own life as a professional. The classroom is a bounded space: a student’s explorations of the content can hit the walls around the course hard but that’s still a learning experience that the student and I can almost always recuperate positively. A college community is a much bigger space–and its boundaries stretch out into the world. Even if there weren’t a pedagogical dimension, faculty have to care about how their community is perceived, about what it does in the world.

The stakes are different than inside a class, and yet in a way they’re not different at all.

Some years ago, I was speaking with several students I knew and liked about this problem. They told me that they felt frustrated that faculty largely ignored students who were involved in activism, or if they paid attention to them, they did so as passive cheerleaders. Real decision-making, they complained, was inaccessible to them.

I wrestled a bit with a feeling of irritation, because I’d had the conversation before. I’ve had it since. I wrestled a bit with a feeling of chagrin, because I’d been one of the students who’d said just this sort of thing to faculty in the 1980s. I wrestled with a feeling of weariness, knowing how I would have taken it then if a 45-year old had told me he’d been one of the students who’d said this sort of thing back in his salad days yadda yadda yadda. Which of course is the first reason we don’t have these conversations more often: a sense of futility, a sense that all this has happened before and will happen again, a sense that we can’t help but be assholes in some way if we get involved in the discussion. Which is in so many ways counter to the ethos of education, which rests always on the belief that we can somehow find a way to teach what we ourselves couldn’t learn nearly so well under the same circumstances before.

So I couldn’t just walk away.

I tried to make a few points. That for one how an institution decides anything is opaque to everyone not because secrets are hidden but because deciding and changing and acting are particles and waves all at once: here in some unpredictable meeting or moment, a decision is made by a few people or even just one person; over there in some diffuse, fuzzy way a hundred people wake up and every morning edge towards doing something in a different way. That a person who the flow charts say has all the authority in the world can find it impossible to simply make something happen, and there a thousand people can think they’re grasping a nettle and find instead that they’re dancing with the wind.

This is very zen.

I also pointed out that the students in question hadn’t really bothered to ask some basic questions about how things worked and they still weren’t asking them–I was sitting right there ready to answer, but they’d rather complain and demand instead of being curious and asking open-ended or basic questions. One student countered that this was, essentially, above their pay grade–that they should be able to propose and someone (faculty, administration, “Swarthmore” in the abstract) should then dispose. Which really did nettle me as much as it would if a student skipped four or five class sessions and then asked me what was on the test. I’ve been pointing out for almost two decades that most of what students want to know about things work is readily knowable, but they have to put the work in–and when they’re given indirect answers accept that sometimes that’s because the way things work is indirect, when they involve confidentiality it’s because there’s something that probably should be confidential. For all of the invocations of democracy, consultation and consensus, I find that students harbor the belief that when it’s something they want, there should be a quicker and more authoritarian way of making it happen, that all it takes is a declaration, a policy, an order. Or that there are no costs, conversely, to turning every decision into a massively consultative and communal process, as if no one will have to actually work on that process.

We ran over a few other points. That students, no matter how passionate they are about the college, have a different and more short-term interest in it than faculty, staff, alumni or trustees do. That a decision that will have consequences for the next forty years affects other groups much more than it affects them. That maybe there are a few things yet that they don’t know about Swarthmore, the world or themselves, that the position of a student–and of any scholar–ought to involve a measure of humility by design.

I explained why I didn’t agree with these students on the specific issue of the moment that concerned them. We had a good discussion on the particulars, and I gave them a few ideas about where they could push further with some hope of gaining ground if they were serious about the issue. And then these folks went back a week later to demanding what they demanded and complaining that no one was listening to them. Now this, I have to say, I didn’t do back when I was a student: I was actually much more drawn to the faculty who were doubtful about South African divestment first because I learned things about the issue from them (they tended to be the people who knew something about the issue) and because it was much more useful for sharpening my own advocacy than talking to someone who patted me on the head and told me to keep fighting the good fight. And they were vastly better than the administrators whose job it was to placate and defer us, or the faculty who frankly thought they could manipulate us into serving as foot soldiers in their own intramural conflicts with colleagues. But even in the case of administrators and trustees, I understood very well that they were doing work in meeting with us and listening to us, that we were asking for time and attention in an environment where both were scarce. I appreciated what we got in that respect even when I didn’t agree with the outcomes. Now, mind you, the faculty I gravitated to were being good teachers in challenging me–they didn’t treat me as they would a serious professional enemy, which would have been terrifying. But the more there is a high emotional and political cost to that kind of engagement on the faculty side because students regard it as impossible or insulting that there should even be a challenge, the less it will happen.

—-

Which brings me to the instance of the moment that has me thinking in these circles again. The college has invited our alumnus Robert Zoellick to be one of the honorary degree recipients at graduation this year, and some students have argued in terms that invite no dissent that this is absolutely unacceptable: that Zoellick is a key “architect of the Iraq War” and that his association with the World Bank, Goldman-Sachs and the Republican Party make him moral anathema regardless of whether there are highly specific decisions or initiatives for which he bears the primary blame that warrant such condemnation.

And I read the discussion and thought, “Is there even a point to entering it? Would I be welcome at all? Would this just be The Man sticking his nose in?” But then of course if no one does, that just feeds this sense that no one is listening and no one cares. More importantly, it leaves this particular viewpoint a free space to define what the wider community thinks and believes. And if I’m thinking pedagogically, even the students who question the invitation could make their case more effectively, because right now they’re making a case that is disturbingly broad and rather careless with evidence.

I teach a class where one of my possible “teaching outcomes” is to help students who come in with a critique of the World Bank and development institutions to refine, strengthen and extend that critique. I’m personally a very strong critic of the Bank’s history and many of its current policies and a strong critic of the history of development institutions and the idea of development as a whole. I agree with Matt Taibbi’s description of Goldman-Sachs as a “great vampire squid wrapped around the face of humanity”. I think George W. Bush’s two administrations were a major political calamity. I think the Iraq War was an extraordinary failure, a violation, and that many people who should face serious professional consequences for their role in it have escaped very lightly.

But I don’t think there’s evidence that Zoellick played any particularly important or central role in that war and I think it’s troubling that students have implied that he did. I don’t think working at Goldman-Sachs should make someone absolute anathema. I think Zoellick did a considerably better job at the World Bank than his predecessor and that the Bank’s influence and actions demand serious consideration and nuanced evaluation. All of which and more can be a subject for considerable disagreement between political allies–or political opponents.

None of which affects how I think of my own students past, present and future, and the students of this college who came well before I ever taught here. It would take an extraordinary act of specific malice and evil before I would think to disown a student, an alumni, from this imagined community. Because the central value of a liberal arts education, as I see it, is that we exert no mastery or ownership over what our students will become, and love them all for what they are and will be. If there were time enough, I’d invite every damn one of them to get an honorary degree and give a speech some day, so I can hear about what they’ve done and thought and become.

I heard an extraordinarily moving thing from a colleague at a recent panel on the future of the liberal arts. She decided to speak on behalf of what her students had told her they valued about the liberal arts, and the first thing they said was, “freedom”. Sure, there are distribution requirements and major requirements, they agreed, but otherwise they felt that they were free now and forevermore to develop their interests, passions, skills and aspirations as they saw fit and that the college believed deeply in that freedom. You cannot believe deeply in that freedom and then tell me that a person as accomplished in his professional life as Robert Zoellick isn’t a worthy part of this college’s legacy. You can’t tell me that there is only a narrow band of acceptable political and social commitments possible from a “Swarthmore education”, that if if a student idly confessed that he might want to work for Goldman-Sachs that we should grab his ear, march him down to the Dean’s Office and expel him right there and then for conduct unbecoming of a social justice crusader. You can’t tell me that we’re about pluralism and diversity and then set a standard that more or less says “Republicans and bankers need not apply”. This isn’t neutrality or dispassion: it’s the deepest passion possible, for freedom and empowerment through education. This is what it means to have that politics: that you not just tolerate but embrace the many ways that students will go forth into life and make some use of what they’ve known and done and learned in their education. You exercise no veto over them: not now, not then, not tomorrow or the next day.

You can’t even tell me that somehow the fix is in, that this is just favoring the 1% or whatever. That’s where a student who complains ought to do himself or herself a favor and ask, “So how do these invitations go out anyway?” And here’s how: first, Swarthmore has a tradition of strongly favoring its own alumni when it comes to honorary degrees, which is a tradition I love even as it risks a certain insularity. Second, it’s open to anyone–even students–to make a suggestion at the beginning of the academic year, keeping in mind the preference for alumni. Third, it’s a sleepy, pleasant little committee that looks at the suggestions each year, and gently tries to get a list down to a manageable size of people who might be available and who might say yes, looking for pluralism and also for who might be comfortable giving the kind of speech that works in that setting. (Which, no matter who we’ve invited, has almost never been sharply political or score-settling, because our speakers generally choose to be more welcoming and inspirational to the entirety of the community gathered that day: families, faculty, students, alumni, staff, people from town.) Fourth, that same process has year after year after year in all the years I’ve taught at Swarthmore brought extraordinary crusaders for social justice, committed left-wing activists, dedicated philanthropists, remarkable researchers and scientists, poets and dreamers all to the graduation. There’s nothing more bad faith than criticizing the process the one time out of fifty that it produces an outcome that you don’t care for when the other forty-nine times you wouldn’t dream of complaining.

So is this teaching or scolding? I don’t know: both, I suppose, but that line gets crossed in the classroom too sometimes. Is it welcome? Is it helpful? Does it move things ahead or shut them down? I don’t know: but I know that what’s at stake is both more and less than the education of students this year–and all years.

Posted in Academia, Swarthmore | 17 Comments

Stagnation

By now, I think everyone knows that the new Sim City is a flaming car wreck, the gaming equivalent of Ishtar or Hudson Hawk, the kind of misfire that raises serious questions about its corporate creator and the entire industry.

But it’s only one of many so-called “AAA” titles in gaming in the last year to raise those questions. Even products that appear successful or well-regarded document the consistent aesthetic underperformance of the most expensive, lavishly produced work the gaming industry is selling–and I think that underperformance is in turn severely limiting the potential commercial and cultural impact of games. AAA titles are increasingly easy for everyone outside of a small, almost entirely male, subculture to ignore. The only games that really matter to the culture as a whole right now are either independent products like Minecraft or light “casual games” that are mostly played on mobile platforms. (A further symptom of the cluelessness of the industry is that many developers therefore conclude that it’s the platform, not the game itself, that consumers prefer, so just move it into mobile, whatever “it” might be.)

Two examples of failures that the gaming subculture has anointed as successes: Far Cry 3 and Bioshock Infinite.

Far Cry 3 is ostensibly an “open world” game, a form that at its best is one of the most powerfully distinctive ways that a digital game can be unlike any other modern media text or performance. It’s a very hard form to produce, demanding a huge amount of content, a lot of QA testing, and flexible, responsive AI scripting. Small wonder that most studios steer clear of the form, or falsely claim to have done it.

Far Cry 3‘s first problem is just that: it’s not really an open world. The player can in theory linger wherever he wants and do what he wants. But then it turns out that there’s a whole set of hidden “gates” throughout the gameworld that require the player to progress through the plot, to watch the cutscenes, to do what’s required of them by the developer. In a genuine open world, the player can go almost anywhere or do almost anything and progress the narrative at her discretion, which might open up a few new locations or new content, but as a supplement to the general environment rather than one more step in a linear sequence.

Progressing through the content wouldn’t be so bad in Far Cry 3 if the content weren’t so bad. And here we hit the second problem that is far more general to the industry: that the writing is not only for a very particular subculture of very particular men, it is largely BY that same subculture. Far Cry 3 is an almost laughably bad pastiche of racialized cliches: it makes Cameron’s Avatar seem like the most sophisticated postmodern rethinking of those tropes by comparison. What is worse both in narrative and gameplay terms is that the player is forced into inhabiting the subject position of one of those cliches. Rather than playing a cipher who simply witnesses and traverses the setting or a specific character who has an alienated or sideways relation to the gameworld, you have to be a character whose arc goes from “spoiled wealthy white American frat boy” to “low-rent dudebro version of John McClane killing dark-skinned drug dealers and bandits on a tropical island” to “white messiah saving natives”. You have to listen to “yourself” saying painfully stupid things throughout the entire game while saving painfully stupid friends. Perhaps worst of all for an allegedly open-world game, your character is frequently forced to do dumb things: walk into traps and trust the untrustworthy. (Plus you end up in QTEs for a number of key plot resolutions, which is like adding a extra turd to the shit sundae.) Far Cry 3 wants to be Grand Theft Auto but no one making it had an ear for the Rockstar aesthetic: all of the “interesting” people your character deals with make GTA IV‘s Roman Bellic seem like a soothing, well-balanced presence by comparison. The only people who could possibly enjoy Far Cry 3 for its diegetic elements are the narrow demographic that wrote the game and that identify with the protagonist.

What especially annoys me (and quite a few other commenters on digital games) is that the head writer, Jeffrey Yohalem, shrugs all the criticism of the narrative and content off because he claims it’s all meant to be stupid. Yes, a graduate of the English Department at Yale University is deploying the argument that he is subverting racist tropes by making them so enragingly stupid that they force players into a Brechtian relation with the game’s text, alienating them from the narrative “skin” lying over the gameplay structure. Or alternatively, that the game’s content seems racist because he’s forcing you to consume that content through the perspective of the young naively racist protagonist and therefore force a confrontation with his subjectivity. If there’s anything that this argument makes me have Brechtian feelings towards, it’s whatever body of cultural theory Yohalem thinks he’s deploying in good faith to make this bad faith apologia for a clumsy example of what’s wrong with a lot of AAA games.

In contrast, Bioshock Infinite has a very well-imagined and literate conceptual and visual setting, which has led a ton of middlebrow game critics to raid the thesaurus looking for sufficient quantities of superlatives. Middlebrow criticism of popular genres and forms, particularly geeky ones, is always poised with a certain undertone of desperation to try and convince mainstream cultural critics that they too are dealing with art, or at least the potential for art.

The problem with Bioshock Infinite, which takes place in a alternate-history version of an early 20th Century American experiment in communal living, in this case a city in the sky defined by racial purity and evangelical Christianity, isn’t much of a game. It’s described as a first-person shooter by most critics, but it’s largely a visually and narratively sophisticated reprise of an almost-dead genre of game, the “adventure game”, whose best-known example even today is the game Myst. Much of Bioshock Infinite consists of wandering in static environments clicking on objects to find out whether they will give you objects or money that you can use later or objects which provide more narrative details about the gameworld and the situation. This experience is periodically interrupted by combat setpieces where your character dispatches small squads of local law enforcers and by periodic dialogues with your companion Elizabeth, who is the other keystone to the eventual resolution of the plot. (The first being your own character.)

But just as Far Cry 3 forces you to endure your character’s cluelessness, Bioshock Infinite creates a very strange hybrid point-of-view and locks you into it. Your character knows things you do not know: about his past, about his motivations, about the events that set the game’s story in motion. Small details are revealed at first through environment and through your character’s occasional mutterings, then later by Elizabeth’s comments on you and your situation (and your subvocalized responses). But this creates a constant bizarre and uncomfortable tension: you are controlling the actions of a character who treats some of what you regard as novel or mysterious as expected or known, or who is blase and indifferent about some of what you (the player) find interesting, engaging or infuriating about the world of Columbia (the flying city).

Moreover, the gameworld only looks like it is three-dimensional. Like its two predecessors, Bioshock Infinite is the quintessential “roller-coaster ride”: there is almost nothing that actually turns on your choices or actions as a player, almost every environment can only be traversed in one way even if it looks like there are multiple pathways through it. You can choose how you want to kill your enemies–shoot them, burn them, rip them up with a whirling saw. You can choose whether to look at all the extra narrative content provided–none of it is needed to progress, but since it’s the game’s major virtue, why skip it? You can sort of choose whether to click on every barrel or crate to gather ammunition and food, but in normal mode, you could skip that and make it through easily enough. Otherwise you are herded through the experience of the game like a cow through one of Temple Grandin’s soothing kill chutes. If you die, nothing really happens, it’s not more than a momentary inconvenience. You can’t jump off Columbia even if you try. You can’t go in most of the storefronts or buildings. You can’t talk to people, just listen to their speak their one line of prerecorded atmospheric dialog. It is absolutely the essence of being on something like Mr. Toad’s Wild Ride in Disneyland. You stay inside the car, the environment swings around and beeps and bloops and moves and that’s it.

The consumer-side question Bioshock Infinite ends up posing is: why spend $50 to watch a 3-hour animated film that has some very good art design, one fairly engaging if rather stock supporting character, an interesting underlying setting, a “trick ending” straight out of the M. Night Shyamalan school of scriptwriting, and repetitively staged intervals of interactivity that very nearly amount to the 21st Century version of a William Castle cinematic gimmick? The distinctive affordances of the medium go largely unused, and there is little point to experiencing the game more than once.

At least Bioshock Infinite has an imaginative soul inside of it, unlike Far Cry 3. But it shows again how culture industries routinely miss the mark, not just or even mostly about artistic aspiration but about economic potential of the forms, genres and technologies that they supposedly mobilize to such fearsomely profitable effect. It may be that Bioshock Infinite or Far Cry 3 will make money for their producers, but the inefficiency of the relation between input and output in their cases ought to give anyone with an investment interest in the future of digital games serious pause. Particularly because the number of Sim Cities, unquestionable disasters, is also rather hard to ignore. Consumers don’t necessarily prefer casual games, mobile games or games like Minecraft because they don’t like long, intricate games that take advantage of the medium’s distinctiveness. They just don’t want to waste their time and money on games that were written for 16-year old boys who spend most of their time texting misanthropic comments to other teenagers or on games that don’t really have any “game-like” qualities.

Posted in Games and Gaming | 3 Comments

If It Gets you Tang, Space Foodsticks and Miniaturization, Then Go Ahead and Fly to the MOOc

So I remain firmly in the camp of people grumpy about the hype over MOOCs. Not so much about the reality of MOOCs, which is something that most of the hypesters remain defiantly unacquainted with.

Digitization in higher education has already often been a powerful, transformative force, and some of what MOOCs do with digitization will be good–occasionally in the way that NASA going to the moon was good, not so much for the moon itself but for all the stuff you have to build to get there.

Just to mention two examples of collateral good things about MOOCs evident in recent weeks. The first is that it’s increasingly possible they may pretty much kill off preceding forms of for-profit online education, most of which make even the most half-assed MOOC look good by comparison not just in terms of the quality of the experience but the cost of the service. More importantly in this context, MOOCs do their business out in the open. You can see what they are, you can see who teaches them. The University of Phoenix and its ilk made it impossible for to see who was teaching, what their qualifications or skill as a teacher might be, or what a class was like until you were signed on as a paying customer, often carrying the heavy debt required to get in the door.

Which leads to the second good thing about MOOCs: even if they don’t succeed in being either the magic edu-topia or devilish capitalist plot that their advocates and detractors envision them to be, they probably will be an important force in bringing more academics into contact with a wider range of the public. Not that MOOCs are unique in that respect–it’s the same kind of thing that Wikipedia, digital culture, crowdsourcing and so on are doing. But it’s one more potentially useful push in this direction.

Most professors are used to a classroom that is a closed environment, to students who more or less accept the curricular strictures and disciplinary parameters that define a given course, and to the ultimate control that student need for certification or assessment has on their behavior in a course.

Professors who have either engaged wider publics about their discipline or research or who have taught in contexts where most of the students are adults seeking enrichment have learned to be open to a far wider range of challenges and to be far less directive about getting to a single predetermined set of outcomes.

You have to make a similar adjustment if you’re teaching a MOOC to thousands of students and many of them have no need for a certification of completion and have a heterogeneous range of reasons for an interest in the subject. Which is what made the story about an economics professor bailing out of a Coursera MOOC so interesting. I basically agree with Professor Catherine Prendergast, who was participating in the course as a student, that the professor was completely unused to having his disciplinary orthodoxies questioned by people defined as “students” and didn’t know how to cope with a group that didn’t simply obey his instructions about what they could and could not discuss.

MOOCs aren’t the best or most generative way I can think of to open classrooms and subject expertise to different kinds of feedback and pressure, but they are A way for that to happen. Yes, that means that Thomas Friedman’s latest blandulations have some validity to them, but roughly for the same reason that it’s possible to think that an astrological forecast has some truth in it: throw enough conventional wisdom and irrefutable fortune-cookie sloganeering at the wall and some of it’s bound to stick.

Posted in Academia, Information Technology and Information Literacy | 5 Comments

“Our Rate Even for Original, Reported Stories is $100″

About two years after I’d started blogging, a journalist friend of mine gently needled me about what I was doing. “You’re going to put us all out of business if you keep giving away all that stuff for free,” he commented. This was right when the bottom was beginning to fall out of print journalism as Craigslist eviscerated revenue from classifieds, other advertising was chasing readers online, and subscription revenue continued its downward trends in many urban markets.

I begged to differ. I still do, but with less blithe assurance than I had back in 2005. There were more blogs then than now, but what I was doing and most bloggers were doing didn’t really undercut what should have been the distinctive content advantage of print journalism. What online content creators were doing, however, without necessarily knowing it, was unbundling print journalism, and most audiences were paying for the whole bundle.

Around 2005-2007, you could get reviews of both popular and elite culture online, in a much wider range of attitudes and writing styles. You could read a vastly better and wider range of opinion and commentary on the news than the moribund, hopelessly establishment editorial pages of virtually any print newspaper or magazine. You could get better classifieds via Craigslist. You could often get better really local news via neighborhood listservs and similar community sources. If you didn’t care much about anything beyond the lede, you could get a quick feed of the major and minor news of the day through various aggregator blogs.

What the unbundling of journalism demonstrated is that most of its readers had completely forgotten the history of how all of that came to be bundled in the newspaper in the first place, and eventually created reasonable livelihoods for staff writers. Unbundling also revealed that the assumption that readers most valued long-form reportage by skilled and experienced writers was false. That was the most expensive kind of content to create (large salaries for the punditocracy squatting on the editorial pages aside) but not really what most of the audience was buying.

Fast forward to now, and this is how we’ve arrived in a world where Nate Thayer is asked by the Atlantic to donate his content for free so that he can “gain a platform for exposure”. This is a bad place to have arrived. So what’s to be done about it?

A few thoughts:

1) Comic books and long-form serious reportage may be in the same place. Meaning, they’ve been sold primarily in serial, short format through a very particular retail architecture, and both the creators and the retailers of this content have built their lives around the cash flow that this format created. But increasingly if audiences are willing to pay for work, they’d rather buy it in long-form: trade paperbacks for comics, non-fiction books or other long-form formats for reportage. Adjusting to this change in the market is going to kill some businesses outright: the comic-book store, the average daily newspaper. But it doesn’t have to put all the creators out of business–they just have to find new distributors and get used to creating work intended to be bought and consumed at that longer scale.

2) Readers who want original information need to stop visiting sites that want to cheap their way around acquiring it, and sites that want those readers and their eyeballs need to stop relying on a model of spraying content out like a firehouse. Yes, I’m talking boycott, at two different ends. More people who create content, even folks who are indeed trying to “get exposure”, need to refuse to let organizations that can pay publish material for free and to enforce ownership rights when they go ahead and try to do it anyway, and more readers who consume content need to pressure sites like the Atlantic to cut that shit out or they’ll stop clicking. Either go ahead and be the HuffPo, and there’s only room for a few aggregator sewers at the bottom, or earn your eyeballs with distinctive content that you paid for.

3) However we get there, a publication platform that allows reasonably priced, no-hassle micropayments for a la carte purchasing of medium-length reportage and other writing that has minimal DRM could have a huge impact. Similarly, I can imagine something a bit like Kickstarter on a much smaller and more focused scale that would let non-fiction writers raise an advance for long-form work that requires travel or similar expenses in return for copies of the work when it’s produced.

4) Long-form reportage has to be saved from the last “investigative journalists” of the mainstream print and television media. Meaning, the people who have done it well and want to keep doing it well have to very clearly distinguish what they do from the insider-access blowjobbery that has been mistaken for “investigative reporting” ever since Woodward and Bernstein sauntered into the scene. Among other things, this means giving up once and for all the tedious formulation of “objectivity” that dominates mainstream American newspapers while equally rejecting the hack-job partisanship of think-tanks and the Beltway punditocracy. A long-form reporter in the new marketplace has to have a real voice, a distinctive style, a good eye–and thus create something that stands out and is worth the money being asked for it. This is the key thing about all cultural markets that are emerging: the work that does more than enhance the reputation capital of its creator, that is a valuable commodity in its own right, has to be distinctive and better. It has to be, in this context, the opposite of the highly standardized craftwork that was the pride and joy of mainstream print journalism in the latter half of the 20th Century. The long-form reportage that people will buy has to throw out its style guides, its pyramids, its sense of belonging to a guild and a profession. Of course, in an America where nothing else (healthcare or retirement savings, for example) is even remotely friendly to individuals who are selling their own content or services directly, this is not a happy-looking alternative, not yet. Happy or not, it’s the alternative to having to give up work for free to a freshly minted M.A. in online journalism who has been trained to give up work for free. (This might be where unbundling eventually goes for many professions, including higher education…)

Posted in Blogging, Information Technology and Information Literacy, Intellectual Property | 8 Comments

Digital Learning Is Like a Snow Leopard (Real, Beautiful, Rare and Maybe To Be Outdated by a New Operating System)

Maybe it’s just because it’s my obsession of the moment, but the digital camera strikes me as the single greatest example of a new “disruptive” technology that permits a fundamentally new kind of learning experience.

However, precisely because digital photography is the best example, it also defines a limit or border: a great many other discrete skills, tasks and competencies do not have the crucial characteristics of digital photography and are therefore not nearly so open to pedagogical transformation through digitization or online communication.

What gives a digital camera a special kind of capacity for auto-didactic learning? Two things: speed and cost. As in fast and none. Add to that an unusually robust communicative infrastructure for sharing, circulating and viewing photographs, so that aspirant photographers will always have a very large, responsive community of fellow learners available to them.

Each picture taken with a digital camera is an experiment with the medium. The photographer can see in a viewfinder immediately the consequences of every press of the shutter trigger. Change the composition, the focus, and if it’s a DSLR, almost every setting, and see a new outcome. Processing software allows images, particularly those taken in RAW format, to become something radically different in a few seconds of adjustment. There is no cost to each experiment once the initial investment is made, save storage to accommodate images on-camera and in some kind of archive.

The sharing and circulation of images was one of the earliest defining practices of the Internet and is now one of the focal points of very large social media communities. Photographers who share images–and even those who just view–are now able to see exceptionally large flows of cultural work on a daily basis and to interpret the action of algorithmic practices of rating, ranking, curating, and so on–a focus of some recent writing here at this blog.

So here we have a technology whose intrinsic properties and emergent interactions with communities of use and production incidentally make it extremely easy to learn new techniques, approaches and themes. Digital photographers not only can discover through trial and error material facts about light, composition, tonality and subject but also can watch large assemblages of algorithmic culture collectively “discover” preferences and tropes, which also instruct the learning photographer not just about how to shoot but what to shoot. When you see the five hundredth image of a Southeast Asian fisherman throwing a net that is backlit by a setting or rising sun, you can come to understand both what the human eye and the algorithmic culture of the moment “like” about that trope. And maybe, as with the best learning, you can abstract that insight to other compositions and ideas. What thin or diaphanous objects catch light? What other kinds of ‘exoticized’ subjects and scenes draw attention? And so on.

This is not to say that all digital photography is auto-didactic, or that all digital photographers are equally capable of taking up these lessons. But there is an unusual density of online resources available to instruct photographers at key moments in their learning. Take for example David Hobby’s seven-year old site Strobist.com, with its Lighting 101 course, which I think deserves to be regarded as one of the first fully-realized and implemented “massively open online courses”, well before Thrun and Norvig’s Stanford AI class. The site and the course have had a major impact on digital photography over those seven years, evidence of the degree to which this particular medium has a learning community that scales up to a global expanse.

And of course there are limits here: for one, involving costly equipment and the vested cultures that gently and not-so-gently push at each other around that equipment. A photographer with an iPhone camera and an open-source or cheap processing app will have a harder time learning some of the concepts and techniques that are privileged in some cultures of photographic production and consumption. For another, there are forms of technological literacy that auto-didactic approaches to digital photography assume rather than offer, and moments in learning where the presence of a regular, physically present teacher would provide more efficient, transformative or richer understanding.

Most importantly, the camera and the software and the communities cannot provide the answer to the question, “Why do I take photographs, and what kind of art or product do I want to make?” A technology and a large community of fellow learners and users, do not tell you how to think about visuality, and how images speak to audiences and each other. A teacher–or an overall education–might make progress on that front, but the device and its infrastructure will not answer of its own accord.

————

The thing is, much of the rest of what people might want to learn in 2013–for fun, for enlightenment, for immediate application to their careers–does not share either of the key attributes of digital photography: speed and cost.

Take something similarly basic to the production of digital images, and even more widely necessary in the contemporary world both in private life and in work: writing. Writing is not fast and it is not, despite word processors and spell-checkers and a host of other small embedded technologies of production, nearly so intuitive to the wetware of the human brain and body. The metaphor of the eye has all sort of misleading or deceptive application to photography but it is a good enough baseline for explaining why most people can take–or view–a picture more intuitively than they can write a letter or an essay.

You cannot write quickly and intuitively enough to experience near-simultaneous iterations of multiple examples of the same instance of writing. And therefore writing has a different kind of cost in labor time, even if it has some of the same frictionless cost of other digital culture. (E.g., the real cost in paper AND time of typing or handwriting multiple drafts is different than the cost of storing many digital files.) You can’t distribute writing to as many online audiences who have as close a consensus about what they’re seeking and the attention that reading takes is in shorter supply. (Hence: TL;DR.)

So just as with photography, questions about “what is writing”, “why write” and so on don’t self-answer, but the medium itself doesn’t even have the automatic affordances of digital photography.

Now try something as inescapably material as carpentry or emergency medicine. Here the costs of the necessary materials are very high, the process of learning them through use is very slow, and the dangers of improper or incomplete practice are extraordinary and multilayered. Or try something where the learning process is inevitably and always social, interactive and/or institutional: counseling, driving, military training. Digital or virtual processes might leaven or enrich educational experience but they can’t even begin to replace the conventional approach.

This is one of several places that the enthusiasm for MOOCs is going to founder in short order: digitization is only revolutionary in its automating possibilities for education in the exception, not the norm, and those exceptions are structured by real, physical limits long before they involve entrenched assumptions. Where digitized approaches to writing or emergency medicine or carpentry compare favorably to existing educational services, that won’t be because of the technology of digitization but because, as Clay Shirky has more or less argued recently, much existing education already sucks as much as digitized education is going to suck. But this is a very different challenge, then, from “digitize it all”: it is “make it suck much less, whatever it is and however it is delivered”.

Posted in Academia, Digital Humanities, Information Technology and Information Literacy | 1 Comment

A Different Diversity

Following on Carl Edgar Blake II’s description of his abilities, let’s go back to the question of whether faculty in higher education ought to have doctorates, whether doctoral study in some form roughly resembling its present structure is the best kind of training for undergraduate-level teachers or academic researchers.

The research part of this question is easy: yes, at least until the nature of scholarship itself changes in some substantial form. (And there’s yet another issue for another day.) The fit between scholarship as practiced in a dissertation (however long it takes to research and write it) and the vast majority of scholarly work across the disciplines is close.

For teaching? For stewardship over a curriculum? Perhaps the fit is not so precise. Going back to Michael Bérubé’s address on the situation of the humanities in academia, the issue of how a department (in the humanities or any other subject) might train graduate students to do other things besides be professors if all the people in that department are professors who were trained to be professors. Louis Menand’s recent talk at Swarthmore remarked on the same problem, with Menand concluding that he couldn’t imagine how to advise or teach a graduate student who wanted to apply her degree to a profession outside of academia, even while he conceded that it is increasingly urgent that graduate programs have this kind of flexibility.

This Catch-22 sentiment is a common one. I hear it even at Swarthmore and other small liberal-arts colleges: the coupling together of a belief that any course of study can lend itself to any kind of working future (indeed, any kind of manner of living life) but that professors usually don’t have the training or knowledge to advise students about any specific profession except for academia itself.

———-

Let’s start with that claim before asking what, if anything, needs to change about graduate training. When this point is made defensively (as I think it was in Menand’s talk), it’s troubling. It brutally undercuts one of the most common claims about the “liberal arts” as an ideal: that they enable students to learn how to learn, to become active agents in interpreting and imagining the world, to acquire knowledge as needed and wanted. If in fact this is true, how can it be that the faculty who teach within a liberal arts approach are incapable of enacting the supposed virtues of that course of study? Such that they cannot be expected to understand professions outside of academia or help a student see connections between what they have studied in college and their future aspirations or goals?

There is a legitimate point that we can make more carefully. I cannot really advise students about careers in museums, development organizations, non-profit community groups, carpentry or graphic design (for a few examples) if the advice a student is seeking from me is about the specific conditions of employment and specific ‘insider culture’ of those professions unless I just happen to have studied them in my own scholarship or have had past professional involvement with them. (In the latter case, that’s probably only useful if I’m a relatively junior faculty member.) The only job market where I have valuable insider knowledge is the academic job market.

But that shouldn’t be an excuse to shoo away a potential advisee. Because I do have two things I can help the student with. First, I should be able to help a student see how their own studies give them potential insight into the kind of work (or in other cases, the sort of living) done in any given field. If I can’t help a student see how their work in a history major can give them useful ideas about how to approach museum exhibition or advertising or law enforcement, I’m not much of a teacher, nor am I living up to the typical argument for a “liberal arts” approach to education. Second, if I can’t sit down with my student and learn together some of what there is to know out there about the “insider culture” of a given profession, find some contacts (maybe alumni, maybe faculty, maybe staff, maybe none of the above) and thus give the student a much clearer and more focused agenda when they do find themselves talking to a career advisor, I’m also not doing a great job as an advisor and teacher. I should be able to show students how to learn and understand what they want to learn and understand whether or not it’s my own area of specialized knowledge, because that’s what we claim our students are learning how to do.

It’s just that it is easier to do for an area where I have more knowledge and experience, and the texture and detail of my advice in those cases will be richer and deeper. When we’re busy, we naturally emphasize trying to match any questions to the person best qualified to answer them. If I’m talking to a student who wants to be a civil engineer, it’s inefficient for me to spend a lot of time acquiring the spot knowledge to help them get to the next step of that goal when there are a bunch of engineers on the other side of the garden at the back of my building. But if I’m talking with a history major who is interested in careers in design and technology and wants to know how history might help inform that future, I should have a bunch of ideas and suggestions readily at hand. It’s only when we get down to brass tacks like, “So what kinds of previous experience or graduate education do entry-level employees in product design typically have?” that I need to say, “Ok, I’m not the best person to ask.”

————-

The question then becomes, does academia need more people who are the best people to ask about a wider range of life experiences and careers? Large research universities with professional schools often do have a bigger range of other kinds of training and experience within their faculty, quite intentionally so in many cases. Small liberal-arts colleges usually don’t: the primary training and work experience of most faculty is academic from beginning to end. When a faculty member has spent time doing other work prior to commencing doctoral study, that often doesn’t figure as much as it could in how the community knows that person and how that person produces knowledge and interpretation within the community.

Not long before he died, my father asked me if it was possible for a successful lawyer with long experience like him to teach the last five or ten years of his working life. I said that it might be that some law schools would be interested, but probably not if he did not already have a connection to them and not if he hadn’t done some form of legal scholarship in his field of expertise. I also thought that there were community colleges that might be interested, and in fact, he had taught a few courses in that setting already. I think he would have been a great teacher in almost any setting: I could easily see him teaching a course on law or labor relations in a college like Swarthmore.

So why don’t we recruit someone like that more often to teach? There are some practical barriers. One-off courses taught by outsiders tend to dangle from the edge of an undergraduate curriculum, poorly integrated into the larger course of study. You can’t plan around taking such a course if you’re a student or directing students to such a course if you’re an advisor. Increasing the supply of such courses more steadily is a short road to adjunctification, which is especially corrosive to small teaching-centered residential colleges.

And if we had longer-term contracts aimed at recruiting this kind of “experiential diversity” in a faculty, how would we know what the content of a candidate’s experience and thinking amounted to? How would we be able to assess who could teach well in a typical liberal-arts environment? You wouldn’t be hiring someone to be a pre-professional trainer: you’d be looking instead for someone who could teach about the ideas, the problems, the open questions, in a broad domain of practice and knowledge. Hiring someone like my dad to teach “NLRB Regulations I” at a place like Swarthmore would be totally out of place with everything else the institution is doing. But a course like “An Insider’s Look at the Culture of Legal Practice in American Society, 1960-1995” might fit in perfectly. While I think he was a natural teacher, I don’t think he could have walked in off the street to teach a course like that any more than I could have walked into Swarthmore at the start of my first year of graduate school and taught a survey in African history.

If you set out to consciously diversify the range of experiences and training present in a typical liberal-arts faculty, you’d really have to be looking for and having an active preference for people like Toby Miller: compatible with and knowledgeable about the internal cultures and practices of academia, trained in some fashion close to the normal course of study, but with a much more wide-ranging set of previous experiences and a conscious dedication to using those experiences to provide a different angle or interpretation of “the liberal arts”.

Miller recounts his working history: “radio DJ, newsreader, sports reporter, popular-culture commentator, speech-writer, cleaner, merchant banker, security guard, storeman-packer, ditch digger, waiter, forester, bureaucrat, magazine and newspaper columnist, blogger, podcaster, journal editor, youth volunteer, research assistant, suicide counsellor, corporate consultant, social-services trainer, TV presenter and secretary”. If a candidate showed up in a job search for Swarthmore with that resume, I don’t think we’d actively discriminate against him if he had his i’s dotted and t’s crossed in a ‘normal’ form of graduate training. But we would neither see any of that past history as a qualifying asset likely to make the candidate a usefully different kind of teacher or advisor.

I’m as guilty of this perspective as anyone. Tenure-track hires are weighty decisions that can have consequences for thirty or forty years–and therefore tend to produce risk-adversity in even the most flighty or idiosyncratic person. Someone who has an innovative, edgy research project or teaching style but whose graduate training is otherwise familiar seems about as much of a risk as most of us want to take: hiring someone whose professional identity is as much vested in what they did before or outside of academia is often too unnerving unless the discipline in question has a particular preference or tolerance for certain kinds of outside-of-academia work (say, as in the case of economics). Considering that Toby Miller’s idiosyncratic path is partially what informs his sharp critique of the institutionalization of the humanities in American academia, it might be that that legitimate worries (“can this person teach? can they do well those things that we’re confident the institution should be doing?”) can’t easily get away from fears about what an outsider sensibility can do to an insider’s lifeworld.

I don’t underestimate the practical problems. I criticized Menand for saying that he can’t imagine how to advise anyone but an aspirant professor about their career choices, but here I’ll have to cop a similar plea. I can’t easily imagine in actual practice how we’d go about having a few Toby Millers by deliberate design rather than happy accident. But I can imagine that students, faculty and staff would benefit a lot if we could dream up a way to accomplish that objective.

Posted in Academia, Defining "Liberal Arts", Generalist's Work, Swarthmore | 2 Comments