Students should feel free to experiment with ideas, and to get them wrong. It’s ok to throw out an interpretation that is flawed, an argument that’s an experiment. An undergraduate shouldn’t feel that there’s a mob armed with rotten fruit waiting for the first slip. I see this a lot at Swarthmore: students who just don’t want to be wrong, and so never risk much. I see it in academic writing, too. We qualify what we say so heavily because we want a fall-back position in case we draw heavy fire.

At the same time, part of being able to persuade is being persuadable. Part of making mistakes usefully is knowing and saying when you’ve made a mistake. Part of experimenting is knowing that sometimes it’s going to blow up. Then you have to clean up your mess, humbly and patiently.

In all the talk about Yale student Aliza Shvartz’s art project, the detail that consistently sticks with me is the aggressively entitled tone of her defense of her work. Not the details of her project, or what that project says about abortion rights or pedagogy or art. It’s the intellectual entitlement behind her defense of the project. Shvartz wrote,

It creates an ambiguity that isolates the locus of ontology to an act of readership. An intentional ambiguity pervades both the act and the objects I produced in relation to it. The performance exists only as I chose to represent it. For me, the most poignant aspect of this representation — the part most meaningful in terms of its political agenda (and, incidentally, the aspect that has not been discussed thus far) — is the impossibility of accurately identifying the resulting blood. Because the miscarriages coincide with the expected date of menstruation (the 28th day of my cycle), it remains ambiguous whether the there was ever a fertilized ovum or not. The reality of the pregnancy, both for myself and for the audience, is a matter of reading.

This ambivalence makes obvious how the act of identification or naming — the act of ascribing a word to something physical — is at its heart an ideological act, an act that literally has the power to construct bodies. In a sense, the act of conception occurs when the viewer assigns the term “miscarriage” or “period” to that blood

This is what I call the porcupine strategy. Make yourself as pointy, sharp and inflated as you can, and hope that any predators will just go away. The problem with this particular porcupine act is that it’s not fooling anyone. Scholars who know something about the theories Shvartz is fumbling to deploy know full well that she’s said very little that makes sense in this passage, that it’s close to being a random assemblage of words. Observers who don’t know anything about those theories just see it as babble.

I hesitate to mention another case any further, because I think the principal actor is an unfortunate figure. But reading an interview with the former Darthmouth writing instructor who threatened to sue her own students, and a long profile of her that makes it clear that litigiousness is a standard strategy for her, it’s hard to leave the case completely alone. Partly because these are the cases that end up framing public awareness of academia, like it or not. Partly, however, because the professor in this case also used the porcupine strategy at many points. In fact, she turns it into an explicit pedagogical credo: one of the things that upset her was that the students questioned her authority, not just as the manager of the classroom but as an expert in her fields of specialization.

At another point, she tries to offer a definition of postmodernism. Look, I grant you that this is a very difficult thing to do. When I was a senior in college, I had an oral exam that was part of the honors program at my university. The basic set-up was that the panel of faculty could ask you questions about anything–all knowledge. So I prepared in the areas where I thought I was weak, and didn’t think much about the humanities. At some point, a line of questioning about historical methodology led to postmodernism, and I was asked point-blank: what is postmodernism? Total meltdown on my part.

These days, I’m a little more prepared when a student asks me that question. But part of my preparation is to leaven my answer with a little humility about my own knowledge but also about postmodernism as a concept. The Dartmouth professor in contrast seems so self-absorbed, so humorless, and most importantly, not really making much sense once you attend to the actual definition or description of postmodernism she offers:

Postmodernism has different definitions, but I’m going to give you the definition according to the guy that invented the term—and he’s Jean-François Lyotard. He wrote a book called The Postmodern Condition, which was published in 1984 in America. The book basically outlines what is called the state of knowledge in post-industrial societies, that because of the influx of computer knowledge, information society, that we are going to have a change in what is known as expert knowledge versus lay knowledge. And I’m sure this will resonate with you because when you go to the computer, you access the Internet and you can get all this information.

Prior to the computer industry or information technology, this was not possible. There was a strict division between expert knowledge and lay knowledge. Expert knowledge of course would be defined as science; science was, according to positivism, the way by which we arrive at knowledge, a truth by the scientific method. Postmodernism was a challenge to that. It challenged the fact that science was the only way of arriving at truth. It was saying that we would have a leveling of the playing field in knowledge. The second thing that it’s about is art, which in the period of modernism and literature—when you go back to [Emile] Zola or the modernist authors—for them, for them art was about the misting of reality. And art should follow the scientific method—that literature and art should follow the tenets of science. According to Lyotard, in the postmodern society, art and literature were going to be in something of a dichotomous relationship with science. In other words, art and literature were going to be now put on the same level as science.

There’s another element to postmodernism prior to the information society in philosophy. The philosophy was about going after knowledge for knowledge’s sake, so you had people just talking about philology, biology, economics, just for the sake of knowledge. But for Lyotard, knowledge would be about efficiency; it would be about doing things better. Knowledge would be not for the sake of knowledge, but for the sake of productivity and technical efficiency. So that’s what postmodernism is about; it has nothing to do with the overthrowing of capitalism. It has nothing to do with it; in fact, postmodernism appropriated many of the tenets of capitalism in what it was talking about. It was not considered a liberal or leftist way of looking at life, although many postmodernists have been thought of as being left-wing or liberal. It was not in any way like that—I just wanted to quality that.

This is again a porcupine approach: back off, I know postmodernism and you do not, I am an authority and you are not, don’t question me. Again, it doesn’t work because those of us who know the texts and arguments the professor is using know that this is a marginally coherent explanation of postmodernism, and those who don’t won’t be able to make any sense out of this explanation but will not be particularly intimidated or impressed by it.

In both cases (and many more like them) the problem with the porcupine approach is that it is pursued at the moment that academic work has become visible to a wider public. It’s one thing if a scholar in one discipline gets irritated in an intramural debate with a scholar in another discipline and wants to end the discussion by asserting authority through obscurantism. It’s another thing if a scholar or student is trying to defend the integrity of their scholarship or teaching against skeptical outsiders.

There’s been a bit of talk about the privileged attitudes of some Ivy League students recently. When I see someone pull the porcupine strategy, that’s when I see privilege asserting itself. It’s both more ethical and more prudent to be pre-emptively open to criticism and dialogue at such moments. Aliza Svartz’ project was always going to draw heavy fire, but I think dangling half-formed chunks of critical theory like a sacred totem about her neck lost her any last shreds of legitimacy. When you have something to defend, you’d better figure out precisely what is that you specifically need to defend, where the heart of the matter is, and you’d better speak to that as clearly and openly as you can.

This entry was posted in Academia. Bookmark the permalink.

16 Responses to Porcupine

  1. Jmayhew says:

    Lyotard didn’t invent the term postmodernism! That in itself is a dead give away that she doesn’t know what she’s talking about.

  2. hestal says:

    I wish I could explain it, but this article hits home to me. It really hits hard.

    I am an old man, and most of my life has been spent in thought. I became a mathematician, but didn’t like pure math. I wanted to produce some result that actually contributed to the common good – mainly because I was taught that people are supposed to do that.

    Just as I was leaving college computers came along and I spent the next 30 years or so designing complex universes in my imagination that, when finally run through the computer, produced a few print images that actually did a lot of good for a lot of people. I am proud to say that some of the systems I developed nearly fifty years ago are still in use and still affect, for the better, the daily lives of millions of Americans. I honestly believe that one of these systems has done more good for American society than anything else since WWII – well, maybe the polio vaccine was more important.

    And this was not hard. All it required was the right equipment and not very much money. And it required a willingness to rethink the world. It required plain words and clear thought, along with complete dedication to the integrity of the result. If a system produces a result, it must be correct and on time. These systems brought useful order into the disorder and confusion that are too often a natural part of life. Without some of these systems, life would have been very different, and very hard for the last half century.

    Since I retired, in 1995, I have tried to apply this approach to redefining more than one public system, and I always strike a stump. Porcupine defenses are everywhere. I have been attacked personally, even threatened, by people who immediately dismiss my proposals and ideas by naming them with a term that makes no sense, but which either changes the subject or shuts down discussion altogether. It has happened to me more than once on this blog.

    And I have seen this, and sometimes had it directed at me, in the academic world. And I am willing to concede that much of what goes on in that world contributes to progress, but I assert that this incredible drivel of clashing definitions impedes progress to a great degree.

    I long to see progress accelerated, and I know, I positively know, it can be, but not without the participation of those who educate our children. So I wonder what the hell is the goal of the academic world: accumulating knowledge, or applying knowledge to the world? Unfortunately I think it is the former. And since every child cannot enter the academic world for which it is being prepared, then progress is slowed. Our children should be prepared to apply knowledge to the world, and I don’t see it happening. I have grandchildren, and many aspects of their schooling worry me. I hired hundreds, at least, of products of our education system from dropouts to PhD’s, and I felt that they were not up to the tasks the world presented.

    So the porcupine defense is not solely part of the academic world, it is part of the world of people who deal in imagined worlds that do not produce a result of concrete value. These people have nowhere to turn when presented with a new idea or a challenge. They have no connection with the real world. You can’t have battles of run-on definitions of terms if you are driving toward a real answer.

    As my Uncle Earl, a butcher and a high school dropout, used to say, “If you can’t butcher a hog, what good are you?” He would always smile, but I knew he meant it. After a while I even got it.

    So I went to my 1987 Britannica and found that “postimpressionism” was listed, but “postmodernism” was not. I googled “postmodernism” and was confronted with many definitions that agreed on little and said little. One of them was quite long and applied the term to many aspects or our cultural life, and the author said at the end that she got her doctorate from Stanford University in the field. I guess she knew what she was talking about; at least I think she knew where she got her degree.

  3. Postmodernism is mostly porcupine, in my experience: the absurd essentialism of the definitions of other systems of thought, the assumption that people are one-dimensional thinkers unless they’re postmodernists themselves, the rejection of anything like communicative fluency….. it’s not accidental, and for all the clunkyness of the definitions offered, I find them entirely consistent with the spirit of postmodernism.

    It’s a particularly toxic set of ideas for “misunderstood artist” types who’ve inherited the modernist tradition of rule-breaking-as-artistic-success. Compound that with youth….

    In the orthodox Jewish tradition, one does not study mysticism until one is married (to provide balance) and at least thirty years old (to provide perspective and maturity). I think we may need to institute similar rules for post-structuralism.

  4. Carl says:

    I think this is right on, and the comments; I even agree with most of what Hestal says as well; without entirely sharing the zeal for ‘progress’. At least part of postmodernism is to wonder at the costs of progress; but as he says, he’s helped people and that’s not a small thing.

    In my experience the ‘porcupine’ doesn’t get trotted out by really privileged people. It’s a stigma-management strategy, in Goffman’s sense, by people who feel one-down. There’s a lot of that around the academy, which dissonantly retains its elite mythos even as it is increasingly dominated by minorities and children of the petty bourgeoisie.

  5. peter55 says:

    Hestal: That you googled the term “postmodernism” and — lo! and behold! — found lots of different, conflicting definitions for it has to be one of the funniest statements I have read in some time!

  6. hestal says:

    Peter55: you are welcome. I am always glad to spread a little happiness wherever I go. Especially to those who need it so.

  7. hestal says:

    Carl, I wonder why you put “progress” in quotes. Is it not a real word? Am I using it incorrectly? Do you mean to imply that there is more to it than I know? Do you think that I do not understand the concept of “costs?” Do your comments mean to say that “progress” is a sham; that on balance it “costs” more than it gains and therefore is a net loss? What precisely do “postmodernists” “wonder” about the “costs” of “progress?” Guys like me wonder if guys like you have a point, ever.

  8. Carl says:

    OK, Hestal, I can see the game here already and I’ll only play it as long as there seems to be some point to it. I’m not spoiling for the nth version of this fight.

    Yes, my friend, guys like me have a point, always. The question is if guys like you are willing or able to get it, ever. I am cautiously hopeful, so here goes.

    The ‘scare quotes’ on progress are part of a way of looking at things in what’s generically called discourse analysis where we don’t assume that the word simply means what it’s commonly taken to mean. Words are tools people accomplish things with. Sometimes that purpose is clear communication; often it’s more than that, an attempt to control how the world is interpreted. So saying we’ve had progress is a way of saying the world is getting better. It’s possible to disagree about that.

    Here are two of many ways this is so. Computers require chips that are made largely of silicon. It turns out that fabricating silicon in a computery way requires the use and then disposal of solvents that are highly toxic when encountered directly, and subtly carcinogenic when encountered indirectly. Any sort of computer progress has this cost associated with it. Once Silicon Valley figured this out, most of the chip production was moved to foreign lands where publics had less power to protest. Now they are paying our costs for computer-related progress – while also enjoying incomes that would not otherwise have been available. So that looks like not pure progress, but a tradeoff in which some people disproportionately win and other people disproportionately lose.

    Here’s another relevant example. Health care and food production have clearly progressed in the modern world and lots of people live longer, healthier lives as a result. Sweet. But complicating this vision of progress is the problem that out of the 6-7 billion living humans who are being kept alive in this new way, several billion live in conditions of horrifying squalor. They’re alive in an entirely new, and really quite appalling way. In a sense many people would call scientific and objective, more people are miserable right now than at any prior time in human history. Progress.

    Furthermore, the people who fully enjoy this progress don’t seem to be as content as people in traditional societies were. In fact, they’re miserable too and gobble antidepressants and other compensating drugs at a fabulous rate. Perhaps it’s because they live in consumer societies in which any sense of meaning is quickly commoditized and reduced to a cash transaction.

    Pal, I like my happy little life, the product of exactly the kind of progress you seem to have in mind. I am grateful. I am grateful to the people who risk cancer to make the wonders of the internet available to me, and I am grateful to the billions whose abject poverty keeps labor prices low and makes my abundant food affordable. I am grateful for the many spiffy things I do not need that I can buy on sale. I am not convinced that keeping me in this luxury counts as progress. I am not, as you have already detected, on board with the Ayn Rand project.

    I feel that I’ve fully answered your questions in the spirit you asked them and given you food for a moment’s thought. It’s not that you have nothing to teach me, hestal. I have already learned, considered, and integrated your perspective into mine. If you had done the same with my perspective you would not have asked the questions you did. I feel I know you so well that I’d like to offer you Erik Erikson’s _Identity: Youth and Crisis_ as a gift. The stage you seem to be in is ego integrity vs. despair. I compliment you on a life of accomplishment and wish you luck in fully accepting your own value.

    If I’ve gotten you wrong, notice I only had a few more paragraphs to work with than you did when you jumped to conclusions about me.

  9. Jerry White says:

    Re: “He wrote a book called The Postmodern Condition, which was published in 1984 in America.” La condition postmoderne was published in 1979 in France. And anyway, it was originally comissioned by the government of Quebec, by their university directorate if memory serves (I’m at home so don’t have the text in front of me). And while I would hesiate to identify who really used the term first, as a general rule Charles Olson really does deserve a mention, esp. in an American context.

    Anyway, just repeating what Tim said, I suppose, that this is a margianlly OK definition at best, and I say this as a guy who is really not all that invested in the concept in the first place…….

  10. fridaykr says:

    In a previous life I trafficked in the postmodern theories of Jameson, Lacan, Derrida, and Zizek, and even published in Postmodern Culture. I found that portions of the turgid work of these writers provided me with useful concepts that allowed me to ask productive questions in my work, which I enjoyed for the sake of its own playful solipsism. But I also recognized that the deployment of these heuristics was very much a language game, one with its own tacit rules and investments in specific institutions and professional hierarchies. And I never thought I was saying anything terribly important or profound.

    Aliza Shvartz’s art project and her own articulation of this project showcase the very worst tendencies in theoretically saturated work in the humanities. Although she claims she wants her project to elicit “discourse” or a conversation, as the many posts on this site have observed or implied, she seems less interested in a conversation and more interested in arrogating the terms on which her project will be discussed.

    In effect, Shvartz reveals that the deployment of jargon most often is done to attain status — it’s not what is said that matters, it’s how you say it, and in what context. The whole episode is ripe for parody, if it isn’t one already.

  11. Western Dave says:

    Am I the only person besides David Harvey who uses postmodern to describe the world after 1973. Postmodern is as inherently useful as Jacksonian or medieval. It shares the same limitations as those terms. The Postmodern era is marked by a shift to flexible accumulation from Fordist production, the globalization of capital (not just commodities), the rise of two-way instantaneous global communication and the culmination of time-space compression (the fact that you can get anything to anywhere relatively quickly) that began with the rise of the railroads. It probably precedes an age that will be named after whatever we use to replace fossil fuels.

  12. Bill McNeill says:

    For my money, the Shvartz quote you cite is only a middling example of the porcupine strategy. Even without the benefit of training in lit-crit jargon I can tell what she’s trying to say and come up with a better way to say it. So the quills aren’t particularly threatening here, but no matter, your point is well taken, and I’ve certainly seen this strategy deployed plenty of times in my own corner of academia.

    It’s frustrating to watch this rhetorical trick get deployed, but it also makes you more appreciative when someone who knows what they’re talking about goes out of their way to be humble. The best example of this I ever encountered was in the Feynman Lectures on Physics, a collection of undergraduate physics lectures by Richard Feynman that’s become a pedagogical classic. In particular, in these books Feynman is praised for the way he manages to illustrate mathematically dense accounts of scientific phenomena with clarifying examples that appeal to students’ basic physical intuitions. There’s a point in Volume III, however, during the exposition of a particularly difficult bit of quantum physics when Feynman’s talent for clever illustrations fails him and all he is able to do is grind through the formal exposition. At this point, his tone becomes apologetic and he says essentially, “Now I’m going to write a bunch of equations on the board and even though I’m convinced they’re correct, the fact that I can’t give you a wholly satisfying explanation why they’re correct is a sign that I don’t fully understand what’s going on here.” Sorry, I can’t find my copy, so I can’t give you the exact quote, but this for me has always been the gold standard of both intellectual humility and the proper attitude of skepticism towards the most recondite and offputting aspects of one’s own field’s jargon.

    Call it the anti-porcupine strategy–I dunno, the tribble.

  13. Bill McNeill says:

    For example, Shvartz says “the act of identification or naming…literally has the power to construct bodies”, to which I can only reply, literally literally? She can’t possibly mean that actual human zygotes blipped into existence because some right-to-lifers were taken in by her stunt. Not only is that idiotic, it goes against the relativistic thrust of her first two sentences. The passage quoted above isn’t so much an example of intimidating obscurantism but the kind of half-baked argumentation intimidating obscurantism is intended to cover up.

    Still, Shvartz deserves some praise. Artists of her ilk are supposed to be shockmeisters, and that makes it hard to have an impact. You can’t just slop some bodily fluids onto a bit of religious imagery and call it a day because ho-hum we’ve seen it all before. The fact that Shvartz was able to push buttons hard enough to break into the national press demonstrates a certain mastery of her genre. She’s a worthy heir to P.T. Barnum whether she knows it or not.

  14. Bob Rehak says:

    Wow, Tim, thanks for putting me on to the Priya Venkatesan story. Reads like a transcript of the bad dreams I’ve had since entering higher ed. Before grad school, I would dream about the usual anxious stuff — showing up unprepared for an exam, or even more hyperbolically, learning at the end of the semester that there was a class I signed up for and never attended. Nowadays, as a faculty member, the dreams about unruly and disrespectful students. It’s always such a relief to wake up and remember that my students are actually quite wonderful; sounds though like Venkatesan’s stuck in the nightmare, perhaps one that she herself catalyzed.

  15. Timothy Burke says:

    Yes. I think everybody’s had a class where everything is not working out quite right, and where there’s a weird antagonism developing. And it’s not easy to shrug it off, I agree. But this is really where the best path out is not to increase the psychic and real-world stakes.

  16. Josh in Philly says:

    Western Dave, I always think of postmodern art/literature, postmodern theory/philosophy, and the postmodern era as slightly different things, although the use of the adjective is itself a statement that they’re connected. For the first, Leslie Fiedler and Ihab Hassan were among the early namers of the category; for the second, Lyotard is indeed a pioneer; for the third, Harvey and Jameson are responsible for most of our understanding of it, although Daniel Bell has his place and the first person to claim that we were in a postmodern era may have been Macaulay.

Comments are closed.