On Not Going Back to School

I think I’ll toss in a bit for Kio Stark’s Kickstarter project Don’t Go Back to School, that aims to be a how-to guide for independent learning.

What Stark is planning to argue (and enable) connects to one of the thoughts behind my own warnings about graduate school, namely, I do not want prospective students to think that an MA or Ph.D (or a J.D., etc.) is primarily about learning how to do something or an extension of the spirit of a liberal arts education. It can be, but usually it’s not. (Exhibit A for the prosecution: the recent article in the New York Times that pointed out that most U.S. law schools don’t really teach their students how to be lawyers, unless the kind of law they’re going to practice is some weird, rarified domain where scholarly approaches to law have some unusual weight.) Graduate school is primarily about credentialling for particular professional objectives. That’s not particularly wholesome but that’s the way it is for now. If the goal is to pick up a new bit of concrete knowledge or skill, there are other and better ways to do it. If the goal is to extend a lifelong engagement with knowledge and critical thinking, graduate school will generally get in the way.

That said, a couple of cautionary thoughts about the project. First, while it’s possible that someone could self-train to understand and interpret neuroscience (for one example) there really are quite a large number of expert domains where understanding and practicing are different matters. An autodidact reader of neuroscience could learn to interpret and evaluate research, teach or write about the field, and imagine or advocate new directions for study or experiment, but it’s still pretty reasonable to have a bright, sharp fence up around “do neuroscientific experimentation on living subjects” and “conduct neurological interventions, surgical or otherwise, on living subjects”. I think it’s very true even there that existing researchers and doctors learn most of what they learn through experience rather than in formal classroom settings, but this is one of many cases where requiring certification of expertise and limiting that certification to appropriate institutions is the only way to hope for some kind of baseline minimum qualification before we collectively permit someone to engage in practices that have very high potential for harming people. Maybe you lose the occasional autodidactical genius who would come up with a completely new medical or research technique that way, but I think you also lose a lot of Dr. Frankensteins and quacks. Pope Brock’s Charlatan, a history of John L. Brinkley, the quack doctor who built a thriving practice on surgically inserting goat testicles into the scrotums of American men looking to revive their sexual potency, is a pretty good reminder of why American society increasingly embraced formal education and certification as a requirement for some kinds of expert practice.

Second, I completely believe that you can learn techniques of autodidacticism from people like Cory Doctorow and Quinn Norton, that at least some of how they learn new things is reproducible. As a self-identified generalist, I feel I can show other people how I do what I do in a way that’s partially reproducible. At the same time, just as I know that I hit some pretty firm cognitive limits in certain domains of intellectual practice, I do feel that there are some people who just are not going to be able to be autodidacts no matter how clear and reproducible the instructions on the box might be. Some people don’t think that way, some people weren’t brought up that way, some people have adapted so strongly to the structure of formal education that it would do them more harm than good for them to try and do without it. It’s Stark’s project, but my meddling-kids advice would be that the most irritating thing about a how-to project might be when it implies that its advice has a potentially global or universal scope. Even with projects, ideas and approaches that I like, I’m finding that I’m very unsatisfied if there isn’t serious attention given to shortcomings, failures and limit conditions. It’s good to interview people who are successful self-learners, but there have got to be some casualties out there too, whether it’s people who tried to learn how to operate a table saw on their own and cut their thumb off or people who have dedicated themselves to the independent mastery of calculus via a dozen routes and had to eventually surrender.

This entry was posted in Academia. Bookmark the permalink.

13 Responses to On Not Going Back to School

  1. Kio Stark says:

    Thanks so much for your thoughts on this. I do agree that there are lots of situations in which grad school is a good and/or necessary thing, and I don’t advocate abolishing it. Just trying to open people’s minds about independent learning and do what I can to enable it. The book will actually include some voices about what’s good about grad school, and I really like the idea of including a failure counterexample. I’m going to go hunting for someone with that kind of story.

  2. Cara says:

    It strikes me, too, about grad school, that aside from credentialing it also gives you access to people with whom you can discuss the things you’re learning. One of the risks of autodidacticism is becoming kind of a crank — it seems vital to me not to spend *all* your days in your basement reading about history by yourself, but also to discuss it with other historians, especially ones that are more broadly acquainted with the field than you are.

    I now do hope to be an academic, God help me, but I initially went back to grad school only out of a vital interest in meeting and having conversations with people interested in what I was interested in — I switched fields between undergrad and grad, and when I tried to self-teach, I found myself at sea. I could read and understand books, but I didn’t understand how they fit into the larger discussion. What grad school gave me was access to people who could help me build a map into which to place my knowledge.

  3. ikl says:

    The NYTimes article wasn’t evidence of anything except the incompetence of its author. Although the article quoted a number of people who had, to varying degrees, thoughtful and interesting things to say, it made errors that suggests a lack of care bordering on bad faith. For example, it cited a philosophy article, written by a philosopher, published in a philosophy journal as an example of a arcane nature and practical irrelevance of legal scholarship (of course, a suitable law review article could easily have been found – the point is that the author didn’t care about the facts enough to make even a trivial effort to get them right). It also complained that Criminal Law courses don’t cover plea bargaining (unmentioned was that this is generally part of Criminal Procedure, which is a distinct course). There were various other oddities as well – even when the article made reasonable points, the examples used to support them were often dubious.

    I’m not really sure what you mean by “weird rarified domain where scholarly approaches to the law have unusual weight”. In my admittedly brief experience, law firm partners care a lot about hiring associates with good legal judgment. This is something that law schools may or may not be especially good at teaching, but it is, for the most part (along with a basic outline of major areas of legal doctrine), what they are actually trying to teach (through issue spotter exams and the like). Law firms have an edge on teaching graduates “how to be a lawyer” in the sense of how to practice law for obvious reasons – although the economics of doing so have gotten more difficult as of late.

    In my particular area of practice, tax, one was expected to have gotten the basics of tax law in law school, although the firm seemed to be open minded about how much you need to have studied as long as you could learn quickly. Good legal judgment and business judgment, however, were not negotiable. Associates who can’t be trusted to make independent decisions are useless – even if the partner ultimately makes the hard decisions, the associate has to recognize what is important and what’s not because often if the associate does not recognize it, the partner will never see it. I’m not sure how representative this is across practise areas. But what I am more confident in is that legal judgment and analytical ability is more important to high level practice, than knowledge of where to file merger documents and the like (an example from the NYT article).

  4. Timothy Burke says:

    Thanks for that, ikl. The article seemed plausible to me but it’s useful to know that there’s a very strong argument against the picture it painted. And of course some the debate surrounding a portrait like that tracks against “liberal arts” vs. “vocational” divides in general, and I’m squarely on the side of believing that a “liberal arts” approach is in the end a better way to learn how to do something concrete–so in this sense, I don’t think that just rehearsing the nitty-gritty, procedural, applied work of lawyering is what law schools ought to be doing. You can’t learn how to make a good argument (either in a brief or in a courtroom) as a mechanical exercise that involves following three or four reproducible steps.

  5. ikl says:

    There are a number of serious issues with the current model of legal education, but the article was basically a hatchet job written by somebody too ignorant of a basic landscape to distinguish serious from spurious critiques.

  6. lemmy caution says:

    “For example, it cited a philosophy article, written by a philosopher, published in a philosophy journal as an example of a arcane nature and practical irrelevance of legal scholarship (of course, a suitable law review article could easily have been found – the point is that the author didn’t care about the facts enough to make even a trivial effort to get them right).”

    The philosopher was looking for a job as a law school professor though. The application of Derek Parfit’s utilitarian ethics to law is interesting but not anything lawyer really needs to know.

    There is a belief system at elite law schools that values the abstract over the practical. It is hard to weed out since to move toward the practical risks reducing the status of the law school. The effects of good grades from an elite law school sure are practical enough.

  7. ikl says:

    No, the philosopher who wrote the article teaches at University of Vermont. Google it. The paper is available online. The author was not the job candidate interviewed for the article who does apparently write about the non-identity problem – the example would have made somewhat more sense if he had been.

  8. Jay Scott says:

    I went to grad school because nobody around me was interested in what I was interested in. I thought I could learn faster and better and more if I immersed myself in a world where everybody had similar interests. And I was right–I had misadventures along the way, but my basic goal paid off.

    Now I work for a university, and as a perk I can take classes in almost anything. But I don’t, because I believe I can learn more efficiently by myself.

    I think that for a well-rounded autodidact, school should be another tool in the toolbox. Consider learning a foreign language: You can learn to read and write by yourself, and you can find partners on the Internet and learn to speak and understand. But empirically, most people who try to learn a language by themselves fail, either through a dropoff in motivation or through lack of understanding of how to learn. Some can succeed, but for most people I recommend finding a good class, partly because it will help keep your motivation up.

  9. Vielle says:

    Hm. A how-to manual for autodidacts…. am I the only one who sees the irony?

    I wanted to comment on a slippery little passage in the original post, though: “Graduate school is primarily about credentialling for particular professional objectives…If the goal is to pick up a new bit of concrete knowledge or skill, there are other and better ways to do it. If the goal is to extend a lifelong engagement with knowledge and critical thinking, graduate school will generally get in the way.”

    (1) Credentialling is related to professionalization; it usually serves to restrict access to a profession, for reasons having more to do with status and cultural clout than safety. (Historically, excluding women has been part of the project, too; the AMA arose in part as a reaction to the growing number of midwives taking care of pregnancy and childbirth and thus eating into physicians’ potential client base.) Although you’re right that safety can be used to justify professionalization, the latter does not guarantee the former: just think of all the unhealthy trends that have been promulgated by MDs; the high rate of iatrogenic illness; the startling rates of morbidity and mortality from prescription drugs (which are, in essence, credentialled or authorized substances). In any case, grad schools produce far more PhD’s, especially in the humanities, than universities can produce faculty positions. No need to restrict access through credentials here.

    (2) Does anyone go to graduate (as opposed to professional) school to learn “a concrete knowledge or skill?” This sounds to me more like learning a craft. But grad school should be a great place in which to get intensive training in how to think and write, how to devise good research questions, how to judge sources, etc., in addition to developing relationships with colleagues. It’s much harder after the student phase to find people to pay close attention to your writing/thinking.

    (3) Academia has its problems, but to say that graduate school “gets in the way of” a lifelong engagement with learning and critical thinking is going a tad too far. Only a small minority of people go to grad school; most these days go into the corporate world. Business is certainly the dominant culture—think how often we hear Americans referred to as “consumers” or “workers” instead of citizens—but it is not exactly known for self-criticism and ongoing humility before the vast realm of the unknown. Indeed, practical capitalist types poke fun at the caution that characterizes academic discourse. The fact that completing undergraduate and professional (MD, JD, DDS, and MBA) programs tends to correlate with conservative voting, even in this whacky decade, indicates either that there is a self-selection process for the students who choose those paths, and that most of them are not challenged in their beliefs, or that the students go in without a particular predilection but find conservative ideology more palatable after their training. Graduate school may not achieve enough in getting people to think critically, but it sure beats the alternative.

  10. Trevor Owens says:

    The potential for the web to connect you with people and let us teach ourselves is really exciting, and I wholeheartedly agree with your way of construing higher education primarily as a credentialing mechanism.

    However, I find it difficult to square this with the role that a lot of these conversations end up playing as part of bigger arguments about supporting education as a public good. Aside from the merits of the teach-your-self model, I think there is a bigger reason for why groups like the gates foundation are supporting the idea of DIY education. Ex http://diyubook.com/ And I think that it is largely part of Gene Glass’s diagnosis of the trajectory of educational reform in Fertilizers, Pills & Magnetic Strips. In short, DIY education fits very nicely as part of an argument for defunding education as a public good.

    Like I said, I think this is really exciting stuff, and I want to be on board, but I also am hopeful that we can come up with ways to think about integrating this kind of education with public education, if only to help more broadly provide opportunities and show the possibilities to people who won’t otherwise be aware of them.

    I realize that this public good argument is much less of a concern at the graduate level than at each previous level of education, but there are similar arguments being made about undergraduate education as well.

  11. lemmy caution says:

    I was thinking about this guy:

    http://www.lawschool.cornell.edu/faculty/bio.cfm?id=422

    http://ww3.lawschool.cornell.edu/faculty/faculty_cvs/Herstein.pdf

    You are right that the other article was written by a philosopher.

    Admittedly, access to actually useful law review articles is better than ever thanks to the internet.

  12. The only self-directed people I know who seem to do OK are programmers. I don’t know if it’s the nature of the work (binary? forgive the pun… either the code works or it doesn’t, regardless of any traditional degree or certification) or something else. Other professions seem to want more of that traditional “authentication.” Or am I making too much of this?

    As a non-financially focused pursuit, however, I’m right with you. The day I don’t learn something new is the day I die, etc.

  13. sarah says:

    It would be a lot easier to make use of syllabi and class lectures available online
    if it were possible to buy online copies of course readers, or for people not affiliated with a university to get access to academic journals (and books!).

Comments are closed.