Walk This Way

Swarthmore had its commencement this morning, on a very pleasant late spring day. I like to sit closer to the front so I can see the graduates and the audience, but it does mean you have to pay attention as they can see you.

One thing that I always notice, year after year, is the striking individualism of every single graduate’s way of walking across the stage to receive the diploma and a handshake from the president. Some of them walk confidently across with a faint smile. Some are somber. A few scurry furtively across like a night-dwelling creature who has been surprised by a bright light. Some of them shake Al Bloom’s hand like they are his fraternity brothers, others shake his hand like they’re afraid he has a joy buzzer in it. A few skip across in delirious joy, waving to friends and family. Some shake their diplomas gleefully at the crowd. Some look at us to their right, some look at family to their left, most stare fixedly ahead, as if they’re afraid the stage will come to an abrupt end and they will pitch off into the bushes. This year, one man scowled fearsomely at Al, and then winked knowingly at him. Caps occasionally pitch off of tall or hirsute heads, usually right when the graduate leans to receive a diploma.

For the students I know well, somehow the walk always captures their personality perfectly: shy, bold, indifferent, quietly confident, boisterous, performative. A window is framed by twenty steps, and then they’re back in the crowd, one more face among many.

Posted in Academia, Swarthmore | 6 Comments

When Do You Shelve a Course?

One of my courses this semester felt slightly off. I’ve taught this course enough times to see some variations in how and when the students are engaged by this material. This group seemed the least engaged I’ve seen. There were plenty of dynamic students, good end-of-term papers and the readings seem as interesting as ever from my perspective.

It may just be the mix of students. One of the best things any institution could do for a junior professor is give them two sections of the same class just to see the difference that the accidental mix of different personalities can make in a course. I had that experience early in my career at Swarthmore, and it gave me a sense of perspective I couldn’t have had any other way. Or maybe it’s the size–the class was too large by about ten people. We got into that dreaded pattern where only a few people were highly active in discussion, other people settled into silence and from there drifted in and out of connection to the course as a whole.

However, I think one of the most dangerous things for any professor is to just stick with a particular elective or topical course no matter what, to assume that some material or approach is timeless.

Many years ago, I taught a class on colonialism and gender where I decided almost immediately after the end of the semester that it was the last time I would teach it. It wasn’t a bad class from the perspective of the students, judging from their evaluations. The problem was that the historiographical debates that I was trying to discuss with the students didn’t seem debatable as far as they were concerned. I wanted to ask them how and when to choose between focusing on gender and focusing on other topics or perspectives. Their answer was that you can always do it all: race/class/gender/sexuality, plus anything else that interests you or seems important. I wanted to talk about whether “gender” as a category made sense cross-culturally or across time, and they seemed to find that a non-question. I wanted to read some of the critiques of feminism as a Western, white, middle-class project and figure out how to consider, accept or reject those critiques. They shrugged. They were very interested in some of the historiography, particularly material on masculinity in colonial societies. Even the material they weren’t interested in, they were thoughtfully disinterested. These were smart, thoughtful students. So I took their reaction seriously. It suggested to me that a historiography that had meant something to me while I was in graduate school no longer meant much to contemporary students.

When I talk to colleagues who are disappointed by similar disengagement from or indifference to a long-running course, I often ask them to consider whether the shelf life for the course has expired, at least in the form that they’re accustomed to teaching. If that’s true, you can put it aside, but you can also try to strip it back to some essential or deeper question posed by the material. It may be that the scholarship you’re teaching about has gotten too baroque, too consumed by its own miniature intellectual history, too much a game for insiders. But it may also be that the course is about issues that current students are no longer prepared to study or no longer much interested in thinking through.

On the other hand, the whole point of scholarship is to preserve an ongoing conversation about subjects that matter. One of the basic missions of any curriculum is stewardship, to define what it is that students do not yet know and what it is that we believe they should know and be able to do by graduation. Some classes you’d never want to retire even if students no longer had an intuitive or instinctive engagement with the subject matter. I might decide that a course on some particular aspect of African history or cultural history is no longer viable, but I’m not about to decide that African history as a whole is no longer worth teaching.

It may be easier for me than for some people to just rip up my syllabi and start fresh, because I routinely do that even with courses I teach regularly. I don’t keep lecture notes from year to year: I always re-do my outlines. The more your own teaching practice is built around a legacy of preparation, the harder it may be to write that preparation off as a sunken cost. Also, the more responsible you are to a core curriculum that has been worked out as a shared or collective project, the less leeway you have. No matter what, though, you’ve got to do some spring cleaning from time to time on your personal catalog.

Posted in Academia | 2 Comments

Letting Go and Keeping Weird

I’ve had my finger hovering over the send button of an irritated email addressed to the Alumni Office at my alma mater, Wesleyan University, for about two weeks now. The main reason I don’t send it is that I would be equally annoyed to be the recipient of a similar email from an alumnus of my own institution, Swarthmore College.

It’s really hard to accept change in an institution that you cherished as it was. (Yet more declension narratives!) If alumni have a useful role, in fact, it’s as guardians of the essential traditions and values of their own institution.

“Essential” is the tricky thing here. No college or university in the United States is anything like what it was seventy-five years ago. In the early 20th Century, most private universities and colleges were far more strongly tied to a religious denomination. They were shabbily genteel places that educated a small fraction of a social elite. They were nothing like the multimillion dollar institutions that inspire dread and frenzy among high school juniors and their parents every spring.

The stereotypical personalities that most individual colleges and universities are known for today came together fairly recently, partly invented or distilled by the first generation of consultants, guides and counselors who appeared as a byproduct of the growing selectivity and centrality of higher education after 1960.

I think most selective colleges and universities today feel some ambivalence about their reputational brand. Swarthmore is sometimes uneasy about its reputation for seriousness, over-intellectualism and masochism, for example. The stronger your perceived niche or character, the more it limits the pool of potential applicants and matriculants. Moreover, the institution’s personality can become more and more exaggerated as it attracts more and more of the same kind of prospective student.

When I was looking as a prospective college student in the early 1980s, I was sure of a few things. I wanted to be in a small college rather than a large university. I wanted to be on the East Coast, largely because of my Californian-derived romantic (and mostly inaccurate) conception of the East Coast. Other than that, I was fairly open to the possibilities. Like many teenagers on their college tour, I let my impressionistic reaction to each place guide my later decisions. I liked the look and feel of Wesleyan, and I liked my interviewer. (I hated my interviewer at Swarthmore: he was a supercilious jerk.) What I also liked at Wesleyan was a vague, funky sense of weirdness and eccentricity.

Hence my not-sent email. A short item about Wesleyan in the New York Times talked about the college in relationship to Obama’s commencement speech there. (He managed to refer to Wesleyan as Wellesley, which happens a lot.) Along with Obama’s speech, Wesleyan has been in the news lately for a near-riot that laid bare some long-standing town-gown tensions around the university. But what caught my eye was the discussion of the administration’s attempts to “mainstream” Wesleyan’s institutional culture, in part by insisting that some long-standing student traditions be renamed.

I lived for two years in West College, the epicenter of this pressure for change. We had two big dorm parties called Zonker Harris Day and Uncle Duke Day, both of which involved 80s college students re-enacting their impressionistic understanding of 60s counterculture, the drugs and rock-and-roll parts. It wasn’t just these two days: the whole dorm had a cheerfully weird feeling to it most of the time. I liked it, though once I got there, I realized I was personally pretty square in a lot of ways.

It seems entirely possible to me that this schtick has gone well past its expiration date. But when administrators and faculty get involved in trying to rebrand the institution, including scuffing over names and associations that interfere with the image they’re trying to communicate, that’s usually a problem.

It’s okay to try and remind your applicants and public that there’s more to your institution than its most exaggerated reputation. Not all or even many Swarthmore students are overintellectualized masochists who play misery poker and complain that anywhere else it would have been an A. Even in the 1980s, relatively few Wesleyan students were the kind who wished they’d been in Chicago in 1968.

If something is really past its expiration date, though, then the current participants will rename it themselves, or give it up, even despite pressure from alumni attached to the good old days. If some day Swarthmore students decided that the annual event Screw Your Roomate was stupid or irrelevant, they’d stop doing it. Current Swarthmore students did decide to rename a long-running student interest group dedicated to games, science fiction, geek culture.

Here I’m not speaking as an alumnus or a professor. This is my general position on cultural change within long-lived institutions. When the change comes from the top as a dictate, that’s usually a counterproductive mistake that can have perverse and unintended consequences. (Among them in this case is that Wesleyan might become a bland safety school, that the students attracted by a weirder Wesleyan are a highly desirable fraction of its student body.)

But I don’t think I’ll press the send button on that email, because I equally think that the cultural life of a college or university isn’t something for alumni to try and micromanage.

After all, we’ll always have West College.

Posted in Academia, Swarthmore | 9 Comments

In My Day…

I seriously hate declension narratives. Anything that starts out with, “Once upon a time, there was a golden age, and then the barbarians came and wrecked it all…” gets me going for my guns. Even when it’s a reasonable enough story, because now and again there’s something to claims of degeneration, failure and loss. The problem is that even the reasonable arguments drift quickly into the borderlands of exaggeration and from there often just go ahead and boldly march into being a big lie.

A very large number of the popular narratives of decline and fall that have circulated in American society for the last thirty years or so, for example, take conditions that were a brief, specific consequence of the post-WWII reorganization and affluence of American society and start to reframe them first as a general part of the entire 20th Century, then as something basic to American history all the way back to colonial settlement, and then leap the Atlantic and usually plow straight for the Aegean, coming to rest in Greece, Rome or Jerusalem.

If you want to argue that there was some social or institutional condition in 1955 or 1960 in the U.S. that was admirable and is now lost, you might have an argument. If you’re going to start getting all moony about the loss of the entire history of Western Civilization, you’ve got a harder job ahead of you, and typically those who do so don’t treat it as a harder job, but instead puff and blow more insistently the more eye-rollingly hilarious their glossing of the past becomes.

A prime example: this book review by Christoper Orlet, via Erin O’Connor. I hope that it’s not an accurate description of the book it purports to review, since Orlet’s comments are a generic sort of declensionist rant that I suspect he could serve up whether he was reviewing a book about higher education, discussing the new Indiana Jones film, or grousing to his family over dinner about life in general.

Try this passage:

AT ONE TIME the purpose of a university education was to give future leaders an opportunity — before they shouldered the dull burdens of civic responsibility — to explore the purpose and value of life. By instilling a strong sense of history, of reason, of logic, of the best of what has been thought and said, a background in the Humanities would prepare a young scholar for whatever may lie ahead.

This, at least, had been the belief going back to Plato’s Republic.

Or how about this one:

The Sixties Generation broke with this four-thousand-year tradition. If the bugbears of early 20th Century radicals were the consumer-driven economy and the thoughtless pursuit of material comfort, then the Baby Boomers’ bete noire was Western Civilization and all it entailed.

From then on, social change, rather than concerns about work and consumption, would be paramount on college campuses. Such change would not come from the government or the people, but from the university, since the university was uniquely situated to tackle moral issues. After all where else could one find so many smart, morally superior persons? First, however, the university, and its Humanities departments (the propagandizer of the elitist, racist, sexist, imperial tradition of Western culture) must change and adapt.

In the subsequent 40 years the radicals and their political agenda have triumphed unopposed on the college campus, so much so that today’s student is compelled to conform to an intolerant progressive doctrine if he hopes to receive his sheepskin. Students are now told that there is a single right answer and, like the Sphinx, only he, the professor, possesses it.

As I said over at O’Connor’s blog, is Orlet seriously trying to argue that there has been a continuous tradition of scholarly humanistic inquiry for the last 4,000 years that was about liberty, freedom, open-ended explorations of the purpose and value of life, that stood unblemished until the dirty hippies came along forty years ago and turned it to indoctrination?

I can’t think of a better proof of declension in the quality of historical education, if Orlet claims to be historically literate. Seriously, it would help, if you want to complain about the declining quality of the humanities, to not be a historical dunderhead on a fantastic scale, to demonstrate some degree of erudition. I’d really like to hear more about the democratic humanities in the West as practiced in universities and their open encouragement of free-thinking liberalism in 8th Century Europe. Or, hell, why not classical Athens? Right, right, I know, short reviews demand generalizations, just making a point to start debate, the usual excuses.

I think this is a generic kind of fallacy that slips into declensionist stories, and not just conservative ones, a misrembering and compression of the details and messiness of history as we have lived it. I’m not going to be so much of a prig for accuracy as to argue that fantasies about the past don’t sometimes have a constructive, healthy relationship to transformations of the present. The general problem with delusions about decline, however, is that they mislead us into thinking that we are trying to restore some past covenant or arrangement when what we are really trying to do is create something that has yet to exist. On that confusion, both bad and good projects often run aground, but not before they do a lot of collateral damage in the process.

Posted in Academia | 43 Comments

Research Notes: GI Joe

Sorting through some old papers in my version of spring cleaning, I found some notes I wrote while watching two episodes of the 80s version of GI Joe while I was working on Saturday Morning Fever. It was one of the shows of the 1980s that I didn’t have a personal memory of, so I felt kind of obligated to watch it. The episodes in question were “Cobra Stops the World” and “The Revenge of Cobra”.

So here they are as I wrote them circa 1997 or so: intellectual work in progress! Keep in mind that I hadn’t ever seen the cartoon, a rare lacunae in my kidvid knowledge. Keep in mind this was before Wikipedia or any pop-culture wiki: researching this stuff used to be hard, you young whippersnappers.

———–

Women in the military.

Everybody has a gadget, single-note personality formula.

Advanced technology with no general consequences, very comic-book.

Cobra guys bail out, escape exploding buildings–Standard & Practices obviously very hard at work to make sure that there is no impression that bad guys are getting killed (that line wouldn’t be crossed for quite a while afterwards). No wonder Cobra is always back next week, strong as ever, they never lose a single guy.

“Yo Joe”. This is not an inspiring battle cry.

Sparks, guy with big backpack.

“Eat hot knuckle, snakeface!”

Whooops on the no death thing–Joe does seem to trap a bunch of Yanomano miners in a diamond mine with a big explosion. Off-screen civilian collateral damage ok?

Who is red-haired woman? “Scarlet”.

This is like “Where in the World is Cobra Commander?” Educational!

We are at Cobra’s secret base in Patagonia. It has a giant Cobra-shaped tower on it. “Secret”.

Cobra Commander is graduate of Lex Luthor school for last minute escapes while heroes stand around congratulating themselves. Cartoon villains have low self-esteem, they expect to lose, because they spend so much time preparing to escape.

“Revenge of Cobra”

Gung Ho. Refugee from Village People? I am sorry, but this is one of the gayest things I’ve seen.

New one-note people: woman with laser javelin, guy with a dog.

“Snake is sneak spelled sideways”.

Cobra’s battlecry: “Cobra!” Catchy.

Duke abandons helicopter to fight 25 guys hand-to-hand. Tactical brilliance.

Roadblock: kind of a low-rent Muhammad Ali, but a step up from Black Vulcan, etc. as far as cartoon multiculturalism goes.

Zartan: “While GI Joe and Cobra battle for the land, I rule the swamp!”. Wow, what an accomplishment.

More not-secret secret hideouts.

Destro: why don’t you stick with your inventions, man. So it failed once, try again. Why keep making new world-ruling thingies?

What is Cobra’s revenue base?

Why not just nuke Cobra’s headquarters if they’re controlling the weather from it? That would solve the problem real quick.

Shipwreck: looks like the old GI Joe doll, the big manly bearded guy. “Don’t worry, I’m not with Cobra. I’m with myself and anybody who can pay my price”. Mercenary or gigolo?

Someday someone in some pop culture should get trapped under a closing door and have their legs crushed. Just saying.

Lots of scene recycling, standard 80s animation, almost porn-like with its repetition. Yes, the people who are complaining about toy tie-ins may have a little bit of a point on this one. Some character designs appealing, though.

Posted in Popular Culture | 6 Comments

The Summer Scorecard

1. This is the kind of issue my colleague Bob Rehak thinks about so very well, but I was really struck watching Prince Caspian at how poorly the WETA house style serves that film. The script was pretty decent considering that of Lewis’ four “core” Narnia stories, I would flag Prince Caspian as the one with the biggest storytelling issues in terms of adaptability to a film. The book starts slow, spending a much longer time in the ruins of Cair Paravel, and then proceeds into an extended flashback. There’s only a bit of dramatic tension among the protagonists, early on, when Lucy can see Aslan and the others cannot. Caspian and the Pevensies only meet late in the story, and have few strong interactions. There’s only one major battle with a few offstage conflicts. The adaptation shortens the opening, sharpens the dramatic tension, adds a major battle sequence and tries to give Peter and Susan more of a dramatic arc (largely at the cost of making Peter a pouty whiner, unfortunately).

This is all fine, but WETA’s visuals end up making Prince Caspian feel like a cadet version of Lord of the Rings and I almost think that would have happened even if there never had been a Peter Jackson Lord of the Rings. Narnia somehow seems to me to need a wispier, less pseudorealist style, to look more earnestly fairy-tale.

I remember my viscerally negative reaction to seeing Disney’s The Black Cauldron in 1985 as a college student. Before I had any reaction to the (numerous) storytelling failures in the film, there was the fundamental issue that the cherubic roundedness of the Disney house style was simply and fundamentally the wrong choice for Lloyd Alexander’s Welsh-inspired fantasies.

WETA’s work is not technological destiny. The visual aesthetics found in digital games suggest the range of what’s possible: if you can have Okami, Grand Theft Auto and World of Warcraft, you can have satyrs, centaurs and dryads who don’t look like they wandered over from a forgotten corner of Middle-Earth and took a left-turn at Lantern Waste.

2) Speaking of works that demonstrate what’s visually possible, Speed Racer deserves a lot more critical appreciation than what it’s garnered so far. I’m not a huge fan of the original cartoon, but I enjoyed the film a lot. The visuals alone are very interesting (and yes, they’ll probably give you a headache in parts) but the film is also just a lot of fun. Chris Sims is perfectly right by calling out the sheer awesomeness of a film that features Racer X punching an upside-down Viking racecar driver. The entire cross-country race in the middle of the film is great, in fact, full of interesting images and great action sequences. Heck, I even found Spridel and Chim-Chim amusing, which is a wholly new sensation in my life. Yes, the film could use some further editing: snip out one or two bits of Moms Racer talking about how proud she is of Speed and a few other character bits, maybe, to shave it to two hours or under.

Posted in Popular Culture | 5 Comments

Slides and Chalkboards

I have two questions, both in reaction to Margaret Soltan’s insistent complaints about PowerPoint usage in university classrooms.

1. How common is repeated, regular PowerPoint usage by professors in higher ed classrooms? (E.g., not the occasional usage to show visual material, but the routine use of it for lectures.) Do we have any systematic idea? Does it differ significantly between less and more selective institutions? Between disciplines? (I’m fairly sure that usage is heavier in the sciences and social sciences.)

2. What’s the difference between bad usage of PowerPoint in lectures and bad lectures that involve hand-outs, overhead transparencies and writing on the chalkboard? Are we just complaining about old wine in new bottles here? Is the real culprit professorial droning at classrooms of 200+ students followed by recite-repeat-and-forget examinations? I think it’s at least plausible that the technology is just giving us a new reason to pay attention to a pedagogy whose effectiveness has been suspect for two generations.

Posted in Academia | 22 Comments

I Agree!

I don’t usually post simple applause entries, but Patricia Nelson Limerick pretty much speaks for me in every respect in this column for the Chronicle of Higher Education.

I think she’s particularly correct when she argues that the “lament of the humanities” about a lack of support is a consequence of one mode of practice in the humanities being actively hostile to connecting to public discourse. I am ok with intellectuals who believe that they should reject the public sphere as it exists, but then don’t complain that there isn’t any public or institutional underwriting of that rejection.

Posted in Academia | 1 Comment

The Wrong End?

One of the basic problems with Culture War 2.0 as it involves higher education is that both the critics and defenders of academia spend so much time focused on a small number of elite and well-resourced institutions. There are questions about academia that involve Harvard, Swarthmore or Michigan as much as they involve any other college or university, but many of the perennial issues under discussion look very different when we’re talking about the full range of institutions of higher education in the United States.

Take assessment, for example. It’s true that elite private and public institutions need to do a better job of figuring out whether or not their teaching is accomplishing its goals. In my view, it’s also true that standardized national testing would be a truly stupid and counterproductive way to help figure that out, and would run counter to much of the best pedagogical practices at those institutions.

The conversation is already badly malformed if it starts at that part of the spectrum. Consider, in contrast, this story from Inside Higher Education about Norfolk State University, a historically black university. A biology professor there has been denied tenure and there is a strong consensus from all parties involved about the reason why he was denied: he failed too many students. Not because he was erratic or vindictive in his grading, but because the university administration tells the faculty that 70% of their students should pass a course. The professor failed many of his students simply because they attended fewer than 80% of the class sessions, which the university maintains is the required attendance to pass a course. In fact, the average attendance in his courses was 66%, and as he notes, that means he should actually have given out even more Fs than he did.

Read the IHE story further and you see that only 12 percent of students graduate Norfolk State in four years, and only 30 percent in six. To be fair, there is a deeper question here than Norfolk State’s policies. The administration there, as many administrations in many less selective colleges and universities do, maintains that their task is to take students with inadequate K-12 preparation and find a way to compensate for that preparation, to educate the students up to a higher standard. That is a numbingly difficult mission to tackle. Four or even six years is too short a time to make up for thirteen years, especially when some of the things you’re trying to make up for are intrinsically harder for adults to absorb and learn.

This is where the debate over national standards for assessment and accreditation really should kick into gear. What do we all think is the acceptable minimum standard of ability and knowledge for a college-educated student? If I see that someone has passed biology at a college, whether it’s Swarthmore or Norfolk State, what do I have a right to expect of that student? More to the point, what do we really want to do as a nation about the problem of inadequate educational preparation? I understand that the most appropriate solution is to ensure that every student gets a good K-12 education, but as most of us are now painfully aware, that’s a daunting public policy problem that has resisted an ideologically varied spectrum of responses.

So should colleges and universities be charged with making up the difference in preparation, and what happens when an established minimum standard meets up with a seriously inadequate preparation? If you enforce the standard stringently, many students at the large number of four-year institutions like Norfolk State will fail: a graduation rate of thirty percent over six years is probably higher than what would follow on strict standards enforcement. So then what? Are we prepared as a society to say to young adults who’ve gone that far and failed, “Sorry about the terrible educational preparation provided by your government, but you’re just going to have to lump it with whatever life and career you can find for yourself”?

How will we know where the institutions are at fault and where the student is at fault, where the student could have learned or performed and simply didn’t? Whether it’s Swarthmore or Norfolk State, those are hard questions to answer. Take any two B minus grades I might give on an essay and I’ll have a private opinion about whether that’s a “B minus, because you can do better and you didn’t bother or you screwed this one up a bit” or a “B minus, because that’s about the best you can do for now, given your understanding of writing and of the subject matter”. The policy consequences of that question are different when we’re talking about the typical experience of higher education in the United States rather than the highly selective elite institutions, though. Even if there’s a social cost to the whole society to failing the students at Norfolk State who didn’t show up to half their classes, I think the institution has to enforce that standard. (Though on the other hand, if the students who didn’t show up performed strongly on appropriately challenging tests or assessment measures, that would pose a different kind of puzzler.)

A lot of the time, academic bloggers let these conversations get driven by the local circumstances and questions they face, which is only natural. But at least some of the time, we should be talking about what is typical in higher education rather than what is exceptional. This doesn’t apply only to debates about assessment or national policy. If, for example, we’re talking about a professor or instructor whose questionable pedagogy, odd scholarship or public pronouncements have drawn attention, it matters what end of the academic labor market we’re talking about. On one hand, in an ideal world, a $15,000/year full-time salary really ought to buy something less reliable than a $100,000/year full-time salary when it comes to probity, competence and commitment. On the other hand, if we have questions about academic qualifications or competence that we suspect aren’t just a matter of idiosyncratic or personal failure, those questions matter far more at the institutions that service the vast majority of college students in the United States. Or, if like Margaret Soltan, you’re worried about the misuse of instructional technology and laptops in the classroom, the most pressing question is about what the average university experience is like rather than with what an exceptional or unusual instructor might creatively choose to do with instructional technology.

Posted in Academia | 5 Comments

Swarthmore News

Swarthmore’s president, Al Bloom, announced today that at the end of his next contract in August of 2009, he will be ending his service at president of the college.

This is a big deal for all of us. He’s the only president I’ve worked for at Swarthmore, having led the college for 17 years, three longer than I’ve been here. It is very hard for me to imagine what the place will be like without Al.

I know faculty are supposed to grouse about their presidents, but I think he’s done a consistently terrific job. Among other things, I’ve found his great enthusiasm for the college, its students, its faculty, its staff and its alumni to be completely charming and wholly sincere. I’m going to miss his leadership and presence.

Posted in Swarthmore | Comments Off on Swarthmore News