Rules and Regs

The College has just submitted its Periodic Review Report (or PRR) to the Middle States Commission on Higher Education, our accrediting agency.   The PRR is an “interim” report, provided at the midpoint between our decennial self-studies.    Though it is not quite the bustle of a self-study – e.g. the bulk of the work is accomplished by one committee that works with others across campus, rather than a multitude of committees; there is no on-site visit from a team of examiners – it is an important accreditation event that takes a great deal of time and work to prepare.  A team of external evaluators will examine our document and accompanying materials to determine whether we are making progress in the areas identified in our last self study, and continue to meet the standards of excellence required by the Commission.  They will make recommendations to the Commission, and the Commission will review our accreditation status in the fall.

Our PRR steering committee worked through last year to revisit our last self-study and report of our examiners, and assess just how we have been doing in the many areas addressed.  The last five years have been a time of considerable change at the College – we came through the economic downturn, had a presidential transition, developed a new strategic plan, and are in the implementation stages of many new initiatives arising from the plan.   The committee had a lot to grapple with.  Their work was assembled and a report drafted through the summer and fall, and shared with the community for comment and input.   A final draft was prepared this spring.  Summarizing all of that activity in one document was difficult, but the result attests to how busy we have all been!

Middle States Federal Regulation FormWe were in great shape on timing of this process, had planned and paced ourselves well.   Our review gave us a good sense of the areas where we have made good progress and those we need to focus on.  We would complete our report before the deadline.    But there is another part of the PRR submission process that has been revised since our last accreditation event – the “Verification of Compliance with Accreditation-Relevant Federal Regulations.”

Of course we comply with all federal regulations, but this new form requires an assemblage of evidence, and this task had not been on our radar.   Doh!  So much for being ahead of schedule.  Off we scurried to locate our written policies, and figure out what evidence we could gather to demonstrate that we are indeed adhering to them.   We discovered that we actually did need to clarify and better codify a few things, particularly how we assign credit value to a course.  This was one of those things that we have done the same way since anyone living could remember.   We’ve never had a problem, and so it didn’t occur to us that our policy might not be crystal clear.   But it wasn’t!   So we revisited and tweaked our policy and will be giving it more attention going forward.  This was not a fun exercise, but it probably is exactly the result intended by this new requirement.   Having a vague policy that relies on everybody simply following tradition could be a disaster waiting to happen – and one which we have avoided.

Posted in Uncategorized | Comments Off

In Case of IR…

photo by Andy Shores

photo by Andy Shores

Posted in Institutional Research Profession, Miscellanea | Tagged | Comments Off

Anecdotes versus Data: A False Dichotomy

Photo by Image Editor

Anecdotes often get a bad rap – sometimes deservedly. We have all seen examples of narratives plucked from the public smorgasbord and used to prop up a pre-conceived ideology. Given the prevalence of this often irresponsible and manipulative use of narrative [discussed further in the Huffington Post’sThe Allure of an Anecdote”] it is easy to lose faith in the power of stories. This periodically leads to a surge in demand for hard data and evidence regarding everything from healthcare to higher education. But data and statistics take their fair share of heat as well. For one thing, it turns out that data analysis is subjective too. Data can be manipulated, massaged, and infused with bias. And the strictly ‘objective’ quantitative analysis tends to come across as cold, devoid of feeling, and uninteresting. We know enough to know that numbers never tell the whole story. Standardized testing alone is a grossly inadequate assessment of educational enrichment and when organizations uncompromisingly focus on ‘the bottom line,’ it makes most of us uncomfortable at best.

This methodological tension is an exemplar of how the solution is rarely to be found in the extremes. Unfortunately, these two approaches to knowing the world have such strong advocates and detractors that we are often drawn toward diametrically opposed camps along a false continuum. Compounding the problem is that shoddy and irresponsible research at both ends of the spectrum is regularly circulated in mainstream media outlets.

This divorce is particularly problematic given that quality science, good journalism, and effective research tend to integrate the two. So-called “hard data and evidence” need narrative and story to provide validity, context and vitality. On the other hand, anecdotes and narratives need “hard data and evidence” to provide reliability, and to help separate the substantive from the whimsical. In responsible and effective research and analysis, the methodological dichotomy is brought into synergy, working together as structure and foundation, flesh and bone. The Philadelphia Inquirer printed a series on poverty in 2010 that serves as a good example from the field of journalism [“A Portrait of Hunger”]. Done effectively, data and narrative are inextricably melded into a seamless new creation.

In my short time thus far in Institutional Research at Swarthmore, I have been impressed by many things, one of which is the simultaneous respect for research and evidence-based decision making alongside respect for stories, nuance, and humanity. When the values and mission of a college call for an environment that respects both, it facilitates the practice of effective and balanced institutional research.

Posted in Data and Reporting, Institutional Research Profession | Comments Off

Time flies…

The fall has been whizzing by, and here it is Halloween already.   I shouldn’t be surprised, it’s been busier than ever.   (I know. I say that every year.)   We’ve completed two massive projects involving tracking student enrollment and outcomes over multiple cohorts and years, and another one that was small potatoes after those.  The Associate Provost and I have spent a ton of time building on the considerable work of our Middle States Periodic Review Report (PRR) Steering Committee to create a first draft of the report.     Our new IR staff members and I have worked together to get up to speed (including the Fall freeze and fall IPEDS reporting).    We’ve fielded four five! surveys (so far), and are getting pretty darned good at Qualtrics.   (Nothing like troubleshooting to help you learn something.)  I’ve gone through the CITI training for IRB and feel incredibly ethical.    And of course we’ve dealt with all the usual ad hoc requests and miscellany.  But with some of this big stuff behind us, and what is turning out to be a terrific IR team, it’s like the sun coming out.  For the first time in a year it seems we’re almost caught up.  We freeze employee data tomorrow, so that may not last long.

Posted in Institutional Research Profession | Comments Off

Happy New Year!

http://www.flickr.com/photos/darwinbell/

photo by Darwin Bell

Having worked in higher education for all of my adult life, I’ve never gotten over that “kid” feeling that September represents a new year.   More than January, it offers new beginnings and possibilities.   Faculty members come back from their summer activities recharged, and with new ideas and projects.   Our students return, literally, in earnest.   The quiet, sunny paths become challenging to navigate as people Have to Get Somewhere.   My new year’s resolution for this fall is to enjoy these moments.   I love helping a first-year student find a building, or hearing a student talk excitedly on their cell phone to a parent about a new class.  Or seeing a faculty member help a new colleague understand our customs and practices.  (“What’s a WA?!”)  When not too busy helping faculty and staff with their new ideas and projects, Institutional Research has a moment to catch its breath before our fall freeze, and watch the excitement.    My wish for the College in this new year is peace, love, and understanding.

Posted in Institutional Research Profession, Swarthmore | Comments Off

A New Era

happycatI am so very happy to share the news that today marks the beginning of a new era for Swarthmore’s Institutional Research Office!   We are now fully staffed, in our new configuration.    Pamela Borkowski-Valentin is our Data and Reporting Officer, and Jason Martin is our Institutional Research Associate.

The Data and Reporting Officer is responsible for creating and managing primary IR datasets (including admissions, student, course, degree, faculty, and staff data extracted from Banner; survey data; and peer data), and using this data to meet both internal and external informational needs. She produces the Common Data Set, the Fact Book and other basic data summaries; responds to mandated state and federal reporting; and addresses ad hoc requests for information.

Pamela has a BA in Sociology, and Master’s degrees in Social Service and Law and Social Policy (from Bryn Mawr).  She has worked for the last five years as a Research Analyst in the Montgomery County Community College (MCCC) IR office, where she supported the data needs of faculty and administrators by gathering and analyzing institutional data, creating and conducting surveys, and writing reports on a range of topics. She is adept at working with internal data as well as using external data to inform institutional questions. Pamela has a broad understanding of the challenges facing college students, gained and enhanced through her background in social work, internship experience at the Bryn Mawr Civic Engagement Office, and her work with two particular programs at MCCC, the KEYS program which focuses on disadvantaged students, and the Minority Male Mentoring Program.

The Institutional Research Associate is a social scientist, expected to use descriptive, exploratory, and inferential statistical techniques to focus on special issues and research projects. He is primarily responsible for more in depth studies, such as the factors that influence students to change majors or the effectiveness of peer mentoring programs.

Jason has a BA, an MA, and recently completed a PhD, all in Sociology. His dissertation topic was “Marketization in the Nonprofit Arts and Culture Sector: An Organizational Analysis.” He has worked for the last six years as a Research Analyst for the Metropolitan Philadelphia Indicators Project (MPIP), as well as doing some consulting work. At MPIP he conducted both quantitative and qualitative (e.g. focus groups) research studies that examined how region-based factors such as education and economy relate to the activities of non-profit organizations. He has a broad base of statistical skills to help him reveal meaning in data, and he is skilled at reporting findings clearly, without sacrificing rigor or “talking down” to his audience.

I hope our Swarthmore friends will stop by to welcome Pam and Jason!

Posted in Uncategorized | Comments Off

Numbers, numbers…

Most IR people are fascinated with numbers, logic, probability, and statistics, which is part of what draws us to our field.   We like to poke at data, and think about what they can and cannot tell us about the phenomena they reflect.  It’s not surprising that many in the profession think that Nate Silver is somewhat of a god.   And so when one of our favorite numbers guru addresses a topic in higher education, our day is made!

Yesterday in his blog, Five Thirty Eight, Nate Silver talked about what the changing numbers of college majors do and do not tell us about college programs and whether or not some majors are suffering from an increased emphasis on career-focused programs.  He uses data from the National Center for Education Statistics – data provided by Institutional Researchers through our IPEDS reporting – and employs a standard IR approach in offering alternative perspectives on numbers:  using a different denominator.   No spoilers here, I couldn’t possibly do it justice anyway.   Please go read.

As an added bonus, Silver mentions his own undergraduate experience at the University of Chicago and advocates broad, diverse studies.   He didn’t explicitly mention “liberal arts education,” but at least a few of his readers’ comments do.   Oh well, you can’t have everything!

Posted in Data and Reporting, Higher Education | 1 Comment

Reflecting

I’m taking a little break from the phone calls and talking with the amazing pool of candidates who have applied for the two IR positions to think about all of the changes taking place at Swarthmore.   Commencement always makes me wistful, but there’s a lot to look forward to.

Everyone has been very busy this year working on the key initiatives that have come from our recent planning process, and these are starting to take shape.   A campus master plan that helps us to prepare for the changes that these initiatives may bring is entering its final stage of preparation and presents exciting possibilities.   This spring our students have challenged the College to more deeply engage with many issues of concern, and the poise and compassion of Swarthmore’s leadership in responding to these challenges has been an inspiration to me.   And of course, I’m looking forward to having a fully staffed Institutional Research Office which can support the College’s effectiveness in all of these efforts.

After this incredibly busy year, commencement this weekend, and alumni weekend next weekend, a relatively quieter period to recover and re-energize will surely be welcomed by all of us!

Posted in Swarthmore | Comments Off

Why IR is hiring

Application_Page_1With an increasing amount of my time for the past two years spent with the Provost’s Office and the College in general helping to guide our assessment efforts, the IR Office has been struggling mightily to keep up with our work.   In January we were approved for an additional limited term position in the IR office to help offset the loss of my time.  The need for the position will be reevaluated in three years, which corresponds to the term of the new (second) Associate Provost position.   This is not an accident.  Since both positions are intended to relieve overloads caused, at least in part, by our needs for assessment and institutional effectiveness leadership, it makes sense to review this new infrastructure a few years down the road to see how it is serving us, as our assessment processes improve and our work becomes more routine.

With this additional IR position and Alex’s departure, I’ll be in a way replacing both Alex and myself, as I continue focusing more on assessment and on our upcoming accreditation mid-term report.   But while Alex and I shared much of the responsibilities for IR reporting and research in the past, I’ll be structuring the two positions to more separately reflect these two key roles.   A Data and Reporting Officer will have primary responsibility for data management, and routine and ad hoc reporting for internal and external purposes.  An Institutional Research Associate (the limited term position) will focus more on special studies, and is expected to provide the advanced analytic skills to our projects.   These two positions, and mine, will share somewhat in responsibilities and have just enough overlap to serve us in those many “all hands” moments.   It should be an exciting time for Institutional Research – and for assessment!

Posted in Assessment, Data and Reporting, Institutional Research Profession, Swarthmore | Comments Off

Keeping Score

SwatScoreSmPresident Obama announced the new “College Scorecard” in his state of the union address, and the interactive online tool was released the next day.  The intended purpose of the tool is to provide useful information to families about affordability and student success at individual colleges.  Since then, the IR community has been buzzing.   Much of the data in the tool is reported via the IR offices, and many of us are already being asked to explain the data and the way it is presented.  Several of our listservs became quite busy as my colleagues compared notes on glitches in the lookup feature of the tool (zip codes searches were problematic early on) and the accuracy of the data, and debated the clarity of the labels and the wisdom of the simple presentation.

This project is an example of a wonderful goal that is incredibly hard to execute well.   Seeing all the press coverage (both mainstream and higher ed press) and hearing from my colleagues, I think about the balance of such a project.   It seems reasonable that after thorough development and testing, there would be a point at which the best course of action is to just move forward and release it even though it is not perfect.   But where is that point?  One could argue whether this was the correct point for the Scorecard project, but all of the attention is creating increased awareness by the public, as well as pressures on the designers for improvement, and on colleges for accuracy and accountability.

HarmGoodSmall

I wonder how many people remember the clunky online tool, COOL (the College Opportunities On Line), from the early 00′s, and the growing pains that it went through as it evolved into the College Navigator, a pretty spiffy – and very useful – tool for families to find a wealth of information about colleges?   These things evolve and if not useful and effective, won’t survive.   The trick is not doing more harm than good while the kinks are worked out.

What’s in the Scorecard and where did it come from?   The Scorecard has six categories of information:  Undergraduate Enrollment, Costs, Graduation Rates, Loan Default Rate, Median Borrowing, and Employment.   Information about the data and its sources can be found at the Scorecard website, but it takes a little work!   Click on the far right square that says “About the Scorecard” on the middle row of squares.  From the text that spins up, click “Here”, which opens another window (not sure if these are “pop-ups” or “floating frames”), and that’s where the descriptions are.

The data for the first three items come from our reporting to the federal government through the IPEDS (Integrated Postsecondary Education Data System), which I have posted about before.   Here is yet another reason to make sure we report accurately!  The next two categories, Loan Default Rate and Median Borrowing, get their data from federal reporting through the National Student Loan Data System (NSLDS).   The last item, Employment, provides no actual data, but rather a sly nudge for users of the system to contact the institutions directly.

While each of these measures creates its own challenge to simplicity and clarity of explanation, one of the more confusing, and hence controversial, measures is the “Cost.”   The display says “Net price is what undergraduate students pay after grants and scholarships (financial aid you don’t have to pay back) are subtracted from the institution’s cost of attendance.”  This is an important concept, and we all want students to understand why they should not just look at the “sticker price” of a college, but at what students actually pay after accounting for aid.   Some very expensive private colleges can actually cost less than public institutions once aid is factored in, and this is a very difficult message to get out!  But the more precise definition behind the scenes (that floating frame!) says “the average yearly price actually charged to first-time, full-time undergraduate students receiving student aid at an institution of higher education after deducting such aid.”  The first point of confusion is that this net price is calculated only for first-time, full-time, aided students, rather than averaged across all students.   The second is the actual formula, which takes some more digging.   It uses the “cost of attendance,” which is tuition, fees, room, and board, PLUS a standard estimate of the cost for books, supplies, and other expenses.   The aid dollars include Pell grants, other federal grants, state or local government grants (including tuition waivers), and institutional grants (scholarship aid that is not repaid).   And the third point that may cause confusion is, of course, the final, single figure itself which is an average, while no one is average.

Will a family dig that deep?   Would they understand the terminology and nuances if they did?   Would they be able to guess whether their student would be an aid recipient, and if so, whether they’d be like the average aid recipient?   The net price presentation that already exists in the College Navigator has an advantage over the single figure shown in the Scorecard, because it shows the value for each of a number of income ranges.   While aid determinations are based on much more than simple income, at least this presentation more clearly demonstrates that the net price for individuals varies – by a lot!

Posted in Data and Reporting, Higher Education, Institutional Research Profession | Comments Off