Reflecting

I’m taking a little break from the phone calls and talking with the amazing pool of candidates who have applied for the two IR positions to think about all of the changes taking place at Swarthmore.   Commencement always makes me wistful, but there’s a lot to look forward to.

Everyone has been very busy this year working on the key initiatives that have come from our recent planning process, and these are starting to take shape.   A campus master plan that helps us to prepare for the changes that these initiatives may bring is entering its final stage of preparation and presents exciting possibilities.   This spring our students have challenged the College to more deeply engage with many issues of concern, and the poise and compassion of Swarthmore’s leadership in responding to these challenges has been an inspiration to me.   And of course, I’m looking forward to having a fully staffed Institutional Research Office which can support the College’s effectiveness in all of these efforts.

After this incredibly busy year, commencement this weekend, and alumni weekend next weekend, a relatively quieter period to recover and re-energize will surely be welcomed by all of us!

Why IR is hiring

Application_Page_1With an increasing amount of my time for the past two years spent with the Provost’s Office and the College in general helping to guide our assessment efforts, the IR Office has been struggling mightily to keep up with our work.   In January we were approved for an additional limited term position in the IR office to help offset the loss of my time.  The need for the position will be reevaluated in three years, which corresponds to the term of the new (second) Associate Provost position.   This is not an accident.  Since both positions are intended to relieve overloads caused, at least in part, by our needs for assessment and institutional effectiveness leadership, it makes sense to review this new infrastructure a few years down the road to see how it is serving us, as our assessment processes improve and our work becomes more routine.

With this additional IR position and Alex’s departure, I’ll be in a way replacing both Alex and myself, as I continue focusing more on assessment and on our upcoming accreditation mid-term report.   But while Alex and I shared much of the responsibilities for IR reporting and research in the past, I’ll be structuring the two positions to more separately reflect these two key roles.   A Data and Reporting Officer will have primary responsibility for data management, and routine and ad hoc reporting for internal and external purposes.  An Institutional Research Associate (the limited term position) will focus more on special studies, and is expected to provide the advanced analytic skills to our projects.   These two positions, and mine, will share somewhat in responsibilities and have just enough overlap to serve us in those many “all hands” moments.   It should be an exciting time for Institutional Research – and for assessment!

Keeping Score

SwatScoreSmPresident Obama announced the new “College Scorecard” in his state of the union address, and the interactive online tool was released the next day.  The intended purpose of the tool is to provide useful information to families about affordability and student success at individual colleges.  Since then, the IR community has been buzzing.   Much of the data in the tool is reported via the IR offices, and many of us are already being asked to explain the data and the way it is presented.  Several of our listservs became quite busy as my colleagues compared notes on glitches in the lookup feature of the tool (zip codes searches were problematic early on) and the accuracy of the data, and debated the clarity of the labels and the wisdom of the simple presentation.

This project is an example of a wonderful goal that is incredibly hard to execute well.   Seeing all the press coverage (both mainstream and higher ed press) and hearing from my colleagues, I think about the balance of such a project.   It seems reasonable that after thorough development and testing, there would be a point at which the best course of action is to just move forward and release it even though it is not perfect.   But where is that point?  One could argue whether this was the correct point for the Scorecard project, but all of the attention is creating increased awareness by the public, as well as pressures on the designers for improvement, and on colleges for accuracy and accountability.

HarmGoodSmall

I wonder how many people remember the clunky online tool, COOL (the College Opportunities On Line), from the early 00’s, and the growing pains that it went through as it evolved into the College Navigator, a pretty spiffy – and very useful – tool for families to find a wealth of information about colleges?   These things evolve and if not useful and effective, won’t survive.   The trick is not doing more harm than good while the kinks are worked out.

What’s in the Scorecard and where did it come from?   The Scorecard has six categories of information:  Undergraduate Enrollment, Costs, Graduation Rates, Loan Default Rate, Median Borrowing, and Employment.   Information about the data and its sources can be found at the Scorecard website, but it takes a little work!   Click on the far right square that says “About the Scorecard” on the middle row of squares.  From the text that spins up, click “Here”, which opens another window (not sure if these are “pop-ups” or “floating frames”), and that’s where the descriptions are.

The data for the first three items come from our reporting to the federal government through the IPEDS (Integrated Postsecondary Education Data System), which I have posted about before.   Here is yet another reason to make sure we report accurately!  The next two categories, Loan Default Rate and Median Borrowing, get their data from federal reporting through the National Student Loan Data System (NSLDS).   The last item, Employment, provides no actual data, but rather a sly nudge for users of the system to contact the institutions directly.

While each of these measures creates its own challenge to simplicity and clarity of explanation, one of the more confusing, and hence controversial, measures is the “Cost.”   The display says “Net price is what undergraduate students pay after grants and scholarships (financial aid you don’t have to pay back) are subtracted from the institution’s cost of attendance.”  This is an important concept, and we all want students to understand why they should not just look at the “sticker price” of a college, but at what students actually pay after accounting for aid.   Some very expensive private colleges can actually cost less than public institutions once aid is factored in, and this is a very difficult message to get out!  But the more precise definition behind the scenes (that floating frame!) says “the average yearly price actually charged to first-time, full-time undergraduate students receiving student aid at an institution of higher education after deducting such aid.”  The first point of confusion is that this net price is calculated only for first-time, full-time, aided students, rather than averaged across all students.   The second is the actual formula, which takes some more digging.   It uses the “cost of attendance,” which is tuition, fees, room, and board, PLUS a standard estimate of the cost for books, supplies, and other expenses.   The aid dollars include Pell grants, other federal grants, state or local government grants (including tuition waivers), and institutional grants (scholarship aid that is not repaid).   And the third point that may cause confusion is, of course, the final, single figure itself which is an average, while no one is average.

Will a family dig that deep?   Would they understand the terminology and nuances if they did?   Would they be able to guess whether their student would be an aid recipient, and if so, whether they’d be like the average aid recipient?   The net price presentation that already exists in the College Navigator has an advantage over the single figure shown in the Scorecard, because it shows the value for each of a number of income ranges.   While aid determinations are based on much more than simple income, at least this presentation more clearly demonstrates that the net price for individuals varies – by a lot!

Transition

Just after the winter holidays, Alex shared with me the wonderful and sad news that he would be moving on to another position outside the College.   It’s a great opportunity for him for advancement, and also to be closer to his family.   But we’ll miss him a lot!    Alex’s last day was January 25th.    His departure is a loss to the office and to the College.

Check out Alex’s new gig!
Passaic County Community College – Institutional Research and Planning

Walking the Walk

walking feet
photo by :::mindgraph:::

In spite of providing support to other areas of the College undertaking assessment projects, like many IR offices, ours had yet to fully engage with our own assessment.  Sure, we had articulated goals for our office some time ago, and had reflected on our success on the simpler, numeric ones, but only this year (as part of increased College-wide efforts) began to grapple seriously with how to assess some of the more complex ones.

One of our take-aways from the Tri-College Teagle-funded Assessment project that we’ve been involved with was that the effort to design rigorous direct assessment is key, and the work that goes into thinking about this may sometimes be even more important than the measurement itself.   It’s one thing to observe this, and quite another to experience it firsthand.

As Alex and I looked at our goals, each phrase begged clarification!  We don’t want to just conduct research studies, we want to conduct effective studies.  But what do we mean by “effective”?   How do we meet our audiences needs?   How do we identify our audience’s needs?   What is the nature of relationship between the satisfaction of a consumer of our research and the quality of the research?   And on and on.   Before even identifying the sorts of evidence of our effectiveness that we should look for, we had already identified at least a dozen ways to be more proactive in our work.

The other revelation, which should not have been a surprise because I have said it to others so often, is that there are seldom perfect measures.   Therefore, we need to look for an array of evidence that will provide confidence and direction. There are many ways to define “effective” research – what is most important (and manageable) to know?

And finally, it’s easy to get caught up in these discussions and in setting up processes to collect more and more evidence.   We could spend a lot of time exploring how a research study informed someone’s work and decision-making, but that time that could have instead been used in expanding the study.   We have to find a balance, so that we do assessments that are most helpful, and don’t end up distracting us from the very work we’re trying to improve.

“Optimal” Faculty to Staff Ratio

An article in the Chronicle today reports on a study by two economists about the optimal faculty to staff ratio.  The study is focused on Research 1 and 2 public institutions, but I couldn’t stop myself from applying the simple math formula to a small liberal arts college, such as Swarthmore, to see what would happen.

We are actually freezing our employee data today, and so I don’t yet have current numbers, but based on last year’s data we had 944 employees – 699 full-time.  The study identifies the optimal ratio as 3 tenure-tack faculty to each full-time professional administrator.  Using IPEDS reporting definitions, we had 162 tenured and on-track faculty members last year, and 242 full-time professional administrators (Executive/ Administrative/ Managerial, and Other Professional).   That’s a conservative estimate of “professional administrators,” because it’s unclear to me from the paper which categories are included in the final equation.   All non-faculty staff are considered at different points in their modeling.

So if that 3 to 1 ratio were desirable here, we would need to add 564 tenure-track faculty.   I don’t know how the 242 administrators would manage all the new buildings and infrastructure we’d need.   And our student to faculty ratio would drop to about 2:1.   Alternately, we could get rid of about 188 professional administrators to drop their total to 54.   In that case our 162 faculty would have to start managing housing, administering grants, raising funds, supporting IT, doing IPEDS reporting, etc., in addition to all their regular responsibilities.  I’m sure they’d enjoy that.

Guess I’ll just have to wait until these researchers tackle this issue for liberal arts colleges.

Time for a prediction

crystal ball
photo by Cillian Storm

I don’t know, is it me?   I think it gets quieter and quieter each year after US News releases its rankings.   Has the publication that all of higher education loves to hate lost its impact?  I saw very little press yesterday, and not even much buzz on the IR listservs, in response to the release of US News’ annual rankings.   Maybe it’s all the bratty little upstart rankings that have begun to get more attention, or that we’ve just reached a point of rankings saturation and there’s nothing more to say.

I’m not big on making predictions.   In fact, whenever anyone asks me to predict what our rank will be, I make a lame joke about leaving my dice at home.   But US News depends heavily on these rankings in their business model, and I wonder if they’re missing the press they used to get.   What they need is some controversy!  I predict that it’s time for US News to “tweak” its methodology, which will result in some upsets in the rankings and presto!  More press!   They could even just update their Cost of Living Adjustment on the Faculty Salary measure – as far as I can tell, they’ve been using the same index since 2002.   That would certainly be defensible, and could have the effect of shaking things up.   But mark my word, SOMETHING will change next year!

The Importance of IPEDS

IPEDS FormThe IR responsibility of providing summary data to the federal government through the Integrated Postsecondary Education Data System (IPEDS) sounds like as much fun as completing tax forms.   And as a matter of fact that analogy pretty much captures it!   It’s an obligation of all institutions that participate in any kind of Title IV funding programs (federal student financial aid), which means that like death and taxes, it affects just about all of us.  Assembling and providing this information is not always easy, but it’s a responsibility that we take very seriously, and we do our best to work effectively with our colleagues internally so that we provide the most accurate data possible.

I recently attended a workshop to become a “trainer” for IPEDS.  The Association for Institutional Research (AIR) works with the National Center for Educational Statistics (NCES) to provide training and support for both submitting data and using the data that NCES makes available to the public.   It’s a really wonderful program of online tutorials, face-to-face workshops, and other activities that promote understanding of this important resource, and I’m excited about being involved.  I’ve always been a girl scout about this stuff anyway, but the workshop reinforced just how valuable and PUBLIC! a resource this is.  Once submitted (and after the agency’s review and consistency checking) this information becomes available to the public through the IPEDS Data Center.   That means that anyone can use it …and they do!   Policy analysts, legislators, reporters, grant agencies, prospective students, administrators at peer institutions, accreditors, job-seekers, higher education researchers, the list is endless.    The accuracy of data can reflect on individual institutions – you really don’t want to show up on the U.S. Department of Education’s list of institutions with the fastest increasing tuition because you couldn’t be bothered to double-check your numbers – but it also has implications for policy, research conclusions, and many other decisions that affect the higher education community.