The Sport of IR

football game

theunforgettablebuzz.com

I was watching the NFL season-opening  game last night.   I’m not actually a football fan, but when your husband writes a book connected to football, it’s one of the sacrifices you make.  (I have also watched DOTA tournaments with my son, and thought it made about as much sense as professional football.   What can I say, I love my guys.)   I was struck by the between-play graphics of the players and their stats, and got to wondering (it wasn’t as if the game held my attention), what kinds of pictures and stats would be shown on a highlights reel of Institutional Researchers.  (You don’t know, it could happen.)

I imagined a shot of myself at a recent discussion of the merits of assessment, face straining to appear as though I was hearing the comments for the first time instead of the hundredth.  (That would be the akin to football’s “tough warrior” pose.)   For many in IR the classic shot would be hunched over a computer, reading glasses on, vertical concentration lines above the nose… followed by a sudden arm-raised celebration when the stat program finally ran correctly.  The casual shot would be a laughing one, as most IR folks I know are very friendly and have a fantastic sense of humor.

The stats would be fun.   An old IR friend from Maryland used to count down his retirement by the number of factbooks he had left to do:   “three more factbooks until I retire!”   So a count of factbooks you’ve done could be one stat.  Another might be how many accreditation self-studies you’ve supported.   Twice earlier in my career I managed to change jobs just after a reaccreditation was completed at the new institution.   I thought it might be a wise strategy to continue, but alas, fell in love with my current institution.  How many Presidents or other senior staff members you’ve adjusted to would certainly be a reflection of a certain level of skill attainment (or at least endurance).    The speed with which one could generate a presentation-quality summary table would be like the 40-yard dash time.   The IR Director is kind of like a quarterback, so there could be special stats for that player.   I’m getting pretty good at evading tackles and the hurry-up offense, though I’m not sure how to measure that.

Our business cards ought to be more like sports trading cards that make use of these kinds of images and stats.   These would be much more fun to share at meetings and conferences.   (Excuse me while I call the folks at VistaPrint…)

 

Posted in Institutional Research Profession, Planes overhead | Leave a comment

Video and SPSS Syntax: Deleting Select Cases Using the National Student Clearinghouse Individual Detail Return File

There may be some situations where you would want to delete select records from an individual return file. For example, you may have a project where you are looking at student enrollment after graduation or transfer, and it is decided that your particular project will only include records for which a student was enrolled for more than 30 days in a fall/spring term or more than 10 days in a summer term. Or, you may have six years of records for a particular cohort, but you only want to examine records for four years. In both of these cases, you would want to delete the records that don’t fit your criteria before analyzing your data.

In this video & in the syntax posted below, I’m going to:

  1. Remove records for enrollment of 30 days or less for a fall/spring term or 10 days or less for a summer term
  2. Remove records that occurred outside of a particular date range (while I don’t highlight the beginning part of the range in the video, having the a range allows you to make sure you are not picking up courses (such as, for example, late ending summer courses) that you may not want in your file.

I have kept the syntax separate for the two so that they can be used independently of each other if need be for different projects. Additionally, while I selected 30 days as my cut off for a fall/spring term and 10 days as my cut off for a summer term for this example, selecting the appropriate number for a project can get complicated, especially if winter term or other accelerated sessions are in the mix.

I’m going to run the deletion of cases in two ways:

  1. Keeping only those with records found that meet the criteria (deleting the no record found cases, the cases where the one record found was too low of an enrollment period, and the cases where the one record found was after a particular date).
  2. Keeping a record for everyone, including the no record found cases and, after modification, those with only one record that has a) too few enrollment days or b) occurs outside of a project-specified date range.

This syntax opens in a txt document; simply copy/paste all into a SPSS syntax file and run.
Note that a few comments had been added to the syntax file that was seen in the video.

Deleting Select Cases From a Detail Individual Return File

Posted in Data and Reporting, Higher Education, Institutional Research Profession, Swarthmore | Tagged | Leave a comment

Video and SPSS Syntax: Admit/Not Enroll Project Using the National Student Clearinghouse Individual Detail Return File

I use the National Student Clearinghouse individual detail return file and SPSS syntax in this video to capture the first school attended for students who were admitted to my institution, but who did not enroll (names listed are not real applicants). In a future video, I’ll work on the same project using the aggregate report. I almost always use the individual detail return file since it provides so much information, but it does have a limitation that impacts this project.

If a school or student blocks records, they will appear in the aggregate report, but they will not appear in the individual detail return file. Some schools block the individual records (although they can still provide graduation information). So, if a lot of your admitted students do enroll at one of these schools, you might decide to use the aggregate report. However, the individual detail file provides more information, which allows for more flexibility.

Also keep in mind: if you are working with multiple years, those in earlier years have had a longer time to attend school as the National Student Clearinghouse will run the records up to the date that they run the file. You may want to review the earliest enrollment dates by year; if nearly all of your admit/not enrolls attend school shortly after their admit cohort term, you may be fine, but if you see a large percentage attending school for the first time years after they were admitted to your institution, you may want to cut records after a particular date before unduplicating (while I don’t review those steps in this video, I will detail them in a future post).

This syntax opens in a txt document; simply copy/paste all into a SPSS syntax file and run.

Admit Not Enrolled Individual Detail Return File SPSS Syntax

 

 

 

Posted in Data and Reporting, Higher Education, Institutional Research Profession, Swarthmore | Tagged | Comments Off

Decisions, Decisions

I’ve been working with data from the National Student Clearinghouse (NSC) for a while now. A lot of wonderful information can be found in the NSC data, but the detailed return file can sometimes be a bit difficult. There are so many ways the data can be sliced, and it can sometimes be challenging to determine how best to work with the data to present meaningful information to stakeholders.

For those unfamiliar with NSC data, the detail return file for a Subsequent Enrollment (SE) search, lists all enrollment records for a person that occurs during or after a specific search date. This means that, depending on the search date, each person has potentially many, many rows of data in the return file, each reflecting a different enrollment. Students will have a record for each enrollment term at an institution, but they will also have multiple enrollment records in a specific term if they took classes at more than one institution that term (this can be true even if the student dropped a class at one of the institutions), or if they were taking a class the same term at a school that reports parts of the school differently (for example, “School A” and “School A Graduate”). There are also separate rows for graduation records (and sometimes multiple rows for the same graduation).

One of the major decisions that need to be made when working with NSC data is the trumping order for which record or records should be selected for each student. Trumping schemes for which record or records to keep are based on the particular project; the same return file could be utilized in different ways for different projects, each with a goal of highlighting different information.

Example of what some of the pieces of an individual detail return file might look like:

Example for blog post 1

For example, one project may focus on graduates of a particular graduating class or transfer-outs who had a similar final term (no matter when they first enrolled) to examine subsequent enrollment and graduation at other institutions. While it would be important to keep the graduation record at the other school since we also want to examine success in this project, for those who didn’t graduate, do we want to use their first chronological record, or the last? Do we want to select 4 year schools over 2 years (but what if they graduated from a 2 year)?   Do we want to include all records, or cut those who weren’t enrolled for x number of days (and does that number need to vary based on a summer vs. a fall/spring term)?

Another project may focus on the progression & completion of a cohort, which would utilize multiple terms of data. However, it would still need to be decided if records with fewer than x number of days of enrollment should be cut, and how to handle multiple records in the same term. Additionally, as start and end dates at schools differ (and a few have winter terms that can overlap with the spring terms of other institutions), clear term definitions need to be set while knowing that there is no perfect answer.

Over the next few months, I’d like to share some of the SPSS syntax that I’ve developed in working with NSC data. For my first project, I’ll be posting a short video explanation & syntax to using NSC data to examine the enrolling institution for students who were admitted to your institution, but who did not enroll.

Posted in Data and Reporting, Institutional Research Profession, Swarthmore | Tagged | 2 Comments

Rules and Regs

The College has just submitted its Periodic Review Report (or PRR) to the Middle States Commission on Higher Education, our accrediting agency.   The PRR is an “interim” report, provided at the midpoint between our decennial self-studies.    Though it is not quite the bustle of a self-study – e.g. the bulk of the work is accomplished by one committee that works with others across campus, rather than a multitude of committees; there is no on-site visit from a team of examiners – it is an important accreditation event that takes a great deal of time and work to prepare.  A team of external evaluators will examine our document and accompanying materials to determine whether we are making progress in the areas identified in our last self study, and continue to meet the standards of excellence required by the Commission.  They will make recommendations to the Commission, and the Commission will review our accreditation status in the fall.

Our PRR steering committee worked through last year to revisit our last self-study and report of our examiners, and assess just how we have been doing in the many areas addressed.  The last five years have been a time of considerable change at the College – we came through the economic downturn, had a presidential transition, developed a new strategic plan, and are in the implementation stages of many new initiatives arising from the plan.   The committee had a lot to grapple with.  Their work was assembled and a report drafted through the summer and fall, and shared with the community for comment and input.   A final draft was prepared this spring.  Summarizing all of that activity in one document was difficult, but the result attests to how busy we have all been!

Middle States Federal Regulation FormWe were in great shape on timing of this process, had planned and paced ourselves well.   Our review gave us a good sense of the areas where we have made good progress and those we need to focus on.  We would complete our report before the deadline.    But there is another part of the PRR submission process that has been revised since our last accreditation event – the “Verification of Compliance with Accreditation-Relevant Federal Regulations.”

Of course we comply with all federal regulations, but this new form requires an assemblage of evidence, and this task had not been on our radar.   Doh!  So much for being ahead of schedule.  Off we scurried to locate our written policies, and figure out what evidence we could gather to demonstrate that we are indeed adhering to them.   We discovered that we actually did need to clarify and better codify a few things, particularly how we assign credit value to a course.  This was one of those things that we have done the same way since anyone living could remember.   We’ve never had a problem, and so it didn’t occur to us that our policy might not be crystal clear.   But it wasn’t!   So we revisited and tweaked our policy and will be giving it more attention going forward.  This was not a fun exercise, but it probably is exactly the result intended by this new requirement.   Having a vague policy that relies on everybody simply following tradition could be a problem waiting to happen – and one which we have avoided.

Posted in Higher Education, Swarthmore | Comments Off

In Case of IR…

photo by Andy Shores

photo by Andy Shores

Posted in Institutional Research Profession, Miscellanea | Tagged | Comments Off

Anecdotes versus Data: A False Dichotomy

Photo by Image Editor

Anecdotes often get a bad rap – sometimes deservedly. We have all seen examples of narratives plucked from the public smorgasbord and used to prop up a pre-conceived ideology. Given the prevalence of this often irresponsible and manipulative use of narrative [discussed further in the Huffington Post’sThe Allure of an Anecdote”] it is easy to lose faith in the power of stories. This periodically leads to a surge in demand for hard data and evidence regarding everything from healthcare to higher education. But data and statistics take their fair share of heat as well. For one thing, it turns out that data analysis is subjective too. Data can be manipulated, massaged, and infused with bias. And the strictly ‘objective’ quantitative analysis tends to come across as cold, devoid of feeling, and uninteresting. We know enough to know that numbers never tell the whole story. Standardized testing alone is a grossly inadequate assessment of educational enrichment and when organizations uncompromisingly focus on ‘the bottom line,’ it makes most of us uncomfortable at best.

This methodological tension is an exemplar of how the solution is rarely to be found in the extremes. Unfortunately, these two approaches to knowing the world have such strong advocates and detractors that we are often drawn toward diametrically opposed camps along a false continuum. Compounding the problem is that shoddy and irresponsible research at both ends of the spectrum is regularly circulated in mainstream media outlets.

This divorce is particularly problematic given that quality science, good journalism, and effective research tend to integrate the two. So-called “hard data and evidence” need narrative and story to provide validity, context and vitality. On the other hand, anecdotes and narratives need “hard data and evidence” to provide reliability, and to help separate the substantive from the whimsical. In responsible and effective research and analysis, the methodological dichotomy is brought into synergy, working together as structure and foundation, flesh and bone. The Philadelphia Inquirer printed a series on poverty in 2010 that serves as a good example from the field of journalism [“A Portrait of Hunger”]. Done effectively, data and narrative are inextricably melded into a seamless new creation.

In my short time thus far in Institutional Research at Swarthmore, I have been impressed by many things, one of which is the simultaneous respect for research and evidence-based decision making alongside respect for stories, nuance, and humanity. When the values and mission of a college call for an environment that respects both, it facilitates the practice of effective and balanced institutional research.

Posted in Data and Reporting, Institutional Research Profession | Comments Off

Time flies…

The fall has been whizzing by, and here it is Halloween already.   I shouldn’t be surprised, it’s been busier than ever.   (I know. I say that every year.)   We’ve completed two massive projects involving tracking student enrollment and outcomes over multiple cohorts and years, and another one that was small potatoes after those.  The Associate Provost and I have spent a ton of time building on the considerable work of our Middle States Periodic Review Report (PRR) Steering Committee to create a first draft of the report.     Our new IR staff members and I have worked together to get up to speed (including the Fall freeze and fall IPEDS reporting).    We’ve fielded four five! surveys (so far), and are getting pretty darned good at Qualtrics.   (Nothing like troubleshooting to help you learn something.)  I’ve gone through the CITI training for IRB and feel incredibly ethical.    And of course we’ve dealt with all the usual ad hoc requests and miscellany.  But with some of this big stuff behind us, and what is turning out to be a terrific IR team, it’s like the sun coming out.  For the first time in a year it seems we’re almost caught up.  We freeze employee data tomorrow, so that may not last long.

Posted in Institutional Research Profession | Comments Off

Happy New Year!

http://www.flickr.com/photos/darwinbell/

photo by Darwin Bell

Having worked in higher education for all of my adult life, I’ve never gotten over that “kid” feeling that September represents a new year.   More than January, it offers new beginnings and possibilities.   Faculty members come back from their summer activities recharged, and with new ideas and projects.   Our students return, literally, in earnest.   The quiet, sunny paths become challenging to navigate as people Have to Get Somewhere.   My new year’s resolution for this fall is to enjoy these moments.   I love helping a first-year student find a building, or hearing a student talk excitedly on their cell phone to a parent about a new class.  Or seeing a faculty member help a new colleague understand our customs and practices.  (“What’s a WA?!”)  When not too busy helping faculty and staff with their new ideas and projects, Institutional Research has a moment to catch its breath before our fall freeze, and watch the excitement.    My wish for the College in this new year is peace, love, and understanding.

Posted in Institutional Research Profession, Swarthmore | Comments Off

A New Era

happycatI am so very happy to share the news that today marks the beginning of a new era for Swarthmore’s Institutional Research Office!   We are now fully staffed, in our new configuration.    Pamela Borkowski-Valentin is our Data and Reporting Officer, and Jason Martin is our Institutional Research Associate.

The Data and Reporting Officer is responsible for creating and managing primary IR datasets (including admissions, student, course, degree, faculty, and staff data extracted from Banner; survey data; and peer data), and using this data to meet both internal and external informational needs. She produces the Common Data Set, the Fact Book and other basic data summaries; responds to mandated state and federal reporting; and addresses ad hoc requests for information.

Pamela has a BA in Sociology, and Master’s degrees in Social Service and Law and Social Policy (from Bryn Mawr).  She has worked for the last five years as a Research Analyst in the Montgomery County Community College (MCCC) IR office, where she supported the data needs of faculty and administrators by gathering and analyzing institutional data, creating and conducting surveys, and writing reports on a range of topics. She is adept at working with internal data as well as using external data to inform institutional questions. Pamela has a broad understanding of the challenges facing college students, gained and enhanced through her background in social work, internship experience at the Bryn Mawr Civic Engagement Office, and her work with two particular programs at MCCC, the KEYS program which focuses on disadvantaged students, and the Minority Male Mentoring Program.

The Institutional Research Associate is a social scientist, expected to use descriptive, exploratory, and inferential statistical techniques to focus on special issues and research projects. He is primarily responsible for more in depth studies, such as the factors that influence students to change majors or the effectiveness of peer mentoring programs.

Jason has a BA, an MA, and recently completed a PhD, all in Sociology. His dissertation topic was “Marketization in the Nonprofit Arts and Culture Sector: An Organizational Analysis.” He has worked for the last six years as a Research Analyst for the Metropolitan Philadelphia Indicators Project (MPIP), as well as doing some consulting work. At MPIP he conducted both quantitative and qualitative (e.g. focus groups) research studies that examined how region-based factors such as education and economy relate to the activities of non-profit organizations. He has a broad base of statistical skills to help him reveal meaning in data, and he is skilled at reporting findings clearly, without sacrificing rigor or “talking down” to his audience.

I hope our Swarthmore friends will stop by to welcome Pam and Jason!

Posted in Uncategorized | Comments Off