Part Two: From Geographic Location to Neighborhood Profile
In Part One of this two part blog post I explained how to start with a list of street addresses and, using Google’s Fusion Tables function, map those locations onto an interactive Google Map. This tool alone can be very useful and powerful in the context of institutional research and administration. However, where spatial analysis becomes significantly more powerful is when you use these known locations to find out more information about the specific communities and neighborhoods of students and alumni. Through the use of spatial analysis software, these “point” locations can be tied directly to zip codes, census tracts, block groups, Congressional districts, etc. From there, geographic data from the Census Bureau’s American Community Survey or other sources can be used to understand a great deal about where the community and neighborhood profile of students and alumni. It’s only a proxy for the individual and we always need to be aware of the Ecological fallacy, but you can gain immense and detailed understanding of a group just by learning about their spatial location.
The following is a guide for taking individual records (including street addresses), overlaying geographic boundaries (such as tracts, zip codes, etc.), joining (or combining) the individual records with their respective geographic descriptors (e.g. Student A lives in zip code 12345), and finally, joining/combining geography-based data from the US Census’ American Community Survey to those individual records (e.g. Student A lives in zip code 12345, which has a population of 3,500, a median household income of 65,000 dollars per year, and so on).
Many offices at the College receive requests for information from agencies, researchers, peer institutions, and others, and it can be difficult to judge which are worthwhile and which are not. A department chair recently asked me about how to decide which requests require a response. Continue reading IR Triage
Part One: Making Use of Mapping in Institutional Research
A good visual can be a helpful tool considering that the job of an Institutional Researcher is to keep the attention of people who have many important things to do with their time and little time to spend wading through a lot of text and long explanations. Enter mapping and spatial analysis. Maps generally make for familiar, easy to read, and aesthetically pleasing images that grab viewers’ attention and, when carefully constructed, do a very good job of communicating information. They are nice to look at, but they can also tell us relevant and important about our institution. I elaborate on a specific technique in this post (Part One) and delve more deeply into the topic in a follow-up post (Part Two). In this post I illustrate a simple but effective technique for mapping point locations. In the following post (Part Two), I discuss some of the potential deeper applications for mapping and spatial analysis in Institutional Research.
I was watching the NFL season-opening game last night. I’m not actually a football fan, but when your husband writes a book connected to football, it’s one of the sacrifices you make. (I have also watched DOTA tournaments with my son, and thought it made about as much sense as professional football. What can I say, I love my guys.) I was struck by the between-play graphics of the players and their stats, and got to wondering (it wasn’t as if the game held my attention), what kinds of pictures and stats would be shown on a highlights reel of Institutional Researchers. (You don’t know, it could happen.) Continue reading The Sport of IR
There may be some situations where you would want to delete select records from an individual return file. For example, you may have a project where you are looking at student enrollment after graduation or transfer, and it is decided that your particular project will only include records for which a student was enrolled for more than 30 days in a fall/spring term or more than 10 days in a summer term. Or, you may have six years of records for a particular cohort, but you only want to examine records for four years. In both of these cases, you would want to delete the records that don’t fit your criteria before analyzing your data.
I use the National Student Clearinghouse individual detail return file and SPSS syntax in this video to capture the first school attended for students who were admitted to my institution, but who did not enroll (names listed are not real applicants). In a future video, I’ll work on the same project using the aggregate report. I almost always use the individual detail return file since it provides so much information, but it does have a limitation that impacts this project.
I’ve been working with data from the National Student Clearinghouse (NSC) for a while now. A lot of wonderful information can be found in the NSC data, but the detailed return file can sometimes be a bit difficult. There are so many ways the data can be sliced, and it can sometimes be challenging to determine how best to work with the data to present meaningful information to stakeholders.
The College has just submitted its Periodic Review Report (or PRR) to the Middle States Commission on Higher Education, our accrediting agency. The PRR is an “interim” report, provided at the midpoint between our decennial self-studies. Though it is not quite the bustle of a self-study – e.g. the bulk of the work is accomplished by one committee that works with others across campus, rather than a multitude of committees; there is no on-site visit from a team of examiners – it is an important accreditation event that takes a great deal of time and work to prepare. Continue reading Rules and Regs
Anecdotes often get a bad rap – sometimes deservedly. We have all seen examples of narratives plucked from the public smorgasbord and used to prop up a pre-conceived ideology. Given the prevalence of this often irresponsible and manipulative use of narrative [discussed further in the Huffington Post’s “The Allure of an Anecdote”] it is easy to lose faith in the power of stories. This periodically leads to a surge in demand for hard data and evidence regarding everything from healthcare to higher education. But data and statistics take their fair share of heat as well. For one thing, it turns out that data analysis is subjective too. Data can be manipulated, massaged, and infused with bias. And the strictly ‘objective’ quantitative analysis tends to come across as cold, devoid of feeling, and uninteresting. We know enough to know that numbers never tell the whole story. Standardized testing alone is a grossly inadequate assessment of educational enrichment and when organizations uncompromisingly focus on ‘the bottom line,’ it makes most of us uncomfortable at best.
This methodological tension is an exemplar of how the solution is rarely to be found in the extremes. Unfortunately, these two approaches to knowing the world have such strong advocates and detractors that we are often drawn toward diametrically opposed camps along a false continuum. Compounding the problem is that shoddy and irresponsible research at both ends of the spectrum is regularly circulated in mainstream media outlets.
This divorce is particularly problematic given that quality science, good journalism, and effective research tend to integrate the two. So-called “hard data and evidence” need narrative and story to provide validity, context and vitality. On the other hand, anecdotes and narratives need “hard data and evidence” to provide reliability, and to help separate the substantive from the whimsical. In responsible and effective research and analysis, the methodological dichotomy is brought into synergy, working together as structure and foundation, flesh and bone. The Philadelphia Inquirer printed a series on poverty in 2010 that serves as a good example from the field of journalism [“A Portrait of Hunger”]. Done effectively, data and narrative are inextricably melded into a seamless new creation.
In my short time thus far in Institutional Research at Swarthmore, I have been impressed by many things, one of which is the simultaneous respect for research and evidence-based decision making alongside respect for stories, nuance, and humanity. When the values and mission of a college call for an environment that respects both, it facilitates the practice of effective and balanced institutional research.