Survey length and response rate

We are often asked what can be done to bolster response rates to surveys.   There are a lot of ways to encourage responding, but one concern that is often dismissed by those conducting surveys is the length of the survey.  But people are busy, and with the many things in life demanding our attention, a long survey can be particularly burdensome if not downright disrespectful.

Below is a plot of the number of items on recent departmental surveys and their response rates.  The line depicts the relationship between length of survey and responding (the regression line, for our statistically-inclined friends).

Scatterplot of survey length (number of items) and percent responding shows an inverse relationship.  The longer the survey, the fewer responses.Aside from shock that someone actually asked a hundred questions, what you should notice is that as the number of items goes up, responding goes down.  This is a simple relationship, determined from just a small number of surveys.  Even if I remove the two longest surveys, a similar pattern holds.   Of all the things that could affect responding (appearance of the survey, affiliation with the requester, perceived value, timing, types of questions, and many, many other things), that this single feature can explain a  chunk of the response rate is pretty compelling!

The “feel” of length can be softened by layout – items with similar response options can be presented in a matrix format, for example.  But the bottom line is that we must respect our respondents’ time, and only ask them questions that will be of real value and that we can’t learn in other ways.

Moral:  Keep it as short as possible!

(For more information about conducting surveys, see the “Survey Resources” section of our website.)

Experiences that matter

Kitten looking in mirror sees lionWe recently heard a talk by Josipa Roksa, a coauthor (with Richard Arum) of  Academically Adrift, the  study which concluded that students aren’t learning very much in college, and which captured the attention of the higher ed community and the public earlier this year.  Hers was the keynote address at the June conference in Philadelphia of the Higher Education Data Sharing (“HEDS”) consortium, which is a group of over 100 liberal arts colleges and a few universities to which Swarthmore belongs.  We share research and planning tools and techniques.  It’s a great group of IR types, and Alex and I were lucky to have the meeting in our back yard.

At the meeting Roksa shared with our group some of the findings from the two years of research conducted since the book was completed.  Among other things, the researchers have explored experiences that positively impact student performance.  Some of the things that mattered were:  faculty having high expectations for students; more rigorous requirements for the course; time that the students spent studying alone (time spent in informal group study had a negative impact!); and department of major (some majors showed more gains than others).  One of the hopeful notes that Roksa struck at the end of her talk was that they are now having some success at identifying the good practices that improve student learning, but the key is to ensure that more students get to experience these good practices.

This got me thinking about the importance of expectations and norms (maybe my roots as a Social Psychologist are showing).   Swarthmore is a place where intense intellectual activity is just part of the ethos.  But what is interesting is that while the faculty are certainly demanding of students, students’ interest in working hard is a self-perpetuating characteristic.  They select to come here because that is the environment they see when they visit, and that’s what they want.   Once here, they do work hard, reinforcing the norm.  We’re very fortunate to have an environment where practices critical for positive learning experiences are so firmly established.   It’s easier to consider implications of studies such as this when there is a strong foundation already in place.

 

 

References

Arum, R., & Roksa, J. (2011). Academically adrift: Limited learning on college campuses. Chicago: University of Chicago Press.

HEDS is at http://www.e-heds.org/The Higher Education Data Sharing (HEDS) Consortium assists member institutions in planning, management, institutional research, decision-support, policy analysis, educational evaluation, and assessment.