Skip to main content
Log in

Superficial Survey Choice: An Experimental Test of a Potential Method for Increasing Response Rates and Response Quality in Correctional Surveys

  • Original Paper
  • Published:
Journal of Quantitative Criminology Aims and scope Submit manuscript

Abstract

Objectives

Drawing on prior theoretical and empirical work on survey participation, this study develops one potential method for increasing response rates and response quality in correctional surveys. Specifically, we hypothesize that providing inmates with a superficial survey choice (SSC)—that is, a choice between completing either of two voluntary surveys that are actually differently ordered versions of the same questionnaire—will increase their motivation both to participate in a given survey and to respond thoughtfully to the questions asked therein.

Methods

We test the effectiveness of this method by evaluating its impact on unit nonresponse, item nonresponse, and answer reliability. To do this, we analyze experimental data from a recent survey of male inmates incarcerated in a medium security, private prison.

Results

Findings indicate that the overall response rate is higher among inmates who are provided a survey choice. In addition, the evidence shows that the SSC method increases the percentage of individual items completed, the number of demanding questions completed, and the reliability of reported responses.

Conclusion

The results from the analyses are consistent with the hypotheses that motivated this study and suggest that the SSC method holds promise as a tool for correctional researchers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. Indeed, based on their experiences interviewing inmates, Fox et al. (2011: 318) suggest that one take away lesson for researchers conducting correctional surveys is to “anticipate low response rates.”

  2. Consider, for example, the low likelihood that an adult resident of the United States has not at some point received a request to participate in either an academic or government-sponsored survey, market research, or a political poll.

  3. In total, 150 inmates participated in the survey. However, one inmate in the experimental group began circling every possible response category after question 44. While we counted responses as not missing for those inmates who circled two answers on a single question (e.g., “White” and “Latino” on the race measure), this particular inmate is an exceptional case. Because of the issue created in measuring item response for someone who clearly circled all responses on purpose, we made a decision to drop this case from the analyses. Therefore, the data reported in the paper is based on a sample size of 149 inmates, as opposed to 150 inmates. Results from ancillary analyses with this inmate included were substantively identical to those reported in the text.

  4. Examples of questions with below average item response rates include: “How many different times have you been convicted of a crime including those that resulted in probation, jail, prison, or a fine?,” and “In the year before you came to prison, what type of neighborhood did you live in … mostly White, mostly Asian, mostly Hispanic, mostly Black, or racially mixed?”.

  5. The Levene’s test for homogeneity of variance was significant, indicating that the variances across the experimental and control group are not homogenous. To account for this issue, we used a simulated ANOVA to perform simulations given the pattern of sample sizes and standard deviations, assuming the means are equal, and assess the Type I error rate that would be expected. A simulated ANOVA was conducted for both Percent Complete and Demanding Item Count. The simulated p value was still less than 0.01 for the analysis of Percent Complete and less than 0.001 for the analysis of the Demanding Item Count, indicating significant mean differences in the Percent Complete and Demanding Item Count across the two groups.

  6. In supplementary analyses, we reestimated the models for Percent Complete and Demanding Item Count after imputing missing values on the control variables with two different imputation strategies: mean imputation and multiple imputation (m = 10). The results from these analyses were substantively similar to those reported in the text, with the exception that the negative coefficient for the variable Black became significant in the multiple imputation models.

  7. The regression analysis of Self-Control 2 on Self-Control 1 for inmates in the experimental and control groups was also conducted using both mean imputation and multiple imputation (m = 10) for missing values on the control variables. The results were substantively similar to those obtained using list wise deletion.

References

  • American Association for Public Opinion Research (2011) Standard definitions: final dispositions of case codes and outcome rates for surveys. American Association for Public Opinion Research, Ann Arbor

    Google Scholar 

  • Beatty P, Herrmann D (2002) To answer or not to answer: decision processes related to survey item nonresponse. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York

    Google Scholar 

  • Cacioppo JT, Petty RE (1982) The need for cognition. J Pers Soc Psychol 42:116–131

    Article  Google Scholar 

  • Camp SD (1999) Do inmate survey data reflect prison conditions? Using surveys to assess prison conditions of confinement. Prison J 79:250–268

    Article  Google Scholar 

  • Camp SD, Gaes GG, Klein-Saffran J, Daggertt DM, Saylor WG (2002) Using inmate survey data in assessing prison performance: a case study comparing private and public prisons. Crim Justice Rev 27:26–51

    Article  Google Scholar 

  • Cannell CF, Miller PV, Oksenberg L (1981) Research on interviewing techniques. In: Leinhardt S (ed) Sociological methodology 1981. Jossey-Bass, San Franciso

    Google Scholar 

  • Carmines EG, Zeller RA (1979) Reliability and validity assessment. Sage, Thousand Oaks

    Google Scholar 

  • Catania JA, Binson D, Canchola J, Pollack LM, Hauck W, Coates TJ (1996) Effects of interviewer gender, interviewer choice, and item wording on responses to questions concerning sexual behavior. Public Opin Q 60:345–375

    Article  Google Scholar 

  • Cavusgil ST, Elvey-Kirk LA (1998) Mail survey response behavior: a conceptualization of motivating factors and an empirical study. Eur J Mark 32:1165–1192

    Article  Google Scholar 

  • Childers TL, Skinner SJ (1979) Gaining respondent cooperation in mail surveys through prior commitment. Public Opin Q 43:558–561

    Article  Google Scholar 

  • Cialdini RB (1988) Influence: science and practice. Scott, Foresman, Glenview

    Google Scholar 

  • Curtin R, Presser S, Singer E (2005) Changes in telephone survey nonresponse over the past quarter century. Public Opin Q 69:87–89

    Article  Google Scholar 

  • de Leeuw E, de Heer W (2002) Trends in household survey nonresponse: a longitudinal and international comparison. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York

    Google Scholar 

  • Dijkstra W, Smit JH (2002) Persuading reluctant recipients in telephone surveys. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York

    Google Scholar 

  • Dillman DA, Smyth JD, Christian LM (2009) Internet, mail, and mixed mode surveys: the tailored design method, 3rd edn. Wiley, Hoboken

    Google Scholar 

  • Fox K, Zambrana K, Lane J (2011) Getting in (and staying in) when everyone else wants to get out: 10 lessons learned from conducting research with inmates. J Crim Justice Educ 22:304–327

    Article  Google Scholar 

  • Gouldner AW (1960) The norm of reciprocity: a preliminary statement. Am Sociol Rev 25:161–178

    Article  Google Scholar 

  • Grasmick HG, Tittle CR, Bursik RJ Jr, Arneklev BJ (1993) Testing the core empirical implications of Gottfredson and Hirschi’s general theory of crime. J Res Crime Delinq 30:5–29

    Article  Google Scholar 

  • Groves RM, Couper MP (1998) Nonresponse in household interview surveys. Wiley, New York

    Book  Google Scholar 

  • Groves RM, Cialdini RB, Couper MP (1992) Understanding the decision to participate in a survey. Public Opin Q 56:475–495

    Article  Google Scholar 

  • Groves RM, Singer E, Corning A (2000) Leverage-saliency theory of survey participation: description and illustration. Public Opin Q 64:299–308

    Article  Google Scholar 

  • Groves RM, Fowler FJ Jr, Couper MP, Lepkowski JM, Singer E, Tourangeau R (2009) Survey methodology, 2nd edn. Wiley, Hoboken

    Google Scholar 

  • Guerino P, Harrison PM, Sabol WJ (2011) Prisoners in 2010. Bureau of Justice Statistics, Washington

    Google Scholar 

  • Haney C (2012) Prison effects in the age of mass incarceration. Prison J. Published online before print at http://tpj.sagepub.com/content/early/2012/09/11/0032885512448604

  • Hanson RK, Letourneau EJ, Olver ME, Wilson RJ, Miner MH (2012) Incentives for offender research participation are both ethical and practical. Crim Justice Behav. Published online before print at http://cjb.sagepub.com/content/early/2012/06/15/0093854812449217.abstract

  • Hart TC (1998) Causes and consequences of juvenile crime and violence: public attitudes and question-order effect. Am J Crim Justice 23:129–143

    Article  Google Scholar 

  • Higgens GE (2007) Examining the original Grasmick scale: a Rasch model approach. Crim Justice Behav 34:157–178

    Article  Google Scholar 

  • Hinrichs JR (1975) Effects of sampling, follow-up letters, and commitment to participation on mail survey response. J Appl Psychol 60:249–251

    Article  Google Scholar 

  • Huebner BM (2003) Administrative determinants of inmate violence: a multilevel analysis. J Crim Justice 31:107–117

    Article  Google Scholar 

  • James JM, Bolstein R (1990) The effect of monetary incentives and follow up mailing on the response rate and response quality in mail surveys. Public Opin Q 54:346–361

    Article  Google Scholar 

  • Jenness V, Maxson CL, Sumner JM, Matsuda KN (2009) Accomplishing the difficult but not impossible: collecting self-report data on inmate-on-inmate sexual assault in prison. Crim Justice Policy Rev 21:3–30

    Article  Google Scholar 

  • Jiang S (2005) Impact of drug use on inmate misconduct: a multilevel analysis. J Crim Justice 33:153–163

    Article  Google Scholar 

  • Junger-Tas J, Marshall IH (1999) The self-report methodology in crime research. Crime Justice 25:291–368

    Article  Google Scholar 

  • Knauper B (1999) The impact of age and education on response order effects in attitude measurement. Public Opin Q 63:347–370

    Article  Google Scholar 

  • Krosnick JA (1991) Response strategies for coping with the cognitive demands of attitude measures in surveys. Appl Cognit Psychol 5:213–236

    Article  Google Scholar 

  • Krosnick JA (2002) The causes of no-opinion responses to attitude measures in surveys: they are rarely what they appear to be. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York

    Google Scholar 

  • Krosnick JA, Alwin DF (1987) An evaluation of a cognitive theory of response order effects in survey measurement. Public Opin Q 51:201–219

    Article  Google Scholar 

  • Lefcourt H (1982) Locus of control: current trends in theory and research. Erlbaum, Hillsdale

    Google Scholar 

  • McFarland SG (1981) Effects of question order on survey response. Public Opin Q 45:208–215

    Article  Google Scholar 

  • Mears DP (2008) Accountability, efficiency, and effectiveness in corrections: shining a light on the black box of prison systems. Criminol Public Policy 7:143–152

    Article  Google Scholar 

  • Mears DP (2010) American criminal justice policy. Cambridge University Press, New York

    Book  Google Scholar 

  • Mears DP (2012) The prison experience: introduction to the special issue. J Crim Justice 40:345–347

    Article  Google Scholar 

  • Miller PV, Cannell CF (1982) A study of experimental techniques for telephone interviewing. Public Opin Q 46:250–269

    Article  Google Scholar 

  • Minton TD (2012) Jail inmates at midyear 2011—statistical tables. Bureau of Justice Statistics, Washington

    Google Scholar 

  • Monahan KC, Goldweber A, Cauffman E (2011) The effects of visitation on incarcerated juvenile offenders: how contact with the outside impacts adjustment on the inside. Law Hum Behav 35:143–151

    Article  Google Scholar 

  • Mowen JC, Cialdini RB (1980) On implementing the door-in-the-face compliance technique in a business context. J Mark Res 17:253–258

    Article  Google Scholar 

  • Nagin DS, Cullen FT, Jonson CL (2009) Imprisonment and reoffending. Crime Justice 38:115–200

    Article  Google Scholar 

  • Oksenberg L, Vinokur A, Cannell CF (1979) Effects of commitment to being a good respondent on interview performance. In: Cannell CF, Oksenberg L, Converse JM (eds) Experiments in interviewing techniques: field experiments in health reporting. Survey Research Center, University of Michigan, Ann Arbor, pp 1971–1977

    Google Scholar 

  • Paternoster R, Brame R, Mazerolle P, Piquero A (1998) Using the correct statistical test for the equality of regression coefficients. Criminology 36:859–866

    Article  Google Scholar 

  • Petersilia J (2003) When prisoners come home: parole and prisoner reentry. Oxford University Press, New York

    Google Scholar 

  • Petersilia J, Deschenes EP (1994) Perceptions of punishment: inmates and staff rank the severity of prison versus intermediate sanctions. Prison J 74:306–328

    Article  Google Scholar 

  • Pew Center on the States (2008) One in 100: behind bars in America in 2008. The Pew Charitable Trusts, Washington

    Google Scholar 

  • Pew Research Center (2012) Assessing the representativeness of public opinion surveys. Pew Research Center, Washington

    Google Scholar 

  • Schwartz B (2004) The paradox of choice: why more is less. Harper Perennial, New York

    Google Scholar 

  • Singer E (2002) The use of incentives to reduce nonresponse in household surveys. In: Groves RM, Dillman DA, Eltinge JL, Little RJA (eds) Survey nonresponse. Wiley, New York

    Google Scholar 

  • Singer E, Van Hoewyk J, Maher MP (2000) Experiments with incentives in telephone surveys. Public Opin Q 64:171–188

    Article  Google Scholar 

  • Slotboom A-M, Kruttschnitt C, Bijleveld C, Menting B (2011) Psychological well-being of incarcerated women in the Netherlands: importation or deprivation? Punishm Soc 13:176–197

    Article  Google Scholar 

  • Smoyer AB, Blankenship KM, Belt B (2009) Compensation for incarcerated research participants: diverse state policies suggest a new research agenda. Am J Public Health 99:1746–1752

    Article  Google Scholar 

  • Sykes GM (1958) The society of captives: a study of a maximum security prison. Princeton University Press, Princeton

    Google Scholar 

  • Tonry M, Petersilia J (2000) American prisons at the beginning of the twenty-first century. In: Tonry M, Petersilia J (eds) Prisons. The University of Chicago Press, Chicago

    Google Scholar 

  • Tourangeau R, Rips LJ, Rasinski K (2000) The psychology of survey response. Cambridge University Press, Cambridge

    Book  Google Scholar 

  • Travis J (2005) But they all come back: facing the challenges of prisoner reentry. The Urban Institute Press, Washington

    Google Scholar 

  • Trulson CR, Marquart JW, Mullings JL (2004) Breaking in: gaining entry to prisons and other hard-to-access criminal justice organizations. J Crim Justice Educ 15:451–478

    Article  Google Scholar 

  • Trussell N, Lavrakas PJ (2004) The influence of incremental increases in token cash incentives on mail survey response: is there an optimal amount? Public Opin Q 68:349–367

    Article  Google Scholar 

  • Visher CA, O’Connell DJ (2012) Incarceration and inmates’ self perceptions about returning home. J Crim Justice 40:386–393

    Article  Google Scholar 

  • Wakai S, Shelton D, Trestman RL, Kesten K (2009) Conducting research in corrections: challenges and solutions. Behav Sci Law 27:743–752

    Article  Google Scholar 

  • Wolf N, Shi J (2011) Patterns of victimization and feelings of safety inside prison: the experience of male and female inmates. Crime Delinq 57:29–55

    Article  Google Scholar 

  • Wolf N, Shi J, Schumann BE (2012) Reentry preparedness among soon-to be-released inmates and the role of time served. J Crim Justice 40:379–385

    Article  Google Scholar 

  • Wright KN (1989) Race and economic marginality in explaining prison adjustment. J Res Crime Delinq 26:67–89

    Article  Google Scholar 

  • Wright KN (1991) A study of individual, environmental, and interactive effects in explaining adjustment to prison. Justice Q 8:217–242

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Justin T. Pickett.

Appendix: Items Included in Low Self-Control Indices

Appendix: Items Included in Low Self-Control Indices

Low Self-Control 1

  1. 1.

    I often act on the spur of the moment without stopping to think.

  2. 2.

    I dislike really hard tasks that stretch my abilities to the limit.

  3. 3.

    Sometimes I will take a risk for the fun of it.

  4. 4.

    I like to get out and do things more than I like to read or contemplate ideas.

  5. 5.

    I try to look out for myself first, even if it means making things difficult for other people.

  6. 6.

    I lose my temper pretty easily.

  7. 7.

    When I’m really angry, other people better stay away from me.

Low Self-Control 2

  1. 1.

    I am more concerned with what happens to me in the short run than in the long run.

  2. 2.

    I often do whatever brings me pleasure here and now, even at the cost of some distant goals.

  3. 3.

    I frequently try to avoid projects that I know will be difficult.

  4. 4.

    The things in life that are the easiest to do bring me the most pleasure.

  5. 5.

    I sometimes find it exciting to do things for which I might get into trouble.

  6. 6.

    I almost always feel better when I am on the move than when I am sitting and thinking.

  7. 7.

    I seem to have more energy and a greater need for activity than most other people my age.

  8. 8.

    I’m not very sympathetic to other people when they are having problems.

  9. 9.

    When I have a serious disagreement with someone, it’s usually hard for me to talk calmly about it without getting upset.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pickett, J.T., Metcalfe, C.F., Baker, T. et al. Superficial Survey Choice: An Experimental Test of a Potential Method for Increasing Response Rates and Response Quality in Correctional Surveys. J Quant Criminol 30, 265–284 (2014). https://doi.org/10.1007/s10940-013-9203-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10940-013-9203-4

Keywords

Navigation