SAGE Journal Articles

Select SAGE journal articles are available to give you more insight into chapter topics. These are also an ideal resource to help support your literature reviews, dissertations and assignments.

The links will open in a new window.

Desimone, L. M. and Le Floch, K. C. (2004) Are We Asking the Right Questions? Using Cognitive Interviews to Improve Surveys in Education Research, Educational Evaluation and Policy Analysis, 26, (1), pp. 1-22.

Abstract: Improving the validity and reliability of surveys is a critical part of the response to the call for improved rigor of education research, policy analysis and evaluation. Too often we create inquiry tools without validating our measures against how respondents interpret our questions, and therefore collect data of questionable quality. The purpose of this article is to demonstrate how cognitive interviews can be a useful method for improving the reliability and validity of surveys used in education research.

Kano, M., Franke, T., Afifi, A. A. and Bourque, L. (2008) Adequacy of Reporting Results of School Surveys and Nonresponse Effects: A Review of the Literature and a Case Study, Educational Researcher, 37, (8), pp.480-490.

Abstract: To ensure accurate interpretation of research findings, researchers should report details about their research design, data collection method, and response rates when presenting findings from survey research. A review of 100 peer-reviewed articles reporting the results of survey research on K–12 schools with principals as the designated respondents revealed that such information is often not reported. Few studies examined or even acknowledged the potentially biasing effects of nonresponse.

Desimone, L. M., Smith, T. M. and Frisvold, D. E. (2010) Survey Measures of Classroom Instruction: Comparing Student and Teacher Reports, Educational Policy, 24, (2), pp. 267-329.

Abstract: This analysis contributes to efforts to improve the use and understanding of survey data in education policy research by asking: How different are student and teacher reports of classroom instruction? Do student, class, or teacher characteristics account for any of the differences? Using National Assessment of Education Progress (NAEP) data, we compare the responses of middle-school students and their teachers to the same questions about mathematics instruction. We found low correlations and small significant mean differences between student and teacher reports.

Moy, P. and Murphy, J. (2016) Problems and Prospects in Survey Research, Journalism & Mass Communication Quarterly, 93, (1), pp. 16-37.

Abstract: Over the last few decades, survey research has witnessed a number of developments that have affected the quality of data that emerge using this methodology. Using the total survey error (TSE) approach as a point of departure, this article documents chronic challenges to data quality. With the aim of facilitating assessments of data quality, this article then turns to best practices in the disclosure of survey findings based on probability and nonprobability samples.

Wilhelm, A. G. and Andrews-Larson, C. (2016) Why Don’t Teachers Understand Our Questions? Reconceptualizing Teachers’ “Misinterpretation” of Survey Items, DOI: 10.1177/2332858416643077, Mar 2016


Abstract: This study examined sources of inconsistency between teachers’ and researchers’ interpretations of survey items. We analyzed cognitive interview data from 12 middle school mathematics teachers to understand their interpretations of survey items focused on one aspect of their practice: the content of their advice-seeking interactions. Through this analysis we found that previously documented conceptualizations of sources of misinterpretation within teacher surveys (e.g., structural complexity, use of reform language) did not adequately account for all of the inconsistencies between the survey items and teachers’ interpretations.