SAGE Journal Articles

Click on the following links. Please note these will open in a new window.

Journal Article 11.1. Cambois, E., Grobon, S., Oyen, H. V., & Robine, J.-M. (2016). Impact of Question Wording on the Measurement of Activity Limitation: Evidence From a Randomized Test in France. Journal of Aging and Health28(7), 1315–1338. DOI: 10.1177/0898264316656504.

Learning Objectives: Compose questions using appropriate vocabulary and tone; Avoid using “problem words”; Compose questions that are clear, specific, and unbiased; Compose clear, specific, and unbiased closed-question response choices.

Summary: Compares alternative strategies to simplify the wording of the Global Activity Limitation Indicator.

Journal Article 11.2. Lenzner, T. (2012). Effects of Survey Question Comprehensibility on Response Quality. Field Methods24(4), 409–428. DOI: 10.1177/1525822X12448166.

Learning Objectives: Compose questions using appropriate vocabulary and tone; Compose questions that are clear, specific, and unbiased; Compose clear, specific, and unbiased closed-question response choices.

Summary: Conducts an experiment using a web survey to assess the impact of respondents’ question comprehension on response quality.

Journal Article 11.3. Baghal, T. A. (2017). Last Year Your Answer Was …: The Impact of Dependent Interviewing Wording and Survey Factors on Reporting of Change. Field Methods29(1), 61–78. DOI: 10.1177/1525822X16645073.

Learning Objectives: Compose questions using appropriate vocabulary and tone; Compose questions that are clear, specific, and unbiased.

Summary: Assesses strategies for asking questions to reduce recall errors in reports of change in longitudinal studies.

Journal Article 11.4. Lenzner, T. (2014). Are Readability Formulas Valid Tools for Assessing Survey Question Difficulty? Sociological Methods & Research43(4), 677–698. DOI: 10.1177/0049124113513436.

Learning Objectives: Compose questions using appropriate vocabulary and tone; Compose questions that are clear, specific, and unbiased; Compose clear, specific, and unbiased closed-question response choices.

Summary: Assesses performance of readability scoring programs for assessing question wording.