SAGE Journal Articles

Access the full-text SAGE journal articles spotlighted in the following chapters of Designing Experiments for the Social Sciences by clicking on the links below. 

Chapter 1: Discovering Cause and Effect

From Study Spotlight 1.2: “Discovering Effects and Explaining Why,” page 5

Journal Article 1: Hay, C., Wang, X., Ciaravolo, E., & Meldrum, R. C. (2015). Inside the black box: Identifying the variables that mediate the effects of an experimental intervention for adolescents. Crime & Deliquency, 61(2), 243-270.

Learning Objective: Explain how cause and effect works in an experiment

Summary: This study serves as a good example of an experiment that finds effects and explains what caused them. It addresses the issue of program participation and reduced delinquency by considering the risk factors that mediate the effects of a comprehensive intervention on juvenile offending. This was considered with data from the Children at Risk program, a 2-year multimodal intervention with random assignment that has been shown to reduce delinquency among high-risk early adolescents.

Chapter 4: Types of Experiments

From Study Spotlight 4.2: “A Study Using the Solomon Four-Group Design,” page 95

Journal Article 1: Genç, M. (2016). An evaluation of the cooperative learning process by sixth-grade students. Research in Education, 95(1), 19-32.

Learning Objective: Summarize the different types of experiments using Campbell and Stanley’s typology

Summary: This study offers a model of a Solomon Four-Group Design. 135 sixth-grade students attending the same school took part in the study, which investigated the effectiveness of cooperative learning on the science lessons achievement of primary school students, and designated their views on cooperative learning process.

Chapter 8: Sampling and Effect Sizes

From How to Do It 8.1: “Rationales for Subjects,” page 214

Journal Article 1: Bennion, E. A., & Nickerson, D. W. (2011). The cost of convenience: An experiment showing e-mail outreach decreases voter registration. Political Research Quarterly, 64(4), 858-869.

Learning Objective: Think critically about the use of students and subjects from various sources

Summary: This study provides an example of how researchers have described and justified their choice of samples, in this case, college students with professional experience in the area of experimental research. The following hypothesis was tested: Downloading forms may impose higher transaction costs than traditional outreach for some people and thereby decrease electoral participation. A randomized, controlled experiment tested this hypothesis by encouraging treatment participants via e-mail to use online voter registration tools.

Chapter 9: Stimuli and Manipulation Checks

From Study Spotlight 9.1: “Creating Realistic Stimuli,” page 250

Journal Article 1: Chen, G. M. (2013). Losing face on social media. Communication Research, 42(6), 819-838. doi:10.1177/0093650213510937.

Learning Objective: Create realistic stimuli

Summary: The author of this study goes to great lengths to create a realistic social-networking site for this experiment, which explores whether even relatively minor face-threatening acts of rejection or criticism on a social-networking site similar to Facebook lead to increases in self-reported negative affect and retaliatory aggression, compared with a control. A mediation model demonstrated that face-threatening acts lead to direct effects on negative affect and an indirect affect on retaliatory aggression through negative affect. Findings are discussed in relation to face theory and politeness theory.

Chapter 10: Instruments and Measures

From How to Do It 10.5: “Writing up the Instrument and Measurement Sections,” page 318

Journal Article 1: Zerback, T., Koch, T., &  Kramer, B. (2015). Thinking of others: Effects of implicit and explicit media cues on climate of opinion perceptions. Journalism and Mass Communication Quarterly, 92(2), 421-43.

Learning Objective: Summarize the advantages and disadvantages of questionnaires as the instrument of an experiment.

Summary: This study made use of a questionnaire, and instrument items and indices’ statistics are found in the text. Researchers examined the relative impact of survey data (explicit cue) and arguments (implicit cue) on climate of opinion judgments. The authors provide a useful example of writing up the instrument and measurement section.

Chapter 11: The IRB and Conducting Ethical Experiments

From Study Spotlight 11.1: “Risks and Wrongs in Social Science Research,” page 339

Journal Article 1: Oakes, J. M. (2002). Risks and wrongs in social science research: An evaluator’s guide to the IRB. Evaluation Review, 26(5), 443.

Learning Objective: Describe the purpose of the Institutional Review Board and its principles when conducting an experiment.

Summary: Having an Institutional Review Board (IRB) review and monitor the use of human subjects is now fundamental to ethical research. Yet social scientists appear increasingly frustrated with the process. This article aims to assist evaluators struggling to understand and work with IRBs. The author theorizes why IRBs frustrate and insists there is only one remedy: We must accept the legitimacy of IRB review and (a) learn more about IRB regulations, imperatives, and the new pressures on them; and (b) educate IRBs about social scientific methodologies and empirically demonstrable risks. A research agenda and tips are offered.