Replication & Building Evidence

This Spotlight features perspectives, research, and resources related to building evidence in STEM education, particularly by means of replication. While NSF and the DRK-12 portfolio traditionally focus on the early stages in the research cycle (i.e., exploratory, design and development), there has been increasing attention to and funding of studies that build evidence of the efficacy and effectiveness of preK-12 STEM education innovations of models, tools, and technologies. Recently, the DRK-12 program funded a replication study featured in this Spotlight and research to support a more consistent definition of “replicable” and promote consistency in the design of replication studies and their statistical analyses. NSF also funded a data hub to support reproducible research practices such as data sharing and study registration. Other studies funded by NSF, such as CASPIR Math (featured below), look closely at core components and the conditions under which interventions can be effectively adapted in various contexts and raise questions about exact replication. In two perspective pieces, Jinfa Cai comments on the role of conceptual replication studies and Christopher Wilson discusses effectively communicating with practitioners and others about the evidence we are building.
 
In this Spotlight...

Researcher Perspectives

Conceptual Replications in STEM Education Research

Jinfa Cai, Kathleen and David Hollowell Professor of Mathematics Education, University of Delaware

Despite the abundant and frequent calls for replication studies from research communities (e.g., Shavelson & Towne, 2002) and funding agencies (e.g., IES & NSF, 2013), the number of such studies remains stubbornly small. For example, in an analysis of all articles published since 1900 in the top 10 psychological journals, Makel, Plucker, and Hegarty (2012) found that less than 1% were replication studies. Moreover, from the top 100 education journals, as ranked by 5-year impact factor, Makel and Plucker (2006) found that only 0.13% of articles were replication studies, with the majority of successful replications being authored by the same individuals who had carried out the initial studies.  At the Journal for Research in Mathematics Education (JRME), among all research articles published from JRME’s inception in 1970 through 2016, only about 3% were clearly intending to replicate prior studies (Cai et al., 2018).

There are many reasons for the limited number of published replication studies. One of the few major reasons is a lack of clarity: What are replication studies? Why should the field of STEM education research engage in such studies?  How should reviewers, journal editors, and funding agencies review them?...

 

Reporting Findings to Decision-Makers and Project Participants

Christopher Wilson, Research Division Director, BSCS Science Learning

As DRK-12 researchers conducting empirical studies of interventions in science education, the findings from our studies are important to multiple audiences. While the dissemination plan might be one of the last sections we write in our proposals, and one of the last pieces we consider during the timeline of a project, it is probably the most important activity we engage in. I’ll always remember the advice my wife’s PhD advisor gave her during her studies on adolescence and animal behavior: “If you’re not publishing, you’re not doing science, you’re just watching hamsters mating in a basement.” The former presumably more justifiable than the latter.

At BSCS Science Learning we’re finding that the results from our research studies are important to an increasingly broad range of audiences. In the past we might have begun projects with the expectation that in the final year we’d be starting the often-endless process of publishing papers in research journals, and presenting findings at national research conferences. Remember those? In more recent years, as the evidence base for the efficacy of instructional materials or professional development programs has become more established, we’ve become more involved in scaling up these effective interventions....


Featured Projects

Comparing the Efficacy of Online to In-Person Professional Development Formats for Improving Student Outcomes of a Student-Teacher-Scientist Partnership Program (NSF #2010556)

PI: Catrina Adams | Co-PI: Joseph Taylor, Anne Westbrook
Target Audience: High school biology teachers and students and scientist mentors
STEM Discipline: Biology, Botany

Description: The newly funded project utilizes PlantingScience, a student-teacher-scientist partnership (STSP) program. It builds on the Digging Deeper study (DIG: NSF Award #1502892), which found that implementing PlantingScience in combination with high-quality, in-person collaborative teacher/scientist professional development resulted in positive and statistically significant effects on student achievement and attitudes versus business-as-usual methods. The project has two components: 1) a replication study to determine if DIG findings hold true; and 2) adding an online format for collaborative professional development and comparing the effectiveness of online and in-person PD formats for improving student outcomes. A cluster-randomized design will be employed to answer these impact questions. Read the complete abstract. View a video from an earlier PlantingScience project.

What are you learning and/or what new questions are arising from your work related to replication methodologies or studies? A major benefit of our three-arm experimental design is that it includes a direct replication of the DIG study. Although the number of intervention studies in education continues to increase, very few replication studies have been conducted. A recent study found that only 0.13 percent of education articles were replications (Makel & Plucker, 2014). Thomas Brock, director of the Institute of Education Sciences (IES), stated that there needs to be more emphasis on replication studies that confirm or unpack existing findings (Sparks, 2017). In 2016, IES and other agencies convened a group of experts to provide advice on how best to support the advancement of evidence beyond an efficacy study. One of the outcomes was a call for more replication studies (NCSER et al., 2016; Taylor & Doolittle, 2017). As districts and states base decisions more directly on research findings, they will be increasingly looking for evidence from multiple studies to show that an intervention would work in their school environments. Replication of the DIG project would help corroborate past positive outcomes and would provide greater evidence that the program is effective and generalizable.

In Makel & Plucker’s (2014) study, the authors distinguish between direct and conceptual replications. The way we envision the replication portion of our study would likely be considered a mix of the two types. Like a direct replication, we anticipate using the same measures of student outcomes as were used in the DIG study, and we anticipate that the study sample of teachers and students will be comparable to the DIG study. The same researchers and staff will be conducting the replication study. However, we have made adjustments to the PlantingScience program and user interface improvements on the online platform based on formative feedback of DIG teachers since the DIG study was conducted. In addition, we improved the in-person professional development that was offered to teachers and scientists based on feedback from participants and from our external evaluators. The development portion of the current project involves making additional minor changes to both in-person and online PD, for example replacing some generic classroom video examples with new video that is specific to the program, and there may be areas that we need to change in the in-person PD to make it comparable to the newly-developed online PD. Beyond these changes, we also anticipate collecting additional data that we did not collect during DIG, for example linking student outcome data to student projects, enabling us to take a closer look at how characteristics of dialog between scientist mentors and student teams relates to student outcomes.

Suggested Readings: We recommend the following readings that shaped our decision to include replication as a major focus of the current project:


Collaborating Around Sustainability of Processes and Instructional Routines (CASPIR) in Mathematics (CASPIR Math) (NSF #1907681)

PI: Alison Castro SuperfineCo-PIs: Shelby Cosner, Benjamin Superfine, Yue Yin
Target Audience: K-8 school districts
STEM Discipline: K-8 mathematics

Description: The aim of the CASPIR Math Project is to co-develop and implement the Elementary Mathematics Leadership (EML) model, a multi-component professional development (PD) intervention designed to enhance the organizational capacities of schools and districts to support improvements in K-8 mathematics teaching and learning. The EML model uses a Design-Based Implementation Research process involving collaboration between researchers, professional developers, and school personnel to co-develop the PD from district- through teacher-levels. The EML model is grounded in the idea of “productive adaptation” to varying district contexts, and partly aimed at developing principles for productive adaptation of similar interventions in other settings. Read the complete abstract.

What are you learning and/or what new questions are arising from your work related to replication methodologies or studies? The CASPIR Math Project is grounded in the idea that, because districts have different contexts, adaptive integration of interventions is important as they go to scale. As researchers studying areas such as Design-based Implementation Research and Improvement Science have argued, successful “scaling up” depends on local actors who make continual, coherent adjustments to interventions as they make their way through various levels of an organization. The EML model employed by CASPIR accordingly involves focusing on “problems of practice” co-identified by researchers and school district personnel that necessarily vary and change over time. But at the same time, the EML model always involves three essential components that remain the same across districts—(1) gathering particular types of data, (2) designing and implementing multi-level PD on the basis of this data, and (3) engaging in collaborative and iterative re-design across a series of years. As such, the EML model involves difficult questions around the replicability of interventions that are specifically designed to change with the setting and over time.

To address such questions, the CASPIR Math Project involves data collection and analysis not only about the effectiveness of the EML Model but also the process of implementing and adapting it. In addition to collecting data in areas such as student achievement, teachers’ math knowledge and instructional practices, and school and district administrator and organizational capacities, the project captures data about the collaborative and iterative co-design process between the project team and school and district personnel. Moreover, the project aims at articulating the factors external to districts and schools (e.g., policies, labor relations) and internal to districts and schools (e.g. teacher efficacy, organizational capacities) that might support and inhibit productive adaptation of the EML model. In this way, the EML model is partly aimed at developing principles for productive adaptation of similar interventions in other settings.

Suggested Readings and Resources: Some research related to DBIR and Improvement Science that have informed our work include the following:

  • Bryk, A., Gomez, L., Grunow, A., LeMahieu, P. (2015). Learning to improve: How America’s schools can get better at getting better. Boston, MA: Harvard Education Publishing.
  • Penuel, W., Fishman, B., Cheng, B., & Sabelli, N. (2011). Organizing research and development at the intersection of learning, implementation and design. Education Researcher, 40(7), 331-337.


Additional Resources

The following publications and resources provide an introduction to several aspects of building evidence in STEM education research, including ways of sharing data.

Select Publications

Data Repositories and Services

  • AEA RCT Registry is the American Economic Association's registry for randomized controlled trials in economics and other social sciences.
  • The Center for Open Science helps researchers manage and archive their research, connects and builds open science communities, and supports and conducts research on scientific practices.
  • Databrary  is a digital data library specialized for storing, managing, preserving, analyzing, and sharing video. Databrary also provides a set of tools that enable researchers to upload video and other materials as they are generated, thus reducing barriers to sharing.
  • DataCite provides persistent identifiers (DOIs) for research data and other research outputs. Organizations within the research community join DataCite as members to be able to assign DOIs to all their research outputs. Their outputs become discoverable and associated metadata is made available to the community.
  • ECR Research Data Hub: Coming Soon!
  • EdArXiv (Education Archive) is a free, open source, non-profit service that allows researchers to post and search working papers, unpublished work, conference materials, articles under review (preprints), and author-formatted versions of published work that the author has permission to post (postprints).
  • ICPSR maintains a data archive in the social and behavioral sciences, offers educational activities including a summer program on quantitative methods, and sponsors research that focuses on the emerging challenges of digital curation and data science.
  • The Qualitative Data Repository (QDR) is a dedicated archive for storing and sharing digital data (and accompanying documentation) generated or collected through qualitative and multi-method research in the social sciences.
  • Re3data is a global registry of research data repositories that covers research data repositories from different academic disciplines.
  • The Registry of Efficacy and Effectiveness Studies (REES) is a database of causal inference studies in education and related fields.
Year