Search EdWorkingPapers

Search for EdWorkingPapers here by author, title, or keywords.

Using Semantic Similarity to Assess Adherence and Replicability of Intervention Delivery

Researchers are rarely satisfied to learn only whether an intervention works, they also want to understand why and under what circumstances interventions produce their intended effects. These questions have led to increasing calls for implementation research to be included in high quality studies with strong causal claims. Of critical importance is determining whether an intervention can be delivered with adherence to a standardized protocol, and the extent to which an intervention protocol can be replicated across sessions, sites, and studies. When an intervention protocol is highly standardized and delivered through verbal interactions with participants, a set of natural language processing (NLP) techniques termed semantic similarity can be used to provide quantitative summary measures of how closely intervention sessions adhere to a standardized protocol, as well as how consistently the protocol is replicated across sessions. Given the intense methodological, budgetary and logistical challenges for conducting implementation research, semantic similarity approaches have the benefit of being low-cost, scalable, and context agnostic for use. In this paper, we demonstrate how semantic similarity approaches may be utilized in an experimental evaluation of a coaching protocol on teacher pedagogical skills in a simulated classroom environment. We discuss strengths and limitations of the approach, and the most appropriate contexts for applying this method.

Keywords
Natural language process, data science, treatment fidelity, treatment adherence, replication
Education level
Document Object Identifier (DOI)
10.26300/n5qj-7310

EdWorkingPaper suggested citation:

Anglin, Kylie L., and Vivian C. Wong. (). Using Semantic Similarity to Assess Adherence and Replicability of Intervention Delivery. (EdWorkingPaper: -312). Retrieved from Annenberg Institute at Brown University: https://doi.org/10.26300/n5qj-7310

Machine-readable bibliographic record: RIS, BibTeX