Free
Editorial  |   July 2012
Intervention Fidelity: A Necessary Aspect of Intervention Effectiveness Studies
Author Affiliations
  • Susan L. Murphy, ScD, OTR/L, is Assistant Professor, Department of Physical Medicine and Rehabilitation, University of Michigan, 9th Floor, 300 North Ingalls Street, Ann Arbor, MI 48109-2007, and Research Health Science Specialist, Geriatric Research, Education and Clinical Center, Veterans Affairs Ann Arbor Health Care System; sumurphy@umich.edu
  • Sharon A. Gutman, PhD, OTR, FAOTA, is Editor-in-Chief, American Journal of Occupational Therapy, and Associate Professor, Columbia University, Programs in Occupational Therapy, New York; ajoteditor@aota.org
Article Information
Evidence-Based Practice / From the Desk of the Editor and Associate Editor
Editorial   |   July 2012
Intervention Fidelity: A Necessary Aspect of Intervention Effectiveness Studies
American Journal of Occupational Therapy, July/August 2012, Vol. 66, 387-388. doi:10.5014/ajot.2010.005405
American Journal of Occupational Therapy, July/August 2012, Vol. 66, 387-388. doi:10.5014/ajot.2010.005405
In a recent editorial, we highlighted information commonly absent from manuscripts that report intervention effectiveness (Gutman & Murphy, 2012). One critical element often missing from manuscripts is a discussion of intervention fidelity, which is the extent to which the intervention is delivered as it was intended (Gearing et al., 2011). To assess fidelity, researchers need to determine at the study design stage the methods they will use to assess and monitor the reliability and validity of the intervention (Bellg et al., 2004; Borrelli, 2011). Reporting of fidelity methods in the published product is crucial to allow readers to judge the quality of the study and replicate it and to assist the intervention developers in understanding how various factors may have influenced the outcome of their study.
Susan L. Murphy, ScD, OTR/L
Susan L. Murphy, ScD, OTR/L
×
Sharon A. Gutman, PhD, OTR, FAOTA
Sharon A. Gutman, PhD, OTR, FAOTA
×
The Consolidated Standards of Reporting Trials (CONSORT) guidelines (Altman et al., 2001), the criteria now required by most medical journals for reporting clinical trials, have been expanded to include reporting of intervention fidelity in nonpharmacological trials (Boutron, Moher, Altman, Schulz, & Ravaud, 2008). The expanded criteria for reporting offer detailed guidance on how to report this additional information. Because fidelity methods are infrequently and inconsistently reported in journals, our intent in this editorial is to discuss the five basic components of intervention fidelity (Bellg et al., 2004; Borrelli, 2011; Gearing et al., 2011):
  1. 1.Intervention design
  2. 2.Training of providers
  3. 3.Intervention delivery
  4. 4.Receipt of intervention
  5. 5.Enactment of skills gained from the intervention.
Intervention Design
Aspects of trial design pertaining to intervention fidelity include the content and dose of the intervention and the use of any comparison groups (Bellg et al., 2004). Researchers should describe in detail the number, length, and frequency of intervention sessions. The researchers should articulate the underlying theoretical framework or clinical guidelines that provided the foundation for the intervention and specify the intervention’s “active ingredients” (Borrelli, 2011). In designing a trial, researchers should try to plan for potential setbacks and consider alternative strategies before the setbacks occur. One potential setback, for example, is provider dropout (Bellg et al., 2004; Gearing et al., 2011); a potential solution is to train extra providers at the outset of the study so that backup providers with the requisite skills are available.
Training of Providers
To ensure fidelity, it is necessary to be certain that multiple providers administer the same intervention in the same manner; for this reason, many researchers develop and standardize training procedures. This training may be done initially and throughout the study implementation to allow for turnover in providers and to keep providers from deviating from the standardized procedures over time (a phenomenon known as therapist drift;Bellg et al., 2004; Borrelli, 2011; Gearing et al., 2011). Before beginning intervention delivery, researchers should assess providers’ acquisition of skills by written test, direction observation, or a combination (Bellg et al., 2004). Often, providers who have particular credentials or experience are selected for a study; training plans may need to be adapted if providers have different levels of experience or education (Borrelli, 2011).
Intervention Delivery
Researchers should report any methods used to standardize the interventions; this information is especially important so that others may replicate the study (Boutron et al., 2008). Researchers may use written intervention manuals, which assist in ensuring fidelity by helping to control for provider differences, ensuring adherence to the intervention protocol, and maintaining the distinct features of the intervention and comparison treatments (Bellg et al., 2004). Written manuals also provide a concrete means to articulate the active ingredients of the intervention, ensuring consistency in how and when these ingredients are delivered.
Receipt of Intervention
The first three components of fidelity focus primarily on providers and how treatment is delivered. Fidelity methods also involve a fourth component: how participants received an intervention (e.g., whether they understood the intervention content and how relevant they thought the intervention was to daily life). Researchers can assess this component of fidelity by tracking attendance at sessions and administering measures such as pre- and postintervention assessments of knowledge gained. It also may be important to assess participant self-efficacy in implementing newly taught behaviors or strategies. Borrelli (2011)  described several ways to enhance fidelity of intervention receipt.
Enactment of Skills Gained From the Intervention
The fifth basic component of fidelity pertains to how people apply the intervention content in daily life. Measuring enactment differs from assessing study outcomes because measurement of enactment occurs throughout the study and not just at an endpoint (Borrelli, 2011). An outside observer can assess enactment using a checklist or other objective measurement. For instance, if a goal of an intervention was medication management and treatment involved teaching the participant to organize a pillbox, the ability to organize the pillbox would be a skill involved in treatment receipt, but taking the medication appropriately would be enactment of the skill in daily life (Bellg et al., 2004).
Conclusion
Intervention fidelity is an important aspect of designing and implementing intervention effectiveness studies. Assessment of intervention fidelity not only is important for replication of the study but also provides crucial information to researchers for interpreting the effects of the intervention.
References
Altman, D. G., Schulz, K. F., Moher, D., Egger, M., Davidoff, F., Elbourne, D., et al. (2001). The revised CONSORT statement for reporting randomized trials: Explanation and elaboration. Annals of Internal Medicine, 134, 663–694. Retrieved from http://medicine.johnstrogerhospital.org/cru/images/education/b2dc147afb1f58f18800c9c5d48e984e.pdf [Article] [PubMed]
Altman, D. G., Schulz, K. F., Moher, D., Egger, M., Davidoff, F., Elbourne, D., et al. (2001). The revised CONSORT statement for reporting randomized trials: Explanation and elaboration. Annals of Internal Medicine, 134, 663–694. Retrieved from http://medicine.johnstrogerhospital.org/cru/images/education/b2dc147afb1f58f18800c9c5d48e984e.pdf [Article] [PubMed]×
Bellg, A. J., Borrelli, B., Resnick, B., Hecht, J., Minicucci, D. S., Ory, M.et al.  .; Treatment Fidelity Workgroup of the NIH Behavior Change Consortium., (2004). Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Health Psychology, 23, 443–451. http://dx.doi.org/10.1037/0278-6133.23.5.443 [Article] [PubMed]
Bellg, A. J., Borrelli, B., Resnick, B., Hecht, J., Minicucci, D. S., Ory, M.et al.  .; Treatment Fidelity Workgroup of the NIH Behavior Change Consortium., (2004). Enhancing treatment fidelity in health behavior change studies: Best practices and recommendations from the NIH Behavior Change Consortium. Health Psychology, 23, 443–451. http://dx.doi.org/10.1037/0278-6133.23.5.443 [Article] [PubMed]×
Borrelli, B. (2011). The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. Journal of Public Health Dentistry, 71, S52–S63. http://dx.doi.org/10.1111/j.1752-7325.2011.00233.x [Article]
Borrelli, B. (2011). The assessment, monitoring, and enhancement of treatment fidelity in public health clinical trials. Journal of Public Health Dentistry, 71, S52–S63. http://dx.doi.org/10.1111/j.1752-7325.2011.00233.x [Article] ×
Boutron, I., Moher, D., Altman, D. G., Schulz, K. F., & Ravaud, P. (2008). Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: Explanation and elaboration. Annals of Internal Medicine, 148, 295–309. Retrieved from www.swmed.edu/utsw/facdir/cma/fileuploads/9/facdir104960/AnnIntMed2008.pdf [Article] [PubMed]
Boutron, I., Moher, D., Altman, D. G., Schulz, K. F., & Ravaud, P. (2008). Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: Explanation and elaboration. Annals of Internal Medicine, 148, 295–309. Retrieved from www.swmed.edu/utsw/facdir/cma/fileuploads/9/facdir104960/AnnIntMed2008.pdf [Article] [PubMed]×
Gearing, R. E., El-Bassel, N., Ghesquiere, A., Baldwin, S., Gillies, J., & Ngeow, E. (2011). Major ingredients of fidelity: A review and scientific guide to improving quality of intervention research implementation. Clinical Psychology Review, 31, 79–88. http://dx.doi.org/10.1016/j.cpr.2010.09.007 [Article] [PubMed]
Gearing, R. E., El-Bassel, N., Ghesquiere, A., Baldwin, S., Gillies, J., & Ngeow, E. (2011). Major ingredients of fidelity: A review and scientific guide to improving quality of intervention research implementation. Clinical Psychology Review, 31, 79–88. http://dx.doi.org/10.1016/j.cpr.2010.09.007 [Article] [PubMed]×
Gutman, S. A., & Murphy, S. L. (2012). From the Desk of the Editor and Associate Editor—Information commonly unreported in intervention effectiveness studies. American Journal of Occupational Therapy, 66, 7–10. http://dx.doi.org/10.5014/ajot.2012.003673 [Article]
Gutman, S. A., & Murphy, S. L. (2012). From the Desk of the Editor and Associate Editor—Information commonly unreported in intervention effectiveness studies. American Journal of Occupational Therapy, 66, 7–10. http://dx.doi.org/10.5014/ajot.2012.003673 [Article] ×
Susan L. Murphy, ScD, OTR/L
Susan L. Murphy, ScD, OTR/L
×
Sharon A. Gutman, PhD, OTR, FAOTA
Sharon A. Gutman, PhD, OTR, FAOTA
×