Jim Hinojosa; The Evidence-Based Paradox. Am J Occup Ther 2013;67(2):e18–e23. https://doi.org/10.5014/ajot.2013.005587
Download citation file:
© 2021 American Occupational Therapy Association
Many occupational therapy practitioners consider evidence-based practice (EBP) to be the means by which occupational therapy can prove the validity of its services and thus support the legitimacy of our profession. The unquestioned acceptance of EBP as the way to establish credibility concerns me; unchallenged acceptance of any idea concerns me. Do practitioners accept EBP as the paradigm for guiding occupational therapy practice and research solely because it is presented as what we must do? I believe that practitioners must examine the implications for our profession of accepting EBP without question. In this article, I review EBP, present criticisms and concerns voiced by other professions and, finally, examine the implications of adopting an EBP perspective that replaces theory-directed practice.
As the British Medical Journal was dropped onto their doormats, little did doctors realise on that autumnal Saturday morning in 1948 that they were witnessing a momentous event. The first medical randomised control trial had appeared on the world scene, and a new era of medicine had begun. The experiment was quite straightforward. Patients with tuberculosis, unaware they were taking part in a clinical study, were randomised to receive streptomycin or nothing. With a plunge of a syringe, 3000 years of medical treatment based on experience and perceived effectiveness were at an end. (p. 1824)
patients and nurses are living, thinking, feeling and acting individuals, relating to one another in different ways and situations, where communication in a broad sense always takes place in time and space. The trusted notion that truth corresponds to objective reality does not take into account the fact that acting is contextual, that interpretation always is at stake, and that it is difficult (impossible) to produce generalizability from one group of nurses or patients to another (external validity). (p. 38)
Understand the strengths and limitations of EBP to use it to guide practice effectively. Practitioners need to acknowledge EBP as a tool we can use in our clinical decision making; it is not the principal way to identify useful and meaningful interventions.
Use the findings of quantitative research as one form of evidence to inform practice and acknowledge the importance of qualitative research, clinical knowledge, and the uniqueness of each client.
Continue to develop a broader and more inclusive system for the evaluation and use of evidence, such as a hierarchy of evidence that reflects the profession, building on the work of Tomlin and Borgetto (2011) or an alternative model.
Learn and understand the language of EBP, such as the number needed to treat (NNT, Armstrong, 2007; Cook & Sackett, 1995), a statistic that clearly informs us about the real effect of an intervention. The NNT identifies the number of clients one needs to treat for one beneficial outcome (Walter, 2001). Using this measure of statistical and clinical significance, we can share our clinical uncertainty honestly with clients to support our philosophical belief in client-centered care. Researchers also need to conduct research using samples with the appropriate effect size to determine the real effects of intervention.
Prepare for the inevitable arguments that payers will use to limit payment for services on the basis of the lack of EBP-supported evidence as they attempt to cut costs or justify rationing decisions.
This PDF is available to Subscribers Only
For full access to this pdf, sign in to an existing account, or purchase an annual subscription.