Free
Research Article
Issue Date: September/October 2015
Published Online: November 02, 2015
Updated: April 30, 2020
Impact of a Curricular Change on Perceived Knowledge, Skills, and Use of Evidence in Occupational Therapy Practice: A Cohort Study
Author Affiliations
  • Teal W. Benevides, PhD, OTR/L, is Assistant Professor, Department of Occupational Therapy, College of Health Professions, Thomas Jefferson University, Philadelphia, PA; teal.benevides@jefferson.edu
  • Tracey Vause-Earland, MS, OTR/L, is Assistant Professor, Department of Occupational Therapy, College of Health Professions, Thomas Jefferson University, Philadelphia, PA
  • Robert Walsh, MS, OTR/L, is Occupational Therapist Resident, The Johns Hopkins Hospital, Baltimore, MD. At the time of the study, he was a Teaching Associate, Department of Occupational Therapy, Thomas Jefferson University, Philadelphia, PA
Article Information
Evidence-Based Practice / Education of OTs and OTAs
Research Article   |   November 02, 2015
Impact of a Curricular Change on Perceived Knowledge, Skills, and Use of Evidence in Occupational Therapy Practice: A Cohort Study
American Journal of Occupational Therapy, November 2015, Vol. 69, 6912185010. https://doi.org/10.5014/ajot.2015.018416
American Journal of Occupational Therapy, November 2015, Vol. 69, 6912185010. https://doi.org/10.5014/ajot.2015.018416
Abstract

OBJECTIVE. We evaluated the impact of a curriculum revision that emphasized experiential use of evidence in clinical environments on occupational therapy graduates’ attitudes, perceived knowledge and skill, and use of evidence in practice.

METHODS. We used a retrospective cohort design to compare two curriculum cohorts of recent graduates exposed to different evidence-based practice (EBP) educational approaches. Responses on a validated survey of attitudes, knowledge/skill, and use of evidence in practice were compared using t tests and Mann–Whitney U tests for Cohort 1 (n = 63) and Cohort 2 (n = 62) graduates.

RESULTS. Findings suggest similar attitudes and use of evidence between cohorts; Cohort 2 reported statistically greater perceived knowledge of and skill in EBP.

CONCLUSIONS. Emphasis on experiential learning in school with reinforcement of skills in clinical learning environments is not sufficient to change graduates’ use of evidence. Although the curriculum revision improved perceived knowledge/skill, our study suggests systems or other factors may influence use after graduation.

Evidence-based practice (EBP) has evolved as a client-centered approach to decision making that traditionally begins with a clinical question, followed by appraisal of research evidence, application of evidence to clients on the basis of individual needs, and evaluation of clinical outcomes with reflection (Haynes, 2002). Use of evidence by health care professionals improves health care delivery (Lugtenberg, Burgers, & Westert, 2009) and increases professional identity (Law & MacDermid, 2008). EBP is highly valued and considered important to health care delivery (Bennett et al., 2003; Humphris, Littlejohns, Victor, O’Halloran, & Peacock, 2000; Jette et al., 2003), but clinicians perceive it as time consuming and difficult (Bennett et al., 2003; Humphris et al., 2000; Jette et al., 2003; Lyons, Brown, Tseng, Casey, & McDonald, 2011). Occupational therapy accreditation standards state that graduates should be prepared to access, appraise, and use the latest evidence supporting the efficacy of occupational therapy practice (American Occupational Therapy Association, 2012). Implementing effective educational approaches to instilling positive attitudes and knowledge and reinforcing use of EBP in practice is therefore of paramount importance for educators of occupational therapy graduate students and similarly trained health professionals.
Understanding barriers to and supports of practitioners’ use of EBP is relevant for educators. The largest barriers practitioners identify are lack of time, lack of resources to search for relevant literature, poor organizational support, lack of skills in interpreting statistical analyses, poor ability to evaluate the quality of research, and decreased competence in applying findings to clinical practice (Curtin & Jaramazovic, 2001; Humphris et al., 2000; Karlsson & Törnquist, 2007; Lyons et al., 2011). Of these identified barriers, those related to practitioner skill and competence may be addressed in entry-level education. Factors related to improved use of evidence include perceived confidence in skills and optimism regarding EBP (McCluskey, 2004), previous training in database use and critical appraisal, greater educational qualifications, and fewer years of clinical experience (Bennett et al., 2003; Jette et al., 2003). These studies have suggested that practitioners need to spend time learning, practicing, and experiencing the process of EBP to gain confidence.
Few studies have explicitly examined how occupational therapists’ training or education influences knowledge about the use of evidence in practice. Two articles (McCluskey & Lovarini, 2005; Welch & Dawson, 2006) evaluated practitioner training approaches to improve clinician skills and use of evidence. Welch and Dawson (2006)  suggested that collaborative learning groups and peer support resulted in greater perceived confidence in use of evidence and transfer of skill into practice (N = 5), but McCluskey and Lovarini (2005)  found that an EBP workshop with outreach support improved knowledge and skill (N = 114) but did not improve actual use of evidence in practice.
A systematic review of the medical school literature revealed differences in the impact of didactic versus experiential learning on student attitudes, knowledge, skill, and use of evidence. Didactic teaching methods were found to improve only knowledge in postgraduates, not attitudes or use, but experiential learning in clinical practice was beneficial for improving attitudes and use of evidence in day-to-day practice (Coomarasamy & Khan, 2004).
Limited research has evaluated the educational approaches used to train occupational therapy graduate students before entry into practice and whether these approaches change knowledge, skill, and use of evidence. The purpose of this study was to examine the impact of an occupational therapy curricular change on recent entry-level master’s-degree graduate students’ attitudes, perceived knowledge of and skill in EBP, and self-reported use of evidence in clinical settings.
Theoretical Framework
Rogers’ (1983)  Diffusion of Innovations theory provides a framework for understanding how an innovation, such as the process of EBP, is recognized and ultimately used by parties who would benefit from its implementation. Diffusion is “the process by which an innovation is communicated through certain channels over time among the members of a social system” (Rogers, 1983, p. 11). Diffusion of Innovations theory has four main components: innovation, communication channel, social system, and time (Rogers, 1983). The innovation, EBP, is a practice perceived as new by the person. The communication channel is the sharing of information with another to reach mutual understanding. The social system must support joint problem solving to accomplish the goal of adoption, which is influenced by social structure, values, and culture. Creating a culture in which students are comfortable and encouraged to ask questions and evaluate current practice is an important component of diffusion. The Diffusion of Innovations theory also recognizes the time dimension in adopting an innovation (Rogers, 1983). Time affects how the innovation is presented, communicated, and valued in an organization.
Rogers (1983)  labeled five stages in the diffusion of an innovation (the innovation–decision process): knowledge, persuasion, decision, implementation, and confirmation. Students are typically provided with knowledge and, in the persuasion stage, with positive social attitudes and a rationale for adoption with didactic learning. However, the decision, implementation, and confirmation stages require explicit action rather than passive receipt of information. To facilitate the decision, implementation, and confirmation stages, it is important to provide an opportunity in which students are explicitly required to use EBP and receive positive feedback and encouragement in its use, for example, during actual clinical experiences while still in school, in contrast to an educational model that solely provides students with knowledge and persuasive arguments in favor of EBP but does not require its implementation while students are in clinical practice.
Aim and Research Questions
This study examined the impact of an explicit curricular revision in EBP content, viewed through the lens of the Diffusion of Innovations theory (Table 1). Two naturally occurring graduate cohorts were compared; both cohorts received knowledge of and persuasion to use EBP, but Cohort 2 received additional support for decision, implementation, and confirmation of EBP in multiple courses. Our research questions were as follows:
  1. What is the difference in attitude, perceived knowledge and skill, and reported use of research evidence between occupational therapy graduates of Cohort 2 and graduates of Cohort 1?

  2. Do the Cohort 2 graduates with more experience in searching, interpreting, and applying research evidence report different research skill barriers to engaging in EBP?

Table 1.
Curricular Comparison of EBP Content for Cohorts 1 and 2
Curricular Comparison of EBP Content for Cohorts 1 and 2×
Steps in the EBP Process and CohortNo. of Times a Stage of the Diffusion of Innovations Theory Was Emphasized in a Course for Each Step in the EPB Process
Ask a clinical question
  Cohort 1K
  Cohort 2KKKKPPPPDDIICC
Acquire research evidence from databases
  Cohort 1KKK
  Cohort 2KKKKPPPDDIICC
Appraise research evidence critically
  Cohort 1KKDD
  Cohort 2KKKKPPPDDIICC
Apply evidence in clinical practice
  Cohort 1DDDDDD
  Cohort 2DDDIIICCC
Assess or reflect on application of evidence
  Cohort 1D
  Cohort 2DDDIIICCC
Table Footer NoteNote. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.
Note. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.×
Table 1.
Curricular Comparison of EBP Content for Cohorts 1 and 2
Curricular Comparison of EBP Content for Cohorts 1 and 2×
Steps in the EBP Process and CohortNo. of Times a Stage of the Diffusion of Innovations Theory Was Emphasized in a Course for Each Step in the EPB Process
Ask a clinical question
  Cohort 1K
  Cohort 2KKKKPPPPDDIICC
Acquire research evidence from databases
  Cohort 1KKK
  Cohort 2KKKKPPPDDIICC
Appraise research evidence critically
  Cohort 1KKDD
  Cohort 2KKKKPPPDDIICC
Apply evidence in clinical practice
  Cohort 1DDDDDD
  Cohort 2DDDIIICCC
Assess or reflect on application of evidence
  Cohort 1D
  Cohort 2DDDIIICCC
Table Footer NoteNote. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.
Note. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.×
×
Method
Design and Participants
We implemented a retrospective cohort design using survey methodology to examine differences in graduates’ attitudes, perceived knowledge and skill, and reported use of evidence in practice between two existing graduate cohorts of master’s degree students completing an occupational therapy curriculum within the past 5 yr. Ethics approval to conduct the study was obtained from the university’s institutional review board before the procedures were implemented.
Survey Instrument
Questions regarding attitudes toward and knowledge and use of EBP were adapted with permission from the Evidence-Based Practice (EBP) Questionnaire, a survey instrument with established test–retest reliability and content validity (Jette et al., 2003). Survey questions were modified for occupational therapy practice but retained the original order of the published survey’s attitude, knowledge, and use questions.
The survey instrument contained 55 items and took approximately 20 min to complete. We used SurveyMonkey (Survey Monkey, Palo Alto, CA; https://www.surveymonkey.com/) to collect and store data, which were imported into IBM SPSS (Version 20.0, IBM Corp., Armonk, NY) for analysis at the conclusion of the survey period. The main survey contained questions on three areas: (1) attitudes and beliefs regarding EBP, (2) perceived knowledge of and skill in using EBP, and (3) self-reported frequency of use of evidence in practice (including use of literature in practice, frequency of database searches, and frequency of reading research articles). Respondents were asked to indicate their top three barriers to engaging in EBP on a checklist and via an open-ended “other” response. Additional questions included but not described in this article ascertained respondents’ access to literature at home and in their clinical setting (four items) and use of clinical practice guidelines (six questions). Interested readers are referred to the original survey for information on survey wording (Jette et al., 2003).
Survey design features were used to reduce respondent bias and missing responses. Negatively worded items were embedded with positively worded items to reduce positive response bias for attitude items. We used skip logic in SurveyMonkey to reduce the time burden for respondents, and we placed demographic questions at the end of the survey to enhance responses on the EBP questions before survey fatigue.
Curricular Differences and Educational Intervention
Graduates of both cohorts had the same faculty at the time of their education and had similar exposure to a variety of clinical sites during their training. Primary EBP changes to the revised curriculum included (1) inclusion of didactic and case-based learning experiences in searching, critiquing, and applying EBP practices in all intervention courses; (2) embedding an asynchronous online EBP course simultaneously with full-time clinical fieldwork experiences to foster and encourage the use of evidence in practice; and (3) changing the final capstone master’s project from a research-based project to a systematic review of the literature based on a clinical question. Our two cohorts had quantitatively and qualitatively different learning experiences regarding EBP.
Other non-EBP curricular changes included (1) shortened program length (e.g., Cohort 2 was in a six-semester master’s degree program, and Cohort 1 was in a seven-semester master’s degree program); (2) reduction of the number of elective credits required to graduate for Cohort 2 (3 credits) compared with Cohort 1 (9 credits); (3) inclusion of three 1-day patient simulation experiences while on campus for Cohort 2; and (4) inclusion of formal interprofessional coursework for Cohort 2. We chose to examine graduates within the past 5 yr to minimize the impact of clinical expertise on knowledge, attitudes, and perceived use of evidence in practice. It has been suggested that graduates evolve past the novice stage approximately 5 yr after graduation (Rassafiani, Ziviani, Rodger, & Dalgleish, 2008).
Power Analysis and Sampling
We conducted power analysis using G*Power 3.1.3 (Faul, Erdfelder, Lang, & Buchner, 2007) to estimate sample size to detect differences between two groups on three outcome variables (attitudes, knowledge/skill, use) using analysis of variance. With a small effect size of .25, power of .80, and an α set to .05, the sample size was estimated at 128 participants (64 in each group). The available sampling frame included 264 graduates who had graduated within the past 5 yr, of whom 245 had valid email addresses (Cohort 1, n = 135; Cohort 2, n = 110).
The sample was recruited over a 6-wk period through email and online mechanisms in place to communicate with alumni. To increase likelihood of response and response rates, participants were invited to submit their name for a raffle for an iPad2. Participants were not required to complete the survey to submit their name for the drawing.
Data Analysis
Data were downloaded in SPSS format and stored offline for analysis. Negatively worded items were reverse coded. All Likert-scale items used for analysis resulted in item scores ranging from 1 (poorest attitude, least knowledge, or least use) to 5 (most positive attitude, most knowledge, or most use). Non-Likert knowledge items were collapsed from a four-item response category to a three-item response category, in which the “do not need to understand” and “don’t understand but would like to” categories were combined, consistent with Jette et al. (2003) . Use items were based on frequency of reported use.
After data-cleaning procedures, Likert-scale items were summed to allow comparison of responses between groups using parametric statistics. The attitude sum score was based on 9 Likert-scale items (range = 9–45). The knowledge/skill sum score was based on 7 Likert-scale items and 8 items reflecting knowledge/understanding on a 3-point scale (range = 15–59). The use sum score was based on 3 items (range = 3–15). Using summed Likert-scale items is deemed appropriate if the final scale is thought to reflect the underlying construct of interest (Polit & Beck, 2008) and, because of the robustness of parametric approaches, it is considered statistically valid in educational research (Norman, 2010). We examined normality and homogeneity of variance of the summary score distributions before using parametric statistics. The use summary score was positively skewed, and thus we used a Mann–Whitney U test to examine differences in groups on this variable. We used analysis of variance to examine cohort differences in sum scores for attitude and knowledge/skill. We set an α of .05 for all analyses. No adjustment for multiple comparisons was made for primary analyses, which is consistent with the level of research in this area (Rothman, 1990).
After primary analyses, we conducted unplanned secondary analyses using χ2 to examine differences in frequency of response on specific items between the cohorts. For items with low cell sizes for item categories, response was collapsed into two categories, and Fisher’s Exact Test (FET) was used. Positively worded Likert-scale items were collapsed into agreement (agree and strongly agree) and disagreement (neutral, disagree, and strongly disagree). Negatively worded items were collapsed into agreement (neutral, agree, and strongly agree) and disagreement (disagree and strongly disagree).
Results
We obtained a total of 132 survey responses. Two participants did not complete the survey after the survey was launched, and 5 participants did not provide an answer regarding year of graduation, resulting in a final sample of 125 participants (Cohort 1, n = 63; Cohort 2, n = 62). The total response rate was high (53.8%).
Graduates in Cohort 1 were significantly older (mean [M] = 29.16 yr, standard deviation [SD] = 2.93) than graduates in Cohort 2 (M = 27.37 yr, SD = 3.91), t (121) = 2.88, p = .005. We found no significant differences in gender representation between Cohort 1 (95.2% women) and Cohort 2 (88.1% women), FET, p = .20. Both groups had more women than men, which is consistent with the population of occupational therapists. We found no differences between the first and second cohort in frequency of respondents enrolled in doctoral courses (3.2% and 1.6%, respectively, FET, p = .51). Both cohorts had similar levels of membership in occupational therapy professional practice organizations at the national or state level (59.7% and 68.3% for Cohorts 1 and 2, respectively), χ2(1) = 0.99, p = .32. Cohort 1 had significantly more graduates who were clinical fieldwork supervisors (68.3%) than Cohort 2 (38.7%), χ2(1) = 10.43, p = .001, representing the expected finding that graduates with less than 1 yr of experience were not permitted to be clinical fieldwork supervisors. Additional practice characteristics for each cohort are shown in Table 2.
Table 2.
Comparison of Cohort Practice Characteristics
Comparison of Cohort Practice Characteristics×
CharacteristicCohort 1 (n = 63)Cohort 2 (n = 62)χ2(1)p
n%n%
Practice areaa14.67.001*
 Children and youth3454.01219.4
 Productive aging57.9812.9
 Rehab, disability, participation2438.13962.9
 Mental health0011.6
 Academic and fieldwork education0011.6
 Missing0011.6
Practice location0.40.82
 Rural69.558.1
 Urban2742.92337.1
 Suburban3047.63251.6
 Missing0023.2
No. of full-time OTs at facility1.28.74
 <53250.82946.8
 5–101219.01524.2
 11–15711.1812.9
 >151219.0812.9
 Missing0023.2
Hours worked per week0.75.86
 <2034.834.8
 20–3023.211.6
 31–402844.42438.7
 >403047.63353.2
 Missing0011.6
Clients seen per dayb1.64.65
 <51015.958.1
 5–104469.84674.2
 11–15812.7812.9
 >1511.611.6
 Missing0023.2
Table Footer NoteNote. Percentages may not total 100 because of rounding. OTs = occupational therapists.
Note. Percentages may not total 100 because of rounding. OTs = occupational therapists.×
Table Footer NoteaAs a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).
As a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).×
Table Footer NotebAs a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.
As a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.×
Table Footer Note*Significant at the p ≤ .01 level.
Significant at the p ≤ .01 level.×
Table 2.
Comparison of Cohort Practice Characteristics
Comparison of Cohort Practice Characteristics×
CharacteristicCohort 1 (n = 63)Cohort 2 (n = 62)χ2(1)p
n%n%
Practice areaa14.67.001*
 Children and youth3454.01219.4
 Productive aging57.9812.9
 Rehab, disability, participation2438.13962.9
 Mental health0011.6
 Academic and fieldwork education0011.6
 Missing0011.6
Practice location0.40.82
 Rural69.558.1
 Urban2742.92337.1
 Suburban3047.63251.6
 Missing0023.2
No. of full-time OTs at facility1.28.74
 <53250.82946.8
 5–101219.01524.2
 11–15711.1812.9
 >151219.0812.9
 Missing0023.2
Hours worked per week0.75.86
 <2034.834.8
 20–3023.211.6
 31–402844.42438.7
 >403047.63353.2
 Missing0011.6
Clients seen per dayb1.64.65
 <51015.958.1
 5–104469.84674.2
 11–15812.7812.9
 >1511.611.6
 Missing0023.2
Table Footer NoteNote. Percentages may not total 100 because of rounding. OTs = occupational therapists.
Note. Percentages may not total 100 because of rounding. OTs = occupational therapists.×
Table Footer NoteaAs a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).
As a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).×
Table Footer NotebAs a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.
As a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.×
Table Footer Note*Significant at the p ≤ .01 level.
Significant at the p ≤ .01 level.×
×
Research Question 1: Differences in Overall Attitudes, Perceived Knowledge/Skill, and Use
Cohort 1 demonstrated an overall attitude summary score (M = 31.02, SD = 4.17) similar to that of Cohort 2 (M = 32.23, SD = 3.64), F (1, 123) = 2.98, p = .09. We found statistically significant differences between both cohorts on the overall knowledge/skill summary score, with Cohort 1 having lower reported knowledge/skill (M = 39.40, SD = 5.4) than Cohort 2 (M = 41.49, SD = 5.1), F (1, 122) = 4.93, p = .03. We found no statistically significant differences in the overall use summary score between cohorts (U = 1,951.5, p = .99).
Secondary Item Analyses of Attitude, Knowledge/Skill, and Use Items
Secondary item analysis of individual attitude items revealed several positively worded attitude statements that were endorsed more frequently by graduates of Cohort 2 (Table 3). Similarly, several items on the negatively worded attitude response set were disagreed with more frequently (representing more positive attitude) by graduates of Cohort 2 (Table 3). Secondary item analysis of individual knowledge items revealed that Cohort 2 was significantly more likely than Cohort 1 to endorse specific statements regarding knowledge and understanding of research concepts (Table 4; see also Supplemental Table 1, available at http://otjournal.net; navigate to this article, and click on “Supplemental”). Item analysis of individual use items revealed no differences in endorsement of frequency of searching the literature, frequency of reading research literature related to practice, or frequency of use of literature to make clinical decisions between cohorts (Supplemental Table 2).
Table 3.
Attitude Items Indicating Positive View of EBP
Attitude Items Indicating Positive View of EBP×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 62)FET, two-tailed p
n%n%
Agreement with positively worded attitude items
 EBP is necessary.5993.65995.1.99
 Literature is useful in my daily practice.5180.94775.8.52
 EBP improves patient care.5892.062100.06
 EBP will improve my reimbursement.1117.51321.0.66
 EBP helps me make decisions.5079.45283.9.65
Disagreement with negatively worded attitude items
 EBP places unreasonable demand.3961.94979.0.05*
 EBP does not take limitations of my practice setting into account.1625.41422.6.84
 Strong evidence is lacking to support my interventions.2133.32337.0.71
 EBP does not take patient preferences into account.1727.02845.2.04*
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
Table 3.
Attitude Items Indicating Positive View of EBP
Attitude Items Indicating Positive View of EBP×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 62)FET, two-tailed p
n%n%
Agreement with positively worded attitude items
 EBP is necessary.5993.65995.1.99
 Literature is useful in my daily practice.5180.94775.8.52
 EBP improves patient care.5892.062100.06
 EBP will improve my reimbursement.1117.51321.0.66
 EBP helps me make decisions.5079.45283.9.65
Disagreement with negatively worded attitude items
 EBP places unreasonable demand.3961.94979.0.05*
 EBP does not take limitations of my practice setting into account.1625.41422.6.84
 Strong evidence is lacking to support my interventions.2133.32337.0.71
 EBP does not take patient preferences into account.1727.02845.2.04*
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
×
Table 4.
Agreement With Knowledge and Skill Items
Agreement With Knowledge and Skill Items×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 61)FET, one-tailed p
n%n%
Learned foundations of EBP in school631006098.4.49
Received search strategy training in school6196.85996.7.68
Familiar with medical search engines (e.g., MEDLINE)6095.261100.13
Confident in my ability to find literature4368.35183.6.04*
Learned critical appraisal skills in school5790.561100.02*
Confident in critical review skills3860.34573.8.08
Gained additional EBP skills outside school2641.32642.6.51
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
Table 4.
Agreement With Knowledge and Skill Items
Agreement With Knowledge and Skill Items×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 61)FET, one-tailed p
n%n%
Learned foundations of EBP in school631006098.4.49
Received search strategy training in school6196.85996.7.68
Familiar with medical search engines (e.g., MEDLINE)6095.261100.13
Confident in my ability to find literature4368.35183.6.04*
Learned critical appraisal skills in school5790.561100.02*
Confident in critical review skills3860.34573.8.08
Gained additional EBP skills outside school2641.32642.6.51
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
×
Research Question 2: Differences in Reported Barriers Between Cohorts
We found no differences in frequency of reported barriers between Cohorts 1 and 2 (Supplemental Table 3). The most frequently reported barrier for both cohorts was lack of time (92% of both cohorts). The top three barriers for both cohorts were lack of time, resources, and generalizable findings within studies. The barriers reported least frequently (less than approximately 10% of each cohort) were those associated with graduates themselves: lack of critical appraisal skills, interest, or search skills.
Discussion
Our first main finding suggests that the curricular change did not result in improved attitudes, but both cohorts had overall positive attitudes toward EBP, confirming existing literature (Bennett et al., 2003; Humphris et al., 2000; Jette et al., 2003). Curricula for both cohorts highly emphasized the Diffusion of Innovation stages of knowledge and persuasion for EBP, both of which likely influenced the positive attitudes toward EBP.
Our second main finding was that Cohort 2 displayed greater self-reported knowledge/skill and understanding of research concepts. Unplanned secondary analyses revealed that Cohort 2 reported greater perceived knowledge in searching databases for existing literature, learning critical appraisal skills in school, and understanding terms such as meta-analysis, heterogeneity, and publication bias. Our findings are consistent with other research findings that a multicomponent EBP intervention (curriculum) leads to improved perceived knowledge (Coomarasamy & Khan, 2004; McCluskey & Lovarini, 2005).
Although the content on interpretation of statistics remained the same in both cohorts, Cohort 2 was exposed to additional didactic and practical experience in formal coursework and in full-time clinical fieldwork in developing a replicable search strategy, using a variety of databases to search the literature, and interpreting and synthesizing results of multiple articles on a clinical topic of interest. Rogers’ (2003)  Diffusion of Innovations theory describes three types of knowledge: (1) awareness–knowledge (recognition that EBP exists), (2) principle–knowledge (how and why EBP works), and (3) how-to knowledge (using an innovation correctly), all of which are essential to adoption. Awareness–knowledge and principle–knowledge were stressed in both curricula. Greater emphasis was placed on how-to knowledge in the revised curriculum by practicing EBP in real-world settings.
The third finding showed no significant difference between cohorts regarding use of EBP. Monthly database use was low, with 19.1% and 27.4% of respondents from the first and second cohorts, respectively, reporting that they performed a database search more than once in the past month. The majority of the respondents reported reading two to five articles per month (50.8%–54.8%), with the vast majority in both cohorts (58.1%–63.5%) reporting using research evidence to inform clinical decision making. Our findings suggest that both cohorts in our study read more literature than has been reported elsewhere: More than 50% of our sample were reading literature, in contrast to the findings of McCluskey and Lovarini (2005)  that approximately 40% of their participants were reading any research literature at follow-up. A study by Salls, Dolhi, Silverman, and Hansen (2009)  of approximately 900 licensed Pennsylvania occupational therapists found that only 12% of therapists reported using electronic databases more than once a month, 32.7% reported reading articles more than once a month, and 24.2% reported using literature once a month or more to inform clinical decisions. Graduates from both curricula in our study exceeded this reported use of literature, suggesting that recent graduates as a whole are more likely to engage in EBP, consistent with findings from other authors (e.g., Dysart & Tomlin, 2002). Although many studies, including ours, have focused on the amount of use of evidence in practice, no studies have evaluated the relationship between actual use of evidence and improved client outcomes in occupational therapy, something that should be investigated in the future.
Our second question investigated whether graduates with more experience in searching, interpreting, and applying research evidence reported fewer research skill barriers to engaging in EBP. Although nonstatistical differences were found between our cohorts, the similar pattern of barriers experienced by both cohorts is somewhat consistent with the literature—systemic and organizational factors serve as frequently reported barriers to implementation and confirmation of EBP (Curtin & Jaramazovic, 2001; Humphris et al., 2000; Karlsson & Törnquist, 2007; Lyons et al., 2011).
The barriers to EBP can be examined through the four components of the Diffusion of Innovations theory: the presentation and accessibility of the innovation; how EBP is communicated; the social system or organization in which the person works; and the time dimension (Rogers, 1983). Although Cohort 2 engaged in the use of EBP while on 6 mo of full-time clinical fieldwork practice, they continued to report difficulties with system barriers after graduation. We found that the top reported barriers were due to system and organizational factors, and these organizational factors may supersede practitioners’ belief in the importance of and their perceived skill in performing EBP. Time and tools to engage in EBP must be valued and recognized as best practice in the organization. To increase the rate of adoption, support for EBP, such as access to scientific evidence via open-access peer-reviewed journals and freely available databases, has previously been recommended (Funk, Tornquist, & Champagne, 1995). Health care administrators must demonstrate their belief in the value of EBP by creating a culture that promotes scientific inquiry and providing opportunities to search, appraise, and evaluate the effects of EBP.
In contrast to the existing literature (Lyons et al., 2011; Thomas, Saroyan, & Dauphinee, 2011), our respondents did not highly endorse barriers reflecting characteristics of the innovation, the adopter, or communication channels. Fewer than one-third of respondents identified barriers related to skill or knowledge, and fewer than half in both cohorts reported that the evidence was not applicable to their clients. Our research suggests that future interventions aimed at improving use of EBP in practice should address systemic barriers in the workplace that prevent practitioners from engaging in these behaviors, which is especially relevant in the context of health care reform. Health care reform will demand greater productivity from practitioners and, possibly, decrease budgets for health care organizations such as hospitals, which will need to emphasize solutions that maximize efficient use of evidence.
Limitations
Our analyses assume differences in the cohorts on the basis of their educational history. However, other noneducational influences may possibly have affected our graduates’ attitudes, knowledge/skill, and use of evidence after having entered clinical practice after graduation. Clinical expertise may have been a confounding variable, although we explicitly sampled practitioners within 5 yr of graduation from our curriculum to limit the impact of clinical expertise on EBP attitude, knowledge/skill, and use (Rassafiani et al., 2008).
History threats to internal validity (e.g., additional training, increased clinical experience) may have influenced our results, although we demonstrated through item analysis that there were no differences between cohorts in perceived knowledge gained through noncurricular educational experiences or in the frequency of practitioners who had pursued doctoral coursework. Other researchers may want to examine longitudinal change in attitude, knowledge, and use of EBP during the student phase, after graduation, and several years after graduation. Although we used a reliable and valid survey instrument to collect data (Jette et al., 2003), the use summary score comprised only three items. This number of survey items is smaller than those for the other areas studied and may have had an impact on the findings.
Implications for Occupational Therapy Education
Students in health professional academic programs learn the core components of EBP in the classroom, but practicing health professionals report limited use of evidence in practice. Our study suggests the following:
  • Educators should embed experiential learning in the curriculum to enhance student knowledge of the EBP process.

  • Outside factors after graduation appear to have an impact on the use of evidence in practice, regardless of student use of evidence in clinical learning environments.

  • Additional inquiry regarding system barriers in practice are warranted, as are intervention studies in the workplace to improve evidence use in practice.

  • Practitioners have a responsibility to advocate for resources that would improve their ability to use evidence in practice, including request for time to review and critique literature as part of the daily workload or access to databases or sources of literature.

Conclusion
Similar to the findings of other studies, our curriculum revision had a greater influence on perceived knowledge and skill than on sustainable EBP behavior, even after an emphasis on the decision, implementation, and confirmation stages of EBP use in the revised curriculum. Our results indicate that practitioners in Cohort 2 valued the use of evidence and had greater knowledge and skills to engage in research activities, but showed no significant difference in use of evidence from Cohort 1. Positive attitude and knowledge alone are not sufficient to ensure EBP in real-world settings, even when facilitated with experiential learning with reinforcement of skills in clinical practice.
Acknowledgments
We thank the alumni who gave their time and participated in this study. Funding for the iPad2 raffle was provided by the Dean’s Office in the Jefferson School of Health Professions at Thomas Jefferson University in Philadelphia, PA. Dean and Chair Janice Burke, who developed and led the effort to evolve curricular content in the occupational therapy programs and provided feedback on this article, also deserves recognition for her support of this project. Finally, we thank the graduate research assistants who contributed to this study, especially Sarah Merhaut and Alexandra Wolfgang-Price.
References
American Occupational Therapy Association. (2012). 2011 Accreditation Council for Occupational Therapy Education (ACOTE®) standards. American Journal of Occupational Therapy, 66, S6–S74. http://dx.doi.org/10.5014/ajot.2012.66S6 [Article]
American Occupational Therapy Association. (2012). 2011 Accreditation Council for Occupational Therapy Education (ACOTE®) standards. American Journal of Occupational Therapy, 66, S6–S74. http://dx.doi.org/10.5014/ajot.2012.66S6 [Article] ×
Bennett, S., Tooth, L., McKenna, K., Rodger, S., Strong, J., Ziviani, J., . . . Gibson, L. (2003). Perceptions of evidence-based practice: A survey of Australian occupational therapists. Australian Occupational Therapy Journal, 50, 13–22. http://dx.doi.org/10.1046/j.1440-1630.2003.00341.x [Article]
Bennett, S., Tooth, L., McKenna, K., Rodger, S., Strong, J., Ziviani, J., . . . Gibson, L. (2003). Perceptions of evidence-based practice: A survey of Australian occupational therapists. Australian Occupational Therapy Journal, 50, 13–22. http://dx.doi.org/10.1046/j.1440-1630.2003.00341.x [Article] ×
Coomarasamy, A., & Khan, K. S. (2004). What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ, 329, 1017–1022. http://dx.doi.org/10.1136/bmj.329.7473.1017 [Article] [PubMed]
Coomarasamy, A., & Khan, K. S. (2004). What is the evidence that postgraduate teaching in evidence based medicine changes anything? A systematic review. BMJ, 329, 1017–1022. http://dx.doi.org/10.1136/bmj.329.7473.1017 [Article] [PubMed]×
Curtin, M., & Jaramazovic, E. (2001). Occupational therapists’ views and perceptions of evidence-based practice. British Journal of Occupational Therapy, 64, 214–222. http://dx.doi.org/10.1177/030802260106400502 [Article]
Curtin, M., & Jaramazovic, E. (2001). Occupational therapists’ views and perceptions of evidence-based practice. British Journal of Occupational Therapy, 64, 214–222. http://dx.doi.org/10.1177/030802260106400502 [Article] ×
Dysart, A. M., & Tomlin, G. S. (2002). Factors related to evidence-based practice among U.S. occupational therapy clinicians. American Journal of Occupational Therapy, 56, 275–284. http://dx.doi.org/10.5014/ajot.56.3.275 [Article] [PubMed]
Dysart, A. M., & Tomlin, G. S. (2002). Factors related to evidence-based practice among U.S. occupational therapy clinicians. American Journal of Occupational Therapy, 56, 275–284. http://dx.doi.org/10.5014/ajot.56.3.275 [Article] [PubMed]×
Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191. http://dx.doi.org/10.3758/BF03193146 [Article] [PubMed]
Faul, F., Erdfelder, E., Lang, A. G., & Buchner, A. (2007). G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behavior Research Methods, 39, 175–191. http://dx.doi.org/10.3758/BF03193146 [Article] [PubMed]×
Funk, S. G., Tornquist, E. M., & Champagne, M. T. (1995). Barriers and facilitators of research utilization: An integrative review. Nursing Clinics of North America, 30, 395–407. [PubMed]
Funk, S. G., Tornquist, E. M., & Champagne, M. T. (1995). Barriers and facilitators of research utilization: An integrative review. Nursing Clinics of North America, 30, 395–407. [PubMed]×
Haynes, R. B. (2002). What kind of evidence is it that evidence-based medicine advocates want health care providers and consumers to pay attention to? BMC Health Services Research, 2, 3. http://dx.doi.org/10.1186/1472-6963-2-3 [Article] [PubMed]
Haynes, R. B. (2002). What kind of evidence is it that evidence-based medicine advocates want health care providers and consumers to pay attention to? BMC Health Services Research, 2, 3. http://dx.doi.org/10.1186/1472-6963-2-3 [Article] [PubMed]×
Humphris, D., Littlejohns, P., Victor, C., O’Halloran, P., & Peacock, J. (2000). Implementing evidence-based practice: Factors that influence the use of research evidence by occupational therapists. British Journal of Occupational Therapy, 63, 516–522. http://dx.doi.org/10.1177/030802260006301102 [Article]
Humphris, D., Littlejohns, P., Victor, C., O’Halloran, P., & Peacock, J. (2000). Implementing evidence-based practice: Factors that influence the use of research evidence by occupational therapists. British Journal of Occupational Therapy, 63, 516–522. http://dx.doi.org/10.1177/030802260006301102 [Article] ×
Jette, D. U., Bacon, K., Batty, C., Carlson, M., Ferland, A., Hemingway, R. D., . . . Volk, D. (2003). Evidence-based practice: Beliefs, attitudes, knowledge, and behaviors of physical therapists. Physical Therapy, 83, 786–805. [PubMed]
Jette, D. U., Bacon, K., Batty, C., Carlson, M., Ferland, A., Hemingway, R. D., . . . Volk, D. (2003). Evidence-based practice: Beliefs, attitudes, knowledge, and behaviors of physical therapists. Physical Therapy, 83, 786–805. [PubMed]×
Karlsson, U., & Törnquist, K. (2007). What do Swedish occupational therapists feel about research? A survey of perceptions, attitudes, intentions, and engagement. Scandinavian Journal of Occupational Therapy, 14, 221–229. http://dx.doi.org/10.1080/11038120601111049 [Article] [PubMed]
Karlsson, U., & Törnquist, K. (2007). What do Swedish occupational therapists feel about research? A survey of perceptions, attitudes, intentions, and engagement. Scandinavian Journal of Occupational Therapy, 14, 221–229. http://dx.doi.org/10.1080/11038120601111049 [Article] [PubMed]×
Law, M. C., & MacDermid, J. (2008). Evidence-based rehabilitation: A guide to practice. Thorofare, NJ: Slack.
Law, M. C., & MacDermid, J. (2008). Evidence-based rehabilitation: A guide to practice. Thorofare, NJ: Slack.×
Lugtenberg, M., Burgers, J. S., & Westert, G. P. (2009). Effects of evidence-based clinical practice guidelines on quality of care: A systematic review. Quality and Safety in Health Care, 18, 385–392. http://dx.doi.org/10.1136/qshc.2008.028043 [Article] [PubMed]
Lugtenberg, M., Burgers, J. S., & Westert, G. P. (2009). Effects of evidence-based clinical practice guidelines on quality of care: A systematic review. Quality and Safety in Health Care, 18, 385–392. http://dx.doi.org/10.1136/qshc.2008.028043 [Article] [PubMed]×
Lyons, C., Brown, T., Tseng, M. H., Casey, J., & McDonald, R. (2011). Evidence-based practice and research utilisation: Perceived research knowledge, attitudes, practices and barriers among Australian paediatric occupational therapists. Australian Occupational Therapy Journal, 58, 178–186. http://dx.doi.org/10.1111/j.1440-1630.2010.00900.x [Article] [PubMed]
Lyons, C., Brown, T., Tseng, M. H., Casey, J., & McDonald, R. (2011). Evidence-based practice and research utilisation: Perceived research knowledge, attitudes, practices and barriers among Australian paediatric occupational therapists. Australian Occupational Therapy Journal, 58, 178–186. http://dx.doi.org/10.1111/j.1440-1630.2010.00900.x [Article] [PubMed]×
McCluskey, A. (2004). New graduates: Experts in evidence‐based practice. Australian Occupational Therapy Journal, 51, 173. http://dx.doi.org/10.1111/j.1440-1630.2004.00473.x [Article]
McCluskey, A. (2004). New graduates: Experts in evidence‐based practice. Australian Occupational Therapy Journal, 51, 173. http://dx.doi.org/10.1111/j.1440-1630.2004.00473.x [Article] ×
McCluskey, A., & Lovarini, M. (2005). Providing education on evidence-based practice improved knowledge but did not change behaviour: A before and after study. BMC Medical Education, 5, 40. http://dx.doi.org/10.1186/1472-6920-5-40 [Article] [PubMed]
McCluskey, A., & Lovarini, M. (2005). Providing education on evidence-based practice improved knowledge but did not change behaviour: A before and after study. BMC Medical Education, 5, 40. http://dx.doi.org/10.1186/1472-6920-5-40 [Article] [PubMed]×
Norman, G. (2010). Likert scales, levels of measurement and the “laws” of statistics. Advances in Health Sciences Education: Theory and Practice, 15, 625–632. http://dx.doi.org/10.1007/s10459-010-9222-y [Article] [PubMed]
Norman, G. (2010). Likert scales, levels of measurement and the “laws” of statistics. Advances in Health Sciences Education: Theory and Practice, 15, 625–632. http://dx.doi.org/10.1007/s10459-010-9222-y [Article] [PubMed]×
Polit, D. F., & Beck, C. T. (2008). Nursing research: Generating and assessing evidence for nursing practice (8th ed.). Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins.
Polit, D. F., & Beck, C. T. (2008). Nursing research: Generating and assessing evidence for nursing practice (8th ed.). Philadelphia: Wolters Kluwer/Lippincott Williams & Wilkins.×
Rassafiani, M., Ziviani, J., Rodger, S., & Dalgleish, L. (2008). Occupational therapists’ decision-making in the management of clients with upper limb hypertonicity. Scandinavian Journal of Occupational Therapy, 15, 105–115. http://dx.doi.org/10.1080/11038120701645425 [Article] [PubMed]
Rassafiani, M., Ziviani, J., Rodger, S., & Dalgleish, L. (2008). Occupational therapists’ decision-making in the management of clients with upper limb hypertonicity. Scandinavian Journal of Occupational Therapy, 15, 105–115. http://dx.doi.org/10.1080/11038120701645425 [Article] [PubMed]×
Rogers, E. M. (1983). Diffusion of innovations. New York: Free Press.
Rogers, E. M. (1983). Diffusion of innovations. New York: Free Press.×
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: Free Press.
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). New York: Free Press.×
Rothman, K. J. (1990). No adjustments are needed for multiple comparisons. Epidemiology, 1, 43–46. http://dx.doi.org/10.1097/00001648-199001000-00010 [Article] [PubMed]
Rothman, K. J. (1990). No adjustments are needed for multiple comparisons. Epidemiology, 1, 43–46. http://dx.doi.org/10.1097/00001648-199001000-00010 [Article] [PubMed]×
Salls, J., Dolhi, C., Silverman, L., & Hansen, M. (2009). The use of evidence-based practice by occupational therapists. Occupational Therapy in Health Care, 23, 134–145. http://dx.doi.org/10.1080/07380570902773305 [Article] [PubMed]
Salls, J., Dolhi, C., Silverman, L., & Hansen, M. (2009). The use of evidence-based practice by occupational therapists. Occupational Therapy in Health Care, 23, 134–145. http://dx.doi.org/10.1080/07380570902773305 [Article] [PubMed]×
Thomas, A., Saroyan, A., & Dauphinee, W. D. (2011). Evidence-based practice: A review of theoretical assumptions and effectiveness of teaching and assessment interventions in health professions. Advances in Health Sciences Education: Theory and Practice, 16, 253–276. http://dx.doi.org/10.1007/s10459-010-9251-6 [Article] [PubMed]
Thomas, A., Saroyan, A., & Dauphinee, W. D. (2011). Evidence-based practice: A review of theoretical assumptions and effectiveness of teaching and assessment interventions in health professions. Advances in Health Sciences Education: Theory and Practice, 16, 253–276. http://dx.doi.org/10.1007/s10459-010-9251-6 [Article] [PubMed]×
Welch, A., & Dawson, P. (2006). Closing the gap: Collaborative learning as a strategy to embed evidence within occupational therapy practice. Journal of Evaluation in Clinical Practice, 12, 227–238. http://dx.doi.org/10.1111/j.1365-2753.2005.00622.x [Article] [PubMed]
Welch, A., & Dawson, P. (2006). Closing the gap: Collaborative learning as a strategy to embed evidence within occupational therapy practice. Journal of Evaluation in Clinical Practice, 12, 227–238. http://dx.doi.org/10.1111/j.1365-2753.2005.00622.x [Article] [PubMed]×
Table 1.
Curricular Comparison of EBP Content for Cohorts 1 and 2
Curricular Comparison of EBP Content for Cohorts 1 and 2×
Steps in the EBP Process and CohortNo. of Times a Stage of the Diffusion of Innovations Theory Was Emphasized in a Course for Each Step in the EPB Process
Ask a clinical question
  Cohort 1K
  Cohort 2KKKKPPPPDDIICC
Acquire research evidence from databases
  Cohort 1KKK
  Cohort 2KKKKPPPDDIICC
Appraise research evidence critically
  Cohort 1KKDD
  Cohort 2KKKKPPPDDIICC
Apply evidence in clinical practice
  Cohort 1DDDDDD
  Cohort 2DDDIIICCC
Assess or reflect on application of evidence
  Cohort 1D
  Cohort 2DDDIIICCC
Table Footer NoteNote. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.
Note. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.×
Table 1.
Curricular Comparison of EBP Content for Cohorts 1 and 2
Curricular Comparison of EBP Content for Cohorts 1 and 2×
Steps in the EBP Process and CohortNo. of Times a Stage of the Diffusion of Innovations Theory Was Emphasized in a Course for Each Step in the EPB Process
Ask a clinical question
  Cohort 1K
  Cohort 2KKKKPPPPDDIICC
Acquire research evidence from databases
  Cohort 1KKK
  Cohort 2KKKKPPPDDIICC
Appraise research evidence critically
  Cohort 1KKDD
  Cohort 2KKKKPPPDDIICC
Apply evidence in clinical practice
  Cohort 1DDDDDD
  Cohort 2DDDIIICCC
Assess or reflect on application of evidence
  Cohort 1D
  Cohort 2DDDIIICCC
Table Footer NoteNote. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.
Note. Some courses emphasized more than one stage (e.g., knowledge and persuasion). Stages of the Diffusion of Innovations Theory (Rogers, 2003) are as follows: K = knowledge, P = persuasion, D = decision, I = implementation, and C = confirmation. EBP = evidence-based practice.×
×
Table 2.
Comparison of Cohort Practice Characteristics
Comparison of Cohort Practice Characteristics×
CharacteristicCohort 1 (n = 63)Cohort 2 (n = 62)χ2(1)p
n%n%
Practice areaa14.67.001*
 Children and youth3454.01219.4
 Productive aging57.9812.9
 Rehab, disability, participation2438.13962.9
 Mental health0011.6
 Academic and fieldwork education0011.6
 Missing0011.6
Practice location0.40.82
 Rural69.558.1
 Urban2742.92337.1
 Suburban3047.63251.6
 Missing0023.2
No. of full-time OTs at facility1.28.74
 <53250.82946.8
 5–101219.01524.2
 11–15711.1812.9
 >151219.0812.9
 Missing0023.2
Hours worked per week0.75.86
 <2034.834.8
 20–3023.211.6
 31–402844.42438.7
 >403047.63353.2
 Missing0011.6
Clients seen per dayb1.64.65
 <51015.958.1
 5–104469.84674.2
 11–15812.7812.9
 >1511.611.6
 Missing0023.2
Table Footer NoteNote. Percentages may not total 100 because of rounding. OTs = occupational therapists.
Note. Percentages may not total 100 because of rounding. OTs = occupational therapists.×
Table Footer NoteaAs a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).
As a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).×
Table Footer NotebAs a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.
As a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.×
Table Footer Note*Significant at the p ≤ .01 level.
Significant at the p ≤ .01 level.×
Table 2.
Comparison of Cohort Practice Characteristics
Comparison of Cohort Practice Characteristics×
CharacteristicCohort 1 (n = 63)Cohort 2 (n = 62)χ2(1)p
n%n%
Practice areaa14.67.001*
 Children and youth3454.01219.4
 Productive aging57.9812.9
 Rehab, disability, participation2438.13962.9
 Mental health0011.6
 Academic and fieldwork education0011.6
 Missing0011.6
Practice location0.40.82
 Rural69.558.1
 Urban2742.92337.1
 Suburban3047.63251.6
 Missing0023.2
No. of full-time OTs at facility1.28.74
 <53250.82946.8
 5–101219.01524.2
 11–15711.1812.9
 >151219.0812.9
 Missing0023.2
Hours worked per week0.75.86
 <2034.834.8
 20–3023.211.6
 31–402844.42438.7
 >403047.63353.2
 Missing0011.6
Clients seen per dayb1.64.65
 <51015.958.1
 5–104469.84674.2
 11–15812.7812.9
 >1511.611.6
 Missing0023.2
Table Footer NoteNote. Percentages may not total 100 because of rounding. OTs = occupational therapists.
Note. Percentages may not total 100 because of rounding. OTs = occupational therapists.×
Table Footer NoteaAs a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).
As a result of cells with cell counts <5, comparison of cells was made between the first three categories (Cohort 1, n = 63, Cohort 2, n = 59).×
Table Footer NotebAs a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.
As a result of cells with cell counts <5, comparison of cells was made after collapsing the last two categories.×
Table Footer Note*Significant at the p ≤ .01 level.
Significant at the p ≤ .01 level.×
×
Table 3.
Attitude Items Indicating Positive View of EBP
Attitude Items Indicating Positive View of EBP×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 62)FET, two-tailed p
n%n%
Agreement with positively worded attitude items
 EBP is necessary.5993.65995.1.99
 Literature is useful in my daily practice.5180.94775.8.52
 EBP improves patient care.5892.062100.06
 EBP will improve my reimbursement.1117.51321.0.66
 EBP helps me make decisions.5079.45283.9.65
Disagreement with negatively worded attitude items
 EBP places unreasonable demand.3961.94979.0.05*
 EBP does not take limitations of my practice setting into account.1625.41422.6.84
 Strong evidence is lacking to support my interventions.2133.32337.0.71
 EBP does not take patient preferences into account.1727.02845.2.04*
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
Table 3.
Attitude Items Indicating Positive View of EBP
Attitude Items Indicating Positive View of EBP×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 62)FET, two-tailed p
n%n%
Agreement with positively worded attitude items
 EBP is necessary.5993.65995.1.99
 Literature is useful in my daily practice.5180.94775.8.52
 EBP improves patient care.5892.062100.06
 EBP will improve my reimbursement.1117.51321.0.66
 EBP helps me make decisions.5079.45283.9.65
Disagreement with negatively worded attitude items
 EBP places unreasonable demand.3961.94979.0.05*
 EBP does not take limitations of my practice setting into account.1625.41422.6.84
 Strong evidence is lacking to support my interventions.2133.32337.0.71
 EBP does not take patient preferences into account.1727.02845.2.04*
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. Disagreement includes both disagree and strongly disagree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
×
Table 4.
Agreement With Knowledge and Skill Items
Agreement With Knowledge and Skill Items×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 61)FET, one-tailed p
n%n%
Learned foundations of EBP in school631006098.4.49
Received search strategy training in school6196.85996.7.68
Familiar with medical search engines (e.g., MEDLINE)6095.261100.13
Confident in my ability to find literature4368.35183.6.04*
Learned critical appraisal skills in school5790.561100.02*
Confident in critical review skills3860.34573.8.08
Gained additional EBP skills outside school2641.32642.6.51
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
Table 4.
Agreement With Knowledge and Skill Items
Agreement With Knowledge and Skill Items×
Survey ItemCohort 1 (n = 63)Cohort 2 (n = 61)FET, one-tailed p
n%n%
Learned foundations of EBP in school631006098.4.49
Received search strategy training in school6196.85996.7.68
Familiar with medical search engines (e.g., MEDLINE)6095.261100.13
Confident in my ability to find literature4368.35183.6.04*
Learned critical appraisal skills in school5790.561100.02*
Confident in critical review skills3860.34573.8.08
Gained additional EBP skills outside school2641.32642.6.51
Table Footer NoteNote. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.
Note. Agreement includes both agree and strongly agree categories, collapsed for frequency analysis. EBP = evidence-based practice; FET = Fisher’s Exact Test.×
Table Footer Note*Significant at the p ≤ .05 level.
Significant at the p ≤ .05 level.×
×
Supplemental Material