Table 3.
Author et al. (Year) | Main Study Characteristics (Country, Years of Study, Population, Sample Size) | Aim of the Study | Type of Study | Intervention (Delivery Option, Type, Duration etc.) |
Instrument (Name, Structure, Scale etc.) |
Main Findings | CASP Score |
---|---|---|---|---|---|---|---|
Justham and Timmons (2005) [39] | UK, 2002, post-registration nursing students (n = 292) | The learning and attitude of post-registration nursing students were assessed after a web-based statistics test to teach statistics | Quasi-experimental. Pre-post-test study without control group | A 20 credits module titled ‘Evidence-Based Practice’ where an online statistics test was administered via WebCT site. The module addresses issues concerning the critical analysis and evaluation of a variety of sources of evidence relevant to patient/client care |
Learning: statistic test from the WebCT test (objective test). Attitude: ‘Evaluation of the WebCT’ questionnaire by post (21 items) |
Learning: First (pre) practice test mean percentage score 52.1%, and Second (post) assessed test was 92.6% (p = 0.000) Attitude: According qualitative data students have positive views with the use of the WebCT. |
1 |
Long et al. (2016) [26] | USA. and Middle-East (ME) 2013–2014 Quasi-experimental nursing students enrolled in the introductory courses in US, RN-BSN and MSN and in ME BSN (n = 23) RCTs undergraduate nutrition, and PharmD students (n = 159) |
To report the results of the effectiveness of the EBR tool to improve the overall online research and critical appraisal skills of learners engaged in EBP. | Mixed-method, quasi-experimental, and two-population randomized controlled trial (RCT) design | Intervention: EBR tool (interactive technology-based tool usable from a computer, smartphone, or iPad to support student acquisition of online research and critical appraisal skills needed for EBP) Subjects received the same 30-min video training, standardizing the study protocol and explaining how to access and use the tool. The video protocol requested participants to work through all ten steps, open and explore every hyperlink, and answer the embedded questions to help guide the online literature search assignment. A library link specific to each institution was placed within the EBR tool, allowing participants to access their institution’s library. |
Self-report Research questions: The EBR tool pre- and post-test, does the EBR tool intervention (a) improve the overall research skill of users? Does it (b) improve the ability of the user to distinguish the credibility of online source materials? |
Quasi-experimental results (nursing students). Research skills: US/BSN: T1 (M) = 3.50 (SD 0.70) T2 (M) = 2.88 (SD 0.98) d = 0.62 (SD 0.81) (CI 95% 0.40–0.83); p = 0.001 ME/BSN: T1 (M) = 3.27 (SD 0.93) T2 (M) = 2.32 (SD 0.64) d = 0.95 (SD 0.84) (CI 95% 0.58–1.32); p = 0.001 Ability to distinguish credibility of online sources: US/BSN (58), p = 0.057, ME/BSN (22), p = 0.219, US/MSN (41), p = 0.070. |
3 |
Foronda et al. (2017) [41] | USA., 2016, master’s entry-level nursing students (n = 51) |
To examine the impact of an in-class, group virtual simulation exercise on nursing students’ (a) cognitive knowledge of EBP and (b) affective knowledge about how evidence affects clinical decision-making. | Quasi-experimental, pre- and post-test study design without a control group | Virtual simulation using the Internet-based platform of CliniSpace and followed with a post-test. The entire exercise lasted about 30 min. |
Self-report questions: Cognitive knowledge related to EBP five multiple-choice questions were asked. Affective knowledge about how evidence affects clinical decision-making. |
Cognitive knowledge related to EBP: Median pretest 60% (IQR = 20) Median post-test score was 80% (IQR = 20) (p < 0.0001) Affective knowledge about how evidence affects clinical decision-making Pretest: 35% (agree) and 63% (strongly agree). Post-test scores increased to 23% rating it a 4 (agree) and 77% (strongly agree). Not p value shows |
3 |
Rojjanasrirat and Rice (2017) [42] | USA. 2011–2012 Nursing master students (n = 63) |
To examine whether or not EBP content provided early in an online, graduate research/EBP course would change the knowledge, attitudes, and practice of EBP of students from before to after taking the course. | Quasi-experimental, pre- and post-test study design without a control group |
Intervention: Online research/EBP course This 4-credit hour graduate nursing course was taught across a 16-week trimester in the first semester of the MSN curriculum. On-line classroom strategies were used to teach research content and the EBP process including weekly asynchronous discussions, case study analysis, quizzes/exams, research critique, and a final EBP project that incorporated all elements of the EBP process |
Self-report questionnaire: Evidence-Based Practice Questionnaire (EBPQ) (Validated by Upton, 2006) [48] Skill/Knowledge, Attitude, Practice Facilitators and barriers to learning EBP concepts developed by the principal investigator were also included, consisted of 14 structured items |
Overall EBPQ mean scores significantly improved after taking the EBP course (t (63) = −9.034, p < 0.001). Practice of EBP Post-test: (M = 74.06, SD = 9.04) Pre-test scores (M = 55.29, SD = 11.57, t (63) = −12.78, p = 0.001). Knowledge of EBP Post-test (M = 25.23, SD = 9.9) Pre-test (M = 23.42, SD = 9.41; p = 0.79 Attitudes toward EBP Post-test: (M = 21.11, SD = 3.51) Pre-test (M = 20.69, SD = 3.51); p = 0.43 |
3 |
Acronyms: Evidence based practice (EBP); focused interactive teaching—E-FIT; Knowledge, Attitudes and Behavior—KAB; Team-based learning—TBL; Evidence-based research—EBR.