ABSTRACT
OBJECTIVE To evaluate family physicians’ enjoyment of and knowledge gained from game-based learning, compared with traditional case-based learning, in a continuing medical education (CME) event on stroke prevention and management.
DESIGN An equivalence trial to determine if game-based learning was as effective as case-based learning in terms of attained knowledge levels. Game questions and small group cases were developed. Participants were randomized to either a game-based or a case-based group and took part in the event.
SETTING Ontario provincial family medicine conference.
PARTICIPANTS Thirty-two family physicians and 3 senior family medicine residents attending the conference.
INTERVENTION Participation in either a game-based or a case-based CME learning group.
MAIN OUTCOME MEASURES Scores on 40-item immediate and 3-month posttests of knowledge and a satisfaction survey.
RESULTS Results from knowledge testing immediately after the event and 3 months later showed no significant difference in scoring between groups. Participants in the game-based group reported higher levels of satisfaction with the learning experience.
CONCLUSION Games provide a novel way of organizing CME events. They might provide more group interaction and discussion, as well as improve recruitment to CME events. They might also provide a forum for interdisciplinary CME. Using games in future CME events appears to be a promising approach to facilitate participant learning.
Games are an innovative and challenging educational method.1 They have long been used as a teaching strategy in both child and adult education. They have also been used as a teaching strategy in medical education,2–10 predominantly to review and reinforce lecture material for undergraduate medical students. In contrast, the nursing literature contains examples of games being used as educational tools for practitioners.11–20
It is well known that games can incorporate concepts and principles of adult learning, including promoting self-learning and participation.21 Games also provide the opportunity for learners to reinforce previously learned information and acquire new knowledge.14,22 By involving repetition and allowing important points to be reiterated, games appear to increase retention and application.23 In addition, games are believed to connect theory and practice and provide the opportunity for immediate feedback.1,14,17 Motivated by the inherent competition, games also provide opportunities for learners to serve as peer teachers, team leaders, and teammates.19 Games encourage interaction among learners, increase learners’ levels of motivation, and enhance the opportunity to learn from others.20 Unlike many other educational formats, game-based learning can bring fun and enjoyment to the learning experience and might encourage greater participation in group learning activities, with the potential to engage learners’ emotions, as well as their intellects. Therefore, this format could substantially contribute to the development of a wider repertoire of teaching and learning methods in continuing medical education (CME). Although many authors claim that games are as effective as more traditional educational methods, games have rarely been formally evaluated, with positive claims being largely based on anecdotal evidence. Formal evaluations of games, demonstrating that they are actually as effective a teaching and learning strategy for CME as more traditional methods are, are lacking. The purpose of this study was to determine if a game-based format is as effective as traditional case-based learning for a CME stroke prevention and management program.
METHODS
Ethics approval was obtained from the University of Toronto Research Ethics Board.
Development of materials
Development and validation of the game. The game was based on the board game “Snakes and Ladders.” The board game concept was chosen because it facilitates small group interactivity among learners.6 Twenty-two multiple-choice and true-or-false questions were developed from materials adapted from a nationally accredited educational workshop, “Changing Dynamics of Stroke Prevention and Management.” These questions were read and discussed by each team of participants. Each game involved 3 teams (pairs) of physicians, and 1 trained moderator who facilitated the game, kept time, and had the answers to the game’s questions. A neurologist was also available as a resource to participants.
The game had been previously used with 52 physicians on 4 different occasions. Questions and rules were modified based on these physicians’ suggestions and research team observations.
Development of the cases. Cases were adapted from an accredited learning activity of the University of Ottawa Department of CME, la Fédération des Médecins Omnipracticiens du Québec, and Aventis Inc. These cases were based on results of a needs assessment of 250 family physicians. Case groups consisted of 5 to 7 participants and a trained facilitator. A neurologist was also available as a resource to participants.
The investigators reviewed the game and case questions to ensure that both covered the same materials in a comparable fashion.
Development of the knowledge test. A 40-item multiple-choice knowledge test, which included 4 clinical vignettes and 20 free-standing questions, was developed. Clinical vignettes have been validated as a method for measuring the competence of physicians and the quality of their actual practice.24 The questions were designed to address the topics covered in the game-based and case-based CME sessions. The candidate questions were reviewed by several family physicians and a neurologist. The 3-month posttest consisted of the same questions presented in a different order, with reordering of answer options. All questions are available upon request.
Development of the participant evaluation of the sessions. Participants completed an evaluation that included 11 statements, each rated on a 5-point scale. The statements explored participants’ enjoyment of the event, their subjective learning experiences, and whether they would attend future CME events using the same format.
Video. A 30-minute videotaped lecture by a neurologist on stroke prevention and management was viewed by all participants before playing the game or discussing the cases, providing both groups with a similar baseline of knowledge.
Sample size calculation. Using a Web-based equivalence trial sample size calculator,25 we determined that 25 participants per group would be necessary to determine that the traditional case-based group’s mean score was no more than 2.5 points greater (out of 40) than the game-based group’s mean score (and 17 per group for a difference of no more than 3.0 points). The sample size was calculated for a study power of 80%, a 1-sided α of .05, an assumed overall mean score of 28 (out of 40), and a standard deviation of 3.5. These estimates were informed by the administration of an earlier version of the knowledge test to a convenience sample of 17 primary care physicians.
Analysis
Another Web-based calculator26 was used to assess differences of proportions for relevant demographic data (ie, the 2 × 2 tables) and for each of the 11 session-evaluation statements completed by the participants, comparing the proportion in each group “strongly agreeing” with the statement. A Web-based calculator for large contingency tables27 was used to analyze the remaining demographic data (ie, the 3 × 2 tables). The Simple Interactive Statistical Analysis Web-based calculator was used to conduct t tests and calculate the lower limit of the 1-tailed 95% confidence intervals (CIs) for the comparison of mean scores.28 Between-group comparisons of knowledge test scores and session evaluations were also conducted with the Wilcoxon test, using the Institute of Phonetic Sciences’ Web-based statistical calculator.29 Statistical significance (ie, α level) was set at .05 for all comparisons. Owing to concerns about data distribution, analysis was done with parametric and nonparametric tests. Because we found similar results with both tests, only the parametric test results are presented.
The event
The CME event took place at an annual meeting of the Ontario College of Family Physicians in Toronto, Ont. E-mails sent before and posters at the conference were used to recruit participants. All physicians attending the conference were invited to attend. Participants were randomized in blocks into groups of 6 to either the game-based or case-based groups (Figure 1). All participants viewed the 30-minute video. Following this, each group’s CME session ran simultaneously for 1 hour. In both groups, there was 1 facilitator for each group of 6 participants plus a neurologist in each of the 2 rooms. Facilitators were provided with a short explanation of the best answer for each question used in the sessions to reinforce participant learning.
Study design
RESULTS
Participants
Despite the participant incentives ($50 plus 2 family medicine texts), and extensive recruitment efforts, fewer than the target number of participants were recruited, with 32 family physicians and 3 senior family medicine residents participating. Eighteen participants were randomized to game-based learning (3 groups of 6) and 17 to case-based learning (2 groups of 6 and 1 group of 5). The demographic questionnaire, completed by each participant, demonstrated the comparability of the 2 groups (Table 1).
Participant demographics
Knowledge test
Immediate posttest. On average, the game-based group scored 1.6 points lower (out of 40) than the case-based group (P = .24; lower limit of 95% CI −3.8) (Table 2).
Comparison of game-based and case-based test results by time of testing: The game-based group scored on average 1.6 points lower (P = .24; lower limit of 95% CI −3.8) than the case-based group on the immediate posttest and 0.3 points lower (P = .83; lower limit of 95% CI −2.5) on the 3-mo posttest.
Three-month posttest. Thirty-one (89%) of the 35 participants completed this test (15 in the game-based group and 16 in the case-based group). On average, the game-based group scored 0.3 points lower (out of 40) than the case-based group (P = .83; lower limit of 95% CI −2.5) (Table 2).
Participants’ evaluation of the session
At the conclusion of the CME session, and after completing the immediate posttest of knowledge, all but one of the participants completed the session-evaluation questionnaire. Game-based participants more frequently chose “strongly agree” (5 on the 5-point scale) for many of the statements (Table 3). A higher proportion of game-based versus case-based participants strongly agreed that the event was enjoyable (94% vs 53%; P = .02), that their attention was high throughout the event (88% vs 41%; P = .012), and that they would register for a similar event in the future (82% vs 41%; P = .034). The comments about the CME event were more strongly positive from the game-based participants (Table 4).
Results of participant evaluation of the session, comparing game-based and case-based participants responding “strongly agree” for each statement
Written comments from participants on the session evaluation
DISCUSSION
This study found, on average, that participants in the traditional case-based learning group scored slightly higher on the immediate posttest of knowledge than the game-based group participants did, but this difference was not statistically significant. Further, 3 months following the CME session both groups had very comparable scores on the knowledge test. Participant evaluations demonstrated that game-based participants reported higher satisfaction with the event than their case-based counterparts did, which could mean that the game-based CME format has the potential to enhance recruitment for future game-based CME events.
While the immediate posttest results suggested that the traditional case-based format led to higher levels of knowledge (as much as a 3.8/40 or 9.5% higher score on the knowledge test), the knowledge levels were similar on the 3-month posttest of knowledge (consistent with the case-based group’s knowledge being no more than 2.5/40 or 6.3% greater than that of the game-based group).
This decrease in the difference between the 2 groups after 3 months appears to be largely due to a larger decline in knowledge scores after 3 months among the case-based group. This observation suggests the possibility that the game-based format leads to less long-term degradation of knowledge than the traditional case-based format. This differential in knowledge “decay” between the 2 educational formats deserves further exploration in future studies of game-based and case-based CME.
Strengths and limitations
The sample size was smaller than we had ideally hoped for, largely because of the challenge of recruiting participants. This might have been partly the result of the length of time required to participate (a total of 3 hours on the night of the event plus the 3-month posttest). It should be noted, however, that a game-based CME session, without the evaluation components or video, could be completed in half the time. Despite recruiting fewer participants than originally planned, the range of knowledge score differences suggested that the case-based group’s 3-month posttest knowledge scores would be no more than 6.3% (ie, 2.5 questions out of 40) greater than those of the game-based group. Finally, as frequently is the case in studies evaluating the effect of CME, this study measured only knowledge level and made no assessment of changes in participants’ clinical practice following the session.
Two study strengths were the use of an experimental design, which enhanced study group comparability, and the inclusion of a 3-month posttest, which enabled assessment of longer-term knowledge retention.
Conclusion
Presenting educational materials in a dynamic, innovative manner is a constant challenge for medical educators. This study found that knowledge gained through game-based learning was comparable to case-based learning, particularly 3 months after the event. In addition, the educational experience was reported as more enjoyable by the game-based participants. The findings of this study should encourage CME providers to consider using educational games more frequently. Future studies of this format should strive to include a larger number of participants. The interactivity and teamwork involved in this form of learning appear to make games a promising format for interprofessional continuing education.
Notes
EDITOR’S KEY POINTS
-
Educators strive to identify innovative ways of providing continuing medical education (CME) programs. Learning through games is engaging and challenging, and there are many possible applications of this strategy for CME for health professionals.
-
This study found that the long-term knowledge gained by family physicians through game-based learning was equivalent to that gained through traditional case-based learning, but participants enjoyed the game-based learning more and rated their level of satisfaction higher, compared with those participating in case-based learning.
-
Game-based learning provides a novel and effective way for delivering CME programming.
POINTS DE REPÈRE DU RÉDACTEUR
-
Les responsables des programmes de formation médicale continue (FMC) s’efforcent de trouver des méthodes d’enseignement novatrices. L’apprentissage par le jeu est une méthode attrayante et stimulante qui pourrait s’appliquer dans plusieurs domaines de la FMC pour les professionnels de la santé.
-
Cette étude a montré que les connaissances à long terme acquises par des médecins de famille au moyen de jeux étaient équivalentes à celles acquises par l’apprentissage traditionnel utilisant des cas, mais que les participants avaient davantage apprécié l’apprentissage par le jeu et accordé une cote de satisfaction plus élevée que ceux qui avaient appris par la méthode traditionnelle.
-
L’apprentissage par le jeu est une méthode nouvelle qui peut être utilisée efficacement dans les programmes de FMC.
Footnotes
-
This article has been peer reviewed.
-
Cet article a fait l’objet d’une révision par des pairs.
-
Contributors
All authors contributed to the design of the project and the educational event; Drs Rothman and Harvey analyzed the data; Drs Telner, Bujas-Bobanovic, Marlow, and Harvey and Mr Chester drafted the manuscript; and all authors revised and approved the final version of the manuscript.
-
Competing interests
Funding for this project was obtained from an unrestricted educational grant from Sanofi-aventis.
- Copyright© the College of Family Physicians of Canada