Med.Sci.Educ. (2016) 26:175–180 DOI 10.1007/s40670-016-0228-9

MONOGRAPH

Using an Instructional Design Model to Teach Medical Procedures Lawrence Cheung 1,2

Published online: 19 January 2016 # The Author(s) 2016. This article is published with open access at Springerlink.com

Abstract Educators are often tasked with developing courses and curricula that teach learners how to perform medical procedures. This instruction must provide an optimal, uniform learning experience for all learners. If not well designed, this instruction risks being unstructured, informal, variable amongst learners, or incomplete. This article shows how an instructional design model can help craft courses and curricula to optimize instruction in performing medical procedures. Educators can use this as a guide to developing their own course instruction. Keywords Instructional design model . Gagne’s Theory of Instruction . Medical procedures . Percutaneous chest tube insertion

learners attempting procedures with which they are unfamiliar [4] and which they may perform incorrectly [5–7]. To reduce these pitfalls, educators can use an instructional design model when designing a course or curriculum. An instructional design model helps ensure that the learning objectives are clear, the instruction and learning experiences align with the objectives, the learning activities are similar amongst the learners, and the assessments used to determine competence are appropriate [8]. It serves as a blueprint that specifies the type, amount, and order of learning events that will occur [9]. This article will show educators how an instructional design model—in this case, Gagne’s theory of instructional design—can be used to design courses to teach procedures in medicine, using percutaneous chest tube insertion as an example.

Introduction Gagne’s Theory of Instructional Design When educators design courses to teach learners how to perform medical procedures, they risk providing instruction that may be informal and unstructured [1], taught by supervisors who may lack competence in the procedure themselves [2], and based on instructional methods that are not supported by the medical education literature [3]. This, in turn, can lead to

* Lawrence Cheung [email protected]

1

Department of Medicine, University of Alberta, Edmonton, AB, Canada

2

Division of Critical Care Medicine, University of Alberta, 3-129 Clinical Sciences Building, 11350 83 Avenue, Edmonton, AB T6G 2G3, Canada

In Gagne’s theory of instructional design [10], developers of the lesson plan must first determine the type of outcome that the learners must achieve; then, they construct and tailor the instructional events necessary to achieve this outcome [9]. This model has been used to develop instructional plans to teach a variety of procedural [11–13] and cognitive skills [14–18]. Gagne’s theory of instructional design posits five learning outcomes and nine events of instruction.

Gagne’s Five Learning Outcomes Gagne proposed five types of learning outcomes, including attitudes, motor skills, memory or recall, complex or procedural knowledge, and learning strategies [19]. The latter three

176

involve cognitive outcomes, while attitudes and motor skills involve affective and psychomotor outcomes, respectively. Complex or procedural knowledge, in turn, encompasses five subcategories including discriminations, concrete concepts, defined concepts, rules, and higher order rules or problem solving [20]. Using the example of percutaneous chest insertion, the learning outcome would be motor skills (that is, procedural technique) and memory or recall (learning the indications, contraindications, and immediate complications of the procedure).

Med.Sci.Educ. (2016) 26:175–180

verbs [27, 28] that describe observable behaviors that the learners need to demonstrate [29–32]. For example, some of our objectives include “List the equipment needed for percutaneous chest tube insertion,” “Insert the introducer needle into the pleural space,” and “Insert the chest tube over the guide wire.” We also ask learners to state their learning objective(s)—that is, what they intend to learn from the session. By focusing activities on their needs, we can engage and empower them during the learning process [33]. Stimulating Recall of Prerequisite Learning

Gagne’s Nine Events of Instruction Once educators have identified the learning outcomes, they must construct and organize the instructional events to achieve these learning outcomes. Gagne proposed nine events of instruction including gaining attention, informing the learner of the objectives, stimulating recall of prerequisite learning, presenting the stimulus material, providing learning guidance, eliciting the performance, providing feedback about performance correctness, assessing the performance, and enhancing retention and transfer [20]. Adopting Gagne’s nine events of instruction, we use the following instructional blueprint when teaching our learners (that is, residents in our respirology subspecialty residency program) percutaneous chest tube insertion. Gaining Attention Educators first need to gain, and maintain, the learners’ attention so that the latter can focus on the requisite learning. We use a pre-test (and subsequent post-test near the end of instruction) to grab their attention and promote participation as learners tend to view this approach favorably [21]. Relating their learning to the workplace also helps gain their attention [22], and we emphasize that learning this procedure is a practical skill required to manage patients during their training and clinical practice. This reinforcement also stimulates their intrinsic motivation and enhances their learning autonomy. Our instructors also judiciously use humor, anecdotes, and casebased examples to emphasize their teaching points as these have been shown to capture attention [23, 24]. Informing the Learners of the Objectives After gaining the learners’ attention, educators must state the learning objectives. Akin to what DeSilets [25] calls a “road map” showing the educational destination, learning objectives specify to the learners and instructors the skills that should be achieved and the outcomes that will be assessed [26]. Stated from the learner’s perspective, objectives should use action

Stimulating recall of prerequisite knowledge serves many functions. It establishes what the learners already know and reveals deficits in pre-existing knowledge that instructors must fill before further learning occurs. It helps the learners organize this pre-existing knowledge into conceptual schemas that can facilitate learning of new material [34] and activates prior knowledge to improve the information processing that will occur in the subsequent learning [35]. To do this, we review the answers to the initial pre-test that we administered while we were gaining the learners’ attention—this act of retrieving information enhances subsequent learning and recall. Our pre-test covers the anatomy of the chest wall, lungs and pleural space, appearance of a pleural effusion on ultrasound, the diseases that can cause a pleural effusion, and the diagnostic tests needed to ascertain the cause. In addition to lower order questions that test recall (for example, draw the anatomy of the chest wall, lungs, and pleural space), we use higher order questions as well. For example, given the relevant anatomy, we ask questions on the complications that can occur during the procedure and the ways to avoid them—these higher order questions can promote deep thinking and learner engagement. We also ask individuals to explain their answers to the group as this mindful use of prior knowledge facilitates the learning of the person who formulates the explanation [36]. Presenting the Stimulus Material Here, new information is presented to the learner. Instructors must emphasize important learning points. In the case of teaching a procedural skill, instructors should not only emphasize the actions needed to perform the procedure correctly, but also highlight the actions to avoid in order to decrease the risk of adverse events. When teaching percutaneous chest tube insertion, we use small groups of up to five members as this has been shown to increase learning gains and learner satisfaction compared to larger groups [37]. We review the prerequisites for the procedure—this includes obtaining informed consent and ensuring availability of all the necessary equipment, space, and

177

Med.Sci.Educ. (2016) 26:175–180

personnel. Then, using photographs of each step of the procedure, we explain each of these steps such as positioning the patient, localizing the effusion with ultrasound, appropriately positioning the patient for the procedure, donning the appropriate gowns, masks, and gloves, opening the sterile tray and organizing the equipment, and so on. For each step, we emphasize the correct actions to perform and the incorrect actions to avoid. Before the teaching session, we give each learner a handout outlining the procedure so that they can prepare themselves. To promote deeper learning, we encourage the learners to ask questions and we invite group members to share their own procedural tips that they have observed in the past. Providing Learning Guidance Providing learning guidance involves modeling or showing the learner the correct performance. In some cases, instruction during this phase might be very similar to that provided during presentation of the stimulus material [9]. In our case of procedural learning, providing learning guidance involves a demonstration of the whole procedure—uninterrupted—from start to finish. This integrates all of the individual procedural steps that were taught during presentation of the stimulus. We first play a demonstrational video in its entirety without interruption; then, we replay the video—pausing intermittently to give the learners the opportunity to ask questions and/or supply comments during the second viewing. Eliciting the Performance To elicit the learners’ performance, they must be given a chance to practice and demonstrate the skill they are required to learn. Before eventually performing the procedure on patients, each learner is given the opportunity to perform the procedure on a manikin that emulates the chest wall anatomy with a pleural effusion. Learners can view practice with a simulator as effective as practice on real patients [38] and simulators help learners achieve a variety of performance skills without compromising patient safety [39–46]. Each of our learners takes a turn performing the procedure while the instructor and other learners observe. Organizing the learners into groups of two—with each learner taking a turn performing and critiquing the procedure—can decrease the instructor to learner ratio [47]. We let each learner practice the procedure once, while realizing that more complex procedures benefit from serial deliberate practice and feedback over several sessions [48, 49].

to make [50–54]. And, feedback, without the learner’s reflection on how to incorporate that feedback, will not likely lead to improvement either [55]. Thus, coaching and feedback, coupled with the learner’s self-appraisal, is needed to improve performance [56–59] and enhance future self-assessment ability [60]. This is especially true when the verbal feedback is given by an expert instructor already proficient with the procedural technique [61] and is tailored to the learners’ needs [62]. When providing feedback, our instructors ensure that their feedback includes components that help the learner improve, rather than providing feedback that is vague or unhelpful. For example, our feedback tries to follow an established pattern where the instructor observes the learner’s performance, provides advice, and compares his assessment with the learner’s own assessment [63]. We allow our learners to reflect on the feedback during the process, and the feedback is provided in a safe, non-judgemental learning environment. Based on the experience of others, we aim to provide feedback that is specific and timely [64, 65], describes task performance [66–68], and incorporates the learner’s goals and anticipated outcomes [69]. Other learners in the group are also invited to supply feedback. Assessing the Performance After the learners have had a chance to improve their performance with feedback and reflection, they must then demonstrate this skill from start to finish, on the manikin, without interruption. Our goal is not to assess final competence as this relies on more formal standard setting methods. Rather, our purpose is to determine whether it is appropriate to allow the learner to perform the procedure on patients with ongoing supervision. Here, competence is not a one-time achievement; instead, it is a process or what Leach [70] refers to as a “habit” of life-long learning, and the learner needs to continue to demonstrate the procedure in the clinical context with supervision [71, 72]. Learners must demonstrate a checklist of components created by our faculty who have expertise in the procedure. If they satisfy the items in the proper sequence, they have satisfied our curricular standards for the procedure and can go on to perform the procedure, with supervision, in the clinical setting. Learners who fail to demonstrate the skills in the checklist redo the individual components, with feedback and reflection, before trying again.

Providing Feedback About Performance Correctness

Enhancing Retention and Transfer

Practice by itself, without feedback, does not necessarily improve performance as learners may be unable to accurately assess themselves and determine the improvements they need

In this instructional event, retention refers to the learner’s ability to repeat the skill in future settings and transfer refers to the learner’s ability to adapt these skills to different situations [9].

178

Much of the groundwork for this instructional event will have been established by the preceding instructional events. For example, retention will be enhanced when active learning occurs while presenting the stimulus material and providing learning guidance. Retention is also enhanced when the learner’s own self-appraisal is incorporated while providing feedback, and the learner’s educational goals are accommodated while reviewing the learning objectives [55, 73, 74]. At this time, we administer the post-test and compare their performance to their pretest to reinforce what they have learned. Furthermore, we give the learners access to the demonstrational video to review in the future before performing the procedure again. This blended learning allows them to review the procedure in the proper clinical context, enhancing retention [75]. To promote transfer of knowledge and skills, we discuss how to modify the procedural technique in a variety of situations, such as when the patient has limited mobility and cannot maneuver into the proper position. We also try to instill an attitude of what Fraser and Greenlaugh [76] refer to as “capability”—that clinicians must adapt to situations which are novel and with which they may feel some uncertainty [77–79].

Med.Sci.Educ. (2016) 26:175–180

instruction [81], and Merrill’s First Principles of Instruction [82], to name a few. Also, while this article has used one procedural skill as an example, an instructional design model can be used to program instruction for a variety of cognitive and procedural skills. Further study is needed to assess the effect that implementing an instructional design model into a teaching program has on workplace (that is, clinical) performance, such as the effect on procedural complication rates or speed. Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http:// creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

References 1.

2.

Program Evaluation 3.

Our program evaluation supports ongoing use of this instructional method. Before structuring our teaching, learners were asked whether the educational material was presented effectively and whether they felt they had acquired adequate procedural skills. Responses ranged from 2 (Disagree) to 5 (Strongly Agree) on a Likert scale from 1 to 5 for these two questions. After implementing our structured teaching, subsequent cohorts of learners consistently reported scores of 4 (Agree) and 5 for both questions. As well, we began to assess the learners’ procedural performance on an objective structured clinical examination (OSCE) station. On global ratings of performance—with possible scores from 1 to 5 and a score of “3” or higher needed to pass the station—all learners have received scores between 3 and 5.

4.

5.

6.

7. 8.

9.

10.

Conclusion Using an instructional design model to craft components of the curriculum enables educators to structure the teaching so that all learners have a comparable learning experience. This structure, in turn, helps identify the specific program components that are effective and those that require improvement. While this article uses Gagne’s theory of instruction, there are many other educational paradigms that could be used for procedural instruction, such as Mayer’s instruction based on cognitive load theory [80], Peyton’s 4 step approach to procedural

11.

12.

13. 14. 15.

Mason WT, Strike PW. See one, do one, teach one—is this still how it works? A comparison of the medical and nursing professions in the teaching of practical procedures. Med Teach. 2003;25:664–6. Wickstrom GC, Kelley DK, Keyserling TC, Kolar MM, Dixon JG, Xie SX, et al. Confidence of academic general internists and family physicians to teach ambulatory procedures. J Gen Intern Med. 2000;15:353–60. Levinson AJ. Where is evidence-based instructional design in medical education curriculum development? Med Educ. 2010;44:536–7. Yadla S and Rattigan EM. “See One, Do One, Teach One: Competence versus Confidence in Performing Procedures.” Virtual Mentor. 2003;5:1–4. Davis JS, Garcia GD, Jouria JM, Wyckoff MM, Alsafran S, Graygo JM, et al. Identifying pitfalls in chest tube insertion: improving teaching and performance. J Surg Educ. 2013;70:334–9. Elsayed H, Roberts R, Emadi M, Whittle I, Shackcloth M. Chest drain insertion is not a harmless procedure—are we doing it safely? Interact Cardiovasc Thorac Surg. 2010;11:745–8. Griffiths JR, Roberts N. Do junior doctors know where to insert chest drains safely? Postgrad Med J. 2005;81:456–8. Piskurich GM. “What is this Instructional Design Stuff Anyway?”. In: Piskurich GM, editor. Rapid instructional design: learning ID fast and right. Thirdth ed. Hoboken: Wiley; 2015. Okey JR. “Procedures of Lesson Design,” in Instructional Design: Principles and Application, L. J. Briggs, Ed., 2nd ed: Education Technology Publications, 1991, pp. 192–208. Gagné RM. The conditions of learning. New York: Holt, Rinehart, and Winston; 1965 Ng JY. Combining Peyton’s four-step approach and Gagne’s instructional model in teaching slit-lamp examination. Perspect Med Educ. 2014;3:480–5. Khadjooi K, Rostami K, Ishaq S. How to use Gagne’s model of instructional design in teaching psychomotor skills. Gastroenterol Hepatol Bed Bench. 2011;4:116–9. Buscombe C. Using Gagne’s theory to teach procedural skills. Clin Teach. 2013;10:302–7. Condell SL, Elliott N. Gagne’s theory of instruction—its relevance to nurse education. Nurse Educ Today. 1989;9:281–4. Coulter MA. A review of two theories of learning and their application in the practice of nurse education. Nurse Educ Today. 1990;10:333–8.

179

Med.Sci.Educ. (2016) 26:175–180 16.

17.

18. 19. 20. 21.

22. 23. 24. 25. 26.

27. 28.

29. 30. 31. 32. 33.

34.

35.

36.

37.

38.

39.

Duan Y. “Selecting and applying taxonomies for learning outcomes: a nursing example.” Int J Nurs Educ Scholarsh. 2006; 3: Article 10. Miner A, Mallow J, Theeke L, Barnes E. “Using Gagne’s 9 Events of Instruction to Enhance Student Performance and Course Evaluations in Undergraduate Nursing Course.” Nurse Educ. 2015;40:152–4 Belfield J. Using Gagne’s theory to teach chest X-ray interpretation. Clin Teach. 2010;7:5–8. Gagné RM. The conditions of learning and theory of instruction: New York: Holt, Rinehart and Winston, c1985. 4th ed. 1985. Gagne RM. Mastery learning and instructional design. Perform Improv Q. 1988;1:7–18. Cao L, McInnes MD, Ryan JO. What makes a great radiology review course lecture: the Ottawa radiology resident review course experience. BMC Med Educ. 2014;14:22. Jokinen P, Mikkonen I. Teachers’ experiences of teaching in a blended learning environment. Nurse Educ Pract. 2013;13:524–8. Naftulin DH, Ware JEJ, Donnelly FA. The Doctor Fox Lecture: a paradigm of educational seduction. J Med Educ. 1973;48:630–5. Collins J. Education techniques for lifelong learning. RadioGraphics. 2004;24:1185–92. DeSilets LD. Using objectives as a road map. J Contin Educ Nurs. 2007;38:196–7. Grant J. Principles of curriculum design. In: Swanwick T, editor. Understanding medical education: evidence, theory and practice. 2nd ed. West Sussex, UK: Wiley-Blackwell; 2013. p. 31–46. Houlden RL, Frid PJ, Collier CP. Learning outcome objectives. Annals RCPSC. 1998;31:327–32. Mager RF. The qualities of useful objectives. In: Mager RF, editor. Preparing instructional objectives: a critical tool in the development of effective instruction. 3rd ed. Atlanta, GA: The Center for Effective Performance Inc; 1997. p. 43–50. Ballard AL. Getting started. Writing behavioral objectives. J Nurs Staff Dev. 1990;6:40–4. Beitz JM. “Developing behavioral objectives for perioperative staff development.” AORN J. 1996;64:87–8, 92–5 Ferguson LM. Writing learning objectives. J Nurs Staff Dev. 1998;14:87–94. Wintergalen B and Skupien MB. “Writing behavioral objectives for continuing education.” Ariz Nurse. 1987;40:6, 15. Beckert L, Wilkinson TJ, Sainsbury R. A needs-based study and examination skills course improves students’ performance. Med Educ. 2003;37:424–8. van Kesteren MT, Rijpkema M, Ruiter DJ, Morris RG, Fernandez G. Building on prior knowledge: schema-dependent encoding processes relate to academic performance. J Cogn Neurosci. 2014;26: 2250–61. Verkoeijen PP, Rikers RM, Schmidt HG. The effects of prior knowledge on study-time allocation and free recall: investigating the discrepancy reduction model. J Psychol. 2005;139:67–79. Pressley M, Wood E, Woloshyn VE, Martin V, King A, Menke D. “Encouraging mindful use of prior knowledge: attempting to construct explanatory answers facilitates learning.” Educational Psychologist. 2015/04/15 1992;27:91–109. Kooloos JGM, Klaassen T, Vereijken M, Van Kuppeveld S, Bolhuis S, Vorstenbosch M. “Collaborative group work: effects of group size and assignment structure on learning gain, student satisfaction and perceived participation.” Medical Teacher. 2015/04/21 2011;33:983–988. Bokken L, Rethans JJ, van Heurn L, Duvivier R, Scherpbier A, van der Vleuten C. Students’ views on the use of real patients and simulated patients in undergraduate medical education. Acad Med. 2009;84:958–63. Hammoud MM, Nuthalapaty FS, Goepfert AR, Casey PM, Emmons S, Espey EL, et al. “To the point: medical education

40. 41.

42. 43. 44.

45.

46.

47.

48.

49.

50.

51.

52.

53.

54.

55.

56.

57.

58.

59.

60.

review of the role of simulators in surgical training. 2008;199: 338–343. Hsu JL, Korndorffer JR Jr, Brown KM. “Design of vessel ligation simulator for deliberate practice.” J Surg Res. 2015. Joyce KM, Byrne D, O’Connor P, Lydon SM, Kerin MJ. An evaluation of the use of deliberate practice and simulation to train interns in requesting blood products. Simul Healthc. 2015;10:92–7. Kalaniti K, Campbell DM. Simulation-based medical education: time for a pedagogical shift. Indian Pediatr. 2015;52:41–5. Lopreiato JO, Sawyer T. Simulation-based medical education in pediatrics. Acad Pediatr. 2015;15:134–42. Michael M, Abboudi H, Ker J, Shamim Khan M, Dasgupta P, Ahmed K. “Performance of technology-driven simulators for medical students—a systematic review. 2014;192:531–3. Thomas GW, Johns BD, Marsh JL, Anderson DD. A review of the role of simulation in developing and assessing orthopaedic surgical skills. Iowa Orthop J. 2014;34:181–9. Udani AD, Macario A, Nandagopal K, Tanaka MA, Tanaka PP. Simulation-based mastery learning with deliberate practice improves clinical performance in spinal anesthesia. Anesthesiol Res Pract. 2014;2014:659160. Cason ML, Gilbert GE, Schmoll HH, Dolinar SM, Anderson J, Nickles BM, et al. Cooperative learning using simulation to achieve mastery of nasogastric tube insertion. J Nurs Educ. 2015;54:S47–51. Bosse HM, Mohr J, Buss B, Krautter M, Weyrich P, Herzog W, et al. The benefit of repetitive skills training and frequency of expert feedback in the early acquisition of procedural skills. BMC Med Educ. 2015;15:22. Dul J, Pieters JM, Dijkstra S, “Instructional feedback in motor skill learning.” Innovations in Education & Training International. 2015/ 04/18 1987;24:71–76. Burson KA, Larrick RP, Klayman J. Skilled or unskilled, but still unaware of it: how perceptions of difficulty drive miscalibration in relative comparisons. J Pers Soc Psychol. 2006;90:60–77. Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296:1094–102. Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don’t know? Poor self assessment in a well-defined domain. Adv Health Sci Educ Theory Pract. 2004;9:211–24. Hodges B, Regehr G, Martin D. Difficulties in recognizing one’s own incompetence: novice physicians who are unskilled and unaware of it. Acad Med. 2001;76:S87–9. Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77:1121–34. Brookhart SM. “Successful students’ formative and summative uses of assessment information.” Assessment in Education: Principles, Policy & Practice. 2015/04/16 2001;8:153–169. Bonrath EM, Dedy NJ, Gordon LE, Grantcharov TP. “Comprehensive surgical coaching enhances surgical skill in the operating room: a randomized controlled trial.” Ann Surg. 2015. Hamid Y, Mahmood S. Understanding constructive feedback: a commitment between teachers and students for academic and professional development. J Pak Med Assoc. 2010;60:224–7. Strandbygaard J, Bjerrum F, Maagaard M, Winkel P, Larsen CR, Ringsted C, et al. Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized trial. Ann Surg. 2013;257:839–44. Kruglikova I, Grantcharov TP, Drewes AM, Funch-Jensen P. The impact of constructive feedback on training in gastrointestinal endoscopy using high-fidelity virtual-reality simulation: a randomised controlled trial. Gut. 2010;59:181–5. Srinivasan M, Hauer KE, Der-Martirosian C, Wilkes M, Gesundheit N. Does feedback matter? Practice-based learning for

180

61.

62.

63. 64.

65. 66. 67. 68. 69.

70. 71.

Med.Sci.Educ. (2016) 26:175–180 medical students after a multi-institutional clinical performance examination. Med Educ. 2007;41:857–65. Porte MC, Xeroulis G, Reznick RK, Dubrowski A. Verbal feedback from an expert is more effective than self-accessed feedback about motion efficiency in learning new surgical skills. Am J Surg. 2007;193:105–10. Paschold M, Huber T, Zeissig SR, Lang H, Kneist W. Tailored instructor feedback leads to more effective virtual-reality laparoscopic training. Surg Endosc. 2014;28:967–73. Alves de Lima AE. “[Constructive feedback. A strategy to enhance learning].” Medicina (B Aires). 2008;68:88–92. Bienstock JL, Katz NT, Cox SM, Hueppchen N, Erickson S, Puscheck EE. To the point: medical education reviews—providing feedback. Am J Obstet Gynecol. 2007;196:508–13. Duffy K. “Providing constructive feedback to students during mentoring.” Nurs Stand. 2013;27:50–6; quiz 58. Hills L. Giving and receiving constructive feedback: a staff training tool. J Med Pract Manage. 2010;25:356–9. James IA. The rightful demise of the sh*t sandwich: providing effective feedback. Behav Cogn Psychother. 2014;43:1–8. Kilminster S, Cottrell D, Grant J, Jolly B. AMEE Guide No. 27: effective educational and clinical supervision. Med Teach. 2007;29:2–19. Nicol DJ, Macfarlane-Dick D. “Formative assessment and selfregulated learning: a model and seven principles of good feedback practice.” Studies in Higher Education. 2015/04/16 2006;31:199–218. Leach DC. “Competence is a habit.” 2002;287:243–244. Epstein RM, Hundert EM. “Defining and assessing professional competence.” 2002;287:226–235.

Epstein RM. “Assessment in Medical Education.” New England Journal of Medicine. 2015/04/16 2007;356:387–396. 73. Chang A, Chou CL, Teherani A, Hauer KE. Clinical skills-related learning goals of senior medical students after performance feedback. Med Educ. 2011;45:878–85. 74. Kvam PH. “The Effect of Active Learning Methods on Student Retention in Engineering Statistics.” The American Statistician. 2015/04/16 2000;54:136–140. 75. G Hughes. “Using blended learning to increase learner support and improve retention.” Teaching in Higher Education. 2015/04/16 2007;12:349–363. 76. Fraser SW, Greenhalgh T. Coping with complexity: educating for capability. BMJ. 2001;323:799–803. 77. Rees C, Richards L. Outcomes-based education versus coping with complexity: should we be educating for capability? Med Educ. 2004;38:1203. 78. Rees CE. The problem with outcomes-based curricula in medical education: insights from educational theory. Med Educ. 2004;38: 593–8. 79. Plsek PE, Greenhalgh T. The challenge of complexity in health care. BMJ. 2001;323:625–8. 80. Mayer RE. Applying the science of learning to medical education. Med Educ. 2010;44:543–9. 81. Walker M, Peyton JWR. “Teaching in the theatre,” in Teaching and learning in medical practice. Peyton JWR, editor., ed Rickmansworth, UK: Manticore Publishers Europe Limited, 1998, pp. 171–180. 82. Merrill MD. First principles of instruction. Educ Technol Res Dev. 2002;50:43–59. 72.

Using an Instructional Design Model to Teach Medical Procedures.

Educators are often tasked with developing courses and curricula that teach learners how to perform medical procedures. This instruction must provide ...
309KB Sizes 0 Downloads 10 Views