2015, 1–7, Early Online

Continuous quality improvement in an accreditation system for undergraduate medical education: Benefits and challenges* BARBARA BARZANSKY1, DAN HUNT2, GENEVIE`VE MOINEAU3, DUCKSUN AHN4, CHI-WAN LAI5, HOLLY HUMPHREY6 & LINDA PETERSON7 1

American Medical Association, USA, 2Association of American Medical Colleges, USA, 3Association of Faculties of Medicine of Canada, Canada, 4Korean Institute of Medical Education and Evaluation, Republic of Korea, 5Taiwan Medical Accreditation Council, Taiwan, 6University of Chicago, USA, 7Committee on Accreditation of Canadian Medical Schools, Canada

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

Abstract Background: Accreditation reviews of medical schools typically occur at fixed intervals and result in a summative judgment about compliance with predefined process and outcome standards. However, reviews that only occur periodically may not be optimal for ensuring prompt identification of and remediation of problem areas. Aims: To identify the factors that affect the ability to implement a continuous quality improvement (CQI) process for the interval review of accreditation standards. Methods: Case examples from the United States, Canada, the Republic of Korea and Taiwan, were collected and analyzed to determine the strengths and challenges of the CQI processes implemented by a national association of medical schools and several medical school accrediting bodies. The CQI process at a single medical school also was reviewed. Results: A functional CQI process should be focused directly on accreditation standards so as to result in the improvement of educational quality and outcomes, be feasible to implement, avoid duplication of effort and have both commitment and resource support from the sponsoring entity and the individual medical schools. Conclusions: CQI can enhance educational program quality and outcomes, if the process is designed to collect relevant information and the results are used for program improvement.

Introduction

Practice points

Accreditation is a process that recognizes educational institutions or programs based on their achieving defined standards or criteria (Blanch 1959). It provides notice to the public, students and future applicants and other stakeholder groups that an accredited institution or program has met, and continues to meet, these standards (American Psychological Association 2014). The development of accreditation systems for the review of medical schools is becoming more common around the world. The Foundation for Advancement of International Medical Education and Research (FAIMER) has collected information on medical school accrediting bodies from 106 countries. This information is available on the FAIMER website in the Directory of Organizations that Recognize/Accredit Medical Schools (DORA) (FAIMER 2014). Accreditation typically occurs at set intervals. Of the 106 countries with accrediting bodies listed in DORA, 58 provided information about the interval at which full reviews for accreditation occur. Fifty countries reported a fixed term for accreditation of medical schools, ranging from four to 10 years





Medical schools should engage in interim review of their compliance with the accreditation standards of their country and act on the results in order to support a culture of continuous quality improvement (CQI). Accrediting bodies or associations that require or encourage CQI should be resourced to provide support through the creation of central evaluation tools and the provision of training.

(mode ¼ 5). Eight countries reported that the term of accreditation varied or was unspecified, and 48 countries did not provide information on the term of accreditation (FAIMER 2014). Regular accreditation reviews may be followed by required follow-up reports on steps taken to address identified areas of noncompliance with accreditation standards. The value of accreditation as a mechanism for quality assurance and improvement has been asserted and, at some level, demonstrated (van Zanten et al. 2008, 2012b; Boulet &

Correspondence: Barbara Barzansky, PhD, MHPE, American Medical Association, 330 North Wabash Avenue, Suite 39300, Chicago, Illinois 60611, USA. Tel: 1 312 464 4690. Fax: 1 312 224 6912. E-mail: [email protected]

*This paper was adapted from a symposium presented at the 2014 meeting of the Association for Medical Education in Europe, Milan, Italy. ISSN 0142-159X print/ISSN 1466-187X online/15/000001–7 ß 2015 Informa UK Ltd. DOI: 10.3109/0142159X.2015.1031735

1

B. Barzansky et al.

Table 1. Characterization of continuous quality improvement (CQI) processes.

Review managed by a

LCME AFMCa KIMEEa TMACa Single Medical School (Univ. of Chicago, USA)

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

a

Results reported to/Acted on by

Timing

All standards/Some standards

Accrediting body School Accrediting body Accrediting body School

Yearly Mid-cycle 2 years after full review Yearly Quarterly/Yearly

Some All Some Some Some

LCME (Liaison Committee on Medical Education). AFMC (Association of Faculties of Medicine of Canada). KIMEE (Korean Institute of Medical Education and Evaluation). TMAC (Taiwan Medical Accreditation Council).

van Zanten 2014). For example, in the United States, the upcoming review by the Liaison Committee on Medical Education (LCME) led one US medical school to involve faculty and students in a ‘‘change management’’ approach that resulted in ‘‘institutional transformation’’. The deadline imposed by the survey visit was cited as a stimulus to the identification of problems and the prompt development of solutions (Chandran et al. 2013). However, accreditation reviews that occur at intervals may not be optimal to support ongoing quality assurance and improvement activities related to the educational process and outcomes of the medical school. Rather, an ongoing approach that is linked to accreditation standards has been suggested as more valuable (Bishop 2004). Given that accreditation standards are considered to be indicators of institutional quality, review of compliance at short and regular intervals could support the creation of an internal culture of quality (Al-Shehri & Al-Alwan 2013). An interim review process would, therefore, be a useful supplement to the summative accreditation judgment of an external agency (i.e. the accrediting body) that occurs at fixed intervals. In the context of accreditation, the authors define continuous quality improvement (CQI) as a process both to monitor compliance with accreditation standards in the interval between full accreditation reviews and to act on the results. Based on the analysis of case examples, the authors provide summary recommendations to facilitate the implementation of CQI at local and national levels.

Options for the structure of CQI A CQI process can be categorized based on several design variables. These variables are framed as questions that planners of a CQI process will need to address:  Breadth of the review: Does the CQI review include all standards or just selected standards? If the latter, how are the specific standards chosen?  Timing of review: When is the CQI review conducted? For example, is the review done at the mid-point of the accreditation cycle, yearly or at some other interval?  Managing body: Is the CQI review conducted and managed by the accrediting body or by some other entity, such as an association of medical schools or the individual medical school? 2



Formative or summative purpose: Who receives the results of the CQI process and how are the results used? Are the results used by the school solely for self-improvement or does the accrediting body receive and use the results to take an accreditation action? How these questions are answered affects the structure of the resulting CQI program.

Examples of CQI processes at the national level This section contains examples of CQI processes as used in four countries and summarizes the strengths of the approach and its challenges, as identified by the authors. Table 1 categorizes each of the examples, based on the design variables described above. In all cases, the CQI review is linked, directly or indirectly, to the expectations for educational process and outcomes as contained in accreditation standards.

The Liaison Committee on Medical Education The Liaison Committee on Medical Education (LCME), founded in 1942, is jointly sponsored by the American Medical Association (AMA) and the Association of American Medical Colleges (AAMC). The LCME accredits the 141 medical education programs leading to the MD degree in the United States; it does not accredit colleges of osteopathic medicine. Accreditation of Canadian medical schools is carried out in collaboration with the Committee on Accreditation of Canadian Medical Schools (CACMS). US MD-granting medical schools undergo a full accreditation review, including a selfstudy and a survey visit, every eight years. The LCME’s interim review process applies to US medical schools, as there is a different process in effect in Canada. The LCME process aims to identify areas where medical schools may be out of compliance with standards in the period between full surveys, so as to allow prompt correction of deficiencies. The LCME has based its interim reviews on data from annual questionnaire that are developed and sent to LCMEaccredited medical schools by staff from the LCME’s sponsoring associations, the AMA and the AAMC. These questionnaires collect quantitative data on variables that have relevance to LCME accreditation standards, such as entering medical school class size, numbers of faculty, medical student tuition and graduating student debt and medical school finances. Each year, the AAMC generates a document, titled the Longitudinal Statistical Summary Report, which includes eight-year trend

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

CQI in accreditation

data on these and other variables. The LCME has set benchmarks for changes in these variables over a set period of time (for example, a 10% increase in entering class size in one year or a 10% decrease in the number of faculty over the same time period). A report is generated annually for each medical school noting the variables where the benchmark was exceeded. The LCME secretariat staff then contacts medical schools with specific questions about the impact of the change. For example, LCME accreditation element 4.1 (sufficiency of faculty) requires that there be a sufficient cohort of faculty to support the medical education program and the other missions of the medical school (LCME 2014). Schools with a decrease in faculty numbers that exceeds the benchmark would be asked to provide evidence that the educational program had not been negatively affected. The school’s response would be reviewed by the LCME for compliance with the relevant standard. If a decision of noncompliance was made, the school would be asked for follow-up on how the deficiency was addressed. In addition to this CQI system that is managed by LCME staff, the LCME has introduced a new accreditation requirement that schools have an internal CQI process to ‘‘ensure effective monitoring of the medical education program’s compliance with accreditation standards’’ (LCME 2014). This expectation, codified in standards, is going into effect for schools being reviewed beginning in July 2015.

Strengths and challenges The annual screening of schools using a standardized set of variables allows the identification of trends that may identify areas of noncompliance with accreditation standards soon after the problem occurs. This allows prompt follow-up by the LCME so that areas of noncompliance can be corrected without waiting for the end of the formal eight-year review cycle. However, the collection and interpretation of data are labor-intensive and depend on the development of questionnaires and the analysis of the results by the LCME’s sponsoring organizations. This process also works only for quantitative variables. Many LCME accreditation standards depend on descriptive (i.e. qualitative), not quantitative, information and so cannot be monitored using this process. Even for quantitative variables, a change above the pre-set benchmark may not mean that a school is out of compliance with the relevant standard. For example, as noted previously, the LCME does not specify the number of faculty or the faculty to student ratio that should be in place. A decrease in the number of faculty greater than the benchmark may not mean that the quality of the medical education program is in jeopardy. Using trend data with pre-set benchmarks that may not be evidence-based results in a number of ‘‘false positives’’, where a school exceeds the benchmark and is asked for follow-up information but is not found to be in noncompliance. The new accreditation standard requiring CQI processes at each medical school may replace this approach, which has been labor intensive for the LCME and for the schools.

Association of Faculties of Medicine of Canada The Committee for Accreditation of Canadian Medical Schools (CACMS), created in 1979, leads the process of accreditation of Canadian medical schools and is sponsored by the Association of Faculties of Medicine of Canada (AFMC) and the Canadian Medical Association (CMA). Canadian medical schools receive dual accreditation from CACMS and LCME as described above. In 2009, in order for Canadian schools to better respond to CACMS/LCME accreditation requirements, the AFMC Council of Deans voted to create a mandatory but completely formative Interim Review Process (IRP) in Canada. The IRP is completely independent of the regular CACMS accreditation review that occurs every eight years. The AFMC staff dedicated to the IRP have developed an extensive checklist that breaks down each element of the CACMS accreditation standards into actionable tasks that, if completed, should lead to success in complying with standards. Checklists are revised each year by the AFMC staff to ensure comparability with the current CACMS Standards and Elements. One core component of the program is the appointment of an Interim Review Coordinator (IRC) by the dean at each school. The IRC is a faculty member who is the lead for accreditation activities at the institution and reports to the dean or vice dean. About three years into its eight-year cycle, a school starts the IRP self-assessment process that lasts 12–15 months, where all accreditation standards are reviewed using the checklists. The process is led by the school’s IRC with a committee of faculty and students. The resulting review documents are sent to an external peer reviewer, who is the IRC at another school. There is a visit at the school that includes the school’s IRC, the external peer reviewer and faculty and students from the school as team members. The team prepares a report, with recommendations for areas to be improved or monitored, which is for the sole use of the medical school. The recommended follow-up is monitored. The AFMC holds bi-annual meetings of the IRCs to facilitate exchange of knowledge and experience across all schools. The AFMC staffs are also available for support to the IRCs anytime throughout the year. The AFMC, CMA and CACMS do not receive the results of the interim review for an individual medical school, but CACMS is informed that the process has occurred. As of the end of 2014, eight of the 17 Canadian medical schools have completed the interim review process, with an additional two schools due for review in 2015. To date, no school that has completed an IRP has had a full accreditation review.

Strengths and challenges Strengths of this approach include the enhancement of accreditation expertise at each of the 17 Canadian medical schools and the creation a broad-based culture of CQI within the schools. The process reviews all standards, some of which may not have existed at the time of a school’s last full accreditation review or may have been modified since then. The review, therefore, identifies specific current problem areas that can be addressed by the school well before the next full review.

3

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

B. Barzansky et al.

For the process to function optimally, the IRC appointed at each school must be knowledgeable about medical school accreditation and also be provided with the time to both coordinate the process at his or her own school and to serve as a peer reviewer at another institution. This relies on the dean to support both the IRC and to provide the other resources needed to implement the process.

standards for accreditation). The tenure of CQI working group members within medical schools also is short, limiting the development of expertise in producing school reports. The requirement for schools to report improvements every two years has not included specific instructions as to content and format, leading to KIMEE receiving nonstandard information across schools.

Korean Institute of Medical Education and Evaluation

Taiwan Medical Accreditation Council

The Accreditation Board for Medical Education in the Republic of Korea was founded in 1997 and became the Korean Institute of Medical Education and Evaluation (KIMEE) through organizational reform in 2003. KIMEE was incorporated under the Ministry of Health in 2004 and certified by the Ministry of Education in 2014. There are 41 medical schools in South Korea. The first cycle of medical school accreditation began in 2000 and was completed by 2005. A second cycle lasted from 2007 to 2011. Also in 2011, a meta-evaluation of the accreditation process was conducted by external reviewers that concluded that the process was working to make accreditation decisions, but it did not appear to lead to overall improvements in the medical education system. The review recommended that accreditation should use CQI to change the educational culture at medical schools. Steps have been taken to place more emphasis on quality improvement at the individual schools. The first was the introduction, in 2012, of standards related to CQI. Schools are being asked as part of the accreditation review to report who within the institution has responsibility for the internal CQI process and whether the school’s CQI efforts are reflected in the accreditation outcome. Also, in 2012 KIMEE began to require that schools submit a progress report every two years that focuses on improvement efforts and their results. During a second cycle (2007–2011) KIMEE also introduced standards that would lead to an ‘‘excellence’’ level of attainment for medical schools. They were voluntary and designed to recognize and promote outstanding medical school performance and were being previewed for possible incorporation into the next cycle (post-second) of review as required standards (Ahn & Ahn 2014). KIMEE no longer uses ‘‘excellence’’ level.

Strengths and challenges KIMEE process is supported by regulations, including the requirement in accreditation standards that CQI take place and the specification in KIMEE rules that a mandatory progress report be submitted every two years. Medical schools also are familiar with the CQI concept as implemented in hospitals. To date, however, CQI efforts across schools have been variable. Challenges include a short-term for deans leading to a lack of continuity in leadership. This inhibits the ability of institutional leadership to stimulate a culture of self-improvement in medical schools and results, in some institutions, in a focus on ‘‘passing the test’’ (i.e. meeting the minimum 4

Taiwan currently has 12 medical schools. Taiwan Medical Accreditation Council (TMAC) was formed in 2000 as an independent accrediting body with the endorsement of the Ministry of Education (MOE) and the Conference of the Deans of Medical Colleges (CDMC) in Taiwan. Accreditation activities began in 2001. Full accreditation is awarded for a term of seven years, but a follow-up visit is conducted two or three years after the review. The first cycle of visits was completed in 2004, and a review of the results from both the initial and follow-up visits was conducted in 2008. The second cycle of reviews began in 2009. In addition to regular reviews, schools have been requested to report major changes annually. In 2009, TMAC began to review its accreditation standards, using the LCME standards as reference. Taking into account Taiwan’s own infrastructural tradition, regulatory norms and historical differences in medical practice, the revised accreditation standards were initially accepted by the CDMC and then tested in 2012 in several schools. The newly modified standards were ready to be fully implemented in 2014.

Strengths and challenges From its inception to the present, TMAC has accomplished its mandate as the sole accreditation council for all medical schools in Taiwan. The decisions of TMAC have been fully supported by the MOE and the recommendations from TMAC have been adopted by all medical schools for the improvement of medical education. In order to achieve CQI, TMAC has requested that an ‘‘annual update’’ be submitted by schools, but the expectations for the content and format of annual reports have not, to date, been standardized, and the ability of the accrediting body to systematically review and provide feedback on the reports has been limited based on the availability of personnel and budget constraints. To strengthen quality improvement, TMAC has also contemplated an additional ‘‘mid-cycle accreditation visit’’ to serve as a ‘‘formative’’ assessment to reduce the work in preparing for the next accreditation visit at the end of the cycle. TMAC has recognized the need for additional personnel to support its CQI activities; this will require further funding from the Ministry of Education.

A CQI process at an individual medical school In order to determine how CQI linked to an accreditation process could be carried out at the level of a single medical

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

CQI in accreditation

school, one institution in the United States was selected that has had a CQI process in place for a number of years.

Recommendations for implementing a CQI system

The University of Chicago, Pritzker School of Medicine, USA

The authors believe that medical schools should review their compliance with at least a minimum set of accreditation standards in the interval between full accreditation reviews. For example, starting a review of compliance 12–18 months before a full accreditation review, which is the standard timeline for US medical schools to prepare for a full survey, may not provide the necessary time to identify gaps in compliance with standards, to correct them and to collect data on outcomes. This conclusion will be tested by the outcomes of the AFMC interim review process, which will provide more evidence of the efficacy of CQI activities when the Canadian schools undergo their next full accreditation reviews. A CQI process that is formative in nature would be useful even if the interval between full accreditation reviews is short. Accreditation is a summative process, with accreditation status a ‘‘high stakes’’ decision. A formative CQI process would allow problems related to compliance with accreditation standards to be addressed before formal action on accreditation status is taken. The following are summary recommendations for implementing a CQI process. The authors derived these recommendations from the case examples and believe that they are generally applicable.

The medical education program leading to the MD degree at the University of Chicago, Pritzker School of Medicine is accredited by the LCME. As directed by the Dean for Medical Education, there is at a collection and review of data directly related to selected LCME accreditation standards. Some of the standards are reviewed quarterly and some annually. These standards come from the following school-determined categories: (1) areas identified as priorities by the medical school, such as direct observation of students’ clinical skills, timeliness of reporting of students’ grades, medical student mistreatment and student access to health services; (2) areas where the required process is ‘‘prone to slippage’’ (i.e. likely to not occur unless regularly managed), such as provision of feedback to students at the mid-point of a clerkship and financial aid counseling and (3) new LCME standards. The CQI process has been carried out by the school on a voluntary basis and the results are not shared with the LCME. Data are collected using a variety of methods, for example, school-developed surveys to students and graduates; discussions with groups of student leaders; review of the results of national questionnaires, such as the Association of American Medical Colleges (AAMC) Medical School Graduation Questionnaire, which is completed by final year medical students across the country; and student performance on internally- and externally-developed examinations. Technology available at the medical school, such as electronic survey tools, supports the process of data collection. Dashboards or trend tables are produced to facilitate review of results by educational leadership.

Strengths and challenges This is a comprehensive approach that involves collection and analysis of a breadth of information directly related to accreditation standards. A number of internal data collection instruments that are also used for other purposes, such as evaluating individual courses, have been adapted to include items relevant to the CQI review. As a consequence, the CQI process at the school does not result in a duplication of effort. This mitigates ‘‘accreditation fatigue’’ and the school’s process will serve the requirement of the LCME for an internal CQI process. The results of the review are used in a timely manner by administrators and faculty committees that are empowered to bring about change. As the standards to be reviewed are selected by the institution, there is a need for school personnel to keep current with LCME standard changes or reinterpretations. There is also a tendency to focus data collection on past problem areas. Use of external survey data, while efficient, is vulnerable to changes in surveys that are made by others, which can limit the availability of trend data. The process also requires the ongoing availability of personnel and other resources to support data collection and interpretation.

Implement a focused and feasible process The CQI process needs to be both focused and feasible and to avoid duplication of effort. Focused means that the information collected should directly link to accreditation standards so that the result is meaningful for determinations of compliance with the selected standards. The authors recommend that, at a minimum, a core set of standards be included in a CQI process for all schools in a given country, based on a broad-based determination of which standards have significant impact on educational program quality and outcomes. A standardized process where similar information is collected across medical schools makes the work of the body conducting the review, either the accrediting body or the school, easier. The aftereffects of the significant effort expended at a medical school to complete the work for a full survey and the required follow-up has been termed ‘‘accreditation fatigue’’ and was mentioned as a barrier to CQI activities in a number of the examples. The need to expend additional effort for a CQI process may not be welcome either at the individual medical school or the accrediting body. In a national process, the duplication of effort across medical schools should be reduced or eliminated. If a medical school is subject to review by multiple accrediting bodies, for example, those that review the medical school itself and its university, there could be opportunities to share data. Also, data gathering to support accreditation-related CQI also can be used in internal processes, such as strategic planning. It is important that CQI activities be supported as much as possible by the entity that manages the process and that there is a synergy of effort. For example, the AFMC has provided centralized resources for training and checklist development.

5

B. Barzansky et al.

The ability to add questions to or use the results of data collection instruments created for other purposes, as in the case of the LCME and the University of Chicago, also can ease the effort burden on an individual medical school.

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

Ensure the process results in meaningful outcomes To be meaningful, the results of CQI activities should support educational quality improvement as well as contribute to a good accreditation outcome. A standards-based process can meet both these desirable outcomes. A study of the importance of 150 accreditation standards used around the world indicated that most were ‘‘at least important’’ (van Zanten 2012a), implying that they have meaning for program quality. By identifying gaps in compliance with accreditation standards, CQI can provide information to improve medical education program quality and outcomes. This means that, in a formative process, appropriate stakeholder groups within the medical school should receive and act on the results of the review. If the CQI is conducted for summative purposes, the accrediting body also should be prepared to take whatever type of action on the results is specified in its procedures. For both school and accrediting body recipients, the results of the review should be lead to meaningful changes, the outcomes of which should be monitored.

Demonstrate institutional commitment to quality improvement For the CQI process to work within a medical school, there needs to be institutional commitment that reduces the potential for ‘‘accreditation fatigue’’. Institutional commitment is both motivational (e.g. visible encouragement and support from leadership) and substantive (e.g. the provision of funding and other resources, such as technology and staff). Leadership plays an important role in creating an institutional culture dedicated to program improvement. One manifestation of commitment to quality improvement is ensuring that personnel are available to manage and engage in CQI activities. The AFMC interim review process includes the expectation for a funded faculty member (the IRC) with a time commitment of about 50% who is responsible for managing the process at each school. This position is financially supported by and reports to the dean or vice dean of the medical school, which illustrates the importance with which it is viewed at the medical school and the AFMC. In a 2014 survey of US medical schools, 121 of 140 (86%) responding schools reported that there were personnel to support ongoing compliance with LCME accreditation standards, including 27 schools (19%) where there was a formal position with responsibility for quality assurance (LCME 2013). For example, at the Pritzker School of Medicine of the University of Chicago, the dean for medical education has responsibility for the school’s CQI activities. Individual faculty members and students also need to be encouraged to participate in CQI activities. For example, the importance of completing relevant questionnaires needs to be 6

explained to students and they should be informed of positive changes that have resulted based on their input. Faculty should be expected to become involved to varying degrees and provided with time to participate. Accrediting bodies or national associations that require medical schools to engage in a CQI process must themselves be appropriately and sufficiently staffed with personnel who have expertise and time to identify or develop data collection tools, train school personnel to carry out data collection and analysis and review school submissions. Appropriate staffing can support a standardized process that eases the workload at individual medical schools and aids decision-making, whether formative or summative.

Conclusion CQI can enhance educational program quality, if the process is designed to collect relevant information and the results are used to support program improvement and outcomes. Broadbased planning, ideally at a national level, and appropriate resource allocation are needed to make the CQI process optimally useful.

Glossary Continuous Quality Improvement (CQI): The structured organizational process for involving personnel in planning and executing a continuous flow of improvements to provide quality health care, medical education or clinical care that meets or exceeds expectations McLaughlin CP, McLaughlin C, Kaluzny AD. 2004. Continuous quality improvement in health care: Theory, implementation, and applications: Jones and Bartlett.

Notes on contributors BARBARA BARZANSKY, PhD, MHPE, is Co-Secretary of the Liaison Committee on Medical Education at the American Medical Association. DAN HUNT, MD, MBA, is Co-Secretary of the Liaison Committee on Medical Education and at the Association of American Medical Colleges. GENEVIE`VE MOINEAU, MD, is President and Chief Executive Officer, Association of Faculties of Medicine of Canada, former the Secretary to the Committee on Accreditation of Canadian Medical Schools. DUCKSUN AHN, MD, FRSCS, President, Korean Institute of Medical Education and Evaluation. CHI-WAN LAI, MD, is Chair, Taiwan Medical Accreditation Council. HOLLY HUMPHREY, MD, is Ralph W. Gerard Professor in Medicine and Dean for Medical Education at the University of Chicago, Pritzker School of Medicine. LINDA PETERSON, PhD, MEd, is Assistant Secretary, Committee on Accreditation of Canadian Medical Schools.

Declaration of interest: The authors report no declarations of interest.

CQI in accreditation

Med Teach Downloaded from informahealthcare.com by Nyu Medical Center on 04/22/15 For personal use only.

References Ahn E, Ahn D. 2014. Beyond accreditation: Excellence in medical education. Med Teach 36:84–95. Al-Shehri AM, Al-Alwan I. 2013. Accreditation and culture of quality in medical schools in Saudi Arabia. Med Teach 35(Suppl. 1):S8–S14. American Psychological Association. 2014. About accreditation. [Accessed 29 September 2014] Available from http://aaa.apa.org/accreditation/ about/about-accreditation.aspx?item¼2. Bishop JA. 2004. The impact of the Academic Quality Improvement Program (AQIP) on the higher learning institutions’ North Central Association accreditation. [Accessed 27 September 2014] Available from http://epublications.marquette.edu/dissertations/AA13153994/. Blanch LE. 1959. The meaning if accreditation. In: Blanch LE, editor. Accreditation in higher education. Washington, DC: US Department of Health, Education, and Welfare. pp 3–8. Boulet J, van Zanten M. 2014. Ensuring high quality patient care: The role of accreditation, licensure, specialty certification and revalidation in medicine. Med Educ 48:75–86. Chandran L, Fleit HB, Shrover AL. 2013. Academic medicine change management: The power of the Liaison Committee on Medical Education accreditation process. Acad Med 88:1225–1231.

FAIMER. 2014. Directory of Organizations that recognize/Accredit Medical Schools (DORA).[Accessed 1 October 2014] Available from http:// www.faimer.org/dora/index.html. Liaison Committee on Medical Education. 2014. Functions and structure of a medical school. March 2014 edition. [Accessed 15 October 2015] Available from http://www. lcme.org. LCME. 2013–2014 LCME Annual Medical School Questionnaire. Available from the files of the American Medical Association Medical Education Group. McLaughlin CP, McLaughlin C, Kaluzny AD. 2004. Continuous quality improvement in health care: Theory, implementation, and applications. Sudbury, MA: Jones and Bartlett. van Zanten M, Boulet JR, Greaves I. 2012a. The importance of medical education accreditation standards. Med Teach 34: 136–145. van Zanten M, McKinley D, Durante Monteil I, Pijano CV. 2012b. Medical education accreditation in Mexico and the Philippines: Impact on student outcomes. Med Educ 46:586–592. van Zanten M, Norcini J, Boulet JR, Simon F. 2008. Overview of accreditation of undergraduate medical education programmes worldwide. Med Educ 42:930–937.

7

Continuous quality improvement in an accreditation system for undergraduate medical education: Benefits and challenges.

Accreditation reviews of medical schools typically occur at fixed intervals and result in a summative judgment about compliance with predefined proces...
122KB Sizes 0 Downloads 13 Views