Emergency Medicine Australasia (2014) 26, 461–467

doi: 10.1111/1742-6723.12271


Piloting an online incident reporting system in Australasian emergency medicine Timothy J SCHULTZ,1 Carmel CROCK,2 Kim HANSEN,3,4 Anita DEAKIN1 and Andrew GOSBELL5 1 Australian Patient Safety Foundation, University of South Australia, Adelaide, South Australia, Australia, 2Royal Victorian Eye and Ear Hospital, Melbourne, Victoria, Australia, 3The Prince Charles Hospital, Brisbane, Queensland, Australia, 4The University of Queensland, Brisbane, Queensland, Australia, and 5Australasian College for Emergency Medicine, Melbourne, Victoria, Australia

Abstract Background: Medical-specific incident reporting systems are critical to understanding error in healthcare but underreporting by doctors reduces their value. Objective: We conducted a pilot study of the implementation of an online EDspecific incident reporting system in Australasian hospitals and evaluated its use. Methods: The reporting system was based on the literature and input of experts. Thirty-one hospital EDs were approached to pilot the Emergency Medicine Events Register (EMER). The pilot evaluated: website usage and analytics, reporting behaviours and rates, the quality of information collected in EMER. Semi-structured interviews of three site champions responsible for implementing EMER were conducted. Results: Seventeen EDs expressed interest; however, due to delays and other barriers reporting only occurred at three sites. Over 354 days, the website received 362 unique visitors and 77 incidents. The median time to report was 4.6 min. The reporting rate was 0.07 reports per doctor month, suggesting a reporting rate of 0.08% of ED presentations. Data quality, as measured by the number of completed nonmandatory fields and ability to clas-

sify incidents, was very high. The interviews identified enablers (the EMER system, site champions) and barriers (chiefly the context of EM) to EMER uptake. Conclusions: Collecting patient safety information by frontline doctors is essential to actively engage the profession in patent safety. Although the EMER system allowed easy online reporting of high quality incident data by doctors, site recruitment and system uptake proved difficult. System use by ED doctors requires dedicated and conscious effort from the profession. Key words: adverse event, emergency medicine, incident reporting, near misses, safety learning system.

Introduction Incident reporting is embedded as an essential element of quality and safety improvement in the cultures of most high risk industries.1 Although there is strong support for incident reporting systems in healthcare,2,3 actual usage of such systems is relatively poor, particularly by doctors.4,5 Doctors’ fear and mistrust of reporting systems, and their misuse by hospital administrators and regulators, are well documented,6 as are perceptions that error is inevitable and a potentially unmanageable feature of

Correspondence: Dr Timothy John Schultz, CEA-20 University of South Australia, Adelaide, SA 5000, Australia. Email: [email protected] Timothy John Schultz, BA, BSc (Hons), PhD, Grad Dipl (Publ Hlth), Technical Director; Carmel Crock, MBBS, FACEM, ED Director; Kim Hansen, MBBS (HonsI), FACEM, Emergency Physician; Anita Deakin, BAppSci (Nurs), Research Fellow; Andrew Gosbell, PhD, Director of Policy and Research. Accepted 7 July 2014

Key findings • The role of incident reporting in improving safety and quality in Australasian EM has stalled in the last 15 years. • An online, non-punitive, protected, independent and confidential system was suitable for ED doctors and collected high quality patient safety data. • System uptake was slower than targeted and further implementation must improve engagement by the profession.

medical work that should be dealt with ‘in-house’ or through self-regulation.5,7 Waring6 proposes a clash between medical culture and patient safety culture. In particular, this clash centres on the use of reporting systems to systematically capture data that identifies error, learn from its causation and reduce risk to future patients. Organisational factors, including patient safety culture and staff training, as well as incident reporting system features, such as access and ease of use, also influence system uptake by healthcare professionals.8–11 To address some of the known barriers to system use, reporting systems should be nonpunitive, protected, confidential, independent from regulation, provide timely feedback, be systems-oriented and voluntary.3 Efforts to implement incident reporting in emergency medicine (EM) have met with some enthusiasm. 12 However, in Australasian EDs there has been little progress towards improving incident reporting practices, systematic learning and

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine



quality improvement since the late 1990s.13 The present study reports a pilot of a voluntary, online, anonymous incident reporting system designed for Australasian EM: the Emergency Medicine Events Register (EMER), including its development, piloting and evaluation of this pilot implementation.

Methods Emergency Medicine Events Register website and reporting tool The Project Steering Committee, comprising three emergency physicians, four patient safety researchers and a representative of the Australasian College for Emergency Medicine (ACEM) developed the incident reporting tool based on previous studies3,10,13–15 and clinical experience in EM. This online reporting form and associated website was designed to be visually engaging, user-friendly, easy to navigate and quick to use; a key specification being incident reporting should take 5 min. The reporting tool collects data into a MySQL© database (Oracle Corporation, Redwood Shores, CA, USA) using the Joomla!® content management system (Open Source Matters, Inc., New York, NY, USA).

Pilot site recruitment Potential pilot hospitals were identified by the Project Steering Committee, and were invited to join the study and nominate a site champion responsible for implementing EMER in their ED. It was anticipated doctors would report an incident per month during the pilot study and were able to claim continuing professional development points (0.5 point per incident). Interested sites were provided with detailed project information, a completed National Ethics Application Form and marketing material (flyers, a Microsoft PowerPoint presentation) to promote the project in their hospitals. Pilot site recruitment continued until September 2013. Certain types of incidents (e.g. diagnostic errors, airway management) were targeted over a 1 month ‘burst reporting’ period to help promote EMER near the conclusion of the pilot study.

Data analysis


All incidents entered into EMER were classified by an expert classifier (AD), using the AIMS™ (Advanced Incident Management System) patient safety classification. This classification provides a structured analytical framework to analyse all relevant information about an incident, including contextual factors surrounding the incident, contributing factors, how the incident was detected, mechanisms for recovery (for near misses), and outcomes and factors that could have prevented the incident. AIMS has been designed to capture information ranging from near misses to sentinel events across the entire spectrum of healthcare.14 Classification of incidents using AIMS enabled identification of the healthcare incident type (HIT). HITs categorise incidents of a common nature, grouped according to shared agreed features.16 Each incident can have multiple HITs assigned, but a principal incident type (PIT) must be identified. The PIT is the incident type that did, or potentially could have most directly, caused the most harm.

Emergency Medicine Events Register reporting tool

Evaluation EMER usage was assessed from website metrics using Google Analytics (Google Inc, Mountain View, CA, USA). Incident data collected by the reporting tool was also examined in terms of the number of fields completed, and quality of data provided, by users. Semi-structured interviews of pilot site champions were conducted by a study author (TJS) and transcribed for thematic analysis.

Ethics approval Overall ethics approval for the study was provided by the University of South Australia Human Research Ethics Committee (protocol 29020). The pilot hospitals independently determined whether site-level ethics approval was required. Use of EMER was approved as a quality assurance activity under Part VC of the Health Insurance Act 1973, affording qualified privilege protection to the collected data.

The EMER website (http://www.emer hosted the online reporting form and supporting information. The reporting form initially had 25 different data fields, two of which (date reported and identification number) were automatically generated; with seven of the 23 user-entered fields being open-ended, free text. All fields are summarised in the Supporting Information Table S1, including eight mandatory fields: country, hospital type, incident involved events, triage score, what happened, notifier designation, incident initiated and incident detected stages. During the pilot study, the Project Steering Group added two nonmandatory fields: handover problem and ‘burst reporting’ topics.

Pilot site recruitment A total of 31 EDs were formally invited to join the pilot study. Although 17 EDs expressed interest, ethics delays and difficulties gaining executive support and appointing site champions meant most sites were not successfully recruited. At the completion of the study only three sites were actively participating (Table 1).

Website analytics Access to the EMER website was not restricted to participating EDs and, over the 354 days of the pilot study (1 December 2012 to 19 November 2013), there were 362 unique visitors making 643 visits and averaging 2.56 pages/visit. The ‘bounce rate’, as the proportion of website visits in which the homepage was exited without any interaction, was 53.65%. The mean time spent on the website per visit was 5.3 ± 9.2 min (median = 2.7 min, range 1 s–56 min). The vast majority (565, 88%) of EMER site usage was by desktop computer, with tablets used in 44 (7%) and mobile phones in 34 (5%) visits.

Reporting rate There were 77 incidents reported over the study period; a rate of 0.22

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine



TABLE 1. Site ID A



Summary of pilot sites participating in the project Descriptor

Annual attendances

Ethics approach

Commence date

A major public teaching hospital based in a capital city. The ED has undergone recent growth in patient numbers (adults and children). A medium-sized private hospital based in a capital city, providing 24 h ED care. Consultants are on call 24 h per day. Outer metropolitan public hospital, medium-sized (adults and children). Other services offered: general surgery, general medical, orthopaedics, paediatrics, gynaecology, palliative care. Severe trauma patients are referred on. Workforce is largely non-specialist doctors (e.g. career medical officers, general practice trainees). ED was newly accredited by ACEM.

65 000

Ethics approval and site-specific approval granted

December 2012

25 000

Ethics approval and site-specific approval granted

March 2013

48 000

No ethics approval was required – deemed quality improvement

August 2013


Months in pilot

Number of doctors

Doctor months in pilot

Estimated attendances during pilot


66 (26 consultants, 25 trainees, 15 RMOs)


62 400




17 750


35 (16 consultants, 19 CMOs)


13 920


94 070



ACEM, Australasian College for Emergency Medicine; CMO, Career Medical Officer; ID, identifier; RMO, Resident Medical Officer.

incidents per day. Two incidents were from New Zealand, and so were not from pilot sites. The reporting rate was 0.07 reports per doctor month (75/ 1136.5), equivalent to 0.79 reports per doctor per year. Based on the estimated patient attendances during the pilot study, the incidence of reporting was 0.08% (75/94 070) for the patient cohort. An example incident reported during the pilot study is presented in Box 1.

Reporting behaviours Incident reporting tended to occur intermittently and in clusters of 3–5 incidents separated by a week to 10 days. Most incidents (n = 54, 70%) were reported during the daytime (0800– 1700 hours), although reports were consistently made (n = 21, 27%) during evening hours (1700– 2400 hours), with two incidents (3%) reported between 0000–0800 hours. Metrics from the online reporting form indicated the mean time to report an incident was 5.7 ± 4.6 min (median = 4.6 min).

Incident data Incident demographics Most incidents were reported from Australia (97%), from public hospitals (87%). The patient was the person impacted by the incident in 73 (95%) cases. ED physicians were the most active reporters (68 reports, 88%), followed by nurses (three reports, 4%), registrars, Career Medical Officers and Resident Medical Officers (two reports each, 3%). The exact date of the incident was reported in 62 incidents (81%). Fifteen incidents (19%) occurred on the weekend and none were reported for public holidays (Table 2).

Advanced Incident Management System analysis The AIMS analysis found the most frequently occurring PITs were clinical management (53 incidents, 69%), medications/IV fluids (10, 13%) and organisation management/services (4, 5%). As an incident might involve more than one HIT, there were a

greater number of HITs (n = 100), although the pattern was broadly similar to the PIT data, with the exception of a greater representation of documentation errors in the HITs (Table 3).

Data quality in Emergency Medicine Events Register Examination of the quality of data entered into EMER was assessed using the 14 non-mandatory fields that were relevant for all incidents and in place for the whole project. Analysis (Table 4) shows that most fields were used most of the time (≥ 85% of incidents), with a mean of 13.4 ± 1 nonmandatory fields completed per incident from the maximum of 14 (median = 14, range 9–14), and 91% of incidents had 13 or more completed fields. The extent to which the reported incidents could be classified using AIMS provides another important measure of data quality. All 77 incidents contained adequate information to allow classification of the PIT (Table 3). Review of the incidents indicated a

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine



BOX 1. An incident reported into Emergency Medicine Events Register during the pilot study Patient Female, aged 45–50, Category 3 triage score presenting with a sore throat and fever. What happened? Patient deteriorated with increasing SOB. ED Consultant attempted to intubate with RSI but unsuccessful. Anaesthetist called, requested Glidescope, Anti-fog and glycopyrrolate, all of which were unavailable in ED. ED’s video laryngoscope used but battery ran out prior to first attempt. Anaesthetist unable to intubate patient in ED despite multiple attempts. Patient able to be ventilated with BVM. Surgeon called and came to ED to assess for open tracheostomy. ICU consultant attended and announced there were no ICU beds. What were the contributing factors? Obese patient, inadequate equipment, inadequate warning for anaesthetist/surgeon. Factors that reduced the impact of the incident Day time hours, consultants available and attended immediately. Consequence or outcome Review of procedures and equipment. How could the incident have been prevented? Additional preparation time for anaesthetist and surgeon with earlier warning. Action taken Patient was transported while being ventilated with BVM (sats>90%) when he was gassed down by anaesthetist with surgeons scrubbed and ready to do open trachy. ICU consultant later rang ED consultant to request ED registrar to transfer patient to another hospital. ED consultant declined as airway so difficult that it would not be appropriate to send with ED staff, needs consultant anaesthetist. Notifier designation ED Physician Incident initiated Initial assessment or treatment (e.g. misdiagnosis) Incident detected Initial assessment or treatment (e.g. misdiagnosis)

high level of descriptive data in most open-text fields, particularly for ‘what happened’, ‘what were the contributing factors’, ‘factors that reduced the impact of the incident’, ‘the consequence or outcome’ and ‘how could the incident have been prevented’ fields.

Site champion interviews Four themes, described by 19 attributes, emerged from site champion interviews. Each attribute was determined

to be either a barrier or enabler to implementing EMER at the pilot sites; there were seven barriers, 11 enablers and one neutral attribute (Table 5). The ‘Pilot Study Characteristics’ theme addressed specifics of the pilot study. The ‘Context’ theme identified a number of site- and EM-specific barriers to EMER implementation, including the challenge of promoting the project and educating staff about EMER and what constituted a reportable incident. Reaching shift workers

and part-time staff required a high level of commitment, for example, at one site: ‘weekly meetings, case reviews, live reporting, regular reminders, incentives of CPD; reminder flyers attached to computer screens with EMER logo, desktop icons’. Another site champion viewed the EMER implementation as unsuccessful but used minimal strategies (an education session and grand round), commenting ‘I haven’t been continuously mentioning it on the floor during shifts, and that is something I could do’. Time pressures were raised: ‘the nature of our jobs is that there are all sorts of things that could be done if we had more time. . . .’, as was clarifying the role of EMER within the context of a hospital incident reporting system: ‘It . . . is about learnings for EM and these are things that doctors historically don’t put into State-wide reporting systems and that EM can use and learn from’. All of the attributes of the ‘EMER system characteristics’ theme, such as its ease and the use of burst reporting to highlight important issues, were considered as enablers: ‘It’s quick, it’s easy . . . there is no log in, no password. It’s less than 5 min, there’s not many compulsory fields’. The ‘Site champions’ theme identified their role in seeking out incidents and use of a one-to-one support with colleagues to assist EMER in their EDs: ‘. . . on the floor I was involved in listening out for, and seeking out, anything that could be potentially reported and questioning doctors about certain patients and events that may be appropriate to report’.

Discussion This study describes the development of EMER as an online incident reporting system for Australasian EM. It is non-punitive, protected from discovery, confidential and independent from regulation. The pilot demonstrated that EMER was suitable for use by doctors who reported 96% of incidents, most (88%) by emergency physicians. The completion rate for non-mandatory fields, richness of narrative in free-text fields and the ability for PIT classification for all reported incidents showed that doctors using EMER produced high quality reports, a known strength of voluntary

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine




Incident demographics






Australia New Zealand Private Public Patient Staff Blank Career Medical Officer ED Physician ED Registrar Nurse Resident Medical Officer Approx – month surrounding Approx – week surrounding Exact date Weekend Holiday

75 2 10 67 73 3 1 2 68 2 3 2 11 4 62 15 0 77

97 3 13 87 95 4 1 3 88 3 4 3 14 5 81 19

Hospital type Person involved

Designation of reporter


Weekend/holiday Total



Summary of Advanced Incident Management System analysis

Incident type

Accidents/occupational health and safety Behaviour/human performance Clinical management Documentation Falls Medical devices/equipment/property Medications/IV fluids Organisation management/services Security Total

incident reporting systems that focus on quality rather than quantity of incidents. 17 Furthermore, for busy doctors in time-pressured ED settings, most incidents could be submitted to EMER within the targeted 5 min time frame. Despite addressing known barriers to incident reporting,4,9,17 piloting of EMER indicated recruitment of hospital sites and doctors was less successful, and slower, than anticipated. Although there was general interest towards involvement in the pilot study from a relatively large number of sites, only three were actively reporting by the conclusion of the study. Furthermore, the reporting rate of 0.07 incidents per doctor per month was well below the targeted rate (1). Differences between sites in the number and intensity of strategies employed to

Principal incident type

Healthcare incident type





1 1 53 3 1 3 10 4 1 77

1.3 1.3 68.8 3.9 1.3 3.9 13.0 5.2 1.3 100.0

1 2 56 10 1 8 13 7 2 100

1.0 2.0 56.0 10.0 1.0 8.0 13.0 7.0 2.0 100.0

promote and educate staff about EMER appeared to influence uptake by doctors, which was variable across sites, suggesting the need to cultivate implementation skills for site champions. They identified barriers to increasing engagement including difficulties in reaching all ED staff with EMER training and education (due to shift-work, part-time staff, lack of regular meetings), the need to provide one-to-one user support and to actively seek out incidents. Potential duplication with hospital-based incident reporting systems required clarification. Sites were advised that EMER incidents be detailed, able to inform the specialty, system redesign and/or education opportunities, and might also need to be entered into the hospital system. At one site nurses used the hospital system and doctors

used EMER; whereas at another site duplicate reporting was necessary if an incident could inform both systems. We did not have access to data from the hospital-based system to measure any potential impacts from EMER use. There is little reliable data on the actual prevalence of reportable events in EDs, but a number of studies have consistently identified significant rates of preventable errors.18 The incident reporting rate in our pilot study was 0.08% per attendance. In comparison, Hashemi et al. 19 identified a 0.16% (n = 179) reporting rate in a UK ED with 113 000 annual attendances; however, only 0.1% (n = 115) were defined as patient safety incidents and most (89%) reports were made by nurses. Brubacher et al.20 found a 0.14% incidence of events reported, mainly by nurses, across eight EDs in British Columbia, but suggested that this represented ‘marked underreporting’. These findings suggest that, despite some barriers to recruiting doctors, a doctor-led reporting system can generate similar reporting rates to other systems. Other recent ED studies either do not provide the designation of reporters21–24 or rely on patient reports.25 Although intended for all doctors, site champions indicated that EMER was viewed as a consultant-level activity. Experience was important in recognising what constituted an adverse event or near miss: ‘Emergency registrars weren’t able to process why things went wrong and how it could have been better, looking at the underlying issues and sometimes they even struggled to say that this was an incident, this was something different that shouldn’t have happened . . . it becomes a consultant level activity’. Additionally, junior staff might be more fearful of reporting competency concerns through senior staff, implying social pressures and authority gradients might be important barriers to junior ED doctors reporting, which can potentially be addressed through cultural change.26 The support of consultants is therefore critical in both training junior staff to identify and report incidents, and in developing a patient safety culture that promotes incident reporting.2,6

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine



TABLE 4. Summary of 17 non-mandatory Emergency Medicine Events Register reporting fields from 77 incidents ID


Number of times used

% times used

5 6 7‡ 8 9 12 13 14 15 17 18 19 20 21 25

Date incident occurred Date_is Weekend or public holiday Time band Person involved Medical specialty involved in the incident Age band Gender Clinical presentation What were the contributing factors Factors that reduced the impact of the incident Consequence or outcome details How could the incident have been prevented Immediate action taken Did incident involve a failure associated with incorrect patient Handover failure Burst reporting

73 77 15 77 76 66 74 75 76 75 71 76 74 66 76

94.8 100.0 19.5 100.0 98.7 85.7 96.1 97.4 98.7 97.4 92.2 98.7 96.1 85.7 98.7

40 6

51.9 7.8

26† 27†‡

†Field added during the pilot. ‡Field not applicable to all incidents.

gaging ED doctors to report using EMER. Addressing these issues will be important in implementing EMER more broadly.

Limitations The estimated reporting rate might be inaccurate due to the following: (i) a lower reporting rate from one pilot site (Hospital F) where the site champion considered EMER implementation to be suboptimal; (ii) ED nursing staff also contributing (4% of) reports; and (iii) patchiness of reporting, as site champions felt that there was likely a small core group of doctors that were reporting relatively frequently, and a majority who reported rarely or not at all. Additionally, results might not be generalisable to other hospitals due to purposive recruiting.

Conclusion TABLE 5. Thematic analysis of site champions interviews – summary of four themes and 19 attributes including a classification based on whether each attribute was a barrier or enabler to EMER implementation Theme ID


Attribute ID


Barrier/ enabler


Pilot study characteristics



1.1 1.2 1.3 1.4 2.1 2.2 2.3

Appeal of the project Ethics approvals National roll out Support for site champions Anticipating inertia Challenge of promoting Double reporting between EMER and hospital systems EMER as an ED profession activity First incident a hurdle Importance of experience in identifying incidents Nurse–doctor use The difficulty of education in ED Burst reporting Ease of the system Incentives to report System changes One-to-one support for colleagues Role of the site champion Seeking out incidents

Enabler Barrier Neither Enabler Barrier Barrier Barrier

2.4 2.5 2.6


EMER system characteristics


Site champions

2.7 2.8 3.1 3.2 3.3 3.4 4.1 4.2 4.3

Enabler Barrier Barrier Enabler Barrier Enabler Enabler Enabler Enabler Enabler Enabler Enabler

EMER, Emergency Medicine Events Register; ID, identifier.

Aside from some uptake of specialtyspecific incident reporting systems in anaesthesia,27 intensive care28 and radiology,10 incident reporting by doctors is patchy and remains the domain of nurses. As one site champion stated, ‘emergency doctors are busy and traditionally poor at adverse event reporting’. Nevertheless, although

barriers to clinician reporting exist, community support for reporting is strong29 and doctors can be encouraged to incorporate incident reporting into quality improvement practices.10,13,30 Although site champions felt they were adequately supported in their roles, they encountered significant barriers and inertia in en-

Collecting adverse event and near miss information by frontline doctors directly involved in daily practice is essential to actively engage the profession in patient safety and improves measures, such as patient safety culture.11,17,30 Despite the need for doctors to report near misses and adverse events to inform safety and quality within the profession, use of reporting systems tends to remains a professional activity conducted by nurses. EMER addresses many of the barriers to ED doctor participation in reporting incidents and demonstrates that high-quality reporting is possible. Increased uptake will only occur through dedicated and conscious effort from the profession.

Acknowledgements We acknowledge the following: the site champions who implemented EMER, all clinicians who submitted incidents, and other members of the project steering group (Natalie Hannaford, Bill Runciman, John Vinen). The project was funded by ACEM, and the EMER website was developed by Alltraders Pty Ltd.

Author contributions TJS led the evaluation and data analysis, and drafted the first version of the manuscript. CC conceived the project

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine



and chaired the project steering committee. KH supported the pilot site implementation and evaluation, and was proxy for steering committee chair. AD supported the pilot site implementation and evaluation, and conducted data analysis. AG promoted the implementation and co-drafted the first version of the manuscript. All authors were members of the project steering committee and contributed to its conduct, and have reviewed and commented on the manuscript.




Competing interests AG is Journal Manager for Emergency Medicine Australasia.


References 1. Reason J. Managing the Risks of Organisational Accidents. Aldershot: Ashgate, 1997. 2. Institute of Medicine. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press, 2000. 3. Leape LL. Reporting of adverse events. N. Engl. J. Med. 2002; 347: 1633–8. 4. Kingston MJ, Evans SM, Smith BJ, Berry JG. Attitudes of doctors and nurses towards incident reporting: a qualitative analysis. Med. J. Aust. 2004; 181: 36–9. 5. Lawton R, Parker D. Barriers to incident reporting in a healthcare system. Qual. Saf. Health Care 2002; 11: 15–8. 6. Waring JJ. Beyond blame: cultural barriers to medical incident reporting. Soc. Sci. Med. 2005; 60: 1927–35. 7. Rosenthal M. How doctors think about medical mishaps. In: Rosenthal M, Mulcahy L, Lloyd-Bostock S, eds. Medical Mishaps. Buckingham: Open University Press, 1999; 141–53. 8. Braithwaite J, Westbrook MT, Travaglia JF, Hughes C. Cultural and associated enablers of, and barriers to, adverse incident reporting. Qual. Saf. Health Care 2010; 19: 229–33. 9. Evans SM, Smith BJ, Esterman A et al. Evaluation of an intervention aimed at improving voluntary incident reporting in hospitals. Qual. Saf. Health Care 2007; 16: 169–75. 10. Jones DN, Benveniste KA, Schultz TJ, Mandel CJ, Runciman WB. Estab-








lishing national medical imaging incident reporting systems: issues and challenges. J. Am. Coll. Radiol. 2010; 7: 582–92. Weingart SN, Ship AN, Aronson MD. Confidential clinician-reported surveillance of adverse events among medical inpatients. J. Gen. Intern. Med. 2000; 15: 470–7. Croskerry P, Sinclair D. Emergency medicine: a practice prone to error. CJEM 2001; 3: 271–6. Vinen J. Incident monitoring in emergency departments: an Australian model. Acad. Emerg. Med. 2000; 7: 1290–7. Runciman WB, Helps SC, Sexton EJ, Malpass A. A classification for incidents and accidents in the healthcare system. J. Qual. Clin. Pract. 1998; 18: 199–211. Jones D, Runciman W. Principles of incident reporting. In: Croskerry P, Cosby KS, Schenkel SM, Wears RL, eds. Patient Safety in Emergency Medicine. Philadelphia, PA: Lippincott, Williams & Wilkins, 2009; 70–80. Runciman W, Hibbert P, Thomson R, Van Der Schaaf T, Sherman H, Lewalle P. Towards an international classification for patient safety: key concepts and terms. Int. J. Qual. Health Care 2009; 21: 18–26. Anonymous. Discussion paper on adverse event and error reporting in healthcare. Huntingdon Valley; 2000: 1–8. [Cited 24 Jan 2014.] Available from URL: Tools/whitepapers/concept.asp Wears RL, Woloshynowych M, Brown R, Vincent CA. Reflective analysis of safety research in the hospital accident & emergency departments. Appl. Ergon. 2010; 41: 695–700. Hashemi K, Khaliq W, Blakeley C. Patient safety incident reporting in an emergency department: a one-year review. Clin. Risk 2010; 16: 3–5. Brubacher JR, Hunte GS, Hamilton L, Taylor A. Barriers to and incentives for safety event reporting in emergency departments. Healthc. Q. 2011; 14: 57–65. Tighe CM, Woloshynowych M, Brown R, Wears B, Vincent C. Incident reporting in one UK accident and emergency department. Accid. Emerg. Nurs. 2006; 14: 27–37.

22. Thomas M, Mackway-Jones K. Incidence and causes of critical incidents in emergency departments: a comparison and root cause analysis. Emerg. Med. J. 2008; 25: 346–50. 23. Clunas S, Whitaker R, Ritchie N, Upton J, Isbister GK. Reviewing deaths in the emergency department: deaths in the department or deaths within 48 h. Emerg. Med. Australas. 2009; 21: 117–23. 24. Kahllberg AS, Goransson KE, Ostergren J, Florin J, Ehrenberg A. Medical errors and complaints in emergency department care in Sweden as reported by care providers, healthcare staff, and patients – a national review. Eur. J. Emerg. Med. 2013; 20: 33–8. 25. Friedman SM, Provan D, Moore S, Hanneman K. Errors, near misses and adverse events in the emergency department: what can patients tell us? CJEM 2008; 10: 421–7. 26. Friedman SM, Sowerby RJ, Guo R, Bandiera G. Perceptions of emergency medicine residents and fellows regarding competence, adverse events and reporting to supervisors: a national survey. CJEM 2010; 12: 491– 9. 27. Runciman WB. Iatrogenic harm and anaesthesia in Australia. Anaesth. Intensive Care 2005; 33: 297–300. 28. Beckmann U, Bohringer C, Carless R et al. Evaluation of two methods for quality improvement in intensive care: facilitated incident monitoring and retrospective medical chart review. Crit. Care Med. 2003; 31: 1006–11. 29. Evans SM, Berry JG, Smith BJ, Esterman AJ. Consumer perceptions of safety in hospitals. BMC Public Health 2006; 6: 41. 30. Weingart SN, Callanan LD, Ship AN, Aronson MD. A physician-based voluntary reporting system for adverse events and medical errors. J. Gen. Intern. Med. 2001; 16: 809–14.

Supporting Information Additional Supporting Information may be found in the online version of this article at the publisher’s website: Table S1 Data fields collected during EMER pilot.

© 2014 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine

Piloting an online incident reporting system in Australasian emergency medicine.

Medical-specific incident reporting systems are critical to understanding error in healthcare but underreporting by doctors reduces their value...
122KB Sizes 0 Downloads 6 Views