Executive Summary

The Johns Hopkins Armstrong Institute for Patient Safety and Quality (Baltimore, MD) earned recognition as the runner-up for ECRI's 17th Health Technology Excellence Award for its development and implementation of an innovative safety-event-reporting software application. The system, called "Hero," has led to demonstrable improvements in the organization's reporting and learning culture.

In order to improve the organization's event reporting, the team at the institute looked at the available event-reporting software applications. None of the available systems, they discovered, could effectively reduce the barriers to reporting and support productive use of the data. So the team decided to build a better one.

Because any given reporter might use the system only once or twice a year, the software application needed to be intuitive and efficient. Additionally, the design process considered how the system would be incorporated into the workflows of event reviewers, who spend the most time interacting with the software and data.

Key features of the new Hero application include:

  • A single, short submission form that makes reporting easier
  • A machine-learning algorithm that automatically categorizes event reports based on the event description
  • Communications functionality that supports seamless collaboration and transparency, allowing reporters to open and examine event reports they submitted and continue to communicate with reviewers
  • Innovative qualitative analytics using machine-learning algorithms that have been optimized using years of Johns Hopkins historical data

Rather than focus on reporting volumes to measure the system's success, since these can be unreliable as metrics, the team considered the key indicators to be user satisfaction, user engagement, the quality of information being received, and the impact of any improvements that can be made as a result of that information. In these areas, the team found evidence of the intervention's success.

_______________________________________________________

Who Should Read This

The Johns Hopkins Armstrong Institute for Patient Safety and Quality (Baltimore, MD) earned recognition as the runner-up for ECRI's 17th Health Technology Excellence Award for its development and implementation of an innovative safety-event-reporting software application. The event-reporting system, called "Hero," has led to demonstrable improvements in the organization's reporting and learning culture.

ECRI's Health Technology Excellence Award recognizes outstanding initiatives undertaken by member healthcare institutions to improve patient safety, reduce costs, or otherwise facilitate better strategic management of health technology. For details about the winning submission and other finalists, see The Health Technology Excellence Award: Recognizing Exceptional Health Technology Management.

ECRI congratulates Eileen Kasda, Lori Paine, Christine Robson, and the rest of the team at the Johns Hopkins Armstrong Institute for their years-long effort to identify common barriers to event reporting, to assess the benefits and drawbacks of existing systems, and to build and implement a more effective system.

The Johns Hopkins Hospital. (Image courtesy of Johns Hopkins Health System.)​​

The Challenge

To design and develop a safety-event-reporting software application that (1) makes it easier for frontline healthcare workers to report safety events, (2) increases transparency throughout the reporting and review processes, and (3) helps leaders and analysts make sense of the reports in this unique dataset and prioritize opportunities for improvement.


The Context

Healthcare organizations implement safety-event-reporting systems to provide a means for clinicians and other staff to document incidents that have—or that could have—adversely affected patient care or led to other harm. High-reliability organizations (HROs) rely on the data from such systems to identify risks and mitigate harm.

Like HROs, Johns Hopkins places significant emphasis on its reporting program. According to Eileen Kasda, Director of Patient Safety for Johns Hopkins Health System, Johns Hopkins Medicine (JHM) staff routinely report patient safety events. "Each of these reports represents valuable learning and improvement opportunities to keep patients, staff, and visitors safe."

However, efforts to take advantage of those opportunities and to implement improvements were hampered by common barriers to reporting. Barriers included factors related to staff awareness (e.g., an insufficient understanding of the process or its importance), organizational culture (e.g., fear of retribution), and system functionality (e.g., time-consuming or complex reporting processes), to name a few. Any such barriers can discourage potential reporters from speaking up, which reduces the opportunities for learning and improvement.

In an attempt to eliminate or minimize some of these barriers, the team at the Johns Hopkins Armstrong Institute for Patient Safety and Quality assessed event-reporting processes, both internally and externally. A key area of focus became event-reporting software applications. "We found that the available systems provided a poor user experience, both for reporters of events as well as for reviewers and leaders responsible for using this data to improve care delivery systems," explains Kasda.

Unable to find a system that could effectively reduce the barriers to reporting and support productive use of the data, the Johns Hopkins team decided to build a better one.

 

The ​Process

Gatheri​​ng Information

Designing a better reporting system first involved assessing the current state—that is, identifying barriers to reporting in general, as well any specific shortcomings of the organization's existing event-reporting system. The Johns Hopkins team met with patient safety experts from 30 organizations across the United States to learn what challenges others were facing and how they were tackling them. In addition, the team conducted focus groups and usability testing with reporters and reviewers of events from various disciplines across JHM to assess the current system and to gain insights that could be used to define requirements for a new event-reporting application.

The team identified the following common industry challenges:

Time to report. The amount of time it takes for frontline workers to submit an event report is known to be one of the top barriers to reporting. The Johns Hopkins team confirmed this finding by speaking with colleagues across the industry, by examining the peer-reviewed literature, and by reviewing the organization's own internal metrics. At Johns Hopkins, the median submission time for reporting a single event using the organization's legacy event-reporting software application was calculated to be over 9.5 minutes.

The persistent belief that event reporting is punitive. Despite the organization's initiatives to promote a reporting and learning culture, some clinicians remained skeptical that problems could be reported without repercussions. Physicians in particular were dubious, and in fact, reports from physicians accounted for only 5% of the event reports that had been submitted to the Johns Hopkins legacy system.

Lack of transparency or feedback after submitting a report. Reporters rarely heard back about an event, other than to see that their report was either "in progress" or "closed." In addition, reporters might not know who their report is routed to or how to provide additional information about the situation. This lack of engagement with reporters, and the absence of visible signs of progress on any given report, likely dissuaded some frontline workers from reporting future events.

Suboptimal report classification processes. When submitting events, reporters were instructed to classify the type of event they were submitting. This requirement could lead to confusion and frustration, since frontline workers are not well versed in safety event taxonomies.

Further, the event-reporting software applications that the Johns Hopkins team assessed allowed for only one category to be assigned per event; the event would then be routed to reviewers based on that single category. The team found this feature to be particularly limiting, since safety events commonly involve multiple factors and need to be examined through multiple lenses. For example, an event that involves a medication error could also involve a poor handoff and an equipment failure. If the report is assigned to only one category, reviewers may not recognize the need to examine those other contributing factors.

Inadequate data analysis tools. The Johns Hopkins team found that the software applications on the market lack the robust analytics necessary to identify patterns and emerging areas of risk. Thus, reviewers and analysts are limited in their ability to turn the qualitative data contained in event reports into actionable insights that can ultimately be used to help improve patient safety.

The team's discussions with other organizations revealed that many of those organizations are exporting data to popular data visualization tools and creating quantitative dashboards in an effort to gain insight from their data. But as Kasda explains, "Event-reporting data are qualitative and often don't lend themselves well to quantitative methods. An overreliance on quantitative metrics could lead to incorrect conclusions about patient safety." For example: Reporting volumes are one quantitative metric that can be monitored, as a way to track whether more or fewer events are being reported. But this metric does not provide much meaningful information about the safety of a work area, since reporting patterns can be influenced by many factors unrelated to safety.

Designing​ a System

Because any given reporter might use the system only once or twice a year, the software application would need to be intuitive and efficient. Additionally, workflows to support event reviewers have not been carefully considered by the designers of existing systems, yet these are the individuals who spend the most time interacting with the software and data. The Johns Hopkins team worked with user experience experts and human-centered designers from inside and outside of healthcare to leverage aspects of modern software design and apply them to patient safety reporting.

The design process considered not only what functionality a software system could provide, but also how that system would be incorporated into the users' workflow. "A key part of this process was to design the system around the workflows, rather than our workflows around the system," explains Lori Paine, Senior Director of Patient Safety at the Johns Hopkins Armstrong Institute and Johns Hopkins Hospital.

A team of software developers with a variety of technical skills was assembled to develop the software application. The development team worked closely with experts from Johns Hopkins and users of the application.


​​The Results

The​ System

The culmination of the team's efforts was the implementation of the Johns Hopkins "Hero" safety-event-reporting software application. The application went live in April 2021 across all JHM locations, including six hospitals, ambulatory surgery centers, the organization's home care group, and over 525 physician practice clinics.

According to the Johns Hopkins team, key features of the Hero application include:

  • A simplified submission form. The Hero system uses a single, short submission form that makes reporting easier.
  • Automated event categorization. Hero incorporates a machine-learning algorithm that uses a dynamic event-reporting taxonomy to automatically categorize event reports based on the event description. Reporters no longer need to spend time classifying the event. They simply focus on the free-text narrative describing the safety concern, and the algorithm suggests relevant categories after submission. The algorithm regularly re-trains to improve performance based on human feedback.
  • Communications functionality that supports seamless collaboration and transparency. Reporters can open and examine events they submitted and continue to communicate with reviewers. Inspired by concepts from social media, the tool creates a chat-like functionality for reviewers and reporters to easily communicate and collaborate about risks identified within the event report and about mitigation strategies being implemented. Further, the application allows frontline staff to report confidentially (i.e., with their identity hidden), while still enabling them to participate in the review process.
  • Innovative qualitative analytics using machine-learning algorithms that have been optimized using years of Johns Hopkins historical data:
    • One algorithm allows reviewing managers to see events with similar descriptions to the one they are currently reviewing, from across the health system, giving them the opportunity to review actions other areas may have taken.
    • Another algorithm clusters similar patterns of events together to allow for trending without relying on imperfect categorization of the reports or on human memory. This analysis has been customized to the user's access credentials, allowing users to review trends specific to their scope of responsibility (e.g., unit or work area, department, organization, health system).

In addition to developing the software application, the team harmonized event-review workflows across JHM organizations, developed education modules, and implemented a new health system event-reporting policy.

In the year since implementation, the team made some enhancements to the tool and began evaluating the impact of their intervention.


​​​The Impact

The Johns Hopkins team reports that the Hero system has successfully met two key goals for the project: (1) improving the user experience for reporters and (2) improving the organization's ability to make effective use of event-reporting data.

With respect to the user experience, the Hero system has yielded the following benefits:

  • A simplified reporting process. The median time required to submit an event report was reduced by 1.5 minutes per report compared with the organization's legacy application.
  • Improved feedback to reporters—described as a "total culture shift" by Kasda. With the Hero system, information-sharing is the rule, not the exception. The application defaults to sharing all follow-up information with the reporter. (This setting can be changed when appropriate, so that the follow-up is not visible to the reporter.) Feedback is emailed directly to the reporter; reporters aren't required to log back into the application to see it, which many might not think to do. As expected, this functionality led to a dramatic improvement in the rates of providing feedback. "Rather than being pleasantly surprised by getting feedback, reporters now expect it," adds Kasda, "and that's led to positive and productive ongoing dialogue among reporters and reviewers about specific safety concerns."
  • Increased physician engagement. Physician reporting has steadily increased over pre-implementation rates. Data from Johns Hopkins's analysis shows that the cumulative number of reports submitted by physicians, residents, and fellows increased by 37%.
  • User satisfaction. To measure user satisfaction with the Hero application, the Johns Hopkins team surveyed recent reporters and reviewers both before implementing the new system and six months after. Survey respondents answered questions using a 1 to 7 Likert-style scale. For the Hero system, the responses for all questions had a statistically significant shift toward the more favorable responses. According to respondents, the new system makes it easier to enter an event, helps facilitate recognition of staff, and leads to meaningful changes.

With respect to the quality of the data analysis: The analytics built into the Hero system have improved the organization's ability to manage and analyze large amounts of qualitative data. This enhances its ability to progress from a submitted report to an effective solution.

"The Hero application helps our analysts see problems across units and entities that would not be readily visible to the human eye," explains Lori Paine. "These risks may have existed in the care delivery system and been reported in the previous application, but we didn't have the lens to see them."


​​Key Takeaways​

With the Hero application, the Johns Hopkins team has begun shifting the mindset from thinking about events to thinking about risks. "Instead of reacting to harm, we envision a world where we can identify organizational blind spots and proactively mitigate against risks," notes Kasda. In addition to capturing events, the application also facilitates collecting "good catch" nominations. Such reports are particularly significant since they represent instances in which harm was averted, as well as opportunities for implementing measures that would prevent future harm. These situations provide data to recognize staff who go above and beyond to ensure safety.

The team considered several measures to assess the overall impact of the new event-reporting program. Significantly, they recognized that reporting volumes would be an imperfect indicator of the program's success. For one thing, the intervention was implemented at a time during COVID-19 when reporting volumes naturally fluctuated with changes related to the pandemic. Instead, the team considered the key indicators to be user satisfaction, user engagement, the quality of information being received, and the impact of any improvements that can be made as a result of that information. In these areas, the team found evidence of the intervention's success.

Additionally, the team views the number of requests it has received for future enhancements and new workflows to be another strong indicator of the success of the Hero system development and implementation.

One final note from ECRI: Barriers to event reporting exist at virtually all healthcare organizations. That's one factor that prompted ECRI to highlight the value of event reporting, as well as the risks associated with ineffective reporting programs, in the 2023 edition of its Top 10 Health Technology Hazards report. (See Hazard #10: Underreporting Device-Related Issues May Risk Recurrence.) This project by the Johns Hopkins Armstrong Institute team illustrates that some of those barriers can be minimized by thoughtful system design. 

Glossary

Bibliography

References

Resource List

Topics and Metadata

Topics

Accidents; Biomedical Engineering; Culture of Safety; Hazard and Recall Management; Health Information Technology; Incident Reporting and Management; Quality Assurance/Risk Management; Root Cause Analysis

Caresetting

Ambulatory Care Center; Ambulatory Surgery Center; Assisted-living Facility; Behavioral Health Facility; Dialysis Facility; Emergency Department; Endoscopy Facility; Home Care; Hospice; Hospital Inpatient; Hospital Outpatient; Imaging Center; Independent Living Facility; Physician Practice; Rehabilitation Facility; Short-stay Facility; Skilled-nursing Facility; Substance Abuse Treatment Facility; Trauma Center

Clinical Specialty

 

Roles

Allied Health Personnel; Biomedical/Clinical Engineer; Clinical Practitioner; Information Technology (IT) Personnel; Nurse; Patient Safety Officer; Quality Assurance Manager; Regulator/Policy Maker; Risk Manager

Information Type

Guidance

Phase of Diffusion

 

Technology Class

 

Clinical Category

 

UMDNS

SourceBase Supplier

Product Catalog

MeSH

ICD9/ICD10

FDA SPN

SNOMED

HCPCS

Disease/Condition

 

Publication History