
Outpatient boards
Virtual checkout experience for patients & Patient tracking display for the clinical staff
OVERVIEW
TIMELINE
April 2019 - January 2021
(3 phases)
TEAM
ROLE
Lead researcher
RESEARCH METHODS
Interviewing, Task analysis, Usability testing, Card sorting,
A usability assessment was requested by Jefferson Health Patient Safety and High-Reliability department, prior to the enterprise-wide implementation of a new event reporting platform, RLDatix (rebranded as OnPoint after product purchase) at Jefferson Health.
OnPoint is intended to enable clinical staff to participate in continuous improvement of care and for clinical and operational leaders and managers to monitor and improve safety, quality, and experience.
THE PROBLEM
Thomas Jefferson University Hospitals was in need of a new enterprise-wide event reporting platform to replace the software used in different hospitals within the Jefferson network. Some of the key issues with existing system were:
​
-
Fragmented system: Different locations within the Thomas Jefferson University Hospital network used different tools for event reporting which made it difficult to submit, monitor, assess, and resolve events.
-
Customization limitations: Localizing the product at different Jefferson sites was challenging.
-
Regulatory needs not met: Some locations did not offer native PA PSRS (Patient Safety Reporting System), a critical regulatory requirement.
-
Demotivation due to effort involved: Time taken to fill out event report forms was a major factor affecting if the staff was motivated to submit them.
The goal of this assessment was to evaluate the effectiveness, efficiency, and intuitiveness of the new error reporting software. Expert opinions of stakeholders and feedback of target users were obtained to gain insights into the error reporting platform design prior to implementation. This assessment will ensure the safety of the staff and patients served at Jefferson Health.
"The value of your assessment is that we have an opportunity on the purchasing side of the contract to make demands related to resolving any usability concerns."
- Dr. Jonathan Gleason, Chief Clinical Officer
RESEARCH
Patient safety event reporting systems are ubiquitous in hospitals and are necessary for tracking patient safety events and improving quality issues. Event reporting systems provide insights that drive safer, better care and help manage, monitor, and analyze event reports for better outcomes within a clinical setting.
How does event reporting work?

PHASE I
Pre-purchase assessment through usability testing
During the vendor selection process, we conducted task analysis with10 nurses to understand the ease and fluidity of the submission of event reports in RLDatix. Nurses are busy with patient care, which means submitting an event report is essentially competing for their attention. In order to be considered as a suitable replacement, the platform had to be intuitive so that nurses do not experience cognitive overload.
GOALS
-
Test product features with the target audience
-
Reveal friction points and confusing experiences
-
Identify expectations in order to establish familiarity and easy transition
We identified a set of users and continued to collaborate with them over the course of this project.

FINDINGS
-
The majority of participants understood and liked the application of the general premise of RLDatix in comparison to their current process and were in favor of switching. They also felt that the learning curve is minimal.
-
All clinical staff highly appreciated the feature feedback and tracking progress.
-
8/10 participants had difficulty interpreting the navigation icons.
-
The icons for different event categories and the detailed event reporting form consistently received positive feedback from users.
I think the usability assessment was helpful in deepening our understanding of what we can do with the system in the future, which will be iterative. It also informed good conversations with the parent company on need for customization. It is interesting to consider the role of usability in rollout as well.
- Dr. Oren Guttman, Enterprise Vice President for Patient Safety & High Reliability
A detailed usability report was submitted to Jefferson Health Patient Safety and High-Reliability department at the end of this phase. It supported the decision-making process of finalizing the contract for this product.
PHASE II
Post-purchase customization
A usability summit was organized to uncover unintentional functionality issues and unexpected errors on this platform. This was followed by a series of conversations with the managers from different locations to define a form structure that works best to collect feedback on the risk forms.
GOALS
In Phase II, the focus turned to the event reporting form enhancement. Our main goal was to optimize the length of the form in order to improve the number of reports submitted.
​
-
Assess usability concerns based on visual design/functionality. Focused on icons, navigation, and font sizes.
-
Assess usability concerns based on content and cognitive load. Focused on hierarchy and form length.
We used the usability metrics listed below to assess the success of filling out event report forms.


What does the event report form look like?
CARD SORTING EXERCISE
12 participants
joined remotely
*This project turned remote at the beginning of the pandemic.
We had to change our research strategy and figure out a way to conduct successful remote exercises. We wanted to gain a better understanding of categories and subcategories of data fields that make sense for an end-user while completing an error report.
We created a Miro-based card sorting exercise. The tasks were focused on three aspects.

How should the categories be organized to create a logical flow of data entry?

How should the subcategories within each category be organized?

Which information should be mandatory for reporters and facilitators of events?
FINDINGS
As our participants organized the subfields under each category, they expressed concerns which we grouped under three main issues.
Confusing labels
"There are a lot of things that I want to report but I don’t know how to classify them. That’s what I think we struggle with right now.”
Need for auto-populating field
"If information is not automatically pulled, I don't think the reporters should be expected to fill this out”
Redundant fields
" There are a lot of boxes here and people might not have the time to sit through each one.”
"The summit exceeded our expectations, as we accomplished all of our goals and objectives for the time together, with clarity on the next steps, both for short-term and near-term goals."
- Dr. Oren Guttman, Enterprise Vice President for Patient Safety & High Reliability
A card sorting assessment report was created in collaboration with human factors consultant, Matt Jesso. It was used to inform the customization of this platform before implementation.
PHASE III
Post-implementation feedback
During the 'Go-Live' process, the Quality and safety analytics team set up a 'Virtual Command Center' that primarily focused on measuring usage and technical issues. Our team was tasked with another round of usability tests to assess how the users are adapting to the new event reporting platform.
GOALS
-
Understand the transitional challenges of users and if the form submission experience is ideal for the medical staff.
-
Identify if there were any unique usability challenges they might have faced.
Reordering the form sections
“I wish that was a hard stop when somebody is filling the MRN, they wouldn't be able to complete it if they don't fill it out.”
FINDINGS
Additional training
“I wish we were provided more training on how to use it. Initially, we were filling out every single field and that was time consuming.”
Improve searchability
“I can’t close out a medication event until I type in the medication involved. It took me 15 minutes to find the right medication and it is in my opinion ridiculous.”
NEXT STEPS
Training users
Additional training sessions can be conducted and an expert can be appointed for each location to guide the users.
​
Periodic user testing
User testing session with the configured application. Rapid or robust based on an available timeline.
​
Tracking and follow-ups
Continuous follow-up to improve the form (learning from stats, clicks, etc)
The 'Go Live' event consisted of 12 status calls over a course of 4 weeks, that tracked different stats to measure the event reports submitted and unique logins. At the end of this event Jefferson reported
5,500
unique logins on the platform
7% increase in event reporting from FY20
52% of events scored are 'Great Catches'
TITLE OF THE CALLOUT BLOCK
REFLECTIONS
I started this project with the challenge of working on a very complex idea, tackling the issue of transparency between nonprofits and donors. It was very overwhelming at first, as so much has been done. Using social media as a solution meant that it wasn't a very innovative solution. However, it became all the more interesting to prove why sometimes a brand new solution is not the answer but it lies in how you make an existing idea better.
Over the course of this thesis, I have learned a lot; it was an amalgamation of all that I studied at the university. I got the opportunity to channel my interest in social innovation by way of this project. With this attempt at trying to solve the problem of transparency and utilize the capabilities of technology for good, I am taking a first step towards using my knowledge for the welfare of society.