CEO SUMMARY: Labs using Lean, Six Sigma, and similar quality management methods are now putting these tools to a new purpose. They are being employed to validate the accuracy of metrics designed to monitor and manage work processes directly related to turnaround times and customer satisfaction. In auditing how such data is collected, lab administrators are often surprised at the inaccuracy of the data sets collected and relied upon—often for years—to measure performance.
ACCURATE DATA IS THE CORNERSTONE of laboratory medicine. Accurate data is also fundamental to evaluating the performance of operational work processes in the clinical laboratory and pathology group.
In recent years, a small, but growing, number of laboratories have begun to question the accuracy of data used to measure the performance of individual work processes within their labs.
“Two things motivate these laboratories to question the precision of the data they collect as part of their ongoing quality assurance activities,” stated Rodney Momcilovic, a consultant with the ValuMetrix division of Ortho-Clinical Diagnostics, a business unit of Johnson & Johnson.
Measuring Individual Steps
“First, as a laboratory organization adopts the quality management methods of Lean, Six Sigma, and process improvement, its staff learns new ways to measure individual steps in a work process,” explained Momcilovic. “One consequence of these new skills is that staff begins to recognize flaws in how data is collected.
“The second reason is that the tools and methods used in Lean and continuous improvement give the laboratory’s staff the capability to fix those flaws in data collection,” he said. “Technology may also play a role in fixing the flaws in data collection, because middleware solutions make it easier for lab staff to capture more granular data in real time.”
Momcilovic made these comments to a packed room at last November’s Lab Quality Confab, conducted in San Antonio, Texas. His session was titled “Unlock Major Performance Gains by Managing Your Lab’s Work Flow with Real-Time Performance Measures.”
The keen interest of his audience in how to better utilize real-time data to guide performance improvement projects was itself a statement. It demonstrated how competitive forces in the laboratory marketplace are causing more lab administrators to look inward into their lab’s operation to identify the sources of errors.
These lab managers know that eliminating errors and shortening turnaround times produces competitive advantage in the marketplace. As this happens their lab is raising the bar in the market and other labs must step up their performance to maintain their competitive position.
“Essentially, these laboratories have decided to challenge the assumption that much of the data on work processes they have collected for years is accurate,” observed Momcilovic. “Too often, their finding is that this is bad data—and it has been bad data for all the years they collected it.”
Non-Clinical Data Accuracy
“At the heart of quality assurance in the laboratory is the regular audit of processes to be sure that the lab test results are accurate,” he said. “But how often do lab administrators audit the non-clinical data that they use in quality improvement efforts? Not often enough! That is why lab managers may unknowingly be using bad data to make critical decisions.”
“A great illustration of this are the time stamps in the LIS that laboratories routinely use to help measure their turnaround-times,” continued Momcilovic. “Time stamps are only as reliable as the process for creating them.
“It’s the old cliché of ‘garbage in, garbage out’,” he added. “If the time stamp is entered incorrectly or at the wrong step in the process, you end up with a completely irrelevant measure.
“Here’s an example of a flawed time stamp process,” stated Momcilovic. “In a consulting project with a lab client, their monthly reports showed that patients waited an average of 11 minutes to have blood drawn, which they thought was very good.
“However, the lab was receiving negative patient satisfaction scores due to long wait times,” he said. “Customer feedback motivated lab administrators to further reduce wait times to improve patient statisfaction—despite the fact that the reports based on the time stamps said the lab’s wait times were good.
“To determine what was actually happening, I took my stopwatch to the outpatient waiting area,” he recalled. “We measured how long patients actually waited from the time they approached the registration desk until the time they were called for their blood draw.
“The findings confirmed that patient dissatisfaction was justified,” Momcilovic revealed. “Patients waited much longer than the time stamp data suggested. Many waited as long as 30 minutes before they were called by the phlebotomist.
“When the lab team reviewed how the time stamp data was collected, the bad data factor was quickly recognized,” he noted. “Clerks entered a time stamp when patients registered. That started the wait time measure.
“To mark the end of the patient wait time, the next time stamp was to be created when the order entry clerks completed the paperwork and moved it to the ready-to-draw basket,” he explained. “However, clerks often created the stamp when they started the paperwork.
“This generated bad data because the time stamp was only measuring the time from registration to the time the paperwork was moved to the order entry basket,” he added. “This was only a small portion of the patient’s actual wait time.
Simple Fix to the Problem
“The fix from bad data to good data was simple,” Momcilovic said. “The second time stamp was now collected at the point when the phlebotomist picked up the order and actually called the patient in for the blood draw.”
Armed with verified data, lab administrators had accurate metrics for improvement. “In this case, the bottleneck was the ordering process,” he explained. “Basic workflow changes were made to the two-person team doing the order entry. Average wait times—now measured by accurate time stamps—fell under eight minutes. Patient satisfaction scores rose.
“Data collection errors like this are common in the lab,” Momcilovic said, “Such bad data make it tough for lab administrators to reliably identify where the problems exist in their labs’ work processes.”
Momcilovic offered another common way that labs unwittingly collect bad data on work processes. “Assume that big batches of samples arrive in the lab at one time,” he said. “Maybe one accessioner decides to speed up the data entry process by giving all the samples the same drawn and received times.
“Here is an example of how this hides customer dissatisfaction,” he offered. “Take a sample drawn at 7 a.m., but entered by the accessioner as drawn at 7:44 and received at 7:45. If the result is ready at 8:30, the TAT, according to the computer, is 45 minutes. But for the physician waiting for results, it’s a TAT of 90 minutes, double what your data shows.
“If you use that bad data to benchmark your service, it hides the defects in your lab’s service,” Momcilovic observed. “In the meantime, the physicians are unhappy and your lab team cannot understand why.”
“Such a simple habit makes it impossible for management to use the time stamp data to know how long it takes the samples to get to the lab,” he continued. “It takes just one out of 10 accessioners taking this shortcut to seriously skew the time stamp data in the LIS.”
Important To Audit
Momcilovic recommends that laboratories regularly audit the process of data collection to improve its accuracy. “Your lab’s quality control (QC) team regularly audits the quality of analytical data. Why shouldn’t the data sets used to determine the performance of work processes also be audited?” he asked.
“The best audits physically watch how things are done,” he continued. “Lab directors are always surprised at what we find when we audit their processes. The process may be a good process. But should the people who perform that process take shortcuts, it could render the collected data unreliable.”
While variations in data may seem trivial, they can seriously skew statistical data, especially when the data is averaged over a month or over a year. “Seconds count, minutes matter and hours add up,” he says.
Auditing can also help lab managers accurately identify the source of bottlenecks. “One client was preparing to purchase a new piece of equipment because doctors told him that it was taking too long to get results from the analyzer,” Momcilovic said. “When we audited the processes, he discovered that the analyzer was providing rapid results, but the results often sat waiting for verification by a med tech. If he hadn’t audited the process, he might have spent $250,000 and gotten no improvement.
Implementing A Solution
“Instead, the lab administrator changed the process to ensure a regular time interval for frequent results review and release until autoverification could be incorporated,” he said. “This delivered a big improvement in speed at no cost.”
While physical auditing takes time, it is often the only way to verify if the data is accurate. “If you use that data to create budgets and determine staffing levels, to guide capital expenditures, or to alter processes in your laboratory, you need to know that the data is creditable and that it truly measures what you are intending to measure,” says Momcilovic.
“If you do the audit and it turns out the data is reliable, you haven’t wasted your time. If the data is ever questioned, you’ll have first-hand knowledge of its credibility,” he added.