CEO SUMMARY: National headlines about erroneous Vitamin D results are a reminder to the lab industry of the imprecision and risks associated with home brew testing. According to one laboratory expert, every laboratory-developed test (LDT) must meet two high standards. One, accuracy, reproducibility, and transferability of the test result number. Two, a reference range that is easily-understood by clinicians and consistent with published studies and existing lab test methodologies.
LABORATORY-DEVELOPED TESTS (LDTs), commonly called “home brew” tests, remain a controversial subject across the lab industry. Recent FDA actions and publicity about a major failure at one lab’s home brew test program raise the spectre that tighter regulation of LDTs might soon be forthcoming.
Labs regularly create and introduce new home brew tests to clinicians. Thus, both the number of home brew assays and their collective test volume has grown steadily. This is particularly true in recent years because a variety of new technologies created opportunities for labs to develop useful diagnostic assays.
In the case of Quest Diagnostics Incorporated, its decision to create a home brew assay using liquid chromatography/tandem mass spectrometry (LC—MS/MS) for use as a high-volume clinical test is probably now the best-known example of a laboratory-developed test. That’s because, after it admitted that it had reported erroneous results on tens of thousands of patients over an 18-month period, that became a national news story. (See TDR, December 22, 2008.)
To provide laboratory directors and pathologists with insights about the issues associated with home brew assays, THE DARK REPORT contacted James Nichols, Ph.D., who is director of the high complexity laboratory at Baystate Medical Center in Springfield, Massachusetts. An expert in lab test errors, Nichols is a frequent speaker on this and other topics at laboratory medicine conferences both in the United States and abroad. He has more than 15 years of experience as the director of a high complexity lab.
Setting Up The Test
“When I hear news about erroneous test results, such as with Quest Diagnostics’ internally-developed Vitamin D test, it focuses my attention on how well the test was set up,” Nichols commented. “Only Quest knows precisely what happened with its Vitamin D assay. From public comments, it believed it had a good calibrator. It set the assay to that calibrator, then later determined that the calibrator was different than the immunoassay.
“One interesting question is how Quest Diagnostics validated its reference ranges and determined its reference ranges based on the results of the validation,” he added. “Any laboratory-developed test should be validated so that the numbers are interpretable to the physicians. That doesn’t always happen because of cost pressures confronting laboratories today.
Validate Reference Ranges
“For any home brew test, the laboratory must succeed in two dimensions,” said Nichols. “One, has the test been developed so that it produces a reliable, reproducible result that is clinically useful and transferable? Two, did the laboratory establish and validate reference ranges that can be understood by the physician?”
Nichols participates in several national laboratory committees and work groups. He is actively involved in helping professional bodies establish appropriate policies and guidelines in many areas of laboratory testing and operation, including laboratory-developed tests.
“It seems that laboratories are striving to bring up LDTs quicker than they did in the past and at less cost,” Nichols explained. “When labs attempt to keep costs down, they tend to cut corners. That can affect the reliability of the test results for that home brew assay.
Trust In Laboratory Results
“Of course, there are minimum standards that must be met when validating each test and allowing it to be interpretable,” continued Nichols. “For most laboratorians, the issues behind validating new tests comes down to trust in the laboratory results. You have to ask yourself, ‘If this were my assay how much validation of the assay would I want to do?’ As it is, validation is left up to each laboratory.
“Laws in the various states tend to be very general in what a laboratory is required to do when it validates an assay,” Nichols noted. “Also, labs use a variety of methods when validating. In the case of Vitamin D assays, that might explain why different laboratories report Vitamin D results using different reference ranges. This creates a problem for physicians if they do not understand how the various numbers reported by different labs relate to their patients.
“An additional complication with home brew testing is that, for a specific assay, there may be several different methodologies to perform that particular test. That is true of Vitamin 25(OH) D,” noted Nichols. “There are immunoassays for Vitamin D and each has its own reference range. Next, there are labs using mass spec (a high complexity assay) for home brew Vitamin 25(OH) D testing. The labs using mass spec are calibrating their standards in different ways and that means the reference ranges need to be revalidated in each of those settings.
“Typically what happens with high-complexity lab tests is that the lab will adopt a reference from the literature,” he said. “If scientific or peer literature exists that identifies the target range, then a laboratory can adopt that range because that’s what physicians are used to seeing. Is this good or is it bad for a lab to adopt such a published range? In certain situations, there is no clear-cut answer to guide the laboratory.
“Let me explain,” he continued. “Typically when our lab sends out a test to a reference lab that we don’t know, I ask in-depth questions about how the lab validated the reference range. I want to know if the ranges were pulled out of the literature, particularly when tests for therapeutic drugs are involved.
“The laboratory performing that test may not have assessed data on 100 normal patients who are stable and free of complicating diseases and interferences,” explained Nichols. “If the lab doesn’t do so, how does it know that it has determined the proper range for that test? At a minimum, the lab should at least match the theoretical literature references. Ideally the lab should do much more to properly validate the test.
Can Doctor Interpret Results?
“Take Vitamin 25(OH)D as an example. Some labs will calibrate against the immunoassay and some labs will calibrate against the reference standards available from NIST,” he said. “Both approaches are right. But, this raises an important question for the laboratory performing the test: Do referring physicians know how to interpret the Vitamin D result reported by my lab? Or, has the laboratory reported results in such a manner that the physician may fail to accurately understand what the results mean and how to proceed in treating the patient? This is an important question to answer when a laboratory validates a reference range, particularly when bringing up a home brew assay developed in house.
“When a laboratory develops a test and it is the only lab performing that test, it must ask itself several hard—and essential—questions,” advised Nichols. “How does the lab know the test is accurate? How does the lab know that the number it produces and reports today will be the same number it produces and reports 10 years from now? Against what is the lab standardizing? How does the lab set the bar for this particular test?
“In the case of Vitamin D, these questions become particularly relevant because there are many labs offering Vitamin D tests,” stated Nichols. “There are manufactured test kits that have been cleared by the FDA and there are home brew tests. When a lab introduces a Vitamin D home brew test, it must ask itself, ‘What are clinicians used to seeing and do the results of my Vitamin D test merge with what physicians expect to see out in the field?’
“When we develop a home brew test here in the lab at the Baystate Medical Center, we operate to stringent standards,” he continued. “For example, I would want to make sure I have 100 males and 100 females of different age groups—particularly if age or gender is a factor that affects the test result.
“Of prime importance, I want a large enough group of individuals so that I can actually pinpoint the appropriate range that physicians would expect when interpreting the result,” stated Nichols. “In other words, I want to know: Is the result I produce when calibrating my home brew assay a normal result or is it disease?
“When developing the home brew assay, it takes time and money to accomplish this evaluation,” he added. “It is not something that a laboratory can do overnight. And it’s often difficult to define what is ‘normal’. This is particularly true of Vitamin 25(OH) D, where the science is evolving and there is ongoing debate in the clinical community about what levels of Vitamin D are necessary for optimal health.”
Share Samples For LDTs
In establishing a laboratory-developed test (LDT), there are other requirements. “If a lab wants to set up and run a test that is already being performed and reported by another laboratory, it must perform proficiency testing or, when proficiency testing is not available, share samples with that other lab to validate its results,” Nichols added. “That’s part of the College of American Pathologist (CAP) standards and part of the CLIA regulations. The purpose of this requirement is, by sharing samples, the laboratory has confidence in the transferability of test results.
“Keep in mind, however, that Vitamin D can be problematic when a lab attempts to find the proper reference range,” he said. “If the lab’s patients live in Florida and get out in the sun a lot, it may have a different normal population than if the lab collected samples from people in Maine in the middle of winter who don’t get exposed to much sunlight.
“Additionally, the lab needs to know if the people it sampled are getting nutritional or vitamin supplementation,” said Nichols. “Each of these factors make it challenging for a laboratory to validate its home brew Vitamin D assay. In my experience, this is not an overnight process because it is difficult to establish appropriate ranges for Vitamin D.
“As a general principle, the issue of validation is particularly troublesome with home brew tests,” observed Nichols. “News that Quest Diagnostics experienced an 18-month period of producing erroneous Vitamin D results from its home brew assay raises valid concerns about how laboratories are validating their home brew tests.
“Obviously CLIA sets the minimum guidelines about what needs to be validated before a laboratory begins to report results produced by a home brew test,” stated Nichols. “For a high-complexity test, CLIA sets much more stringent standards than for moderate-complexity tests. In particularly, CLIA mandates a much higher minimum set of standards that a lab must meet when establishing reference range interferences and analytical and potentially clinical sensitivities.
Validating The Assay
“At Baystate Medical Center, when our lab refers esoteric testing to other laboratories, we typically ask questions to make sure the lab has gone through the typical validations that are necessary to meet those standards,” he commented. “When a lab adds a new home brew test, it is not simply a research test with which it is tin- kering. It will be used by clinicians. Thus, only if a laboratory properly validates its new home brew test will it have confidence that its test is ready for prime time.”
Nichols sees the expanding range of technologies that can be used to develop home brew assays as complicating efforts to standardize and regulate laboratory- developed tests. “There is such a wide variety in the types of tests and the types of rigor applied to validating test results that there may be no way to close the gap with more stringent regulations,” he observed.
“If regulation cannot solve the problem, it means that laboratory directors become the front line in meeting the challenge so that their lab produces results that are reliable, reproducible and transferable, as well as similar to the results reported by other labs,” recommended Nichols.
Issues With Vitamin D Test
“It has always been the role of the laboratorian to teach physicians in their hospital and in their community about the differences in testing and to explain why, when a lab test number has been reported, it can’t be assumed that the clinician can treat to that number. Laboratorians help physicians understand how to use that test result in conjunction with a clinical picture of that patient. In other words, no test result should be used in isolation. And, as noted earlier, Vitamin D is particularly difficult and it’s a high profile type of test.
“When an error in lab testing occurs, it is always easier in hindsight to criticize what that lab has done,” he said. “The important lesson from recent events comes down to this: The laboratory profession already has regulations, meaning the CLIA standards, which set a minimum level for what needs to be validated. But clearly, when a laboratory meets those standards, there is still the potential for error.”
Many Pitfalls With LDTs
As Nichols points out, there are many pitfalls associated with laboratory-developed tests. Furthermore, the complexity of new diagnostic technologies is likely to further challenge the effectiveness of existing regulations, requirements, and guidelines that a laboratory must follow when developing and introducing a home brew assay.