Our Editor Gets His Vitamin D Test Results From 9 Different Labs

Do different Vitamin D methods confuse doctors?

CEO Summary: Editor-In-Chief Robert L. Michel gave blood for the cause and it’s another laboratory industry first! To understand what doctors and patients see as national labs use different methodologies and reference ranges to report Vitamin 25(OH) D results, his blood was tested 24 times by nine laboratories. The results were unveiled at the Executive War College last May in New Orleans. These results are published here, along with comments from the All-Star Vitamin D Panel experts who discussed reasons why doctors might be confused and might misinterpret Vitamin D lab test results.

UNTIL RECENT YEARS, THE VITAMIN D TESTING MARKET was a rather quiet, uncontroversial corner of the lab testing marketplace. This was true because of the widespread acceptance and use of a long-established, FDA-cleared immunoassay test for Vitamin 25(OH) D.

However, this status quo in Vitamin D was disrupted when some national laboratories began performing Vitamin 25(OH) D testing using tandem mass spectrometry (LC-MS). For a variety of reasons, this different methodology introduced a new element of complexity for physicians and their patients.

In recent years, laboratory scientists and pathologists have begun to publicly discuss and debate the pros and cons of testing for Vitamin 25(OH) D by each of the available methodologies. Much of this discussion centers on analytical precision.

However, this scientific debate about analytical precision of different methodologies among laboratory testing professionals often fails to recognize the needs of physicians and their patients. Clinical laboratory testing is done at the specific request of a physician who is evaluating and treating a patient. These physicians and patients are the true customers of the clinical laboratory, Thus, their needs and expectations for Vitamin 25(OH) D testing should be addressed in the public discussions of laboratory scientists.

During the All-Star Vitamin D Panel at the Executive War College in New Orleans last May, the perspective of the patient was introduced in a novel and unique way. Robert L. Michel, moderator of the panel and Editor-In-Chief of THE DARK REPORT, shared the results of 24 Vitamin D tests performed on his blood by nine different laboratories in the United States. It was a revealing moment, both for the five experts on the panel and for the entire audience. Twelve of Michel’s Vitamin 25(OH) D results were reported by immunoassay methods and 12 Vitamin D results were reported by home brew tandem mass spectrometry methods.

A sidebar on page 13 presents the Vitamin D results as reported to each of the two laboratories which received blood drawn from Michel at the same time. These two labs aliquotted Michel’s samples and sent two aliquots—about 21 days apart—to each send-out laboratory.

To illustrate why a physician and a patient could be confused, the sidebar on page 15 presents all the individual Vitamin D results reported on Michel’s blood so as to show the low-to-high range of numbers.

When viewed from the perspective of a physician and a patient, the potential for confusion—as well as misdiagnosis and/or mistreatment—was obvious. That’s because, although Michel’s Vitamin D level is clearly in the sufficient range (above 30 ng/mL), individual Vitamin D results reported his level to be as low as 36 ng/mL and as high as 66 ng/mL!

Are Physicians Confused?

This illustrates a problem that generally goes unaddressed when laboratory scientists discuss and debate the analytical accuracy and performance of different methodologies and reference ranges used in testing for Vitamin 25(OH) D. That problem is the potential for physicians and patients to be confused as they attempt to interpret results generated by different methodologies, in the context of reference ranges that themselves reflect no scientific consensus.

There are two dimensions to this problem of potential confusion. First, different Vitamin 25(OH) testing methodologies have a recognized bias relative to each other. That bias can be quite significant between individual laboratories, based on how they have set up the particular Vitamin D methodology they use in their laboratory. Clinicians are frequently ignorant of the bias factor when they evaluate Vitamin D results reported on their patients by different laboratories using different Vitamin D methodologies.

Second, the reference ranges used by various laboratories in the United States to report their Vitamin D results do not reflect a single standard supported by the general consensus of the scientific community.

However, there is a de facto standard! It is familiar to any clinician who has practiced medicine since 1993 and has worked regularly with Vitamin 25(OH) D testing since that date. It is the first FDA-cleared predicate “device” or kit, the RIA assay manufactured by Diasorin.

Since 1993, the broader medical profession has become familiar with the reference ranges associated with use of the FDA-cleared immunoassay. That means physicians who actively measure and manage the Vitamin D levels of their patients are quite familiar with the meaning of these reference ranges. They understand how immunoassay results reported on their patients should be interpreted relative to these broadly-accepted reference ranges.

Separately, over the past 20 years, a significant number of published clinical studies involving Vitamin 25(OH) D gathered data using the immunoassay methodology. Physicians aware of the findings of these studies have used the recommended Vitamin D levels and reference ranges suggested by these studies and based on use of the FDA-cleared predicate device/kit in the studies.

De Facto Vitamin D Standard

This de facto standard exists today in the clinical marketplace. It often goes unremarked and undiscussed when laboratory scientists debate the analytical accuracy of their preferred Vitamin D methodology.

But this de facto standard is the source of another practical problem for clinicians and patients. Assume a laboratory introduces a home brew test for Vitamin 25(OH) D into clinical use that has a significant bias relative to the FDA predicate device (which is the RIA manufactured by Diasorin). Scientifically and ethically, how should the laboratory communicate the fact of this bias in the patient’s result to the physician—particularly if the physician has almost 20 years of clinical experience in use of the FDA-cleared methodology for testing Vitamin 25(OH) D?

Further, if the reporting laboratory uses essentially the same reference range that accompanies the FDA-cleared Vitamin 25(OH) D immunoassay, scientifically and ethically, how should the reporting laboratory alert the physician to the bias of the reported result and how that bias might affect the physician’s interpretation of the patient’s result against that lab’s reference range, which may be almost identical to the FDA-cleared predicate methodology?

Real-World Consideration

These are not theoretical questions. In the laboratory marketplace, competing laboratories are getting complaints about the “inaccuracy” of their Vitamin D results. Some doctors even accuse the laboratory they use of reporting flawed results. The laboratory accused of such misdeeds quickly recognizes that the complaining doctor is often confused because another laboratory providing testing to his medical practice may be using a different Vitamin D methodology, with a bias that has gone unrecognized by the doctor—and undisclosed or unremarked by the reporting laboratory to that doctor.

Certainly laboratory scientists recognize the problem created for clinicians by the lack of a standardized reference range for Vitamin 25(OH) D levels. But seldom does a pathologist or clinical chemist publically address how and why the current situation could be troublesome for physicians, and how it might possibly contribute to less-than-ideal care for patients.

That was not the case with the All-Star Vitamin D Panel at the Executive War College. Using Editor Michel’s 24 Vitamin

D results from nine laboratories, the experts were willing to acknowledge the two practical problems—from the perspective of physician and patient—created by: 1) bias in different methodologies that goes unrecognized by clinicians; and, 2) how laboratories establish their reference ranges for reporting Vitamin D results.

Coefficient Of Variation

“In looking at these 24 Vitamin D test results, what jumps out for me is the very tight coefficient of variation, 40 to 48, among the laboratories which performed the test by immunoassay,” observed L.V. Rao, Ph.D., who is Director of the Core Laboratory and Immunology at the UMass Laboratory in Worcester, Massachusetts. “By contrast, labs performing the test by LC-MS have a larger coefficient of variation, 36 to 66.”

Rao had earlier shared with the audi- ence the findings of his laboratory as it developed a home brew LC-MS assay to meet the requests of outreach physicians for this methodology. In THE DARK REPORT issue of July 20, 2009, the data presented by Rao was published, along with Rao’s analysis and comments. In evaluating results of the home brew LC-MS against the immunoassay, Rao stated that “The data showed a fairly acceptable correlation (r=0.80), but with significant bias (approximately 40%).”

Similar points caught the attention of the other Vitamin D panelists as they viewed Michel’s 24 Vitamin D test results. “Three things stand out as I view these results,” stated Julian Barth, Ph.D., Consultant in Chemical Pathology & Metabolic Medicine, The General Infirmary at Leeds, Leeds, West Yorkshire, United Kingdom. “First—and a point which I think is quite important for your clinicians—is that all these labs use differ- ent reference ranges. It’s the same data. What are the reasons why they report these data framed by such different num- bers for their reference ranges?

“Second, for laboratories using the immunoassay methodology, this data is a testament to Diasorin’s manufacturing ability. The uniformity in the performance of the Diasorin kits is stunning,” he noted.

“Third, I’d like to build on earlier comments about analytical accuracy and standardization,” Barth explained. “Going forward, Michel’s Vitamin D results demonstrate why a key need for mass spectrometry is to provide the same answer everywhere, in the same way that labs using the immunoassay kits are demonstrating standardization.”

Andre Valcour, Ph.D., spoke directly to the consequences of a physician attempting to understand the clinical significance of such a wide range of Vitamin D results and reference ranges. “If Michel’s physician called me, I can tell him/her that Michel’s Vitamin D levels are good. Most patients we see don’t have Vitamin D levels in the 40s, like Michel,” stated Valcour, who is Vice President and Laboratory Director at Laboratory Corporation of America in Burlington, North Carolina.

Confusing To Physicians

“But what would this range of results tell a physician if they were much lower?” questioned Valcour. “Let us say that Michel’s Vitamin D value by the Diasorin method was 20, and his doctor sent me some of these labs. I would tell him that his patient is low and needs to be at a minimum of 32.

“Assume this doctor sends Michel’s sample to another lab and that lab reports it as a 20 with a reference range that says 20 is ‘sufficient.’ That confuses both Michel and his physician.

“Not surprisingly, the physician will say, ‘I don’t trust this test. This whole vitamin D stuff is hokum. I can’t trust the results because two laboratories tell me different things based on the same value. One tells me my patient is deficient. Another lab tells me my patient is completely replete’,” observed Valcour.

“This is too confusing,” he added. “It may discourage the physician from ordering the test. But the worst thing is it may also discourage both the physician and this patient from monitoring his Vitamin D levels. That’s not a good outcome, because so many new clinical studies indicate that maintaining higher levels of Vitamin D can contribute to better health and extend life expectancy.”

Valcour sees this confusion among physicians almost daily. “It is real world that different laboratories are reporting different numbers and using reference ranges that doctors find bewildering,” he said. “I deal with this situation every day. I inevitably get the call from the doctor who inadvertently sent a sample to another lab and has also sent to me, and our two labs report different results on the same patient. Quite frankly, it’s time for laboratories to solve this Vitamin D testing issue and I’m really glad we are here today to talk about it.”

Tackling The Problem Of Bias

One panelist wanted to tackle the issue of bias between methodologies head on, picking up on Valcour’s theme. “For a clinician, bias in the results generated by one Vitamin D methodology versus another has consequences in his medical practice,” offered Bruce Hollis, Ph.D., Professor of Pediatrics and Neonatology at the Medical University of South Carolina in Charleston, South Carolina.

“Say a lab runs a mass spec Vitamin D assay that consistently produces a result number that is 40% above the RIA, which is the FDA-cleared predicate device/kit,” he continued. “There are some outside assessments that indicate such a 40% higher result was true of the mass spec assay at Quest Diagnostics Incorporated at the time they introduced their internally-developed lab test into general clinical use. In my example, assume this lab uses 20 as the cut-off for sufficient and reports the patient at 20.

“However, based on the RIA, the patient would test at 12 or 15, which is half of the target 30 used by most labs—and this patient is clearly insufficient!” postulated Hollis. “In this scenario, the doctor will tell the patient he or she is okay and doesn’t need treatment for Vitamin D insufficiency. To me, this is not good medicine. It’s potentially harmful across many disease spectrums for the patient. And the source of the confusion is bias between two Vitamin D methodologies, which is not explained to the physician by the lab using the mass spec assay with that higher bias.

How Patients Are Affected

“Moreover, this problem is not a concern for the upper level of results,” noted Hollis. “Doctors seldom see patients with 70 or 80. Rather, the concern is at the low levels of Vitamin D which are seen constantly, all the time. Thus, how the laboratory defines that low range is the real key to helping the physician develop the right treatment plan. It’s my view that setting a low-end level of 20 as sufficient, and then reporting results with a bias of 1.4 over the RIA is not good lab medicine.”

Recognizing the role of standardization, panelist Russell Grant, Ph.D., Strategic Director, National Office of Quality & Science at Esoterix, Inc., in Burlington, North Carolina, stated “The need is clear. The first pass for the laboratory profession is standardization to address these issues. The second pass is harmonization—or at least a gold standard underneath that, since calibration is one element of harmonization.”

Progress In Vitamin D Testing

As the comments by the experts participating in this All-Star Vitamin D Panel demonstrate, there is plenty of evidence that physicians and patients can be confused when they interact with multiple laboratories, each using different Vitamin 25(OH) D testing methodologies. Additional intelligence briefings on the Vitamin D testing marketplace will be forthcoming.

08-10-09 image 1

 

08-10-09 image 2

Tags: , , , , , , , , , , , , , , , , , , , , , , , , ,

Enter Your Login Credentials
This setting should only be used on your home or work computer.

×

Send this to a friend