Posted by Sten Westgard, MS
In the recent issue of Clinical Chemistry, an editorial reviews the current state of Vitamin D testing: "There is common agreement that 25-OHD is a 'difficult' analyte."
25-Hydroxyvitamin D: A Difficult Analyte, Graham D. Carter, Clin Chem 58:3; 486-488 (2012).
At the same time, the editorial notes that marked process is being made:
"Nevertheless, results submitted to the international Vitamin D External Quality Assessment (DEQAS) have shown a gradual reduction in interlaboratory imprecision (CV) in recent years - from >30% in 1995 to 15% in 2011."
The question is, is that reduction in imprecision good enough? Or is the quality required by Vitamin D still too "difficult"?
More after the jump...
Assessing the quality required by Vitamin D assays has been a tough problem for years. What's the CLIA proficiency testing criterion for Vitamin D? (crickets chirping). There isn't one. What's the Rilibak goal? (wind whistling). There isn't one. What about specifications derived from the biologic variation database? (paint drying). Sorry, not even there. None of the major sources of quality requirements has been able to come up with a specification for 25-OHD.
Well, until now. Last year (2011), Viljoen, Singh, Farrington and Twomey conducted a biological variation study on 25-OHD, and from that, specified analytical goals for assay performance:
Analytical Quality Goals for 25-Vitamin D Based on Biological Variation, Adie Vijoen, Dhruv K Singh, Ken Farrington, Patrick J Twomey, Journal of Clinical Laboratory Analysis 25: 130-133 (2011).
In this study, they identify minimum, desirable, and optimum goals for imprecision, bias, and allowable total error. The relevant numbers are 9, 6, and 3%, respectively, for imprecision; 21, 10.5, and 5.3%, respectively, for bias; and 32.2, 21.7 and 16.4%, respectively, for allowable total error.
Let's focus on the last set of numbers. If assays have imprecision of 30% and the minimum quality requirement is 32.2%, that doesn't bode well for performance. No wonder the method was considered difficult back in the 1990s; the imprecision alone was consuming almost the entire budget for error, leaving no room for bias, the performance characteristics of the QC procedure, or any margin for safety. If current method performance is closer to 15% CVs, that still gives us only 2-sigma performance, using the largest quality requirement (the minimum goal of 32.2%) specified by biologic variation. Again, this amount of variation means there is little room for bias, the response of the QC procedure, etc.
So while methods are improving, it's clear that they need to do significantly better than 15% CVs. If we are aiming for minimum performance (3-Sigma) using the minimum quality requirement (32.2% TEa), we would want about a 10% CV.
What's interesting is that a 2009 study conducted by Stockl, Patrick, and Thienpont already reached this conclusion. In that study, they specified that a routine method should have a CV < 10% and a Bias of < 5%. If we would reverse-engineer that into a 3-Sigma quality requirement, that would be approximately 35%. That's very close to the 32.2% requirement specified by the more recent study.
Undoubtedly, Vijoen et als study will be eventually incorporated into the biologic variation database. For now, it may be most realistic to adopt the minimum quality specification (32.2% TEa) and set method performance goals for imprecision of less than 10%, with a future goal of reducing that to below 7%. Performance at that level would then achieve a level of 3-Sigma using the desirable quality specification.
It looks like D will stand for Difficult for a few more years to come.
[A footnote: I realize that this analysis doesn't take into account the significant biases and interferences that may be present. Current assays have not only challenges in their precision, but also their standardization and their lot-to-lot consistency. Carter's editorial goes into more and better detail on these subjects. I focused here on the precision issue.]
Comments