Posted by Sten Westgard, MS
An important new study was published in the International Journal of Laboratory Hematology, on the QC practices of coagulation laboratories.
But here's a question for you: of all these activities, how many of them occur greater than 90% of the time - and which one doesn't?
- Repeat the QC, and if it passes, report results
- Open [and run] new QC
- Look for trending
- Discontinue testing until controls are within limits
- Repeat all patient samples from the last acceptable QC
To find out which one of the activities occurs at a frequency not like the others, follow the jump...
Sadly, it's the last item (repeating patient samples from last acceptable QC) that doesn't occur greater than 90% of the time. It only happens, in fact, 42% of the time.
Here are the other activities and their frequency
- Repeat the QC, and if it passes, report results (97%)
- Open [and run] new QC (95%)
- Look for trending (93%)
- Discontinue testing until controls are within limits (90%)
The first two activities are wasteful, but control vendors and manufacturers love them. You're using more controls, more reagent, and generating more revenue for all your suppliers. It's only you in the lab who's feeling the frustration and wasting your time and money.
The waste of this behavior is even more stark when the study lists the control rules in use in most coagulation labs:
- The 1:2s rule is used by 88% of labs
- The 2:2s rule is used by 74% of labs
- The 4:1s rule is used by 53% of labs
- The 10:x rule is used by 37% of labs
We don't know enough about these labs to know if they actually need all these rules to control their methods. If their assays are performing at Six Sigma, they don't need 2:2s, 4:1s, or 10:x. Violations of those rules might occur but won't indicate medically significant errors.
We do know that the 1:2s generates a lot of false rejection. When a laboratory uses 2s limits with 2 controls, they might see 9% false rejection rates and even higher if they use 3 controls. It's not surprising that the labs subsequently find themselves repeating many controls - the QC limits generate a lot of false alarms, which means having to take some kind of action to convince the lab that nothing is truly wrong. An error in QC rules and limits is followed by an error in QC interpretation and trouble-shooting, over and over again.
A much better approach is to design QC (based on an objective assessment of quality) and optimize the limits being implemented. That way, when an outlier occurs, it's much more likely to be a true problem - one that doesn't mean simply repeating controls until they fall back in, but actually demands genuine trouble-shooting and a repeat of patient testing.
What's interesting, of course, is that these are the same problems we see in chemistry testing, even though in theory we've had more time and experience to correct these practices. Alas, I believe the bad habits of chemistry infected the coag practices. Nor do I believe these practices are confined to Canadian coagulation labs - they're probably common throughout the world.
This excellent study has many more revelations and useful recommendations and can be found at
Internal Quality Control Practices in Coagulation Laboratories: recommendations based on a patterns-of-practice survey, A. McFarlane, B. Aslan, A. Raby, KA Moffat, R. Selby, R. Padmore, Int Jnl Lab Hem 2015, 37: 729-738.