Tools, Technologies and Training for Healthcare Laboratories

From Rules and Tools to Technology and Training

At the Fourth European Conference Quality [r]evolution in Clinical Laboratories, in Antwerp, Belgium, Dr. Westgard charted the course of the recent developments in quality control. He covers the introduction of the original multirule ("Westgard Rules") to the introduction of the OPSpecs chart, to software-automated QC selection, to the future applications of embedded software in instrumentation and laboratory information systems.

Presented at the 1998 Fourth European Conference Quality [r]evolution in Clinical Laboratories, Antwerp, Belgium, October 29-30, 1998

This discussion describes the evolution of my interests in analytical quality management and presents some ideas about what is needed in the next generation quality management system and how those improvements might be accomplished.

I take the liberty of providing a somewhat personal perspective at this conference because of the inauguration of the Westgard Quality Award. It's both an honor and a concern to have an award with this name. The honor is being identified with quality in the clinical laboratory - it's certainly been the central interest of all my work throughout my career. The concern is that awards are usually named after dead people and I'm worried that I've been identified as "over the hill" or "under the ground." I will now have to conclude all my papers and letters with the words "not dead yet."

Rules for quality control

Some people may actually wish I was dead because of all the pain and suffering caused by the publication of "a multirule Shewhart chart for quality control in clinical chemistry" [1] - a QC procedure now commonly known as "Westgard rules." I have always regarded multirule QC as a concept and consider the "Westgard rules" as an example that shows how that concept can be applied. We actually described the basis of multirule QC in two or three paragraphs in an earlier paper on the performance characteristics of QC rules [2]. Later, at the request of the editors of the series on "selected methods in clinical chemistry," we prepared the detailed example shown in Figure 1 that eventually became known as "Westgard rules."

This work on QC began during a sabbatical at Uppsala University, where I was fortunate to collaborate with Professor Calle deVerdier, Torgny Groth, and Torsten Arronson. The intended purpose of that sabbatical was to focus on methodology for defining analytical goals, but after much discussion, we found that project too large to make significant progress in the available time. In an effort to have something to show for my time in Sweden (i.e., papers), we performed computer simulations to study the performance characteristics of different QC rules [1]. However, I like to think that our discussions stimulated the interest in analytical goals that led to the NORDKEM project on "Assessing Quality Requirements in Clinical Chemistry" [3] and also the NORDKEM project on "Quality Control in Clinical Chemistry - Efforts to find an efficient strategy" [4].

Collaborative work between Uppsala and Madison continued for several years, producing a series of studies on the performance of QC procedures [5-8], as well as attempts to develop practical applications of the theory [9-11]. As an aside, let me relate that it was difficult to get these studies accepted by the journals because of their theoretical nature. The editor of Clinical Chemistry (at that time) advised me to quit this line of work and find something that would be professionally more rewarding. However, my Scandinavian determination and stubbornness served me well, and a year or two later, in 1982, I received the Ames Award from the American Association of Clinical Chemistry for outstanding contributions to clinical chemistry.

Tools for quality planning

During the 80s when multirule QC was beginning to be implemented in laboratories in the US and around the world, my interest was focused on relating the amount of QC to the quality required for the test. In the mid 80s,Total Quality Management was emerging in American industry, and we started applying those concepts in the laboratory to improve quality and productivity through cost-effective QC [12]. We provided general guidelines for developing cost-effective QC procedures that included the following steps:

  • defining a total error specification,
  • evaluating the performance of the measurement procedure,
  • calculating medically important errors,
  • assessing the performance characteristics of QC procedures from computer simulation studies or power function graphs, and
  • selecting control rules and numbers of control measurements to have high error detection and low false rejection.

The only problem was that few people could do this in their own laboratories. They needed access to a computer simulation program [9,10], which was a very complex tool with limited availability. Or, they needed to find the appropriate power function graphs in the literature [6,12]. Even though we demonstrated the practicality of selecting QC procedures for individual tests on a multi-test analyzer [13] and the potential cost-savings of improved QC designs [14], few laboratories could follow up on this work. Better tools had to be developed to make it practical and easy for others to select appropriate QC procedures.

The advantage of graphical tools was evident from our TQM experiences with problem-solving teams. A simple graphical tool was needed to display the relationship between the quality required for the test, the imprecision and inaccuracy observed for the method, and the QC needed to detect medically important errors. A chart of operating specifications, or "OPSpecs chart" [15,16], was developed during another sabbatical which was spent in part at Hartford Hospital with Robert Burnett, George Bowers, and Robert Moore, and in part at Odense University in Denmark with Per Hyltoft Petersen and Mogens Horder.

I described the OPSpecs chart in my cornerstone address at this meeting last year [17]. The OPSpecs chart is a map that shows the imprecision and inaccuracy that are allowable when different QC procedures are used to detect medically important errors. The map is prepared for the quality required by the test. The performance of the method identifies the location on that map, permitting the control rules and numbers of control measurements to then be read from the lines on the map. These maps can be provided in the form of an atlas, i.e., a compilation of OPSpecs charts for different quality requirements [18]. Or, they can be prepared by a computer program [19,20], which can also automate the whole process of selecting a QC procedure. We also provide Internet access to "normalized" OPSpecs charts, such as the one shown in Figure 2.

Applications have been presented to demonstrate how the OPSpecs chart can be used with analytical quality requirements in the form of allowable total errors [21], medical quality requirements in the form of a clinical decision interval [22], and biologic goals that are converted to a biologic total error [23]. The selection of QC procedures has been illustrated in detail for immunoassays [24], blood gas measurements [25], and lipid tests [26]. OPSpecs charts have been used to evaluate the "state of the art" precision performance [27], assess the imprecision required by clinical and analytical test outcome criteria [28], and compare the precision required by European biologic goals and US CLIA total error criteria [29]. Patient data techniques, such as Average of Normals (AoN) algorithm, can also be selected or designed using OPSpecs charts [30].

Technology for analytical quality management

With the OPSpecs tool, it takes only a minute to select the QC procedure appropriate for the quality required by a test and the performance observed for the method. The limitations in quality planning are now due to the difficulties in defining the quality requirement and obtaining the estimates of imprecision and inaccuracy. In addition, there are difficulties in implementing the selected QC procedures because of software limitations in current analytical instruments and laboratory information systems.

One approach for resolving these problems is to develop the computer technology that integrates the functions of method validation, quality planning, and quality control into a single system for analytical quality management, as illustrated in figure 3. This system would be like one of the software "office suites" that contains several programs or functions that are tightly integrated, allowing the user to work back and forth between functions and to have access to all the performance validation data in the system. We are prototyping this technology by developing four functions, as follows:

  1. "Setup parameters" allows entry of descriptive information about the method, quality requirements for the test, and QC instructions for monitoring routine operation.
  2. "Method validation" supports the entry and analysis of data for different experiments, such as the replication experiment that is used to estimate method imprecision and the comparison of methods experiment that is used to estimate method inaccuracy.
  3. "Rule validation" supports the automatic selection of statistical QC procedures.
  4. "Run validation" supports the implementation of different QC procedures for routine monitoring of method performance.

The integrated system would function in the following way. The parameters, particularly the quality requirements and QC instructions, would drive the whole quality management process. Estimates of method imprecision and inaccuracy would be obtained from method validation experiments and become inputs for the selection of a statistical QC procedure. The rule validation module would function much like the QC Validator 2.0 program that provides automatic selection of a QC procedure on the basis of the quality required for the test and the imprecision and inaccuracy observed for the method. The selected rules and numbers of measurements would then be implemented automatically in the run validation module. Routine control data would be entered, charted, and evaluated, and problems would be identified and documented.

Ongoing estimates of method performance could also be obtained from routine QC and used to update the QC design. Closing this loop would create a "dynamic" system for managing analytical quality. As method performance improves, less QC would be needed, which should result in simpler rules and lower numbers of control measurements. If method performance deteriorates, more QC would be needed, which could result in multirule procedures with higher numbers of control measurements.

To achieve total laboratory automation, it will be necessary to automate the management of analytical quality in this way. Otherwise, the release of test results will become the rate-limiting step in the automated laboratory. The development of new process control software may offer a better opportunity for implementing this kind of integrated system and improving analytical quality management than the development of new generations of laboratory information systems. Process control systems may even provide the basis for the next generation laboratory information systems.

Training for everyone, anywhere, anytime

The implementation of new technology will change our ways of doing things, therefore, the jobs and responsibilities of laboratory scientists and technologists will also change. Ongoing education and training will be needed by everyone if they are to adapt and embrace these changes.

I have always been committed to education and training, so this is not a new interest. However, I have new concerns due to changes in the staffing patterns in laboratories today. Efforts to "do more with less" often lead to staffs with less experience and less education. Laboratories also have less time for in-service training and much of that time is needed for cross-training. In addition, many laboratory tests are performed in point-of-care setting where the personnel have little training in laboratory techniques and little understanding of quality control. All these factors contribute to the increasing needs for training at all levels - from basic quality control to advanced quality planning.

New technology may again provide an answer. The Internet now makes it possible to deliver training to anyone, anyplace in the world, at anytime of the day or night. We began developing a website for training in analytical quality management about two years ago. My son Sten deserves the credit for the idea and also for all the work to establish the website. I focus on the contents, write a lot of it, edit some additional materials, and try to maintain a schedule for frequent updates. Our approach has been to use a newsletter format, as shown in Figure 4.

Over a series of updates, we try to develop a group of related materials or lessons that can then be organized into a training course. For example, we have courses on Basic QC Practices, Basic Method Validation, and Multirule QC, that are offered for continuing education credit through the American Society for Clinical Laboratory Science.

The power of this Internet training technology is best understood by thinking about the training materials as a "lesson-base", as illustrated in Figure 5. The analogy with a database is useful to emphasize the ability to select and organize individual items according to some request. For example, the request might be my department chairman telling me that our pathology residents need more training in analytical quality management. I can respond by selecting materials from our existing lessons, organize them in an appropriate manner, and create a new "course-page" that describes the course and provides access to the materials.

With this Internet training technology, it would be possible to supply a training course in analytical quality management directly to your laboratory. You could help identify the skills and training needed to customize the training for your staff. You could add specific examples and applications to emphasize important and unique aspects of your laboratory service. This could be accomplished via e-mail, with you being here in Antwerp, Belgium, me being in Madison, Wisconsin, the webmaster being in Norwalk, Connecticut, and the lesson-base residing on a computer located anywhere in the world.

Concluding comments

Computer technology and Internet training technology will be important factors in the future improvements in analytical quality management. It is important for us to accept the idea that quality management needs new technology and to recognize the importance of this technology in the automation of laboratory testing processes, whether those processes are point-of-care measurement systems or totally automated laboratories. To support those improvements and routine laboratory operation, improvements are also needed in training and education, both the contents and the mechanism for delivery.

It's a special privilege to offer these comments to the distinguished audience attending this meeting, which includes many special friends, colleagues, and collaborators. I expect that much of the progress and improvements will come from those of you in the audience.

However, let me remind you that I also expect to contribute to these improvements. I'm not dead yet - the best is yet to come!


  1. Westgard JO, Groth T, Aronsson T, Falk H, deVerdier C-H. Performance characteristics of rules for internal quality control: Probabilities for false rejection and error detection. Clin Chem 1977;23:1857-67.
  2. Westgard JO, Barry PL, Hunt MR, Groth T. A multi-rule Shewhart chart for quality control in clinical chemistry. Clin Chem 1981;27:493-501.
  3. Horder M, ed. Assessing quality requirements in clinical chemistry. Scand J clin Lab Invest 1980;40(suppl 155).
  4. deVerdier CH, Aronsson T, Nyberg A. Quality control in clinical chemistry - Efforts to find an efficient strategy. Scand J clin Lab Invest 1984;44(suppl 172).
  5. Westgard JO, Falk H, Groth T. Influence of a between run component of variation, choice of control limits, and shape of error distribution on the performance characteristics of rules for internal quality control. Clin Chem 1979;25:394-400.
  6. Westgard JO, Groth T. Power functions for statistical control rules. Clin Chem 1979;25:863-69.
  7. Westgard JO, Groth T. A predictive value model for quality control: Effects of the prevalence of errors on the performance of control procedures. Am J Clin Pathol 1983;80:49-56.
  8. Westgard JO, Groth T, Aronsson T, deVerdier C-H. Combined Shewhart-cusum control chart for improved quality control in clinical chemistry. Clin Chem 1977;23:1881-87.
  9. Groth T, Falk H, Westgard JO. An interactive computer simulation program for the design of statistical control procedures in clinical chemistry. Computer Programs in Biomedicine 1981;13:73-86.
  10. Westgard JO, Groth T. Design and evaluation of statistical control procedrues: Applications of a computer 'QC Simulator' program. Clin Chem 1981;27:1536-1545.
  11. deVerdier C-H, Groth T, Westgard JO. What is the quality of quality control procedures. Scand J Clin Lab Invest 1981;41:1-14.
  12. Westgard JO, Barry PL. Cost-effective quality control: Managing the quality and productivity of analytical processes. Washington, DC:AACC Press, 1986.
  13. Koch DD, Oryall JJ, Quam EF, Felbruegge DH, Dowd DE, Barry PL, Westgard JO. Selection of medically useful QC procedures for individual tests on a multi-test analytical system. Clin Chem 1990;36:230-3.
  14. Westgard JO, Oryall JJ, Koch DD. Predicting effects of QC practices on the cost-effective operation of a multitest analytical system. Clin Chem 1990;36:1760-64.
  15. Westgard JO. Charts of operational process specifications ("OPSpecs charts") for assessing the precision, accuracy, and quality control needed to satisfy proficiency testing criteria. Clin Chem 1992;38:1226-33.
  16. Westgard JO. Analytical quality assurance through process planning and quality control. Arch Pathol Lab Med 1992;116:765-769.
  17. Westgard JO. Mapping the road to analytical quality with OPSpecs charts. Appendix 2 in Basic QC Practices:Training in statistical quality control for healthcare laboratories. Madison WI:Westgard QC, 1998, 211-220.
  18. Westgard JO. OPSpecs Manual - Expanded Edition. Ogunquit ME:Westgard QC, 1996.
  19. Westgard JO, Stein B, Westgard SA, Kennedy R. QC Validator 2.0: a computer program for automatic selection of statistical QC procedures in healthcare laboratories. Comput Method Program Biomed 1997;53:175-186.
  20. Westgard JO, Stein B. An automatic process for selecting statistical QC procedures to assure clinical or analytical quality requirements. Clin Chem 1997;43:400-403.
  21. Westgard JO, Hytoft Petersen P, Wiebe DA. Laboratory process specifications for assuring quality in the U.S. National Cholesterol Education Program (NCEP). Clin Chem 1991:37:656-661.
  22. Westgard JO, Wiebe DA. Cholesterol operational process specifications for assuring the quality required by CLIA proficiency testing. Clin Chem 1991;37:1938-44.
  23. Hyltoft Petersen P, Ricos C, Stockl D, Libeer JC, Baadenhuijsen H, Fraser C, Thienpont L. Proposed guidelines for the internal quality control of analytical results in the medical laboratory. Eur J Clin Chem Clin Biochem 1996;34:983-999.
  24. Mugan K, Carlson IH, Westgard JO. Planning QC procedures for immunoassays. J Clin Immunoassay 1994;17:216-22.
  25. Olafsdottir E, Westgard JO, Ehrmeyer SS, Fallon KD. Matrix effects on the performance and selection of QC procedures to monitor PO2 in blood gas measurements. Clin Chem 1996;42:392-6.
  26. Fallest-Strobl PC, Olafsdottir E, Wiebe DA, Westgard JO. Comparison of NCEP performance specifications for triglycerides HDL, and LDL cholesterol with operating specifications based on NCEP clinical and analytical goals. Clin Chem 1997:43:2164-2168.
  27. Westgard JO, Bawa N, Ross JW, Lawson NS. Laboratory precision performance: State of the art versus operating specifications that assure the analytical quality required by proficiency testing criteria. Arch Path Lab Med 1996;120:621-625.
  28. Westgard JO, Seehafer JJ, Barry PL. Allowable imprecision for laboratory tests based on clinical and analytical test outcome criteria. Clin Chem 1994;40;1909-14.
  29. Westgard JO, Seehafer JJ, Barry PL. European specifications for imprecision and inaccuracy compared with operating specifications that assure the quality required by U.S. CLIA proficiency testing criteria. Clin Chem 1994;40:1228-32.
  30. Westgard JO, Smith FA, Mountain PJ, Boss S. Design and assessment of average of normals (AON) patient data algorithms to maximize run lengths for automatic process control. Clin Chem 1996;42:1683-1688.