JCM Accepts, published online ahead of print on 4 June 2014 J. Clin. Microbiol. doi:10.1128/jcm.01035-14 Copyright 2014, American Society for Microbiology. All Rights Reserved. 1 Full-Length Paper 2 3 4 Competency Assessment of Microbiology Medical Laboratory Technologists in Ontario, Canada 5 6 Marc Desjardins a,b,c,#, and Christine Ann Fleming d. 7 8 9 10 11 a Division of Microbiology, Eastern Ontario Regional Laboratory Services, Ottawa, Canada, b Ottawa Hospital Research Institute, Ottawa, Canada, c University of Ottawa, Ottawa, Canada, d Quality Management Program Laboratory Services, Toronto, Canada 12 13 Running title: Competency assessments in microbiology laboratories 14 15 16 17 18 19 20 # Corresponding author: Marc Desjardins Division of Microbiology, Eastern Ontario Regional Laboratories, 501 Smyth Rd, Ottawa, Ontario, Canada, K1H-8L6. Phone: 613-737-8322, Fax: 613-737-8324 E-mail: madesjardins@ottawahospital.on.ca 21 22 Abstract word count: 249 1
23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 Abstract Accreditation in Ontario, Canada requires that licensed clinical laboratories participate in external quality assessment (also known as proficiency testing) and perform competency evaluation of their staff. To assess the extent of ongoing competency assessment practices, the Quality Management Program Laboratory Services (QMP LS) Microbiology Committee surveyed all 112 licensed Ontario microbiology laboratories. The questionnaire consisted of a total of 21 questions including: yes/no, multiple-choice and short answer formats. Participants were asked to provide information about existing programs, the frequency of testing, what areas are evaluated and how results are communicated back to the staff. Of the 111 responding laboratories, six indicated they did not have a formal evaluation program since they only perform limited bacteriology testing. Of the remaining 105 respondents, 87% perform evaluations at least annually or every 2 years, and 61% include any test or task performed, whereas 16% and 10% focus only on problem areas or high volume complex tasks, respectively. The most common methods of evaluation were review of EQA challenges, direct observation and worksheet review. All but one participant communicate results back to staff and most will take remedial action to correct the deficiencies. Although most accredited laboratories have a program to assess ongoing competency of their staff, the methods used are not standardized or consistently applied, indicating that there is room for improvement. The survey successfully highlighted potential areas for improvement and allowed the QMP LS Microbiology Committee to provide guidance to Ontario laboratories for establishing or improving existing microbiology-specific competency assessment programs. 46 2
47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 Introduction In Ontario, legislation requires the licensure, external quality assessment (EQA, also known as proficiency testing) and accreditation of all clinical laboratories. The Ontario Medical Association (OMA) is the agency designated by the Ministry of Health and Long-Term Care (MOHLTC) to monitor proficiency of laboratory test performance and to accredit laboratories. The QMP LS is a department of the OMA and is a mandatory EQA and accreditation program for licensed medical laboratories in Ontario, Canada. All QMP LS EQA programs are accredited to ISO/IEC 17043:2010 Conformity assessment General requirements for proficiency testing [1] and the QMP LS accreditation division is a signatory to the Mutual Recognition Arrangement (MRA) of the International Laboratory Accreditation Cooperation (ILAC) and its regional Asia Pacific Laboratory Accreditation Cooperation (APLAC). The accreditation requirements are based on ISO 15189 [2] and are augmented with international standards for safety and point-of-care testing, national standards for blood safety, government regulation and generally accepted principles of good practice [3]. The accreditation division of QMP LS requires clinical laboratories (hospital, private and public health) to establish skills evaluation testing or competency evaluation of their staff as part of a comprehensive quality management program. During accreditation visits, laboratories are required to demonstrate a competency program is in place and that evaluation records are kept, and, while there are suggested assessment methods, there are no explicit accreditation requirements for the content of the program or the frequency of testing. In the United States, the Clinical Laboratory Improvement Act (CLIA '88) mandates that competency assessment be completed for all technical, 3
70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 supervisory and testing personnel, for each test that an individual is approved to perform, and must include the following elements for each test evaluated: (1) direct observation of routine test performance; (2) monitoring the recording and reporting of test results; (3) review of intermediate test results, quality control records, proficiency testing results, and preventative maintenance records; (4) direct observation of performance of instrument maintenance and function checks (5) assessment of test performance through testing previously analyzed specimens, blinded samples, or external proficiency samples; (6) assessment of problem solving skills [4,5,6,7]. An initial assessment is required within the first six months following the initial training period and annually thereafter. For more in-depth guidance on performing competency assessments, laboratories are more likely to rely on published recommendations such as those from Clinical Laboratory Standards Institute (CLSI) [8], or other peer-reviewed publications [,,9,10,11,12,13,14,15]. Many publications are not specific to microbiology and address staff competency evaluation from the perspective of laboratory medicine in general. The most comprehensive document providing guidance to microbiology laboratories for how they may achieve compliance with CLIA in establishing a comprehensive competency evaluation program is Cumitech 39, which was updated in the review by Sharp and Elder [,6,7]. For each EQA discipline, QMP LS has a volunteer, scientific committee of laboratory physicians, scientists and medical laboratory technologists (MLTs) that provide technical and clinical advice to the program. The QMP LS Microbiology Committee has noted problems related to competency in laboratories with performance issues and was 4
93 94 95 96 97 98 99 100 101 102 concerned that Ontario microbiology laboratories may not have a comprehensive competency evaluation program beyond determining competency of staff following initial training and if they had, that it may not be sustained over time. Although Cumitech and CLSI documents are useful resources, the lack of specific guidance for microbiology laboratories was identified by the committee as an opportunity to help laboratories formulate a microbiology-specific competency evaluation program by gaining insights from their peers. A patterns-of-practice survey designed to gather information on current practices in Ontario for ongoing competency assessment of MLTs was provided to licensed bacteriology laboratories across Ontario in July 2011, with the goal of assessing these practices and providing feedback and guidance to laboratories. 103 104 105 106 107 108 109 110 111 112 113 114 115 Methods: A mandatory web-based patterns-of-practice survey was provided on a passwordprotected website to 112 Ontario laboratories that are licensed to perform bacteriology testing. This included laboratories holding single test licenses such as for Gram stain only, Group A Streptococcus throat screens or Clostridium difficile antigen or toxin detection. The survey consisted of a total of 21 questions covering areas of ongoing competency assessment and did not apply to post-training assessment of new staff. The questionnaire consisted of 4 yes/no questions, 3 yes/no questions with a follow-up explanation, 5 multiple-choice questions and 9 short answer questions. Participants were also requested to submit blank copies of their competency assessment forms and asked for their permission to share the forms in the committee comments for the benefit of other laboratories. The survey was designed in order to obtain information on basic 5
116 117 operations of a competency program and provide laboratories with the opportunity to share their approach to competency assessment. 118 119 120 121 122 123 124 125 126 127 128 The Who, What, When, Where and How of Competency Assessment Programs Laboratories were asked if they have policies in place to assess staff competency on an ongoing basis. If policies are in place, laboratories were asked to provide information on who is tested, who performs the evaluation, what areas in bacteriology are tested and to provide examples for each section, what the frequency of testing is, when the evaluation is performed, how the staff are evaluated, how the components of the assessment are selected and how competency assessment is documented. Laboratories were also asked to review their competency assessment records for 25%, up to a maximum of 25, randomly selected technologists and to determine how many were evaluated on an annual basis from 2007 to 2011. 129 130 131 132 133 134 135 136 137 Impact of Competency Assessment on Laboratory Operations To determine what impact competency assessment programs have on laboratory operations and staffing resources, laboratories were asked how many full time equivalents (FTE) are assigned to administer, review and document competency evaluations. To adjust for the size and level of testing, laboratories were requested to only include the numbers of MLT FTEs in the bacteriology section as opposed to that for the entire department which may consist of other sections such as virology, parasitology, etc. The impact on staffing resources was defined as the ratio of FTEs 6
138 139 required to perform the competency evaluations to the total number of MLT FTEs performing testing in the bacteriology laboratory. 140 141 142 143 144 145 146 147 148 149 150 151 Feedback and Remedial Action Laboratories were asked if they had established criteria to define successful completion of the assessment and if yes, what criteria were used. If a staff member was unsuccessful, laboratories were asked if remedial action was taken, if so what type of action was taken and to provide two examples of the most common issues requiring remediation. Laboratories were also requested to provide information and examples of what process, if any, was in place to provide feedback to the staff. Laboratories were asked to perform a self-assessment on the quality of competency assessment in their facility by rating the quality of their program and if they felt it had a positive impact on the quality of service provided on a scale of 1 (not well developed) to 5 (extremely well developed). 152 153 154 155 156 157 158 159 160 Results Responding laboratories Of the 112 licensed Ontario laboratories who received the survey, 111 responded. One laboratory closed during the survey period and did not submit a response. Of the 111 respondents, seven indicated they did not have a formal policy or process for ongoing competency assessment. Of these, six performed minimal bacteriology testing (rapid testing, loading and unloading blood culture bottles, Gram stain only), and one indicated they had an informal process for ongoing competency evaluation and were in the 7
161 162 163 process of developing policies and procedures. Analysis of the responses was performed on the 105 laboratories that submitted a response, performed informal or formal evaluations and provided a comprehensive bacteriology service. 164 165 The who, what, when, and where of competency assessments 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 Of the 105 respondents, 99 (94%) responded that they regularly assess ongoing competency of their MLTs in bacteriology (Table 1). The six laboratories that did not perform regular assessments indicated they would do so as required or as problems arise. Eighty-seven per cent (91/105) of participants indicated they performed ongoing assessment annually or at least once every 2 years (Table 1). In addition to annual testing, most laboratories have established criteria for when an individual s competency must be reassessed such as after extended leave (85%), when duties (71%) or test methodologies (84%) change or when internal indicators suggest problems (90%). The majority (64/105 [61%]) of laboratories include any test or task performed in their competency assessment program, while 17 (16%) focus on problem tests/tasks and 11 (10%) focus only on high volume or highly complex tests/tasks (Table 1). The most common tasks assessed include microscopy (83%), specimen processing (75%), culture interpretation (73%) and organism identification (72%). LIS/data management was the least frequently assessed ongoing competency (54%). Only 4% of participants indicated performing ongoing assessment for nucleic acid amplification tests (NAAT) or other molecular methods (Table 1). 183 8
184 185 186 187 188 189 190 191 192 193 194 195 How is ongoing competency monitored? Performance on EQA challenges was the most common approach for the ongoing evaluation of specific core competencies. Direct observation of routine test performance and review of worksheets and patient reports were other major assessment tools. Approaches such as introduction of a blind/unknown sample, or written or oral quiz were not as commonly used. Some laboratories used web-based competency assessment programs such as CAP competency assessment program available at (http://www.cap.org/apps/cap.portal?_nfpb=true&cntvwrptlt_actionoverride=%2fportlet s%2fcontentviewer%2fshow&_windowlabel=cntvwrptlt&cntvwrptlt%7bactionform.co ntentreference%7d=education%2fcompetency_assessment%2fintro_new.html&_stat e=maximized&_pagelabel=cntvwr) and MTS Competency Assessment available at (http://www.medtraining.org/labcontent.aspx). 196 197 198 199 200 201 Of the 105 participants who responded, the majority indicated that managers, senior technologists and/or supervisors are responsible for performing competency evaluations. However, one third (35/105) of laboratories also indicated assessment by peer technologists are used. Evaluations by laboratory directors and/or medical/clinical microbiologists are uncommon (9 laboratories out of 105 respondents). 202 203 204 205 206 Laboratories were asked to provide two examples of how competencies are assessed for specific laboratory core areas such as specimen processing, microscopy, culture interpretation, organism identification and antimicrobial susceptibility testing and responses are summarized in Table 2. 9
207 208 209 210 211 212 213 214 215 Do laboratories establish criteria for successful completion of competency assessment, communicate results and do they take remedial action? Most (75%) laboratories have established thresholds, benchmarks or criteria for passing or failing an assessment. Many establish a passing grade for objective evaluations such as quizzes or unknown panels. Of the 31 laboratories that provided information on their protocols, 12 indicated they use a threshold of 100%, 14 use 80%, 2 use 85% or 90% and 3 use 70% or 75%. For more subjective evaluations such as worksheet assessments or result reporting, performance is based on a simple pass or fail. 216 217 218 219 220 Almost all laboratories stated that they communicate and document competency assessments (Table 3). Of the 105 laboratories, 74 (70%) have sign off of competency assessment documents, 56 (53%) have individual competency assessment folders for each staff and 31 (30%) use a commercial electronic documentation system. 221 222 223 224 225 226 227 228 All but seven of the 105 respondents (93%) indicated that remedial action would be taken if the individual fails the evaluation. Most laboratories indicated that re-training, reviews, continuing education programs, and re-evaluation would be part of any followup for staff that does not perform well on competency assessments. However, only four laboratories indicated that restricting the individual s rotations to areas of demonstrated competency or other actions until competency has been established would be considered. 229 10
230 231 232 233 234 235 236 237 238 Common competency issues The most common competency issue requiring remediation was associated with Gram staining and interpretation. Additional common areas of concern included failure to understand or lack of familiarity with laboratory protocols, difficulties in performing antimicrobial susceptibility testing such as technical issues (e.g. length of incubation for disk diffusion testing, inconsistent measurement of zone diameters), lack of familiarity with appropriate methods, interpretation of antibiogram, failing to recognize unusual phenotypes and reporting (Table 4). Other areas which often required remedial action included LIS/data entry and biosafety. 239 240 241 242 243 244 245 246 Self-assessment of ongoing competency assessment programs Approximately half (47%) of the respondents indicated that they would rate their laboratory s competency assessment program as moderately well-developed whereas 30% and 23% rated their programs as less than moderately well-developed and better than moderately well-developed, respectively. The majority (88%) felt that their competency assessment program had a positive impact compared to 12% who indicated there was no impact on the quality of service provided. 247 248 249 250 251 252 The cost of ongoing competency assessment Of the 105 laboratories, 79 (75%) had 10 or fewer full-time equivalent (FTE) technologists performing bacteriology tests and 10 (10%) had 20 or more FTEs. To determine the impact on staffing resources, laboratories were asked to estimate the proportion of FTEs to the total number of MLT FTE s required to review competency 11
253 254 255 256 257 258 259 260 assessments. Many respondents either misinterpreted the question or overestimated the actual time to perform their competency assessments. However, among the 45 laboratories that appear to have correctly interpreted the question, the proportion of FTEs to MLT FTEs required to perform competency assessments was 2% to 6% or one FTE for 17 to 50 MLTs depending on the comprehensiveness of the program. The proportion of MLTs assessed for competency increased over the survey period (Figure 1). The number of laboratories assessing at least 80% of MLTs in a year more than doubled from 35 (33%) in 2007 to 78 (74%) in 2010. 261 262 263 264 265 266 267 268 269 270 271 272 Discussion The goal of competency assessment is to improve the laboratory s performance by identifying areas requiring education and/or training of the staff and thus ensuring patient safety. Ongoing competency assessment outside of initial training is not as universally implemented and in some cases where it exists, may be limited in scope and intensity. Performing competency assessment as the need arises is a reactive rather than a proactive approach. The goal should be to detect problems before they happen. Ongoing assessment needs to be incorporated into a laboratory s quality management system. In Ontario, competency assessment is mandated by OLA. However, there is little explicit direction as to what components or which areas should be included in a competency program or the frequency with which staff should be evaluated. 273 274 275 The first step in developing a competency program is to determine which staff should be assessed, which areas of the laboratory should be included, the frequency with which 12
276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 staff should be evaluated and how to measure competency. Although technologists, technicians and clerks should all be assessed, the scope and frequency of assessment may be tailored to the individual s responsibilities. In smaller laboratories with fewer resources, it may be preferable to focus on key areas in the laboratory. This can be based on the importance or impact the procedure may have on a patient s clinical management and the potential for serious harm. Microscopy and Gram stain competency assessment is often cited as being one of the more important areas to evaluate because Gram stain misinterpretation can lead to significant patient safety concerns [16]. In fact, survey respondents indicated that the most common competency issue requiring remediation was associated with Gram staining and interpretation. In addition, culture interpretation, isolate identification, antigen testing and antimicrobial susceptibility testing, interpretation and reporting should be included as a minimum. Most respondents in this survey indicated they assessed staff in several of these core competencies. Ongoing assessment of proficiency with LIS/data management systems, such as use of proper codes, specimen workup data entry and accessioning was the least frequently monitored core competency. This is a key area to include in evaluation programs since most laboratories are now paperless and entirely dependent on information technology. Only a few laboratories indicated performing competency evaluation for NAAT or other molecular tests. Although this is likely a reflection of the number of laboratories performing these assays in routine testing or the level of services, the rate at which NAAT are being implemented in bacteriology highlights the need to include these methods as part of a competency assessment program. The absence of proper EQA programs for many of these assays, particularly in Canada, 13
299 300 301 302 303 304 305 make it difficult for laboratories to design effective evaluation strategies. Finally, only two laboratories indicated they include biosecurity as part of the assessment. With the pending implementation of the Human Pathogens and Toxins Act [17], which is similar in intent to the Select Agents rule [18], and is to become law in December 2015, laboratories must be aware that this legislation mandates that all laboratories develop biosecurity plans and that assessment of biosecurity knowledge has become an important component of competency evaluation. 306 307 308 309 310 311 The extent of assessment in each area will largely be defined by the laboratory s resources which will also determine what type of measures are to be included in the evaluation. The frequency with which staff members should be evaluated is also limited by available resources but laboratories should assess each individual annually in at least one area of responsibility. 312 313 314 315 316 317 318 319 320 321 Competency evaluation of staff should include measures that allow determination of what the individual knows, what they can do and if they actually follow policies and procedures or the know-can-do approach [10,16]. Determining what an individual knows and what they can do is usually straightforward and assesses what they have learned and whether or not they can apply that knowledge. Measurements can include assessment of problem-solving skills through quizzes, observation of testing of patient or EQA samples or testing of simulated samples [5,7]. However, all of these measures are performed with the individual being aware of the assessment which may encourage some to be more diligent in adhering to policies and procedures [10]. By far the most 14
322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 common method used for ongoing assessment for all core competencies is review of performance on EQA samples. Despite best intentions, the individuals testing samples are aware that they are being used to monitor their performance which limits the utility of EQA challenges as an evaluation tool and should not be used alone. Other approaches such as blinded samples and quizzes are not commonly used. Quizzes test the theoretical knowledge and to a certain extent problem-solving skills and are intended to test the can they perform the test but not do they perform the test correctly [10,11]. Non-observational evaluation such as random worksheet audits, review of QC records, instrument printouts or second reviews can identify situations where despite the knowledge and ability to perform the task, an individual under pressure may take the liberty of modifying protocols. Another tactic can be to covertly introduce a simulated specimen or previously tested specimen into the workflow. However, there are substantial logistical issues in implementing such a program. LIS limitations may make it difficult to assign a hospital or laboratory identification number to a non-existent patient with a staff physician in order not to raise suspicion of the staff being assessed. Despite these limitations, time and resource requirements, blinded samples are more likely to provide valuable insight into how the staff perform their duties in real time. 340 341 342 343 344 The final component of a competency assessment program is communication and remedial action. Without proper communication back to the individual, opportunities to correct deficiencies will be lost and the program will lose its value. Most participants verbally communicated back to the technologist but a third did so as part of the individual s annual performance appraisal. Most laboratories indicated that remedial 15
345 346 347 348 349 350 351 352 action would be taken in the event that an individual fails the evaluation. Unless the issue arises because of deliberate negligence, remediation should not be punitive but rather educational [7]. Corrective actions should include but not be limited to discussion of the concerns with the individual, determining the root cause(s) for the issue, continuing education opportunities, mentoring and training. Careful consideration should be given to restricting the individual s responsibilities before being allowed to return to the area of concern. Finally, corrective actions and, more importantly, re-evaluation must be successful before allowing the individual to test patient samples. 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 Most laboratories in Ontario do have a process for ongoing competency evaluation of their staff. However, there is room for improvement. Laboratories need to define competency evaluation programs as part of their QA/QC requirements. Such programs need to be defined in terms of goals and scope. In one of the author s laboratory (MD), the quality management committee is responsible for oversight of competency evaluations. The committee defined as a goal that 100% of staff would be evaluated each year over a three year cycle and be evaluated with one of three methods including: observational test/task performance, random bench audits and anonymous simulated samples. Areas of core competencies to be tested are determined by the committee for each staff based on performance, prior evaluations and scope of practice. Obviously, this approach may not be appropriate for all given availability of staff and resources but programs should consider including some basic elements in their programs: 1) Evaluate the knowledge, the skills to perform specific tasks and determine if staff are applying policies. We recommend that non-observational evaluations such as worksheet evaluations, random reviews of cultures and/or anonymous samples 16
368 369 370 371 372 373 374 375 376 377 378 379 380 381 integrated into routine workflow be used as means of evaluating staff. 2) Determine what core competencies need to be evaluated which best reflect the laboratory s level of service. Gram stain and interpretation of cultures were cited as the most common areas evaluated. However, other areas that require evaluation include, but are not limited to, performance of specialized testing such as typing, agglutination reactions, specialized microscopy, biosecurity and LIS management 3) Competency evaluations should be performed soon after training and on an annual basis thereafter. 4) The program must allow for communication back to the individual on their performance. Waiting for a yearly performance appraisal is not sufficient. Results of competency evaluations, whether satisfactory or not should be communicated in real time and allow for feed back. 5) The program must provide opportunities for non-punitive remediation. Re-evaluation, re-training, or re-assignment can be considered in situations when staff does not meet expectations. Performance improvement programs must establish expectations and be clearly communicated to the individuals. 382 383 384 385 386 387 388 389 Acknowledgements We gratefully acknowledge the contribution of past and present QMP LS microbiology committee members who were involved in the planning and review of this survey and/or in the critical review of this manuscript Dr. Susan Poutanen (chair), Dr. Kevin Katz, Dr. Fran Jamieson, Ms. Helen Meaney, Dr. Samir Patel, Dr. David Richardson, Dr. Alicia Sarabia and Ms. Deirdre Soares. We also thank Maritess Koerner for her editorial expertise. 17
390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 References 1. ISO/IEC. ISO 17043 (2010) Conformity assessment general requirements for proficiency testing. 2. ISO. ISO 15189:2012(E) Medical Laboratories - Requirements for Quality and Competence. 3. Quality Management Program Laboratory Services, Ontario Laboratory Accreditation Division. Ontario Laboratory Accreditation Requirements. Version 5.1. Ontario (Canada): QMP LS; 2011. 4. Medicare Medicaid and CLIA programs: regulations implementing the Clinical Laboratory Improvement Act of 1988 (CLIA). 57 Federal Register 7184 (1992). 5. Centers for Medicare and Medicaid Service. What do I need to do to assess personnel competency. CLIA Brochure #10. November 2012. Available from: http://www.cms.gov 6. Elder BL, Sharp SE. 2003. Cumitech 39, Competency assessment in the clinical microbiology laboratory. Coordinating ed., S.E. Sharp. ASM Press, Wahsington, D.C. 7. Sharp SE, and Elder BL. 2004. Competency assessment in the clinical microbiology laboratory. Clin. Microbiol.Rev. 17(3);681-94. 8. Clinical Laboratory Standards Institute. Training and competency assessment; approved guidelines third edition, CLSI document GP21-A3, 2009. 9. Boone JD. 2000. Assessing laboratory employee competence. Arch. Pathol. Lab. Med. 124;190-1. 10. Haun DR, Leach A. 2003. Performing poorly but testing perfectly. Clin. Lab. Manag. March/April;85-7. 11. Haun DE, Leach AP, Vivero R. 2000. Takin care of mama from competency assessment to competency improvement. Lab. Med. 31(2);106-10. 12. Haun DE, Zeringue A, Leach A, Foley A. 2000. Assessing the competency of specimen-processing personnel. Lab. Med. 31(11); 633-7. 13. Howanitz PJ, Valenstein PN, Fine G. 2000. Employee competence and performance-based assessment. Arch. Pathol. Lab. Med. 124;195-202. 14. Schiffigens J, Bush V. 2001. A four-part approach to competency assessment. Lab. Med. 8(32);431-5. 18
435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 15. Woods RS, Longmire W, Galloway MJM, Smellie WSA. 1999. Development of a competency based training programme to support multidisciplinary working in a combined biochemistry/haematology laboratory. J. Clin.Pathol. 53;401-4. 16. Goodyear N, Kim S, Reeves M, Astion ML. 2006. A 2-year study of Gram stain competency assessment in 40 clinical laboratories. A. J. Clin. Pathol. 125;28-33. 17. Human Pathogens and Toxins Act, Bill C-11, 40 th Parliament, 2nd Sess. (2009). 18. U.S. Department of Health and Human Services. 2005. Part III. Title 42 CFR 72 and 73, Office of the Inspector General 42 CFR part 1003. Possession, use and transfer of select agents and toxins; final rule. Fed Regist 70:13294-325. 19
451 452 453 454 455 456 457 458 459 460 Table 1: The Who, What, When, Where of Competency Testing (n=105) a n (%) No. of Laboratories that assess medical laboratoy technologists 99 (94) What is the frequency of competency assessment? Annually 65 (62) At least every 2 years 16 (15) More than 1/year 10 (10) When required/ad hoc 8 (8) Prospective/retrospective review of EQA only 5 (5) Last one performed greater than 3 years prior to the survey 1 (1) What criteria are used to determine the need for competency assessment Any test/task 64 (61) Problem prone test/task 17 (16) High volume/complex test/task 11 (10) Gram stain 6 (6) No formal program 4 (4) Other b 3 (3) What areas of bacteriology are assessed Specimen processing 79 (75) Microscopy (Gram, AFB) 87 (83) Culture interpretation 77 (73) Organism identification 76 (72) Antimicrobial susceptibility testing 68 (65) Non-cultural based direct detection 68 (65) Equipment operation 67 (64) LIS/data management 57 (54) Biosafety 86 (82) Molecular testing 4 (4) Other c 23 (22) No. of laboratories documenting competency assessments 104 (99) No of laboratories providing feedback to staff 95 (95) a: Pattern-of-practice survey was provided to 112 laboratories, 111 responded and, of these, 105 indicated they perform competency evaluations. b: Other includes testing of EQA samples, upon implementation of new methodology or for low volume/infrequent tests. c: Individual labs also indicated assessments were performed in other areas including but not limited to biosecurity reviews, inventory controls, specimen rejection, quality management, transportation of dangerous goods, occult blood testing, privacy, waste management, reportable disease notification. 20
461 462 463 Table 2: Examples of approaches for the evaluations of specific core competencies 1. Specimen Processing and Microscopy Demonstrates appropriate selection and handling of media Adheres to rejection criteria Performance on EQA surveys Gram stain correlation with culture outcomes or on EQA samples Blinded internal unknowns 2. Culture Interpretation, organism identification and antimicrobial susceptibility testing Demonstrates ability to recognize clinically significant organisms in mixed culture Performance on EQA surveys Correlation between Gram and culture outcomes Appropriate reporting of antimicrobial agents (appropriate for organism and/or source) Worksheet reviews Review of QC testing 464 465 466 467 468 469 3. Other Non-Culture based Assays o EQA samples, following written procedures Operation of analyzers, equipment and LIS data management o Demonstrates proficiency on equipment maintenance Biosafety o Correct use of Personal Protective Equipment (PPE) o Documentation of annual WHIMIS safety training o Demonstration of knowledge of appropriate disposal of biomedical waste Table 3: Documentation of competency assessment (n=105) Percent of reporting Documentation method laboratories Include assessment reports in staff personnel folders 53 Sign off of assessment documentation by staff and assessors 70 Electronic documentation 30 Other format a 28 No documentation 1 a: Other approaches include but are not limited to keeping assessments in a common binder, use of a spreadsheet, retention of EQA records or corrected reports 21
100 Percent Laboratories Performing Competancy Evaluations 90 80 70 60 50 40 30 20 10 470 471 472 473 474 475 476 0 0 1-20 21-40 41-60 61-80 81-100 Proportion of of Laboratory Technologists Assessed Figure 1: Proportion of laboratories performing staff competency evaluations in 2007 (dark bars), 2008 (white bars), 2009 (hatched bars) and 2010 (checkered bar). Laboratories were asked to review up to 25% of the staff s records to determine how many were evaluated for ongoing competency in that year. 22
477 478 479 Table 4: Common competency issues requiring remedial action (n=105). Percent of reporting Competency laboratories Gram stain misinterpretation 27 Failure to follow, understand or lack of familiarity with protocols 23 Antimicrobial susceptibility testing: method, interpretation, reporting 20 Pre-analytical: labeling, set-up, accessioning 6 Post analytical: Incorrect reporting, inappropriate turn around time, 7 communication LIS/data entry 7 Lack of familiarity with biosafety 7 23