Carter: External quality assessment in resource–limited countries

Introduction

Medical and public health laboratory services are a critical component of national health systems and are central to disease diagnosis, treatment, prevention, surveillance and outbreak investigations (1). When used optimally, laboratory medicine generates knowledge that facilitates patient safety, improves patient outcomes and leads to more cost-effective healthcare (2). In the United States of America, laboratory testing influences 60–70% of critical decision-making in health, with community laboratories performing at least 50% of all testing (3). In the primary healthcare setting in resource-limited (RL) countries, laboratory testing may influence 45% of medical decision-making (4).

Until recently, allocation of resources to laboratory testing was of low priority for healthcare systems in many sub-Saharan African (SSA) countries. Unreliable and inaccurate laboratory diagnostic testing has promoted the perception that laboratory testing is unhelpful and may compromise patient care (5). Major challenges facing laboratory services in under-developed settings include weak infrastructure, human capacity shortages, and lack of laboratory policies, strategic plans and integrated national quality management systems (6, 7). Funding opportunities for health systems strengthening in RL countries such as the Global Health Initiative, US President’s Emergency Plan for AIDS Relief, and Global Fund to Fight AIDS, Tuberculosis and Malaria have increased demand for diagnostics and provided opportunities to address this neglect and strengthen capacity of public health laboratory networks (8).

Microscopy remains an important laboratory diagnostic procedure in RL countries and provides the basis for managing and controlling several bacterial and parasitic infections, including tuberculosis (TB) and malaria. Absent or poor malaria microscopy has long been recognised and is attributed to multiple factors, including skills of the laboratory workforce, workload, condition of microscopes and quality of laboratory supplies (9). Reyburn et al. found only 46.1% of 4474 patients treated for severe malaria in 10 hospitals in Tanzania had positive blood films (10); Zurovac et al. showed 68.6% sensitivity and 61.5% specificity of malaria microscopy in outpatients attending 17 government health facilities in two districts in Kenya (11); Kahama-Moro et al. found 71.4% sensitivity and 47.3% specificity of routine blood slides in 12 urban public health facilities in Dar es Salaam, Tanzania (12). Misclassification of malaria species occurs from lack of skills in microscopic diagnosis (13). Improving and monitoring malaria-related test performance in peripheral laboratories on a countrywide basis is achieved through training and supervision but programmes must be sustained by national commitment (14). However, focusing on improving diagnosis for one disease may lead to over-diagnosis of another illness (15). Perceptions and attitudes of healthcare providers on quality of laboratory services also affect their correct use for patient management (16).

Malaria rapid diagnostic tests (RDT) were introduced to address challenges of poor malaria microscopy and promote greater accessibility to malaria confirmation, but errors in performing RDT are widespread (17, 18). Community health workers can use RDT safely and accurately for up to 12 months post-training (19). Standardised product performance evaluations that distinguish between well and poorly performing tests are also essential (20).

An innovative approach to sustainably strengthen national laboratory systems in RL countries in Africa is the World Health Organization Regional Office for Africa (WHO AFRO) Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) framework linked to the Strengthening Laboratory Management Toward Accreditation (SLMTA) training programme. SLMTA was introduced in 2009 to provide a systematic, user-friendly approach to achieving laboratory quality improvement and accreditation (21) and uses a competency-based, step-wise process addressing daily laboratory operations for immediate and measurable laboratory improvement, harmonised to International Organization for Standardization (ISO) 15189 standards (8). Accreditation is the internationally accepted framework for verifying that laboratories adhere to established quality and competence standards for accurate and reliable patient testing. In some countries, laboratory accreditation is mandatory; in others, accreditation remains voluntary and may be driven by market incentives (22).

The World Health Organization (WHO) defines External Quality Assessment (EQA) as a system for objectively checking a laboratory’s performance using an external agency or facility (23). EQA participation is associated with improved laboratory performance over time and is a requirement for accreditation (24). However, many professionals in SSA countries are unable to effectively implement quality improvement programmes and many countries remain without an accredited clinical laboratory (25). Establishing, maintaining and demonstrating the accuracy of diagnostic tests is a major challenge for most laboratories in SSA, and the complexity and cost of setting up and maintaining quality assurance systems mean that very few laboratories, mainly tertiary or privately owned, can achieve these standards (26, 27). The common perception that EQA is costly or unnecessary has hindered the widespread enrolment of laboratories into EQA programmes (28).

EQA can be applied in three main ways, each with advantages and disadvantages: 1) Proficiency testing (PT), where an external provider sends samples of undisclosed results to laboratories or individual testers and provides feedback on results; 2) Rechecking or retesting samples in higher level or peer laboratories (inter-laboratory comparison); 3) On-site assessment by approved evaluators (23).

Materials and methods

A systematic search of published literature was conducted to identify studies addressing quality monitoring and laboratory performance in RL countries. RL countries were selected as countries with low and lower middle-income economies (29). Unpublished reports were sought from national laboratory authorities and personnel for further information.

Results

Proficiency testing schemes

Several laboratories in RL countries are enrolled in international or regional PT schemes operated by commercial companies. These offer a wide range of discipline-specific schemes suitable mainly for reference or larger laboratories. Advantages include a high level of participation with different analytical methods, thereby increasing statistical validity. However, government laboratories in RL countries, unless partner supported, may be unable to sustain participation due to high cost. Some institutions in developed countries operate PT schemes that also support RL countries at cost. These include the United Kingdom National EQA Scheme (UK NEQAS) that provides a wide range of PT schemes and educational support; and the African Regional External Quality Assessment Scheme implemented by the National Health Laboratory Service (NHLS), South Africa, that offers a range of PT materials including stable inactivated mycobacterial dried culture spot (DCS) material for GeneXpert instruments performing the Xpert MTB/RIF assay (Cepheid, Sunnyvale, CA) (30).

Table 1 summarises evaluations of PT schemes in developed countries supporting RL countries. The National Institute for Communicable Diseases (NICD), South Africa, has provided PT support to national public health laboratories and related facilities in RL countries in the WHO AFRO and Eastern Mediterranean regions since 2002, supported by WHO AFRO and the WHO Office in Lyon, France. Levels of participation and performance were sub-optimal with no upward trend in performance over time across all diseases despite several scheme-associated training visits provided by NICD and WHO AFRO (31). The Quality Assessment and Standardization for Immunological Measures Relevant to HIV/AIDS (QASI), Canada, was established in 1997 to provide PT for CD4 enumeration at no or low cost to RL laboratories. An impact study demonstrated PT programmes can improve overall laboratory performance despite the diversity of technologies employed (32). An impact study of CD4 EQA provided by the NHLS, South Africa, from 2002 to 2006 demonstrated how EQA provides an opportunity for post-market surveillance, standardising protocols and switching operating systems (33). A regional PT programme for TB smear microscopy conducted between 2003 and 2010 across South Asian Association for Regional Cooperation (SAARC) member states demonstrated the on-going quality of diagnostic support to countries’ TB control programmes (34). The East African Regional External Quality Assessment Scheme (EA-REQAS) established in 2008 by the East African ministries of health provides integrated PT panels for basic laboratory tests. By 2015, 16 surveys had been conducted with enrolment increasing from 195 to 559 laboratories in four countries. Materials include blood slides for malaria and other blood parasites, smears for TB and Gram stain, preserved stool parasites, blood lysate for haemoglobin measurement, blood films for peripheral blood cell morphology, and human immunodeficiency virus (HIV) and syphilis serology (35).

Table 1

Proficiency testing schemes in developed countries supporting RL countries

Country/ (Reference) Participating laboratories Methodology Results
NICD, South Africa (31) 39 laboratories increased to 78 in 48 WHO member states Bacteriological analyses for enteric diseases, meningitis and plague; unstained blood films for Y. pestis; stained and unstained TB smears; thick and thin blood films for malaria
3 surveys per year
Each survey returned by 64–93% participants.
Bacterial identification for enteric diseases and meningitis acceptable in 65% and 69% challenges.
Serotyping and antibiotic susceptibility testing frequently unacceptable.
Microscopy acceptable for 73%, 87% and 82% plague, TB and malaria challenges
51% malaria parasite quantitation scores acceptable.
QASI, Canada (32) 115 laboratories in 47 developed and RL countries 5 consecutive shipments of stabilised blood for CD4 enumeration Survey responses from 23–92 laboratories.
Coefficient of variation for CD4% decreased from 7.2% to 4.7%; absolute CD4 counts from 14.2% to 8.8%.
NHLS, South Africa (33) 13–195 sites in South Africa and other SSA countries 20 trials of stabilised blood for CD4 enumeration Response rate 86–100%.
Inter-laboratory precision 10.8% for CD4% counts and 11.9% for absolute CD4 counts.
SAARC TB Reference Laboratory, India (34) 10 laboratories in 8 SAARC member states 7 rounds with total 778 TB slides 87.5% to 100% inter-laboratory agreement.
14 minor errors; 1 major error.
Sensitivity 90.9–100%. Specificity 83.33–100%.
NICD – National Institute for Communicable Diseases; WHO – World Health Organization; TB – tuberculosis; QASI - Quality Assessment and Standardization for Immunological Measures Relevant to HIV/AIDS; RL – resource limited; CD4 - cluster of differentiation 4; NHLS – National Health Laboratory Service; SAARC – South Asian Association for Regional Cooperation.

Many RL countries operate national disease-specific PT programmes. Mukadi et al. described four studies in the Democratic Republic of Congo (DRC) between 2010 and 2014; three involved submission of single panels of blood slides for microscopy to 183, 356 and 445 laboratories respectively (36-38). Panel composition and results are presented in Table 2. In the second study, there was a lower frequency of serious errors in assessing parasite density (17.2% vs. 52.3%, P < 0.001) and reporting false-positive results (19.0% vs. 33.3%, P < 0.001) by laboratories that had participated in the first study; laboratories with a high number of sleeping sickness cases recognised trypanosomes more frequently (57.0% vs. 31.2%, P < 0.001). In the third study the proportion of correct or acceptable scores was higher among EQA-experienced participants compared to first time participants (40.9% vs. 22.4%, P = 0.001), and higher among those trained < 2 years previously compared to those not trained (42.9% vs. 26.3%, P = 0.01). The fourth study analysed interpretation of photographs of RDT results by laboratory workers; the most frequent errors were failure to recognise invalid or negative test results, overlooking faint test lines and incorrectly identifying malaria parasite species (17).

Table 2

Blood slide proficiency testing studies in Democratic Republic of Congo

Participating / responding laboratories (% response) Materials Results
183 / 174
(95.1%)*
Three thick and thin Giemsa-stained blood films:
- P. falciparum (177,000/μL)
- P. falciparum (86/μL)
- Parasite negative
34.3%, 21.5% and 5.8% errors in 1, 2 & 3 slides.
Thin blood film, negative parasites, Howell Jolly bodies Howell Jolly bodies not recognised; 16.7% reported P. falciparum.
356 / 277
(77.8%)
Four thick and thin Giemsa-stained blood films:
- P. falciparum gametocytes
- Parasite negative
- Trypanosoma brucei spp.
- P. falciparum (113,530/μL)
35.0% reported four slides correctly.
17.5% non-recognition of P. falciparum gametocytes.
19.0% reported parasites in negative slide.
50.4% non-recognition of trypanosomes.
445 / 400
(89.9%)
Five thick and thin Giemsa-stained blood films:
- Parasite negative
- P. malariae
- P. falciparum (09 ± 107/μL) and Trypanosoma (1–3/microscopic field)
- Trypanosoma (0–1/4 microscopic fields)
- P. ovale (1,938 ± 326/μL)
30.6% reported malaria
11.0% reported P. malariae correctly.
32.5%/16.5% reported P. falciparum/Trypanosoma; 6.0% reported both.
44.9% reported Trypanosoma.
6.2% reported P. ovale; 68.8% reported malaria.
1892 / 1849 (97.7%) health workers§ 10 photographs of malaria RDT results 18.5% correct scores (10 / 10)
24.1% correct scores of 9 – 9.5
22.2% scores < 6.5
Source: * (36), (37), (38), § (17); RDT – rapid diagnostic test.

Rechecking and mixed EQA schemes

Many RL countries operate slide-rechecking schemes as part of national malaria, TB and HIV control programmes. Malaria and TB slide rechecking is performed by supervisors on-site, at next level health facilities or central reference laboratories; rechecking of HIV RDT results is usually performed at central level. Since the blanket approach of rechecking 100% positive and 10% negative acid-fast bacilli (AFB) slides was withdrawn (39), most studies reported using standard LQAS (lot quality assurance sampling) based on annual laboratory volume of AFB smears and proportion of positive smears to achieve an overall sensitivity of 75–85%, and processes for blinded rechecking. A significantly greater percentage of errors is detected on randomly selected, blinded AFB smears than on non-randomly selected, un-blinded smears; knowledge of prior results influences re-reading of TB slides (40-42).

Table 3 summarises studies assessing laboratory performance through rechecking of AFB smears, malaria slides and HIV RDT results; some studies used mixed EQA methods. Malik et al. reported random blinded slide rechecking of AFB smears is feasible for a large application with 40,506 slides rechecked over a two-year period at 24 sites in New Delhi, India (43). Selvakumar et al. reported rechecking of 8 slides per month per microscopy centre was sufficient to assess performance of sputum smear microscopy in 12 centres in one district in India, based on sensitivity of 80%, specificity of 100%, slide positivity rate of 10% and annual negative smear volume of 1000–5000 slides. (44). They also noted the discrepancy between microscopy centres and district supervisors was higher (4.7%) than with the reference laboratory (1%), suggesting lack of training and motivation at district level (45). Shiferaw et al. noted false negative results during rechecking were associated with unavailability of microscope lens cleaning solution and dirty smears, and false positive results with no previous EQA experience (46). Shargie et al. suggested disagreements during rechecking between regional and national reference readers are likely due to low-positivity of discordant slides (47). Mundy et al. established excellent concordance (98.2%) on rechecking TB smears after instituting basic internal quality control in one district in Malawi (48). Manalebh et al. studied three EQA methods to assess AFB smear microscopy performance in Ethiopia: on-site evaluation, blinded rechecking and panel testing; poor staining and low positivity were major factors affecting PT performance (49). Ayana et al. employed similar mixed approaches to assess AFB smear microscopy at university practical attachment sites in Ethiopia; poor working environments were a major contributor to poor performance (50). Some studies noted re-staining during rechecking at national level significantly improved sensitivity of low-positive AFB smears (41, 51).

Table 3

Summary of rechecking and mixed EQA studies

Country /
(Reference)
Methodology Results
India (41) 41,978 TB slides from 12 microscopy centres rechecked by supervisors and national level False negatives 2–7% by centres; 0–3% by supervisors; re-staining and blinded rereading reduced false positives from 27% to 7%
India (43) 40,506 TB slides from 183 microscopy centres rechecked by district supervisors 446 errors (2.2%)
India (44, 45) 1547 TB slides from 7 microscopy centres rechecked by district supervisors; 726 TB slides from 5 microscopy centres rechecked at reference level 70 errors by district supervisors; 2 errors by reference laboratory reader
Ethiopia (46) 39,725 TB slides from 201 laboratories rechecked at 26 EQA centres 68 false positives in 41 (20.4%) laboratories;
67 false negatives in 29 (14.4%) laboratories
Ethiopia (47) 2209 TB slides rechecked at regional level; discordant slides rechecked at national level 96.8% overall agreement; 3.2% false positives; 74% agreement between regional and national readers on 95 discordant slides
Malawi (48) 208 TB smears from 1 hospital/32 health centres rechecked at district hospital Concordance 98.1% for positive smears; 98.2% for negative smears
Ethiopia (49) 37 public-private mix laboratories:
1123 TB smears rechecked at regional laboratory
PT of 370 TB panel smears
99.4% agreement
Overall agreement 96%; errors reading unstained and ready-stained smears 62% vs. 38%
Ethiopia (50) 8 health institution laboratories:
578 TB smears rechecked at university laboratory
PT of 10 TB panel smears
94.5% agreement; 3 (3.75%) major errors (high false positives)
Overall error 17 (25.25%); 14 (17.5%) minor errors; 3 (3.75%) major errors (high false positives)
Burundi (51) 1014 TB slides from 72 microscopy centres rechecked at reference level 31.2%/6.9% false positives; 1.2%/4.1% false negatives before/after re-staining.
Rwanda (53) 20 positive, 20 negative malaria slides from 3 laboratories rechecked at reference level 96.67% agreement
Pakistan (54) On-site rechecking of 1170 malaria slides in 4 districts by district supervisors 0.5–1% discordance
Kenya (55) 4514 malaria slides rechecked from 17 health centres by study microscopist Average sensitivity 96%, specificity 88%
Nigeria (56) TB slides from 5 centres; malaria slides from 2 centres rechecked at state level From baseline to final assessment:
Concordance for TB microscopy increased from 81% to 91.0%
Concordance for malaria microscopy increased from 69.2% to 83.3% at 1 laboratory; decreased from 100% to 83.3% in second laboratory.
Nepal (57) DBS from 5 centres retested using repeat HIV RDT and ELISA at national level RDT results 100% concordant
ELISA: 32 samples (91.4%) concordant; 3 samples (8.6%) discordant; sensitivity 88.9%; specificity 94.0%
Ethiopia (58) 64 TB slides, 64 malaria slides, 64 HIV blood samples from 4 health centres rechecked at district level Agreement of 98.4% (63/64), 92.2% (59/64) and 95.3% (61/64) for TB microscopy, malaria microscopy and HIV rapid testing
TB – tuberculosis; EQA – external quality assessment; PT – proficiency testing; DBS – dried blood spots; RDT – rapid diagnostic test; ELISA – enzyme-linked immunosorbent assay.

Random monthly selection of five low density and five negative malaria slides has been proposed for routine rechecking (52). A study from Rwanda reported excellent quality of malaria slide re-reading without grading of positive slides (53). Khan et al. studied malaria slide reading performance in Pakistan after instituting three activities: on-site rechecking of malaria slides, uninterrupted availability of laboratory reagents and supplies, and supervision by district supervisors (54); Wafula et al. instituted four concurrent activities in Kenya: re-reading of malaria films; feedback on quality of slide preparation; preparation of monthly laboratory performance reports; and on-the-job mentorship (55). These studies demonstrated the beneficial effects of additional support on performance. Sarkinfada et al. described systematic integration of malaria slide rechecking within an existing TB microscopy-rechecking scheme in Nigeria. Similar systems were adopted for recording results, slide sampling and storage, feedback mechanisms and monitoring visits to laboratories. Use of the same microscopists and microscopes, and joint training and supervision were feasible at laboratory level but successfully integrating these schemes requires combined TB and malaria management support at national level (56). Thapa et al. used dried blood spots (DBS) for repeat HIV RDT and enzyme-linked immunosorbent assay (ELISA) testing at reference level in Nepal and demonstrated superior sensitivity and specificity of ELISA testing (57). Manyazewal et al. integrated TB and malaria slide rechecking with HIV retesting in Ethiopia and emphasised the role secondary laboratories can play in assuring laboratory quality at primary level to avoid dependence on remotely located national and regional laboratories (58).

Competency assessment of slide readers

Vieira et al. proposed double-blind sputum smear microscopy readings using a panel of 75 slides (36 negative, 4 inconclusive, 35 positive) to assess proficiency of readers and supervisors in TB smear microscopy (59). Dave et al. assessed technical performance of laboratory supervisors using testing panels comprising 5 heat-fixed unstained sputum slides (including TB positives of any grade and negatives) for staining and examination; only 5% of 295 readers reported any type of error (60). Ayalew et al. provided 80 laboratory professionals from public and private health facilities in Ethiopia with 10 panel slides containing P. falciparum, P. vivax, mixed species and negative slides. Overall agreement in malaria parasite detection and species identification was 88% and 74.3% respectively; agreement was lower in slides with low parasite density and mixed infection (61). A multi-country network to address competency of malaria microscopists was established in Asia in 2003 with five-day courses to assess competency in parasite detection, species identification and parasite quantitation conducted by an external facilitator using reference slide panels from the Research Institute for Tropical Medicine, Manila, Philippines, regional slide bank. By 2011, 60 competency assessment exercises had been conducted in 14 countries; microscopists from 5 countries showed significant improvements in performance scores (62). This programme has translated into globally recognised standards of competency in malaria microscopy (52).

Benefits

EQA schemes are valuable for recognising laboratory errors and identifying underlying problems facing peripheral laboratories (50), but participation in PT schemes may not by themselves improve performance. Mandy et al. and Mukadi et al. noted improved performance after several rounds of participation in PT (32, 37); however, Paramasivan et al. noted no significant improvement after submitting five rounds of TB panels, even after communicating observed deficiencies to participants (63). Frean et al. noted little improvement in performance over time in the NICD-supported PT programme (31). Van Rie et al. evaluated the effects of 5-day refresher training for laboratory technicians and distribution of new microscopes on the quality of TB smear microscopy in 13 primary healthcare laboratories in Kinshasa, DRC, through blinded rechecking of slides, but no long-term effect was demonstrated with major errors occurring in 10 (77%) clinics after 9 months (64).

Challenges

Despite the widespread use of malaria RDT in RL countries, tools available for monitoring field performance are limited. Masanja et al. compared RDT performance with malaria reference microscopy and detection of parasite DNA by real-time quantitative polymerase chain reaction (qPCR) on DBS. Malaria RDT had a higher positivity rate (6.5%) than qPCR (4.2%) or microscopy (2.9–2.5%) with poor correlation between RDT and microscopy results. Overall, agreement among the three diagnostic approaches was limited and neither microscopy nor qPCR were suitable for RDT quality monitoring under field conditions (65). McMorrow et al. noted that quality monitoring of RDTs using poor quality blood films might undermine confidence in RDT use (66). Blood samples for malaria antigens for RDT PT must be stored at 4 °C for a maximum of 48 hours, or for longer at –70 °C (67). Methods for preparing stable positive controls from cultured P. falciparum using DBS and dried tube specimens (DTS) have been demonstrated (68, 69). Tamiru et al. reported performance of DTS PT challenges at 1000 and 500 parasites/µL in a field trial in Ethiopia; false negative DTS at 500 parasites/μL were reported after 24 weeks’ storage due to errors in interpreting faint test lines (70). Aggett reported stability of CD4 PT samples as only 7 days at ambient temperature and 14 days when refrigerated (32). There are no internationally agreed EQA methods for blinded rechecking of fluorescent TB smears (71).

Studies have highlighted the difficulties in RL countries of transporting panels to peripheral sites and returning results, or submitting samples for rechecking to reference laboratories. Mukadi et al. used a combination of on-site delivery by car and private air carrier to provincial airports in DRC (38) with responses using short messaging services (SMS) (17). The NICD, South Africa, scheme used express air courier (31). Motorcycles were used to support EQA of malaria microscopy in Pakistan (54). There are obvious logistical advantages to using district-level laboratories to conduct rechecking with reduced distances and costs, and ease of feedback to laboratories (58).

Costs

Published information on costs of operating EQA programmes in RL settings is limited. Khan et al. reported a capital cost of approximately 90,000 Pakistani rupees (USD 1400) to implement a district malaria-microscopy EQA scheme, including motorcycles and training of district laboratory supervisors. This amounted to a 50% increase in direct per-slide cost, but Khan argued this is marginal to the high capital and recurrent costs of microscopy services, and its value in rationalising anti-malarial drug use (54). Mukadi et al. reported the cost of conducting one PT survey in DRC was USD 10,000 excluding salaries (USD 25 per participant) (38).

Discussion

Countries with developing healthcare systems mostly lie in tropical areas where common diseases, such as malaria and diarrhoeal diseases, require immediate diagnosis (within a few hours). Coupled with poor communication across large distances, this necessitates placing laboratory testing close to where patients seek care, resulting in large numbers of small laboratories working independently. Most small laboratories still perform mainly manual assays, which are particularly prone to errors during sample collection, labelling and registration; and many laboratory staff at this level lack skills in recognising pathology, and transcribing and delivering results. Combined, these errors can lead to significant variance in the accuracy of results, leading to incorrect diagnosis, inappropriate treatment or withholding of lifesaving therapy (22). Lack of adequate resources to support these laboratory networks has resulted in equipment breakdowns, interruption of supplies and variable performance. For many laboratories across RL countries, the quality of services is unknown.

Participation in EQA is ideally required for all testing procedures performed in a laboratory. Where an established PT scheme is not available, alternate EQA mechanisms should be considered. All EQA approaches depend for their effectiveness on following national or regional protocols, good communication and feedback, and instituting corrective measures. The benefits of EQA schemes rest on mandatory participation, timely return of results with practical suggestions for corrective action and ability of participating laboratories to address deficiencies. Nothing is gained from EQA participation unless information received is directed to laboratory improvement (23).

PT programmes may be organised at national, regional and international levels and funded through government agencies, as arms of corporations, or operated on a cost-recovery basis. Most government-supported schemes in RL countries address single diseases using a vertical approach, such as malaria, TB and HIV infection; developed country and commercial schemes address a range of laboratory disciplines. Some PT schemes focus on different laboratory system levels, such as the NICD scheme (national public health laboratories) and EA-REQAS (primary level laboratories). Commercial PT providers and developed country PT schemes are accredited to ISO 17043:2010 standards, but few PT schemes in RL countries are ISO compliant.

Rechecking schemes are commonly incorporated into national disease control programmes. WHO recommends integrating malaria microscopy rechecking with other microscopically diagnosed communicable diseases, which is feasible with proper coordination at peripheral and national levels (52, 56). Rechecking schemes require laboratories to follow correct procedures to avoid slide selection bias. Laboratory workers may retain slides of good quality and staining regardless of instructions; some laboratories may not submit slides for rechecking due to lack of confidence in their performance or uncertainty about implications of unsatisfactory performance (47, 49). Few reports indicated the competency of staff conducting the rechecking process but relied on concordance of slide reading to determine accuracy; this can be addressed by including reference centre rechecking that assesses both peripheral laboratory technicians and their immediate supervisors; and conducting regular competency assessments of slide readers. Various standards have been proposed to assess competency in TB slide reading but only malaria microscopy has an established globally recognised competency assessment programme (52, 59, 60). Regular competency assessment of supervisors and reference-level staff urgently needs to be incorporated into national EQA programmes.

A process of rechecking can also be applied to rapid testing assays, such as RDT for HIV, where an alternative technique, such as enzyme immunoassay (EIA) or ELISA is used on dried blood or serum spot samples (57). Rechecking of samples by peer or higher level laboratories (inter-laboratory comparison) is appropriate for specialised tests for which no PT schemes exist, or for single unusual results; however, no published studies were identified demonstrating use or benefits of inter-laboratory comparisons in RL countries. On-site visits to laboratories by qualified auditors using standard checklists also provide a reliable EQA mechanism by implementing practical improvements to address identified gaps.

PT schemes are limited by the availability of stable PT materials that can withstand conditions of heat and humidity often found in RL countries, but have the unique advantage of being able to address uncommon pathology for which staff need to retain competence. However, PT samples are constrained by being unable to provide challenges that mimic some patient samples, such as living organisms and cells (72). PT panels may be placed within a clinical context and involve other health worker cadres in responses (73, 74). Although all EQA programmes can provide inter-laboratory comparisons (benchmarking) when adequate data management systems are in place, this aspect is particularly suited to PT programmes where data are collected and analysed centrally. PT schemes use either referee laboratories or consolidated results from participants to set target values; in RL countries use of participant results may lead to lowering of precision when new participants join a scheme, but precision is usually restored once participants become more experienced (33).

Although every laboratory should treat PT samples as routine samples, this is impossible to monitor and enforce. Most laboratories in RL settings pay special attention to PT samples, especially pathology recognition by microscopy. Therefore, PT results are likely to be the very best that a laboratory can produce; poor results may reflect a worse performance under routine conditions. Participation in PT schemes should not be punitive, but regarded as an educational tool to objectively assess laboratory performance and direct improvement efforts. Regular participation is the first step to using PT as an effective tool for laboratory improvement and benefits will accrue if laboratories review results and possible causes of errors, re-examine samples, keep records of performance and use schemes as group learning exercises. Keeping health facility managers and authorities informed of reasons for poor performance provides the justification for allocating resources to maintain quality. At central level, PT schemes can provide ongoing post-market surveillance of commercial test kits, quantify errors associated with a particular technology, identify the best technologies to use, identify training needs and indicate the need for governments to validate and standardise equipment and methodologies (33).

Many published reports present unacceptable laboratory performance in EQA in RL countries, indicating the vital importance of ongoing monitoring and corrective actions. Implementation of corrective actions is primarily the responsibility of laboratory personnel and management and is dependent on established hierarchical supervisory structures. Supervisors making on-site visits can assess pre- and post-analytical aspects of laboratory procedures, and address technical performance through mentorship and ensuring functional equipment and adequate supplies; only a field visit can convey a realistic picture of the conditions under which technicians work (63). Supervisory visits are more effective when standard checklists are used systematically (75). Visits by effective supervisors are highly motivating for laboratory workers, but are time-consuming and expensive when considered across thousands of small laboratories; regular visits may not be sustained unless linked with PT performance to target poorly performing laboratories. Some PT schemes offer assistance with training and corrective action or link with entities that provide this support (31, 33).

Novel applications using mobile technology may enhance the reach and benefits of EQA programmes in RL countries. Mobile camera phones can capture and transmit images directly from the eyepiece of an ordinary laboratory microscope to a central review site for feedback (76-78). The rapid expansion of mobile networks and internet coverage, and decreasing operational costs, offer opportunities to develop PT programmes that provide images of rare pathology or pathology that cannot be mass produced, such as histology specimens or organisms found in spleen or bone marrow. Several PT providers in developed countries are already implementing this approach.

Acknowledgements

The author would like to thank Professor Michael Noble, Department of Pathology and Laboratory Medicine, University of British Columbia (UBC) and Chair of the UBC Program Office for Laboratory Quality Management for reading through the manuscript and providing valuable suggestions.

Notes

[1] Conflicts of interest None declared.

References

1 

Nkengasong JN. Strengthening laboratory services and systems in resource-poor countries. Am J Clin Pathol. 2009;131:774. https://doi.org/10.1309/AJCP8GYX8KTKDATZ https://doi.org/10.1309/AJCP8GYX8KTKDATZ

2 

Beastall GH. Adding value to laboratory medicine: a professional responsibility. Clin Chem Lab Med. 2013;51:221–7. https://doi.org/10.1515/cclm-2012-0630 https://doi.org/10.1515/cclm-2012-0630

3 

Forsman RW. Why is the laboratory an afterthought for managed care organizations? Clin Chem. 1996;42:813–6.

4 

Carter JY, Lema OE, Wangai MW, Munafu CG, Rees PH, Nyamongo JA. Laboratory testing improves diagnosis and treatment outcomes in primary health care facilities. Afr J Lab Med. 2012;•••:1.

5 

Petti CA, Polage CR, Quinn TC, Ronald AR, Sande MA. Laboratory medicine in Africa: a barrier to effective health care. Clin Infect Dis. 2006;42:377–82. https://doi.org/10.1086/499363 https://doi.org/10.1086/499363

6 

Birx D, de Souza M, Nkengasong J. Laboratory challenges in the scaling up of HIV, TB, and malaria programs. The interaction of health and laboratory systems, clinical research, and service delivery. Am J of Clin Path 2009;131:849-51. https://doi.org/10.1309/AJCPGH89QDSWFONS.

7 

Vitoria M, Granich R, Gilks CF, Gunneberg C, Hosseini M, Were W, et al. The Global Fight Against HIV/AIDS, Tuberculosis, and Malaria. Current status and future perspectives. Am J Clin Pathol. 2009;131:844–8. https://doi.org/10.1309/AJCP5XHDB1PNAEYT https://doi.org/10.1309/AJCP5XHDB1PNAEYT

8 

Nkengasong JN, Nsubuga P, Nwanyanwu O, Gershy-Damet G-M, Roscigno G, Bulterys M, et al. Laboratory systems and services are critical in global health. Time to end the neglect? Am J Clin Pathol. 2010;134:368–73. https://doi.org/10.1309/AJCPMPSINQ9BRMU6 https://doi.org/10.1309/AJCPMPSINQ9BRMU6

9 

Wongsrichanalai C, Barcus MJ, Muth S, Sutamihardja A, Wernsdorfer WH. A review of malaria diagnostic tools: microscopy and rapid diagnostic test (RDT). Am J Trop Med Hyg. 2007;77:119–27.

10 

Reyburn H, Mbatia R, Drakeley C, Carneiro I, Mwakasungula E, Mwerinde O, et al. Overdiagnosis of malaria in patients with severe febrile illness in Tanzania: a prospective study. BMJ. 2004;329:1212–7. https://doi.org/10.1136/bmj.38251.658229.55 https://doi.org/10.1136/bmj.38251.658229.55

11 

Zurovac D, Midia B, Ochola SA, English M, Snow RW. Microscopy and outpatient malaria case management among older children and adults in Kenya. Trop Med Int Health. 2006;11:432–40. https://doi.org/10.1111/j.1365-3156.2006.01587.x https://doi.org/10.1111/j.1365-3156.2006.01587.x

12 

Kahama-Maro J, D’Acremont V, Mtasiwa D, Genton B, Lengeler C. Low quality of routine microscopy for malaria at different levels of the health system in Dar es Salaam. Malar J. 2011;10:332. https://doi.org/10.1186/1475-2875-10-332 https://doi.org/10.1186/1475-2875-10-332

13 

Obare P, Ogutu B, Adams M, Odera JS, Lilley K, Dosoo D, et al. Misclassification of Plasmodium infections by conventional microscopy and the impact of remedial training on the proficiency of laboratory technicians in species identification. Malar J. 2013;12:113. https://doi.org/10.1186/1475-2875-12-113 https://doi.org/10.1186/1475-2875-12-113

14 

Bates I, Bekoe V, Asamoa-Adu A. Improving the accuracy of malaria-related laboratory tests in Ghana. Malar J. 2004;3:38. https://doi.org/10.1186/1475-2875-3-38 https://doi.org/10.1186/1475-2875-3-38

15 

Mosha JF, Conteh L, Tediosi F, Gesase S, Bruce J, Chandramohan D, et al. Cost implications of improving malaria diagnosis: findings from north-eastern Tanzania. PLoS One. 2010;5:e8707. https://doi.org/10.1371/journal.pone.0008707 https://doi.org/10.1371/journal.pone.0008707

16 

Derua YA, Ishengoma DRS, Rwegoshora RT, Tenu F, Massaga JJ, Mboera LEG, et al. Users’ and health service providers’ perception on quality of laboratory malaria diagnosis in Tanzania. Malar J. 2011;10:78. https://doi.org/10.1186/1475-2875-10-78 https://doi.org/10.1186/1475-2875-10-78

17 

Mukadi P, Gillet P, Lukuka A, Mbatshi J, Otshudiema J, Muyembe JJ, et al. External Quality Assessment of reading and interpretation of malaria rapid diagnostic tests among 1849 end-users in the Democratic Republic of the Congo through Short Message Service (SMS). PLoS One. 2013;8:e71442. https://doi.org/10.1371/journal.pone.0071442 https://doi.org/10.1371/journal.pone.0071442

18 

Seidahmed OME, Mohamedein MMN, Elsir AA, Ali FT, Malik EFM, Ahmed ES. End-user errors in applying two malaria rapid diagnostic tests in a remote area of Sudan. Trop Med Intern Health 2008;13:406-9. https://doi.org/10.1111/j.1365-3156.2008.02015.x.

19 

Counihan H, Harvey SA, Sekeseke-Chinyama M, Hamainza B, Banda R, Malambo T, et al. Community health workers use malaria rapid diagnostic tests (RDTs) safely and accurately: results of a longitudinal study in Zambia. Am J Trop Med Hyg. 2012;87:57–63. https://doi.org/10.4269/ajtmh.2012.11-0800 https://doi.org/10.4269/ajtmh.2012.11-0800

20 

WHO. FIND, CDC. Malaria rapid diagnostic test performance: results of WHO product testing of malaria RDTs: round 5 (2013). Geneva: World Health Organization;2014.

21 

Alemnji GA, Zeh C, Yao K, Fonjungo PN. Strengthening national health laboratories in sub-Saharan Africa: a decade of remarkable progress. Trop Med Int Health. 2014;19:450–8. https://doi.org/10.1111/tmi.12269 https://doi.org/10.1111/tmi.12269

22 

Peter TF, Rotz PD, Blair DH, Khine A-A, Freeman RR, Murtagh MM. Impact of laboratory accreditation on patient care and the health system. Am J Clin Pathol. 2010;134:550–5. https://doi.org/10.1309/AJCPH1SKQ1HNWGHF https://doi.org/10.1309/AJCPH1SKQ1HNWGHF

23 

World Health Organization. 2011. Overview of external quality assessment (EQA): module 10, content sheet 10-1. WHO, Geneva, Switzerland. Available at: http://www.who.int/ihr/training/laboratory_quality/10_ b_eqa _contents.pdf. Accessed April 3rd 2016.

24 

Noble MA. Does external evaluation of laboratories improve patient safety? Clin Chem Lab Med. 2007;45:753–5. https://doi.org/10.1515/CCLM.2007.166 https://doi.org/10.1515/CCLM.2007.166

25 

Mesfin EA, Taye B, Belay G, Ashenafi A. The status of medical laboratory towards of AFRO WHO accreditation process in government and private health facilities in Addis Ababa, Ethiopia. Pan Afr Med J. 2015;22:136. https://doi.org/10.11604/pamj.2015.22.136.7187 https://doi.org/10.11604/pamj.2015.22.136.7187

26 

Bates I, Maitland K. Are laboratory services coming of age in sub-Saharan Africa? Clin Infect Dis. 2006;42:383–4. https://doi.org/10.1086/499368 https://doi.org/10.1086/499368

27 

Kibet E, Moloo Z, Ojwang PJ, Sayed S, Mbuthia A, Adam RD. Measurement of improvement achieved by participation in international laboratory accreditation in sub-Saharan Africa. The Aga Khan University Hospital Nairobi experience. Am J Clin Pathol. 2014;141:188–95. https://doi.org/10.1309/AJCPV8A9MRWHGXEF https://doi.org/10.1309/AJCPV8A9MRWHGXEF

28 

Peter T, Badrichani A, Wu E, Freeman R, Ncube B, Ariki F, et al. Challenges in implementing CD4 testing in resource-limited settings. Cytometry B Clin Cytom. 2008;74:S123–30. https://doi.org/10.1002/cyto.b.20416 https://doi.org/10.1002/cyto.b.20416

29 

World Bank. New country classifications. Available at: http://data.worldbank.org/news/new-country-classifications. Accessed April 3rd 2016.

30 

Gous N, Isherwood LE, David A, Stevens W, Scott LE. A pilot evaluation of external quality assessment of GenoType MTBDRplus versions 1 and 2 using dried culture spot material. J Clin Microbiol. 2015;53:1365–7. https://doi.org/10.1128/JCM.03340-14 https://doi.org/10.1128/JCM.03340-14

31 

Frean J, Perovic O, Fensham V, McCarthy K, von Gottberg A, de Gouveia L, et al. External quality assessment of national public health laboratories in Africa, 2002–2009. Bull World Health Organ. 2012;90:191–9A. https://doi.org/10.2471/BLT.11.091876 https://doi.org/10.2471/BLT.11.091876

32 

Mandy F, Bergeron M, Houle G, Bradley J, Fahey J. Impact of the international program for Quality Assessment and Standardization for Immunological Measures Relevant to HIV/AIDS: QASI. Cytometry. 2002;50:111–6. https://doi.org/10.1002/cyto.10088 https://doi.org/10.1002/cyto.10088

33 

Aggett H. The impact of a CD4 External Quality Assessment Programme for Southern Africa and Africa. Johannesburg: University of Witwatersrand; 2009.

34 

Jha K, Thapa B, Salhotra V, Afridi N. Panel testing of sputum smear microscopy of national tuberculosis reference laboratories in SAARC region: 2003-2010. SAARC J Tuber Lung Dis HIV/AIDS 2011;8:31-5.

35 

Munene S, Songok J, Munene D, Carter J. Implementing a regional integrated laboratory proficiency testing scheme for peripheral health facilities in East Africa. Biochem Med (Zagreb) 2017;27:110-3. https://doi.org/10.11613/BM.2017. 014.

36 

Mukadi P, Gillet P, Lukuka A, Atua B, Kahodi S, Lokombe J, et al. External quality assessment of malaria microscopy in the Democratic Republic of the Congo. Malar J. 2011;10:308. https://doi.org/10.1186/1475-2875-10-308 https://doi.org/10.1186/1475-2875-10-308

37 

Mukadi P, Gillet P, Lukuka A, Atua B, Sheshe N, Kanza A, et al. External quality assessment of Giemsa-stained blood film microscopy for the diagnosis of malaria and sleeping sickness in the Democratic Republic of the Congo. Bull World Health Organ. 2013;91:441–8. https://doi.org/10.2471/BLT.12.112706 https://doi.org/10.2471/BLT.12.112706

38 

Mukadi P, Lejon V, Barbé B, Gillet P, Nyembo C, Lukuka A, et al. Performance of microscopy for the diagnosis of malaria and human African trypanosomiasis by diagnostic laboratories in the Democratic Republic of the Congo: results of a nation-wide external quality assessment. PLoS One. 2016;11:e0146450. https://doi.org/10.1371/journal.pone.0146450 https://doi.org/10.1371/journal.pone.0146450

39 

APHL. CDC, IUATLD, KNCV, RIT, WHO. External Quality Assessment for AFB smear microscopy. 2002. Available at: http://www.aphl.org/AboutAPHL/publications/Documents/ External Quality _Assessment_for_AFB_Smear_Microscopy.pdf. Accessed April 3rd 2016.

40 

Martinez A, Balandrano S, Parissi A, Zuniga A, Sanchez M, Ridderhof J, et al. Evaluation of new external quality assessment guidelines involving random blinded rechecking of acid-fast bacilli smears in a pilot project setting in Mexico. Int J Tuberc Lung Dis. 2005;9:301–5.

41 

Selvakumar N, Prabhakaran E, Rahman F, Chandu NA, Srinivasan S, Santha T, et al. Blinded rechecking of sputum smears for acid-fast bacilli to ensure the quality and usefulness of restaining smears to assess false positive errors. Int J Tuberc Lung Dis. 2003;7:1077–82.

42 

Nguyen TN, Wells CD, Binkin NJ, Becerra JE, Linh PD, Nyugen VC. Quality control of smear microscopy for acid-fast bacilli: the case for blinded re-reading. Int J Tuberc Lung Dis. 1999;3:55–61.

43 

Malik S, Hanif M, Chopra KK, Aggarwal N, Vashist RP. Evaluation of a new quality assessment strategy for blinded rechecking of random sputum smears for TB in Delhi, India. Southeast Asian J Trop Med Public Health. 2011;42:342–6.

44 

Selvakumar N, Murthy BN, Prabhakaran E, Sivagamasundari S, Vasanthan S, Perumal M, et al. Lot quality assurance sampling of sputum acid-fast bacillus smears for assessing sputum smear microscopy centers. J Clin Microbiol. 2005;43:913–5. https://doi.org/10.1128/JCM.43.2.913-915.2005 https://doi.org/10.1128/JCM.43.2.913-915.2005

45 

Selvakumar N, Prabhakaran E, Murthy BN, Sivagamasundari S, Vasanthan S, Govindaraju R, et al. Application of lot sampling of sputum AFB smears for the assessment of microscopy centres. Int J Tuberc Lung Dis. 2005;9:306–9.

46 

Shiferaw MB, Hailu HA, Fola AA, Derebe MM, Kebede AT, Kebede AA, et al. Tuberculosis laboratory diagnosis quality assurance among public health facilities in West Amhara Region, Ethiopia. PLoS One. 2015;10:e0138488. https://doi.org/10.1371/journal.pone.0138488 https://doi.org/10.1371/journal.pone.0138488

47 

Shargie EB, Yassin MA, Lindtjorn B. Quality control of sputum microscopic examinations for acid fast bacilli in southern Ethiopia. Ethiop J Health Dev. 2005;19:104–8. https://doi.org/10.4314/ejhd.v19i2.9978 https://doi.org/10.4314/ejhd.v19i2.9978

48 

Mundy CJF, Harries AD, Banerjee A, Salaniponi FM, Gilks CF, Squire SB. Quality assessment of sputum transportation, smear preparation and AFB microscopy in a rural district in Malawi. Int J Tuberc Lung Dis. 2002;6:47–54.

49 

Manalebh A, Demissie M, Mekonnen D, Abera B. The quality of sputum smear microscopy in public-private mix directly observed treatment laboratories in West Amhara Region, Ethiopia. PLoS One. 2015;10:e0123749. https://doi.org/10.1371/journal.pone.0123749 https://doi.org/10.1371/journal.pone.0123749

50 

Ayana DA, Kidanemariam ZT, Tesfaye HM, Milashu FW. External quality assessment for acid fast bacilli smear microscopy in eastern part of Ethiopia. BMC Res Notes. 2015;8:537. https://doi.org/10.1186/s13104-015-1478-0 https://doi.org/10.1186/s13104-015-1478-0

51 

Buzingo T, Sanders M, Masabo JP, Nyandwi S, van Deun A. Systematic re-staining of sputum smears for quality control is useful in Burundi. Int J Tuberc Lung Dis. 2003;7:439–44.

52 

World Health Organization. Malaria microscopy: quality assurance manual, version 2. Geneva, Switzerland: WHO;2016. Available at: http://www.who.int/malaria/publications/malaria_microscopy_QA_manual.pdf?ua=1. Accessed March 3rd 2016.

53 

Nzitakera A, Ngizwenayo L, Niyonshuti G, Umubyeyi FK, Mwubahamana C, Njunwa KJ. Assessment of the inter-rater reliability of the microscopic diagnosis of malaria in three health centres of Kayonza District, Eastern Province, Rwanda. Rwanda Journal Series F: Medicine and Health Sciences 2015;242-6.

54 

Khan MA, Walley JD, Munir MA, Khan MA, Khokar NG, Tahir Z, et al. District level external quality assurance (EQA) of malaria microscopy in Pakistan: pilot implementation and feasibility. Malar J. 2011;10:45. https://doi.org/10.1186/1475-2875-10-45 https://doi.org/10.1186/1475-2875-10-45

55 

Wafula R, Sang E, Cheruiyot O, Aboto A, Menya D, O’Meara WP. Short report: high sensitivity and specificity of clinical microscopy in rural health facilities in western Kenya under an External Quality Assurance program. Am J Trop Med Hyg. 2014;91:481–5. https://doi.org/10.4269/ajtmh.14-0133 https://doi.org/10.4269/ajtmh.14-0133

56 

Sarkinfada F, Aliyu Y, Chavasse C, Bates I. Impact of introducing integrated quality assessment for tuberculosis and malaria microscopy in Kano, Nigeria. J Infect Dev Ctries. 2009;3:20–7. https://doi.org/10.3855/jidc.101 https://doi.org/10.3855/jidc.101

57 

Thapa B, Koirala S, Upadhaya BP, Mahat K, Malla S, Shakya G. National external quality assurance scheme for HIV testing using dried blood spot: a feasibility study. SAARC J Tuber Lung Dis HIV/AIDS 2011;8:23-7.

58 

Manyazewal T, Paterniti AD, Redfield RR, Marinucci F. Role of secondary level laboratories in strengthening quality at primary level health facilities’ laboratories: an innovative approach to ensure accurate HIV, tuberculosis, and malaria test results in resource-limited settings. Diag Microbiol and Inf Dis 2013;75:55-9. https://doi.org/10.1016/j.diagmicrobio.2012.09.020.

59 

Vieira FD, Salem JI, Netto AR, Camargo SAD, Silva RRF, Moura LC, et al. Methodology for characterizing proficiency in interpreting sputum smear microscopy results in the diagnosis of tuberculosis. J Bras Pneumol. 2008;34:304–11. https://doi.org/10.1590/S1806-37132008000500010 https://doi.org/10.1590/S1806-37132008000500010

60 

Dave PV, Patel ND, Rade K, Solanki RN, Patel PG, Patel P, et al. Proficiency panel testing - a reliable tool in external quality assessment of sputum smear microscopy services in Gujarat, India. Indian J Tuberc. 2011;58:113–9.

61 

Ayalew F, Tilahun B, Taye B. Performance evaluation of laboratory professionals on malaria microscopy in Hawassa Town, Southern Ethiopia. BMC Res Notes. 2014;7:839. https://doi.org/10.1186/1756-0500-7-839 https://doi.org/10.1186/1756-0500-7-839

62 

Ashraf S, Kao A, Hugo C, Christophel EM, Fatunmbi B, Luchavez J, et al. Developing standards for malaria microscopy: external competency assessment for malaria microscopists in the Asia-Pacific. Malar J. 2012;11:352. https://doi.org/10.1186/1475-2875-11-352 https://doi.org/10.1186/1475-2875-11-352

63 

Paramasivan CN, Venkataraman P, Vasanthan JS, Rahman F, Narayanan PR. Quality assurance studies in eight state tuberculosis laboratories in India. Int J Tuberc Lung Dis. 2003;7:522–7.

64 

Van Rie A, Fitzgerald D, Kabuya G, Van Deun A, Tabala M, Jarret N, et al. Sputum smear microscopy: evaluation of impact of training, microscopy distribution, and use of external quality assessment guidelines for resource-poor settings. J Clin Microbiol. 2008;46:897–901. https://doi.org/10.1128/JCM.01553-07 https://doi.org/10.1128/JCM.01553-07

65 

Masanja IM, McMorrow ML, Maganga MB, Sumari D, Udhayakumar V, McElroy PD, et al. Quality assurance of malaria rapid diagnostic tests used for routine patient care in rural Tanzania: microscopy versus real-time polymerase chain reaction. Malar J. 2015;14:85. https://doi.org/10.1186/s12936-015-0597-3 https://doi.org/10.1186/s12936-015-0597-3

66 

McMorrow ML, Masanja MI, Abdulla SMK, Kahigwa E, Kachur SP. Challenges in routine implementation and quality control of rapid diagnostic tests for malaria, Rufiji District, Tanzania. Am J Trop Med Hyg. 2008;79:385–90.

67 

Gillet P, Mukadi P, Vernelen K, Van Esbroek M, Muyembe JJ, Bruggeman C, et al. External Quality Assessment on the use of malaria rapid diagnostic tests in a non-endemic setting. Malar J. 2010;9:359. https://doi.org/10.1186/1475-2875-9-359 https://doi.org/10.1186/1475-2875-9-359

68 

Versteeg I, Mens PF. Development of a stable positive control to be used for quality assurance of rapid diagnostic tests for malaria. Diag Microbiol and Inf Dis 2009;64:256–60. https://doi.org/10.1016/j.diagmicrobio.2009.03.012.

69 

Aidoo M, Patel JC, Barnwell JW. Dried Plasmodium falciparum-infected samples as positive controls for malaria rapid diagnostic tests. Malar J. 2012;11:239. https://doi.org/10.1186/1475-2875-11-239 https://doi.org/10.1186/1475-2875-11-239

70 

Tamiru A, Boulanger L, Chang MA, Malone JL, Aidoo M. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia. Malar J. 2015;14:11. https://doi.org/10.1186/s12936-014-0524-z https://doi.org/10.1186/s12936-014-0524-z

71 

Steingart KR, Ramsay A, Pai M. Optimizing sputum smear microscopy for the diagnosis of pulmonary tuberculosis. Expert Rev Anti Infect Ther. 2007;5:327–31. https://doi.org/10.1586/14787210.5.3.327 https://doi.org/10.1586/14787210.5.3.327

72 

Rej R. Proficiency testing and external quality assurance: crossing borders and disciplines. Accred Qual Assur 2002;7:335–40. https://doi.org/10.1007/s00769-002-0513-8.

73 

Sciacovelli L, Secchiero S, Zardo L, Zaninotto M, Plebani M. External Quality Assessment: an effective tool for clinical governance in laboratory medicine. Clin Chem Lab Med. 2006;44:740–9. https://doi.org/10.1515/CCLM.2006.133 https://doi.org/10.1515/CCLM.2006.133

74 

Carter JY, Lema OE, Adhiambo CG, Materu SF. Developing external quality assessment programmes for primary health care level in resource-limited countries. Accred Qual Assur 2002;7:345-50. https://doi.org/10.1007/s00769-002-0510-y.

75 

Aziz M, Bretzel G. Use of a standardised checklist to assess peripheral sputum smear microscopy laboratories for tuberculosis diagnosis in Uganda. Int J Tuberc Lung Dis. 2002;6:340–9.

76 

Frean J. Microscopic images transmitted by mobile cameraphone. Trans R Soc Trop Med Hyg. 2007;101:1053–5. https://doi.org/10.1016/j.trstmh.2007.06.008 https://doi.org/10.1016/j.trstmh.2007.06.008

77 

Bellina L, Missoni E. Mobile cell-phones (M-phones) in telemicroscopy: increasing connectivity of isolated laboratories. Diagn Pathol. 2009;4:19. https://doi.org/10.1186/1746-1596-4-19 https://doi.org/10.1186/1746-1596-4-19

78 

Tuijn CJ, Hoefman BJ, van Beijma H, Oskam L, Chevrollier N. Data and image transfer using mobile phones to strengthen microscopy-based diagnostic services in low- and middle-income country laboratories. PLoS One. 2011;6:e28348. https://doi.org/10.1371/journal.pone.0028348 https://doi.org/10.1371/journal.pone.0028348