Ana-Maria Šimundić
Department of Medical Laboratory Diagnostics
University Hospital "Sveti Duh"
Sveti Duh 64
10 000 Zagreb, Croatia
Phone: +385 1 3712-021
e-mail address:
editorial_office [at] biochemia-medica [dot] com

Useful links


Special issue: External Quality Assessment in Laboratory Medicine

Original papers


Jane Y Carter*. External quality assessment in resource–limited countries. Biochemia Medica 2017;27(1):97-109.

Amref Health Africa Headquarters, Nairobi, Kenya
*Corresponding author: jane [dot] carter [at] amref [dot] org




Introduction: Health laboratory services are a critical component of national health systems but face major operational challenges in resource-limited (RL) settings. New funding for health systems strengthening in RL countries has increased the demand for diagnostics and provided opportunities to address these constraints. An approach to sustainably strengthen national laboratory systems in sub-Saharan African countries is the Strengthening Laboratory Management Toward Accreditation (SLMTA) programme. External Quality Assessment (EQA) is a requirement for laboratory accreditation. EQA comprises proficiency testing (PT), rechecking of samples and on-site evaluation.

Materials and methods: A systematic literature search was conducted to identify studies addressing laboratory EQA and quality monitoring in RL countries. Unpublished reports were also sought from national laboratory authorities and personnel.

Results: PT schemes in RL countries are provided by commercial companies, institutions in developed countries and national programmes. Most government-supported PT schemes address single diseases using a vertical approach. Regional approaches to delivering PT have also been implemented across RL countries. Rechecking schemes address mainly tuberculosis (TB), malaria and human immunodeficiency virus (HIV); integrated rechecking programmes have been piloted. Constraints include sample transportation, communication of results, unknown proficiency of referee staff and limited resources for corrective action. Global competency assessment standards for malaria microscopists have been established.

Conclusions: EQA is vital for monitoring laboratory performance and maintaining quality of laboratory services, and is a valuable tool for identifying and assessing technology in use, identifying gaps in laboratory performance and targeting training needs. Accreditation of PT providers and competency of EQA personnel must be ensured.

Key words: laboratory; quality assurance; proficiency testing; resource-limited countries


Received: May 03, 2016                                                                                                                                  Accepted: October 09, 2016




Medical and public health laboratory services are a critical component of national health systems and are central to disease diagnosis, treatment, prevention, surveillance and outbreak investigations (1). When used optimally, laboratory medicine generates knowledge that facilitates patient safety, improves patient outcomes and leads to more cost-effective healthcare (2). In the United States of America, laboratory testing influences 60–70% of critical decision-making in health, with community laboratories performing at least 50% of all testing (3). In the primary healthcare setting in resource-limited (RL) countries, laboratory testing may influence 45% of medical decision-making (4).

Until recently, allocation of resources to laboratory testing was of low priority for healthcare systems in many sub-Saharan African (SSA) countries. Unreliable and inaccurate laboratory diagnostic testing has promoted the perception that laboratory testing is unhelpful and may compromise patient care (5). Major challenges facing laboratory services in under-developed settings include weak infrastructure, human capacity shortages, and lack of laboratory policies, strategic plans and integrated national quality management systems (6,7). Funding opportunities for health systems strengthening in RL countries such as the Global Health Initiative, US President’s Emergency Plan for AIDS Relief, and Global Fund to Fight AIDS, Tuberculosis and Malaria have increased demand for diagnostics and provided opportunities to address this neglect and strengthen capacity of public health laboratory networks (8).

Microscopy remains an important laboratory diagnostic procedure in RL countries and provides the basis for managing and controlling several bacterial and parasitic infections, including tuberculosis (TB) and malaria. Absent or poor malaria microscopy has long been recognised and is attributed to multiple factors, including skills of the laboratory workforce, workload, condition of microscopes and quality of laboratory supplies (9). Reyburn et al. found only 46.1% of 4474 patients treated for severe malaria in 10 hospitals in Tanzania had positive blood films (10); Zurovac et al. showed 68.6% sensitivity and 61.5% specificity of malaria microscopy in outpatients attending 17 government health facilities in two districts in Kenya (11); Kahama-Moro et al. found 71.4% sensitivity and 47.3% specificity of routine blood slides in 12 urban public health facilities in Dar es Salaam, Tanzania (12). Misclassification of malaria species occurs from lack of skills in microscopic diagnosis (13). Improving and monitoring malaria-related test performance in peripheral laboratories on a countrywide basis is achieved through training and supervision but programmes must be sustained by national commitment (14). However, focusing on improving diagnosis for one disease may lead to over-diagnosis of another illness (15). Perceptions and attitudes of healthcare providers on quality of laboratory services also affect their correct use for patient management (16).

Malaria rapid diagnostic tests (RDT) were introduced to address challenges of poor malaria microscopy and promote greater accessibility to malaria confirmation, but errors in performing RDT are widespread (17,18). Community health workers can use RDT safely and accurately for up to 12 months post-training (19). Standardised product performance evaluations that distinguish between well and poorly performing tests are also essential (20).

An innovative approach to sustainably strengthen national laboratory systems in RL countries in Africa is the World Health Organization Regional Office for Africa (WHO AFRO) Stepwise Laboratory Quality Improvement Process Towards Accreditation (SLIPTA) framework linked to the Strengthening Laboratory Management Toward Accreditation (SLMTA) training programme. SLMTA was introduced in 2009 to provide a systematic, user-friendly approach to achieving laboratory quality improvement and accreditation (21) and uses a competency-based, step-wise process addressing daily laboratory operations for immediate and measurable laboratory improvement, harmonised to International Organization for Standardization (ISO) 15189 standards (8). Accreditation is the internationally accepted framework for verifying that laboratories adhere to established quality and competence standards for accurate and reliable patient testing. In some countries, laboratory accreditation is mandatory; in others, accreditation remains voluntary and may be driven by market incentives (22).

The World Health Organization (WHO) defines External Quality Assessment (EQA) as a system for objectively checking a laboratory’s performance using an external agency or facility (23). EQA participation is associated with improved laboratory performance over time and is a requirement for accreditation (24). However, many professionals in SSA countries are unable to effectively implement quality improvement programmes and many countries remain without an accredited clinical laboratory (25). Establishing, maintaining and demonstrating the accuracy of diagnostic tests is a major challenge for most laboratories in SSA, and the complexity and cost of setting up and maintaining quality assurance systems mean that very few laboratories, mainly tertiary or privately owned, can achieve these standards (26,27). The common perception that EQA is costly or unnecessary has hindered the widespread enrolment of laboratories into EQA programmes (28).

EQA can be applied in three main ways, each with advantages and disadvantages: 1) Proficiency testing (PT), where an external provider sends samples of undisclosed results to laboratories or individual testers and provides feedback on results; 2) Rechecking or retesting samples in higher level or peer laboratories (inter-laboratory comparison); 3) On-site assessment by approved evaluators (23).


Materials and methods


A systematic search of published literature was conducted to identify studies addressing quality monitoring and laboratory performance in RL countries. RL countries were selected as countries with low and lower middle-income economies (29). Unpublished reports were sought from national laboratory authorities and personnel for further information.




Proficiency testing schemes

Several laboratories in RL countries are enrolled in international or regional PT schemes operated by commercial companies. These offer a wide range of discipline-specific schemes suitable mainly for reference or larger laboratories. Advantages include a high level of participation with different analytical methods, thereby increasing statistical validity. However, government laboratories in RL countries, unless partner supported, may be unable to sustain participation due to high cost. Some institutions in developed countries operate PT schemes that also support RL countries at cost. These include the United Kingdom National EQA Scheme (UK NEQAS) that provides a wide range of PT schemes and educational support; and the African Regional External Quality Assessment Scheme implemented by the National Health Laboratory Service (NHLS), South Africa, that offers a range of PT materials including stable inactivated mycobacterial dried culture spot (DCS) material for GeneXpert instruments performing the Xpert MTB/RIF assay (Cepheid, Sunnyvale, CA) (30).

Table 1 summarises evaluations of PT schemes in developed countries supporting RL countries. The National Institute for Communicable Diseases (NICD), South Africa, has provided PT support to national public health laboratories and related facilities in RL countries in the WHO AFRO and Eastern Mediterranean regions since 2002, supported by WHO AFRO and the WHO Office in Lyon, France. Levels of participation and performance were sub-optimal with no upward trend in performance over time across all diseases despite several scheme-associated training visits provided by NICD and WHO AFRO (31).


Table 1. Proficiency testing schemes in developed countries supporting RL countries



The Quality Assessment and Standardization for Immunological Measures Relevant to HIV/AIDS (QASI), Canada, was established in 1997 to provide PT for CD4 enumeration at no or low cost to RL laboratories. An impact study demonstrated PT programmes can improve overall laboratory performance despite the diversity of technologies employed (32). An impact study of CD4 EQA provided by the NHLS, South Africa, from 2002 to 2006 demonstrated how EQA provides an opportunity for post-market surveillance, standardising protocols and switching operating systems (33). A regional PT programme for TB smear microscopy conducted between 2003 and 2010 across South Asian Association for Regional Cooperation (SAARC) member states demonstrated the on-going quality of diagnostic support to countries’ TB control programmes (34). The East African Regional External Quality Assessment Scheme (EA-REQAS) established in 2008 by the East African ministries of health provides integrated PT panels for basic laboratory tests. By 2015, 16 surveys had been conducted with enrolment increasing from 195 to 559 laboratories in four countries. Materials include blood slides for malaria and other blood parasites, smears for TB and Gram stain, preserved stool parasites, blood lysate for haemoglobin measurement, blood films for peripheral blood cell morphology, and human immuno­deficiency virus (HIV) and syphilis serology (35).

Many RL countries operate national disease-specific PT programmes. Mukadi et al. described four studies in the Democratic Republic of Congo (DRC) between 2010 and 2014; three involved submission of single panels of blood slides for microscopy to 183, 356 and 445 laboratories respectively (36-38). Panel composition and results are presented in Table 2.


Table 2. Blood slide proficiency testing studies in Democratic Republic of Congo 



In the second study, there was a lower frequency of serious errors in assessing parasite density (17.2% vs. 52.3%, P < 0.001) and reporting false-positive results (19.0% vs. 33.3%, P < 0.001) by laboratories that had participated in the first study; laboratories with a high number of sleeping sickness cases recognised trypanosomes more frequently (57.0% vs. 31.2%, P < 0.001). In the third study the proportion of correct or acceptable scores was higher among EQA-experienced participants compared to first time participants (40.9% vs. 22.4%, P = 0.001), and higher among those trained < 2 years previously compared to those not trained (42.9% vs. 26.3%, P = 0.01). The fourth study analysed interpretation of photographs of RDT results by laboratory workers; the most frequent errors were failure to recognise invalid or negative test results, overlooking faint test lines and incorrectly identifying malaria parasite species (17).


Rechecking and mixed EQA schemes

Many RL countries operate slide-rechecking schemes as part of national malaria, TB and HIV control programmes. Malaria and TB slide rechecking is performed by supervisors on-site, at next level health facilities or central reference laboratories; rechecking of HIV RDT results is usually performed at central level. Since the blanket approach of rechecking 100% positive and 10% negative acid-fast bacilli (AFB) slides was withdrawn (39), most studies reported using standard LQAS (lot quality assurance sampling) based on annual laboratory volume of AFB smears and proportion of positive smears to achieve an overall sensitivity of 75–85%, and processes for blinded rechecking. A significantly greater percentage of errors is detected on randomly selected, blinded AFB smears than on non-randomly selected, un-blinded smears; knowledge of prior results influences re-reading of TB slides (40-42).

Table 3 summarises studies assessing laboratory performance through rechecking of AFB smears, malaria slides and HIV RDT results; some studies used mixed EQA methods. Malik et al. reported random blinded slide rechecking of AFB smears is feasible for a large application with 40,506 slides rechecked over a two-year period at 24 sites in New Delhi, India (43).


Table 3. Summary of rechecking and mixed EQA studies


Selvakumar et al. reported rechecking of 8 slides per month per microscopy centre was sufficient to assess performance of sputum smear microscopy in 12 centres in one district in India, based on sensitivity of 80%, specificity of 100%, slide positivity rate of 10% and annual negative smear volume of 1000–5000 slides. (44). They also noted the discrepancy between microscopy centres and district supervisors was higher (4.7%) than with the reference laboratory (1%), suggesting lack of training and motivation at district level (45). Shiferaw et al. noted false negative results during rechecking were associated with unavailability of microscope lens cleaning solution and dirty smears, and false positive results with no previous EQA experience (46). Shargie et al. suggested disagreements during rechecking between regional and national reference readers are likely due to low-positivity of discordant slides (47). Mundy et al. established excellent concordance (98.2%) on rechecking TB smears after instituting basic internal quality control in one district in Malawi (48). Manalebh et al. studied three EQA methods to assess AFB smear microscopy performance in Ethiopia: on-site evaluation, blinded rechecking and panel testing; poor staining and low positivity were major factors affecting PT performance (49). Ayana et al. employed similar mixed approaches to assess AFB smear microscopy at university practical attachment sites in Ethiopia; poor working environments were a major contributor to poor performance (50). Some studies noted re-staining during rechecking at national level significantly improved sensitivity of low-positive AFB smears (41,51).

Random monthly selection of five low density and five negative malaria slides has been proposed for routine rechecking (52). A study from Rwanda reported excellent quality of malaria slide re-reading without grading of positive slides (53). Khan et al. studied malaria slide reading performance in Pakistan after instituting three activities: on-site rechecking of malaria slides, uninterrupted availability of laboratory reagents and supplies, and supervision by district supervisors (54); Wafula et al. instituted four concurrent activities in Kenya: re-reading of malaria films; feedback on quality of slide preparation; preparation of monthly laboratory performance reports; and on-the-job mentorship (55). These studies demonstrated the beneficial effects of additional support on performance. Sarkinfada et al. described systematic integration of malaria slide rechecking within an existing TB microscopy-rechecking scheme in Nigeria. Similar systems were adopted for recording results, slide sampling and storage, feedback mechanisms and monitoring visits to laboratories. Use of the same microscopists and microscopes, and joint training and supervision were feasible at laboratory level but successfully integrating these schemes requires combined TB and malaria management support at national level (56). Thapa et al. used dried blood spots (DBS) for repeat HIV RDT and enzyme-linked immunosorbent assay (ELISA) testing at reference level in Nepal and demonstrated superior sensitivity and specificity of ELISA testing (57). Manyazewal et al. integrated TB and malaria slide rechecking with HIV retesting in Ethiopia and emphasised the role secondary laboratories can play in assuring laboratory quality at primary level to avoid dependence on remotely located national and regional laboratories (58).


Competency assessment of slide readers

Vieira et al. proposed double-blind sputum smear microscopy readings using a panel of 75 slides (36 negative, 4 inconclusive, 35 positive) to assess proficiency of readers and supervisors in TB smear microscopy (59). Dave et al. assessed technical performance of laboratory supervisors using testing panels comprising 5 heat-fixed unstained sputum slides (including TB positives of any grade and negatives) for staining and examination; only 5% of 295 readers reported any type of error (60). Ayalew et al. provided 80 laboratory professionals from public and private health facilities in Ethiopia with 10 panel slides containing P. falciparum, P. vivax, mixed species and negative slides. Overall agreement in malaria parasite detection and species identification was 88% and 74.3% respectively; agreement was lower in slides with low parasite density and mixed infection (61). A multi-country network to address competency of malaria microscopists was established in Asia in 2003 with five-day courses to assess competency in parasite detection, species identification and parasite quantitation conducted by an external facilitator using reference slide panels from the Research Institute for Tropical Medicine, Manila, Philippines, regional slide bank. By 2011, 60 competency assessment exercises had been conducted in 14 countries; microscopists from 5 countries showed significant improvements in performance scores (62). This programme has translated into globally recognised standards of competency in malaria microscopy (52).



EQA schemes are valuable for recognising laboratory errors and identifying underlying problems facing peripheral laboratories (50), but participation in PT schemes may not by themselves improve performance. Mandy et al. and Mukadi et al. noted improved performance after several rounds of participation in PT (32,37); however, Paramasivan et al. noted no significant improvement after submitting five rounds of TB panels, even after communicating observed deficiencies to participants (63). Frean et al. noted little improvement in performance over time in the NICD-supported PT programme (31). Van Rie et al. evaluated the effects of 5-day refresher training for laboratory technicians and distribution of new microscopes on the quality of TB smear microscopy in 13 primary healthcare laboratories in Kinshasa, DRC, through blinded rechecking of slides, but no long-term effect was demonstrated with major errors occurring in 10 (77%) clinics after 9 months (64).



Despite the widespread use of malaria RDT in RL countries, tools available for monitoring field performance are limited. Masanja et al. compared RDT performance with malaria reference microscopy and detection of parasite DNA by real-time quantitative polymerase chain reaction (qPCR) on DBS. Malaria RDT had a higher positivity rate (6.5%) than qPCR (4.2%) or microscopy (2.9–2.5%) with poor correlation between RDT and microscopy results. Overall, agreement among the three diagnostic approaches was limited and neither microscopy nor qPCR were suitable for RDT quality monitoring under field conditions (65). McMorrow et al. noted that quality monitoring of RDTs using poor quality blood films might undermine confidence in RDT use (66). Blood samples for malaria antigens for RDT PT must be stored at 4 °C for a maximum of 48 hours, or for longer at –70 °C (67). Methods for preparing stable positive controls from cultured P. falciparum using DBS and dried tube specimens (DTS) have been demonstrated (68,69). Tamiru et al. reported performance of DTS PT challenges at 1000 and 500 parasites/µL in a field trial in Ethiopia; false negative DTS at 500 parasites/μL were reported after 24 weeks’ storage due to errors in interpreting faint test lines (70). Aggett reported stability of CD4 PT samples as only 7 days at ambient temperature and 14 days when refrigerated (32). There are no internationally agreed EQA methods for blinded rechecking of fluorescent TB smears (71). 

Studies have highlighted the difficulties in RL countries of transporting panels to peripheral sites and returning results, or submitting samples for rechecking to reference laboratories. Mukadi et al. used a combination of on-site delivery by car and private air carrier to provincial airports in DRC (38) with responses using short messaging services (SMS) (17). The NICD, South Africa, scheme used express air courier (31). Motorcycles were used to support EQA of malaria microscopy in Pakistan (54). There are obvious logistical advantages to using district-level laboratories to conduct rechecking with reduced distances and costs, and ease of feedback to laboratories (58).



Published information on costs of operating EQA programmes in RL settings is limited. Khan et al. reported a capital cost of approximately 90,000 Pakistani rupees (USD 1400) to implement a district malaria-microscopy EQA scheme, including motorcycles and training of district laboratory supervisors. This amounted to a 50% increase in direct per-slide cost, but Khan argued this is marginal to the high capital and recurrent costs of microscopy services, and its value in rationalising anti-malarial drug use (54). Mukadi et al. reported the cost of conducting one PT survey in DRC was USD 10,000 excluding salaries (USD 25 per participant) (38).




Countries with developing healthcare systems mostly lie in tropical areas where common diseases, such as malaria and diarrhoeal diseases, require immediate diagnosis (within a few hours). Coupled with poor communication across large distances, this necessitates placing laboratory testing close to where patients seek care, resulting in large numbers of small laboratories working independently. Most small laboratories still perform mainly manual assays, which are particularly prone to errors during sample collection, labelling and registration; and many laboratory staff at this level lack skills in recognising pathology, and transcribing and delivering results. Combined, these errors can lead to significant variance in the accuracy of results, leading to incorrect diagnosis, inappropriate treatment or withholding of lifesaving therapy (22). Lack of adequate resources to support these laboratory networks has resulted in equipment breakdowns, interruption of supplies and variable performance. For many laboratories across RL countries, the quality of services is unknown.

Participation in EQA is ideally required for all testing procedures performed in a laboratory. Where an established PT scheme is not available, alternate EQA mechanisms should be considered. All EQA approaches depend for their effectiveness on following national or regional protocols, good communication and feedback, and instituting corrective measures. The benefits of EQA schemes rest on mandatory participation, timely return of results with practical suggestions for corrective action and ability of participating laboratories to address deficiencies. Nothing is gained from EQA participation unless information received is directed to laboratory improvement (23).

PT programmes may be organised at national, regional and international levels and funded through government agencies, as arms of corporations, or operated on a cost-recovery basis. Most government-supported schemes in RL countries address single diseases using a vertical approach, such as malaria, TB and HIV infection; developed country and commercial schemes address a range of laboratory disciplines. Some PT schemes focus on different laboratory system levels, such as the NICD scheme (national public health laboratories) and EA-REQAS (primary level laboratories). Commercial PT providers and developed country PT schemes are accredited to ISO 17043:2010 standards, but few PT schemes in RL countries are ISO compliant.

Rechecking schemes are commonly incorporated into national disease control programmes. WHO recommends integrating malaria microscopy rechecking with other microscopically diagnosed communicable diseases, which is feasible with proper coordination at peripheral and national levels (52,56). Rechecking schemes require laboratories to follow correct procedures to avoid slide selection bias. Laboratory workers may retain slides of good quality and staining regardless of instructions; some laboratories may not submit slides for rechecking due to lack of confidence in their performance or uncertainty about implications of unsatisfactory performance (47,49). Few reports indicated the competency of staff conducting the rechecking process but relied on concordance of slide reading to determine accuracy; this can be addressed by including reference centre rechecking that assesses both peripheral laboratory technicians and their immediate supervisors; and conducting regular competency assessments of slide readers. Various standards have been proposed to assess competency in TB slide reading but only malaria microscopy has an established globally recognised competency assessment programme (52,59,60). Regular competency assessment of supervisors and reference-level staff urgently needs to be incorporated into national EQA programmes.

A process of rechecking can also be applied to rapid testing assays, such as RDT for HIV, where an alternative technique, such as enzyme immunoassay (EIA) or ELISA is used on dried blood or serum spot samples (57). Rechecking of samples by peer or higher level laboratories (inter-laboratory comparison) is appropriate for specialised tests for which no PT schemes exist, or for single unusual results; however, no published studies were identified demonstrating use or benefits of inter-laboratory comparisons in RL countries. On-site visits to laboratories by qualified auditors using standard checklists also provide a reliable EQA mechanism by implementing practical improvements to address identified gaps.

PT schemes are limited by the availability of stable PT materials that can withstand conditions of heat and humidity often found in RL countries, but have the unique advantage of being able to address uncommon pathology for which staff need to retain competence. However, PT samples are constrained by being unable to provide challenges that mimic some patient samples, such as living organisms and cells (72). PT panels may be placed within a clinical context and involve other health worker cadres in responses (73,74). Although all EQA programmes can provide inter-laboratory comparisons (benchmarking) when adequate data management systems are in place, this aspect is particularly suited to PT programmes where data are collected and analysed centrally. PT schemes use either referee laboratories or consolidated results from participants to set target values; in RL countries use of participant results may lead to lowering of precision when new participants join a scheme, but precision is usually restored once participants become more experienced (33).

Although every laboratory should treat PT samples as routine samples, this is impossible to monitor and enforce. Most laboratories in RL settings pay special attention to PT samples, especially pathology recognition by microscopy. Therefore, PT results are likely to be the very best that a laboratory can produce; poor results may reflect a worse performance under routine conditions. Participation in PT schemes should not be punitive, but regarded as an educational tool to objectively assess laboratory performance and direct improvement efforts. Regular participation is the first step to using PT as an effective tool for laboratory improvement and benefits will accrue if laboratories review results and possible causes of errors, re-examine samples, keep records of performance and use schemes as group learning exercises. Keeping health facility managers and authorities informed of reasons for poor performance provides the justification for allocating resources to maintain quality. At central level, PT schemes can provide ongoing post-market surveillance of commercial test kits, quantify errors associated with a particular technology, identify the best technologies to use, identify training needs and indicate the need for governments to validate and standardise equipment and methodologies (33).

Many published reports present unacceptable laboratory performance in EQA in RL countries, indicating the vital importance of ongoing monitoring and corrective actions. Implementation of corrective actions is primarily the responsibility of laboratory personnel and management and is dependent on established hierarchical supervisory structures. Supervisors making on-site visits can assess pre- and post-analytical aspects of laboratory procedures, and address technical performance through mentorship and ensuring functional equipment and adequate supplies; only a field visit can convey a realistic picture of the conditions under which technicians work (63). Supervisory visits are more effective when standard checklists are used systematically (75). Visits by effective supervisors are highly motivating for laboratory workers, but are time-consuming and expensive when considered across thousands of small laboratories; regular visits may not be sustained unless linked with PT performance to target poorly performing laboratories. Some PT schemes offer assistance with training and corrective action or link with entities that provide this support (31,33).

Novel applications using mobile technology may enhance the reach and benefits of EQA programmes in RL countries. Mobile camera phones can capture and transmit images directly from the eyepiece of an ordinary laboratory microscope to a central review site for feedback (76-78). The rapid expansion of mobile networks and internet coverage, and decreasing operational costs, offer opportunities to develop PT programmes that provide images of rare pathology or pathology that cannot be mass produced, such as histology specimens or organisms found in spleen or bone marrow. Several PT providers in developed countries are already implementing this approach.




The author would like to thank Professor Michael Noble, Department of Pathology and Laboratory Medicine, University of British Columbia (UBC) and Chair of the UBC Program Office for Laboratory Quality Management for reading through the manuscript and providing valuable suggestions.


Potential conflict of interest

None declared.




 1. Nkengasong JN. Strengthening laboratory services and systems in resource-poor countries. Am J Clin Pathol 2009;131:774.

 2. Beastall GH. Adding value to laboratory medicine: a professional responsibility. Clin Chem Lab Med 2013;51:221-7.

 3. Forsman RW. Why is the laboratory an afterthought for managed care organizations? Clin Chem 1996;42:813-6.

 4. Carter JY, Lema OE, Wangai MW, Munafu CG, Rees PH, Nyamongo JA. Laboratory testing improves diagnosis and treatment outcomes in primary health care facilities. Afr J Lab Med 2012;1.

 5. Petti CA, Polage CR, Quinn TC, Ronald AR, Sande MA. Laboratory medicine in Africa: a barrier to effective health care. Clin Infect Dis 2006;42:377-82.

 6. Birx D, de Souza M, Nkengasong J. Laboratory challenges in the scaling up of HIV, TB, and malaria programs. The interaction of health and laboratory systems, clinical research, and service delivery. Am J of Clin Path 2009;131:849-51.

 7. Vitoria M, Granich R, Gilks CF, Gunneberg C, Hosseini M, Were W, et al. The Global Fight Against HIV/AIDS, Tuberculosis, and Malaria. Current status and future perspectives. Am J Clin Pathol 2009;131:844-8.

 8. Nkengasong JN, Nsubuga P, Nwanyanwu O, Gershy-Damet G-M, Roscigno G, Bulterys M, et al. Laboratory systems and services are critical in global health. Time to end the neglect? Am J Clin Pathol 2010;134:368-73.

 9. Wongsrichanalai C, Barcus MJ, Muth S, Sutamihardja A, Wernsdorfer WH. A review of malaria diagnostic tools: microscopy and rapid diagnostic test (RDT). Am J Trop Med Hyg 2007;77:119-27.

10. Reyburn H, Mbatia R, Drakeley C, Carneiro I, Mwakasungula E, Mwerinde O, et al. Overdiagnosis of malaria in patients with severe febrile illness in Tanzania: a prospective study. BMJ 2004;329:1212-17.

11. Zurovac D, Midia B, Ochola SA, English M, Snow RW. Microscopy and outpatient malaria case management among older children and adults in Kenya. Trop Med Int Health 2006;11:432-40.

12. Kahama-Maro J, D’Acremont V, Mtasiwa D, Genton B, Lengeler C. Low quality of routine microscopy for malaria at different levels of the health system in Dar es Salaam. Malaria J 2011;10:332.

13. Obare P, Ogutu B, Adams M, Odera JS, Lilley K, Dosoo D, et al. Misclassification of Plasmodium infections by conventional microscopy and the impact of remedial training on the proficiency of laboratory technicians in species identification. Malaria J 2013;12:113.

14. Bates I, Bekoe V, Asamoa-Adu A. Improving the accuracy of malaria-related laboratory tests in Ghana. Malaria J 2004;3:38.

15. Mosha JF, Conteh L, Tediosi F, Gesase S, Bruce J, Chandramohan D, et al. Cost implications of improving malaria diagnosis: findings from north-eastern Tanzania. PLoS One 2010;5:e8707.

16. Derua YA, Ishengoma DRS, Rwegoshora RT, Tenu F, Massaga JJ, Mboera LEG, et al. Users’ and health service providers’ perception on quality of laboratory malaria diagnosis in Tanzania. Malaria J 2011;10:78.

17. Mukadi P, Gillet P, Lukuka A, Mbatshi J, Otshudiema J, Muyembe JJ, et al. External Quality Assessment of reading and interpretation of malaria rapid diagnostic tests among 1849 end-users in the Democratic Republic of the Congo through Short Message Service (SMS). PLoS One 2013;8:e71442.

18. Seidahmed OME, Mohamedein MMN, Elsir AA, Ali FT, Malik EFM, Ahmed ES. End-user errors in applying two malaria rapid diagnostic tests in a remote area of Sudan. Trop Med Intern Health 2008;13:406-9.

19. Counihan H, Harvey SA, Sekeseke-Chinyama M, Hamainza B, Banda R, Malambo T, et al. Community health workers use malaria rapid diagnostic tests (RDTs) safely and accurately: results of a longitudinal study in Zambia. Am J Trop Med Hyg 2012;87:57-63.

20. WHO, FIND, CDC. Malaria rapid diagnostic test performance: results of WHO product testing of malaria RDTs: round 5 (2013). Geneva: World Health Organization;2014.

21. Alemnji GA, Zeh C, Yao K, Fonjungo PN. Strengthening national health laboratories in sub-Saharan Africa: a decade of remarkable progress. Trop Med Int Health 2014;19:450-8.

22. Peter TF, Rotz PD, Blair DH, Khine A-A, Freeman RR, Murtagh MM. Impact of laboratory accreditation on patient care and the health system. Am J Clin Pathol 2010;134:550-5.

23. World Health Organization. 2011. Overview of external quality assessment (EQA): module 10, content sheet 10-1. WHO, Geneva, Switzerland. Available at: b_eqa _contents.pdf. Accessed April 3rd 2016.

24. Noble MA. Does external evaluation of laboratories improve patient safety? Clin Chem Lab Med 2007;45:753-5.

25. Mesfin EA, Taye B, Belay G, Ashenafi A. The status of medical laboratory towards of AFRO WHO accreditation process in government and private health facilities in Addis Ababa, Ethiopia. Pan African Med J 2015;22:136.

26. Bates I, Maitland K. Are laboratory services coming of age in sub-Saharan Africa? Clin Infect Dis 2006;42:383-4.

27. Kibet E, Moloo Z, Ojwang PJ, Sayed S, Mbuthia A, Adam RD. Measurement of improvement achieved by participation in international laboratory accreditation in sub-Saharan Africa. The Aga Khan University Hospital Nairobi experience. Am J Clin Pathol 2014;141:188-95.

28. Peter T, Badrichani A, Wu E, Freeman R, Ncube B, Ariki F, et al. Challenges in implementing CD4 testing in resource-limited settings. Cytometry B Clin Cytom 2008;74:S123–130.

29. World Bank. New country classifications. Available at: Accessed April 3rd 2016.

30. Gous N, Isherwood LE, David A, Stevens W, Scott LE. A pilot evaluation of external quality assessment of GenoType MTBDRplus versions 1 and 2 using dried culture spot material. J Clin Microbiol 2015;53:1365–7.

31. Frean J, Perovic O, Fensham V, McCarthy K, von Gottberg A, de Gouveia L, et al. External quality assessment of national public health laboratories in Africa, 2002–2009. Bull World Health Organ 2012;90:191–9A.

32. Mandy F, Bergeron M, Houle G, Bradley J, Fahey J. Impact of the international program for Quality Assessment and Standardization for Immunological Measures Relevant to HIV/AIDS: QASI. Cytometry 2002;50:111-6.

33. Aggett H. The impact of a CD4 External Quality Assessment Programme for Southern Africa and Africa. Johannesburg: University of Witwatersrand; 2009.

34. Jha K, Thapa B, Salhotra V, Afridi N. Panel testing of sputum smear microscopy of national tuberculosis reference laboratories in SAARC region: 2003-2010. SAARC J Tuber Lung Dis HIV/AIDS 2011;8:31-5.

35. Munene S, Songok J, Munene D, Carter J. Implementing a regional integrated laboratory proficiency testing scheme for peripheral health facilities in East Africa. Biochem Med (Zagreb) 2017;27:110-3.

36. Mukadi P, Gillet P, Lukuka A, Atua B, Kahodi S, Lokombe J, et al. External quality assessment of malaria microscopy in the Democratic Republic of the Congo. Malaria J 2011;10:308.

37. Mukadi P, Gillet P, Lukuka A, Atua B, Sheshe N, Kanza A, et al. External quality assessment of Giemsa-stained blood film microscopy for the diagnosis of malaria and sleeping sickness in the Democratic Republic of the Congo. Bull World Health Organ 2013;91:441-8.

38. Mukadi P, Lejon V, Barbé B, Gillet P, Nyembo C, Lukuka A, et al. Performance of microscopy for the diagnosis of malaria and human African trypanosomiasis by diagnostic laboratories in the Democratic Republic of the Congo: results of a nation-wide external quality assessment. PLoS One 2016;11:e0146450.

39. APHL, CDC, IUATLD, KNCV, RIT, WHO. External Quality Assessment for AFB smear microscopy. 2002. Available at: External Quality _Assessment_for_AFB_Smear_Microscopy.pdf. Accessed April 3rd 2016.

40. Martinez A, Balandrano S, Parissi A, Zuniga A, Sanchez M, Ridderhof J, et al. Evaluation of new external quality assessment guidelines involving random blinded rechecking of acid-fast bacilli smears in a pilot project setting in Mexico. Int J Tuberc Lung Dis 2005;9:301-5.

41. Selvakumar N, Prabhakaran E, Rahman F, Chandu NA, Srinivasan S, Santha T, et al. Blinded rechecking of sputum smears for acid-fast bacilli to ensure the quality and usefulness of restaining smears to assess false positive errors. Int J Tuberc Lung Dis 2003;7:1077-82.

42. Nguyen TN, Wells CD, Binkin NJ, Becerra JE, Linh PD, Nyugen VC. Quality control of smear microscopy for acid-fast bacilli: the case for blinded re-reading. Int J Tuberc Lung Di. 1999;3:55-61.

43. Malik S, Hanif M, Chopra KK, Aggarwal N, Vashist RP. Evaluation of a new quality assessment strategy for blinded rechecking of random sputum smears for TB in Delhi, India. Southeast Asian J Trop Med Public Health 2011;42:342–6.

44. Selvakumar N, Murthy BN, Prabhakaran E, Sivagamasundari S, Vasanthan S, Perumal M, et al. Lot quality assurance sampling of sputum acid-fast bacillus smears for assessing sputum smear microscopy centers. J Clin Microbiol 2005;43:913-5.

45. Selvakumar N, Prabhakaran E, Murthy BN, Sivagamasundari S, Vasanthan S, Govindaraju R, et al. Application of lot sampling of sputum AFB smears for the assessment of microscopy centres. Int J Tuberc Lung Dis 2005;9:306-9.

46. Shiferaw MB, Hailu HA, Fola AA, Derebe MM, Kebede AT, Kebede AA, et al. Tuberculosis laboratory diagnosis quality assurance among public health facilities in West Amhara Region, Ethiopia. PLoS One 2015;10:e0138488.

47. Shargie EB, Yassin MA, Lindtjorn B. Quality control of sputum microscopic examinations for acid fast bacilli in southern Ethiopia. Ethiop J Health Dev 2005:19:104-8.

48. Mundy CJF, Harries AD, Banerjee A, Salaniponi FM, Gilks CF, Squire SB. Quality assessment of sputum transportation, smear preparation and AFB microscopy in a rural district in Malawi. Int J Tuberc Lung Dis 2002;6:47-54.

49. Manalebh A, Demissie M, Mekonnen D, Abera B. The quality of sputum smear microscopy in public-private mix directly observed treatment laboratories in West Amhara Region, Ethiopia. PLoS One 2015;10: e0123749.

50. Ayana DA, Kidanemariam ZT, Tesfaye HM, Milashu FW. External quality assessment for acid fast bacilli smear microscopy in eastern part of Ethiopia. BMC Res Notes 2015;8:537.

51. Buzingo T, Sanders M, Masabo JP, Nyandwi S, van Deun A. Systematic re-staining of sputum smears for quality control is useful in Burundi. Int J Tuberc Lung Dis 2003;7:439-44.

52. World Health Organization: Malaria microscopy: quality assurance manual, version 2. Geneva, Switzerland: WHO;2016. Available at: Accessed March 3rd 2016.

53. Nzitakera A, Ngizwenayo L, Niyonshuti G, Umubyeyi FK, Mwubahamana C, Njunwa KJ. Assessment of the inter-rater reliability of the microscopic diagnosis of malaria in three health centres of Kayonza District, Eastern Province, Rwanda. Rwanda Journal Series F: Medicine and Health Sciences 2015;242-6.

54. Khan MA, Walley JD, Munir MA, Khan MA, Khokar NG, Tahir Z, et al. District level external quality assurance (EQA) of malaria microscopy in Pakistan: pilot implementation and feasibility. Malar J 2011;10:45.

55. Wafula R, Sang E, Cheruiyot O, Aboto A, Menya D, O’Meara WP. Short report: high sensitivity and specificity of clinical microscopy in rural health facilities in western Kenya under an External Quality Assurance program. Am J Trop Med Hyg 2014;91;481-5.

56. Sarkinfada F, Aliyu Y, Chavasse C, Bates I. Impact of introducing integrated quality assessment for tuberculosis and malaria microscopy in Kano, Nigeria. J Infect Dev Ctries 2009;3:20-7.

57. Thapa B, Koirala S, Upadhaya BP, Mahat K, Malla S, Shakya G. National external quality assurance scheme for HIV testing using dried blood spot: a feasibility study. SAARC J Tuber Lung Dis HIV/AIDS 2011;8:23-7.

58. Manyazewal T, Paterniti AD, Redfield RR, Marinucci F. Role of secondary level laboratories in strengthening quality at primary level health facilities’ laboratories: an innovative approach to ensure accurate HIV, tuberculosis, and malaria test results in resource-limited settings. Diag Microbiol and Inf Dis 2013;75:55-9.

59. Vieira FD, Salem JI, Netto AR, Camargo SAD, Silva RRF, Moura LC et al. Methodology for characterizing proficiency in interpreting sputum smear microscopy results in the diagnosis of tuberculosis. J Bras Pneumol 2008;34:304-11.

60. Dave PV, Patel ND, Rade K, Solanki RN, Patel PG, Patel P, et al. Proficiency panel testing - a reliable tool in external quality assessment of sputum smear microscopy services in Gujarat, India. Indian J Tuberc 2011;58:113-9.

61. Ayalew F, Tilahun B, Taye B. Performance evaluation of laboratory professionals on malaria microscopy in Hawassa Town, Southern Ethiopia. BMC Research Notes 2014;7:839.

62. Ashraf S, Kao A, Hugo C, Christophel EM, Fatunmbi B, Luchavez J, et al. Developing standards for malaria microscopy: external competency assessment for malaria microscopists in the Asia-Pacific. Malaria J 2012;11:352.

63. Paramasivan CN, Venkataraman P, Vasanthan JS, Rahman F, Narayanan PR. Quality assurance studies in eight state tuberculosis laboratories in India. Int J Tuberc Lung Dis 2003;7:522–7.

64. Van Rie A, Fitzgerald D, Kabuya G, Van Deun A, Tabala M, Jarret N, et al. Sputum smear microscopy: evaluation of impact of training, microscopy distribution, and use of external quality assessment guidelines for resource-poor settings. J of Clin Microbiol 2008;46:897–901.

65. Masanja IM, McMorrow ML, Maganga MB, Sumari D, Udhayakumar V, McElroy PD, et al. Quality assurance of malaria rapid diagnostic tests used for routine patient care in rural Tanzania: microscopy versus real-time polymerase chain reaction. Malaria J 2015;14:85.

66. McMorrow ML, Masanja MI, Abdulla SMK, Kahigwa E, Kachur SP. Challenges in routine implementation and quality control of rapid diagnostic tests for malaria, Rufiji District, Tanzania. Am J Trop Med Hyg 2008;79:385-90.

67. Gillet P, Mukadi P, Vernelen K, Van Esbroek M, Muyembe JJ, Bruggeman C, et al. External Quality Assessment on the use of malaria rapid diagnostic tests in a non-endemic setting. Malar J 2010;9:359.

68. Versteeg I, Mens PF. Development of a stable positive control to be used for quality assurance of rapid diagnostic tests for malaria. Diag Microbiol and Inf Dis 2009;64:256–60.

69. Aidoo M, Patel JC, Barnwell JW. Dried Plasmodium falciparum-infected samples as positive controls for malaria rapid diagnostic tests. Malaria Journal 2012;11;239.

70. Tamiru A, Boulanger L, Chang MA, Malone JL, Aidoo M. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia. Malaria Journal. 2015;14;11.

71. Steingart KR, Ramsay A, Pai M. Optimizing sputum smear microscopy for the diagnosis of pulmonary tuberculosis. Expert Rev Anti Infect Ther 2007;5:327-31.

72. Rej R. Proficiency testing and external quality assurance: crossing borders and disciplines. Accred Qual Assur 2002;7:335–40.

73. Sciacovelli L, Secchiero S, Zardo L, Zaninotto M, Plebani M. External Quality Assessment: an effective tool for clinical governance in laboratory medicine. Clin Chem Lab Med 2006;44:740-9.

74. Carter JY, Lema OE, Adhiambo CG, Materu SF. Developing external quality assessment programmes for primary health care level in resource-limited countries. Accred Qual Assur 2002;7:345-50.

75. Aziz, M, Bretzel G. 2002. Use of a standardised checklist to assess peripheral sputum smear microscopy laboratories for tuberculosis diagnosis in Uganda. Int J Tuberc Lung Dis 2002;6:340-9.

76. Frean J. Microscopic images transmitted by mobile cameraphone. Trans R Soc Trop Med Hyg 2007;101:1053-5.

77. Bellina L, Missoni E. Mobile cell-phones (M-phones) in telemicroscopy: increasing connectivity of isolated laboratories. Diagn Pathol 2009;4:19.

78. Tuijn CJ, Hoefman BJ, van Beijma H, Oskam L, Chevrollier N. Data and image transfer using mobile phones to strengthen microscopy-based diagnostic services in low- and middle-income country laboratories. PLoS One 2011;6:e28348.