Final Report USAID/Malawi Data Quality Assessment Activity

Similar documents
Botswana s Integration of Data Quality Assurance Into Standard Operating Procedures

Audit of Performance Monitoring for Indicators Appearing In Selected USAID Operating Units Results Review and Resource Request Reports

PERFORMANCE MONITORING & EVALUATION TIPS CONDUCTING DATA QUALITY ASSESSMENTS ABOUT TIPS

Performance Management Plan (PMP) Toolkit

For Conducting Comprehensive Data Quality Audit among Health Facilities supported by EGPAF Mozambique

AUDIT OF USAID/MOROCCO S DEMOCRACY AND GOVERNANCE PROGRAM

Federal Bureau of Investigation s Integrity and Compliance Program

DATA QUALITY ASSESSMENT CHECKLIST

Performance Management Plan (PMP) Toolkit

* * * * * * * * * * * * * * * * * * * * * * * * * UNITED STATES MISSION-BOGOTA VACANCY ANNOUNCEMENT

M&E Development Series

Industry Services Quality Management System

ADVANCING PARTNERS & COMMUNITIES

Poultry Production and Marketing Project. Kitui County. Terms of Reference. For. An End of Project Evaluation

Royal Borough of Kensington and Chelsea. Data Quality Framework. ACE: A Framework for better quality data and performance information

Section 7. Terms of Reference

Organizational Capacity Assessment for Community-Based Organizations. New Partners Initiative Technical Assistance (NuPITA) Project

U.S. Department of the Treasury. Treasury IT Performance Measures Guide

TIPS DATA QUALITY STANDARDS ABOUT TIPS

Audit of Financial Management Governance. Audit Report

ROUTINE DATA QUALITY ASSESSMENT TOOL (RDQA)

U.S. DEPARTMENT OF THE INTERIOR OFFICE OF INSPECTOR GENERAL Verification of Previous Office of Inspector General Recommendations September 2009

Data Management Implementation Plan

The Health and Family Planning Manager s Toolkit PERFORMANCE MANAGEMENT TOOL

TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

DOC Electronic Timekeeping System Quarterly Project Oversight Report: Comprehensive Review For Period January March 2015

GAO PERFORMANCE MANAGEMENT SYSTEMS. IRS s Systems for Frontline Employees and Managers Align with Strategic Goals but Improvements Can Be Made

VA Office of Inspector General

Innovation and Technology Department

DATA QUALITY ASSURANCE

How To Teach Human Resources Management

AUDIT REPORT INTERNAL AUDIT DIVISION. Asset management at the UNHCR operations in Georgia

7 Directorate Performance Managers. 7 Performance Reporting and Data Quality Officer. 8 Responsible Officers

TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

Audit of Controls over Government Property Provided under Federal Student Aid Contracts FINAL AUDIT REPORT

Fortune 500 Medical Devices Company Addresses Unique Device Identification

Oversight of Information Technology Projects. Information Technology Audit

Health Management Information System(HMIS)Specialist TWELVE (12) MONTHS PERSONAL SERVICES CONTRACT RENEWABLE, SALARY EQUIVALENT TO GRADE FSN-11

Audit of IT Asset Management Report

Audit of the Test of Design of Entity-Level Controls

Office of Audits. Independent Auditor s Report on the Application of Agreed-Upon Procedures Related to Selected DynCorp Invoices UNCLASSIFIED

Annex II: Terms of Reference for Management and Implementation Support Consultant (Firm)

NYSAN PROGRAM QUALITY SELF-ASSESSMENT (QSA) TOOL, 2 ND EDITION QUALITY INDICATOR DEFINITIONS

Internal Audit. Audit of the Inventory Control Framework

Terms of Reference Concurrent Monitoring of Mid Day Meal (MDM) in Odisha

Qualifications, Roles and Responsibilities of the Implementation Team. The Implementation Team should include the following members:

Integrated Risk Management:

Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Prepared by: Audit and Assurance Services Branch.

I-TECH Ethiopia Clinical Mentoring Program: Field-Based Team Model

TREASURY INSPECTOR GENERAL FOR TAX ADMINISTRATION

MNLARS Project Audit Checklist

Audit Report OFFICE OF INSPECTOR GENERAL. Farm Credit Administra on s Personnel Security and Suitability Program A 15 04

All South African citizens and permanent residents with valid work permits at the time of application.

Audit Report OFFICE OF INSPECTOR GENERAL. Farm Credit Administration s Purchase Card Program A Auditor-in-Charge Sonya Cerne.

Memorandum. ACTION: Report on Computer Security Controls of Financial Management System, FTA FE May 23, 2000.

QUALITY MANAGEMENT SYSTEM MANUAL

INFORMATION MANAGEMENT

TIPS PREPARING A PERFORMANCE MANAGEMENT PLAN

Request for proposal. For. Data Quality Audit. 1 Sep 2011 to 30 Sep 2012

JHU CCP Zambia Chief of Party

PHASE 3: PLANNING PHASE

Guidance Note on Developing Terms of Reference (ToR) for Evaluations

TRANSPORT CANADA MARINE SAFETY PLEASURE CRAFT OPERATOR COMPETENCY PROGRAM QUALITY MANAGEMENT SYSTEM FOR ACCREDITATION

Management Oversight of Federal Employees Compensation Act Operations within the U.S. Department of Agriculture

Timor- Leste. United Nations Development Programme. Vacancy Announcement

Norfolk Public Schools Teacher Performance Evaluation System

Guide for Performance Review of Educator Preparation in Rhode Island (PREP-RI)

COUNTERING OVERSEAS THREATS

DATA QUALITY ASSESSMENT FOR NATIONAL NTD PROGRAMMES

UNITED STATES DEPARTMENT OF EDUCATION OFFICE OF INSPECTOR GENERAL

AUDIT REPORT INTERNAL AUDIT DIVISION. Invoice Processing in UNAMID. Internal controls over invoice processing were inadequate and ineffective

White Paper from Global Process Innovation. Fourteen Metrics for a BPM Program

msd medical stores department Operations and Sales Planning (O&SP) Process Document

USAID/UGANDA COUNTRY DEVELOPMENT COOPERATION STRATEGY : PERFORMANCE MANAGEMENT PLAN (PMP) VOLUME 1: MISSION PMP

... and. Uses data to help schools identify needs for prevention and intervention programs.

NABL NATIONAL ACCREDITATION

pm4dev, 2007 management for development series Introduction to Project Management PROJECT MANAGEMENT FOR DEVELOPMENT ORGANIZATIONS

OFFICE OF INSPECTOR GENERAL

Aboriginal Affairs and Northern Development Canada. Internal Audit Report. Audit of Internal Controls Over Financial Reporting.

Develop Project Charter. Develop Project Management Plan

PHASE 3: PLANNING PHASE

Sample Quality Management Plan

Ghana AIDS Commission. Terms of Reference Data Quality Assessment 2014

USAID PROGRAM CYCLE OVERVIEW

Transcription:

IN Final Report USAID/Malawi Data Quality Assessment Activity Final: November 27, 2013 This report was produced by Social Impact at the request of the United States Agency for International Development.

USAID/MALAWI DATA QUALITY ASSESSMENT (DQA) ACTIVITY: FINAL REPORT Final: November 27, 2013 AID-612-TO-13-00005/IQC # AID-OAA-I-10-00013 DISCLAIMER This document was produced for review by Social Impact for the United States Agency for International Development, under USAID SOL-612-13-000007 USAID/Malawi Data Quality Assessment (DQA) Training.

TABLE OF CONTENTS ACRONYMS... ii EXECUTIVE SUMMARY... iv I. CONCEPTUAL FRAMEWORK AND INTRODUCTION... 1 II. METHODOLOGY... 2 III. FINDINGS AND RECOMMENDATIONS... 6 A. Data Quality Standards... 7 1. Validity... 7 2. Reliability... 8 3. Integrity... 10 4. Precision... 11 5. Timeliness... 13 B. USAID/Malawi DQA Next Steps... 13 C. USAID/Malawi Performance Monitoring System... 15 D. USAID/Malawi Data Quality Assurance Planning... 17 E. USAID/Malawi Communication with Implementing Partners... 20 IV. BEST PRACTICES... 22 ANNEXES... 24 i

ACRONYMS ADS Automated Directives System AM Activity Manager CDCS Country Development Cooperation Strategy COR/AOR Contracting Officer s Representative/Agreement Officer s Representative CRS Catholic Relief Services DO Development Objective DQA Data Quality Assessment FHI360 Family Health International 360 FTF Feed the Future FY Fiscal Year HPN Health, Population, and Nutrition Office (USAID/Malawi) KII Key Informant Interview IM Implementing Mechanism INVC Integrating Nutrition into Value Chains (USAID/Malawi-funded activity) IP Implementing Partner M&E Monitoring and Evaluation MIS Management Information System MOH Ministry of Health MOU Memorandum of Understanding NSO National Statistical Office of Malawi OIG Office of the Inspector General PDA Program Development and Analysis Office (USAID/Malawi) PEPFAR President s Emergency Plan for AIDS relief PIRS Performance Indicator Reference Sheet PIH Partners in Hope PMPOC Performance Monitoring Point of Contact PPR Past Performance Report PMP Performance Management Plan RIG Regional Inspector General ii

SEG SI SPS TLC USAID UNDP USG WALA Sustainable Economic Growth Office (USAID/Malawi) Social Impact Standard Program Structure Total Land Care U.S. Agency for International Development United Nation Development Program U.S. Government Wellness and Agriculture for Life Assessment (USAID/Malawi-funded activity) iii

EXECUTIVE SUMMARY INTRODUCTION AND METHODOLOGY While the Automated Directives System (ADS) has evolved since its creation in the late 1990s, the basic data quality assessment (DQA) requirements and standards have remained constant. As such, the five data quality standards validity, integrity, precision, reliability, and timeliness continue to anchor assessments of performance indicator data, and the ADS maintains the requirement that data reported to USAID Headquarters for external reporting purposes must have had a DQA within the past three years before submission. New guidance on data quality states that outside experts should not be hired to conduct DQAs. In alignment with this updated guidance, USAID/Malawi optimized the utility of these requirements by not only focusing on data quality to inform programming, direct resource allocation decisions, and enable confident results-reporting, but by also using the DQA requirement as an opportunity to build internal capacity for the Mission staff and its implementing partners (IPs). Social Impact (SI) embraced the Mission s capacity building focus and designed an iterative approach that emphasized local ownership by USAID staff and Malawian data sources through continuous teamwork and learning. There were two main phases to the SI approach: Phase 1: This phase centered on delivering data quality workshops for both USAID/Malawi and IP staff utilizing participatory instruction methods. The team developed workshop curriculum based on a desk review of background documents including the Office of Inspector General (OIG)- produced Audit of USAID s Agricultural Programs in Malawi Report (July 30, 2013) and 25 key informant interviews (KIIs) with both Mission and IP staff. Phase 2: Building on participant feedback and questions raised during the data quality workshops, this phase relied on joint USAID/Malawi-SI DQA Teams to conduct the assessments, which utilized the Agency-recommended DQA Checklist, as well as an SI-developed Action plan. The Action Plan allowed USAID and IPs to collaboratively identify needed actions to improve data quality, assign individuals responsible for next steps, and determine deadlines for when the actions should be accomplished. In order to complete the DQAs, the team utilized a three-pronged capacity building approach was utilized during Phase 2: Step 1: SI led initial DQAs were with USAID/Malawi team members observing and/or participating. Step 2: As USAID/Malawi staff became more comfortable with the assessment process, they took the lead on the DQAs with support from SI. Step 3: For USAID/Malawi team members able to participate in several DQAs, they led DQAs while the SI team members observed and provided quality assurance. In practice, one sector s DQA team had the consistent participation of a number of select USAID/Malawi staff, which allowed the Mission members to fully lead DQA discussions with IPs. The other sector s DQA team tended to have a greater variety of USAID/Malawi staff participation. This iv

allowed more Mission staff to participate in the DQAs, but also resulted in SI playing a more active supporting role throughout each of the assessments. SI conducted in-country fieldwork was conducted in early October 2013 for Phase 1 and late October through early November for Phase 2. In total, nearly 40 USAID and 75 IP participants attended seven data quality workshops. The joint USAID/Malawi-SI DQA Teams conducted assessments in nine locations throughout the central and southern regions of Malawi. Overall, the teams visited 30 IPs and completed 137 individual DQA sessions on 57 performance indicators. Based on requirements set by USAID/Malawi, the teams conducted DQAs only on those Foreign Assistance Framework standard indicators that the Mission planned to report for FY 2013. FINDINGS The findings and recommendations stated in this section are founded on analyses conducted by the SI Team based on learning through the KIIs, training workshops, and joint USAID/Malawi-SI DQA sessions with IPs and secondary data sources. Ov erall, approximately 28% (or 38 out of 137) of the conducted DQAs did not identify data quality issues where the majority of these assessments were related to the Health, Population, and Nutrition (HPN) Office. While the remaining DQAs identified at least one data quality concern, when examining each of the five data quality standards individually problems were identified less than 50% of the time. This section contains a summary of findings related to the five data quality standards. For more detailed information refer to the main report. Data Quality Standard I: Validity Critical issues were identified with validity in 35% of all of the DQAs conducted. Many IPs did not have the correct performance indicator reference sheets (PIRS), which resulted in inconsistent definitions, disaggregation, and data collection methodologies. Some IPs did not have any PIRSs, and others utilized abbreviated PIRSs that lacked critical information necessary to support validity, such as the indicator s relationship to the intended result. There was confusion among both Mission staff and its IPs on which PIRS to use, wherein many USAID/Malawi offices were using the approved IP PIRSs or Presidential Initiative standard PIRSs instead of the Mission-level PIRSs, while some IPs were using old versions of the Foreign Assistance Framework standard indicator PIRSs. Almost none were using the Mission s own approved PIRSs. Data Quality Standard II: Reliability Issues with reliability were identified during 42% of the DQAs conducted, making this the most common data quality weakness across all completed DQAs. Inconsistent and/or undocumented original data collection processes had a critical impact on the reliability of the data reported to USAID/Malawi. Many of the performance indicators relied on data collection by Government of Malawi staff, volunteers, or other data sources outside of the control of the IP (and therefore, of the Mission). As a result, verification of the original data at the source is difficult. v

Inconsistency among data collection efforts within and among implementers reduced data s reliability. Standard definitions often lacked activity-level specificity, which caused confusion among data collectors. Data Quality Standard III: Timeliness Only 9% of the DQAs identified issues with timeliness. Some IPs did not get all of the necessary data from the field in time to submit to the Mission by the deadline. Some indicator data reported to USAID/Malawi was incomplete for the reporting period. Data Quality Standard IV: Precision The Team found data quality issues related to precision in 22% of all DQAs conducted. Data collection methodologies were not fine-tuned enough to collect data according to the required definition or the required disaggregation (e.g., data collected on household respondents were not in alignment with the required disaggregation, such as male-with-female). Some estimated calculation methods did not have accompanying written documentation and/or lacked sufficient precision. Often required disaggregation was missing or not in alignment with the standard indicator PIRS. Sampling methodologies were not always representative of the population usually due to inadequate sample sizes. Sampling methodologies were often not well documented. Data Quality Standard V: Integrity The Team identified integrity issues were identified in 29% of the completed DQAs. IP filing systems were not well ordered and/or unprotected, which could result in loss of original data documentation, or in manipulation of the data. At times, databases and e-files were unprotected. The lack of systematic verification of data, especially at site-level, means that manipulation of data could go undetected by the IP and/or by USAID/Malawi. In particular, spot checks by IP monitoring and evaluation (M&E) staff are not frequent and comprehensive; not sufficient to cover all applicable indicators; and were often more related to data cleansing than to general verification of data quality. IPs tend to follow a hierarchical or linear data management process, wherein raw data is reported up the chain until it reaches the senior managers, who then report the data to USAID. Problems with the data are then relayed back down the chain until the source of the problem. As a result, deliberate manipulation could go undetected because there is no verification that breaks the chain. Many IPs rely on data sources that are prone to deliberate manipulation (e.g., Ministry staff collect and report data that also will help or hinder their promotion efforts or potential for increases in salary). RECOMMENDATIONS vi

This section presents a summary of recommendations focusing on data quality assurance mechanisms and the Mission s performance management system. For more detailed information, please refer to the main report. Data Quality Assurance The Mission should continue targeting DQAs to specific indicators, versus conducting general examinations of an IP s M&E system (as done in the past). In order to minimize the burden of conducting DQAs on dozens of reported performance indicators, SI recommends that DQAs are completed on a rolling basis (preferably quarterly). The Mission should utilize DQA teams with two-to-four members including, ideally, the Contracting Officer s Representative (COR) or the Agreement Officer s Representative (AOR) or the Activity Manager (AM) taking the lead. Additional members may include other technical office members, the Program Development and Analysis (PDA) Office and other support offices. The PDA Office should work with the rest of USAID/Malawi, particularly the technical offices, to establish a Mission-wide acceptable level of data quality (such as must have no issues identified for four out of five of the data quality standards during a DQA) and develop a response for those indicators that fall below the minimum threshold. For performance indicators where multiple IPs report data, and thus the Mission will need to aggregate the data across the IPs in order to report to external stakeholders, either the Project Manager, Technical Team Leader, or COR/AOR/AM (depending on which purpose statement or result the indicator is linked) should compile a summary DQA report from all of the relevant individual IP-level DQAs. CORs/AORs/AMs should work with the Contracts Office to ensure that all contract and grant awards include a section on data quality. Performance Management System For all standard and custom indicators, USAID/Malawi should utilize the SI-developed PIRS template, which is based on the Agency-recommended PIRS format. The Mission-level PIRSs should reference linkages to specific indicator handbook editions and numbering conventions for indicators included in Presidential Initiatives and other standard reporting. USAID/Malawi should add country context to standardized PIRSs such as for Presidential Initiatives and Foreign Assistance Framework indicators, where needed, to provide specificity and refinement that supports implementation and data quality. The Mission s Performance Management Plan (PMP) should be reviewed annually to determine whether any indicators need to be updated; if any revisions are made, USAID should communicate these to IPs. USAID/Malawi should establish a procedure to review and incorporate revised standard indicator handbooks into the Mission s annual review of its PMP. The Mission should convene sector-specific IP M&E/data quality meetings regularly. vii

I. CONCEPTUAL FRAMEWORK AND INTRODUCTION While the Automated Directives System (ADS) has evolved since its creation in the late 1990s, the basic data quality assessment (DQA) requirement and standards have remained constant. Data reported to USAID headquarters for external reporting purposes require a DQA within the past three years of submission. USAID/Malawi has wisely determined that it can optimize compliance with this requirement by not only focusing on quality data to inform programming, direct resource allocation decisions, and enable confident results-reporting but by also using the DQA requirement as an opportunity to build internal capacity. As such, the Mission has engaged Social Impact (SI) under the Transparency, Accountability, and Performance (TAP) contract to a) train, b) assist with, and c) monitor its Contracting/Agreement Officer s Representatives (CORs/AORs), monitoring and evaluation (M&E) team, and implementing partners (IPs) to perform the fiscal year (FY) 2013, Mission-wide DQA for annual indicators reported through the annual Performance Plan and Report (PPR). SI appreciates this approach, which is practical, time-efficient, and builds sustainability into Mission and partner M&E systems (in this case, early in the life of the new Country Development Cooperation Strategy). The requirement also aligns with USAID Forward and the Agency s commitment to capacity building for long-term development impact. This final report synthesizes the methodology, findings, emerging best practices, and analytical conclusions of six tasks. It also offers recommendations for updating the Mission s performance monitoring system, reducing CORs/AORs data quality management burden, providing direction to IPs so that they can improve the overall quality of their performance indicators, and ultimately ensuring that data quality requirements are fully met. The six DQA Activity Tasks are as follows (see Annex 1): 1. Preparing an Inception Report, including a detailed DQA methodological design, timeline, and implementation plan. The SI Team s Inception Report included all proposed site visits for primary and secondary sources, based on our sampling method that covered all three sectors of interest (education, health, and economic growth). 2. Developing a Training Practicum Curriculum to strengthen IPs ability to understand USAID DQA requirements. This curriculum encompassed steps to assist partners to address DQA needs and comply with ADS guidance. SI also designed the curriculum to deliver a general overview of M&E best practices as related to DQAs. 3. Developing a Training Practicum Curriculum for USAID CORs/AORs to facilitate and strengthen staff members ability to understand the steps and processes integral to performing a successful DQA. This curriculum incorporated DQA best practices and critical stages, and also provided an overview of the step-by-step process of conducting a DQA. 4. Delivering Training Workshops for IPs and Subs that are implementing the Integrating Nutrition into Value Chains (INVC) and Wellness and Agriculture for Life Assessment (WALA) activities, equipping M&E specialists, Chiefs of Party, and Deputy Chiefs of Party with USAID s 1

Operational Policy contained in the Automated Directive Series (ADS) DQA requirements and best practices from the partner perspective. 5. Delivering Training Workshops for USAID Staff to understand data quality issues and perform DQAs. 6. Conducting DQA Visits with USAID/Malawi CORs/AORs for all IPs reporting on PPR indicators, and for all subcontractors/subgrantees implementing the INVC and WALA activities. II. METHODOLOGY A key understanding that reinforces the Social Impact methodology was the Mission s appropriate focus on building the capacity of USAID/Malawi staff, rather than bringing in a third party firm or an external expert to conduct the necessary DQAs for its FY 2013 performance reporting. As stated in the revised ADS (November 2012), Missions should not hire an outside expert to conduct DQAs. Therefore, the SI Team designed a participatory approach to foster capacity building that engaged Mission staff as well as IPs and sub-partners at every phase of the assignment in understanding how to improve data quality. 1 Through our approach, SI deliberately sequenced tasks so as to promote and apply learning across the assignment phases. In Phase 1, for example, the SI Team designed a semi-structured interview protocol, which allowed for investigative probing during the KIIs based on the initial curriculum design and desk review findings. In turn, findings from the KIIs with Mission and IP staff informed the workshop content through targeted revisions and adjustments that were appropriate for and reflective of the USAID/Malawi and IP context. Phase 1: DQA Training Sessions for USAID/Malawi and Implementing Partners Desk Review The SI Team kicked off Phase 1 by conducting a desk review of key documents provided by the Mission including the critical, 2013 audit report to help inform the workshop curricula design. These documents also served as references for the DQAs, and included the List of FY2013 Standard Indicators, performance indicator reference sheets (PIRSs), the Excel spreadsheet list of IPs, various Development Objective (DO)-level Excel spreadsheets of indicators and partners/projects reporting on those indicators, and other pertinent background documents. Key Informant Interviews Upon arriving in Lilongwe, the Team developed a semi-structured interview protocol for both groups of key informants: Mission staff and IPs. 1 In contracting SI to build the capacity of its staff to conduct the FY 2013 DQAs, USAID/Malawi laid a strong foundation to uphold Section 203.3.11.3 of the ADS, as follows: Missions should not hire an outside expert to assess the quality of their data. Mission staff, usually the technical offices, Monitoring and Evaluation staff, or project/activity implementers, as part of their award, can conduct the assessment, provided that mission staff review and verify DQAs conducted by implementing partners. This may entail site visits to physically inspect records maintained by implementing partners or other partners. 2

The purpose of the KIIs was to identify specific data quality issues of concern for the DO teams, the Program Development and Analysis Office (PDA), the Front Office, and the Contracts Office. This information helped SI in customizing the DQA workshops and in informing the DQAs conducted in Phase 2. The SI Team also conducted a limited number of KIIs with key IPs in Lilongwe in order to understand critical data quality issues and help prepare for the DQA workshops for IPs (refer to Table 1 below; for a complete list of key informants interviewed by SI, see Annex 2): Table 1: Key Informant Interviews Conducted Key Informants No. of Interview No. of Key Informants Sessions Interviewed USAID/Malawi staff 7 16 IP staff 4 9 Total 11 25 Participatory Instruction on Data Quality and Conducting Data Quality Assessments The SI Team refined its curricula for the training workshops based on information learned in these KIIs so as to ensure that workshop content sufficiently addressed the specific areas that have been problematic for USAID/Malawi in the recent past. In addition, SI customized the workshop material for the three sets of DQA trainings targeting a) USAID/Malawi staff (focusing on data quality); b) USAID/Malawi AORs, CORs, and future CORs/AORs (with an emphasis on conducting DQAs); and c) IPs and sub-partners (emphasizing how to prepare for a DQA). From October 9 to October 14, 2013 SI facilitated seven participatory training workshops at Kumbali Lodge in Lilongwe for the stakeholders listed above. In total (see Annexes 3 and 4): 77 participants from 29 IPs and sub-partners attended three workshops on data quality and how to prepare for a DQA; 34 USAID staff participated in two workshops on data quality, covering topics such as the five data quality standards, as well as requirements of a data quality assessment; 37 current and future USAID/Malawi CORs/AORs learned how to conduct a data quality assessment. While SI had originally planned to deliver only one workshop specifically customized to CORs/AORs, demand from the Mission staff prompted SI to change its plans and offer an additional, make-up session for CORs/AORs during the afternoon of October 10, 2013. SI Team members used the discussion method of training, which relies on two-way communication between the facilitators and participants. This method increases learning through short lectures (20 minutes or less) that present basic information, followed by discussion among the participants and between the participants and the facilitators, and exercises to practice the learning principles (see Annex 4). Using tailored examples from USAID/Malawi (e.g., from the Performance Audit findings), the IPs experiences, and other real-life examples, SI facilitators aimed to support, reinforce, and expand upon the baseline understanding of each participant, regardless of the participants understanding of data quality. The SI Team concluded this phase by reviewing findings from all Phase 1 tasks in order to refine preparations for the Phase 2 DQA delivery with USAID/Malawi staff. Phase 2: Conducting the DQAs During Phase 2, the SI Team focused on two main objectives: 3

1) Equipping Mission staff to carry out DQAs through joint USAID/Malawi-SI Teams and phased USAID/Malawi participation (a) observe, b) participate actively, c) lead); and 2) Initiating a Mission-wide system that captures and chronicles DQA findings, determines an acceptable data quality threshold, and develops action plans to address noted IP data quality deficiencies. SI accomplished these objectives through the following activities: Collaborating with the Mission to identify joint USAID-SI Teams (and finalizing a schedule and logistical plan to support DQA delivery); Updating a cross-walk that listed all the FY2013 standard indicators that the Mission expected to report in the PPR and the IPs that were responsible for those indicators, which SI revised on a rolling basis throughout Phase 2 as new information was provided by the Mission and IPs; Sending DQA invitations to the IPs, which included the five data quality standards, the ADS requirements for conducting the DQAs every three years for indicators reported externally, a list of the IPs indicators that would receive DQAs, a list of the documents and other information that the IP should have on hand during the DQAs, and a list of the USAID participants at the DQAs this information supported the success of the DQAs (refer to Annex 5 for a sample DQA invitation); Delivering logistics handouts to all DQA participants to facilitate effective DQA visits; Conducting DQA visits with USAID/Malawi staff, utilizing the DQA checklist and the developed Action Plan template and visiting secondary data source institutions to review their methodologies for data collection, cleaning, and analysis; Assessing the five data quality standards of PPR performance indicators at all prime IPs and select sub-awardees for two of the Sustainable Economic Growth (SEG) primes (in accordance with RIG findings) through joint USAID/Malawi-SI Teams, utilizing the DQA checklist and the Action Plan template; and Preparing and delivering debriefings for both USAID/Malawi (26 attendees) and IPs (nearly 60 attendees) (see Annex 6). Data Quality Assessments In order to deliver the practicum component of the participatory training workshops, the SI Team used on-the-job mentoring with the CORs/AORs/AMs (and other participating PDA and DO staff) during the DQAs, which took place from October 16 to November 1, 2013. The SI Team split into two sub-teams with a Senior and Mid-level expert on each in order to leverage the SI Team members particular expertise. These two SI Teams, in turn, conducted the DQAs with USAID staff members deployed to the teams based on the indicator data being assessed, as well as USAID staff members availability throughout the 2.5-week period. Using SI s DQA sampling methodology, the teams accomplished the following (see Annex 7): Completed 137 DQAs; Assessed 57 PPR Indicators; Visited a total of 31 IPs (additionally, the teams visited six of the partners again because these IPs implement two USAID-funded activities); Visited two secondary sources to discuss the quality of their data (the National Statistical Office and United Nations Development Program) (see Annex 8); and Visited IPs in 9 locations including Lilongwe, Blantyre, Balaka, Chikwawa, Dedza, Mulanje, Nsanje, Thyolo, and Zomba. 4

After the initial Lilongwe-based DQAs during the week of October 14, one SI sub-team traveled to Zomba, Mulanje, and Balaka with USAID staff in order to conduct DQAs with key secondary sources (namely the National Statistical Office) and other IPs and sub-partners. The other SI sub-team and relevant USAID colleagues traveled to Blantyre, Nsanje, Chikwawa, and Thyolo in order to conduct the DQAs for partners and sub-partners located in those areas. Upon returning to Lilongwe, the SI Team split up to conduct the remaining DQAs for key partners located in the capital and other field-based locations (e.g., Dedza), again with appropriate USAID staff partnering with the SI consultants. To build the capacity of each COR/AOR/AM participating in the joint USAID/Malawi-SI Teams, SI employed the following process: Step 1: First DQA with the USAID staff member(s) A. The COR/AOR/AM or the SI Team introduced the DQA, explained the purpose of the DQA Checklist, and set the overall tone for the DQA. B. The SI Team led the assessment by utilizing the Data Quality Checklist while acknowledging the leadership position of the USAID/Malawi team members. C. The SI Team focused on the mechanics of the DQA and led the questioning process. D. SI provided immediate feedback to the IP in concert with the COR/AOR/AM. E. The DQA team and IP/sub created an immediate Action Plan for follow-up. Step 2: Second DQA with the same USAID staff member(s) A. The COR/AOR/AM introduced the DQA, explained the purpose of the DQA Checklist, and set the overall tone for the DQA. B. The COR/AOR/AM conducted the assessment by utilizing the Data Quality Checklist, with the SI Team members playing a supporting role (as needed). C. The SI Team focused on ensuring that the questions were fully explored and followed up on additional data quality issues as necessary. D. The COR/AOR/AM provided any immediate feedback to the IP with support from SI Team. E. The COR/AOR/AM and IP/sub created an immediate Action Plan for follow-up. Step 3: Third DQA with the same USAID staff member(s) A. The COR/AOR/AM introduced the DQA, explained the purpose of the DQA Checklist, and took the lead on conducting the DQA. B. The COR/AOR/AM utilized the Data Quality Checklist and asked probing questions. C. The SI Team members played a supporting role (as needed). D. The COR/AOR/AM provided any immediate feedback to the IP E. The COR/AOR/AM and IP/sub created an immediate Action Plan for follow-up. In cases where the USAID staff member only participated in one DQA, or where the COR/AOR/AM was absent, the joint SI-USAID Team utilized Step 1 to ensure that the USAID staff understood the processes of the DQA. Documentation of DQA Findings During the DQAs, the SI Team had envisioned that the COR/AOR/AM would enter information directly into the Data Quality Checklist tool on a USAID laptop. However, in many instances, USAID/Malawi staff did not bring a laptop, and thus filled out the Checklist by hand. Upon return to the Mission, the USAID staff were expected to complete the soft-copy DQA Checklist, including incorporation of the Action Plan tasks. (Refer to Section IIIB below for further USAID action steps.) Limitations 5

During the implementation of the DQA trainings and visits, SI faced several challenges that limited the Team s ability to roll out the methodology as initially envisioned. Through collaboration with the PDA office, as well as creative problem solving, the Team overcame many of these challenges, as follows: Due to the previously scheduled USAID Performance Monitoring training for USAID/Malawi during the week of October 28, 2013, the PDA office requested that SI reschedule the DQA visits to the South for the week of October 21 st, 2013. As a result, the SI Team planned for most of the DQA site visits in Lilongwe to be scheduled the week of October 28 th. USAID/Malawi and SI then found out that the USAID Performance Monitoring training had unfortunately been postponed due to the government shutdown. This change made scheduling the DQA site visits during the week of October 28 th with USAID CORs/AORs a bit easier, particularly with SEG staff. However, most health, population, and nutrition (HPN) staff members selected dates during the week of October 28 th when HPN IPs were not available, leading to scheduling difficulties. Several IPs proved to be unavailable during the timeframes when the USAID CORs/AORs/AMs or substitute staff members were available. The SI Team managed to substitute alternative site visits for time slots that various IPs had rejected by intensifying its logistical coordination. The SI Team identified several USAID/Malawi partners that were located in different cities or districts than those areas listed in these IPs addresses in the USAID Partners spreadsheet. This includes partners that have moved their headquarters offices, partners that have different offices for the projects requiring DQAs, and multiple changes in key staff contact information. As an added deliverable, the USAID/Malawi/PDA will have an updated partner list by the end of the assignment (see Annex 9). SI received continuous updates and changes to the list of indicators that required DQAs, until October 30 th, 2013. This included revisions of the standard indicators, determining that IPs that were expected to report on PPR indicators were not actually collecting data on these indicators, in addition to discovering that primes and not sub-awardees were reporting on expected indicators. Many of USAID s partners report on a large number of performance indicators that require data quality assessments. For example, Partners in Hope (PIH) reports on 17 performance indicators. On average, it takes a minimum of one to two hours to conduct a DQA on just one performance indicator meaning that it would typically take more than 17 hours to complete the DQAs on PIH indicators, thus requiring SI to schedule a least a 1.5 day-long visit for PIH or split the joint USAID/Malawi- SI Team into two, which was the approach that SI took to assess as many indicators as possible. III. FINDINGS AND RECOMMENDATIONS The findings and recommendations stated in this section derive from analyses conducted by the SI Team based on learning through the KIIs, training sessions, and joint USAID/Malawi-SI DQA sessions with IPs and secondary data sources (see Annex 10 for list of indicators and IPs that were covered through this assignment). Overall, approximately 28% (or 38 out of 137) of the conducted DQAs did not identify 6

data quality issues where the majority of these assessments were related to the Health, Population, and Nutrition (HPN) Office. While the remaining DQAs identified at least one data quality concern, when examining each of the five data quality standards individually, the team identified problems less than 50% of the time. Language for these recommendations references the word should as a term for the SI Team s recommendations, in contrast with the ADS that has specific requirements, as stated in the report accordingly. A. Data Quality Standards 1. Validity Of the DQAs completed, approximately 35% demonstrated small to significant issues with data validity. Performance indicator data with validity weaknesses can cascade to other data quality standards most critically to reliability issues. The most common, identified validity issues involved differences between the requirements of the standard definition versus the IPs actual definitions and data collection methodologies. Many USAID/Malawi technical staff relied on PIRS that were not from the Mission s approved PMP. There was confusion among Mission staff on which PIRS to use, where many USAID/Malawi staff used the Presidential Initiative standard PIRSs (such as the PEPFAR or Feed the Future Indicator Handbooks), other Mission staff used the approved but potentially outdated Mission-level PIRSs, and still other USAID staff were using the IPs PIRSs. Recommendation: As required and standard indicators are modified, deleted, or added for Presidential Initiatives or the standard indicators listed in the Foreign Assistance Framework, as project-level M&E plans are approved, and as activity-level M&E plans are approved the Mission s own PMP and PIRSs should be reviewed and updated as needed. As a result, the Mission s own PMP and PIRSs should serve as the definitive PIRSs for all staff. IPs did not have definitive PIRSs from USAID/Malawi, resulting in inconsistent definitions, data collection, and calculation methods. The confusion within the Mission staff on which PIRSs to use, and which indicators were to be reported, was mirrored in the USAID/Malawi IP community. Many of the Mission s CORs/AORs/AMs had directed IPs to various sources for PIRSs, including: outdated Mission-level PMPs, the standard indicators listed in the Foreign Assistance Framework, and PEPFAR and FTF Indicator Handbooks. Certain IPs were not using PIRSs at all (typically sub-awardees), and others were working off of abbreviated PIRSs (with landscape page orientation). As a result, IPs reporting on the same indicators were using different definitions, data collection methodologies, and even different disaggregation and calculation methods. Recommendation: The Mission s own PMP and PIRSs should serve as the definitive PIRSs for all IPs, and any changes approved by USAID/Malawi should be communicated to the IPs as soon as possible, including clear directions on when and how to incorporate the changes into the IPs reporting. The Mission s PIRSs that were based on standard indicators were not refined or customized to the development context and Agency strategy in Malawi. Many USAID/Malawi staff believed that they were not allowed to change or modify the contents of standard indicators. As a result, broad definitions in many standard indicators, which articulated all of the relevant options that were relevant to the use of a specific standard indicator, were retained by the Mission in their PMP and PIRS. For example, one standard indicator included long-term and short-term training, including academic and non-academic programs. The Mission s program was focused only on short-term non-academic training. By not customizing the PIRS to the Mission s own program and to the activities, confusion arose among the IPs on what data should be included in their reporting on this indicator. 7

Recommendation: USAID/Malawi should develop Mission-level PIRSs that add country context where the standard reference sheets are lacking specificity. This customization would be a refinement of the standard indicators, rather than changing the standard indicator. The areas that should be considered for refinement include: the definition, adding additional disaggregation (or removing the ones that are not relevant), choosing among or further specificity of data collection methods and/or sources, and increasing the frequency of reporting. Refer to USAID/Malawi Performance Management System for more details. Verification of performance indicator s relationship to the intended result was difficult. The PIRSs do provide linkages to the F Bureau s Standard Program Structure (SPS); however, linkages to the Mission s intended results as documented in the PMP are often not included. Recommendation: The Mission should utilize the full Agency-recommended PIRS format (vertical, portrait orientation), including all of the required sections and country-specific information. Refer to Section IIIC, USAID/Malawi Performance Management System for more details. The abbreviated PIRS format used by some IPs and certain Mission staff excluded key information. At times, USAID/Malawi and various IPs have utilized abbreviated PIRSs, which results in excluding important information such as the full standard definition, a section detailing data quality concerns, responsible parties for receiving/submitting data, and so forth. Recommendation: Refer to the finding above and Section IIIC, USAID/Malawi Performance Management System, for more details. Continuous revised editions of standard indicator handbooks (including the F Bureau, PEPFAR, and FTF Indicator Handbooks) caused serious version control problems and confusion among the Mission and IPs. The two recent editions of the FTF Indicator Handbook were from April 9, 2012 and September 4, 2013. The majority of IPs were unaware of this most recent update to the standard FTF indicators, while others were not aware of the standard indicator handbook at all. As a result, many IPs were collecting and reporting FTF indicators using the wrong definitions (including new or modified disaggregation) and/or the wrong data collection methodology. Some IPs were able to modify their reported data to adhere to the new definitions/methodologies, but most have not been able to make these changes for FY13 reporting by the Mission. Recommendation: USAID/Malawi should make a Mission-wide decision on how to respond to changes to standard indicators that affect the current fiscal year reporting (on an indicator-byindicator basis). For example, as a result of the September 2013 changes to FTF indicators, the USAID/Malawi SEG Team and PDA Office should have decided whether to report on the PPR indicators using the older PIRSs so that their FY13 reporting would include all of the FY13 data collected, or should have decided to immediately move to the revised indicators and not report certain data for FY13 that was now obsolete. The Mission should then communicate their decision(s) to all affected IPs, including recommendations for how to (or whether to) update their approved PMPs. As part of this communication, the Mission should send updated Missionlevel PIRSs to the IPs. Prime awardees should then communicate Mission decisions to their relevant sub-awardees or sub-contractors. Recommendation: Standard indicator definitions, disaggregation, and data collection methodologies included in PIRSs should be shared and inserted directly into activity-level M&E plans and/or PMPs. All other Mission-level PIRS should be shared with relevant IPs as well. 2. Reliability For the completed DQAs, the SI Team discovered reliability issues most frequently (42%) among the five data quality standards. In addition to the difficulty in verifying data reliability (given that IPs do not 8

always maintain source document records), the majority of reliability weaknesses traced back to two causes. Several IPs do not maintain source documents in their records. Original source documents (or copies of the source documents) were often not available during the DQAs due to IPs inconsistent maintenance of these documents in their records. This occurred in cases where IPs are directly responsible for data collection, such as food security activities where beneficiary volunteers were collecting data but did not share their original sources with the IP staff. In other cases, IPs rely on primary data collection through individuals not receiving compensation or primary benefits through USG funding (such as health activities where hospital and health clinic staff collect data). For example, volunteers collected data on the number of people in a household receiving nutrition-related training and document this information into hardcover notebooks provided by the IP, and then the data was reported by the volunteer up through promoters to the IP staff. The promoters and/or the IP staff would then document the reported data into the relevant summary format, but the original data on number of people being trained was not provided nor filed at the IP s office to support the data in the summary format. In both cases, aggregated data is collected in summary format (monthly or quarterly) without retaining the original or copies of the primary data that is typically used to verify the data collection at the site-level. Therefore, systematic verification of the site-level data is not possible, nor can verification take place during DQA exercises or performance audits. For data on public-private partnerships, proprietary information on the private entities required that the original MOUs be retained outside of the IP offices. As a result, IPs could not collect original documentation at the source of the activity for some indicators. Recommendation: Since it is unclear whether original data documentation is feasible for performance indicators where USAID/Malawi IPs are collecting data, SI suggests the following: When data collection is dependent on a host government s national system (e.g., in Malawi, health activities working with hospitals and health clinics utilizing MOH data collection procedures) and; for individuals not compensated through USG funds or with individuals who are not primary beneficiaries, it is recommended that USAID/Malawi seek guidance from Washington to better understand whether original data sources should be sought. Guidance is unclear on this issue. Recommendation: USAID/Malawi should strongly encourage IPs to retain the originals or copies of the source documents in their records (e.g., participant sign-in sheets and input acknowledgement receipt forms). When originals are required to be maintained outside of the IPs offices, copies should be stored at the IP with a detailed explanation on why the originals are not there. Finally, because a critical amount of initial data collection is outside of both the Mission s and IPs control, site visits and verification of the original source information must be frequent and comprehensive in order to determine the accuracy and reliability of this data. Lack of complete documentation for indicator-specific data collection and calculation methodologies. While general M&E procedures were often spelled out and documented in flow charts or maps, standardized data collection formats, and MIS guides, most IPs and sub-awardees did not have detailed documentation on indicator-specific data collection methodologies. In addition, many PIRSs for standard indicators did not have sufficient information on the process or methodology of how the data was collected, reviewed, analyzed, maintained, and/or reported. Recommendation: USAID/Malawi IPs should refine and expand written documentation of indicator-specific data collection methods, including all of the data collection forms and tools with procedures and methods detailed and explained. This more detailed process and methodology should also be included in an updated PIRS for each indicator. A best practice could be to develop indicator-specific files housed in the M&E office of each IP, with a complete set of processes and methodologies for each indicator. 9

Inconsistency of source documentation. The joint USAID/Malawi-SI Teams found that some IPs are not using consistent data collection methodologies over time within the same IP (e.g., where different staff were using different forms to collect the same data), and among IPs reporting on the same indicator (e.g., where one IP would collect disaggregated data using one definition of head of household, and other IP would collect data using a different definition of head of household). Inconsistent use of standard sign-in sheets, confusion on what could or should be included during data collection, irregular collection of required disaggregation, and use of undocumented estimations to report on performance plagued IP reporting. Recommendation: While specific recommendations were documented during the DQAs in the Action Plans for applicable performance indicators, both USAID/Malawi and IPs should ensure that data collection instruments and calculation methodologies are aligned and congruent with the indicator definition and other components of the approved PIRS. 3. Integrity Approximately 29% of the DQAs of IP-reported indicators had overall integrity issues. While IPs expressed concerns regarding the capacity of third party data sources (such as volunteers, Government of Malawi officials, or other data collection entities outside of the control of the IPs or even USAID) to manipulate the data for personal or other gain, most IPs reviewed had established at least a minimum verification system. Filing systems were disorderly and/or, on occasion, unprotected. While several IPs had filing systems ranging from relatively well to very-well organized, most IPs hard copy filing systems were in a state of disarray. Specifically, the deficiencies included: documents within a file were poorly organized, had missing or erroneous data; documents had unsigned and undated corrections; files were often missing key required documents; files were overstuffed so that they did not close properly; and files were not labeled or poorly labeled and exact contents of files could be unknown. Many IPs either kept the files in a locked room or cabinet, but some IPs had no protective controls in place. Recommendation: Filing systems are often an indication of data quality. If files are well ordered and maintained, often this is reflective of carefully collected and documented performance indicators. USAID/Malawi IPs should ensure that their systems are organized so that information can be found easily. All hard copy data sources should be filed in locked cabinets or rooms. Additionally, a document log should be attached to each file that indicates who is in possession of any removed files, the dates they were removed, and the dates they need to be returned. IP databases and e-files were usually, but not always protected. Most IPs databases had controls in place, such as password protection and, occasionally, read/write access for different users. In addition, databases typically were backed-up on a regular basis. In limited instances, controls were not in place. Recommendation: USAID/Malawi IPs should continue to emphasize the importance of data security and ensure a balance between data protection and teams abilities to use data for activity updates. The Mission may periodically assess IPs ability to protect data, but the SI Team did not find this to be a major concern. IPs data verification procedures and crosscheck methods typically follow the data chain. While roles and responsibilities are clear and documented by IPs and consortia members, data collection and reporting was often a linear process from the data source to various intermediaries (e.g., volunteers and promoters), IP field coordinators, M&E staff, Program Managers for review, and then to the reporting to a Prime IP and/or USAID. Data reviews and checks were also linear, moving down the data chain to the next in line. Therefore, 10

errors and data manipulation could be easily missed because of the lack of independence to break the chain. Recommendation: The data chain should be broken so that rather than confirming data with the next person down the line (who may have reasons for not correcting any errors), instead, the M&E personnel or other managers should jump over the intermediary personnel and go directly to the data source to verify and confirm performance data information. Reviewing and crosschecking of original or other hard copy source documentation needs to be done as a routine part of data entry. The joint USAID/Malawi-SI Teams reviewed available source documentation (including some originals) and identified several errors. These include but are not limited to: white out edits and correction of data; lack of explanation or signature; mistakes in totals that were not corrected; lack of dates and signatures of the data entry clerks that would indicate that the data had been entered into the MIS and/or database; and additional data that was added later, with no explanation or signature. Recommendation: Source documentation should include signatures and dates by all reviewers to indicate that they have confirmed that the data is correct (including field coordinators, M&E staff, and data entry clerks). Additionally, any corrections to the data should be documented with an explanation for the changes, with signature(s) and date(s). These explanations (as well as review confirmations) could be in the form of a checklist or note that is stapled to each original source document for ease of review and audit. Crosschecks of data entered into MIS and/or other databases should be done routinely. The joint USAID/Malawi-SI Teams identified many cases where the data entered into the MIS or other databases by data entry clerks was not reviewed by other IP personnel, or not reviewed in a sufficiently systematic manner. As a result, transcription errors are not identified. For example, some IPs reported that M&E staff would periodically take the raw data form and review the data entered into the MIS for accuracy (and sometimes would also state that the same M&E staff would conduct a site visit to check that the raw data was correct). This demonstrates an effort to clean the data entered. However, there was no systematic requirement for reviewing the data (all of the data entered into the MIS or other database should be crosschecked for accuracy), nor was there a plan for how many site visits needed to be conducted to ensure that the raw data was accurate (e.g., 20% of randomly selected or stratified selections of data entered into the MIS should receive site visits by the M&E staff to check for accuracy). Recommendation: Where possible, at least two data entry clerks should enter the same data, compare the datasets, and identify and correct any variances. For IPs without those particular resources, sufficient double-checking of the data entered into the MIS and other databases should be conducted by personnel other than the data entry clerk(s). In most cases, random or stratified sampling of the data and comparing the data in the database to source documents should be the responsibility of the M&E Manager or Officer. The sampling methodology should cover a significant volume of the data being entered or at least establish a minimum threshold (i.e., sampling 10% of the source documents). 4. Precision Nearly a quarter (22%) of the completed DQAs of indicators reported by IPs had small to significant issues with the precision of their data. Most of the issues identified relate to whether the data collection method/tool being used to collect the data is fine-tuned or exact enough to register the expected change. Fewer precision issues were identified for surveys or other sampling efforts because most IPs did not use surveys for data collection. Data collection instruments insufficiently captured the performance data. The joint USAID/Malawi-SI DQA Team found that incomplete data collected by IPs was due to 11

deficiencies of data collection instruments. Often, these deficiencies are due to misunderstandings between the definition, the data collection tool, and/or the data calculation method. Recommendation: During site visits, DQAs, and other M&E-related inquiries, USAID/Malawi should regularly focus on the harmonization of IPs M&E systems. A correct and comprehensive understanding of a performance indicator (i.e., all aspects of its reference sheet) is necessary to develop an appropriate data collection instrument. In turn, an appropriate tool capturing the required data is needed to feed into an appropriate data calculation system (often completed through an IP s database). Recommendation: IPs need to focus on ensuring that they fully understand their performance indicators and that this understanding informs their data collection and analysis system. Primes should actively hold M&E meetings with their subs and conduct in-depth examinations. Disaggregation not collected and not aligned with standard indicator requirements. Recent updates to the standard indicator definitions, data collection methods and disaggregation requirements resulted in many disaggregation failures identified during the joint USAID/Malawi- SI DQAs. However, many instances of improper disaggregation can be tracked to other causes, including different definitions of age, lack of understanding of the importance of sex disaggregation, and even confusion over terminology. Most often, even when data collection formats included a requirement to disaggregate by sex, many forms (including participant sign-in sheets) did not include easily collected information on numbers of males and females participating in the activity. A more challenging reason for unaligned disaggregation is often due to differences with national-level data collection disaggregation (for example, definitions of heads of household). Recommendation: Regarding changes to performance indicators, USAID/Malawi should communicate these revisions for all standard and custom indicators through amended PIRSs as soon as possible to the relevant IPs. In turn, IPs with sub-awardees should communicate these new requirements in order to ensure that the data collected from that point onwards aligns with USAID s needs. The Mission needs to understand how these changes will impact IPs ability to collect the required data and roll out these expected changes with an accordingly realistic timeframe. For other disaggregation problems, the Mission should encourage more training and sensitization of the IPs on the importance of collecting sex disaggregated data, as well as adhering to and understanding the definitions of other required disaggregation. Insufficient verification of data results in reporting of estimates or outdated information. On an annual basis, many IPs reported that they verified performance indicator data through site visits, including random sampling of targeted beneficiaries. This is a good practice by those IPs collecting data at the household level, but is not a shared practice among all of the Mission s IPs even among members of the same consortia. In one case, the sample size for verification did not represent the targeted beneficiaries, and therefore could not be sufficient for asserting that the numbers reported were anything other than estimates based on the receipt of inputs. Lack of sufficient verification could also result in the under-reporting of several beneficiary groups. This can be a result of the data collection processes and instruments focusing on collecting data at the beginning of an activity or sub-activity and not on changes that might occur among the beneficiary population during implementation. Recommendation: IPs should work with their CORs/AORs, or data collection experts, to develop and apply a verification sampling methodology that captures a significant representation of the activity or sub-activity beneficiary population. This sampling methodology would then serve to confirm assumptions and estimations while providing USAID and IPs with the level of detail that decision-makers can utilize. 12

5. Timeliness For the most part, the joint USAID/Malawi-SI Teams did not uncover serious concerns with the timeliness of reported performance data (9%). However, the greatest risk that SI identified is that USAID/Malawi may not receive adjusted, actual figures that reflect100% of the performance data actually collected by its IPs. Not all IPs obtained the necessary data from the field on time. Typically, a USAID/Malawi IP works with thousands or tens of thousands of beneficiaries in any given quarter and, at such a scale, it was not surprising that IPs experienced difficulty in capturing all data from the field for the reporting period within the Mission s deadline. Recommendation: For indicators where data collection is occurring more frequently than reporting to USAID/Malawi, IPs could adjust the required deadline for the field in those periods where reporting to the Mission does not immediately follow. This will provide site-level data collectors with more time to submit data to the IP and, at the same time, provide the IP with more leverage to insist that the more strict deadline is adhered to for the periods prior to USAID/Malawi reporting. For example, an IP only needs to ensure that the field is meeting strict data submission deadlines the last month of each quarter (i.e., four times per year), versus each month for an indicator that requires monthly data collection, but only quarterly reporting to the Mission. USAID/Malawi deadlines may prompt IPs to report only partial results, due to limited data available. Given the relatively stringent timeframe in which data is required by the Agency, and the fact that the reporting chain (i.e., Washington, field operating unit, field operating units IPs) needs to accommodate many layers, the IPs the base level in the reporting chain have had difficulty reporting complete data to USAID/Malawi by the reporting deadline. Often times, this was due to a time lag in being able to collect the information from the field, enter it an IP s database and include it in reporting to the Mission. As a result, the joint USAID/Malawi-SI DQA Team confirmed that there have been occasions when the Mission never received updated data with the full results reported. Recommendation: USAID/Malawi needs to initiate an additional procedure in its existing IP data submission procedures, whereby IPs are required to submit revised data, as applicable, for the previous reporting period. This will allow IPs to provide the Mission with complete and accurate data and will reduce under-reporting in the future. B. USAID/Malawi DQA Next Steps The following section is focused on the priority next steps that the Mission should undertake to complete the 2013 DQA process started in September 2013 with the assistance of Social Impact. These steps can also serve as a blueprint for future DQAs in order to ensure that the data quality requirements are fully met. SEG and HPN teams will need to complete DQAs for indicators reported by certain IPs not covered by the joint USAID/Malawi-SI team s efforts. The sampling methodology used by the SI DQA team ensured that nearly 100% of the indicators reported by USAID/Malawi in the 2013 PPR received at least one DQA (two environmental indicators reported by one IP did not receive a DQA), and when possible and practical, those indicators reported by multiple IPs would receive more than one DQA. 13

However, indicators remain that require DQAs by USAID/Malawi before being reported to external stakeholders. The USAID/Malawi/SEG team needs to complete seven DQAs before the FY 2013 PPR reporting, including: 1. Four indicators reported by CRS (the WALA Prime); 2. Two environment-related indicators reported by TLC Kulera; and 3. One indicator reported by FHI 360 MMAP. Because the joint USAID/Malawi-SI Teams focused on the HPN indicators and were able to visit more than 50% of the IPs reporting on the same indicator, the SI Team recommends that the USAID/Malawi/HPN team conduct additional DQAs on five indicators reported by multiple IPs where the joint USAID/Malawi-SI teams were only able to visit 50% of the relevant IPs. DQAs conducted by joint USAID/Malawi-SI Teams need to be completed by the COR/AOR/AM and then signed by the Team Leader. The joint USAID/Malawi-SI Teams conducted 137 DQAs across three sectors (Health, Population, and Nutrition; Sustainable Economic Growth; and Education) including indicator-specific action plans. However, the summary section of the DQA checklist, including the overall conclusion regarding the quality of the data and the significance of the data limitations, will need to be completed by the relevant COR/AOR/AM. These sections can only be completed by the COR/AOR/AM because they have the closest management responsibility to the activity-level data sources (the IPs). Once this section is completed, the DQA Checklist, including the action plan, should be reviewed by the DO Team Leader and signed. Send final DQA reports to IPs and, as necessary, meet with IPs to discuss findings. In order for the IPs to confirm their understanding of the strengths and weaknesses of the data they report to USAID and to confirm the agreed upon action tasks during the DQA itself, the CORs/AORs/AMs will need to send final copies of the DQAs (with the Action Plans attached) to the IP. Once the Action Plan tasks have been accomplished, each task should be marked completed and should include any further follow-up or documentation of the completed actions inserted into the comment sections. These completed DQAs will also serve as a model to the IPs for when they conduct their own internal DQAs. DQAs of the same indicator collected and reported by multiple data sources must be compared and summarized. According to recent USAID guidance, Project Managers or their designees are responsible for ensuring the comparability of data for the same indicator collected by different mechanisms. 2 For indicators where multiple IPs report data, individual DQAs should be compiled into a summary report. While the format and methodology for this comparison exercise is not specified by USAID guidance, the SI Team suggests that the Project Managers work with the PMPOC to analyze the data quality issues across IPs together. PMs should summarize the aggregate data quality issues either into a summary DQA Checklist for the indicator or at minimum in a narrative on the indicator. To support this effort, the SI Team has prepared a summary report of all of the DQAs completed to date by each indicator in Annex 10). This should provide a basis for summarizing data quality issues related to those performance indicators reported by USAID/Malawi in the PPR. 2 See the Standardized Mission Order on Performance Monitoring (page 10). 14

Share the final, approved DQAs of the PPR indicators for FY 2013, with the action plans, with the PDA Office. The PDA is charged with ensuring that the completed DQAs, and the completed Action Plans, are stored with the Mission-wide PMP. CORs/AORs/AMs should follow through with the action plans. While some of the actions are assigned to USAID staff (such as confirming data collection methodologies with experts in USAID/Washington bureaus), other tasks were assigned to the relevant IPs. The USAID staff will need to follow up with the IPs to determine whether the tasks have been completed, and if necessary, adjust the deadlines and even the tasks. USAID/Malawi prime partners (CRS and DAI) should complete DQAs for PPR indicators reported to them by their sub-partners. While the joint USAID/Malawi-SI team focused on visiting all of the CRS and DAI sub-partners, the team was not able to conduct DQAs for all of the indicators reported by the sub-partners. According to the ADS, project/activity implementers, as part of their award, can conduct the assessment, provided that mission staff review and verify DQAs conducted by implementing partners (ADS 203.3.11.3). Once the Primes complete the remaining DQAs, they should submit them to their respective COR/AOR/AR for review and approval. CORs/AORs/AMs and Primes should jointly review sub-partners DQA reports and action plans completed by the joint USAID/Malawi-SI Teams and by the Prime. This review can help strategize on how to build the capacity of the subs, identify lessons learned and best practices, and understand general trends with data quality across the consortia or activity team. Additionally, it may prove cost effective or otherwise beneficial for the Mission to support a cross-ip training to address common data quality weaknesses, and to share best practices, among all of the Mission s IPs. Review indicators with data quality weaknesses to determine if data should be reported to Washington. Because of significant revisions to PPR indicators by the updated FTF Handbook (September 2013 version), the indicators may not be relevant to the Mission s activities. The USAID/SEG Team, in particular, will need to discuss these indicators with PDA to make a final determination on whether to report on these indicators for FY 2013. While few other indicators that received DQAs had fatal data quality issues, other indicators across the Mission s portfolio may not meet the credibility standard of the Mission. These indicators should be also be discussed with PDA before a final determination is made. C. USAID/Malawi Performance Monitoring System Understanding the quality of the Mission s performance data is critical to making sound decisions, whether at the activity-level, the project-level, the DO level, or even the goal level of the CDCS. Therefore, the foundation of data quality of performance indicators depends in large part on the quality of the Mission s performance monitoring system, including: 1) The performance management plan (PMP), including comprehensive performance indicator reference sheets (PIRSs); 2) Good communications with data sources (particularly to implementing partners); 3) Support by the PDA in providing guidance, formats, and quality assurance and quality control; and 4) The capacity and capability of USAID CORs/AORs/AMs, and the Mission s implementing partners. 15

The findings and recommendations below focus on supporting the Mission s efforts to update its performance monitoring system as it moves into developing the PMP as part of the new CDCS. Mission staff are confused on which indicator reference sheets they should be using. USAID/Malawi staff do not consistently use the USAID/Malawi-approved PIRSs for standard or customized indicators within the DOs. As a result, the Mission s own approved PIRSs include out-of-date targets and actuals, have been inconsistently updated to align with changes to required indicators (including changes in definitions, disaggregation, and data collection methodologies), and have different indicator numbers than either the standard indicators, PEPFAR or FTF, and/or IP M&EP indicators. This has caused considerable confusion among Mission and IP staff on which indicators are currently in use and need to be reported. As a result, the data quality of performance indicators has been negatively affected, particularly with validity and reliability (see above sections on VIPRT). Recommendation: As USAID/Malawi initiates its new CDCS-related, Mission-wide PMP, it should utilize the Agency-recommended PIRS template so that the Missions PIRS will include content that will support better quality performance data. This includes listing the CDCS results statement and/or project purpose in the PIRS, the Foreign Assistance Framework and Presidential Initiative indicator language for standard indicators, and other content that was missing from the old USAID/Malawi DO-level PMP PIRSs. (See Annex 11 for a recommended PIRS model) Recommendation: Likewise, the CDCS-related PIRSs should also include additional content for standard and Presidential Initiative indicators that contextualize these indicators to USAID/Malawi s CDCS strategy and project log frames, as well as to the development environment of Malawi. The objective is not to change standard definitions, but rather to operationalize and refine the reference sheets to explain specifically what the Mission intends to do, the relevant disaggregation, appropriate data collection methods, who is responsible for collecting data within the Mission, who is responsible to provide the data to the Mission, and so forth. For example, if the standard definition includes both long-term and short-term training, and the Mission only intends to fund short-term training, this distinction should be mentioned in the operational definition, including the Mission s definition of how that short-term training is defined in number of hours. Linkages to specific indicator handbooks need to be included in the PIRS, in addition to the version being utilized (for example, the FTF Handbook from September 2013). Implementing these suggestions will ensure that PIRSs are both practical and comprehensive. See Annex 11 for an SI-recommended PIRS template for USAID/Malawi, which is based on the recommended template and has additional fields for customization by field missions. Recommendation: USAID/Malawi and its IPs should use the same indicator numbers for standard and Presidential Initiative indicators as the standard indicator source (e.g., for FTF indicators use the FTF numbering system). This will reduce confusion among the Mission and its partners and ease the aggregation and reporting process. Recommendation: PIRSs should be reviewed at least annually to determine whether changes are required or, if not required, useful. If the Mission identifies a need to revise, drop, or add indicators (e.g., if a new version of the PEPFAR Indicator Handbook has directed changes to a Mission indicator), these changes should be updated in the PIRS. Once a change has been made and approved, these revised PIRSs should be communicated to the IPs who should then revise their own PMP PIRSs to align them with the Mission, and then have these revised PIRSs approved by the relevant COR/AOR/AM. USAID/Malawi should, ideally, not only maintain the current PIRSs in its files, but retain PIRSs from previous fiscal years to refer to any historical changes that have been made. 16

USAID/Malawi has sometimes incorporated revised standard PIRSs into the Mission s PMP and IP s M&E Plans without a careful review. USAID/Malawi, like other field missions trying to comply with new Washington policies, has adopted revised standard PIRSs quickly; however this may not be in the best interest of the Mission (or the Agency). Often the name of the indicator changes only slightly, suggesting that the original PIRS in use by the Mission can be easily swapped out for the revised version. However, a careful read of the revised PIRS can expose changes in the definition, disaggregation, and even data collection methodology that negates a simple swapping out of the old for the new. In addition, it is not clear whether the Mission has been comparing the revised PIRSs against the relevant Results Statement (or Purpose Statement) to ensure that the new indicators still reflect USAID/Malawi intended results. Recommendation: Unless Washington mandates immediate adoption of revised PIRSs, then it is advisable for USAID/Malawi to review its affect on standard performance indicators and how they would be modified, which performance indicators best reflect Mission (and DO, and even project-level) intended results, and when to appropriately incorporate new indicators into the Mission s PMP. Questions to ask: 1) Is the revised definition still applicable to the country focus in Malawi? 2) Are changes to data collection and analysis methods not compatible with the current M&E systems in that the current system would not be able to produce the new data? These are factors that USAID/Malawi should review with the DO/technical offices and, possibly, PDA when determining how and when to incorporate revised performance indicators into the Mission PMP and IP M&E Plans. Proper reviews and planning will ensure that revisions to the PMP are able to be properly absorbed without adversely affecting data quality. Further, as Washington may publish revised performance indicator handbooks in the middle of a fiscal year, to avoid incorporating new performance indicators and/or new PIRSs that significantly vary from the previous iteration during the middle of a fiscal year if at all possible. The Mission should convene sector-specific IP data quality meetings regularly and retain focus at activity start-up. D. USAID/Malawi Data Quality Assurance Planning Both the ADS and the new Standardized Mission Order on Performance Monitoring require that Missions develop a Data Quality Assurance Plan. 3 The format and the process of developing the Data Quality Assurance Plan are not specified; however the minimum requirements include: 1) conducting Data Quality assessments; 2) comparing data quality for the same indicator collected by different mechanisms; and 3) ensuring that the mission-wide PMP includes a section on data quality assessment procedures that uses a common Mission format for DQAs, identifies a common location for approved DQAs, and identifies Mission-specific procedures and best practices for conducting DQAs. The PMPOC has the responsibility to ensure that the Mission tracks important findings and follow-up actions from DQAs. The findings and recommendations below are focused on helping USAID/Malawi to develop its own data quality assurance system. Previous DQAs were not focused on individual indicators as required by the ADS. Many previous, completed DQAs conducted by USAID/Malawi staff lacked a specific assessment of one performance indicator, but instead had an IP s monitoring and evaluation or data management system. These older DQAs did not align with current USAID guidance. As ADS 3 See ADS 203.3.11 and page 10 of the Mission Order on Performance Monitoring. 17

203.3.11 on Data Quality clearly states: Once the Mission has selected its indicators for monitoring various levels of program performance, the next step is to verify the quality of the indicator data collected. The goal of the data quality assessment is to ensure that decision makers are fully aware of data strengths and weaknesses and the extent to which data can be trusted when making management decisions and reporting. Recommendation: As implemented during the joint USAID/Malawi-SI DQAs conducted in October-November 2013 covering 57 indicators, each USAID/Malawi-led DQA should continue to be conducted on a single performance indicator to better understand the extent to which data are defined, collected, and analyzed in accordance with the standard PIRS (i.e., for a custom indicator, the Mission-wide PIRS). During this assessment, the IP s (or other data source s) broad performance management system will also be reviewed as it relates to that specific performance indicator. However, because the SI team received feedback from both IPs and USAID staff on the utility of reviewing the IP s M&E and data management systems, SI has included in Annex 11 two models to help the Mission assess the data management systems of the IPs. It is suggested that the optimal time to conduct these data management (or M&E) systems assessments is soon after start up of an activity by an IP, and before a DQA has been conducted. Improper documentation of the completion of actions or follow-up of DQA findings. A review of previous completed DQAs (FY 2011 and FY 2012) found that most had identified follow-up actions by either the IP or the Mission. However, the paper (or audit) trail of how and when those actions were completed, and by whom, was not included in the documentation shared with the SI Team. As stated in the recent Standardized Mission Order on Performance Monitoring, Project Managers or their designees should ensure that COR/AOR/AM is on track for conducting DQAs and following up on corrective actions. 4 Additionally, the USAID/Malawi PMPOC has the responsibility to ensure that the Mission tracks important findings and follow-up actions from DQAs. 5 Recommendation: Although the recommended DQA Checklist format includes a section on actions needed to address limitations prior to the next DQA, SI s experience is that the follow-up actions taken by the IPs and USAID are frequently not documented, resulting in a potential compliance risk. Therefore, the SI Team introduced as part of the DQA methodology developed for USAID/Malawi a separate action plan for use by both the IP and the Mission. The USAID COR/AOR/AM and the highest-ranking manager of the IP should sign indicator-specific action plans at the end of the DQA. This not only increases transparency and ownership of the agreed follow up tasks, but the signed form itself also helps the Mission to document that actions have been completed. USAID/Malawi should continue to use these separate action plans, documenting the completion of each action and maintaining a copy of the completed action plan with the relevant completed DQA checklist. These action plans (with the relevant completed DQA checklists) are then reviewed by the PMPOC at least annually in order to confirm that the action tasks have been completed, and so that the Office of Program Development and Analysis (PDA) can assess common data quality issues and actions across the Mission s portfolio. Potential weaknesses of DQAs conducted in FY 2012. Many of USAID/Malawi s previous DQAs were not based on the strengths and weaknesses of data pertaining to specific performance indicators. As a result, there is a compliance risk for those DQAs conducted in FY 4 See the USAID Standardized Mission Order on Performance Monitoring (page 10). 5 Ibid. 18

2012, as these indicators were not included in the joint USAID/Malawi-SI DQAs conducted in October and November 2013. Recommendation: The PDA staff (potentially the Performance Monitoring Point of Contact, or PMPOC), working in collaboration with the technical offices, should review the FY 2012 DQAs to identify which assessments were completed that were not indicator-specific. For those indicators, USAID/Malawi should determine if follow-up DQAs are needed and should begin to schedule any needed assessments accordingly to reduce compliance risks. The schedules for conducting previous FY DQAs result in a management burden during critical times in the Mission s workplan. While the Mission explained to the SI Team that DQAs have been conducted on a rolling basis in recent prior years, 6 a review of the previous annual DQA Summary Reports indicates that the majority of assessments have taken place during limited time periods throughout the year. Most of the previous DQAs were conducted during the first quarter of the fiscal year when Missions are typically busy with portfolio reviews, preparing for PPR reporting, and other planning and review processes. Recommendation: Given the increasing number of performance indicators that USAID/Malawi reports to Washington through the PPR and Presidential Initiatives (notably, PEPFAR and FTF), it is recommended that the Mission conduct DQAs on a quarterly basis. Additionally, DQAs should be conducted for new awards after their initial performance reporting has begun. This will help both the IP and the COR/AOR/AM make immediate adjustments to improve the data quality of the reported indicators. Multiple periods for conducting these assessments would distribute the work more evenly throughout the year, making it more manageable and reduce the burden for Mission staff. Past DQA Teams did not always leverage the most relevant staff, such as the COR/AOR/AMs. In previously reviewed DQA documentation, the USAID/Malawi staff teams conducting the assessments did not always include the COR/AOR/AM. The COR/AOR/AM is responsible for conducting the DQAs for indicators in their activity or implementing mechanism s (IM s) M&E plan (or activity-level PMPs) that will be reported externally. Thus, the COR/AOR/AM should therefore be a required member of the DQA team. 7 During the joint USAID/Malawi-SI DQAs, the expert knowledge of the activity/ims by the COR/AOR/AMs was invaluable in adding substance and context to the DQA findings and identification of actions. Additionally, the contributions by USAID staff members from the PDA, other Technical Offices and from other Support Offices (particularly Finance) were of consistently high value. Recommendation: USAID/Malawi should augment the COR/AOR/AM led DQA Teams with two to three additional Mission team members to leverage skills and knowledge from throughout the Mission to add additional points of view and ideas, and to continue to build staff skills in performance management, raising the overarching capacity of the Mission. Criteria for acceptable performance indicator data quality need to be established. The Mission does not have criteria for what constitutes acceptable data quality based on the five data quality standards and, therefore, does not have an established procedure to review performance indicators when data quality issues arise. The revised ADS states that the major decision point in conducting a data quality assessment is to determine what level of data quality is acceptable (see 203.3.11.3). 6 SI Team discussions with USAID/Malawi staff during KIIs and while conducting DQAs, October 2013. 7 See the Standardized Mission Order on Performance Monitoring (page 10). 19

Recommendation: The acceptable level should be customized to the Mission s understanding of the availability of data within Malawi, the capacity of local and other organizations to collect good quality data, and of the costs and time needed to collect better quality data. USAID/Malawi should ideally determine one set of Mission-wide criteria that defines acceptable data quality standards (versus varying criteria by DO or technical office). When performance indicators do not meet these criteria, this should trigger a review of the indicator, its data, and identified data quality concerns to determine if the data should be: a) included in reporting to USAID/Washington (or other external parties) in their entirety with a narrative explanation noting data quality weaknesses; b) withheld from reporting to USAID/Washington in their entirety due to the significance of data quality weaknesses; or c) partially included in reporting to USAID/Washington with a narrative explanation of what data has been reported, what has been withheld, and why. These triggered reviews should include the COR/AOR/AM with the DO/technical office team leader and/or Project Manager in reviewing and approving the decision of how, or if, to report the data in question. PDA, as the office in charge of M&E for the Mission, should periodically review the DO/technical office and Project team determinations. See Annex 11 for examples of rating and threshold systems that could serve as part of the Mission s Data Quality Assurance plan, including the acceptable level of data quality. E. USAID/Malawi Communication with Implementing Partners The following findings and recommendations focus on instructions that USAID/Malawi should provide to their implementing partners. Such directions will not only improve the overall data quality of the IP s performance indicators, but might help to reduce the management burden on CORs/AORs/AMs for data quality-related tasks. USAID/Malawi IPs have typically not been provided with Mission-level PIRSs. The SI Team could not determine the extent to which USAID/Malawi has conducted meetings for the development and refinement of an IP s M&E Plan, as well as ensuring that this plan aligned with the Mission s broader PMP. While the Mission often provides standard PIRSs to IPs especially those supporting Presidential Initiatives these PIRSs have not been operationalized at the Mission-level. Recommendation: USAID/Malawi should continue to focus and expand on IPs M&E Plans and overall data quality starting at activity inception (i.e., during the kickoff meeting) and continuing throughout the life of the activity. At the activity kickoff meeting (or, even before) the Mission should provide the IP with Mission-level PIRSs for all standard and custom indicators (i.e., operationalized PIRSs that have more defined USAID/Malawi-relevant descriptions, noting which USAID/Malawi and IP(s)staff have data collection/reporting responsibilities). This will help to ensure that the IP understands how these indicators fit into the Mission s PMP, providing them with a larger perspective that will contribute to more integrated and informed performance management systems. Recommendation: USAID/Malawi should provide IPs with timely, updated PIRSs to reflect new performance indicators, regardless of whether these updates are from the Mission or Washington. It is not advisable to provide updated PIRSs with significant changes to definition, data collection/analysis methods, or other elements midcourse in the fiscal year as this could compromise data consistency and reliability. Recommendation: USAID/Malawi CORs/AORs/AMs, as well as IP COPs, should be sensitized to the requirements to regularly review and modify IPs M&E plans and to do so in alignment with 20

revisions to the Mission s PMP. The Mission and several IPs worked from outdated PMP/M&E Plans, or were reporting data that did not adhere to their approved PMP/M&E Plans. In order for the IPs to streamline requests for Mission approval, it is recommended to only submit the revised sections of the M&E Plans (typically the revised PIRS) for approval. This reduces the management burden on both the IP and the USAID/Malawi COR/AOR/AM, while also ensuring that practical and important changes to the performance indicators are acknowledged and approved. USAID/Malawi IP confusion on how to conduct DQAs. In addition to the above discussion on the use of DQAs to assess an IP s broad performance management system instead of targeting these assessments to specific performance indicators, some IPs are confused on this as well. During KIIs and joint USAID/Malawi-SI DQA sessions, IPs mentioned that they were confused by these prior DQAs as their indicators were not the focus of these assessments. Unfortunately, some IPs have internalized the structure of these DQAs and have been conducting internal reviews of their own, and their sub-awardees, systems as part of their quality control procedures. While these system-wide reviews can be useful and lead to improvements in data quality, they do not qualify as DQAs. Recommendation: Since some of these IP-led DQAs have not focused on specific indicators, IPs need to institute more rigorous assessments of the data quality on the indicators that are reported to USAID. In addition, the Mission would benefit from regular IP meetings to review data quality, continue to train and sensitize IP staff on the data quality standards and on how to prepare for and conduct DQAs, and to identify best practices for improving data quality. USAID/Malawi and IPs are unaware of the fact that all of the indicators in their approved PMPs are auditable. Because the PMPs developed by the IPs are contract or award deliverables, and because the IP s respective COR/AOR/AM approves the PMPs, all of the indicators included in the IP PMP can be audited. Likewise all of the indicators in the Missionwide approved PMP can be subject to a performance audit, whether these indicators are reported externally or not. This finding was confirmed by a review with USAID/Malawi staff of the OIG audit findings on the Mission s agricultural programs completed in July 2013. 8 Recommendation: USAID/Malawi and all of its IPs should conduct DQAs of all of their approved PMP indicators, including those indicators not reported by USAID to external stakeholders. Even if the Mission conducts DQAs of the IPs indicators that are reported externally, it is a best practice for the IPs to conduct their own DQAs prior to the Mission s efforts. Because these DQAs must be reviewed by the COR/AOR/AM (see page 26 above, next step on USAID/Malawi prime partners (CRS and DAI) should complete DQAs for PPR indicators reported to them by their sub-partners ), the Mission can potentially accept the IP s own DQA in lieu of a Mission-managed DQA. This reduces the Mission s management burden. For this same reason, Prime IPs should conduct DQAs of their sub-partners indicator data that is reported up to the Prime, and then reported by the Prime to USAID. If the COR/AOR/AM does not conduct the DQA, they are responsible for certifying the DQA, once done, and for addressing findings with the implementer and ensuring that corrective actions are taken. 9 8 See the Audit of USAID s Agricultural Programs in Malawi, Audit Report No. 4-612-13-010-P, July 30, 2013 (Office of Inspector General: Pretoria, South Africa). 9 See the Standardized Mission Order on Performance Monitoring (page 10). 21

USAID/Malawi s IPs stated that they need more resources in order to improve the verification of performance data, and to improve their M&E systems related to data quality. Whether this is true or not is beyond the scope of Social Impact s assignment. However, the DQA process conducted in October and November 2013 by USAID/Malawi has sent a clear message to IPs on the importance of data quality when reporting performance information to USAID. Recommendation: Reinforcing the message to IPs on the importance of data quality and building systems, procedures, and expertise within the IPs for data quality, is key to sustaining the focus on improving performance information. ADS 203.11.1 states that data quality should be included in the scope of work of any solicitation for project/activity implementation. This suggests that the Mission could include requests for creative approaches to improve data quality around the five standards, as well as including data quality as a key part of proposal M&E systems. This would include a deliberate requirement to address data quality in the awardee s PMP or activity M&E plan deliverable. IV. BEST PRACTICES USAID/Malawi DQA team verifies data against the data source. The SI Team identified a potential best practice for conducting DQAs, from the Report on Data Quality Assessment for Malawi Teacher Professional Development Support Activity (March 2012). This DQA, conducted by the COR/AOR and the M&E point of contact in the Education Office at USAID/Malawi, not only reviewed the performance data reported to USAID against the five data quality standards, but also included several site visits along the performance indicator data chain. As a result, the Mission s DQA team identified several issues and action steps related to data collection. When interviewed about the utility and effectiveness of the DQA, the sub-awardee (now a prime under the new activity) stated that they were not only able to improve the old activity s reporting, but were incorporating lessons learned into their current M&E systems in response to the prior data quality weaknesses. The SI Team is not identifying this 2012 DQA as a definitive best practice because the DQA checklists were not available for review. Therefore, the SI Team cannot be sure that the DQA focused on performance indicators rather than data management systems. Double-entry database systems. HPN IP Dignitas International has developed a double-entry database system that greatly increases confidence in the data quality, particularly with regards to data reliability and precision. Double-entry systems are redundant since they require entering all of the data twice, but for those IPs with sufficient resources, it provides a strong boost to data quality. Dignitas International s system requires double entry and then automatically identifies discrepancies that the appropriate personnel can review and correct. 22

Summary performance dashboards provide a visual overview at a glance. HPN implementer Partners in Hope (PIH) has created site-level performance dashboards (see Table 2). For the activity s 57 sites (comprised of health clinics and hospitals in six districts), they have found it helpful to review these dashboards on a quarterly basis. The dashboards are composed of a number of USAID/Malawi performance indicators (as well as other, internal management indicators) that are most relevant for assessing site performance, with actual values included by quarter for the fiscal year. Actuals are color-coded to provide a visual overview of whether performance is below, at, or above planned targets. A PIH team member noted that the dashboards have been the basis for rich, results-oriented discussions with site staff often lasting one to two hours. Table 2: Nkhoma Hospital Dashboard 2013 Presently, PIH is using dashboards only at the sitelevel, however the Mission s IPs could apply similar dashboards at the site and activity levels. In addition, USAID/Malawi could incorporate performance dashboards for CORs/AORs/AMs to track IP performance, as well as for DO and technical teams (and broader groups) to track project and program performance (especially useful during Mission portfolio reviews). While the concept of performance dashboards is a best practice, there are several improvements that could be incorporated into the PIH format, including: 1) inserting a header that clearly states the site name and date of the report; and 2) including targets so that the viewer can make better comparisons with the provided actual data. Data documents for field-level data collectors, with triplicate carbon copies. SEG/WALA implementing partner TLC identified a problem with getting copies of signatures of participants in training, meetings, field days, and other events. Often, because the participants received a small stipend, their Finance team needed the original signatures, and additional copies were often mislaid. As a solution, the TLC staff developed a data collection Notebook for all field-level data collectors. This Notebook uses a form that includes all the required individual disaggregation related to their performance indicators, and that documents participation in triplicate. The original copy goes to Finance, a second copy goes to M&E, and a third copy goes to the sub-activity manager. This type of data collection tool is easy to use, cost effective, and could be easily replicated easily by other USAID/Malawi IPs. Documenting commodity distribution. Another SEG/WALA implementing partner, World Vision, has developed a system for documenting the warehousing, transportation, distribution, and impact of commodity programs. Every step of the process is documented, and maintained in wellorganized and labeled binders that contain original registrations and other key source documents. Indeed, this sub-activity has been the subject of multiple clean audits, including audits by the RIG. This documentation system of source data is a best practice that should be shared with other USAID/Malawi IPs managing commodity distribution programs. 23

ANNEXES 1. Scope of Work 2. List of Key Informant Interviewees (KIIs) 3. List of training participants (both IPs and USAID participants) 4. DQA Training Slides, Debriefing Slides (PowerPoint) 5. Sample DQA Training Invitation 6. List of exit debriefing participants 7. DQA participants 8. List of Secondary Data Source findings 9. Updated list of USAID/Malawi partners, with contact information 10. Performance Indicator-IP Coverage Tables (by Sector/Technical Office) 11. Updated Crosswalk of Indicators and Partners/Data Quality Issues Summary Matrix (by Performance Indicator and IP/Project) 12. Recommended Tools (Summary DQA Rating Models, PIRS Template, etc.) 24