+ All Categories
Home > Documents > Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

Date post: 03-Jan-2016
Category:
Upload: leon-byrd
View: 216 times
Download: 1 times
Share this document with a friend
Popular Tags:
21
Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan
Transcript
Page 1: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

Data Quality Assessment

Wednesday, June 10, 2015

USAID/Jordan

Page 2: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

SESSION GOALS

• By the end of this session, participants should:

– Understand the reasons for conducting Data Quality Assessment (DQA)

– Be familiar with the processes and best practice of conducting a DQA

Page 3: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

DATA QUALITY ASSURANCE

• Data informs decisions across the Program Cycle• High quality data is the cornerstone for evidence

based decision-making• USAID’s credibility when communicating and

reporting requires realistic understanding of the limitations of data

Page 4: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

TRUE OR FALSE?

• Data Quality – Key Trade-Offs

– Quality vs. Cost

– Between dimensions of quality Cost

Qu

alit

y

The goal of data quality assessment is to eliminate all data limitations

FALSE

Performance data should be as complete and consistent as management needs and resources permit

Page 5: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• A process to ensure that the Mission, DO team, and IPs are aware of the:

– Strengths and weaknesses of data (vs five data quality standards), and

– Extent to which data integrity can be trusted to influence management decisions

DATA QUALITY ASESSMENT (DQA) (ADS 203.3.11)

What is it? What is it’s purpose?

Page 6: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• Validity: data (and the indicator) clearly and adequately represent the intended result

• Integrity: data have safeguards to minimize risk of transcription error or data manipulation

• Precision: data have sufficient level of detail to permit management decision-making

• Reliability: data reflect stable and consistent data collection processes and analysis methods over time and across sites/partners

• Timeliness: data available at a useful frequency, are current, and timely enough to influence management decision-making

STANDARDS FOR DATA QUALITY (VIPRT) (ADS 203.11.1)

Page 7: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• An audit

– although reported data are auditable and are routinely audited

• An exercise that seeks to find fault and/or place blame

• Something to fear

DATA QUALITY ASESSMENT (DQA)

What is it not?

DQA is an opportunity for USAID and IPs tounderstand results better and make improvements

Page 8: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

COR/AOR/AM is responsible for conducting DQAs with the participation of the PRO M&E Specialists, for indicators in their AMEPs.

WHO IS RESPONSIBLE?

WHICH INDICATORS?

Those reported externally to Washington.

However, “while managers are not required to conduct DQAs on all performance data, they should be aware of the strengths and weaknesses of all indicators they collect to monitor performance.”

ADS 203.3.11.2

Page 9: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

WHEN?

• A DQA must occur for indicators that are externally reported within six months before reporting initial results, and at least every 3 years thereafter

• In addition, a number of circumstances might prompt a manager to conduct DQA, if:

– Critically or strategically important data

– Potential indicator data issues

– Previous DQA follow up

Page 10: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• Lack of documentation (particularly PIRS)• IP collected data do not match USAID definition• Double-counting• Untimely data• Inconsistent/Missing data • Biased data collection procedures• Data procedures are not clear to all staff• Data procedures/tools are not standardized/consistent across

Activities, sites and/or over years• Data transcription/calculation errors• Unclear/inconsistent inclusion/exclusion criteria

COMMON DATA QUALITY ISSUES ENCOUNTERED

Page 11: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

STEPS FOR CONDUCTING DQAS

Plan• Communicate with IPs• Request/review Mission and IP documents• Identify specific areas of focus to discuss with IP

Conduct• Visit IP/field offices • Examine data collection process• Verify data compliance against VIP-RT

Analyze, Document, and Follow Up• Complete DQA checklist• Share• Implement and monitor recommendations

Page 12: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

PLAN

• Communicate with partners/stakeholders– What are the indicator(s)?– When will the DQA visit be?– Who needs to be there?– What background documents are needed?– How can/should IP prepare?

• Review DQA checklist and prepare questions for how you will get the answers

Remember: (DQA Checklist is not an interview guide!)

Page 13: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• Review relevant Mission and IP documents ahead of time– PMP and AMEP, including PIRS (Mission and IP)

• For new indicators have IP complete PIRS before DQA – All reports to USAID in which performance data was

reported (quarterly, annual, others)– IP M&E guidelines/SOPs related to different stages of data

handling (collecting, monitoring, assessing, sampling, etc.) – Previously conducted DQA reports (Mission and IP)

• Identify particular areas of focus

PLAN

Page 14: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

EXERCISE 20 minutes

  Review PIRS to plan for DQA and identify areas of specific focus during the DQA

• Definitions• Calculation challenges• Availability of data• Potential bias• Other issues …

Page 15: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• Conduct IP office visits (and to field office when possible)

• Include IP staff involved in data handling at all stages

• Ask questions! Lots of them (system vs indicator specific)

• Review lots of documents! Data verification materials (original sign-in sheets, score cards, photos, inventory record, activity reports, etc.)

• Assess compliance of data reported with the documented definition

• Assess the data collection processes and obtain a clear picture of how data is managed from source to report

• Track a reported result/value back through the system to replicate it

• Compare data: in different reports, field vs reported, and hard copy vs electronic copy

• Assess the data storage system

• Obtain copies of documents/data you review or spot-check

CONDUCT

Page 16: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

Develop questions to address DQA checklist or other items.

Using the same PIRS from the previous exercise and the DQA checklist, identify and practice specific questions that you might want to ask your IP during the DQA to get more information

Remember: DQA checklist is not an interview guide.

EXERCISE 20 minutes

Page 17: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

If you want to know… Ask the partner…

Does the indicator actually measure what you want it to measure?

Is this indicator useful for monitoring your activity’s intended results? Why or why not?

What procedures are used for data collection and analysis?

Describe to me how data are managed from point of collection to final reporting? Show me the different tools along the way.

Are there systems to ensure consistent collection, analysis and reporting of data?

Who are all of the people that are involved in data collection and reporting? Are there different field offices or sub-contractors that are involved? How do you ensure consistency among different staff/offices?

What is the quality of data entry system to record and maintain data?

How are data checked for errors in entry, transcription or math errors?

Do data reflect the current situation? How long does it take to go from data collection to reporting?

Page 18: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

• Complete DQA checklist for each indicator and IP– Not just to check the boxes. Write lots of comments!– Assess the findings and determine if improvement is needed– Write up recommendations and action items for improving

the quality of data where applicable– Solicit and incorporate IP responses into the completed

checklist– Attach all necessary reviewed supporting documents

• Debrief IP and agree on action plan, schedule, and responsibilities

• Share completed checklist and report with PRO• Implement and Monitor recommendations

ANALYZE, DOCUMENT, AND FOLLOW UP

Page 19: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

COMMON DQA QUESTIONS

• How do I document DQA for indicators reported by multiple activities?

• How many pieces of data or documents should be reviewed?• Should reported data be corrected if issues are found and

how?• What if I don’t know what DQ standard an issue identified falls

under?• What if I don’t know as much about M&E methods as the IP?• Should IPs do their own DQAs and how should they be used in

USAID DQAs?

Page 20: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

DQA TOOLS AND REFERENCES

• ADS 203

• PMP Toolkit (Part 2, Module 7)

• DQA checklist

• A Step by Step DQA Planning and Implementation Guide

Page 21: Data Quality Assessment Wednesday, June 10, 2015 USAID/Jordan.

Thank You


Recommended