Date post: | 20-Jan-2016 |
Category: |
Documents |
Upload: | byron-carter |
View: | 223 times |
Download: | 0 times |
Data Processing &
Data Quality
A virtuous cycleThe implicit assumptions underlying information systems are
twofold: first, that good data, once available, will be transformed into useful information which, in turn, will influence decisions; second, that such information-based decisions will lead to a more effective and appropriate use of scarce resources through better procedures, programmes, and policies, the execution of which will lead to a new set of data which will then stimulate further decisions, and so forth in a spiral fashion.
(Sauerborn 2000 in Lippeveld et .al. Design and implementations of health information systems)
How do we process it?
How do we present it?
How do we use it? Reliable Information
Information CycleWhat do we collect?
Stages Tools Outputs
data sources & tools
Timely Quality data
Data quality checks & analysisInformation
Ensuring data accuracy
Once data has been collected, it should be checked for any inaccuracies and obvious errors.
Ideally this should be done as close to the point of data collection as possible. – Identify cause– Prevent future errors
Remember Johan’s little investigation
Why checking data is vital?
Use of inaccurate data leads to– Wrong priorities (focus on the wrong data)– Wrong decisions (not applying the right actions)– Garbage in = garbage out
Producing data is EXPENSIVE– Waste of resources and time to collect poor data
Data, in order to be useful, should be:
RELIABLE: correct, complete, consistent
TIMELY: fixed deadlines for reporting
AVAILABLE : who reports to whom? feedback mechanisms
ACTIONABLE: no action = throw data away
COMPARABLE: same numerator and denominator definitions used by all (e.g. geography vs org. unit function)
Complete data?
Geography: submission by all (most) reporting facilities
Time: can you do analysis over time? Consistency?
Does your services cover the full population? Many indicators depend on population figures as denominators
Correct data?
Are we even collecting the right data?
The data seems sensible/plausible?
The same definition applied uniformly
Legible handwriting
Are there any preferential end digits used?
PREFERENTIAL END-DIGITS
JAN FEB MARCH APRIL MAY JUNE JULY
250 230 240 220 230 240 250
Consistent data?Data in the similar range as this time last year or similar to
comparable reporting organization units
No large gaps or missing data
No multiplicity of data (same data from multiple sources –which one to trust?)
Timely data?
Some data needs to be acted upon immediately
Late reports weaken the potential for comparison, and action can be too late,
but still useful for documenting trends
Accuracy enhancing principles
Capacity building through training (90% of HISP activities)
User-friendly collection/collation tools Feedback on data errors (but not only!) Feedback of analysed Information Local Use of information
Controlling quality with DHIS2
Maximum / minimum valuesValidation rulesValidation Checks/RemindersCompleteness and timeliness reports (input for a league
table?)
- Will be covered in lab session -
Minimum and Maximum Values
0
500
1000
1500
2000
2500
3000
Jan Feb March April May June July
Num
ber
Minimum and Maximum Values
Maximum
Minimum
Good data quality 10 steps to achieve it
1. Small, Essential Data Set – EDS2. Use of data locally by the collectors3. Clear definitions - standards4. Careful collection and collation of data – good tools5. Sharing of information6. Regular feedback7. Supportive supervision - at all levels8. Ongoing capacity building through training and
support9. Regular discussion of information at facility team
meetings10. Monitoring & Rewarding good information (League
Table)
or else…
limited capacity to manage or analyse data
Using evidence not perceived as a winning strategy
A vicious cycle A vicious cycle
Data not trusted
Weak demand
Weak HIS
Poor data quality
Limited investment in HIS
Decisions not evidence-basedDonors get their own
Fragmentation