LFA Onsite Data Verification OSDV - PowerPoint PPT Presentation

1 / 17
About This Presentation
Title:

LFA Onsite Data Verification OSDV

Description:

Information not tracked on actual implementation of action items recommended in the report; Current OSDV does not assess underlying M&E system. RDQA Data Verification ... – PowerPoint PPT presentation

Number of Views:119
Avg rating:3.0/5.0
Slides: 18
Provided by: theglob
Category:

less

Transcript and Presenter's Notes

Title: LFA Onsite Data Verification OSDV


1
LFA On-site Data Verification (OSDV)
Presentation at the SWA Regional Meeting
Hyderabad, India. 12th October 2009 John
Puvimanasinghe
2
OSDV During Grant Life Cycle
Before Grant Signing
Year 1
Year 2
Year 3
Year 4
Year 5
ME system strengthening cycle implementation
ME Self Assessment
Performance Framework
PF Years 1-2
PF Years 3-5
ME Plan
ME Plan
LFA PR Assessment
Phase 2 Assessment
OSDV
OSDV
OSDV
OSDV
OSDV
3
Objective of the OSDV
  • To assess and ensure the quality of data reported
    for important programmatic results for a grant or
    entire portfolio.
  • Routine exercise at least once a year per grant
  • Flexible approach based on individual Grant
    situation and judgment of LFA/Regional Team
  • Complementary to other data quality tools used
    (i.e. MESS DQA)

4
Roles and Methodology
METHODOLOGY
EXAMPLE - Eretria
ROLES
  • 5 days
  • 3 Top 10 Indicators
  • Number of people on ART
  • on women on PMTCT
  • of service deliverers trained

FPMwith LFA
4 sites in 2 districts
  • Medical Records
  • Training Log-sheets
  • Verifications
  • Cross-checks
  • Spot-checks

LFA
  • Indicator Rating
  • Summary of key findings
  • Recommended measures

5
Step 1 Determine Level of Effort (LoE)
LoE will be based on two variables
Step 1
1- Determine Level of Effort (LoE)
Primary Variable MAGNITUDE OF THE GRANT AND
PERCEIVED DATA-QUALITY RISK
2- Select Indicator Results
3- Select Sites
Secondary Variable DIFFICULTY OF PERFORMING THE
VERIFICATIONS
4- Select Source Documents
High
5- Perform Verifications
Importance
6- Produce Report
Low
High
Complexity
6
Step 2 Select Indicator Results
Step 2
Both Pre-determined and Grant-specific
1- Determine Level of Effort (LoE)
  • ? PRE-DETERMINED INDICATORS (to be verified in
    all cases)
  • People on ARV Treatment
  • ITNs Distributed
  • New TB cases detected.

2- Select Indicator Results
3- Select Sites

4- Select Source Documents
? GRANT-SPECIFIC INDICATORS (to be verified on a
case-by-case basis)
5- Perform Verifications
6- Produce Report
7
Step 3 4 Select Sites and Source Documents
Sample sites and source documents should be
selected
Step 34
1- Determine Level of Effort (LoE)
  • ? SELECTION OF SITES
  • Concentrate on the most important Regions and/or
    Districts
  • Typically, for an annual Grant amount
  • lt 4 Million USD - 2 sites in 1 Region and/or
    District
  • between 4-9 Million USD - 4 sites in 2
    Regions/Districts
  • gt 9 Million USD - 6 sites in 2 Regions and/or
    Districts

2- Select Indicator Results
3- Select Sites
4- Select Source Documents
5- Perform Verifications
? IDENTIFICATION OF SOURCE DOCUMENTS
6- Produce Report
8
Step 5 Perform Verifications
Step 5
03 three types of verifications that can be
performed by the LFA
1- Determine Level of Effort (LoE)
? BOTTOM-UP AUDIT TRAIL from primary records to
aggregated reports (all Grants)
2- Select Indicator Results
3- Select Sites
? CROSS-VERIFICATIONS with other data-sources
(all Grants)
4- Select Source Documents
5- Perform Verifications
? SPOT-CHECKS OF ACTUAL SERVICE DELIVERY (at
least for Grants above USD 9M)
6- Produce Report
9
Step 6 Produce Report
Outputs of the verifications should be summarized
in a report
Step 6
1- Determine Level of Effort (LoE)
  • ? DATA-VERIFICATION RATING (for each indicator)
  • A - less than 10 error margin
  • B1 - between 10-20 error margin
  • B2 - above 20 error margin
  • C - no systems in place.

2- Select Indicator Results
3- Select Sites
  • ? MATERIAL AND NON-MATERIAL FINDINGS
  • Data accuracy issues
  • Documentation completeness issues
  • Discrepancies in cross-checks
  • Issues from interviews of sample beneficiaries
    (if applicable).

4- Select Source Documents
5- Perform Verifications
? RECOMMENDED STRENGTHENING MEASURES
6- Produce Report
10
Distribution of Indicator Ratings (n300
indicators)
11
Distribution of Indicator Ratings (n300
indicators)
12
Status of OSDV Implementation for 2009 SWA
Region
13
Issues in OSDV Implementation
  • Reporting is not standardized
  • Some confusion in the indicator rating
  • No rating given for the overall grant
  • Some recommendations are not specific to enable
    action and follow-up
  • Information not tracked on actual implementation
    of action items recommended in the report
  • Current OSDV does not assess underlying ME system

14
RDQA Data Verification
Indicator Patients Currently on ART
Reporting Period February 2005
VF for Site 70/100 70
100
70
ART Register
15
RDQA Tool for Conducting OSDV?
16
RDQA Summary Statistics Level Specific
Dashboard
17
  • DHANYAWADAMULU
  • DHANYAWAD
  • THANK YOU
Write a Comment
User Comments (0)
About PowerShow.com