Data Quality Feedback - PowerPoint PPT Presentation

1 / 51
About This Presentation
Title:

Data Quality Feedback

Description:

Intervention on drug related harm (FR) Prevention in school (IR) Infectious diesases (IT) ... 143 out of 557 tables (26%) were sent more than 45 days after deadlines ... – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 52
Provided by: OEDT3
Category:
Tags: data | day | feedback | field | quality

less

Transcript and Presenter's Notes

Title: Data Quality Feedback


1
Data Quality Feedback
  • Linda Montanari, Warsaw, 13-14 May 2004

2
Quality feedback
  • Has the objective to
  • Was requested by
  • Is based on
  • Is done through
  • improve the quality of NFPs and EMCCDA products
  • NFPs and MB in 1999
  • a process of a mutual (NFPs and EMCDDA) agreed
    evaluation of outputs (reports and data)
  • a co-ordination of contributions from EMCDDA
    staff and other experts when possible

3
Main steps a continuous process for improvement
  • 1999 proposal to have quality feedback on
    National reports and other products
  • 2000 definition of quality criteria for
    evaluation of National Reports and 1st quality
    feedback on NRs
  • 2001 first broad assessment of Standard Tables
  • 2003 Quality status of standard tables through
    EISDD
  • 2003 Historical overview of quality feedback on
    NRs
  • 2004 4st quality feedback, quality status of
    standard tables, first feedback on other
    products (EDDRA, dissemination, financial
    reports)

4
Expected Outputs from the 2003Grant Agreements
  • Collection and analysis of information at
    national level in 2003
  • Dissemination at national level
  • Progress reports
  • Financial implementation reports 

5
Collection and analysis of information at
national level in 2003
  • Annual national report (old and new MS
    separately)
  • Statistical standard tables (old/new MS together)
  • Data input into EMCDDA and REITOX information
    systems (e.g. EDDRA, REITOX extranet) (new MS
    together)
  • Press clippings covering major national
    developments (e.g. launch of the Annual Report)
  • .

6
Dissemination at national level
  • Language checking and proof-reading
  • ..
  • ..

7
Financial implementation reports
  • Interim financial implementation report
  • Final financial implementation report

8
Data QualityFeedback on 2003 National Reporting
  • Linda Montanari, Warsaw, 13-14 May 2004

9
General Quality of the National Reports from
2000 to 2003
15 old Member States Norway
10
Respect of deadlines
11
Average improvement over the last 4 years of NRs
1 Not Sufficient 2 Sufficient 5 Very
Good 3 Rather Good 4 Good
12
Quality enhancement of the National Reports from
2000 to 2003
13
National reports improved from.
14
National Reports with the same level of quality
15
Areas where there was improvement
  • Adherence to guidelines
  • Better layout and presentation
  • Way to report epidemiological and demand
    reduction information
  • Methodological quality
  • Insight and use of qualitative information

16
Most common weaknesses
  • Delay in providing the report
  • Scarce use of qualitative information
  • Generic description of information, esp.
    on policy, legislation and interventions
  • Repeated information within the report and with
    preivous year
  • Scarce insight and level of interpretation

17
Where is most difficult to change.
  • Respect of deadlines
  • Unavailability of epidemiological data sources,
    esp. routine data sources
  • Difficult to involve national networks/experts

What NFPs and EMCDDA can do?
18
Most common strong points
  • Adherence to Guidelines
  • Comprehensiveness of the report, according to
    the existing information
  • Clear way to report methodological details
  • Presence of information sources
  • Good use of routine information either on
    epidemiological fields or in demand reduction
    (when available)
  • Good use of Key Indicators data

19
Strong Points in 2003 National Reportssome
examples by country
  • Prevention in recreational settings (NL)
  • Co-morbidity chapter (AU)
  • Strategies on demand reduction   (PT)
  • Treatment demand section (FI)
  • Treatment interventions (SW)
  •  
  • Trends by drug (UK)
  • Demand reduction section (NO)
  •  Data from the HBSC WHO survey (BE)
  • Treatment evaluation (DK)
  • Prevention among young people (GE)
  • Cannabis chapter (GR)
  • Description of laws (SP)
  • Intervention on drug related harm (FR) 
  • Prevention in school (IR)
  • Infectious diesases (IT) 
  • Problem drug use section (LU)

20
Recommendations
  • Follow the recommendations
  • Provide the report on time
  • Provide the report in in a simple format
  • Try to be concise
  • Look at the previous EU Annual Report to see
    peculiarities for your country
  • Exchange experiences with other National Focal
    Points

21
Data QualityFeedback on 2003 Standard Tables
  • Notrbert Frost, Linda Montanari, Warsaw, 13-14
    May 2004

22
Standard Tables to be submitted in 2003
  • Drug Seizures
  • Purity of illicit drugs Composition of drugs
  • Drug prices
  • Leading edge indicator
  • Mortality cohorts
  • School prevention programmes
  • Assistance to drug users prison
  • Prevention in recreational settings
  • TDI
  • Population Survey
  • School surveys
  • Treatment Demand
  • Treatment Demand Evolution
  • Acute Drug Related Deaths
  • Evolution drug related deaths
  • National Prevalence
  • Local Prevalence
  • Prevalence Hepatitis B and C
  • Syringe exchange distribution Arrests/Reports
  • Prison drug use

23
N. of Standard tables submitted in 2003(by topic
concerning old/new MS)
Source EISDD
Up-date 05/05/2004
24
N. Countries who did not submit standard tables
in 2003 by table
Source EISDD
Up-date 05/05/2004
25
Deadlines observance(15 September 2003)
26
Some additional elements..
  • 414 out of 557 (74) were sent on time or with
    one month delay
  • 68 out of 557 tables (12) were deleted and
    uploaded again
  • 143 out of 557 tables (26) were sent more than
    45 days after deadlines
  • 142 out of 557 (25) were not uploaded through
    the REITOX web site

27
To take into account that.
  • Tables sometimes were deleted and uploaded again
  • Tables were sent to project managers directlyno
    control on date of reception through REITOX web
    site
  • Quality in some case was very low
  • After reception there is a long period of
    validation internal and with NFPs
  • Through the EISDD in the next assessment it will
    be possible to have track of all the process in
    order to see the respect of guidelines

28
Quality criteria defined in the EISDD
  • 1 invalid (e.g. invalid sample, wrong
    definitions, etc.)
  • 2 No new data (old data present)
  • 3 empty table - provided
  • 4 questions doubts to be clarified when
    tables were provided
  • 5 use in AR data used for the Annual Report
  • 6 data entered in the EISDD, either used or
    not for AR
  • de process data entry on going process of
    data entry
  • Ne infocomplete do not enter data, informatical
    category to indicate that something
    informatically can be done (tables can be
    treated not positive or negative)
  • Data archived in EISDD data file archived but
    not entered and not analysed
  • Other qualitative assessment good, complete,
    empty

29
Summary of tables meeting other quality criteria
according to EISDD
30
N. of standard tables used for Annual Report 
by country
Source EISDD
31
Common problems
  • Delay in providing the tables
  • Empty Tables
  • Tables with old data
  • Tables partially filled in (in particular
    methodological details empty, )
  • Change in the tables name/file name
  • Lack of indication of information sources
  • Different definitions used
  • Changing the structure of tables
    (adding/deleting lines )

32
Recommendations
  • Follow the recommendations
  • Provide the tables on time
  • Fill in the tables precisely, indicating when
    data are not available
  • Indicate clearly the problems
  • Indicate methodological details
  • Do not change table names or file names
  • Use the upload function
  • Follow the guidelines
  • Do not change the table structures
  • Exchange experiences with other National Focal
    Points

33
For more detailed information on the EISDD and
the quality index calculationnorbert.frost_at_emcdd
a.eu.int
34
Data QualityFeedback on 2003 EDDRA
  • Abigail David, Linda Montanari, Warsaw, 13-14 May
    2004

35
EDDRA progress report
36
Useful points in the reports(1)
  • Report structure
  • Use of a clear report structure
  • Short paragraphs
  • Use of simple lists (tables) e.g. list of
    projects, promotional activities, problems etc.

37
Useful points in the reports(2)
  • Reviews of projects entered during the course of
    the year
  • Updates and pending projects
  • Analysis of entries by area and theme
  • Follow the guidelines
  • Proposed projects for 2004
  • Clear 2004-work programme.

38
Useful points in the reports(3)
  •          Description of any constraints
    experienced during the course of the year. I.e.
    issues with questionnaire completion,
    difficulties in collecting information, internal
    personnel changes etc.
  •          Description of any problems
    experienced with the EDDRA database. I.e.
    software and technical problems, access to the
    website
  •          Clear review of promotional and
    networking activities
  • Clear review of resources invested during the
    course of the year. I.e. personnel

39
For more detailed information on EDDRA
feedbackabigail.david_at_emcdda.eu.int
40
Data QualityFeedback on 2003 dissemination
  • Joelle V.D.Auwera, Rosemary de Sousa, Kathryn
    Robertson,
  • Linda Montanari, Warsaw, 13-14 May 2004

41
Consultation process on Annual Report(under the
responsibilities of MB only old MS and No)
  • Comments received from all MS (15 Norway)
  • 1 country was very late in providing feedback
  • 1 country has requested to check the full text in
    its language before publication
  • Comments were clear and marked in a way that made
    easy to find page and reference
  • Very efficient and good collaboration
  • This year the timeliness is even more important,
    since deadlines are very tight

42
2) Annual report press launch(only old MS No)
  • Linguistic revision of news releases was positive
    and timely (some NFPs requested to verify
    content. To be decided)
  • Dissemination of news releases positive
    collaboration feedback was offered when problems
    were encountered
  • Press clippings positive and timely (see Annual
    report press review)

43
3) Policy briefing  Drug in Focus (including
new MS)
  • Tasks requested to NFPs
  • to proofread the text and revise the translation
    which often makes change the meaning of the text
  • To make a list of relevant to policy makers to
    whom send the publications
  • With new NFPs 21 languages included
  • Quality of proofreading was very good from all
    countries, although translation has often a very
    poor quality
  • Problems derive from the countries who did not
    send feedback (4 on the last proofreading)
  • Delay in sending the feedback 4 countries

44
4) New presentation brochure (including new MS)
  • Most countries collaborated positively 2
    countries did not reply
  • Very bad translations were received in two
    languages. 1 country retranslated the other
    returned the file
  • 2 countries did not reply

45
EMCDDA staff working on dissemination.. thanks
the NFPS for the very good collaboration!
  • For further information on
  • Annual report consultation
  • Rosemary.de.Sousa_at_emcdda.eu.int
  • Annual report press launch and new brochure
  • Kathryn.Robertson_at_emcdda.eu.int
  • Policy briefings and Annual report presentation
    to policy makers
  • Joelle.Vanderauwera_at_emcdda.eu.int

46
Data Quality Feedback Brainstorming for future
perspectives
  • Linda Montanari, Warsaw, 13-14 May 2004

47
Criteria used for feedback
  • Adherence to guidelines and deadlines
  • Layout and presentation
  • Methodological quality
  • Content by section
  • Global evaluation with strong and weak points
  • Final recommendations, including examples of best
    practices in other countries

48
Assessment process
  • Collection of contributions from EMCDDA staff on
    each project
  • Scientific committee contributions on NRs (not in
    2003)
  • Reading and assesment in the REITOX co-ordination

49
General points for discussion
  • Different/new criteria for evaluation? (which
    ones)
  • Different/new process? (e.g. mutual NFPs review
    proposed and already refused!)
  • Who else should be involved? (but time consume
    should be considered e.g. SC refused to do the
    exercise this year)
  • What is the impact in the NFPs? (process after
    feedback)
  • What impact on EMCDDA? (feedback on feedback)
  • How evaluation should change with the new
    guidelines?

50
Specific points for discussion
  • Draft/Final report what date should be
    considered and used what report for analysis?
  • What relation with key indicators?
  • Exchanges between NFPs? (e.g. horizontal
    co-operation)
  • In the next guidelines nothing is binding except
    the main chapters what relations with the
    checklist?

51
Proposals for improvement taking into account..
  • Resources problems
  • Feedback objectives ..
  • Feedback could be positive/negative/both
  • EMCDDA and NFPs impact.
Write a Comment
User Comments (0)
About PowerShow.com