Improving the Toxic Substance Risk Assessment / Risk Management Process - PowerPoint PPT Presentation

About This Presentation
Title:

Improving the Toxic Substance Risk Assessment / Risk Management Process

Description:

Title: Improving Risks Assessments through Continuous Improvement Author: pmforri Last modified by: Jack Soule Created Date: 8/5/2005 5:11:29 PM Document presentation ... – PowerPoint PPT presentation

Number of Views:170
Avg rating:3.0/5.0
Slides: 28
Provided by: pmforri
Category:

less

Transcript and Presenter's Notes

Title: Improving the Toxic Substance Risk Assessment / Risk Management Process


1
Improving the Toxic Substance Risk Assessment /
Risk Management Process
CEPA ICG Meeting
  • Peter Forristal
  • May 17, 2006

2
Recommendation
  • To use a performance measurement tool to identify
    the priority areas for improvement of the Toxic
    Substance Risk Assessment / Risk Management
    (RA/RM) process.

3
Vision
  • The quality of all toxic substance risk
    assessments meet or exceed expectations.

4
Goal and Objective
  • Increase the effectiveness of CEPA in addressing
    significant risks to health and the environment.
  • Improve the application of the Federal Government
    frameworks policies.
  • Restore stakeholder confidence in the toxic
    substance RA / RM process.
  • Provide focus for future Industry advocacy work.

5
Todays Situation
  • Industry is concerned that the RA/RM process is
    inconsistent due to systemic problems
  • RA inconsistent information gathering (hazard
    and risk)
  • PFOS, PBDE, PFOA and window of data gathering
  • RA/RM linkage the context of the risk needs to
    be clearer.
  • Varying application of precaution.
  • Uncertainty factors for P and B (PFOS, PBDE)
  • Using hazard and exposure data with significant
    limitations (CP)
  • Variations in peer review
  • There will be more Screening Level Risk
    Assessments requiring specific review and
    advocacy
  • Government resources stretched
  • Different industry stakeholders for each risk
    assessment

6
RA/RM Performance Measures Initiative Update
  • September 2005
  • Industry proposed broad stakeholder involvement
    to identify performance measures for RA / RM
    process.
  • A common set of performance measures would
  • provide a clear set of expectations for all
    stakeholders
  • identify common issues facing all risk
    assessments
  • provide a compelling argument for improvement.
  • December 2005
  • EC/HC identified Q2 2006 plans to develop Quality
    Management System for RA/RM process.
  • May 2006
  • Industry has prepared an initial list of
    performance measures for development.

7
Performance Measurement
  • Performance requirements have been drawn from
    existing cabinet approved documents.
  • Framework for Science Technology Advice
  • Framework for the Application of Precaution in
    Risk-based Decision-making
  • Proposed Performance Measures
  • Gap Identification Inclusiveness
  • Peer Review Quality Assurance
  • Use of Science Information Accuracy
  • Uncertainty Identification Stakeholder
    Consultation
  • Risk Characterization Reconsideration
  • These measures will be quantified, where
    possible, according to the degree that they meet
    the expectations of the government frameworks,
    policies and procedures.

8
A Sample Evaluation of the Risk Assessment
Process
RISK ASSESSMENT
MEASURE
9
Next Steps
  • Develop a clear set of expectations for each
    proposed performance measure.
  • Proposing a workshop before summer.
  • Encourage broad stakeholder participation
    (Industry, Government, ENGO?)
  • Use the measures and expectations to evaluate and
    provide feedback on current screening level risk
    assessments.
  • Modify the measures based on feedback/changes in
    expectations.
  • After the measures have been used for gt10 risk
    assessments, analyze for priority improvement
    opportunities.
  • Share improvement areas with Substance Management
    Group.

10
Background Material
  • Risk Assessment / Risk Management Performance
    Measures Initiative

11
Whats required to improve any work process?
  • Three essential elements for process improvement
  • A clearly defined process owner
  • A systematic approach
  • Performance measurement

12
Work Process Improvement Roadmap
Define Products Services
Identify Customers
Identify the work process
Understand Requirements
Measure Performance
Identify Gaps
Understand Why
Innovate and Test
Improve Process
Evaluate and Do It Again
13
Risk Assessment Work Process Roadmap
Substance Risk Assessment
Government, NGOs, Industry
Screening Level Risk Assessment
Framework for Science Technology
Advice Framework for Application of Precaution
Measure Performance
Identify Gaps
Understand Why
You cant progress without performance measurement
Innovate and Test
Improve Process
Evaluate and Do It Again
14
What is the Framework for Science and Technology
Advice?
  • Adopted by all science departments in the federal
    government in 2001 - 2002.
  • Acknowledges sound science as a key input to
    policy formulation.
  • Leads to sound government decisions, minimizes
    crises and capitalizes on opportunities.
  • Ensures Ministers can be confident that advice is
    based on rigorous and objective assessment of all
    available science.

15
Framework for Science and Technology Advice
Principle I Early Issue Identification
  • Anticipate opportunities for which science advice
    will be required based upon interdisciplinary,
    interdepartmental and international cooperation
    issues.
  • Performance Measure Gap Identification
  • Exceeds Expectations
  • Strong effort to fill gaps and incorporate into
    risk assessment
  • Meets Expectations
  • Effort made to fill gaps to provide direction for
    risk management options.
  • Below Expectations
  • No effort to fill identified gaps left to risk
    management.

16
Framework for Science and Technology Advice
Principle II - Inclusiveness
  • Draw advice from a variety of scientific sources
    and from experts in relevant disciplines to
    capture the full diversity of scientific thought
    and opinion.
  • Performance Measure - Inclusiveness
  • Exceeds Expectations
  • An unbiased external advisory panel was used.
  • Meets expectations
  • Many references and key evidence comes from
    internationally acclaimed journals.
  • Below Expectations
  • Missing references to key science identified by
    stakeholders.

17
Framework for Science and Technology Advice
Principle III - Sound Science and Science Advice
  • Adopt due diligence procedures for assuring
    quality, reliability, integrity and objectivity
    (including scientific peer review) of science and
    science advice.
  • Performance measures
  • Peer review
  • Quality Assurance
  • Use of science
  • Information accuracy

18
Framework for Science and Technology Advice
Principle III - Sound Science and Science Advice
  • Performance Measure - Peer Review
  • Exceeds Expectations
  • Opinions of reviewers are acknowledged and areas
    of disagreement identified.
  • Meets Expectations
  • Qualified reviewers are used from government,
    NGOs, academia and industry.
  • Below Expectations
  • Missing reviews from key stakeholders.

19
Framework for Science and Technology Advice
Principle III - Sound Science and Science Advice
  • Performance Measure - Quality Assurance
  • Exceeds Expectations
  • QA done on hazards and effects exposure
    characterization plus a review of the
    environmental or health sections.
  • Meets Expectations
  • QA done on hazards and effects exposure
    characterization.
  • Below Expectations
  • QA missing on hazards or effects exposure
    characterization.

20
Framework for Science and Technology Advice
Principle III - Sound Science and Science Advice
  • Performance Measure - Use of Science
  • Exceeds Expectations
  • Rationale provided for new risk assessment
    methodologies.
  • Meets Expectations
  • Open publication of scientific information and
    using generally accepted risk assessment
    methodologies.
  • Below Expectations
  • Key science is not publicly available.
  • Scientific logic is faulty or not sound.

21
Framework for Science and Technology Advice
Principle III - Sound Science and Science Advice
  • Performance Measure - Information accuracy
  • Exceeds Expectations
  • Current data used for exposure and effects
    assessment.
  • Meets Expectations
  • Current data used for exposure assessment.
  • Below Expectations
  • Materially significant exposure or effects data
    not used.

22
Framework for Science and Technology Advice
Principle IV Uncertainty and Risk
  • Use a risk management approach (which will have
    the goal of scientifically sound, cost effective
    integrated actions that reduce risks while taking
    into account social, cultural, ethical, political
    and legal considerations) to assess, manage and
    communicate the high degree of uncertainty
    inherent in the science on which policy advice is
    based.
  • Performance Measures
  • Uncertainty identification

23
Framework for Science and Technology Advice
Principle IV Uncertainty and Risk
  • Performance Measure - Uncertainty Identification
  • Exceeds Expectations
  • Comprehensive discussion of uncertainty and
    confidence level for all sections of the
    assessment.
  • Meets Expectations
  • Full discussion of uncertainty in the
    effects-exposure section.
  • Below Expectations
  • Poor explanation of uncertainty

24
Framework for Science and Technology Advice
Principle V Transparency and Openness
  • Provide a clear articulation of how policy
    decisions are arrived at to those who are
    affected, including providing access to the
    underlying science as soon as possible.
  • Performance Measures
  • Stakeholder Consultation
  • Risk Characterization

25
Framework for Science and Technology Advice
Principle V Transparency and Openness
  • Performance Measure -Stakeholder Consultation
  • Exceeds Expectations
  • Many comments received and incorporated into the
    final risk assessment.
  • Meets Expectations
  • Key stakeholders consulted and received feedback.
  • Below Expectations
  • No evidence of stakeholder consultation.

26
Framework for Science and Technology Advice
Principle V Transparency and Openness
  • Performance Measure - Risk Characterization
  • Exceeds Expectations
  • Meets Expectations
  • Exposure and effects evidence is solid and risk
    management direction is clear.
  • Below Expectations
  • Assessment did not give clear direction for risk
    management.

27
Framework for Science and Technology
AdvicePrinciple VI - Review
  • Subsequent review of decisions to determine
    whether recent advances in scientific knowledge
    have an impact on the advice and decision.
  • Potential Measure Reconsideration
  • Exceeds Expectations
  • Meets Expectations
  • Precautionary measures should be implemented on a
    provisional basis and consistent with measures
    taken in similar circumstances.
  • Below Expectation
  • Precaution has been used and there is no timing
    set for reconsideration of risk assessment
    conclusions.
Write a Comment
User Comments (0)
About PowerShow.com