Human Reliability - PowerPoint PPT Presentation

1 / 74
About This Presentation
Title:

Human Reliability

Description:

38) COGnitive EveNt Tree (COGENT) 93. PSY 562 Advanced Human Factors. Human Reliability ... Representation techniques (COMET, COGENT) ... – PowerPoint PPT presentation

Number of Views:288
Avg rating:3.0/5.0
Slides: 75
Provided by: darre70
Category:

less

Transcript and Presenter's Notes

Title: Human Reliability


1
Human Reliability
  • Presented byPatricia A. Dunavold

2
Human Reliability
  • Human error identification techniques for risk
    assessment of high risk systems Part 1 review
    and evaluation of techniques Barry Kirwan (1998)
  • Human error identification techniques for risk
    assessment of high risk systems Part 2 towards
    a framework approach Barry Kirwan (1998)
  • Human error identification techniques applied to
    public technology predictions compared with
    observed use C. Baber N.A. Stanton (1996)

3
Human Reliability
  • Kirwan, Part 1 outlines 38 error identification
    techniques and categorizes them into types of
    error identification approaches in high risk,
    complex systems
  • Kirwan, Part 2 describes a framework approach
    and applies it to a high risk, complex system a
    nuclear power plant
  • Baber Stanton applies the use of HEI
    techniques to predict errors in the use of a
    ticket vending machine as an alternative to
    observation studies

4
Human Reliability
  • Probabilistic Safety Assessment (PSA)
  • Used to determine the risk of complex systems
    such as chemical plants, nuclear power plants
    from all potential risk causes
  • Human Reliability Assessment (HRA)
  • Used to determine the impact of human error
    error recovery on a system
  • Probability of the error occurring is calculated

5
Human Reliability
  • Error Reduction Analysis (ERA)
  • Ways of reducing the likelihood of the error
  • Its impact on the system
  • Human Error Assessment (HEA)
  • Error identification
  • Error recovery consideration
  • Consequence determination
  • Human Error Identification (HEI)

6
Human Reliability
  • PSA
  • HRA
  • HEA ERA
  • HEI

7
Human Reliability
  • Human Error Identification (HEI)
  • Thirty-eight different techniques
  • Catagorized into five broad catorgies
  • Two different approaches
  • Toolkit approach using a mixture of independent
    tools which together deal with all the required
    error types
  • Framework approach - develop an integrated
    framework-based set of tool/taxonomies
  • HEI techniques applied usability study

8
Human Reliability
  • Process of Error Identification
  • Step 1 Decide the scope of the analysis
  • Which operator involvements or critical tasks do
    you want to consider?
  • Only emergency events
  • Misdiagnoses
  • Maintenance errors
  • Rule violation errors
  • Step 2 Do a task analysis
  • Determine how the operations should proceed

9
Human Reliability
  • Process of Error Identification, cont.
  • Step 3 Consider what can go wrong
  • Three major components to an error
  • External Error Mode (EEM) external
    manifestation of the error (i.e. closed wrong
    valve) necessary component
  • Performance Shaping Factors (PSF) factors that
    influence the likelihood of an error occurring
    (i.e. quality of the operator interface, time
    pressure, training) - desirable component
  • Psychological error Mechanism (PEM) the
    internal manifestation of the error (i.e.
    memory failure, pattern recognition failure)
    desirable component

10
Human Reliability
  • Process of Error Identification, cont.
  • Step 4 Considering the implications
  • Error recovery potential can be considered
  • Consequences of the identified error can be
    determined

11
Human Reliability
  • Seven major error types of interest
  • Slips and lapses (action execution errors)
  • Cognitive errors diagnostic decision-making
  • Maintenance errors and latent failures
  • Errors of commission
  • Rule violations
  • Routine
  • Extreme
  • Idiosyncratic errors
  • Software programming errors

12
Human Reliability
  • Slips and lapses (action execution errors)
  • Most predictable
  • Simple errors of quality of performance (too much
    or too little force applied)
  • Omissions (leaving out a step in a sequence)
  • Sequence errors (task steps carried out in wrong
    sequence)
  • Cognitive errors diagnostic decision- making
  • Misunderstanding by the operator of what is
    happening in the system leads to insufficient
    operator support
  • Due to design, procedures, training (submersible)

13
Human Reliability
  • Cognitive errors diagnostic decision-making,
    cont.
  • This type of error includes
  • Misdiagnosis
  • Partial diagnosis
  • Diagnostic failure
  • Can alter accident progression sequences
  • Can lead to system failures
  • Can cause dependencies between redundant
    back-up systems

14
Human Reliability
  • Maintenance errors and latent failures
  • Maintenance errors mostly due to slips lapses
    but in maintenance and testing activities
  • Can lead to immediate system failures or latent
    failures
  • Latent failures
  • Can lay dormant until a system is used
  • Errors of commission (EOC)
  • The operator does something that is incorrect and
    also unrequired
  • Valve should be locked open but is locked closed

15
Human Reliability
  • Errors of commission (EOC), cont.
  • Errors can be due to
  • Carrying out actions on the wrong components
  • A misconception
  • Risk recognition failure
  • EOCs are an increasing concern for 3 reasons
  • They do happen, although rarely
  • They can have a large impact on system risk
  • They are difficult to identify and therefore
  • Difficult to anticipate and
  • Defend against

16
Human Reliability
  • Rule violations
  • Relatively unexpected and can lead to failure of
    multiple safety systems and barriers
  • Routine negligible risk and is viewed as an
    acceptable part of the job
  • Replacing light bulbs in a nuclear power plant
  • Extreme seen as a real risk and is a serious
    violation
  • Union Carbide Chemical plant disaster in Bhopal,
    India

17
Human Reliability
  • Idiosyncratic errors
  • Errors due to social variable and the operators
    emotional state
  • Result of a combination of personal factors in an
    unprotected and vulnerable system
  • Of greatest concern in systems where the operator
    has the potential to kill a large number of
    person (as in a transportation system)
  • Is intent a necessary component? (i.e. terrorist
    attacks vs ferry disaster or Valdez oil spill)

18
Human Reliability
  • Software programming errors
  • Errors of increasing importance due to the
    prevalence of software-based control systems
    required to economically control large, complex
    systems
  • Also important in other areas such as safety
    critical software and navigational applications
  • Few techniques to predict human errors in
    software programming
  • Effort is spent on verifying validating
    software to ensure it is error-free

19
Human Reliability
  • Five broad theoretical classifications of HEI
    techniques (Table 2 pg 161)
  • Taxonomies basically error checklists
  • Psychologically based tools rely on an
    understanding of factors affecting performance
  • Cognitive modeling tools least mature of the
    human error analysis approaches
  • Cognitive simulations the most sophisticated
    human error identification area
  • Reliability-oriented tools derived from
    reliability approaches used with non-human
    reliability problems

20
Human Reliability
  • Taxonomies
  • Consist of error mode checklists
  • Rely on the interpretation of the analyst for the
    context of interest
  • 1) Technique for Human Error Rate (THERP) 81
  • Early THERP 70s
  • 2) INTENT 91
  • 3) Savannah River Site HRA (SRS-HRA) 94

21
Human Reliability
  • Psychologically based tools
  • Tools that rely on an understanding of the
    factors affecting performance
  • Particularly characterized by tools that consider
    error causes (PSF) and/or error mechanisms (PEMs)
  • 4) Murphy diagrams 81
  • 5) Skill, Rule and Knowledge-based behavior model
    (SRK) 81

22
Human Reliability
  • Psychologically based tools, cont.
  • 6) Systematic Human Error Reduction and
    Prediction Approach (SHERPA) 86
  • 7) Systematic Critical Human Error Management
    Approach (SCHEMA) 92
  • 8) Technique for Evaluating and Assessing the
    Contribution of Human Error to Risk which uses
    the Systems Induced Error Approach
    (TEACHER/SIERRA) 93
  • 9) Predictive Human Error Analysis technique
    (PHEA) 93

23
Human Reliability
  • Psychologically based tools, cont.
  • 10) Generic Error Modeling System (GEMS) 87
  • 11) Critical Action and Decision Approach (CADA)
    88
  • 12) Potential Human Error Causes Analysis
    (PHECA) 88
  • 13) Human Reliability Management System (HRMS)
    90
  • 14) Task Analysis for Error Identification
    (TAFEI) 91

24
Human Reliability
  • Cognitive modeling tools
  • Tools that attempt to model cognitive aspects of
    performance
  • In terms of relationships between knowledge items
    relating to symptoms of events
  • In terms of how various factors will affect
    cognitive performance aspects of the task
  • 15) Influence Modeling and Assessment System
    (IMAS) 86
  • 16) Cognitive Reliability and Error Analysis
    Method (CREAM) 94

25
Human Reliability
  • Cognitive simulations
  • Computer simulations of operator performance
  • 17) Dynamic Logical Analyzing Methodology (DYLAM)
    85
  • 18) Cognitive Environment Simulation (CES) 90
  • 19) INTEgrated Reactor OPerator System (INTEROPS)
    91
  • 20) Cognitive Simulation Model (COSIMO) 92
  • 21) CREW SIMulation (CREWSIM) 93
  • 22) CREW PROblem solving simulation (CREWPRO)
    94

26
Human Reliability
  • Cognitive simulations, cont.
  • 23) Accident Dynamic Sequence Analysis (ADSA)
    94
  • 24) Cognitive Action Modeling of Erring
    Operator/Task Analysis Tool (CAMEO/TAT) 94
  • 25) System for the Behavior of the Operating
    Group (SYBORG) 95
  • 26) Simulation-based Evaluation and Analysis
    support system for MAn-machine Interface Design
    (SEAMAID) 96

27
Human Reliability
  • Reliability-oriented tools
  • HAZOP
  • 27) Confusion Matrix Analysis (CMA) 81
  • 28) Human HAZard and Operability Study technique
    (Human HAZOP) 85
  • 29) Team Operations Performance and Procedure
    Evaluation (TOPPE) 91
  • 30) Procedure to Review and Evaluate Dependency
    In Complex Technologies (PREDICT) 92
  • 31) Error of Commission Analysis (EOCA) 94
  • 32) A Technique for Human Error ANAlysis
    (ATHEANA) 96

28
Human Reliability
  • Reliability-oriented tools, cont.
  • FMEA
  • 33) Task Analysis-Linked EvaluatioN Technique
    (TALENT) 88
  • 34) Human Error Mode, Effect and Criticality
    Analysis (HEMECA) 89
  • 35) SNEAK 91
  • 36) Procedure Response Matrix Approach (PRMA)
    94
  • Event Trees
  • 37) COMmission Event Trees (COMET) 91
  • 38) COGnitive EveNt Tree (COGENT) 93

29
Human Reliability
  • Ten analytical approaches (which are not mutually
    exclusive)
  • Checklist-based approaches (SRS-HRA, THERP,
    INTENT, GEMS)
  • Easy to use
  • Rely on the skill and understanding of the
    analyst
  • Flowchart-based approaches (SHERPA, SCHEMA,
    TEACHER/SIERRA, PHEA)
  • Very structured and tend to produce consistent
    results between different analysts
  • Only assess straightforward behaviors (skill
    rule-based behaviors)

30
Human Reliability
  • Ten analytical approaches, cont.
  • Group-based approaches (HAZOP, PREDICT, EOCA,
    CMA)
  • Use of experienced groups can identify less
    obvious error forms and can predict events for
    novel systems
  • Reliability cannot be guaranteed and they are
    costly
  • Cognitive psychological approaches (GEMS, SRK,
    CREAM)
  • Provides a psychological-based model for
    identifying higher-level cognitive behaviors
    and errors
  • Limited in scope and still under development

31
Human Reliability
  • Ten analytical approaches, cont.
  • Representation techniques (COMET, COGENT)
  • Maintains the direction of the study well and
    highlights risk assessment needs from the HRA
  • Does not identify lower level errors
  • Cognitive simulations (CES, COSIMO)
  • When mature, can provide powerful insight into
    how human operators will respond in emergency
    situations (esp. complex environments)
  • Only as good as the cognitive models they are
    based on very costly and time consuming to
    develop

32
Human Reliability
  • Ten analytical approaches, cont.
  • Task analysis linked techniques (TALENT, PRMA,
    CAMEO-TAT, SHERPA)
  • Very documentable approach which provides an
    audit trail for the assessment and can identify
    some of the more complex error forms within the
    context of the operations
  • Relies on a detailed task analysis which can be
    costly and time-consuming

33
Human Reliability
  • Ten analytical approaches, cont.
  • Affordance-based techniques (TAFEI, SNEAK,
    PREDICT)
  • Very theoretical in nature and attempts to
    predict any human actions that could occur given
    the system architecture and its operational
    environment, and then tries to establish reasons
    for the actions. This approach is particularly
    relevant to errors of commission and rule
    violations.
  • Might be too theoretical as it may be basing
    reasons for human actions on erroneous intentions

34
Human Reliability
  • Ten analytical approaches, cont.
  • Error of Commission (EOC) Identification
    techniques (SNEAK, EOCA, ATHEANA)
  • Can be helpful in identifying the interaction
    between poor design elements and errors and can
    be useful for a large range of industries
  • It can be a very resource-intensive method (some
    techniques rely on table-top simulations or large
    databases of ergonomics guidance)

35
Human Reliability
  • Ten analytical approaches, cont.
  • Crew interactions and communications (CREWSIM,
    CREWPRO, TOPPE)
  • Most psychologically ambitious method (to date)
    and most realistic in terms of actual crew
    coordination and communication tasks
  • Still developmental in nature and requires
    validation

36
Human Reliability
  • Criteria for evaluation of HEI techniques
  • Two related criteria sets
  • HEI comparative validation exercises
  • Used in Kirwans original papers (1992a, b)
  • Outside the scope of current paper
  • Qualitative evaluations
  • Ten separate criteria for qualitatively
    evaluating HEI techniques

37
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques
  • Comprehensiveness of human behavior
  • Degree to which the technique addresses
  • Skill-based behavior (S)
  • Rule-based behavior (R)
  • Knowledge-based behavior (K)
  • Rule violations (RVa)
  • Errors of commission (EOC)

38
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Consistency
  • Degree to which the technique is structured and
    thereby likely to yield consistency of results
  • Low (relatively open-ended technique)
  • Moderate (some flexibility within a detailed
    framework)
  • High (highly structured - different analysts will
    come to the same conclusions )

39
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Theoretical validity 1
  • Technique is based on a model of human
    performance
  • Low (simple classification-based system)
  • Moderate (technique makes reference to a model of
    human performance)
  • High (technique is a direct interpretation of a
    human performance model)

40
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Theoretical validity 2
  • Whether the technique addresses one or more of
    the three main error components
  • External Error Modes (EEMs)
  • Psychological Error Mechanisms (PEMs)
  • Performance Shaping Factors (PSFs)

41
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Usefulness
  • Degree to which the technique can generate error
    reduction mechanisms
  • Low (little concern with error reduction)
  • Moderate (technique is capable of error
    reduction)
  • High (error reduction is primary focus of the
    technique)
  • Resources 1
  • Amount of time it would take the analyst to apply
    technique
  • Low
  • Moderate
  • High

42
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Resources 2
  • Training required to use the technique
  • Yes
  • Low, Moderate, High
  • No
  • Resources 3
  • Requirement for an expert panel or task-domain
    experts
  • Yes
  • No

43
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Documentability
  • Degree to which the technique supports auditable
    documentation
  • Low (technique is difficult to document)
  • Moderate (technique provides sufficient
    documentation to be repeatable)
  • High (all findings are recorded so that the
    documentation can be used for system operations
    in the future and the documentation can be used
    to facilitate future periodic assessments)

44
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Acceptability 1 PSA usage to date
  • Very difficult to determine because little has
    been published on these techniques
  • Low (prototype only)
  • Moderate (used in a small number of assessments)
  • High (has been used extensively)

45
Human Reliability
  • Ten criteria for qualitative evaluation of HEI
    techniques, cont.
  • Acceptability 2 Availability of technique
  • Technique is either available or not (because it
    has been discontinued OHECA it is proprietary
    and not generally available HRMS or is a
    prototype and not generally available CES)
  • Yes
  • No

46
Human Reliability
  • Three additional criteria for evaluation of HEI
    techniques (for this paper)
  • HEI output quantifiability
  • Whether a HRA quantification technique-HEI
    technique partnership exists or whether error
    forms are beyond quantification at this time
  • Success Likelihood Index Method (SLIM)
  • Absolute Probability Judgement (APJ)
  • Paired Comparisons (PC)
  • THERP

47
Human Reliability
  • Three additional criteria for evaluation of HEI
    techniques (for this paper), cont.
  • Life cycle stage applicability
  • Earliest life cycle stage when the technique can
    be applied
  • Concept
  • Detailed design
  • Commissioning
  • Existing/operational life cycle phases

48
Human Reliability
  • Three additional criteria for evaluation of HEI
    techniques (for this paper), cont.
  • Primary objective of the technique
  • Original purpose or objective of the technique
  • Evaluation of HEI techniques
  • Each of the 38 HEI techniques were evaluated by
    the author according to the 13 criteria just
    outlined
  • See table 3 on pgs. 170-172 in the first article

49
Human Reliability
  • Conclusions
  • No single HEI technique has the capability to
    predict/identify all possible errors in all
    environments/systems
  • Two possible approaches
  • Use a mixture of individual HEI tools to address
    all the required error types (toolkit approach)
  • Develop an integrated framework-based set of HEI
    tools (framework approach)
  • This has the advantage of allowing the analyst to
    custom design a set of HEI techniques to each
    specific environment/system

50
Human Reliability
  • HEI techniques applied
  • The second Kirwan paper describes a framework
    approach and applies it to a high risk, complex
    system a nuclear power plant
  • Framework approach - best approach may be to
    combine several techniques into a viable
    framework
  • Framework developed for the UK Nuclear Power and
    Reprocessing industry

51
Human Reliability
  • Customized Framework Approach
  • Error assessment range must address the
    following error types
  • Skill and rule-based error forms
  • Cognitive errors
  • Errors of commission
  • Rule violations
  • Teamwork and communication errors
  • Framework Philosophy
  • Should include a range of techniques that the
    analyst can choose from to address all error
    types

52
Human Reliability
  • Customized Framework Approach, cont.
  • Integration and coherence there must be
    commonalities between the various error type
    assessment modules and should have the following
    components
  • External Error Modes (EEMs)
  • Performance Shaping Factors (PSFs)
    Psychological Error Mechanisms (PEMs)
  • Question-and-answer checklists
  • Similar tabular formats for recording errors,
    factors, etc.
  • Procedures for error identification and selection
    of techniques
  • Worked examples

53
Human Reliability
  • Customized Framework Approach, cont.
  • Best technique usage framework should build on
    the best techniques available and the useful
    functions of currently unavailable techniques
  • Resource flexibility framework should try to
    provide both basic and more detailed approaches
  • Computerization the system should be
    computerized

54
Human Reliability
  • Customized Framework Approach, cont.
  • Comprehensiveness vs. exclusivity system should
    aim to be comprehensive and open-ended and most
    of the techniques should be checklist based

55
Human Reliability
  • The Human Error and Recovery Assessment system
    (HERA)
  • Specifically designed for error assessment
    associated with Nuclear Power Plants and
    Reprocessing plants
  • System consists of
  • A document
  • A prototye software package
  • Several appendices
  • A software manual
  • Consists of eight functional modules
  • Scoping the analysis and critical task
    identification
  • Task analysis

56
Human Reliability
  • The Human Error and Recovery Assessment system
    (HERA), cont
  • Consists of eight functional modules, cont.
  • Skill and rule based error identification
    (only module discussed in this paper)
  • Diagnostic and decision-making error
    identification
  • Error of commission analysis
  • Rule violation error identification
  • Teamwork and communication error identification

57
Human Reliability
  • The Human Error and Recovery Assessment system
    (HERA), cont
  • Consists of eight functional modules, cont.
  • Integration issues
  • Representation
  • Quantification
  • Error reduction
  • Documentation
  • Quality assurance

58
Human Reliability
  • The Human Error and Recovery Assessment system
    (HERA), cont
  • The scope of this system is aimed at safety and
    not operational problems
  • It does not address the following potential human
    errors
  • Safety culture poor attitudes leading to safety
    problems
  • Software reliability programming errors
  • Social or idiosyncratic errors personal
    problems of the operators

59
Human Reliability
  • Practical aspects of the framework approach
  • Several potential problems with the framework
    approach have been identified
  • How to decide which modules to use?
  • Analyst selects different sub-modules depending
    on resources , degree of novelty of the system,
    degree of human involvement, dependence of system
    safety on human reliability

60
Human Reliability
  • Practical aspects of the framework approach
  • Is there a risk of over-identifying or
    re-identifying the same errors?
  • There is some overlap however, this approach
    allows the analyst more than one attempt to
    identify errors and may see the errors from
    different perspectives
  • What is the level of resources required?
  • This approach typically uses less resources than
    some techniques but more resources than other
    techniques
  • Can novices use this approach?
  • Yes, if supervised and effectively trained

61
Human Reliability
  • Conclusions Advantages vs Disadvantages of the
    Framework and Toolkit approaches
  • Framework
  • Advantages
  • Results in an integrated analysis
  • May use less resources
  • Possibly requires less training on the part of
    the analyst
  • Analysis tends to be very comprehensive and
    robust
  • Disadvantages
  • May not be the best method when a specific error
    type needs to be addressed

62
Human Reliability
  • Conclusions Advantages vs Disadvantages of the
    Framework and Toolkit approaches
  • Toolkit
  • Advantages
  • More divergent thinking on the part of the
    analyst may result in more comprehensive error
    identification
  • May be less prone to analyst bias since it may
    require more than one analyst
  • Analysis tends to be very comprehensive and
    robust
  • Disadvantages
  • May not be the best method when a specific error
    type needs to be addressed

63
Human Reliability
  • Human Error Identification (HEI) (Baber Staton)
  • The purpose of HEI techniques is the definition
    of points in the interaction between humans and
    artifacts, or systems which are likely to give
    rise to errors. Typically this is achieved
    through four related practices
  • Representing the full range of operations that
    people can perform using the artifact or system
  • Determining the types of error that are likely to
    occur
  • Assessing the consequence of errors for system
    performance
  • Generating strategies to prevent, or reduce, the
    impact of errors

64
Human Reliability
  • There are three uses of HEI techniques
  • In the design of new artifacts, so that potential
    errors can be identified and rectified before
    production
  • In risk assessment, so that the impact of safety
    critical errors in system operation can be
    reduced
  • In accident investigation, so that the cause of
    errors can be established

65
Human Reliability
  • Criteria for comparing HEI techniques (see Table
    1, pg. 120)
  • Comprehensiveness concurrent validity
  • Consistency inter-analyst reliability
  • Theoretical validity construct validity
  • Usefulness face validity
  • Resource usage factors such as time, etc,
  • Auditability traceability from analyst to
    client
  • Acceptability content validity

66
Human Reliability
  • HEI techniques applied usability study
  • Toolkit approach - using a mixture of independent
    tools which together deal with all the required
    error types
  • Two HEI techniques were applied to the use of a
    ticket vending machine and compared with the more
    traditional method of observation
  • Task Analysis for Error Identification (TAFEI)
  • Predictive Human Error Analysis (PHEA)

67
Human Reliability
  • TAFEI
  • Psychologically-based tool
  • Relies on an understanding of factors affecting
    performance
  • Particularly characterized by tools that consider
    error causes (PSF) and/or error mechanisms (PEMs)
  • Affordance-based technique
  • Attempts to predict any human actions that could
    occur given the system architecture and its
    operational environment, and then tries to
    establish reasons for the actions
  • This approach is particularly relevant to errors
    of commission and rule violations

68
Human Reliability
  • PHEA
  • Psychologically-based tool
  • Relies on an understanding of factors affecting
    performance
  • Particularly characterized by tools that consider
    error causes (PSF) and/or error mechanisms (PEMs)
  • Flowchart based technique
  • Very structured and tends to produce consistent
    results between different analysts
  • Only assess straightforward behaviors (skill
    rule-based behaviors)

69
Human Reliability TAFEI PHEA
  • Comprehensiveness S,R,Rv,EOC S,R
  • Structuredness Moderate Moderate
  • Model-based? High Moderate
  • EEM/PEM/PSF EEM, PSF EEM, PSF
  • Usefulness (ERA) High Moderate
  • Resources usage High Moderate
  • Experts tool Yes/Moderate No
  • Experts required Yes No
  • Documentability High High
  • Usage in PSA? Low Low
  • Availablility Yes Yes
  • Quantifiable? N/A SLIM/other
  • Life cycle stage? Detailed Detailed/Comm.
  • Primary objective Error Modes Errors leading to

70
Human Reliability
  • Procedure
  • Observations
  • Were made over three days at different times of
    the day and at different stations
  • Where possible observers selected users who fell
    into certain specified categories of age, gender
    and nationality

71
Human Reliability
  • Procedure, cont.
  • Analyst 1
  • TAFEI
  • Two-month waiting period
  • PHEA
  • Analyst 2
  • PHEA
  • Two-month waiting period
  • TAFEI

72
Human Reliability
  • Conclusions
  • Concurrent Validity little difference between
    the two HEI techniques (.80) and the HEI
    techniques produced an 80 level of agreement
    with observed data
  • Reliability both HEI techniques produced a
    reliability rate of between 90 and 100 between
    the two analysts

73
Human Reliability
  • Conclusions, cont.
  • Resource usage
  • Observation study 30 hours 30 minutes
  • HEI study 8 hours
  • Generality the HEI techniques were able to
    capture all the principal error types however,
    both HEI techniques failed to predict some of the
    observed errors

74
Human Reliability
  • Conclusions, cont.
  • Utility of HEI - Acceptable
  • Produced 80 level of agreement with observed
    data
  • Reliable inter-rater measures
  • More cost effective
  • Only one area of concern
  • HEI did not predict four error types that
    actually occurred
  • Criticality of unpredicted errors must be
    considered
Write a Comment
User Comments (0)
About PowerShow.com