Title: Psychological Aspects of Risk Management and Technology
1Psychological Aspects of Risk Management and
Technology Overview
2- "The correct functioning of the train control
system and the automatic traffic control system
is to be monitored by the signaller. If
necessary, he/she has to intervene manually.
During normal operation, no monitoring is
necessary as long as the operational requirements
are met. In the case of disturbances or
incidents, the notification of the required
services and the required alarm procedures must
be guaranteed." -
- (Excerpt from the rule book of a European
railway company)
3Supervisory control (Sheridan, 1987)
- Planning off-line what task to do and how to do
it - Teaching the computer what was planned
- Monitoring the automatic action on-line to make
sure all is going as planned and to detect
failures - Intervening when deviations occur or plans change
- Learning from experience in order to improve
performance
4Ironies of automation (Bainbridge, 1983)
- "If decisions can be fully specified then a
computer can make them more quickly than a human
operator can. There is therefore no way in which
the human operator can check in real-time that
the computer is following its rules correctly. If
the computer is being used to make the decisions
because human judgement and intuitive reasoning
are not adequate in this context, then which of
the decisions is to be accepted? The human
monitor has been given an impossible task." - Attempts at removing humans from automated
systems increase their importance as system
back-up, while at the same time reducing human
capabilities and motivation for adequate
judgment of technical functioning and adequate
intervention.
5Design challenges in automated systems
- Avoiding mix of qualitative overload and
quantitative underload - Avoiding leftover activities in automation gaps
- Securing implicit knowledge
- Providing fit between accountability and control
6Strategies for distributing tasks between human
and technology
7Relative capabilities of humans and machines
Coping with ill-defined problems
Handling complex, but well-defined tasks
8Critique of comparison strategy
- Humans and technology cannot be quantitatively
compared - The same function is carried out in qualitatively
different ways by humans and technology - Humans and technology do not substitute, but
complement each other - Instead of either-or decisions, the interaction
between humans and technology needs to be
designed - Task requirements are determined by interactions
between differen functions - Automating one function will influence handling
of another function by the human
9Overcoming the ironies of automation through the
design principle of complementarity
- Designing the interaction between human and
technical system based on complementary
differences creating a new performance quality - technology not as competitor and not as imitation
of the human operator, aiming at replacing him,
but as - complementary support of human strengths (e.g.
solving of ill-defined problems) and compensation
of human shortcomings (e.g. unreliable repetition
of monotonous operations) - Supporting human control over technical systems
and the human's role as system manager
10Propositions for safe design of automation
- Every automated system is a socio-technical
system, irrespective of its degree of automation.
- People are accountable for correct system
functioning even in fully automated systems. - Accountability requires control over the
technical system in terms of system transparency
and predictability and proper means for
influencing the system. - Instead of either-or decisions based on
quantitative comparisons of capabilities and
performance, human-technology interaction is to
be designed based on complementary use of
qualitatively different performance potentials. - System control is determined by the distribution
of tasks between human and technology and between
people.
11Objectives of the design method KOMPASS (Grote et
al., 2000 Wäfler et al., 2003)
- Embedding function allocation decisions in job
and organizational design - Supporting task design that considers cognitive
as well as motivational preconditions and
outcomes - Supporting prospective design in
interdisciplinary teams - Balancing of user participation and reliance on
expert criteria - Implementing a complementary design philosophy
12KOMPASS Global design objectives
- Work system Local regulation of system
variances and disturbances - Individual work taskCompetence development and
intrinsic motivation - Human-technology interactionHuman control over
technology
13KOMPASS Design criteria
14Example of KOMPASS criteria Information and
execution authority
Manual
Manual, technically supported
Manual, technically constrained
Manual, technically supported and constrained
Automatic, manually comfirmed
Automatic
Design objective Fit between level of
information and execution authority and between
overall authority and responsibility
15KOMPASS Design heuristic
- Phase 1 Expert analysis of existing work
systems. - Phase 2 Discussion of design philosophy.
- Step 2.1. Definition of the primary task and the
functions of the planned work system. - Step 2.2. Definition of a shared evaluation
concept to differentiate between successful and
unsuccessful work systems. - Step 2.3. Identification of the main potentials
for improvement and design objectives - Step 2.4. Identification of the potential
contributions to a successful work system by
human operator, technical system and
organizational conditions. - Step 2.5. Specification of the working conditions
required for human operators to make their
specific contributions. - Step 2.6. Decision on usefulness of the KOMPASS
criteria for the analysis, evaluation and design
of work systems. - Phase 3 Derivation of concrete design
requirements. - Step 3.1. Derivation of requirements for system
design. - Step 3.2. Definition of work packages.
16KOMPASS application
- Analysis of different scenarios for task
distribution and work process design regarding
KOMPASS criteria - Applying KOMPASS design heuristic to gain
understanding of (implicit) assumptions in
scenarios regarding e.g. - competence for handling and power for
transferring uncertainties by the different
actors - role of technology vs. different human operators
as back-ups
17But ...
- Are we approaching limits of human control over
techology? - Increasing complexity and uncertainty in system
networks and learning systems - example Air Traffic Management
- example Pervasive computing
18Design for (partial) non-controllability
- Giving human operators information about the
limits of control - Shifting accountability to system designers and
operating organizations for handling limitations
of operator control - Gaining control by giving up control ?
19Designing driverless cars
- Develop a general scenario for introducing
driverless cars - Describe the distribution of tasks in the
scenario - Describe the distribution of control and
accountability in the scenario