Seven Principles of Sociotechnical Systems Engineering - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

Seven Principles of Sociotechnical Systems Engineering

Description:

... on a wet runway. aquaplaned, so brakes didn't ... but, in a cross-wind landing on a wet runway ... Aircraft hit about 400 metres short of the runway. ... – PowerPoint PPT presentation

Number of Views:155
Avg rating:3.0/5.0
Slides: 13
Provided by: Martyn67
Category:

less

Transcript and Presenter's Notes

Title: Seven Principles of Sociotechnical Systems Engineering


1
Seven Principles of Sociotechnical Systems
Engineering
  • Martyn Thomas

2
SUMMARY
  • Preserve the real world requirements
  • Keep the humans in the loop
  • Training is a first-class system component
  • Human behaviour must be made dependable
  • Dont set traps
  • Plan for deviant behaviour
  • Development methods must support formal analysis
    for dependability

3
Sociotechnical Systemsa perspective that
recognises that
  • requirements are actions in the real world, not
    within the technology
  • the computer systems are almost always there to
    support the actions of humans, not the reverse
  • the humans whose behaviour can affect the
    achievement of the requirements are inside the
    system, not outside
  • ... and their behaviour is affected by their
    training, culture, laws, physical environment and
    ethics

4
STS Dependability
  • For a sociotechnical system to be dependable, it
    must create or preserve the required properties
    in the real world considering the probable
    actions of these humans.
  • This has profound implications for the systems
    engineer, including .....

5
Component specifications must preserve the
necessary real world requirements
  • What happened?
  • Airbus A320, Warsaw 1993
  • aircraft landed in a crosswind on a wet runway
  • aquaplaned, so brakes didnt work
  • pilot applied reverse thrust, but disabled
  • why
  • airborne ? reverse thrust disabled
  • airborne ? no WoW/wheel pulses ? disabled
  • but, in a cross-wind landing on a wet runway ...
  • simplified for full analysis, see Mellor 94,
    Ladkin 96

6
Keep the humans in the loopwhy is it doing
that?All humans within the STS must understand
the systems behaviour adequately at all times
  • The system designer should ensure that the users
    understand what the system is doing
  • 14 February 1990 Indian Airlines A320
    Bangalore, India Controlled flight into terrain
    during approach. Aircraft hit about 400 metres
    short of the runway. Four of the seven crew
    members and 88 of the 139 passengers were killed.
    The pilot had accidentally caused the A320 to
    enter Open Idle descent. This had the effect of
    delaying alpha-floor activation which the PIC
    probably thought would save them. See Mellor
    1994

7
Training may be safety criticaltraining
materials and simulators should be considered
part of the STS
  • A fatal aircraft crash in 1987 was partly caused
    by inaccuracies in a flight training simulator.
  • The crash occurred when the crew attempted to
    take off without deploying flaps and slats
  • an unexplained electrical failure caused the
    take-off warning system to fail to warn them.
  • the simulator shows a warning system failure if
    the power fails to the warning system, whereas
    the equivalent aircraft power system fails
    silently.
  • the crew selected go-around on the flight
    director, from take-off, and the result was an
    additional pitch-up command of 6 degrees, whereas
    the simulator inhibits go-around from take-off.
    The aircraft stalled into the ground.

8
Base all assumptions about human behaviour on
strong evidence
  • the actions of human actors must be predictable,
    as a result of their training, or psychological
    studies, observation, reasoning and analysis
  • the required human actions must not obstruct the
    humans desire to get their job done effectively,
    or appear to do so. (c.f.NPfIT role-based access
    control)
  • test assumptions about behaviour (DIRC
    mammography study)
  • the system should support work-arounds, not
    obstruct them (ethnographic pattern book?)

9
Dont set traps!
Descent angle/rate on A320. 20 January 1992
Air Inter A320 near Strasbourg, France
Aircraft had a controlled flight into terrain
after the flight crew incorrectly set the flight
management system, probably selecting Flight
Path Angle instead of Vertical Speed. Five of
the six crew and 82 of the 87 passengers
perished.
10
Plan for deviant behaviour
  • System designers should anticipate actions
    motivated by laziness, curiosity, avarice or
    malice
  • For example, hospitals have experienced hundreds
    of insider attempts to read the medical records
    of celebrities.
  • Security is much harder to assure than safety,
    because malice undermines assumptions about the
    independence of failures.

11
Development methods must support formal analysis
for dependability
  • It is impractical or impossible to gain adequate
    confidence in any significant STS through testing
    alone
  • Formal analysis must therefore be at the core of
    the dependability case
  • The necessary science is incomplete. The
    engineering methods that exploit the science are
    immature or have not yet been developed
  • Current industry standards for developing
    critical STS are inadequate
  • This is a grand challenge for researchers and for
    the systems industry.

12
SUMMARY
  • Preserve the real world requirements
  • Keep the humans in the loop
  • Training is a first-class system component
  • Human behaviour must be made dependable
  • Dont set traps
  • Plan for deviant behaviour
  • Development methods must support formal analysis
    for dependability
Write a Comment
User Comments (0)
About PowerShow.com