CSI / ELG / SEG 2911 Professional Practice Pratique professionnelle - PowerPoint PPT Presentation

About This Presentation
Title:

CSI / ELG / SEG 2911 Professional Practice Pratique professionnelle

Description:

CSI / ELG / SEG 2911 Professional Practice Pratique professionnelle TOPICS 13-15 Society, The Environment and The Future Some of the material in these s is ... – PowerPoint PPT presentation

Number of Views:92
Avg rating:3.0/5.0
Slides: 41
Provided by: TimothyLe8
Category:

less

Transcript and Presenter's Notes

Title: CSI / ELG / SEG 2911 Professional Practice Pratique professionnelle


1
CSI / ELG / SEG 2911Professional
PracticePratique professionnelle
  • TOPICS 13-15
  • Society, The Environment and The Future
  • Some of the material in these slides is derived
    from slides produced by Sara Basse, the Author of
    the Gift of Fire textbook , and also other
    professors who have taught this course including
    Stan Matwin and Liam Peyton

2
System Failures
  • System failures have caused
  • Much death and destruction
  • Hundreds of billions of dollars in economic loss
  • 70B/year in avoidable loss just due to poor
    project management
  • Several individual systems have had multi-billion
    dollar losses
  • Much general inconvenience
  • It is the job of the profession and professionals
    to work to reduce this loss

3
An Excellent Website The Risks Digest
  • http//catless.ncl.ac.uk/Risks/
  • We will look at a couple of situations today

4
Failures and Errors in Systems
  • Most common high-level causes of system design
    failures
  • Lack of clear, well thought out goals and
    specifications
  • Poor management and poor communication among
    customers, designers, programmers, etc.
  • Pressures that encourage unrealistically low
    bids, low budget requests, and underestimates of
    time requirements
  • Use of very new technology, with unknown
    reliability and problems
  • Refusal to recognize or admit a project is in
    trouble
  • Lack of education or qualifications of critical
    personnel

5
Failures and Errors in Systems 2
  • Most computer applications are so complex it is
    virtually impossible to produce programs with no
    errors
  • The cause of failure is often more than one
    factor
  • Professionals must study failures in order to to
  • Learn how to avoid them
  • Understand the impacts of poor work

6
Denver Airport Fiasco
  • Baggage system failed due to real world problems,
    problems in other systems and software errors
  • Main causes
  • Time allowed for development was insufficient
  • Denver made significant changes in specifications
    after the project began

7
The Therac-25
  • Therac-25 Radiation Overdoses
  • Massive overdoses of radiation were given
  • The machine said no dose had been administered at
    all
  • Caused severe and painful injuries and the death
    of three patients
  • Important to study this to avoid repeating errors
  • Manufacturer, computer programmer, and
    hospitals/clinics all have some responsibility

8
The Therac-25 (cont.)
  • Software and Design problems
  • Re-used software from older systems, unaware of
    bugs in previous software
  • Weaknesses in design of operator interface
  • Inadequate test plan
  • Bugs in software
  • Allowed beam to deploy when table not in proper
    position
  • Ignored changes and corrections operators made at
    console

9
The Therac-25 (cont.)
  • Why So Many Incidents?
  • Hospitals had never seen such massive overdoses
    before, were unsure of the cause
  • Manufacturer said the machine could not have
    caused the overdoses and no other incidents had
    been reported
  • which was untrue)
  • The manufacturer made changes to the turntable
    and claimed they had improved safety after the
    second accident.
  • The changes did not correct any of the causes
    identified later

10
The Therac-25 (cont.)
  • Why So Many Incidents? (cont.)
  • Recommendations were made for further changes to
    enhance safety
  • the manufacturer did not implement them
  • The FDA declared the machine defective after the
    fifth accident
  • The sixth accident occurred while the FDA was
    negotiating with the manufacturer on what changes
    were needed

11
The Therac-25 (cont.)
  • Observations and Perspective
  • Minor design and implementation errors usually
    occur in complex systems
  • they are to be expected
  • The problems in the Therac-25 case were not minor
    and suggest irresponsibility
  • Accidents occurred on other radiation treatment
    equipment without computer controls when the
    technicians
  • Left a patient after treatment started to attend
    a party
  • Did not properly measure the radioactive drugs
  • Confused microcuries and millicuries

12
Case Study The Therac-25 Discussion Question
  • If you were a judge who had to assign
    responsibility in this case, how much
    responsibility would you assign to the
    programmer, the manufacturer, and the hospital or
    clinic using the machine?
  • Top Hat Monocle Question

13
The Environment 1
  • Hardware should be made in the greenest way
    possible
  • Lowest possible energy input to manufacturing
    process
  • Avoidance or reduced use of dangerous or depleted
    substances
  • Arsenic - used in displays to prevent defects
  • Mercury - used in flourescent backlights for
    displays
  • Lead - formerly used in CRTs still used in some
    solder
  • Hexavelent Chromium, Cadmium and other heavy
    metals
  • Banned by RoHS rules (Restrictions on Hazardous
    Substances)

14
The Environment 2
  • Standardized and replaceable components to avoid
    wastage
  • E.g. Universal power adapter / charger, standard
    batteries
  • EU will be mandating USB connectors to charge all
    cell phones
  • Recyclable materials and design for recyclability
  • Avoidance of design for obsolescence
  • Responsible waste disposal
  • Take-it-back policies and campaigns
  • Bounties
  • Cash for clunkers
  • Refund of deposits when an item reaches the end
    of its life
  • Avoiding shipping e-waste to developing countries
    for disposal

15
The Environment 3
  • Hardware and software that economizes on energy
    use
  • Avoidance of DC-power adapters that are always
    using phantom power
  • It is possible to preserve state with a battery
    and switch transformer on only when needed
  • Switching off and slowing down circuits,
    displays, etc. that are not in use
  • Adaptive, efficient, power-aware algorithms
  • Power-aware distributed computing
  • Run compute-intensive tasks where power is
    cheaper and/or where wind and solar power is
    currently being generated

16
The Environment 4
  • Focusing on the webs use of power
  • YouTube is expected to be losing 470m per year
    largely due to the huge costs of storing and
    delivering massive amounts of video
  • http//www.guardian.co.uk/technology/2009/may/03/i
    nternet-carbon-footprint
  • In 2011, data centres and the Internet were
    estimated to be using 2 of all electricity in
    the world
  • http//www.newscientist.com/blogs/onepercent/2011/
    10/307-gw-the-maximum-energy-the.html

17
The Environment 5
  • Green /social accounting
  • Accounting for environmental costs, not just
    financial costs
  • Inventing computing technology to support other
    green initiatives
  • Smart grid
  • Monitoring and distributing power better, so
    solar, wind and other local green power sources
    can be more effectively used
  • http//tims-ideas.blogspot.ca/2013/02/solar-power-
    has-bright-future-provided.html
  • Software to improve energy efficiency of
    vehicles and other energy-consuming devices
  • Software for environmental modelling to help
    improve scientific understanding of climate change

18
Social responsibility
  • The theory that any entity has a responsibility
    to society at large
  • Many groups of professionals have formed social
    responsibility groups
  • For our field, one example is Computer
    Professionals for Social Responsibility
  • http//cpsr.org
  • Another worthy organization is Engineers Without
    Borders

19
Social responsibility issues 1
  • Corporate social responsibility
  • Beyond just obeying the law
  • Examples
  • Avoiding creating products or services whose main
    intent will lead to social harm, or selling
    potentially harmful products to questionable
    entities
  • E.g. Tools for violating privacy
  • Avoiding exploitation at manufacturing plants and
    software development sites in developing
    countries
  • Involvement of corporations in community-based
    projects

20
Social responsibility issues 2
  • Availability of technology
  • To developing countries and the poor (The Digital
    Divide)
  • Helping train local people
  • Providing them with basic technology and internet
    access
  • This used to be a much greater problem before the
    recent rapid uptake of cellular phones into
    developing countries
  • But many landlocked countries in Africa still
    lack good Internet access
  • Access to computers is still low (as is access to
    books)
  • To rural areas
  • E.g. ensuring there is broadband and cellular
    coverage
  • To schools
  • E.g. One laptop per child program
  • Education can help bring children out of poverty
  • To the disabled
  • Ensuring software designs follow accessibility
    guidelines

21
Social responsibility issues 3
  • Internet and computer addiction and isolation
  • The more people use the Internet or spend time
    gaming
  • The more they lose contact with their real
    social environment
  • Or is it a different form of contact?
  • People who might be considered socially awkward
    can often have personally fulfilling interaction
    through the Internet
  • Second Life / Facebook
  • The less they use traditional media
  • The more time they spend working (at the office
    and home)
  • The more at risk they are of becoming addicted
  • E.g. Internet Addiction Disorder
  • See http//www.netaddiction.com/
  • Is working on a computer more isolating than
    reading a book
  • An activity that is usually applauded?

22
Social responsibility issues 4
  • Computers and children
  • How much should children be exposed to computers
    and the Internet? At what ages?
  • Bad effects
  • Kids can learn many bad things from the open
    internet
  • They can become addicted to the web and/or games
  • Good effects
  • Higher test scores, especially for
    under-priveleged children
  • See http//www.apa.org/news/press/releases/2006/04
    /internet-use.aspx

23
Social responsibility issues 5
  • Free and open-source software
  • Availability of this has stimulated for-profit
    enterprises to lower prices and improve quality
  • Encourages availability for the disadvantaged
  • Reduces monopoly by companies and countries

24
Social responsibility issues 6
  • Pro-bono donation of time of engineers and
    computer experts to the disadvantaged and to
    charities
  • Developing for local charities
  • International development, e.g. Engineers without
    Borders

25
Social responsibility issues 7
  • Women in engineering and computing
  • Computing is one of the industries with the
    lowest fraction of women
  • Using of engineering and computing for peaceful
    means only
  • Voting technology and promotion of democracy and
    civil society
  • Internet and spectrum governance
  • ICANN - still under US government control

26
Social responsibility issues 8
  • Promotion of freedom of speech and related rights
  • Opposition to censorship in certain countries
  • Net neutrality
  • Cryptome and Wikileaks
  • Revealing questionable information Electronic
    rights
  • Electronic Frontier Foundation (EFF)
  • http//www.eff.org/
  • Main issues
  • Bloggers and coders rights
  • Opposition to digital rights management, software
    patents
  • Promotion of privacy and transparency

27
Risks of Catastrophic FailuresElectrical
Engineers must beware! (1)
  • Electromagnetic disturbances
  • Interference (in many guises)
  • Pulses from nuclear explosions
  • Space weather
  • Gamma ray bursts
  • Ripple-effect grid failure
  • Infrastructure degeneration
  • Theft of copper

28
Risks of Catastrophic FailuresElectrical
Engineers must beware! (2)
  • Weather
  • Hurricanes, tornadoes, ice storms
  • Droughts and heatwaves
  • Increasing demand, drying up reservoirs
  • Toxins in materials
  • E-waste
  • Mercury in florescent lighting

29
Risks of catastrophic failures Computer
professionals must beware! (1)
  • Major industries may be brought down for short or
    long periods by IT failures
  • Food distribution, energy, transportation,
    communications, finance and markets
  • In other words, everything society depends on
  • IT failures causing this may result from
  • Natural or man-made disasters taking out
    computing infrastructure we have come to depend
    on
  • Design flaws
  • Hacking and cyber warfare
  • A combination of the above

30
Risks of catastrophic failures Computer
professionals must beware! (2)
  • The risks of large-scale catastrophe are small on
    a day-to-day basis, but large in the long run
  • Dependency on IT and computing is growing
  • Complexity is growing
  • Some types of threats (e.g. hacking) are growing
  • There is a risk of cascading effects
  • Some failures (e.g. energy) lead to others (e.g.
    telecom and food distribution) leading to
    isolated or more widespread social breakdown
  • Low short-term risk, but tremendously high costs
    means vigilance and action is imperative

31
Single point of failure GPS
  • The GPS System may become unavailable or
    dramatically less reliable
  • In one area or around the world
  • Causes
  • Jamming, solar flares
  • Failure of satellites from various causes
  • US government withdrawal of service in a crisis
  • What can fail
  • Military and civilian navigation, emergency
    response, delivery of products and services
  • Remediation
  • Backups such as inertial navigation with dead
    reckoning and visual identification
  • Use of Russian, EU (Galileo) and Chinese systems
  • Ongoing use of LORAN (which US no longer
    supports)

32
Single point of failure Electricity systems
controlled by computers
  • Increased software control could lead to
    cascading failure
  • Causes
  • Design errors, and hacking
  • Magnetic storms, ice storms, heat waves etc.
    leading to cascading overloads
  • Breakdown in markets, perhaps caused by fuel
    shortages or price increases
  • What can fail
  • All of industrial and domestic power supply
  • Hence computers, telecom, etc. once backup
    sources run out
  • This has happened
  • Remediation
  • Fail-safe islanding of grid
  • Secondary independent control system
  • Backup power sources for critical infrastructure

33
Single point of failure Grounding of all
vehicles of a given type due to software glitch
  • As vehicles become more software-driven, life
    threatening vulnerabilities may be discovered
  • E.g. the Toyota acceleration problem
  • E.g. fly-by-wire in airplanes
  • E.g. millions of vehicles becomoing prone to
    hacking
  • Consequences
  • Millions of people or businesses being forced off
    the road
  • Chaos in airlines
  • Causes
  • Design errors, hacking
  • Time to fix may be lengthy
  • Remediation
  • Fail-safe backup systems

34
Single point of failure Electronic banking,
finance and market system failures
  • Banks, credit card networks, stock trading, and
    similar systems go down or suffer data breaches
  • Consequences
  • Temporary interruption of many types of business
  • Market crashes
  • Loss of private information
  • Loss of records of transactions
  • Remediation
  • Alternative markets
  • Diversification
  • Accounts in different institutions

35
Single point of failure Air traffic control
failures
  • Many small-scale examples of this have occurred
  • Causes
  • Bugs, power outages, hacking, upgrade failures,
    network failures, radar jamming, etc.
  • Remediation
  • Protocols for scaling back flights
  • A backup system that works and is regularly
    tested and used
  • In the long run we may have control systems for
    road vehicles subject to similar modes of failure

36
Single point of failure Zero-day vulnerabilities
in major OSs, websites etc.
  • For example, a new vulnerability is found and
    exploited by a virulent worm
  • Causes
  • Latent design flaws coupled with hacking or
    cyber-warfare
  • Consequences
  • Systems of many kinds go down
  • Remediation
  • Avoid consumer operating systems in critical
    infrastructure
  • Use heterogeneous tools
  • Have backup tools, and use them regularly
  • Back up data, and test backups

37
Single point of failure Cellular and general
telecom system failures
  • Communications we rely on for many aspects of
    business fail
  • We have seen many small-scale examples
  • Causes
  • Hacking, design flaws, cable cuts
  • Consequences
  • Emergency response fails, businesses shut down,
    Internet shuts down or becomes degraded
  • Remediation
  • Maintain landline and mobile as alternatives
  • Interconnects between providers
  • Diversity of underlying technologies

38
Single point of failure Robots or AI systems run
amok
  • A favourite scenario in sci-fi
  • A realistic possibility in the more-distant
    future
  • With advances in technology it seems certain that
    within 50 or 100 or at least 200 years, computers
    and robots will be more intelligent than us
  • What will this mean for society?
  • Can we and should we do anything in preparation?
  • Engineers are working hard to enable robots to
    interact appropriately with humans
  • E.g. not too much force when in physical contact

39
Top Hat Monocle Questions
  • Which scenario do you fear the most occurring?
  • Which scenario do you think the most likely
    within your lifetime?

40
Azimovs laws of Robotics Fiction, yet a good
basis for discussion of risks
  • 1. A robot may not injure a human being or,
    through inaction, allow a human being to come to
    harm
  • But how is a robot to know what will necessarily
    harm a human
  • 2. A robot must obey any orders given to it by
    human beings, except where such orders would
    conflict with the First Law.
  • How is a robot to know whether there would be any
    conflict
  • 3. A robot must protect its own existence as
    long as such protection does not conflict with
    the First or Second Law.
Write a Comment
User Comments (0)
About PowerShow.com