Design for Human Capabilities

1 / 50
About This Presentation
Title:

Design for Human Capabilities

Description:

Analyzing how people do their jobs ... Actors - Mary, Bob, Sally. Composite objects - The 'team' Jan 24 , Spring 2003. CS 4750 ... Naturalistic (Field studies) ... – PowerPoint PPT presentation

Number of Views:21
Avg rating:3.0/5.0
Slides: 51
Provided by: cds198

less

Transcript and Presenter's Notes

Title: Design for Human Capabilities


1
Design for Human Capabilities
  • Task Analysis

2
Agenda
  • Task Analysis
  • Evaluation
  • Predictive evaluation
  • Heuristic evaluation
  • Discount usability testing
  • Cognitive walkthrough

3
Task Conformance
  • Task coverage
  • Can system do all tasks of interest?
  • Task adequacy
  • Can user do tasks?
  • Does system match real-world tasks?

4
Task Analysis
  • Analyzing how people do their jobs
  • Go to their environment
  • Examine users tasks to better understand what
    they need from interface and how they will use it

5
Task Analysis
  • Broad Focus
  • Observe users of current system(s)
  • Generate requirements
  • Hierarchical task analysis
  • Knowledge-based task analysis
  • Entity-Relationship model

6
Existing System
  • Usually task analysis involves an examination of
    an existing system, process or practice
  • Watch what they do and how they do it

7
No Existing System
  • Gather documents, talk with knowledgeable people,
    etc.
  • Can still be useful to help generate requirements

8
Broad Focus
  • Dont just focus on computer system artifacts and
    interactions
  • Study related processes and objects in the
    environment that people may use and involve
  • Example office environment---papers,
    whiteboards, etc.

9
Task Analysis Focus
  • Not on internal cognitive state of user (more on
    that in the near future)
  • Focus on observable behaviors
  • Observe users, what they do, and how they do it
  • What are the practices, methods, steps, objects,
    , used?

10
Key Component
  • Requirements - Usually involves developing set of
    requirements of what interface should provide
  • Example Get food in cafeteria
  • A. User must be able to see all food items and
    costs
  • B. Tray and utensils must be easily accessible
  • C. ...

11
Types of Task Analysis
  • 1. Hierarchical task decomposition
  • 2. Knowledge-based analysis
  • 3. Entity-relationship methods

12
1. Hierarchical Task Decomposition
  • Decompose task into
  • Subtasks
  • Multiple levels
  • Plans describing ordering and conditions

13
Common Plans
  • Fixed sequence
  • Optional tasks
  • Waiting for events
  • Cycles
  • Time-sharing -- parallel
  • Discretionary

14
2. Knowledge-based
  • List all objects and actions involved in a task,
    then build a taxonomy of them
  • Often times, work with domain expert to get help

15
Methodology
  • Sample
  • Get 3x5 cards
  • Put different object/action on each
  • Dont worry about repetition at this point!
  • Group into piles, subpiles, etc.

16
Utility
  • This type of task analysis can be very useful
    when youre writing a manual or some
    documentation
  • Taxonomy --- Document sections

17
3. Entity-Relationship
  • Object-based methodology, with a real stress on
    relationship between objects and actions
  • Involves
  • Concrete objects
  • Actors
  • Composite objects

18
Example
  • Task Develop design for final project
  • Objects - Pens, paper, drawing tools, etc.
  • Actors - Mary, Bob, Sally
  • Composite objects - The team

19
Methodology
  • Often list attributes, actions of objects

Object pen simple Attribute color
red writing on/off Object Mary
actor Actions M1 make a sketch
M2 organize meeting
20
Sources of Information
  • Documentation, training manuals
  • Beware They often say what is supposed to
    happen, not what happens in real life
  • Observation
  • Make sure to just watch, get a feel for what the
    task involves
  • Interviews
  • Why do you do that?
  • What happens if something goes wrong?

21
Use
  • Produce documentation
  • Training, manuals, tutorials
  • Requirements capture and system design
  • Helps you define requirements document
  • Helps decide what should be included
  • Helps interface design
  • Hierarchical breakdown might feed menu design

22
Evaluation
  • Gathering data about usability of a design by a
    specified group of users for a particular
    activity within a specified environment

23
Goals
  • 1. Assess extent of systems functionality
  • 2. Assess effect of interface on user
  • 3. Identify specific problems with system

24
Forms
  • Summative
  • After a system has been finished. Make judgments
    about final item.
  • Formative
  • As project is forming. All through the
    lifecycle. Early, continuous.

25
Approaches
  • Experimental (Lab studies)
  • Typically in a closed, lab setting
    Manipulate independent variables to see effect on
    dependent variables
  • Naturalistic (Field studies)
  • Observation occurs in real life setting Watch
    process over time
    Ecologically valid

26
Tradeoffs
  • Expert May be able to isolate cause and effect
  • Ecologically valid?
  • Natural
  • No experimental control

27
Evaluation Methods
  • 1. Experimental/Observational Evaluation
  • a. Collecting user opinions
  • b. Observing usage
  • c. Experiments (usability specifications)
  • 2. Predictive Evaluation
  • 3. Interpretive Evaluation

28
Predictive Evaluation
  • Basis
  • Observing users can be time-consuming and
    expensive.
  • Try to predict usage rather than observing it
    directly.
  • Conserve resources (quick low cost)

29
Approach
  • Expert reviews
  • HCI experts interact with system and try to find
    potential problems and give prescriptive feedback
  • Best if
  • Havent used earlier prototype
  • Familiar with domain or task
  • Understand user perspectives

30
Methods
  • 1. Heuristic Evaluation
  • 2. Discount usability testing
  • 3. Cognitive Walkthrough
  • 4. User Modeling

31
Heuristic Evaluation
  • Developed by Jakob Nielsen
  • Several evaluators assess system based on simple
    and general heuristics (principles or rules of
    thumb)

(Web sites)
32
Procedure
  • 1. Gather inputs
  • 2. Evaluate system
  • 3. Debriefing and collection
  • 4. Severity rating

33
Gather Inputs
  • Who are evaluators?
  • Need to learn about domain, its practices
  • Get the prototype to be studied
  • May vary from mock-ups and storyboards to a
    working system

34
Evaluation Method
  • Reviewers evaluate system based on high-level
    heuristics

use simple and natural dialog provide clear
marked exits speak users language provide
shortcuts minimize memory load provide good
error msgs be consistent prevent errors
provide feedback
35
Updated Heuristics
  • Stresses

visibility of system status aesthetic and
minimalist design user control
and freedom consistency and standards error
prevention
recognition rather than recall flexibility
and efficiency of use recognition, diagnosis
and recovery from errors help and
documentation match between system and real
world
36
Process
  • Perform two or more passes through system
    inspecting
  • Flow from screen to screen
  • Each screen
  • Evaluate against heuristics
  • Find problems
  • Subjective
  • Dont dwell on whether it is or isnt a Real
    Problem

37
Debriefing
  • Organize all problems found by different
    reviewers
  • At this point, decide what are and arent
    problems
  • Group, structure

38
Severity Rating
  • 0-4 rating scale
  • Based on
  • frequency
  • impact
  • persistence
  • market impact

39
Advantage
  • Cheap, good for small companies who cant afford
    more
  • Getting someone practiced in method is valuable

40
Application
  • Nielsen found that about 5 evaluations found 75
    of the problems
  • Above that you get more, but at decreasing
    efficiency

41
Discount Usability Testing
  • Hybrid of empirical usability testing and
    heuristic evaluation
  • Have 2 or 3 think-aloud user sessions with paper
    or prototype-produced mock-ups

42
Cognitive Walkthrough
  • From Polson, Lewis, etc at UC Boulder
  • Like code walkthrough (s/w engineering)
  • Assess learnability and usability through
    simulation of way users explore and become
    familiar with interactive system

43
CW Process
  • Construct carefully designed tasks from system
    spec or screen mock-up
  • Walk through (cognitive operational) activities
    required to go from one screen to another
  • Review actions needed for task, attempt to
    predict how users would behave and what problems
    theyll encounter

44
Requirements
  • Description of users and their backgrounds
  • Description of task user is to perform
  • Complete list of the actions required to complete
    task
  • Prototype or description of system

45
Assumptions
  • User has rough plan
  • User explores system, looking for actions to
    contribute to performance of action
  • User selects action seems best for desired goal
  • User interprets response and assesses whether
    progress has been made toward completing task

46
Methodology
  • Step through action sequence
  • Action 1
  • Response A, B, ..
  • Action 2
  • Response A
  • ...
  • For each one, ask four questions

47
CW Questions
  • 1. Will users be trying to produce whatever
    effect action has?
  • 2. Will users be able to notice that correct
    action is available?
  • 3. Once found, will they know its the right one
    for desired effect?
  • 4. Will users understand feedback after action?

48
Answering the Questions
  • 1. Will user be trying to produce effect?
  • Typical supporting Evidence
  • It is part of their original task
  • They have experience using the system
  • The system tells them to do it
  • No evidence?
  • Construct a failure scenario
  • Explain, back up opinion

49
Another Question
  • 2.Will user notice action is available?
  • Typical supporting evidence
  • Experience
  • Visible device, such as a button
  • Perceivable representation of an action such as a
    menu item

50
Example
  • Program VCR
  • List actions
  • Ask questions
Write a Comment
User Comments (0)