A1256655992LZxIz - PowerPoint PPT Presentation

1 / 143
About This Presentation
Title:

A1256655992LZxIz

Description:

CSCW and Groupware: Experiences, State of Art, and Future Trends – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 144
Provided by: jonatha133
Category:

less

Transcript and Presenter's Notes

Title: A1256655992LZxIz


1
Usabilityand Beyond!
Understanding Usefulness, Usability Use CHI
2004 Tutorial April 2004 Diane J.
Schiano ltAbbreviated for PARCgt
2
Table of Contents
Page Introduction 15 A
Primer on User Exp Research Design
30 ---Core UE Research Design Principles
37 ---Key Pragmatic Issues

56 Methods, Measures More 67 ---Key
Research Design Decisions 68 ---Methods
Self-Report Observational
69 ---Measures More Quantitative, Qualitative
128 More On Communicating Results
148 Appendix 155
3
Course Learning Objectives
  • This tutorial will provide a general
    understanding of
  • Usefulness, usability and use studies what they
    are, how and why they are done, and the kinds of
    information they yield, with extended examples
    and resources available in the tutorial notes.
  • User research design principles, and the
    pragmatic challenges of attaining validity,
    reliability, generalizability and robustness in
    user studies.
  • The major self-report and observational methods
    available for user studies, and how to choose and
    apply them appropriately.
  • Approaches to dealing with qualitative,
    quantitative and hybrid data, and to summarizing
    results and making design recommendations from
    them
  • Pragmatic considerations in applying procedures,
    discussed from logistical, organizational, and
    personal/privacy perspectives
  • Also provided
  • Guidelines, readings, other resources for
    conducting studies and evaluating findings, both
    in class and in the tutorial notes
  • Opportunities for expert and peer feedback on
    practical problems participants are invited to
    bring in for consideration during the User
    Research Design Clinic.

4
Abstract
Digital products are growing increasingly
complex, encompassing interactions among humans
as well as between humans and technology. Careful
methods are required to understand user
experience of these products, and to use this
understanding to inform iterative design. The
goal of this tutorial is to provide a practical
understanding of the principles and procedures
used to assess product usefulness, usability and
use. Participants are given guidance and
grounding in general user research design
principles and procedures to aid them in choosing
methods, conducting studies, evaluating results,
and making recommendations effectively, even
under constrained conditions. A principled yet
pragmatic approach is advocated, consolidating
the best of classic usability engineering and
ethnographic methodsand applied appropriately
for the research question and context at hand.
Useful exercises, extended examples and extensive
references and other resources are provided.
Finally, attendees are invited to bring actual
problems to discuss in the User Research Design
Clinic for peer and expert feedback.
5
Usabilityand Beyond!
Understanding Usefulness, Usability Use
6
Usabilityand Beyond!
Understanding User Experience! Focusing on the
Principles Underlying the Procedures
7
Introduction
8
Creating Useful, Usable Used Products
  • Depends critically on effective User Research.
  • If we build it they will come NOT!!!
  • Whats the use of designing products that
    arent
  • Useful,
  • Usable,
  • Used???!!!

9
Core User Experience Research Concepts
  • Usefulness
  • Why--and how--could the product be useful to
    people? Design ( marketing) implications from
    current practice?
  • Usability
  • How easily--and well--can the product be learned
    and used? Implications for re-design?
  • Use
  • How do people actually use the product?
    Implications for re-design?


10
User-Centered Product Research Design
Usefulness --gt Use lt----------------------gt
Usability
From UsabilityNet
11
Usefulness, Usability, Use in Product Cycle
Usefulness
Usability
Requirements
Implementation
Test Measure
Planning Feasibility
Design
Post-Release
Use
12
Usefulness, Usability, Use in Product Cycle
Usability
Use
Usefulness
From UsabilityNet
13
Well Focus on Research Design Principles
  • Primarily the WHYs behind choosing and
    implementing user research methods.
  • For more on HOW-TOs, see
  • UsabilityNet (http//www.usabilitynet.org)
  • Appendix references recommendations


14
UsabilityNet An Excellent Resource!
  • High Quality, Free
  • How-Tos, Mini-Tutorials
  • http//www.usabilitynet.org

15
Core User Experience Research Concepts
  • Usefulness
  • Why--and how--could the product be useful to
    people? Design ( marketing) implications from
    current practice?
  • Usability
  • How easily--and well--can the product be learned
    and used? Implications for re-design?
  • Use
  • How do people actually use the product?
    Implications for re-design?


16
Two Classic Approaches to UE Research
Ethnography (Usefulness Use) Human
Factors Engineering (Usability)
17
The Ethnographic Approach
  • Traditional Emphases
  • Usefulness Use (motivations, practice)
  • Self-report w/ contextualized observation
  • Naturalistic context, no (or low) control
  • Why? How? questions
  • Qualitative data deliverables

18
The Human Factors Engineering Approach
  • Traditional Emphases
  • Usability
  • Observation (task performance)
  • Lab context, high control
  • How often/fast/much? questions
  • Quantitative data deliverables

19
These Approaches are Now Converging
  • Self-report observation are complementary
  • Naturalistic observations are becoming
    increasingly common (esp. on the Internet)
  • Using converging methods is more informative and
    cost-effective.

20
These Approaches are Now Converging
  • So it is becoming increasingly important
  • to understand the core principles
  • underlying ALL user experience research.
  • And thats why were here!

21
A Primer on User Research Design
22
A Primer on User Research Design
  • Science is the elucidation of common sense.
  • Francis Bacon (attrib)
  • There are the hard sciences, and then there are
    the difficult sciences.
  • Gregory Bateson

23
The Art Science of UE Research Design
  • The creative use of .
  • research principles pragmatics
  • to construct, conduct communicate
  • research to effectively inform product design.

24
Overview of the Research Process
Prioritize. Focus on what you want to learn.
Design your research project using
appropriate methods based on - Research
principles pragmatic considerations
taken together. Conduct the research
appropriately. Analyze interpret findings
responsibly. - Use caution qualify as
needed. Communicate your findings effectively.
25
Your Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
26
My Advice
  • KISS Keep it simple, sil vous plait!
  • gtgt Focus on what you really need to learn.
  • Your questions, goals, deliverables
  • gtgt Prioritize!
  • Design principles pragmatic considerations
  • gtgt What evidence would convince YOU?
  • Use common sense
  • Be your own best critic. Challenge yourself!

27
Core Research Design Principles
  • Validity
  • Reliability
  • Generalizability

28
Core Research Design Principles
  • Validity (Am I really studying what I think I
    am?)
  • aka internal validity
  • Reliability (Will my findings be repeatable?)
  • aka statistical significance
  • Generalizability (Do my findings apply
    appropriately?)
  • aka external validity

29
Validity Reliability
30
Validity Reliability
31
Validity Reliability
High Reliability, High Validity Consistent
ON-Target
32
Validity Reliability
33
Validity Reliability
High Reliability, Low Validity Consistent but
OFF-Target
34
Validity Reliability
35
Validity Reliability
High Reliability, Low Validity NOT Consistent
OFF-Target
36
Generalizability
37
Generalizability, aka External Validity
External Applicability Throughput to Related
Target(s)
38
Core Research Design Principles
  • Validity (Am I really studying what I think I
    am?)
  • Confounds controls, errors biases
  • Look for disconfirming evidence
  • Reliability (Will my findings be repeatable?)
  • Statistical significance v chance
  • Sample size ( participants, observations)
  • Generalizability (Will my findings apply
    appropriately?)
  • Representativeness (of sample, context)

39
Core Research Design Principles
  • Validity
  • Reliability
  • Generalizability
  • These 3 principles may seem simple at first
    glance, but they are profoundand the issues
    involved can become quite complex.
  • They provide a foundation for evaluating ALL
    researchqualitative or quantitative,
    naturalistic
  • or controlled, market research or usability
    testing.

40
Example IRCs LambdaMOO Project
41
LambdaMOO Project Overview
  • Goals Assess Hype-otheses, Characterize
    Community
  • Ask (Self-Report) lt ----------- gt
    Observe Behavior
  • Survey, Interviews lt ----------- gt
    Logfile Analysis
  • Naturalistic lt ----------- gt
    Controlled Context
  • LambdaMOO lt ------gt Lab?
  • Qualitative lt ----------- gt
    Quantitative Data
  • E.g., Transcripts lt ----------- gt
    Hours Logged On
  • Convergent Methods Approach

42
LambdaMOO Project Methods
  • Survey (Self-Report)
  • 1 Week Call upon Login 581 Respondents
  • 30 Questions, Various Formats, Online
  • Interviews (Self-Report)
  • 12 Real-Life, Long-Term Participants (Many
    IVR,etc)
  • 1.5-2 hr In-Depth, Semi-Structured Interviews
    Maps Follow-ups
  • Logging Studies (Naturalistic Observed Behavior)
  • Who/Where/When _at_ 1min Intervals, 24 hr/day,
    2wks
  • Privacy Respected
  • Data on 4,000 Users Obtained (Twice, 6 Mo.
    Interval)
  • And Much More...
  • Participant observations, attending BayMOO mtgs,
    comparison studies, etc.

43
Q Addiction to LambdaMOO?
  • Self-Report Use Estimates Very High
  • Previous research papers, popular books press
  • 80 hrs/wk not uncommon
  • Our interview findings not inconsistent
  • Our Logfile Observations Differed Greatly
  • Mean 8 hrs/wk (with multi-tasking idle time!)
  • Less than 5 users on for 20 or more hrs/wk

44
LambdaMOO Use Data from Logfiles
Mean1.13 hrs/day 8 hrs/wk
45
Q Addiction to LambdaMOO?
  • Q Are you convinced by the logfile data? Why
    or
  • why not? Can you explain divergent
    results?
  • gtgtValidity
  • (Am I really studying what I think I am?)
  • gtgtReliability
  • (Will my findings be repeatable?)
  • gtgtGeneralizability
  • (Will my findings apply appropriately?)

46
Key Pragmatic Issues
  • Robustness (How strong is this effect?)
  • Impact (How important is this finding?)
  • Convergence (More is better!)

47
Key Pragmatic Issues
  • Robustness (How strong is this effect?)
  • Large in magnitude (effect size)
  • Impact (How important is this finding?)
  • High in priority, severity
  • Convergence (More is better!)
  • Multiple modest methods can be most informative
  • and cost effective.

48
Q Addiction to LambdaMOO?
  • Robustness (How strong is this effect?)
  • Huge effect size (difference) 80 v 8 hrs/wk
  • Impact (How important is this finding?)
  • High potential social import, impact
  • Convergence (More is better!)
  • Self Report Observation Qualitative
    Quantitative
  • Triangulating why, how, how
    often/fast/much
  • questions from various perspectives.

49
Other Pragmatic Considerations
  • Complexity of Product/Design/System
  • Your Deliverables
  • Design recommendations? Presentation? Paper?
  • Time Frame
  • Product deadlines, readiness, design cycle
  • Cost
  • Time, money, personnel
  • Other Resources Constraints
  • Availability of participants, prototypes, tools
  • Your skills, expertise interests
  • Organizational political priorities
  • Etc.

50
Example LambdaMOO Project
  • Extremely Complex System, User Community
  • Our Deliverables
  • Business design recommendations
  • Time-Frame
  • System complete but evolving cohort effects
  • Cost
  • High
  • Other Resources Constraints
  • Research ideal Many resources, few constraints

51
Pragmatic Suggestions from UsabilityNet.
52
1) Given Limited Time Resources
53
2) Given No Direct Access to Users
54
3) Given Limited Skill/Expertise
55
Methods, Measures More
56
Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
57
Methods
58
Methods Overview
  • Self-Report (e.g., Surveys, Interviews)
  • Explanations Meaning, salience, satisfaction
  • Feelings, opinions, preferences, priorities
  • Other otherwise unobservables (w/ caution!)
  • Observations (e.g., Logfile Clickstream
    Analyses, Task Performance)
  • Naturalistic behavior
  • Task performance

59
Taxonomy of Common Methods (after Nielsen,
1993)
  • Heuristic Evaluation Self-Report (of Experts)
  • Performance Measures Observed Behavior
  • Verbal Protocols Self-Report
  • Observation Observed Behavior
  • Questionnaires Self-Report
  • Interviews Self-Report
  • Focus groups Self-Report (in Groups)
  • Logging actual use Observed Behavior
  • User feedback Self-Report

60
Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
61
Self-Report Methods
62
Self-Report Methods
  • Surveys, Questionnaires
  • Interviews
  • And More

63
Surveys, Questionnaires
64
Surveys
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
65
Surveys
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
66
Surveys
Whats wrong with this picture? - - gt
67
Survey

01. How often do you use ltFeature Xgt? 02. How
good a user of MS Word do you consider
yourself? 03. Which version of Word do you
usually use? 04. When you need to find out how to
do something using lt Feature X gt, what do you
usually do first? a) Run through the menus c)
Read the manual b) Run through the toolbars d)
Access the easy-to-use help index 10. How do
you feel about lt Feature X gt? a) It's
one of the best things Microsoft ever did
b) I like it d) I don't like it c)
Indifferent e) I hate it so much I turned it
off 11. Why?
68
Surveys
  • Broad demographics, bkgnd information,
  • Specific questions, often quantitative
  • Fairly cheap easy to conduct analyze
  • Useful pre- post- task comparisons
  • - Limited, not natural context
  • - Subject to biases, memory effects

69
Surveys General Issues Guidelines
  • Be Brief, Clear, Specific, Consistent Easy
  • Question Design
  • Specific gt General
  • Forced-Choice gt Agree/Disagree
  • Offer No Opinon or N/A Comment options
  • Use rating scales for measuring intensity
  • Ease of Using Surveys Comes At a Price
  • Measurement errors biases. Sampling issues.
  • Open-ended Qs harder to analyze, but more valid?
  • ALWAYS Pilot Test Iterate on Questions!

70
Example LambdaMOO Survey
  • Survey
  • 1 Week Call upon Login 581 Respondents
  • 30 Questions, Various Formats, Online
  • A relatively quick and easy way to ask for a
    fairly large amount of self-report information
    from a large population. The survey was
    structured, and response formats for were
    designed for ease of coding, analysis and
    comparison.
  • Note Quantitative survey results are often more
    valid, useful, and interesting when considered
    comparatively rather than absolutely

71
Example LambdaMOO Survey
  • k (Self-Report) lt ----------- gt
  • Ask (Self-Report) lt ----------- gt Observe
    Behavior
  • Naturalistic lt ----------- gt
    Controlled Context
  • Qualitative lt ----------- gt
    Quantitative Data

72
Example Sociality in LambdaMOO?
  • Survey Responses

73
User Surveys for Design I (From
UsablityNet)
Usefulness
74
User Surveys for Design II
75
User Surveys for Design III
76
Subjective Assessment Surveys I
Usability Use
77
Subjective Assessment Surveys II
78
Subjective Assessment Surveys III
79
Interviews
80
Interviews
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
81
Interviews
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
82
Interviews
  • Less limited, more naturalistic
  • Greater depth of understanding
  • - Difficult to collect analyze results
  • - Subject to biases, memory effects

83
Interviews General Issues Guidelines
  • Context choices
  • Conversational style or structured interview?
  • At your facility or theirs?
  • ALWAYS use a guide sheet.
  • Be prepared!
  • Consider logistics, how to record data
  • take notes, AND how to analyze present
    results.

84
Example Sociality in LambdaMOO?
  • Previous reports emphasized the importance of
    sociality in LambdaMOO .
  • Great good place gtgtgt Club/pub analogy
  • Our survey and logfile data was mixed
  • Survey suggested most time spent socializing.
  • Logfile analysis showed most time spent alone!
  • Modal characters in rooms together 1.

85
Q Sociality in LambdaMOO?
  • Interviews explained how BOTH were correct
  • Most people discussed spending most time
  • SOCIALIZING ALONE!
  • Using remote messaging
  • MOOmail, paging (presaging IM)
  • Typically ( increasingly) from home
  • For security multi- tasking

86
Interviews I (From UsablityNet)
Usefulness, Usability, Use
87
Interviews II
88
More Self-Report Methods
  • See UsabilityNet for a variety of other
    self-reportand hybridmethods.
  • If youre interestedand if theres time--we can
    discuss some of these methods later in the
    tutorial.

89
Observational Methods
90
Observational Methods
  • You can learn a lot from looking.
  • Yogi Bera (attrib.)

91
Observations
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
92
Observations
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
93
Naturalistic Observations
  • Direct
  • Outside observer in context, or
  • Participant Observer
  • Indirect (Recorded)
  • Video, audio recordings
  • Clickstream and logfile data
  • ---gt Notes Quotes

94
Ethnographic Naturalistic Observations
not hypothesis driven
Fernandos filing system
(from Nardi)
95
Ethnographic Observations
  • Typically, detailed and extended observation of
    behavior and artifacts in context. Often in
    conjunction with interviews participant
    observation.
  • Ethnographic core concepts
  • Holism
  • Natives point(s) of view
  • Natural context
  • History

96
Classic Example Ethnographic Research
  • Schiano, Nardi, Gumbrecht Swartz. (CHI2004
    Short paper submission). Blogging by the Rest of
    Us.

97
Example LambdaMOO Log Observations
Ask (for Self-Report) lt ----------- gt
Observe Behavior Naturalistic
lt ----------- gt Controlled Context
Qualitative lt ----------- gt Quantitative Data
98
LambdaMOO Logfile Studies
  • Logged information
  • State of each object (who/where/when) in system
  • Recorded _at_ 1min intervals, 24 hr/day, 2wks
  • Data on 4,000 users obtained
  • Twice, with 6 month interval btwn studies
  • Huge, rich database of objective information on
    use patterns in naturalistic context.
  • Extensive data re-coding and analysis was
    required.
  • The findings are very compelling.

99
Q Addiction to LambdaMOO?
  • Self-Report Use Estimates Very High
  • Previous research papers, popular books press
  • 80 hrs/wk not uncommon
  • Our interview findings not inconsistent
  • Our Logfile Observations Differed Greatly
  • Mean 8 hrs/wk (with multi-tasking idle time!)
  • Less than 5 users on for 20 or more hrs/wk

100
Q Addiction to LambdaMOO?
Mean1.13 hrs/day 8 hrs/wk
101
LambdaMOO Project Epilog
  • Convergent methods were very effective we felt
    we gained a good understanding of the community.
  • Our company decided NOT to invest in MUDs or
    other online communitiesBut by interpreting the
    findings narrowly, missed a major opportunity IM
    chat!
  • Schiano (1998). Lessons from LambdaMOO.
    Presence.

102
User Observations (From
UsablityNet)
Usefulness, Use
103
User Observations II
104
Usability Tests as Controlled Observations
  • Task performance is observed behavior
  • Exploratory, assessment, validation comparison
    tests (Rubin)
  • Highly controlled task, lab context
  • Aids direct comparisons, ease of analysis,
    reliability
  • May hinder validity, generalizability

105
Classic Example Human Factors Research
  • Schiano, Ehrlich, Raharja Sheridan (2000). Face
    to Interface Facial Affect in (Hu)Man and
    Machine. CHI2000.

106
Brief Example Online Web Usability Tests
Data Speed, accuracy clickstreams!
107
Performance Testing I (From UsablityNet)
Usability
108
Performance Testing II
109
Observational Methods Issues Guidelines
  • Naturalistic v Controlled Context Issues
  • Major Data Coding Analysis Issues
  • Responsible data reduction required
  • - Validity, Generalizability Issues
  • Inferring users attention, intentions
  • Remember, correlation ltgt causation!
  • Never Underestimate
  • Privacy/Permissions Issues
  • Preparedness Pragmatics

110
Observation Special Topic Clickstreams

111
Clickstream Information
  • Who is Visiting Your Site
  • How Many, How Long, How Often
  • Paths Taken Through Your Pages
  • Page Analyses
  • Frequency of Use, Time Spent on Each Page
  • Entries Exits
  • Where Are Users Coming From? Leaving?
  • Success?
  • Transactions, Downloads, Info Viewed

112
Example Netraker Clickstream Data
113
Example Netraker Clickstream Data
Features
Example Observations
Start page
Majority of participants traveled this path
Pogo-sticking with Back-button
Link color Blacklink Redback-button
Preferred strategybrowsing
Page color Redlong time Orangemedium Yellowsh
ort
Participants used many navigation strategies
Multiple paths to the same page
And search terms
Dead-end
Backtracking from target page - everyone resorted
to search
Target page
114
Web Clickstream Data Useful Information
  • Who is Visiting Your Site
  • How Many, How Long, How Often
  • Paths Taken Through Your Pages
  • Page Analyses
  • Frequency of Use, Time Spent on Page
  • Entries Exits
  • Where Are Users Coming From? Leaving?
  • Success?
  • Transactions, Downloads, Info Viewed

Rosenbloom, 2000?
115
Caveats Re. Clickstream Observations
  • Much missing or potentially misleading data
  • Same cookie, diff user? Diff address, same user?
  • Cached content ( back button) issues
  • Different behaviors of diff browsers, portals,
    ISPs
  • Much is inferred
  • Especially users intention, attention
  • Validity generalizability issues
  • Incredibly rich, readily available data
  • Best in convergence with other methods
  • Netraker, Vividence, Enviz, etc.

116
Measures More
  • Quantitative Quantified Data

117
Quantitative Summary Statistics
  • Data--gtHow often/fast/much
  • Measures of Central Tendency
  • Mean (Average value)
  • Median (Middle value)
  • Mode (Most common value)
  • Measures of Variability
  • Range (Interval btwn lowest highest values)
  • Variance (Sum of squared deviations from mean)
  • Standard Deviation (Square root of variance)

118
Quantifying Qualitative Data
  • Recoding why, how data to assess
  • How often/fast/much
  • Often compelling, but can be difficult to do, and
    easily subject to bias.
  • Use with caution!

119
Quantitative Analyses
  • Match measures analyses to your questions
  • and what you want to communicate.
  • Collapsing, transforming, summarizing data
  • Graphical representations
  • Significance tests
  • T-tests, ANOVAs
  • Correlations, etc.
  • Etc.

120
Brief Example IM Logfile Study


121
IM Data Coding Analysis Issues
  • Units of Analysis?
  • Chunk Chat activity separated by gt 5 min idle
    time
  • Pairs Hi v Lo IM Familiarity Pair Interaction
  • Measures
  • Objective Data Who, Where, When,
  • How Often/ Fast/ Much?
  • Interpretive Coding How? Why?
  • Inter-Coder Reliability Issues
  • Analyses
  • Primarily Quantitative

122
Objective Measures
Interpretive Coding Human Judgment
123
Initial Results


124
Initial Results --gtFurther Coding, Analyses


125
For Further Information
  • Schiano, Ehrlich, Raharja Sheridan (2000).
    Face to Interface Facial Affect in (Hu)Man and
    Machine. CHI2000.

126
Match Your Measures to Your Questions
  • Qualitative (Why? How?)
  • Quantitative (How often/fast/much?)
  • Convergent (Bits of both)
  • Trying to translate general hypotheses into
    specific
  • quantitative questions can be very useful. Not
    all
  • issues lend themselves to quantification. But
    for
  • those that do, the precision and ability to make
    direct
  • comparisons that quantification provides can
    prove
  • highly useful.

127
But Lets be Clear
  • Quantification is not a magic bulleteven with
    so-called objective measures. Human
    judgmentyour judgment--is always required in
    conducting research, analyzing interpreting
    dataand communicating your findings!
  • Numbers can be impressiveor intimidating,
    depending on your audience, and they are easily
    misusedeven unintentionally. Think carefully
    about what youre trying to find out in your
    research, why, and for whom. Remember that the
    apparent precision of numerical results may be
    misleading. In short, be responsible!

128
Measures More
  • Qualitative Data

129
Qualitative Measures More
  • Notes Quotes
  • Photos, Video clips
  • Narratives, Summaries
  • Structured Products to Inform Design
  • Personas/User Profiles
  • Use Cases / Scenarios of Use
  • User Advocacy in Design

130
Personas/ User Profiles
131
Example Personas/User Profiles I
http//ccm.redhat.com/user-centered/personas.html
132
Personas/User Profiles II
133
Personas/User Profiles III
134
Use Cases/ Scenarios of Use
135
Example Use Cases/ Scenarios of Use
136
More On Communicating Results
137
More On Communicating Results
  • There are lies, damn lies and statistics.
  • Mark Twain

138
Medium as Message
  • Your products are communicative, persuasive.
  • Presentations
  • Papers/reports
  • Structured products for design (e.g., personas,
    use cases)
  • User advocacy in design meetings
  • You are making a case with every selection,
  • summary or presentation of results
  • Notes Quotes
  • Photos, Video clips
  • Narratives, summaries
  • Statistical summaries, graphs tables

139
Bottom Line Be Responsible, CredibleUseful!
  • In Designing Conducting Research
  • As weve discussed at length
  • In Reporting Results
  • Refer to research principles in summarizing and
    presenting results
  • Provide access to methods raw data
  • In Making Recommendations
  • Be judicious and pragmatic
  • Prioritize by robustness and impact.

140
User-Centered Product Research Design
Usefulness --gt Use lt----------------------gt
Usability
From UsabilityNet
141
Special Discussion Topics
142
User Research Design Clinic
143
Appendix
Write a Comment
User Comments (0)
About PowerShow.com