Title: A1256655992LZxIz
1Usabilityand Beyond!
Understanding Usefulness, Usability Use CHI
2004 Tutorial April 2004 Diane J.
Schiano ltAbbreviated for PARCgt
2Table of Contents
Page Introduction 15 A
Primer on User Exp Research Design
30 ---Core UE Research Design Principles
37 ---Key Pragmatic Issues
56 Methods, Measures More 67 ---Key
Research Design Decisions 68 ---Methods
Self-Report Observational
69 ---Measures More Quantitative, Qualitative
128 More On Communicating Results
148 Appendix 155
3Course Learning Objectives
- This tutorial will provide a general
understanding of - Usefulness, usability and use studies what they
are, how and why they are done, and the kinds of
information they yield, with extended examples
and resources available in the tutorial notes. - User research design principles, and the
pragmatic challenges of attaining validity,
reliability, generalizability and robustness in
user studies. - The major self-report and observational methods
available for user studies, and how to choose and
apply them appropriately. - Approaches to dealing with qualitative,
quantitative and hybrid data, and to summarizing
results and making design recommendations from
them - Pragmatic considerations in applying procedures,
discussed from logistical, organizational, and
personal/privacy perspectives - Also provided
- Guidelines, readings, other resources for
conducting studies and evaluating findings, both
in class and in the tutorial notes - Opportunities for expert and peer feedback on
practical problems participants are invited to
bring in for consideration during the User
Research Design Clinic.
4Abstract
Digital products are growing increasingly
complex, encompassing interactions among humans
as well as between humans and technology. Careful
methods are required to understand user
experience of these products, and to use this
understanding to inform iterative design. The
goal of this tutorial is to provide a practical
understanding of the principles and procedures
used to assess product usefulness, usability and
use. Participants are given guidance and
grounding in general user research design
principles and procedures to aid them in choosing
methods, conducting studies, evaluating results,
and making recommendations effectively, even
under constrained conditions. A principled yet
pragmatic approach is advocated, consolidating
the best of classic usability engineering and
ethnographic methodsand applied appropriately
for the research question and context at hand.
Useful exercises, extended examples and extensive
references and other resources are provided.
Finally, attendees are invited to bring actual
problems to discuss in the User Research Design
Clinic for peer and expert feedback.
5Usabilityand Beyond!
Understanding Usefulness, Usability Use
6Usabilityand Beyond!
Understanding User Experience! Focusing on the
Principles Underlying the Procedures
7Introduction
8 Creating Useful, Usable Used Products
- Depends critically on effective User Research.
- If we build it they will come NOT!!!
- Whats the use of designing products that
arent - Useful,
- Usable,
- Used???!!!
9Core User Experience Research Concepts
- Usefulness
- Why--and how--could the product be useful to
people? Design ( marketing) implications from
current practice? - Usability
- How easily--and well--can the product be learned
and used? Implications for re-design? - Use
- How do people actually use the product?
Implications for re-design?
10 User-Centered Product Research Design
Usefulness --gt Use lt----------------------gt
Usability
From UsabilityNet
11Usefulness, Usability, Use in Product Cycle
Usefulness
Usability
Requirements
Implementation
Test Measure
Planning Feasibility
Design
Post-Release
Use
12 Usefulness, Usability, Use in Product Cycle
Usability
Use
Usefulness
From UsabilityNet
13Well Focus on Research Design Principles
- Primarily the WHYs behind choosing and
implementing user research methods. - For more on HOW-TOs, see
- UsabilityNet (http//www.usabilitynet.org)
- Appendix references recommendations
14 UsabilityNet An Excellent Resource!
- High Quality, Free
- How-Tos, Mini-Tutorials
- http//www.usabilitynet.org
15Core User Experience Research Concepts
- Usefulness
- Why--and how--could the product be useful to
people? Design ( marketing) implications from
current practice? - Usability
- How easily--and well--can the product be learned
and used? Implications for re-design? - Use
- How do people actually use the product?
Implications for re-design?
16Two Classic Approaches to UE Research
Ethnography (Usefulness Use) Human
Factors Engineering (Usability)
17The Ethnographic Approach
- Traditional Emphases
- Usefulness Use (motivations, practice)
- Self-report w/ contextualized observation
- Naturalistic context, no (or low) control
- Why? How? questions
- Qualitative data deliverables
18The Human Factors Engineering Approach
- Traditional Emphases
- Usability
- Observation (task performance)
- Lab context, high control
- How often/fast/much? questions
- Quantitative data deliverables
19These Approaches are Now Converging
- Self-report observation are complementary
- Naturalistic observations are becoming
increasingly common (esp. on the Internet) - Using converging methods is more informative and
cost-effective.
20These Approaches are Now Converging
- So it is becoming increasingly important
- to understand the core principles
- underlying ALL user experience research.
- And thats why were here!
21A Primer on User Research Design
22A Primer on User Research Design
- Science is the elucidation of common sense.
- Francis Bacon (attrib)
- There are the hard sciences, and then there are
the difficult sciences. - Gregory Bateson
23The Art Science of UE Research Design
- The creative use of .
- research principles pragmatics
-
-
- to construct, conduct communicate
- research to effectively inform product design.
24Overview of the Research Process
Prioritize. Focus on what you want to learn.
Design your research project using
appropriate methods based on - Research
principles pragmatic considerations
taken together. Conduct the research
appropriately. Analyze interpret findings
responsibly. - Use caution qualify as
needed. Communicate your findings effectively.
25Your Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
26My Advice
- KISS Keep it simple, sil vous plait!
-
- gtgt Focus on what you really need to learn.
- Your questions, goals, deliverables
- gtgt Prioritize!
- Design principles pragmatic considerations
- gtgt What evidence would convince YOU?
- Use common sense
- Be your own best critic. Challenge yourself!
27Core Research Design Principles
- Validity
- Reliability
- Generalizability
28Core Research Design Principles
- Validity (Am I really studying what I think I
am?) - aka internal validity
- Reliability (Will my findings be repeatable?)
- aka statistical significance
- Generalizability (Do my findings apply
appropriately?) - aka external validity
29Validity Reliability
30Validity Reliability
31Validity Reliability
High Reliability, High Validity Consistent
ON-Target
32Validity Reliability
33Validity Reliability
High Reliability, Low Validity Consistent but
OFF-Target
34Validity Reliability
35Validity Reliability
High Reliability, Low Validity NOT Consistent
OFF-Target
36Generalizability
37Generalizability, aka External Validity
External Applicability Throughput to Related
Target(s)
38Core Research Design Principles
- Validity (Am I really studying what I think I
am?) - Confounds controls, errors biases
- Look for disconfirming evidence
- Reliability (Will my findings be repeatable?)
- Statistical significance v chance
- Sample size ( participants, observations)
- Generalizability (Will my findings apply
appropriately?) - Representativeness (of sample, context)
39Core Research Design Principles
- Validity
- Reliability
- Generalizability
- These 3 principles may seem simple at first
glance, but they are profoundand the issues
involved can become quite complex. - They provide a foundation for evaluating ALL
researchqualitative or quantitative,
naturalistic - or controlled, market research or usability
testing.
40 Example IRCs LambdaMOO Project
41LambdaMOO Project Overview
- Goals Assess Hype-otheses, Characterize
Community - Ask (Self-Report) lt ----------- gt
Observe Behavior - Survey, Interviews lt ----------- gt
Logfile Analysis -
- Naturalistic lt ----------- gt
Controlled Context - LambdaMOO lt ------gt Lab?
- Qualitative lt ----------- gt
Quantitative Data - E.g., Transcripts lt ----------- gt
Hours Logged On - Convergent Methods Approach
42LambdaMOO Project Methods
- Survey (Self-Report)
- 1 Week Call upon Login 581 Respondents
- 30 Questions, Various Formats, Online
- Interviews (Self-Report)
- 12 Real-Life, Long-Term Participants (Many
IVR,etc) - 1.5-2 hr In-Depth, Semi-Structured Interviews
Maps Follow-ups - Logging Studies (Naturalistic Observed Behavior)
- Who/Where/When _at_ 1min Intervals, 24 hr/day,
2wks - Privacy Respected
- Data on 4,000 Users Obtained (Twice, 6 Mo.
Interval) - And Much More...
- Participant observations, attending BayMOO mtgs,
comparison studies, etc.
43Q Addiction to LambdaMOO?
- Self-Report Use Estimates Very High
- Previous research papers, popular books press
- 80 hrs/wk not uncommon
- Our interview findings not inconsistent
- Our Logfile Observations Differed Greatly
- Mean 8 hrs/wk (with multi-tasking idle time!)
- Less than 5 users on for 20 or more hrs/wk
44LambdaMOO Use Data from Logfiles
Mean1.13 hrs/day 8 hrs/wk
45Q Addiction to LambdaMOO?
- Q Are you convinced by the logfile data? Why
or - why not? Can you explain divergent
results? - gtgtValidity
- (Am I really studying what I think I am?)
- gtgtReliability
- (Will my findings be repeatable?)
- gtgtGeneralizability
- (Will my findings apply appropriately?)
46Key Pragmatic Issues
- Robustness (How strong is this effect?)
- Impact (How important is this finding?)
-
- Convergence (More is better!)
47Key Pragmatic Issues
- Robustness (How strong is this effect?)
- Large in magnitude (effect size)
- Impact (How important is this finding?)
- High in priority, severity
- Convergence (More is better!)
- Multiple modest methods can be most informative
- and cost effective.
48Q Addiction to LambdaMOO?
- Robustness (How strong is this effect?)
- Huge effect size (difference) 80 v 8 hrs/wk
- Impact (How important is this finding?)
- High potential social import, impact
- Convergence (More is better!)
- Self Report Observation Qualitative
Quantitative - Triangulating why, how, how
often/fast/much - questions from various perspectives.
49Other Pragmatic Considerations
- Complexity of Product/Design/System
- Your Deliverables
- Design recommendations? Presentation? Paper?
- Time Frame
- Product deadlines, readiness, design cycle
- Cost
- Time, money, personnel
- Other Resources Constraints
- Availability of participants, prototypes, tools
- Your skills, expertise interests
- Organizational political priorities
- Etc.
50Example LambdaMOO Project
- Extremely Complex System, User Community
- Our Deliverables
- Business design recommendations
- Time-Frame
- System complete but evolving cohort effects
- Cost
- High
- Other Resources Constraints
- Research ideal Many resources, few constraints
51Pragmatic Suggestions from UsabilityNet.
52 1) Given Limited Time Resources
53 2) Given No Direct Access to Users
54 3) Given Limited Skill/Expertise
55Methods, Measures More
56Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
57Methods
58Methods Overview
- Self-Report (e.g., Surveys, Interviews)
- Explanations Meaning, salience, satisfaction
- Feelings, opinions, preferences, priorities
- Other otherwise unobservables (w/ caution!)
- Observations (e.g., Logfile Clickstream
Analyses, Task Performance) - Naturalistic behavior
- Task performance
59Taxonomy of Common Methods (after Nielsen,
1993)
- Heuristic Evaluation Self-Report (of Experts)
- Performance Measures Observed Behavior
- Verbal Protocols Self-Report
- Observation Observed Behavior
- Questionnaires Self-Report
- Interviews Self-Report
- Focus groups Self-Report (in Groups)
- Logging actual use Observed Behavior
- User feedback Self-Report
60Key Research Design Decisions
Methods (What you can do) Ask lt
----------- gt Observe
(Self-Report) (Behavior)
Context (How--
where--you do it) Naturalistic lt
----------- gt Controlled Data analyses
deliverables Qualitative lt ----------- gt
Quantitative
61Self-Report Methods
62Self-Report Methods
- Surveys, Questionnaires
- Interviews
- And More
63Surveys, Questionnaires
64Surveys
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
65Surveys
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
66 Surveys
Whats wrong with this picture? - - gt
67Survey
01. How often do you use ltFeature Xgt? 02. How
good a user of MS Word do you consider
yourself? 03. Which version of Word do you
usually use? 04. When you need to find out how to
do something using lt Feature X gt, what do you
usually do first? a) Run through the menus c)
Read the manual b) Run through the toolbars d)
Access the easy-to-use help index 10. How do
you feel about lt Feature X gt? a) It's
one of the best things Microsoft ever did
b) I like it d) I don't like it c)
Indifferent e) I hate it so much I turned it
off 11. Why?
68Surveys
- Broad demographics, bkgnd information,
- Specific questions, often quantitative
- Fairly cheap easy to conduct analyze
- Useful pre- post- task comparisons
- - Limited, not natural context
- - Subject to biases, memory effects
69Surveys General Issues Guidelines
- Be Brief, Clear, Specific, Consistent Easy
- Question Design
- Specific gt General
- Forced-Choice gt Agree/Disagree
- Offer No Opinon or N/A Comment options
- Use rating scales for measuring intensity
- Ease of Using Surveys Comes At a Price
- Measurement errors biases. Sampling issues.
- Open-ended Qs harder to analyze, but more valid?
- ALWAYS Pilot Test Iterate on Questions!
70Example LambdaMOO Survey
- Survey
- 1 Week Call upon Login 581 Respondents
- 30 Questions, Various Formats, Online
- A relatively quick and easy way to ask for a
fairly large amount of self-report information
from a large population. The survey was
structured, and response formats for were
designed for ease of coding, analysis and
comparison. - Note Quantitative survey results are often more
valid, useful, and interesting when considered
comparatively rather than absolutely
71Example LambdaMOO Survey
- k (Self-Report) lt ----------- gt
- Ask (Self-Report) lt ----------- gt Observe
Behavior -
- Naturalistic lt ----------- gt
Controlled Context -
- Qualitative lt ----------- gt
Quantitative Data -
72Example Sociality in LambdaMOO?
73User Surveys for Design I (From
UsablityNet)
Usefulness
74User Surveys for Design II
75User Surveys for Design III
76Subjective Assessment Surveys I
Usability Use
77Subjective Assessment Surveys II
78Subjective Assessment Surveys III
79Interviews
80Interviews
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
81Interviews
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
82Interviews
- Less limited, more naturalistic
- Greater depth of understanding
- - Difficult to collect analyze results
- - Subject to biases, memory effects
83Interviews General Issues Guidelines
- Context choices
- Conversational style or structured interview?
- At your facility or theirs?
- ALWAYS use a guide sheet.
- Be prepared!
- Consider logistics, how to record data
- take notes, AND how to analyze present
results.
84Example Sociality in LambdaMOO?
- Previous reports emphasized the importance of
sociality in LambdaMOO . - Great good place gtgtgt Club/pub analogy
- Our survey and logfile data was mixed
- Survey suggested most time spent socializing.
- Logfile analysis showed most time spent alone!
- Modal characters in rooms together 1.
85Q Sociality in LambdaMOO?
- Interviews explained how BOTH were correct
- Most people discussed spending most time
- SOCIALIZING ALONE!
- Using remote messaging
- MOOmail, paging (presaging IM)
- Typically ( increasingly) from home
- For security multi- tasking
86Interviews I (From UsablityNet)
Usefulness, Usability, Use
87Interviews II
88 More Self-Report Methods
- See UsabilityNet for a variety of other
self-reportand hybridmethods. - If youre interestedand if theres time--we can
discuss some of these methods later in the
tutorial.
89Observational Methods
90Observational Methods
- You can learn a lot from looking.
- Yogi Bera (attrib.)
91Observations
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
92Observations
What you can do Ask (for Self-Report) lt
----------- gt Observe Behavior
How ( where) you do it
Naturalistic lt ----------- gt Controlled
Context Data analyses deliverables
Qualitative lt ----------- gt Quantitative
93Naturalistic Observations
- Direct
- Outside observer in context, or
- Participant Observer
- Indirect (Recorded)
- Video, audio recordings
- Clickstream and logfile data
- ---gt Notes Quotes
94Ethnographic Naturalistic Observations
not hypothesis driven
Fernandos filing system
(from Nardi)
95Ethnographic Observations
- Typically, detailed and extended observation of
behavior and artifacts in context. Often in
conjunction with interviews participant
observation. - Ethnographic core concepts
- Holism
- Natives point(s) of view
- Natural context
- History
96Classic Example Ethnographic Research
- Schiano, Nardi, Gumbrecht Swartz. (CHI2004
Short paper submission). Blogging by the Rest of
Us.
97Example LambdaMOO Log Observations
Ask (for Self-Report) lt ----------- gt
Observe Behavior Naturalistic
lt ----------- gt Controlled Context
Qualitative lt ----------- gt Quantitative Data
98LambdaMOO Logfile Studies
- Logged information
- State of each object (who/where/when) in system
- Recorded _at_ 1min intervals, 24 hr/day, 2wks
- Data on 4,000 users obtained
- Twice, with 6 month interval btwn studies
- Huge, rich database of objective information on
use patterns in naturalistic context. - Extensive data re-coding and analysis was
required. - The findings are very compelling.
99Q Addiction to LambdaMOO?
- Self-Report Use Estimates Very High
- Previous research papers, popular books press
- 80 hrs/wk not uncommon
- Our interview findings not inconsistent
- Our Logfile Observations Differed Greatly
- Mean 8 hrs/wk (with multi-tasking idle time!)
- Less than 5 users on for 20 or more hrs/wk
100 Q Addiction to LambdaMOO?
Mean1.13 hrs/day 8 hrs/wk
101LambdaMOO Project Epilog
- Convergent methods were very effective we felt
we gained a good understanding of the community. - Our company decided NOT to invest in MUDs or
other online communitiesBut by interpreting the
findings narrowly, missed a major opportunity IM
chat! - Schiano (1998). Lessons from LambdaMOO.
Presence.
102User Observations (From
UsablityNet)
Usefulness, Use
103User Observations II
104Usability Tests as Controlled Observations
- Task performance is observed behavior
- Exploratory, assessment, validation comparison
tests (Rubin) - Highly controlled task, lab context
- Aids direct comparisons, ease of analysis,
reliability - May hinder validity, generalizability
105 Classic Example Human Factors Research
- Schiano, Ehrlich, Raharja Sheridan (2000). Face
to Interface Facial Affect in (Hu)Man and
Machine. CHI2000. -
106 Brief Example Online Web Usability Tests
Data Speed, accuracy clickstreams!
107Performance Testing I (From UsablityNet)
Usability
108Performance Testing II
109 Observational Methods Issues Guidelines
- Naturalistic v Controlled Context Issues
- Major Data Coding Analysis Issues
- Responsible data reduction required
- - Validity, Generalizability Issues
- Inferring users attention, intentions
- Remember, correlation ltgt causation!
- Never Underestimate
- Privacy/Permissions Issues
- Preparedness Pragmatics
110Observation Special Topic Clickstreams
111Clickstream Information
- Who is Visiting Your Site
- How Many, How Long, How Often
- Paths Taken Through Your Pages
- Page Analyses
- Frequency of Use, Time Spent on Each Page
- Entries Exits
- Where Are Users Coming From? Leaving?
- Success?
- Transactions, Downloads, Info Viewed
-
112Example Netraker Clickstream Data
113Example Netraker Clickstream Data
Features
Example Observations
Start page
Majority of participants traveled this path
Pogo-sticking with Back-button
Link color Blacklink Redback-button
Preferred strategybrowsing
Page color Redlong time Orangemedium Yellowsh
ort
Participants used many navigation strategies
Multiple paths to the same page
And search terms
Dead-end
Backtracking from target page - everyone resorted
to search
Target page
114Web Clickstream Data Useful Information
- Who is Visiting Your Site
- How Many, How Long, How Often
- Paths Taken Through Your Pages
- Page Analyses
- Frequency of Use, Time Spent on Page
- Entries Exits
- Where Are Users Coming From? Leaving?
- Success?
- Transactions, Downloads, Info Viewed
-
Rosenbloom, 2000?
115Caveats Re. Clickstream Observations
- Much missing or potentially misleading data
- Same cookie, diff user? Diff address, same user?
- Cached content ( back button) issues
- Different behaviors of diff browsers, portals,
ISPs - Much is inferred
- Especially users intention, attention
- Validity generalizability issues
- Incredibly rich, readily available data
- Best in convergence with other methods
- Netraker, Vividence, Enviz, etc.
-
116Measures More
- Quantitative Quantified Data
117Quantitative Summary Statistics
- Data--gtHow often/fast/much
- Measures of Central Tendency
- Mean (Average value)
- Median (Middle value)
- Mode (Most common value)
- Measures of Variability
- Range (Interval btwn lowest highest values)
- Variance (Sum of squared deviations from mean)
- Standard Deviation (Square root of variance)
118Quantifying Qualitative Data
- Recoding why, how data to assess
- How often/fast/much
- Often compelling, but can be difficult to do, and
easily subject to bias. - Use with caution!
119Quantitative Analyses
- Match measures analyses to your questions
- and what you want to communicate.
- Collapsing, transforming, summarizing data
- Graphical representations
- Significance tests
- T-tests, ANOVAs
- Correlations, etc.
- Etc.
120Brief Example IM Logfile Study
121IM Data Coding Analysis Issues
- Units of Analysis?
- Chunk Chat activity separated by gt 5 min idle
time - Pairs Hi v Lo IM Familiarity Pair Interaction
- Measures
- Objective Data Who, Where, When,
- How Often/ Fast/ Much?
- Interpretive Coding How? Why?
- Inter-Coder Reliability Issues
- Analyses
- Primarily Quantitative
122Objective Measures
Interpretive Coding Human Judgment
123Initial Results
124Initial Results --gtFurther Coding, Analyses
125For Further Information
- Schiano, Ehrlich, Raharja Sheridan (2000).
Face to Interface Facial Affect in (Hu)Man and
Machine. CHI2000.
126Match Your Measures to Your Questions
- Qualitative (Why? How?)
- Quantitative (How often/fast/much?)
- Convergent (Bits of both)
- Trying to translate general hypotheses into
specific - quantitative questions can be very useful. Not
all - issues lend themselves to quantification. But
for - those that do, the precision and ability to make
direct - comparisons that quantification provides can
prove - highly useful.
127But Lets be Clear
- Quantification is not a magic bulleteven with
so-called objective measures. Human
judgmentyour judgment--is always required in
conducting research, analyzing interpreting
dataand communicating your findings! - Numbers can be impressiveor intimidating,
depending on your audience, and they are easily
misusedeven unintentionally. Think carefully
about what youre trying to find out in your
research, why, and for whom. Remember that the
apparent precision of numerical results may be
misleading. In short, be responsible!
128Measures More
129Qualitative Measures More
- Notes Quotes
- Photos, Video clips
- Narratives, Summaries
- Structured Products to Inform Design
- Personas/User Profiles
- Use Cases / Scenarios of Use
- User Advocacy in Design
130 Personas/ User Profiles
131Example Personas/User Profiles I
http//ccm.redhat.com/user-centered/personas.html
132Personas/User Profiles II
133Personas/User Profiles III
134Use Cases/ Scenarios of Use
135Example Use Cases/ Scenarios of Use
136 More On Communicating Results
137More On Communicating Results
- There are lies, damn lies and statistics.
- Mark Twain
138Medium as Message
- Your products are communicative, persuasive.
- Presentations
- Papers/reports
- Structured products for design (e.g., personas,
use cases) - User advocacy in design meetings
- You are making a case with every selection,
- summary or presentation of results
- Notes Quotes
- Photos, Video clips
- Narratives, summaries
- Statistical summaries, graphs tables
139Bottom Line Be Responsible, CredibleUseful!
- In Designing Conducting Research
- As weve discussed at length
-
- In Reporting Results
- Refer to research principles in summarizing and
presenting results - Provide access to methods raw data
- In Making Recommendations
- Be judicious and pragmatic
- Prioritize by robustness and impact.
140 User-Centered Product Research Design
Usefulness --gt Use lt----------------------gt
Usability
From UsabilityNet
141Special Discussion Topics
142User Research Design Clinic
143Appendix