Title: Martin Bazley www'ICT4Learning'com
1User testing and evaluation why, how and when
to do it
- Evaluating and user testing
- Weston Park
- 15 Sep 2008
- Martin Bazley
- ICT4Learning.com
2Intro Martin Bazley
- Consultancy/websites/training/user testing
ICT4Learning.com (10 yrs) - Chair of E-Learning Group for Museums
- Previously
- E-Learning Officer, MLA South East (3yrs)
- Science Museum, London, Internet Projects (7yrs)
- Taught Science in secondary schools (8yrs)
3Why evaluate websites?
- Why do evaluation and user testing?
- Isnt it really expensive and time consuming?
- Save money avoid substantial, hurried
redevelopment later in project - Audience feedback improves resource in various
ways new activity ideas, etc - Demonstrate involvement of key stakeholders
throughout project
4Making websites effective
- 3 key success factors
- Understanding audience
- Learning experience and learning outcomes right
for audience and clearly stated - Evaluation esp in classroom or home (observe in
natural habitat wherever possible)
5Who for what for ...
- Who for? (audience)
- Need to be clear from starte.g. for teachers
of yr5/6 in local area with whiteboards - What real-world outcomes? (learning outcomes)
- What will they learn or do as a result? e.g.
plan a visit to museum, learn that Romans wore
funny clothes, discover that they enjoy using a
digital camera - How will they use it? (learning experiences)
- What do they actually do with the site? e.g.
work online or need to print it? - in pairs or
alone? - with or without teacher help? - Where, when and why will they use it?
- context is important
6(No Transcript)
7(No Transcript)
8(No Transcript)
9(No Transcript)
10(No Transcript)
11(No Transcript)
12(No Transcript)
13(No Transcript)
14(No Transcript)
15Website evaluation and testing
- Need to think ahead a bit
- what are you trying to find out?
- how do you intend to test it?
- why? what will do you do as a result?
- The Why? should drive this process
16Test early
- Testing one user early on in the project
- is better than testing 50 near the end
17When to evaluate or test and why
- Before funding approval project planning
- Post-funding - project development
- Post-project summative evaluation
18Testing is an iterative process
- Testing isnt something you do once
- Make something
- gt test it
- gt refine it
- gt test it again
19Before funding project planning
- Evaluation of other websites
- Who for? What for? How use it? etc
- awareness raising issues, opportunities
- contributes to market research
- possible elements, graphic feel etc
- Concept testing
- check idea makes sense with audience
- reshape project based on user feedback
20(No Transcript)
21Post-funding - project development
- Concept testing
- refine project outcomes based on feedback from
intended users - Refine website structure
- does it work for users?
- Evaluate initial look and feel
- graphics,navigation etc
22(No Transcript)
23(No Transcript)
24(No Transcript)
25(No Transcript)
26Post-funding - project development 2
- Full evaluation of a draft working version
- usability AND content do activities work, how
engaging is it, what else could be offered, etc
Observation of actual use of website
by intended users,
using it for intended purpose,
in intended context classroom, workplace,
library, home, etc
27(No Transcript)
28(No Transcript)
29(No Transcript)
30(No Transcript)
31(No Transcript)
32- Video clip Moving Here key ideas not lesson plans
33(No Transcript)
34(No Transcript)
35(No Transcript)
36(No Transcript)
37Post-funding - project development 3
- Acceptance testing of finished website
- last minute check, minor corrections only
- often offered by web developers
- Summative evaluation
- report for funders, etc
- learn lessons at project level for next time
38Two usability testing techniques
- Get it testing
- - do they understand the purpose, how it works,
etc - Key task testing
- ask the user to do something, watch how well they
do - Ideally, do a bit of each, in that order
39(No Transcript)
40User testing who should do it?
- The worst person to conduct (or interpret) user
testing of your own site is - you!
- Beware of hearing what you want to hear
- Useful to have an external viewpoint
- First 5mins in a genuine setting tells you 80 of
whats wrong with the site - etc
41User testing more info
- User testing can be done cheaply tips on how to
do it available (MLA SE guide)
www.ICT4Learning.com/onlineguide
42- Strengths and weaknesses of different data
gathering techniques
43Data gathering techniques
- User testing - early in development and again
near end - Online questionnaires emailed to people or
linked from website - Focus groups - best near beginning of project,
or at redevelopment stage - Visitor surveys - link online and real visits
- Web stats- useful for long term trends /events
etc
44- Need to distinguish between
- Diagnostics making a project or service
better - Reporting
- to funders, or for advocacy
45Online questionnaires
- () once set up they gather numerical and
qualitative data with no further effort given
time can build up large datasets - () the datasets can be easily exported and
manipulated, can be sampled at various times, and
structured queries can yield useful results - () respondents are self-selected and this will
skew results best to compare with similar data
from other sources, like visitor surveys - () the number and nature of responses may depend
on how the online questionnaire is displayed and
promoted on the website
46Focus groups
- () can explore specific issues in more depth,
yielding rich feedback - () possible to control participant composition
to ensure representative - () comparatively time-consuming (expensive) to
organise and analyse - () yield qualitative data only - small numbers
mean numerical comparisons are unreliable
47Visitor surveys
- () possible to control participant composition
to ensure representative - () comparatively time-consuming (expensive) to
organise and analyse - () responses can be affected by various factors
including interviewer, weather on the day, day of
the week, etc, reducing validity of numerical
comparisons between museums
48Web stats
- () Easy to gather data can decide what to do
with it later - () Person-independent data generated - it is the
interpretation, rather than the data themselves,
which is subjective. This means others can
review the same data and verify or amend initial
conclusions reached
49Web stats
- () Different systems generate different data for
the same web activity for example no of unique
visits measured via Google Analytics is generally
lower than that derived via server log files - () Metrics are complicated and require
specialist knowledge to appreciate them fully
50Web stats
- () As the amount of off-website web activity
increases (e.g. Web 2.0 style interactions) the
validity of website stats decreases, especially
for reporting purposes, but also for diagnostics - () Agreeing a common format for presentation of
data and analysis requires collaborative working
to be meaningful
51Who for what for ...
- Who for? (audience)
- Need to be clear from starte.g. for teachers
of yr5/6 in local area with whiteboards - What real-world outcomes? (learning outcomes)
- What will they learn or do as a result? e.g.
plan a visit to museum, learn that Romans wore
funny clothes, discover that they enjoy using a
digital camera - How will they use it? (learning experiences)
- What do they actually do with the site? e.g.
work online or need to print it? - in pairs or
alone? - with or without teacher help?
52Who for what for ...
- How can you ensure you do get these right?
- Build questions into the planning process
- Evaluate/test regularly
- Get informal feedback whenever possible and act
on it - Who is it for?
- What are the real world outcomes?
- How will they use it?
- Also When, Where, Why?
53More information
- Martin Bazley
- 0780 3580 737
- www.martinbazley.com