Martin Bazley www'ICT4Learning'com - PowerPoint PPT Presentation

1 / 53
About This Presentation
Title:

Martin Bazley www'ICT4Learning'com

Description:

Consultancy/websites/training/user testing ICT4Learning.com (10 yrs) ... Video clip Moving Here key ideas not lesson plans. Post-funding - project development 3 ... – PowerPoint PPT presentation

Number of Views:90
Avg rating:3.0/5.0
Slides: 54
Provided by: elearning6
Category:

less

Transcript and Presenter's Notes

Title: Martin Bazley www'ICT4Learning'com


1
User testing and evaluation why, how and when
to do it
  • Evaluating and user testing
  • Weston Park
  • 15 Sep 2008
  • Martin Bazley
  • ICT4Learning.com

2
Intro Martin Bazley
  • Consultancy/websites/training/user testing
    ICT4Learning.com (10 yrs)
  • Chair of E-Learning Group for Museums
  • Previously
  • E-Learning Officer, MLA South East (3yrs)
  • Science Museum, London, Internet Projects (7yrs)
  • Taught Science in secondary schools (8yrs)

3
Why evaluate websites?
  • Why do evaluation and user testing?
  • Isnt it really expensive and time consuming?
  • Save money avoid substantial, hurried
    redevelopment later in project
  • Audience feedback improves resource in various
    ways new activity ideas, etc
  • Demonstrate involvement of key stakeholders
    throughout project

4
Making websites effective
  • 3 key success factors
  • Understanding audience
  • Learning experience and learning outcomes right
    for audience and clearly stated
  • Evaluation esp in classroom or home (observe in
    natural habitat wherever possible)

5
Who for what for ...
  • Who for? (audience)
  • Need to be clear from starte.g. for teachers
    of yr5/6 in local area with whiteboards
  • What real-world outcomes? (learning outcomes)
  • What will they learn or do as a result? e.g.
    plan a visit to museum, learn that Romans wore
    funny clothes, discover that they enjoy using a
    digital camera
  • How will they use it? (learning experiences)
  • What do they actually do with the site? e.g.
    work online or need to print it? - in pairs or
    alone? - with or without teacher help?
  • Where, when and why will they use it?
  • context is important

6
(No Transcript)
7
(No Transcript)
8
(No Transcript)
9
(No Transcript)
10
(No Transcript)
11
(No Transcript)
12
(No Transcript)
13
(No Transcript)
14
(No Transcript)
15
Website evaluation and testing
  • Need to think ahead a bit
  • what are you trying to find out?
  • how do you intend to test it?
  • why? what will do you do as a result?
  • The Why? should drive this process

16
Test early
  • Testing one user early on in the project
  • is better than testing 50 near the end

17
When to evaluate or test and why
  • Before funding approval project planning
  • Post-funding - project development
  • Post-project summative evaluation

18
Testing is an iterative process
  • Testing isnt something you do once
  • Make something
  • gt test it
  • gt refine it
  • gt test it again

19
Before funding project planning
  • Evaluation of other websites
  • Who for? What for? How use it? etc
  • awareness raising issues, opportunities
  • contributes to market research
  • possible elements, graphic feel etc
  • Concept testing
  • check idea makes sense with audience
  • reshape project based on user feedback

20
(No Transcript)
21
Post-funding - project development
  • Concept testing
  • refine project outcomes based on feedback from
    intended users
  • Refine website structure
  • does it work for users?
  • Evaluate initial look and feel
  • graphics,navigation etc

22
(No Transcript)
23
(No Transcript)
24
(No Transcript)
25
(No Transcript)
26
Post-funding - project development 2
  • Full evaluation of a draft working version
  • usability AND content do activities work, how
    engaging is it, what else could be offered, etc

Observation of actual use of website
by intended users,
using it for intended purpose,
in intended context classroom, workplace,
library, home, etc
27
(No Transcript)
28
(No Transcript)
29
(No Transcript)
30
(No Transcript)
31
(No Transcript)
32
  • Video clip Moving Here key ideas not lesson plans

33
(No Transcript)
34
(No Transcript)
35
(No Transcript)
36
(No Transcript)
37
Post-funding - project development 3
  • Acceptance testing of finished website
  • last minute check, minor corrections only
  • often offered by web developers
  • Summative evaluation
  • report for funders, etc
  • learn lessons at project level for next time

38
Two usability testing techniques
  • Get it testing
  • - do they understand the purpose, how it works,
    etc
  • Key task testing
  • ask the user to do something, watch how well they
    do
  • Ideally, do a bit of each, in that order

39
(No Transcript)
40
User testing who should do it?
  • The worst person to conduct (or interpret) user
    testing of your own site is
  • you!
  • Beware of hearing what you want to hear
  • Useful to have an external viewpoint
  • First 5mins in a genuine setting tells you 80 of
    whats wrong with the site
  • etc

41
User testing more info
  • User testing can be done cheaply tips on how to
    do it available (MLA SE guide)
    www.ICT4Learning.com/onlineguide

42
  • Strengths and weaknesses of different data
    gathering techniques

43
Data gathering techniques
  • User testing - early in development and again
    near end
  • Online questionnaires emailed to people or
    linked from website
  • Focus groups - best near beginning of project,
    or at redevelopment stage
  • Visitor surveys - link online and real visits
  • Web stats- useful for long term trends /events
    etc

44
  • Need to distinguish between
  • Diagnostics making a project or service
    better
  • Reporting
  • to funders, or for advocacy

45
Online questionnaires
  • () once set up they gather numerical and
    qualitative data with no further effort given
    time can build up large datasets
  • () the datasets can be easily exported and
    manipulated, can be sampled at various times, and
    structured queries can yield useful results
  • () respondents are self-selected and this will
    skew results best to compare with similar data
    from other sources, like visitor surveys
  • () the number and nature of responses may depend
    on how the online questionnaire is displayed and
    promoted on the website

46
Focus groups
  • () can explore specific issues in more depth,
    yielding rich feedback
  • () possible to control participant composition
    to ensure representative
  • () comparatively time-consuming (expensive) to
    organise and analyse
  • () yield qualitative data only - small numbers
    mean numerical comparisons are unreliable

47
Visitor surveys
  • () possible to control participant composition
    to ensure representative
  • () comparatively time-consuming (expensive) to
    organise and analyse
  • () responses can be affected by various factors
    including interviewer, weather on the day, day of
    the week, etc, reducing validity of numerical
    comparisons between museums

48
Web stats
  • () Easy to gather data can decide what to do
    with it later
  • () Person-independent data generated - it is the
    interpretation, rather than the data themselves,
    which is subjective. This means others can
    review the same data and verify or amend initial
    conclusions reached

49
Web stats
  • () Different systems generate different data for
    the same web activity for example no of unique
    visits measured via Google Analytics is generally
    lower than that derived via server log files
  • () Metrics are complicated and require
    specialist knowledge to appreciate them fully

50
Web stats
  • () As the amount of off-website web activity
    increases (e.g. Web 2.0 style interactions) the
    validity of website stats decreases, especially
    for reporting purposes, but also for diagnostics
  • () Agreeing a common format for presentation of
    data and analysis requires collaborative working
    to be meaningful

51
Who for what for ...
  • Who for? (audience)
  • Need to be clear from starte.g. for teachers
    of yr5/6 in local area with whiteboards
  • What real-world outcomes? (learning outcomes)
  • What will they learn or do as a result? e.g.
    plan a visit to museum, learn that Romans wore
    funny clothes, discover that they enjoy using a
    digital camera
  • How will they use it? (learning experiences)
  • What do they actually do with the site? e.g.
    work online or need to print it? - in pairs or
    alone? - with or without teacher help?

52
Who for what for ...
  • How can you ensure you do get these right?
  • Build questions into the planning process
  • Evaluate/test regularly
  • Get informal feedback whenever possible and act
    on it
  • Who is it for?
  • What are the real world outcomes?
  • How will they use it?
  • Also When, Where, Why?

53
More information
  • Martin Bazley
  • 0780 3580 737
  • www.martinbazley.com
Write a Comment
User Comments (0)
About PowerShow.com