EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes - PowerPoint PPT Presentation

1 / 24
About This Presentation
Title:

EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes

Description:

EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes Sally Ziph Instruction Coordinator Kresge Business Administration Library Stephen M. Ross School of Business – PowerPoint PPT presentation

Number of Views:111
Avg rating:3.0/5.0
Slides: 25
Provided by: Steven1050
Category:

less

Transcript and Presenter's Notes

Title: EVALUATING VIDEO TUTORIALS: Measuring Excellence And Outcomes


1
EVALUATING VIDEO TUTORIALSMeasuring Excellence
And Outcomes
  • Sally Ziph
  • Instruction Coordinator
  • Kresge Business Administration Library
  • Stephen M. Ross School of Business
  • University of Michigan
  • Elizabeth Beers
  • Web Developer, MPublishing
  • University of Michigan Library
  • University of Michigan

2
Our Videos
3
Rationale
  • New MBA Consortium students needed company and
    industry information to prepare for interviews
    with recruiters.
  • Videos were intended to help complete a specific
    task build lists in OneSource, find career
    information in Vault, etc.

4
The Process
  • Librarians created scripts based on frequently
    asked questions.
  • Technologies included Audacity for audio and
    Camtasia for screen capture.
  • Videos were integrated into the library website
    and advertised through email.

5
A Challenge
  • How do we evaluate the effectiveness of
    non-interactive task-based point-of-use video
    tutorials?

6
Evaluating our Videos
7
Best Practices
  • Pre/Post tests
  • A/B tests
  • Focus groups and user interviews
  • Surveys
  • Statistics

8
Pre/Post Tests
  • From the literature
  • Used to measure learning that takes place when
    students view tutorial
  • Most suitable for assessing tutorials that seek
    to teach a specific skill
  • Can be difficult to write questions that will
    give good comparison data

9
Pre/Post Tests
  • For our purposes
  • Tests intended to help with a task, rather than
    teach a skill

10
A/B Tests
  • From the literature
  • Used to compare outcomes from versions of the
    same tutorial
  • In person vs. online
  • Different instructional techniques
  • Also useful for gathering subjective feedback
  • Difficult to tease out whether variance in
    outcomes is due to content or presentation

11
A/B Tests
  • For our purposes
  • Abbreviated project timeline made comparison
    testing impossible.
  • Useful for refining tutorials

12
Focus Groups and User Interviews
  • From the literature
  • Easy ways to get feedback from a variety of
    viewpoints
  • Can be conducted iteratively and at any stage of
    the development cycle
  • Require an experienced facilitator

13
Focus Groups and User Interviews
  • For our purposes
  • Abbreviated project timeline made iterative
    assessment impossible
  • Useful for future assessment

14
Surveys
  • From the literature
  • Easy method for data collection
  • Easy to formulate questions but can be
    difficult to formulate good questions
  • Results often reflect subjective preferences

15
Surveys
  • For our purposes
  • Quick method for getting feedback from the target
    user group
  • Not the best data, but better than none.

16
Statistics
  • From the literature
  • Raw numbers from Google Analytics or built in
    utilities measure use.
  • Can be used to validate data from other
    methodologies
  • Does not measure usefulness or quality of
    tutorial content

17
Statistics
  • For our purposes
  • Data we were collecting anyway may be useful for
    future videos.
  • Most popular
  • Career resources
  • Company information
  • Industry reports

18
Our Evaluation
19
Our Survey
  • Managed using Qualtrics
  • Distributed by targeted email to 45 students
  • Questions included
  • Did you watch one or more of the Kresge database
    videos?
  • How useful were the instructional videos for the
    following situations?
  • What changes would you make to the Kresge videos
    to make them more useful?

20
Results
  • 13 out of 45 students responded (28)
  • 72 of respondents did not use the videos
  • Videos were perceived as inconvenient or
    irrelevant
  • Students wanted other topics

21
Conclusion
  • We need to clarify
  • What the Kresge Library can do to help Consortium
    students
  • Why these videos are worth their time
  • We need to send reminders at strategic times
    throughout the program.

22
Next Time?
23
In the Future
  • Continue gathering statistics
  • Use focus groups to refine tutorials and identify
    new topics
  • Consider testing methodologies to measure
    learning and usefulness

24
Questions?
Write a Comment
User Comments (0)
About PowerShow.com