Beyond Usability: Evaluation Aspects of Visual Analytic Environments - PowerPoint PPT Presentation

1 / 13
About This Presentation
Title:

Beyond Usability: Evaluation Aspects of Visual Analytic Environments

Description:

Time to answer questions about different abstraction levels -Accuracy in answering questions ... view information at different levels of abstraction efficiently ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 14
Provided by: Melissa401
Category:

less

Transcript and Presenter's Notes

Title: Beyond Usability: Evaluation Aspects of Visual Analytic Environments


1
Beyond Usability Evaluation Aspects of Visual
Analytic Environments
  • Jean Scholtz
  • Pacific Northwest National Laboratory
  • jean.scholtz_at_pnl.gov

2
Why do Evaluations?
  • Help to show progress to program managers and
    funding agencies
  • Facilitates tech transfer
  • Provide feedback to researchers

3
Typical Evaluations
  • Performance evaluations
  • Measure different algorithms with respect to
    accuracy, speed
  • Accuracy measures depend on having ground truth
  • Usability evaluations
  • Measure effectiveness, efficiency and user
    satisfaction of human-computer interaction
  • Based on empirical studies of representative
    users doing representative tasks

4
Going Beyond
  • Performance and Usability evaluations are
    certainly needed
  • In addition we want to determine the overall
    value of the visual analytic environment to the
    user
  • Proposed categories for assessing value of visual
    analytic environments are
  • Situation awareness
  • Collaboration
  • Interaction
  • Creativity
  • Utility

5
Situation Awareness
  • Does the visual analytic environment help the
    user obtain situation awareness (SA)?
  • As measured by perception, comprehension and
    projection (Endsley)
  • Assessment methods of SA
  • Performance based
  • Knowledge based
  • Subjective measures
  • Situation awareness global assessment tool
    (SAGAT)
  • Measures correctness, time
  • May also be concerned with SA of others work

6
Collaboration
  • In order to work effectively in collaboration
    environments users need to understand what others
    are doing
  • Who, what, when, where
  • Measures (for visual analytic environments) could
    include knowing what sources have been used, what
    queries have been done, where individuals are in
    the sense-making process, what the outstanding
    questions are
  • Intelligent systems should be considered as
    collaborators as well and hence the same
    information should be available when working with
    intelligent systems
  • Ability to share information with others in
    different domains of expertise

7
Interaction
  • Visual analytic environments will not be static
    but interactive
  • Measures include
  • Usability of interactions
  • Capabilities provided according to principles of
    dialog
  • Suitability for the task
  • Selfdescriptiveness of the action
  • Controllability
  • Conformity with user expectations or consistency
  • Error tolerance
  • Suitability for individualization or
    customization
  • Suitability for learning.

8
Creativity
  • Workshop on creativity funded by NSF proposed the
    following measures for creativity support tools
  • Quality of solutions
  • Number of unique alternatives considered
  • Degree of radicalism/conservatism of alternatives
    considered
  • Serendipitous solutions
  • Time to come up with solutions
  • Satisfaction with solutions
  • Cost (person-time) to come up with solutions
  • Cost of the solution versus the utility of the
    solution
  • Ease of use of the support tool
  • Buy-in to the use of the support tool

9
Utility
  • Added value to users process or product
  • Possible measures
  • Better product in same or less time
  • Better could equate to more citations, increased
    confidence in recommendations, more hypotheses
    investigated
  • Process measure
  • Invert the bathtub curve

analysis
analysis
Data gathering report generation
data gathering report
generation
10
Development of Metrics
  • One way is to formulate hypotheses about why we
    are developing visual analytic environments
  • For each hypothesis, think about which of the
    different categories it should support
  • Based on this, determine measures to collect to
    support/disconfirm the hypothesis

11
Examples
Hypothesis Possible measures Category
VAE should increase the amount of information/ data that analysts can incorporate into their analysis -Number of documents looked at -Number of entities considered (some notion of relationships, attributes) - Amount of evidence extracted and used Interaction
Visualizations (created by the analyst) should increase the comprehension of analytic products -Accuracy of customer comprehension of analytic products - Accuracy of comprehension of analytic products by other analysts Interaction Creativity
VAE should increase the efficiency and effectiveness of analytic collaboration -Time for each party in collaboration to comprehend a situation -Accuracy in shared understanding Collaboration Situation Awareness
VAE should allow analysts to view information at different levels of abstraction efficiently and effectively -Time to answer questions about different abstraction levels -Accuracy in answering questions Interaction Creativity
VAE should allow analysts to efficiently and effectively view data from multiple intelligence sources -Time/accuracy in answering questions about different intelligence sources Interaction Situation Awareness
VAE should improve the process of the analysts -Time spent in each step of the process -Time spent in overhead (tool use) -Productivity of information-seeking tasks Interaction Usability Utility
12
Issues
  • Validation of metrics
  • Correlation to product quality
  • Need a repeatable and acceptable way to assess
    the quality of analytic products
  • Look at measures that correlate with increased
    quality
  • Correlation to process improvements
  • Amount of time spent in various stages of
    analysis

13
Current Work
  • Construction of data sets using PNNL Threat
    Stream Generator
  • And corresponding tasks
  • Developing metrics and running experiments with
    different NVAC1/RVAC2 software projects
  • Developing metrics for VAST contests and
    analyzing results
  • National Visualization and Analytics Center
  • Regional Visualization and Analytics Center
Write a Comment
User Comments (0)
About PowerShow.com