JUSP - The JISC Journal Usage Statistics Portal - PowerPoint PPT Presentation

1 / 49
About This Presentation
Title:

JUSP - The JISC Journal Usage Statistics Portal

Description:

Title: PowerPoint Presentation Last modified by: Jo Lambert Created Date: 1/1/1601 12:00:00 AM Document presentation format: On-screen Show (4:3) Other titles – PowerPoint PPT presentation

Number of Views:83
Avg rating:3.0/5.0
Slides: 50
Provided by: acuk
Category:

less

Transcript and Presenter's Notes

Title: JUSP - The JISC Journal Usage Statistics Portal


1
JUSP - The JISC Journal Usage Statistics Portal
  • Ross MacIntyre, Mimas
  • The University of Manchester
  • ross.macintyre_at_manchester.ac.uk

2
Timeline
JISC Collections USAGE STATISTICS PORTAL SCOPING
STUDY PHASE ii TECHNCIAL DESIGN AND
PROTYPING INVITATION TO TENDER Summary 1. This
invitation to tender invites bidders to submit
proposals to undertake to the technical design
and prototyping for a Usage Statistics
Portal 2. The deadline for proposals is 1200
noon on Monday 14 July 2008. The work should
start no later than end of July 2008. The work a
detailed technical specification and design for
the Usage Statistics Portal, and a scoping of
costs required to bring it to production, and
final report should be complete by 1st March 2009.
  • 1998 Nesli
  • 2000 UKSG w/shop
  • 2002 COUNTER
  • 2003 JDbR1
  • 2004 Evidence Base report
  • Nesli2 Analysis of Usage Statistics
  • 2005 JDbR2
  • 2006 Key Perspectives report
  • Usage Statistics Service Feasibility Study
  • 2007 Content Complete report JUSP Scoping
    Study
  • 2008 JISC ITT JUSP Scoping Study 2 JDbR3
  • 2009 JUSP Report
  • 2010 April JISC fund JUSP to service
  • Areas for Discussion
  • Different perspectives
  • Publishers, Aggregators, Learning Institutions,
    Commercial Organisations, Product Vendors...
  • What do you want to monitor why?
  • What is usage?
  • Are you getting enough?
  • What are you supplying/gathering?
  • What do you do with it?
  • What is your holy grail?

3
Mission
  • to assist and support libraries in the analysis
    of NESLi2 usage statistics and the management of
    their e-journals collections.
  • 20 NESLi2 e-journal deals/Publishers
  • 130 HEIs taking up NESLi2 deals
  • 3 Intermediaries (gateway/host)

4
JISC JUSP service
  • Refine user requirements for usage portal
  • Develop portal in line with requirements
  • To be based on COUNTER usage reports
  • JR1 (total number of full-text article requests)
  • JR1A (requests from archives or backfiles)
  • Harvest usage statistics via SUSHI

JR1 Number of Successful Full-Text Article
Requests by Month and Journal JR1a Number of
Successful Full-Text Article Requests by Month
and Journal for a Journal Archive JR2
Turnaways by Month and Journal JR3 Number of
Successful Item Requests and Turnaways by Month,
Journal and Page Type JR4 Total Searches Run
by Month and Service JR5 Number of Successful
Full-Text Article Requests by Year of Publication
and Journal DB1 Total Searches and Sessions by
Month and Database DB2 Turnaways by Month and
Database DB3 Total Searches and Sessions by
Month and Service
5
(No Transcript)
6
(No Transcript)
7
Technical conversion from .xls
8
Technical conversion to .xml
9
DEMO of JUSP
10
1. Single point of access to all JR1 and JR1A
usage statistics as currently downloaded
individually from publisher websites
  • User informational text
  • From this page, you can download JR1 and JR1A
    (archive) reports.
  • You can select data from to
  • Interface shows
  • Report drop down list (JR1 (all), JR1A (archive
    only)
  • Publisher drop down list
  • Date Span from Month Year to Month Year

11
(No Transcript)
12
(No Transcript)
13
2. Addition of host/gateway JR1 statistics where
relevant
  • User informational text
  • To get a full picture of usage you may need to
    add usage statistics provided by other services
    such as Swetswise. This will depend on the
    publisher.
  • Select publisher and date range to download JR1
    reports with Ingenta, Swetswise, Ebsco EJS etc
    included where appropriate.
  • Interface shows
  • Report drop down list (JR1 (all))
  • Publisher drop down list
  • Date From (m/y) To (m/y)

14
(No Transcript)
15
3. Excluding usage of backfile collections
  • User informational text
  • JR1 reports include all usage. Some publishers
    also produce JR1A reports which give only usage
    of their archive or backfile collections. If you
    have access to these, you can download here
    reports that exclude backfile use and show only
    usage of current titles.
  • Interface shows
  • Publisher drop down list
  • Date From (m/y) To (m/y)
  • Data processing notes
  • Titles in JR1 and JR1A matched by ISSN.
  • JR1A usage subtracted from JR1.

16
(No Transcript)
17
4. SCONUL Return (Society of College, National
and University Libraries)
  • User informational text
  • Use this data for SCONUL return, which requires
    total use by Publishers by Academic Year.
  • These tables are used to look at usage trends
    over time, and to compare usage of the various
    publisher deals.
  • Interface shows
  • Publisher drop down list
  • Academic year

18
(No Transcript)
19
5. Summary table to show use of host/gateways
  • User informational text
  • Use this table to see how much of your total
    usage goes through intermediaries, e.g. Ingenta
    and Swetswise
  • Interface shows
  • Publisher drop down list
  • Calendar Year(s)
  • Data processing notes
  • Separate columns for publisher, gateway, host and
    total.
  • JR1 usage shown in each.
  • Percentage use from each source calculated.

20
(No Transcript)
21
6. Summary table to show use of backfiles
  • User informational text
  • Use this table to see how much of your total
    usage comes from backfiles
  • Interface shows
  • Publisher drop down list
  • Calendar Year(s)
  • Data processing notes
  • JR1 total including intermediaries.
  • Shows percentage of total JR1 usage that comes
    from JR1A.

22
(No Transcript)
23
7. Some more figures sic
  • User informational text
  • Find the average, median, (monthly) maximum
    number of requests, standard deviation and
    variance.
  • Interface shows
  • Publisher drop down list
  • Calendar year(s)

24
(No Transcript)
25
8. Which titles have the highest use?
  • User informational text
  • Find the (20) titles which have the highest use
  • Interface shows
  • Publisher drop down list
  • Calendar year(s)
  • Display (20) titles with the highest usage,
    including publisher, title, issn, no. of requests
    (descending order).

26
(No Transcript)
27
9. Tables and graphs
  • User informational text
  • See your monthly or annual usage over time as a
    chart
  • Interface shows
  • Publisher drop down list
  • Calendar years
  • Data processing notes
  • Show table of monthly totals for each year
  • Draw line graph

28
(No Transcript)
29
10. Benchmarking
  • User informational text
  • Compare usage with others in the same JISC band
  • Interface shows
  • Publisher drop down list
  • Calendar year(s)
  • JISC Band (A-J All)
  • Data processing notes
  • Give total for all libraries in the JISC band and
    average.

30
(No Transcript)
31
JISC Collections Benchmarking Survey March 2010
Usage Statistics Portal Benchmarking
functionality 76 Institutions responded to our
short survey in reference to the usage statistics
portal (benchmarking functionality). Our
findings are as detailed below. Question 1
How useful would it be for you to benchmark your
institutions journal usage for each individual
NESLi2 publisher against that of other HE
institutions? (76 responses)
38 / 76 (50) Very useful 36 / 76 (47.4)
Somewhat useful 2 / 76 (2.6) Not useful
32
Question 5. Regarding questions 2-4 above, please
indicate which would be your preferred choice
regarding benchmarking (74 responses)
37 / 74 (50) Named institution 23 / 74 (31.1)
Listed anonymously (same JISC band) 14 / 74
(18.9) Average usage by institutions in the
same JISC Band
33
Questions 10 Regarding questions 7-9 above,
which would be your preferred choice? (74
responses)
37 / 74 (50) Being anonymised within my JISC
Band 30 / 74 (40.5) Other institutions being
able to see my institution's name 7 / 74 (9.5)
Being part of an average figure for the Band I am
in
34
Question 6. Is there any other benchmarking
criteria you would like to see?
  • Same mission group Select our own particular
    subset of named institutions
  • Similar size and structure
  • Usage, spend and budget for resources
  • Cost per download cost per FTE - Student and
    Staff at department / subject level
  • SCONUL divisions (RLUK, old, new, collHE) and by
    area Scotland / Wales would also be useful
  • Trend over a period of years

35
Question 11 Please add any additional comments
you would like to make
  • If OK with the licence then comparing named
    institutions would be best/ Happy to be named if
    all institutions are named
  • Averages are not helpful unless accompanied by
    other institutional data. Anonymised usage
    figures would be more useful
  • Institutions within the same JISC Band can vary
    widely (e.g. do they have a medical school, do
    they still have a chemistry dept) so you really
    need the institution name to give any sort of
    useful benchmarking.
  • Pulling data like FTE and RAE would save us all
    from having to do that ourselves.
  • Would be useful for NESLi2, however the majority
    of our deals are outside NESLi2

36
Participation Agreement - Library
  • 3. PERMITTED USES/ACTIVITIES
  • 3.1 The Institution hereby agrees to
  • 3.1.1 permit the Consortium to include its
    COUNTER-compliant Usage Statistics in the
    database created for the Journal Usage Statistics
    Portal Service
  • 3.1.2 permit the Consortium to display the
    COUNTER-compliant Usage Statistics via the
    Journal Statistics Portal Service
  • 3.1.2 permit the Consortium to show the
    COUNTER-compliant Usage Statistics to other
    participating libraries in the Journal Usage
    Statistics Portal Service for benchmarking
    purposes and
  • 3.1.3 be identified in the Journal Usage
    Statistics Portal Service by (1) institutional
    name (2) JISC Band and (3) institutional group.

37
Participation Agreement - Library
  • 4. RESPONSIBILITIES OF THE CONSORTIUM
  • 4.1 The Consortium agrees to
  • 4.1.1 only provide access to any
    COUNTER-compliant Usage Statistics collected by
    the Consortium to authorized users from other
    participating institutions in the Journal Usage
    Statistics Portal Service and the Consortium
    partners
  • 4.1.2 use authentication for access to the
    Journal Usage Statistics Portal Service and
  • 4.1.3 permit JISC Collections to use the
    COUNTER-compliant Usage Statistics in the Journal
    Usage Statistics Portal Service database for
    negotiation purposes with publishers within the
    framework of NESLi2.

38
Participation Agreement Publisher/Intermediary
  • 3. PERMITTED USES/ACTIVITIES
  • 3.1 The Publisher hereby agrees to
  • 3.1.1 provide the Consortium with the COUNTER
    Usage Statistics of the Institutions, including
    by using the SUSHI Protocol
  • 3.1.2 permit the Consortium to include the
    collected COUNTER-compliant Usage Statistics in
    the database created for the JISC Journals
    Statistics Portal Project
  • 3.1.4 permit the Consortium to show all
    COUNTER-compliant Usage Statistics to any
    NESLi2-eligible Institutions for their own usage
    assessment and for benchmarking their own usage
    against that of other Institutions
  • 3.1.5 permit the Institutions to use the
    information in the JISC Journals Statistics
    Portal for their SCONUL returns and any other
    uses agreed between the Publisher and the
    Consortium
  • 3.1.6 provide the Consortium with usage
    statistics which are in compliance with the
    latest COUNTER guidelines and
  • 3.1.7 implement the SUSHI Protocol.

39
Some slight modifications
40
(No Transcript)
41
SUSHI Processing
  • OUP
  • Of the 51 sites now signed up, 24 had 2010 data
    from OUP but no 2009 data. SUSHI used to collect
    12 months worth of JR1 and JR1a data for these
    sites.
  • (24 sites x 12 months x 2 files per month 576
    files.)
  • Total time to collect files from OUP - 25 minutes
  • Total time to quality check them - 15 minutes
  • Total time to load them - 20 minutes
  • Total processing time for 2009 data for OUP for
    24 sites - 1 hour
  • Publishing Technology
  • 2009 data collected, processed and loaded for 25
    institutions.
  • Total time required 17 minutes
  • AIP
  • 15 sites now have complete 2009 data loads for
    AIP. This involved the collection, processing and
    loading of 180 sushi files took just under 25
    minutes.

42
Observations
  • SUSHI rare indeed!
  • NIL
  • Upload of publisher price lists lack of
    machine-readable sources (why not ONIX Serials
    SPS?)
  • Authority files to populate the Journal and
    Supplier tables
  • Subject categorisation of journals

43
Authentication/Authorisation
  • UK Access Management Federation
  • eduPersonScopedAffiliation
  • member_at_institution.ac.uk or staff_at_institution.ac.u
    k
  • eduPersonEntitlement
  • http//jisc-collections.ac.uk/entitlements/represe
    ntative

44
Final Observations
  • Open Source available to institutions or other
    consortia
  • Complementary not in competition with licensed
    software offerings

45
QA
This artwork by ADANeagoe, originally published
in Omagiu Magazine.
46
Raptor
  • Athens -gt Shibboleth loss of stats
  • Stats important for making budgetary decisions
    about eResources
  • Raptor is a Java based AuthN system log file
    parser
  • Shibboleth, Ezproxy and OpenAthens
  • Future release may see some integration directly
    in Shibboleth
  • Designed for non technical users. But will have
    technical components.
  • Statistics per institution as well as aggregated
    to higher levels e.g. UK federation

47
(No Transcript)
48
(No Transcript)
49
The end
Write a Comment
User Comments (0)
About PowerShow.com