SlideShare a Scribd company logo
1 of 19
Effectively Applying Usage 
Statistics in E-Resource 
Collection Development 
Using Evidence and Outreach in Decision- 
Making 
ACRL-MD – New Identities: Adapting the Academic Library 
November 14, 2014 
Randy Lowe – Collection Development, Acquisition & Serials Librarian, Frostburg State University
Overview 
 Why E-Resources Assessment? 
 Usage Statistics – Types, Reports, 
Collection 
 Assessment: Evidence & Outreach 
◦ Applying usage statistics to collection 
management decision-making 
◦ Engaging librarians, faculty and 
administrators in the process
Why E-Resource 
Assessment? 
 Libraries have historically measured use 
of services (circulation statistics, re-shelving 
counts, gate counts, etc.) 
 The technology upon which e-resources 
reside inherently allows for extensive 
collection of usage data – and 
assessment of that use 
 Assessment of use data supports 
evidence-based collection management 
 Libraries operate in a challenging fiscal 
environment – demonstrating e-resource 
value and fiscal responsibility is a must
Effective E-Resources Assessment 
 Two essential elements in conducting 
effective e-resource assessments: 
◦ Efficient and Accurate Data Collection 
◦ Clear and Succinct Analysis 
 E-Resource assessment is more than 
just collecting usage statistics – it is 
applying them in the making of sound 
management decisions regarding 
library resources 
 Usage statistics measure volume, not 
value of resources
What Can You Do with E-Resources 
Usage Statistics? 
 Track usage / Assess overall collection use 
 Track expenditures / Figure cost-per-use 
 Track turnaways 
 Assess title, subject, publisher and other usage 
elements 
 Identify user behavior trends 
 Assist in making collection development 
decisions, including acquisition model selection 
 Effectively advocate for resources – especially 
if assessment is tied to institutional 
goals/strategic plan, curricular initiatives, 
student learning goals
Types of Usage Statistics Reports and 
When to Use Them 
 Vendor-Defined 
◦ Analyzing usage data from a single vendor 
◦ Obtaining cost information 
◦ Comprehensive data files make it easy to 
analyze combinations of various data 
elements [Example] 
◦ When COUNTER reports do not provide 
adequate detail 
 COUNTER-Compliant 
◦ Analyzing usage data across multiple 
vendors 
◦ Ensuring data integrity though adherence to 
recognized standards
Collecting Usage Data 
 Define Objectives 
◦ What you need to know or are trying to 
find out should drive your data collection 
decisions 
◦ Collecting Usage Statistics can be a major 
time commitment 
 Use your assessment objectives to help you to 
not only determine what data to collect, but 
when you have collected enough data to 
analyze 
 Properly balancing time and resources 
dedicated to both data collection and analysis 
is vital
Collecting Usage Data 
 Various vendors present data differently 
– this can present a challenge not only 
across vendors, but even with combining 
data elements from a single vendor 
 Manipulation / Formatting of raw data will 
likely be necessary 
 Example – COUNTER BR1 Report + 
Acquisition Type Data + Cost Data 
Compiled Manually = Data for 
Assessment 
 Schedule time(s) to collect data 
 Vendors’ archival policies for maintaining 
usage statistics vary
Assessing Usage Data 
You have usage data – What do you do 
with it? 
 It is easy to get overwhelmed in usage 
data – analysis should be guided by 
your assessment objectives 
◦ What do you want/need to assess? 
◦ What questions are you trying to answer? 
◦ Who is your audience? 
 Have a purpose for using your data
Assessing Usage Data 
 Assessment is most powerful when it 
is tied to an action or potential action 
(including requests) 
 There is no single method for 
assessing usage statistics in every 
case – the “right data” to analyze and 
include in your report is that which will 
support your assessment objectives
Usage Data Analysis 
 Data analysis should be thorough, but 
presented succinctly 
 Conclusions, trends, etc. should be 
clear and verifiable 
 Beware of pre-conceived notions, 
perceptions or opinions – hypotheses 
can be both proven and refuted 
 State known limitations of the data you 
have collected and how they may 
affect your analysis
Using/Applying Evidence: 
Writing Your Report 
 Know your audience 
 Include a brief purpose/introduction 
 Write clearly and succinctly 
 Reported usage data should support 
the purpose of the assessment 
◦ Only include data that supports your 
stated objectives – don’t include all 
collected data; it won’t be read by 
administrators
Using/Applying Evidence: 
Writing Your Report 
 Reported usage data should support the 
purpose of the assessment (continued) 
◦ Include data within the text of your report where it 
is necessary and provides clear evidence for the 
points you are making 
◦ It is usually more effective to include visual 
representations of (charts, graphs) rather than 
just figures within the text of reports 
◦ Larger tables and data sets, if necessary to 
include, are best placed in appendices 
 Conclusions and recommendations should be 
easily identified and based on the evidence 
presented 
 State action and/or desired response clearly
Using/Applying Evidence: 
The Frostburg Experience 
 Effectively applying e-resources data to 
collection management has been an 
evolution 
 The lay of the land – 2007 
◦ We had data (searches & link resolver) 
◦ Study to compare journal costs by format 
◦ Data sat in a vacuum outside of annual 
database budgeting 
 Needed to establish a frame of reference 
to begin applying usage statistics in 
engaging faculty and administrators
Evidence & Outreach Example 1: 
Faculty Survey – 2007-2008 
 Faculty had not been previously engaged 
systematically in collection development efforts 
 User behavior as demonstrated in link resolver 
statistics indicated that online full-text was 
preferred by users 
 Library determined periodicals and standing 
orders should be migrated to online format, but 
which ones? 
 Fall 2007: Faculty surveyed regarding value 
(content) and usefulness (format) of journals, 
standing orders, databases. 
 Spring 2008: Results of survey matched link 
resolver usage statistics 
 Subscription Cancellations, additions, format 
migrations made over next 5 years
Evidence & Outreach Example 2: 
Underutilized Journals 
 Library began collecting full text article 
retrievals in 2009-2010 (and re-shelving 
counts in 2011-2012) 
 All journal subscriptions are reviewed 
by librarians annually 
 Faculty are involved in second level of 
review for underutilized subscriptions 
 Objective is to use the process as a 
means for continued dialogue with 
faculty in collection development
Evidence & Outreach Example 3: 
Collaboration with Academic 
Depts  Academic departments becoming 
increasingly engaged in e-resource 
subscription discussions, including funding 
◦ Chemistry – CAS SciFinder 
◦ Visual Arts – Artstor 
 Current collaboration is with Biology 
◦ Department not satisfied with current e-resources 
◦ No funds available for additional resources 
◦ Reviewed use of current journal subscriptions 
and content of requested databases 
◦ Department suggested journal cancellations to 
fund databases 
◦ New e-resource scenarios developed
Evidence & Outreach Example 4: 
E-Book Assessment 
 Frostburg State University: Report 
overall use and expenditures of e-books 
over time; implement the most cost 
effective DDA acquisition model(s) 
[Report] 
 USMAI Consortial E-Book Pilot: Assess 
the effectiveness of a specific DDA 
acquisition model for the consortium; use 
and expenditures by consortium 
members and user types; identification of 
possible future program funding models 
[Report]
Thank You 
 Questions? 
 Contact Information: 
Randy Lowe 
Frostburg State University 
rlowe@frostburg.edu

More Related Content

What's hot

Spss workshop by riaz
Spss workshop by riazSpss workshop by riaz
Spss workshop by riazMehreen Khan
 
Academic library assessment and evaluation in higher education
Academic library assessment and evaluation in higher education Academic library assessment and evaluation in higher education
Academic library assessment and evaluation in higher education BasitShah18
 
Researcher profiles and metrics that matter
Researcher profiles and metrics that matterResearcher profiles and metrics that matter
Researcher profiles and metrics that matterLibrary_Connect
 
028 097e
028 097e028 097e
028 097eBE/CRE
 
School Library Evaluation
School Library EvaluationSchool Library Evaluation
School Library EvaluationNatalie Harvey
 
data driven decision making
data driven decision makingdata driven decision making
data driven decision makingyliulaoshi
 
Cochrane for Librarians: An update on searching and specialised registers
Cochrane for Librarians:  An update on searching and specialised registersCochrane for Librarians:  An update on searching and specialised registers
Cochrane for Librarians: An update on searching and specialised registersDouglas Salzwedel
 
Escape the data dungeon: Shedding light on strategies to share your findings
Escape the data dungeon: Shedding light on strategies to share your findingsEscape the data dungeon: Shedding light on strategies to share your findings
Escape the data dungeon: Shedding light on strategies to share your findingsKimberly Vardeman
 
Evaluating the Big Deal: What metrics matter?
Evaluating the Big Deal: What metrics matter?Evaluating the Big Deal: What metrics matter?
Evaluating the Big Deal: What metrics matter?Selena Killick
 
Case studies for open science
Case studies for open scienceCase studies for open science
Case studies for open scienceIUPUI
 
2018 Library Assessment Conference: How many seats do we need in our library?
2018 Library Assessment Conference: How many seats do we need in our library?2018 Library Assessment Conference: How many seats do we need in our library?
2018 Library Assessment Conference: How many seats do we need in our library?brightspot
 
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impactBuilding a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impactStephen Town
 

What's hot (19)

Spss workshop by riaz
Spss workshop by riazSpss workshop by riaz
Spss workshop by riaz
 
Web and social media metrics: library’s impact and value
Web and social media metrics: library’s impact and  valueWeb and social media metrics: library’s impact and  value
Web and social media metrics: library’s impact and value
 
Academic library assessment and evaluation in higher education
Academic library assessment and evaluation in higher education Academic library assessment and evaluation in higher education
Academic library assessment and evaluation in higher education
 
Researcher profiles and metrics that matter
Researcher profiles and metrics that matterResearcher profiles and metrics that matter
Researcher profiles and metrics that matter
 
Ed psy 510 5th class 2014
Ed psy 510 5th class 2014Ed psy 510 5th class 2014
Ed psy 510 5th class 2014
 
Acc 281 final exam
Acc 281 final examAcc 281 final exam
Acc 281 final exam
 
AIM: Data-driven collection management
AIM: Data-driven collection managementAIM: Data-driven collection management
AIM: Data-driven collection management
 
Collection Analysis and Evaluation: Fundamentals of Collection-Centered Asse...
Collection Analysis and Evaluation: Fundamentals of  Collection-Centered Asse...Collection Analysis and Evaluation: Fundamentals of  Collection-Centered Asse...
Collection Analysis and Evaluation: Fundamentals of Collection-Centered Asse...
 
028 097e
028 097e028 097e
028 097e
 
School Library Evaluation
School Library EvaluationSchool Library Evaluation
School Library Evaluation
 
data driven decision making
data driven decision makingdata driven decision making
data driven decision making
 
Cochrane for Librarians: An update on searching and specialised registers
Cochrane for Librarians:  An update on searching and specialised registersCochrane for Librarians:  An update on searching and specialised registers
Cochrane for Librarians: An update on searching and specialised registers
 
Escape the data dungeon: Shedding light on strategies to share your findings
Escape the data dungeon: Shedding light on strategies to share your findingsEscape the data dungeon: Shedding light on strategies to share your findings
Escape the data dungeon: Shedding light on strategies to share your findings
 
Evaluating the Big Deal: What metrics matter?
Evaluating the Big Deal: What metrics matter?Evaluating the Big Deal: What metrics matter?
Evaluating the Big Deal: What metrics matter?
 
Case studies for open science
Case studies for open scienceCase studies for open science
Case studies for open science
 
2018 Library Assessment Conference: How many seats do we need in our library?
2018 Library Assessment Conference: How many seats do we need in our library?2018 Library Assessment Conference: How many seats do we need in our library?
2018 Library Assessment Conference: How many seats do we need in our library?
 
Research metrices (cite score)
Research metrices (cite score)Research metrices (cite score)
Research metrices (cite score)
 
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impactBuilding a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
 
Aasl panel
Aasl panelAasl panel
Aasl panel
 

Viewers also liked

Viewers also liked (7)

Best Practices for Managing E-Resources in Academic Libraries
Best Practices for Managing E-Resources in Academic LibrariesBest Practices for Managing E-Resources in Academic Libraries
Best Practices for Managing E-Resources in Academic Libraries
 
Academic Library Budgets Bibliography
Academic Library Budgets BibliographyAcademic Library Budgets Bibliography
Academic Library Budgets Bibliography
 
Collection development of e-resources
Collection development of e-resourcesCollection development of e-resources
Collection development of e-resources
 
Collection development
Collection developmentCollection development
Collection development
 
IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...
IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...
IFLA Key Issues in Electronic Resources Collection Development and CMO Propos...
 
collection development ppt
collection development pptcollection development ppt
collection development ppt
 
Collection development
Collection developmentCollection development
Collection development
 

Similar to Effectively Applying Usage Statistics in E-Resource Collection Development

Virginia ACRL Presentation
Virginia ACRL PresentationVirginia ACRL Presentation
Virginia ACRL PresentationGreg Raschke
 
Virginia tech collections_presentation
Virginia tech collections_presentationVirginia tech collections_presentation
Virginia tech collections_presentationGreg Raschke
 
Oklahoma Collections Innovation Presentation
Oklahoma Collections Innovation PresentationOklahoma Collections Innovation Presentation
Oklahoma Collections Innovation PresentationGreg Raschke
 
From Spreadsheets to SUSHI: Five Years of Assessing E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing E-ResourcesFrom Spreadsheets to SUSHI: Five Years of Assessing E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing E-ResourcesKristin Calvert
 
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-ResourcesFrom Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-ResourcesCharleston Conference
 
Moneyball, Libraries, and more - Ithaka collections presentation
Moneyball, Libraries, and more - Ithaka collections presentationMoneyball, Libraries, and more - Ithaka collections presentation
Moneyball, Libraries, and more - Ithaka collections presentationGreg Raschke
 
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at Tribal
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at TribalSoLAR Flare 2015 - Turning Learning Analytics Research into Practice at Tribal
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at TribalChris Ballard
 
Research in to Practice: Building and implementing learning analytics at Tribal
Research in to Practice: Building and implementing learning analytics at TribalResearch in to Practice: Building and implementing learning analytics at Tribal
Research in to Practice: Building and implementing learning analytics at TribalLACE Project
 
From Data To Information Perspectives On Policy And Practice
From Data To Information  Perspectives On Policy And PracticeFrom Data To Information  Perspectives On Policy And Practice
From Data To Information Perspectives On Policy And PracticeJeff_Watson
 
Employing the Balanced Scorecard in Academic Libraries
Employing the Balanced Scorecard in Academic LibrariesEmploying the Balanced Scorecard in Academic Libraries
Employing the Balanced Scorecard in Academic Librariesjpotter49505
 
Exploring learning analytics
Exploring learning analyticsExploring learning analytics
Exploring learning analyticsJisc
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems ChangeNoel Hatch
 
NCompass Live: New Statistical Standard for Public Services in Archives and S...
NCompass Live: New Statistical Standard for Public Services in Archives and S...NCompass Live: New Statistical Standard for Public Services in Archives and S...
NCompass Live: New Statistical Standard for Public Services in Archives and S...Nebraska Library Commission
 

Similar to Effectively Applying Usage Statistics in E-Resource Collection Development (20)

Virginia ACRL Presentation
Virginia ACRL PresentationVirginia ACRL Presentation
Virginia ACRL Presentation
 
Virginia tech collections_presentation
Virginia tech collections_presentationVirginia tech collections_presentation
Virginia tech collections_presentation
 
Oklahoma Collections Innovation Presentation
Oklahoma Collections Innovation PresentationOklahoma Collections Innovation Presentation
Oklahoma Collections Innovation Presentation
 
From Spreadsheets to SUSHI: Five Years of Assessing E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing E-ResourcesFrom Spreadsheets to SUSHI: Five Years of Assessing E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing E-Resources
 
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-ResourcesFrom Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
From Spreadsheets to SUSHI: Five Years of Assessing Use of E-Resources
 
Moneyball, Libraries, and more - Ithaka collections presentation
Moneyball, Libraries, and more - Ithaka collections presentationMoneyball, Libraries, and more - Ithaka collections presentation
Moneyball, Libraries, and more - Ithaka collections presentation
 
Project evaluation
Project evaluationProject evaluation
Project evaluation
 
E-Metrics: Assessing Electronic Resources
E-Metrics: Assessing Electronic ResourcesE-Metrics: Assessing Electronic Resources
E-Metrics: Assessing Electronic Resources
 
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at Tribal
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at TribalSoLAR Flare 2015 - Turning Learning Analytics Research into Practice at Tribal
SoLAR Flare 2015 - Turning Learning Analytics Research into Practice at Tribal
 
Research in to Practice: Building and implementing learning analytics at Tribal
Research in to Practice: Building and implementing learning analytics at TribalResearch in to Practice: Building and implementing learning analytics at Tribal
Research in to Practice: Building and implementing learning analytics at Tribal
 
From Data To Information Perspectives On Policy And Practice
From Data To Information  Perspectives On Policy And PracticeFrom Data To Information  Perspectives On Policy And Practice
From Data To Information Perspectives On Policy And Practice
 
Part III. Project evaluation
Part III. Project evaluationPart III. Project evaluation
Part III. Project evaluation
 
NISO Altmetrics Initiative, ALA Update January 21, 2017
NISO Altmetrics Initiative, ALA Update January 21, 2017NISO Altmetrics Initiative, ALA Update January 21, 2017
NISO Altmetrics Initiative, ALA Update January 21, 2017
 
Employing the Balanced Scorecard in Academic Libraries
Employing the Balanced Scorecard in Academic LibrariesEmploying the Balanced Scorecard in Academic Libraries
Employing the Balanced Scorecard in Academic Libraries
 
Control
ControlControl
Control
 
Hiller and Kyrillidou, "Measuring Use, Assessing Success, Part One: Measure, ...
Hiller and Kyrillidou, "Measuring Use, Assessing Success, Part One: Measure, ...Hiller and Kyrillidou, "Measuring Use, Assessing Success, Part One: Measure, ...
Hiller and Kyrillidou, "Measuring Use, Assessing Success, Part One: Measure, ...
 
Exploring learning analytics
Exploring learning analyticsExploring learning analytics
Exploring learning analytics
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems Change
 
Maxwell "Lessons Learned from Developing a Predictive Analytics Data Model"
Maxwell "Lessons Learned from Developing a Predictive Analytics Data Model"Maxwell "Lessons Learned from Developing a Predictive Analytics Data Model"
Maxwell "Lessons Learned from Developing a Predictive Analytics Data Model"
 
NCompass Live: New Statistical Standard for Public Services in Archives and S...
NCompass Live: New Statistical Standard for Public Services in Archives and S...NCompass Live: New Statistical Standard for Public Services in Archives and S...
NCompass Live: New Statistical Standard for Public Services in Archives and S...
 

Recently uploaded

ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...JojoEDelaCruz
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Projectjordimapav
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptshraddhaparab530
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4MiaBumagat1
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptxiammrhaywood
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Mark Reed
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designMIPLM
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSJoshuaGantuangco2
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataBabyAnnMotar
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17Celine George
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfJemuel Francisco
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfTechSoup
 
TEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxTEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxruthvilladarez
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfErwinPantujan2
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationRosabel UA
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4JOYLYNSAMANIEGO
 

Recently uploaded (20)

ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
ENG 5 Q4 WEEk 1 DAY 1 Restate sentences heard in one’s own words. Use appropr...
 
ClimART Action | eTwinning Project
ClimART Action    |    eTwinning ProjectClimART Action    |    eTwinning Project
ClimART Action | eTwinning Project
 
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptxYOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
YOUVE GOT EMAIL_FINALS_EL_DORADO_2024.pptx
 
Integumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.pptIntegumentary System SMP B. Pharm Sem I.ppt
Integumentary System SMP B. Pharm Sem I.ppt
 
ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4ANG SEKTOR NG agrikultura.pptx QUARTER 4
ANG SEKTOR NG agrikultura.pptx QUARTER 4
 
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptxYOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
YOUVE_GOT_EMAIL_PRELIMS_EL_DORADO_2024.pptx
 
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptxAUDIENCE THEORY -CULTIVATION THEORY -  GERBNER.pptx
AUDIENCE THEORY -CULTIVATION THEORY - GERBNER.pptx
 
Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)Influencing policy (training slides from Fast Track Impact)
Influencing policy (training slides from Fast Track Impact)
 
Paradigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTAParadigm shift in nursing research by RS MEHTA
Paradigm shift in nursing research by RS MEHTA
 
Keynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-designKeynote by Prof. Wurzer at Nordex about IP-design
Keynote by Prof. Wurzer at Nordex about IP-design
 
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTSGRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
GRADE 4 - SUMMATIVE TEST QUARTER 4 ALL SUBJECTS
 
Measures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped dataMeasures of Position DECILES for ungrouped data
Measures of Position DECILES for ungrouped data
 
How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17How to Add Barcode on PDF Report in Odoo 17
How to Add Barcode on PDF Report in Odoo 17
 
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdfGrade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
Grade 9 Quarter 4 Dll Grade 9 Quarter 4 DLL.pdf
 
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdfInclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
Inclusivity Essentials_ Creating Accessible Websites for Nonprofits .pdf
 
TEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docxTEACHER REFLECTION FORM (NEW SET........).docx
TEACHER REFLECTION FORM (NEW SET........).docx
 
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdfVirtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
Virtual-Orientation-on-the-Administration-of-NATG12-NATG6-and-ELLNA.pdf
 
Activity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translationActivity 2-unit 2-update 2024. English translation
Activity 2-unit 2-update 2024. English translation
 
Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4Daily Lesson Plan in Mathematics Quarter 4
Daily Lesson Plan in Mathematics Quarter 4
 
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptxINCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
INCLUSIVE EDUCATION PRACTICES FOR TEACHERS AND TRAINERS.pptx
 

Effectively Applying Usage Statistics in E-Resource Collection Development

  • 1. Effectively Applying Usage Statistics in E-Resource Collection Development Using Evidence and Outreach in Decision- Making ACRL-MD – New Identities: Adapting the Academic Library November 14, 2014 Randy Lowe – Collection Development, Acquisition & Serials Librarian, Frostburg State University
  • 2. Overview  Why E-Resources Assessment?  Usage Statistics – Types, Reports, Collection  Assessment: Evidence & Outreach ◦ Applying usage statistics to collection management decision-making ◦ Engaging librarians, faculty and administrators in the process
  • 3. Why E-Resource Assessment?  Libraries have historically measured use of services (circulation statistics, re-shelving counts, gate counts, etc.)  The technology upon which e-resources reside inherently allows for extensive collection of usage data – and assessment of that use  Assessment of use data supports evidence-based collection management  Libraries operate in a challenging fiscal environment – demonstrating e-resource value and fiscal responsibility is a must
  • 4. Effective E-Resources Assessment  Two essential elements in conducting effective e-resource assessments: ◦ Efficient and Accurate Data Collection ◦ Clear and Succinct Analysis  E-Resource assessment is more than just collecting usage statistics – it is applying them in the making of sound management decisions regarding library resources  Usage statistics measure volume, not value of resources
  • 5. What Can You Do with E-Resources Usage Statistics?  Track usage / Assess overall collection use  Track expenditures / Figure cost-per-use  Track turnaways  Assess title, subject, publisher and other usage elements  Identify user behavior trends  Assist in making collection development decisions, including acquisition model selection  Effectively advocate for resources – especially if assessment is tied to institutional goals/strategic plan, curricular initiatives, student learning goals
  • 6. Types of Usage Statistics Reports and When to Use Them  Vendor-Defined ◦ Analyzing usage data from a single vendor ◦ Obtaining cost information ◦ Comprehensive data files make it easy to analyze combinations of various data elements [Example] ◦ When COUNTER reports do not provide adequate detail  COUNTER-Compliant ◦ Analyzing usage data across multiple vendors ◦ Ensuring data integrity though adherence to recognized standards
  • 7. Collecting Usage Data  Define Objectives ◦ What you need to know or are trying to find out should drive your data collection decisions ◦ Collecting Usage Statistics can be a major time commitment  Use your assessment objectives to help you to not only determine what data to collect, but when you have collected enough data to analyze  Properly balancing time and resources dedicated to both data collection and analysis is vital
  • 8. Collecting Usage Data  Various vendors present data differently – this can present a challenge not only across vendors, but even with combining data elements from a single vendor  Manipulation / Formatting of raw data will likely be necessary  Example – COUNTER BR1 Report + Acquisition Type Data + Cost Data Compiled Manually = Data for Assessment  Schedule time(s) to collect data  Vendors’ archival policies for maintaining usage statistics vary
  • 9. Assessing Usage Data You have usage data – What do you do with it?  It is easy to get overwhelmed in usage data – analysis should be guided by your assessment objectives ◦ What do you want/need to assess? ◦ What questions are you trying to answer? ◦ Who is your audience?  Have a purpose for using your data
  • 10. Assessing Usage Data  Assessment is most powerful when it is tied to an action or potential action (including requests)  There is no single method for assessing usage statistics in every case – the “right data” to analyze and include in your report is that which will support your assessment objectives
  • 11. Usage Data Analysis  Data analysis should be thorough, but presented succinctly  Conclusions, trends, etc. should be clear and verifiable  Beware of pre-conceived notions, perceptions or opinions – hypotheses can be both proven and refuted  State known limitations of the data you have collected and how they may affect your analysis
  • 12. Using/Applying Evidence: Writing Your Report  Know your audience  Include a brief purpose/introduction  Write clearly and succinctly  Reported usage data should support the purpose of the assessment ◦ Only include data that supports your stated objectives – don’t include all collected data; it won’t be read by administrators
  • 13. Using/Applying Evidence: Writing Your Report  Reported usage data should support the purpose of the assessment (continued) ◦ Include data within the text of your report where it is necessary and provides clear evidence for the points you are making ◦ It is usually more effective to include visual representations of (charts, graphs) rather than just figures within the text of reports ◦ Larger tables and data sets, if necessary to include, are best placed in appendices  Conclusions and recommendations should be easily identified and based on the evidence presented  State action and/or desired response clearly
  • 14. Using/Applying Evidence: The Frostburg Experience  Effectively applying e-resources data to collection management has been an evolution  The lay of the land – 2007 ◦ We had data (searches & link resolver) ◦ Study to compare journal costs by format ◦ Data sat in a vacuum outside of annual database budgeting  Needed to establish a frame of reference to begin applying usage statistics in engaging faculty and administrators
  • 15. Evidence & Outreach Example 1: Faculty Survey – 2007-2008  Faculty had not been previously engaged systematically in collection development efforts  User behavior as demonstrated in link resolver statistics indicated that online full-text was preferred by users  Library determined periodicals and standing orders should be migrated to online format, but which ones?  Fall 2007: Faculty surveyed regarding value (content) and usefulness (format) of journals, standing orders, databases.  Spring 2008: Results of survey matched link resolver usage statistics  Subscription Cancellations, additions, format migrations made over next 5 years
  • 16. Evidence & Outreach Example 2: Underutilized Journals  Library began collecting full text article retrievals in 2009-2010 (and re-shelving counts in 2011-2012)  All journal subscriptions are reviewed by librarians annually  Faculty are involved in second level of review for underutilized subscriptions  Objective is to use the process as a means for continued dialogue with faculty in collection development
  • 17. Evidence & Outreach Example 3: Collaboration with Academic Depts  Academic departments becoming increasingly engaged in e-resource subscription discussions, including funding ◦ Chemistry – CAS SciFinder ◦ Visual Arts – Artstor  Current collaboration is with Biology ◦ Department not satisfied with current e-resources ◦ No funds available for additional resources ◦ Reviewed use of current journal subscriptions and content of requested databases ◦ Department suggested journal cancellations to fund databases ◦ New e-resource scenarios developed
  • 18. Evidence & Outreach Example 4: E-Book Assessment  Frostburg State University: Report overall use and expenditures of e-books over time; implement the most cost effective DDA acquisition model(s) [Report]  USMAI Consortial E-Book Pilot: Assess the effectiveness of a specific DDA acquisition model for the consortium; use and expenditures by consortium members and user types; identification of possible future program funding models [Report]
  • 19. Thank You  Questions?  Contact Information: Randy Lowe Frostburg State University rlowe@frostburg.edu