The NISO Altmetrics Initiative aims to develop standards around altmetrics through a consensus-based process. [NISO] is coordinating a two-phase project to define key issues in altmetrics and establish working groups. They held initial community meetings to identify priorities such as ensuring consistent measurement, valid data, and addressing issues like gaming. The goal is recommended best practices and standards to support assessment practices and infrastructure for new forms of scholarly communication.
Interactive Powerpoint_How to Master effective communication
NISO-Altmetrics-NFAIS-Virtual-Conf-March-2014
1. The NISO Altmetrics Initiative
NFAIS Virtual Seminar: Assessing Contribution, Assessing
Usage: Metrics in a New Context
March 28, 2014
Nettie Lagace (@abugseye)
NISO Associate Director for Programs
2. What’s NISO?
• Non-profit industry trade association accredited by ANSI
with 150+ members
• Mission of developing and maintaining standards related to
information, documentation, discovery and distribution of
published materials and media
• Represent US interests to ISO TC46 (Information and
Documentation) and also serve as Secretariat for ISO
TC46/SC 9 (Identification and Description)
• Responsible for standards like ISSN, DOI, Dublin Core
metadata, DAISY digital talking books, OpenURL, SIP, NCIP,
MARC records and ISBN (indirectly)
• Volunteer driven organization: 400+ spread out across the
world
3. Premise of “Standards”
• Consensus standards created by a community with
various stakeholders
• Trust
• Leading to broader acceptance
• Standards as plumbing
• Standards facilitate trade, commerce and innovation
• Standards reduce costs
• Standards support better communication and
interoperability across systems
4.
5.
6. Why worth funding?
• Scholarly assessment is critical to the overall
process
– Which projects get funded
– Who gets promoted and tenure
– Which publications are prominent
• Assessment has been based on citation since the
60s
• Today’s scholars multiple types of interactions
with scholarly content are not reflected
– Is “non-traditional” scholarly output important too?
7. Why worth funding?
• In order to move out of “pilot” and “proof-of-
concept” phases …
• Altmetrics must coalesce around commonly
understood definitions, calculations and data
sharing practices
• Altmetrics must be able to be audited
• Organizations who want to apply metrics will
need to understand them and ensure consistent
application and meaning across the community
8. What can be measured &
how do we measure it?
Image: Flickr user karindalziel
13. • How do we ensure the validity of
those measurements?
• How to (or should we) deal with
other issues such as gaming?
• What infrastructure do we need for
alternative metrics?
• Basic definitions (so we are all
taking about the same thing)
• Open exchange of component data
• Research on the use and
integration of new scholarship
forms
"Q is for Question Mark" by b4b2 is licensed under CC BY 2.0
14. 2 Phases
• Phase 1: Hold meetings of stakeholders to define a
high-level list of issues
– October 2013, San Francisco
– December 2013, Washington, DC
– January 2014, Philadelphia
– Public Webinars
– White paper output, public presentations, public feedback
• Phase 2: Create Working Group within NISO structure,
to create recommended practice(s) and/or standard(s)
– Education/training efforts to ensure implementation
• Final report to Sloan due November 2015
15. Steering Committee
• Euan Adie, Altmetric
• Amy Brand, Harvard University
• Mike Buschman, Plum Analytics
• Todd Carpenter, NISO
• Martin Fenner, Public Library of Science (PLoS) (Chair)
• Michael Habib, Reed Elsevier
• Gregg Gordon, Social Science Research Network (SSRN)
• William Gunn, Mendeley
• Nettie Lagace, NISO
• Jamie Liu, American Chemical Society (ACS)
• Heather Piwowar, ImpactStory
• John Sack, HighWire Press
• Peter Shepherd, Project Counter
• Christine Stohn, Ex Libris
• Greg Tananbaum, SPARC (Scholarly Publishing & Academic Resources Coalition)
16. 1 2
3
San Francisco – PLOS ALM
Washington DC – CNI
Philadelphia – ALA Midwinter
Community Meetings Oct 2013-Jan 2014
17. Meetings’ General Format
• Held in conjuction with another
(industry/community) meeting
• Morning: lightning talks, post-it brainstorming
• Afternoon: discussion groups
– X
– Y
– Z
– Report back/react
• Livestreamed (video recordings now available)
18. Meeting Lightning Talks
• Expectations of researchers
• Exploring disciplinary differences in the use of social media in
scholarly communication
• Altmetrics as part of the services of a large university library
system
• Deriving altmetrics from annotation activity
• Altmetrics for Institutional Repositories: Are the metadata
ready?
• Snowball Metrics: Global Standards for Institutional
Benchmarking
• International Standard Name Identifier
• Altmetric.com, Plum Analytics, Mendeley reader survey
• Twitter Inconsistency
“Lightning" by snowpeak is licensed under CC BY 2.0
19.
20. SF Meeting – General outputs
• The importance of best practices for media
coverage of science (using DOIs, etc.)
• More Altmetrics research is needed and could
be promoted through this group
• Providing a standard set of research outputs
that we can use to compare different services
• The importance of use cases for specific
stakeholder groups in driving the discussion
forward
21. Discussions
San Francisco Washington DC Philadelphia
Business & Use Cases Business & Use Cases Use Cases (3X)
Quality & Data science Qualitative vs. Quantitative Data Integrity
Definitions Definitions/Defining
Impact
Definitions
Development &
Infrastructure
Identifying Stakeholders
and their Values
Standards
Future Proofing