carpenter library assessment conference presentation
DESCRIPTION
Todd Carpenter's Presentation during the Library Assessment Conference 2014 in Seattle, WA on August 2014 at University of Washington. During this presentation, Todd covered the output of Phase One of NISO's alternative metrics assessment initiative.TRANSCRIPT
When assessing impact, be sure you’re comparing
digital apples to digital apples: Report of Phase 1 of
NISO New Assessment IniBaBve Todd Carpenter
Executive Director, NISO August 4, 2014
August 4, 2014 1
! Non-‐profit industry trade associa9on accredited by ANSI
! Mission of developing and maintaining technical standards related to informa9on, documenta9on, discovery and distribu9on of published materials and media
! Volunteer driven organiza9on: 400+ contributors spread out across the world
! Responsible (directly and indirectly) for standards like ISSN, DOI, Dublin Core metadata, DAISY digital talking books, OpenURL, MARC records, and ISBN
About
August 4, 2014 2
August 4, 2014 3
What are the infrastructure elements
of alternative assessments?
August 4, 2014 4
Basic Definitions������
(So we are all talking��� about the same thing)
August 4, 2014 5
August 4, 2014 6
Element Identification
August 4, 2014 7
At what granularity?
August 4, 2014 8
How long do we measure?
August 4, 2014 9
August 4, 2014 10
Consistency across providers
Source: Scott Chamberlain, Consuming Article-‐Level Metrics: Observations And Lessons From Comparing Aggregator Provider Data, Information Standards Quarterly, Summer 2013, Vol 25, Issue 2. August 4, 2014 11
TRUST
=
August 4, 2014 12
What is NISO working toward? August 4, 2014 13
August 4, 2014 14
August 4, 2014 15
Steering CommiYee • Euan Adie, Altmetric • Amy Brand, Harvard University • Mike Buschman, Plum Analy9cs • Todd Carpenter, NISO • Mar9n Fenner, Public Library of Science (PLoS) (Chair) • Michael Habib, Reed Elsevier • Gregg Gordon, Social Science Research Network (SSRN) • William Gunn, Mendeley • Neae Lagace, NISO • Jamie Liu, American Chemical Society (ACS) • Heather Piwowar, ImpactStory • John Sack, HighWire Press • Peter Shepherd, Project Counter • Chris9ne Stohn, Ex Libris • Greg Tananbaum, SPARC (Scholarly Publishing & Academic Resources Coali9on)
August 4, 2014 16
AlternaBve Assessment IniBaBve
Phase 1 MeeBngs
October 9, 2013 -‐ San Francisco, CA December 11, 2013 -‐ Washington, DC
January 23-‐24 -‐ Philadelphia, PA Round of 1-‐on-‐1 interviews – March/Apr
Phase 1 report published in June 2014 �
August 4, 2014 17
Mee9ngs’ General Format
• Collocated with other industry mee9ng • Morning: lightning talks, post-‐it brainstorming • Agernoon: discussion groups – X – Y – Z – Report back/react
• Live streamed (video recordings are available)
August 4, 2014 18
Mee9ng Lightning Talks • Expecta9ons of researchers • Exploring disciplinary differences in the use of social media in
scholarly communica9on • Altmetrics as part of the services of a large university library
system • Deriving altmetrics from annota9on ac9vity • Altmetrics for Ins9tu9onal Repositories: Are the metadata
ready? • Snowball Metrics: Global Standards for Ins9tu9onal
Benchmarking • Interna9onal Standard Name Iden9fier • Altmetric.com, Plum Analy9cs, Mendeley reader survey • TwiYer Inconsistency August 4, 2014 19
“Lightning" by snowpeak is licensed under CC BY 2.0
August 4, 2014 20
August 4, 2014 21
SF Mee9ng Discussions • Business & Use cases
– Publishers want to serve authors, make money – People don’t value a standard, they value something that helps them – … Couldn’t iden9fy a logical standard need that actors in the space would value,
and best prac9ces are of interest
• Quality & Data science – Themes: context, valida9on, provenance, quality, descrip9on & metadata – We'll never get to the point where assessment can be done without a human
in the loop, but discovery and recommenda9on can
• Defini9ons – Define “ALM” and “Altmetrics” – Map the landscape – We'll never get to the point where assessment can be done without a human in
the loop, but discovery and recommenda9on can
August 4, 2014 22
DC Mee9ng Discussions • Business and Use Cases • Discovery
– metrics only get generated if material is discovered
• Qualita9ve vs. Quan9ta9ve • Iden9fying Stakeholders and their Values
– stakeholders in outcomes / stakeholders in process of crea9ng metrics – shared values but tensions – branding
• Defini9ons/Defining Impact – metrics and analyses – what led to success of cita9on? – how to be certain we are measuring the right things
• Future Proofing – what won't change – impact -‐ hard to establish across disciplines
August 4, 2014 23
Philly Mee9ng Discussions • Defini9ons
– Define life cycle of scholarly output and associated metrics – Qualita9ve versus Quan9ta9ve aspects -‐ what is possible to define here – Consider other aspects of these data collec9ons
• Standards – Develop defini9ons (what is a download? what is a view?) – Differen9ate between scholarly impact versus popular/social use – Define sources/characteris9cs for metrics (social, commercial, scholarly)
• Data Integrity – Counter biases/gaming – Associa9on with credible en99es -‐ e.g. ORCID ID v. gmail account – Reproduceability is key – Everyone needs to be at the table to establish overall credibility
• Use cases (3X)
August 4, 2014 24
30 One-‐on-‐One Interviews
August 4, 2014 25
White Paper Released
August 4, 2014 26
30 One-‐on-‐One Interviews
August 4, 2014 27
Takeaways: Problems with term “Altmetrics” Don’t conflate discovery, social interac9ons, and assessment Be broad in what is an output Clear in the item level iden9fica9on Consistency in methodologies to calculate metric Linking Alignment with iden9fying/naming conven9ons for new research output forms.
PotenBal work themes
DefiniBons ApplicaBon to types of research outputs Discovery implicaBons Research evaluaBon Data quality and gaming Grouping, aggregaBng, and granularity Context AdopBon
August 4, 2014 28
Poten9al work themes
DefiniBons Applica9on to types of research outputs Discovery implica9ons Research evalua9on Data quality and gaming Grouping, aggrega9ng, and granularity Context Adop9on
August 4, 2014 29
Poten9al work themes
Defini9ons
ApplicaBon to types of research outputs Discovery implica9ons Research evalua9on Data quality and gaming Grouping, aggrega9ng, and granularity Context Adop9on
August 4, 2014 30
Poten9al work themes
Defini9ons Applica9on to types of research outputs
Discovery implicaBons Research evalua9on Data quality and gaming Grouping, aggrega9ng, and granularity Context Adop9on
August 4, 2014 31
Poten9al work themes
Defini9ons Applica9on to types of research outputs Discovery implica9ons
Research evaluaBon Data quality and gaming Grouping, aggrega9ng, and granularity Context Adop9on
August 4, 2014 32
Poten9al work themes
Defini9ons Applica9on to types of research outputs Discovery implica9ons Research evalua9on
Data quality and gaming Grouping, aggrega9ng, and granularity Context Adop9on
August 4, 2014 33
Poten9al work themes
Defini9ons Applica9on to types of research outputs Discovery implica9ons Research evalua9on Data quality and gaming
Grouping, aggregaBng, and granularity Context Adop9on
August 4, 2014 34
Poten9al work themes
Defini9ons Applica9on to types of research outputs Discovery implica9ons Research evalua9on Data quality and gaming Grouping, aggrega9ng, and granularity
Context Adop9on
August 4, 2014 35
Poten9al work themes
Defini9ons Applica9on to types of research outputs Discovery implica9ons Research evalua9on Data quality and gaming Grouping, aggrega9ng, and granularity Context
AdopBon & PromoBon August 4, 2014 36
AlternaBve Assessment IniBaBve
Phase 2 PresentaBons of Phase 1 report (June 2014)
PrioriBzaBon Effort (June -‐ Aug, 2014) Project approval (Sept 2014)
Working group formaBon (Oct 2014) Consensus Development (Nov 2014 -‐ Dec 2015)
Trial Use Period (Dec 15 -‐ Mar 16)
PublicaBon of final recommendaBons (Jun 16) August 4, 2014 37
AlternaBve Assessments of our Assessment IniBaBve
White paper downloaded 3,599 in 60 days 21 substan9ve comments received
120 in-‐person and virtual par9cipants at the mee9ngs These three mee9ngs aYracted more than 400 RSVPs
Goal: generate about 40 ideas, in total, generated more than 250 Recordings of project work downloaded more than 10,000 9mes
More than 440 direct tweets using the #NISOALMI hashtag Five ar9cles in tradi9onal news publica9ons
15 blog posts about the ini9a9ve
August 4, 2014 38
We all want our own
August 4, 2014 39
For more
Project Site: www.niso.org/topics/tl/altmetrics_initiative/
White Paper:
http://www.niso.org/apps/group_public/download.php/13295/niso_altmetrics_white_paper_draft_v4.pdf
August 4, 2014 40
Questions?
Todd Carpenter
Executive Director [email protected]
National Information Standards Organization (NISO) 3600 Clipper Mill Road, Suite 302 Baltimore, MD 21211 USA +1 (301) 654-2512 www.niso.org
August 4, 2014 41