carpenter library assessment conference presentation

41
When assessing impact, be sure you’re comparing digital apples to digital apples: Report of Phase 1 of NISO New Assessment IniBaBve Todd Carpenter Executive Director, NISO August 4, 2014 August 4, 2014 1

Upload: national-information-standards-organization-niso

Post on 08-May-2015

908 views

Category:

Technology


3 download

DESCRIPTION

Todd Carpenter's Presentation during the Library Assessment Conference 2014 in Seattle, WA on August 2014 at University of Washington. During this presentation, Todd covered the output of Phase One of NISO's alternative metrics assessment initiative.

TRANSCRIPT

Page 1: Carpenter Library Assessment Conference Presentation

When  assessing  impact,  be  sure  you’re  comparing    

digital  apples  to  digital  apples:  Report  of  Phase  1  of    

NISO  New  Assessment  IniBaBve Todd Carpenter  

Executive Director, NISO August 4, 2014

August  4,  2014   1  

Page 2: Carpenter Library Assessment Conference Presentation

!  Non-­‐profit  industry  trade  associa9on    accredited  by  ANSI  

!  Mission  of  developing  and  maintaining  technical  standards  related  to  informa9on,  documenta9on,  discovery  and  distribu9on  of  published  materials  and  media  

!  Volunteer  driven  organiza9on:  400+  contributors  spread  out  across  the  world  

!  Responsible  (directly  and  indirectly)  for  standards  like  ISSN,  DOI,  Dublin  Core  metadata,  DAISY  digital  talking  books,  OpenURL,  MARC  records,  and  ISBN  

About    

August  4,  2014   2  

Page 3: Carpenter Library Assessment Conference Presentation

August  4,  2014   3  

Page 4: Carpenter Library Assessment Conference Presentation

What are the infrastructure elements

of alternative assessments?  

August  4,  2014   4  

Page 5: Carpenter Library Assessment Conference Presentation

Basic Definitions������

(So we are all talking��� about the same thing)  

August  4,  2014   5  

Page 6: Carpenter Library Assessment Conference Presentation

August  4,  2014   6  

Page 7: Carpenter Library Assessment Conference Presentation

Element Identification  

August  4,  2014   7  

Page 8: Carpenter Library Assessment Conference Presentation

At what granularity?  

August  4,  2014   8  

Page 9: Carpenter Library Assessment Conference Presentation

How long do we measure?  

August  4,  2014   9  

Page 10: Carpenter Library Assessment Conference Presentation

August  4,  2014   10  

Page 11: Carpenter Library Assessment Conference Presentation

Consistency across providers  

Source:  Scott  Chamberlain,  Consuming  Article-­‐Level  Metrics: Observations  And  Lessons  From  Comparing  Aggregator  Provider  Data,  Information  Standards  Quarterly,  Summer  2013,  Vol  25,  Issue  2.  August  4,  2014   11  

Page 12: Carpenter Library Assessment Conference Presentation

TRUST  

=  

August  4,  2014   12  

Page 13: Carpenter Library Assessment Conference Presentation

What is NISO working toward?  August  4,  2014   13  

Page 14: Carpenter Library Assessment Conference Presentation

August  4,  2014   14  

Page 15: Carpenter Library Assessment Conference Presentation

August  4,  2014   15  

Page 16: Carpenter Library Assessment Conference Presentation

Steering  CommiYee    •  Euan  Adie,  Altmetric  •  Amy  Brand,  Harvard  University  •  Mike  Buschman,  Plum  Analy9cs  •  Todd  Carpenter,  NISO  •  Mar9n  Fenner,  Public  Library  of  Science  (PLoS)  (Chair)  •  Michael  Habib,  Reed  Elsevier  •  Gregg  Gordon,  Social  Science  Research  Network  (SSRN)  •  William  Gunn,  Mendeley  •  Neae  Lagace,  NISO  •  Jamie  Liu,  American  Chemical  Society  (ACS)  •  Heather  Piwowar,  ImpactStory  •  John  Sack,  HighWire  Press  •  Peter  Shepherd,  Project  Counter  •  Chris9ne  Stohn,  Ex  Libris  •  Greg  Tananbaum,  SPARC  (Scholarly  Publishing  &  Academic  Resources  Coali9on)  

August  4,  2014   16  

Page 17: Carpenter Library Assessment Conference Presentation

AlternaBve  Assessment  IniBaBve    

 Phase  1  MeeBngs  

October  9,  2013    -­‐  San  Francisco,  CA  December  11,  2013  -­‐  Washington,  DC  

January  23-­‐24  -­‐  Philadelphia,  PA  Round  of  1-­‐on-­‐1  interviews  –  March/Apr  

 Phase  1  report  published  in  June  2014 �

 August  4,  2014   17  

Page 18: Carpenter Library Assessment Conference Presentation

Mee9ngs’  General  Format  

•  Collocated  with  other  industry  mee9ng  •  Morning:  lightning  talks,  post-­‐it  brainstorming  •  Agernoon:  discussion  groups    – X  – Y  – Z    – Report  back/react  

•  Live  streamed  (video  recordings  are  available)  

August  4,  2014   18  

Page 19: Carpenter Library Assessment Conference Presentation

Mee9ng  Lightning  Talks  •  Expecta9ons  of  researchers  •  Exploring  disciplinary  differences  in  the  use  of  social  media  in  

scholarly  communica9on  •  Altmetrics  as  part  of  the  services  of  a  large  university  library  

system  •  Deriving  altmetrics  from  annota9on  ac9vity  •  Altmetrics  for  Ins9tu9onal  Repositories:  Are  the  metadata  

ready?  •  Snowball  Metrics:  Global  Standards  for  Ins9tu9onal  

Benchmarking  •  Interna9onal  Standard  Name  Iden9fier  •  Altmetric.com,  Plum  Analy9cs,  Mendeley  reader  survey  •  TwiYer  Inconsistency  August  4,  2014   19  

“Lightning"  by  snowpeak  is  licensed  under  CC  BY  2.0  

Page 20: Carpenter Library Assessment Conference Presentation

August  4,  2014   20  

Page 21: Carpenter Library Assessment Conference Presentation

August  4,  2014   21  

Page 22: Carpenter Library Assessment Conference Presentation

SF  Mee9ng  Discussions  •  Business  &  Use  cases  

–  Publishers  want  to  serve  authors,  make  money  –  People  don’t  value  a  standard,  they  value  something  that  helps  them  –  …  Couldn’t  iden9fy  a  logical  standard  need  that  actors  in  the  space  would  value,  

and  best  prac9ces  are  of  interest  

•  Quality  &  Data  science  –  Themes:  context,  valida9on,  provenance,  quality,  descrip9on  &  metadata  –  We'll  never  get  to  the  point  where  assessment  can  be  done  without  a  human  

in  the  loop,  but  discovery  and  recommenda9on  can  

•  Defini9ons  –  Define  “ALM”  and  “Altmetrics”  –  Map  the  landscape  –  We'll  never  get  to  the  point  where  assessment  can  be  done  without  a  human  in  

the  loop,  but  discovery  and  recommenda9on  can  

August  4,  2014   22  

Page 23: Carpenter Library Assessment Conference Presentation

DC  Mee9ng  Discussions  •  Business  and  Use  Cases  •  Discovery  

–  metrics  only  get  generated  if  material  is  discovered  

•  Qualita9ve  vs.  Quan9ta9ve  •  Iden9fying  Stakeholders  and  their  Values  

–  stakeholders  in  outcomes  /  stakeholders  in  process  of  crea9ng  metrics  –  shared  values  but  tensions  –  branding  

•  Defini9ons/Defining  Impact  –  metrics  and  analyses  –  what  led  to  success  of  cita9on?  –  how  to  be  certain  we  are  measuring  the  right  things  

•  Future  Proofing  –  what  won't  change  –  impact  -­‐  hard  to  establish  across  disciplines  

August  4,  2014   23  

Page 24: Carpenter Library Assessment Conference Presentation

Philly  Mee9ng  Discussions  •  Defini9ons  

–  Define  life  cycle  of  scholarly  output  and  associated  metrics  –  Qualita9ve  versus  Quan9ta9ve  aspects  -­‐  what  is  possible  to  define  here  –  Consider  other  aspects  of  these  data  collec9ons  

•  Standards  –  Develop  defini9ons  (what  is  a  download?  what  is  a  view?)  –  Differen9ate  between  scholarly  impact  versus  popular/social  use  –  Define  sources/characteris9cs  for  metrics  (social,  commercial,  scholarly)  

•  Data  Integrity  –  Counter  biases/gaming  –  Associa9on  with  credible  en99es  -­‐  e.g.  ORCID  ID  v.  gmail  account  –  Reproduceability  is  key  –  Everyone  needs  to  be  at  the  table  to  establish  overall  credibility  

•  Use  cases  (3X)  

August  4,  2014   24  

Page 25: Carpenter Library Assessment Conference Presentation

30  One-­‐on-­‐One  Interviews  

August  4,  2014   25  

Page 26: Carpenter Library Assessment Conference Presentation

White  Paper  Released  

August  4,  2014   26  

Page 27: Carpenter Library Assessment Conference Presentation

30  One-­‐on-­‐One  Interviews  

August  4,  2014   27  

Takeaways:    Problems  with  term  “Altmetrics”    Don’t  conflate  discovery,  social  interac9ons,            and  assessment    Be  broad  in  what  is  an  output    Clear  in  the  item  level  iden9fica9on    Consistency  in  methodologies  to  calculate  metric      Linking    Alignment  with  iden9fying/naming  conven9ons        for  new  research  output  forms.    

   

Page 28: Carpenter Library Assessment Conference Presentation

PotenBal  work  themes  

DefiniBons  ApplicaBon  to  types  of  research  outputs  Discovery  implicaBons  Research  evaluaBon  Data  quality  and  gaming  Grouping,  aggregaBng,  and  granularity  Context    AdopBon  

August  4,  2014   28  

Page 29: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

DefiniBons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    Adop9on  

August  4,  2014   29  

Page 30: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  

ApplicaBon  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    Adop9on  

August  4,  2014   30  

Page 31: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  

Discovery  implicaBons  Research  evalua9on  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    Adop9on  

August  4,  2014   31  

Page 32: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  

Research  evaluaBon  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    Adop9on  

August  4,  2014   32  

Page 33: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  

Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    Adop9on  

August  4,  2014   33  

Page 34: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  Data  quality  and  gaming  

Grouping,  aggregaBng,  and  granularity  Context    Adop9on  

August  4,  2014   34  

Page 35: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  

Context    Adop9on  

August  4,  2014   35  

Page 36: Carpenter Library Assessment Conference Presentation

Poten9al  work  themes  

Defini9ons  Applica9on  to  types  of  research  outputs  Discovery  implica9ons  Research  evalua9on  Data  quality  and  gaming  Grouping,  aggrega9ng,  and  granularity  Context    

AdopBon  &  PromoBon  August  4,  2014   36  

Page 37: Carpenter Library Assessment Conference Presentation

AlternaBve  Assessment  IniBaBve    

 

Phase  2  PresentaBons  of  Phase  1  report  (June  2014)  

PrioriBzaBon  Effort  (June  -­‐  Aug,  2014)  Project  approval  (Sept  2014)  

Working  group  formaBon  (Oct  2014)  Consensus  Development  (Nov  2014  -­‐  Dec  2015)  

Trial  Use  Period  (Dec  15  -­‐  Mar  16)  

PublicaBon  of  final  recommendaBons  (Jun  16)  August  4,  2014   37  

Page 38: Carpenter Library Assessment Conference Presentation

AlternaBve  Assessments  of  our  Assessment  IniBaBve    

     

White  paper  downloaded  3,599  in  60  days  21  substan9ve  comments  received  

120  in-­‐person  and  virtual  par9cipants  at  the  mee9ngs    These  three  mee9ngs  aYracted  more  than  400  RSVPs      

Goal:  generate  about  40  ideas,  in  total,  generated  more  than  250  Recordings  of  project  work  downloaded  more  than  10,000  9mes  

More  than  440  direct  tweets  using  the  #NISOALMI  hashtag    Five  ar9cles  in  tradi9onal  news  publica9ons    

15  blog  posts  about  the  ini9a9ve  

 August  4,  2014   38  

Page 39: Carpenter Library Assessment Conference Presentation

We  all  want  our  own  

August  4,  2014   39  

Page 40: Carpenter Library Assessment Conference Presentation

For more

Project Site: www.niso.org/topics/tl/altmetrics_initiative/

White Paper:

http://www.niso.org/apps/group_public/download.php/13295/niso_altmetrics_white_paper_draft_v4.pdf

August  4,  2014   40  

Page 41: Carpenter Library Assessment Conference Presentation

Questions?

Todd Carpenter

Executive Director [email protected]

National Information Standards Organization (NISO) 3600 Clipper Mill Road, Suite 302 Baltimore, MD 21211 USA +1 (301) 654-2512 www.niso.org

August  4,  2014   41