building a scientific basis for research evaluation

33
Copyright © 2012 American Institutes for Research. All rights reserved. Building a Scientific Basis for Research Evaluation Rebecca F. Rosen, PhD Senior Researcher Research Trends Seminar October 17, 2012

Upload: evadne

Post on 08-Feb-2016

24 views

Category:

Documents


0 download

DESCRIPTION

Building a Scientific Basis for Research Evaluation. Rebecca F. Rosen, PhD. Senior Researcher. Research Trends Seminar October 17, 2012. Outline. Science of science policy A proposed conceptual framework Empirical approaches: NSF Engineering Dashboard ASTRA – Australia HELIOS – France - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Building a Scientific Basis for Research Evaluation

Copyright © 2012 American Institutes for Research.All rights reserved.

Building a Scientific Basis for Research Evaluation

Rebecca F. Rosen, PhDSenior ResearcherResearch Trends SeminarOctober 17, 2012

Page 2: Building a Scientific Basis for Research Evaluation

2

Outline

• Science of science policy• A proposed conceptual framework• Empirical approaches:

NSF Engineering Dashboard ASTRA – Australia HELIOS – France

• Final thoughts

Page 3: Building a Scientific Basis for Research Evaluation

3

Outline

• Science of science policy• A proposed conceptual framework• Empirical approaches:

NSF Engineering Dashboard ASTRA – Australia HELIOS – France

• Final thoughts

Page 4: Building a Scientific Basis for Research Evaluation

4

The emergence of a science of science policy• Jack Marburger’s challenge (2005)

• Science of Science & Innovation Policy Program at the National Science Foundation (2007) An emerging, highly interdisciplinary research field

• Science of Science Policy Interagency Task Group publishes a “Federal Research Roadmap” (2008): The data infrastructure is inadequate for decision-

making

• STAR METRICS (2010)

Page 5: Building a Scientific Basis for Research Evaluation

5

Why a science of science policy?

• Evidence-based investments• Good metrics = good incentives• Science is networked and global

• Build a bridge between researchers and policymakers• Researchers ask the right questions

• The adjacent possible: leverage existing and new research and expertise • New tools to describe & measure

communication

Page 7: Building a Scientific Basis for Research Evaluation

7

A conceptual framework for a science of science policy

Page 8: Building a Scientific Basis for Research Evaluation

8

Getting the right framework matters

• What you measure is what you get Poor incentives Falsification

• Usefulness• Effectiveness

Page 9: Building a Scientific Basis for Research Evaluation

9

A proposed conceptual framework

Adapted from Ian Foster, University of Chicago

Page 10: Building a Scientific Basis for Research Evaluation

10

A framework to drive person-centric data collection

WHO is doing the research

WHAT is the topic of their research

HOW are the researchers funded

WHERE do they work

With WHOM do they work

What are their PRODUCTS

Page 11: Building a Scientific Basis for Research Evaluation

11

Challenge – The data infrastructure didn’t exist

However, some of the data do exist

Page 12: Building a Scientific Basis for Research Evaluation

12

Empirical Approaches Leveraging existing data to begin

describing results of the scientific enterprise

Page 13: Building a Scientific Basis for Research Evaluation

13

An empirical approach

• Enhance the utility of enterprise data• Identify authoritative “core” data elements• Develop an Application Programming

Interface (API) Data platform that provides programmatic

access to public (or private) agency information

• Develop a tool to demonstrate value of API

Page 14: Building a Scientific Basis for Research Evaluation

14

Topic modeling: Enhancing the value of existing data

David Newman - UC Irvine

NSF proposals Topic Model:

- Use words from(all) text

- Learn T topics

t49t18t114t305

Topic tags for each and every proposal

Automatically learned topics (e.g.):…

t6. conflict violence war international military …

t7. model method data estimation variables …

t8. parameter method point local estimates …

t9. optimization uncertainty optimal stochastic …

t10. surface surfaces interfaces interface …

t11. speech sound acoustic recognition human …

t12. museum public exhibit center informal outreach

t13. particles particle colloidal granular material …

t14. ocean marine scientist oceanography …

Page 15: Building a Scientific Basis for Research Evaluation

15

Stepwise empirical approach

• Enhance the utility of enterprise data• Identify authoritative “core” data elements• Develop an Application Programming

Interface (API) Data platform that provides flexible,

programmatic access to public (or private) agency information

• Develop a tool to demonstrate value of API

Page 16: Building a Scientific Basis for Research Evaluation

16

Page 17: Building a Scientific Basis for Research Evaluation

17

Stepwise empirical approach

• Enhance the utility of enterprise data• Identify authoritative “core” data elements• Develop an Application Programming

Interface (API) Data platform that provides programmatic

access to public (or private) agency information

• Develop a tool to demonstrate value of API

Page 18: Building a Scientific Basis for Research Evaluation

18

Outline

• Science of science policy• A proposed conceptual framework• Empirical approaches:

NSF Engineering Dashboard ASTRA – Australia HELIOS – France

• Final thoughts

Page 19: Building a Scientific Basis for Research Evaluation

19

Page 20: Building a Scientific Basis for Research Evaluation

20

Page 21: Building a Scientific Basis for Research Evaluation

21

Page 22: Building a Scientific Basis for Research Evaluation

22

Page 23: Building a Scientific Basis for Research Evaluation

23

Page 24: Building a Scientific Basis for Research Evaluation

24

Outline

• Science of science policy• A proposed conceptual framework• Empirical approaches:

NSF Engineering Dashboard ASTRA – Australia HELIOS – France

• Final thoughts

Page 25: Building a Scientific Basis for Research Evaluation

25

Linking administrative and grant funding data in Australia

Page 26: Building a Scientific Basis for Research Evaluation

26

Outline

• Science of science policy• A proposed conceptual framework• Empirical approaches:

NSF Engineering Dashboard ASTRA – Australia HELIOS – France

• Final thoughts

Page 27: Building a Scientific Basis for Research Evaluation

27

People People

Describing public-private partnerships in France

Page 28: Building a Scientific Basis for Research Evaluation

28

What does getting it right mean?

• A community driven empirical data framework should be: Timely Generalizable and replicable Low cost, high quality

• The utility of “Big Data”: Disambiguated data on individuals

- Comparison groups New text mining approaches to describe and

measure communication ??

Page 29: Building a Scientific Basis for Research Evaluation

29

Final thoughts

Page 30: Building a Scientific Basis for Research Evaluation

30

Policy makers can engage SciSIP communities:

• Patent Network Dataverse; Fleming at Harvard and Berkeley

• Medline-Patent Disambiguation; Torvik & Smalheiser at U Illinois)

• COMETS (Connecting Outcome Measures in Entrepreneurship Technology and Science); Zucker & Darby at UCLA

Page 31: Building a Scientific Basis for Research Evaluation

31

The power of open research communities

• Internet and data technology can transform effectiveness of science: Informing policy Communicating science to the public Enabling scientific collaborations

• Interoperability is key

• Publishers are an important part of the community

Page 32: Building a Scientific Basis for Research Evaluation

32

Rebecca F. Rosen, PhDE-Mail: [email protected]

1000 Thomas Jefferson Street NWWashington, DC 20007General Information: 202-403-5000TTY: 887-334-3499Website: www.air.org

THANK YOU!

Page 33: Building a Scientific Basis for Research Evaluation

33