discussion from republic of science to audit society, irwin feller

14
Discussion From Republic of Science to Audit Society, Irwin Feller S. Charlot ASIRPA, Paris, France; June 13, 2012

Upload: bert

Post on 22-Feb-2016

40 views

Category:

Documents


0 download

DESCRIPTION

Discussion From Republic of Science to Audit Society, Irwin Feller. S. Charlot ASIRPA , Paris , France; June 13, 2012. Outline. New Questions/Issues & What’s at Stake How They’re Answered? Validity of Performance Metrics and Methodological Choice(s )  Econometrics - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Discussion From Republic of Science to Audit

Society, Irwin Feller

S. Charlot

ASIRPA, Paris, France; June 13, 2012

Page 2: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Outline

• New Questions/Issues & What’s at Stake How They’re Answered?

• Validity of Performance Metrics and Methodological Choice(s)

Econometrics

• Use, Non-Use and Misuse of Research Assessments

Page 3: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Pre-New Public Management Assessment Paradigm

• Republic of Science (M.Polyani)• Peer (Expert) Review• Social Contract

Page 4: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

New Public Management Paradigm

• Accountability

• Deregulation

• Competition (among different uses of public funds)

• Performance Measurement (for evaluating research uses)

Page 5: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Promises of Research Performance Assessment

• Objectives provide useful baseline for assessing performance.

• Performance measurement focuses attention on the end objectives of public policy, on what’s happened or happening outside rather than inside the black box.

• Well defined objectives and documentation of results facilitate communication with funders, performers, users, and others.

Page 6: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Limitations of Research Performance Measurement

• Returns/Impacts to research are uncertain, long-term, and circuitous• Specious precision in selection of measures• Impacts are typically dependent on complementary actions by

agents outside of Federal agency control• Limited (public) evidence of contributions to improved decision

making

• Benefits from “failure” are underestimated• Distortion of Incentives : opportunistic behavior (young researchers to be employed and elder researchers to catch future funds)

First comment/issue + Role of creativity/very innovative ideas in science progress (i. e. “scientific revolutions”)

Page 7: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Overview of Evaluation MethodologiesMETHOD BRIEF DESCRIPTION EXAMPLE OF USE

Analytical conceptual modeling of underlying theory

Investigating underlying concepts and developing models to advance understanding of some aspect of a program, project or phenomenon.

To describe conceptually the paths through which spillover effects may occur.

Survey Asking multiple parties a uniform set of questions about activities, plans, relationships, accomplishments, value, or other topics, which can be statistically analyzed.

To find out how many companies have licensed their newly developed technology to others.

Case study – descriptive

Investigating in-depth a program or project, a technology, or a facility, describing and explaining how and why developments of interest have occurred.

To recount how a particular joint venture was formed, how its participants shared research tasks, and why the collaboration was successful or unsuccessful.

Case study - economic estimation

Adding to a descriptive case study quantification of economic effects, such as through benefit-cost analysis.

To estimate whether, and by how much, benefits of a project exceed its costs.

Econometric and statistical analysis

Using tools of statistics, mathematical economics, and econometrics to analyze functional relationships between economic and social phenomena and to forecast economic effects.

To determine how public funding affects private funding of research.

Sociometric and social network analysis

Identifying and studying the structure of relationships by direct observation, survey, and statistical analysis of secondary databases to increase under-standing of social organizational behavior and related economic outcomes.

To learn how projects can be structured to increase the diffusion of resulting knowledge.

Page 8: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Second comment/questionComplementarities between methodologies

Econometric modeling needs analytical conceptual modeling of underlying theory to be pertinent

Econometric analysis also needs to take into account the policy design, context… to be pertinent Survey, case studies..

No econometric identification of impacts without these components in the evaluation model

Page 9: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Complementarity second example Benefit -Cost Analysis can be made by

econometric modelConduct Technical Analysis

Identify Next Best Alternative

Estimate Program Costs

Estimate Economic Benefits

Determine Agency Attribution

Estimate Benefits of Economic Return

RTI 2010

Page 10: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Before After

Treatment Group

SТ SТ

Comparison/Control Group

SC SC

τ

τ + 1

τ + 1

τ

Issue: Before/After Design Shows changes “related” to policy intervention, but does not adjust for “intervening” factors. (Threats to internal validity)

Reframe Analysis: Did policy “cause” change(s) in treatment group different from those observable in a comparison/control group

Microeconometrics of policy evaluation

Page 11: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Third comment/questionEconometric enhancements: Non parametric analysis no a priori constraint on relationship between outcome (whatever the outcome chosen) and R&D spending or funding No knowledge production function a priori

Taking into account the effect of non observable characteristics or time varing characteristics on outcomes context, context, context

Page 12: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Fourth comment/question“Dominant” U.S. but also European Union Methodology is Expert Panels

Problem of network effects

Same issue as peer evaluation and bibliometrics Only issue for « low impacts » (publications..) but not for high impacts???

Page 13: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Is Anyone Listening? My small experience (one evaluation report): no one is

listening

as a researcher I agree that “Doing good may not make you happy, but doing wrong will

certainly make you unhappy”

But for a novice at evaluating policy what are arguments not to stop this kind of intellectual exercise? (except publishing or funding researches)

What type of advices?

For ASIRPA?

Page 14: Discussion  From  Republic of Science to  Audit  Society,  Irwin  Feller

Thank you