reinhilde veugelers - cbs · science policy based on short term bibliometric indicators and journal...

29
Reinhilde Veugelers Prof@KULeuven-MSI; ERC Scientific Council Member; Senior Fellow at Bruegel;

Upload: others

Post on 13-Oct-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Reinhilde Veugelers

Prof@KULeuven-MSI; ERC Scientific Council Member; Senior Fellow at Bruegel;

Page 2: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly
Page 3: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

EU science only slowly catching up with US top; China is catching up fast also to the top

EU’s low growth & austerity leading to shrinking public (research) budgets (in contrast to China’s expanding budget)

Monitoring, evaluation of public research budgets More emphasis on (measuring) impact of (public)

research (funding) on society More emphasis on contribution of public research to

(local) economic & societal development Public funders more short-term impact oriented, …

Page 4: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Excellence Big impact Wide impact Risky (big breakthroughs, high failure prob) Create new research fields Novel, New recombinations of know-how (existing

pieces of know-how in new applications) Crossing disciplines …

Page 5: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Frontier research is (like basic research in general) a public good, which is “undersupplied” and therefore motivates public funding

Frontier research is especially important for advancement in science ◦ Instigates a multitude of incremental improvements

Frontier research overproportionally important for linking to technology and

innovations

Frontier research is more likely to be undersupplied than basic research in general ◦ Lower incentives for scientists to do more risky research

Fundamental reason for government support of public research is to promote risk taking (Arrow 1962)

Page 6: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Competitive selection procedures increasingly accused of favoring “safe” projects which exploit existing knowledge at expense of novel projects that explore untested waters

Increased reliance in evaluations on bibliometric measures—

particularly short-term bibliometric measures (Leiden Manifesto; Martin editorial); ◦ “Instant bibliometrics for reviewers” JIF, Three-year citation window NB: Goes beyond funding decisions Hiring and promotion decisions based in part on short-term bibliometric measures (eg

Italy,..) Allocation of research funds to universities and departments within some countries

based on such measures (eg Netherlands, UK, Flanders.. Why are these trends related?

Page 7: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

◦ Develop a bibliometric measure of novelty: papers making new combinations of journal references, taking into account the difficulty of making such new combinations through the distance between the journals

◦ Study relationship between novelty and citations, using 2001 WoS journal articles.

Stephan, Veugelers, Wang, 2017, Evaluators blinkered by bibliometrics, Nature, 544, 411-412.

Page 8: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Find a ‘high risk/high gain” profile of novel research ◦ More average citations but also higher variance in citations; ◦ More likely to become top cited (top 1%) but only when using a long enough time window (at least 4 years);

◦ More likely to stimulate follow-on breakthroughs; ◦ Appreciation of novel research comes from outside its own field; not within its field.

Characteristics one expects if novelty is correlated with breakthrough research

Also find bias against novelty in standard bibliometric indicators ◦ Less likely to be highly cited in typically short-term citation window ◦ More likely to be published in journals with lower Journal Impact Factor

Findings in a Nutshell

Page 9: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Findings in a Nutshell

Page 10: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Novel papers are less likely to be published in high JIF journals

And even if they get into high JIF, they face a delayed recognition

Findings in a Nutshell

Page 11: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty

Over-reliance on such measures ◦ Directly discourages novel research that might be of great value. ◦ Indirectly misses follow-on breakthroughs built on novel research.

Findings may help explain why funding agencies who are increasing

relying on bibliometric indicators are at the same time perceived as being increasingly risk averse

Results also point to importance of having interdisciplinary panels evaluate research

Page 12: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Funders should not provide (or ask to provide) short-term bibliometric measures and prevent them from being used as decisive in reviews of grant proposals. They should insist on multiple ways to assess applicants’ and institutions’ publications They should resist evaluating the success based on short-term citation counts and journal-impact factors. They should also include experts with outside field expertise. Panel members should resist seeking out and relying too much on metrics, especially when calculated over less than a three-year window

Page 13: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly
Page 14: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Excellence as the only criterion Support for the individual scientist – no networks! Global peer-review No predetermined subjects (bottom-up) Support of frontier research in all fields of science and humanities

│ 2

Legi

slat

ion

Stra

tegy

Scientific governance: independent Scientific Council with 22 members; full authority over funding strategy

Budget: € 13 billion (2014-2020) - 1.9 billion €/year € 7.5billion (2007-2013) - 1.1 billion €/year ERC represents 17% of total EC-H2020 budget

Page 15: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

“the ERC aims at reinforcing excellence, dynamism and creativity in European research by funding investigator-driven projects of the

highest quality at the frontiers of knowledge”.

“its grants will help to bring about new and unpredictable scientific and technological discoveries - the kind that can form the basis of

new industries, markets, and broader social innovations of the future”.

“Scientific excellence is the sole selection criterion. In particular, high risk/high gain pioneering proposals which go beyond the state of the

art, address new and emerging fields of research, introduce unconventional, innovative approaches are encouraged”.

Page 16: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

ERC’s subsidiarity over Member States public funding ◦ Scale advantages from: larger pooling of projects and selection expertise ◦ Scale advantages especially important for risky frontier research

The ERC should be able to leverage its scale, quality and reputation to overcome the risk aversion trap of its panels

Page 17: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

The evaluation of ERC grant applications is conducted by peer review panels composed of scholars selected by the ERC Scientific Council from all over the world; They are assisted by remote referees. Typically 375 panel members by call; 2000 remote referees by call; About 15% of panel members from outside EU.

Reviewers are asked to evaluate the proposals on their ground breaking nature, their level of ambition to go beyond the state of the art and push the frontier.

Panels decide on the ranking/who-gets-funded

ERC does not provide or asks for bibliometric indicators (JIF, Citations, ERC instructs its panel to only consider submitted material (ie not look up/use other

information... Nevertheless, PIs often self-report in their applications (often advised by their host

institutions/peers) Panel members are often found to self-search for bibliometric indicators

│ 17

Page 18: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

│ 18

ERC’s performance since 2007

• Headline KPI: Share of publications from ERC funds in top 1% highly cited • Target in H2020: 1.6%; Realised since 2007: 7% of the ERC-

acknowledging publications were among the top 1% (ie > 5500pubs)

• International prizes/awards of ERC grantees • eg 6 Nobel prize winners as grantees

• Qualitative assessment (pilot project): • 71% of the first 200 completed ERC-funded projects made

scientific breakthroughs and major advances in science (as judged by panels of peer reviewers)

Page 19: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

KPIs for ERC should be on whether it is the best mode to support frontier research ◦ Does its peer review system select and support frontier research? ◦ Do its funded teams deliver frontier research? ◦ Does it do it better than the counterfactuals?

Beyond excellence/highly cited papers, assessing frontier research: 1% highly cited (=headline KPI) only captures the “high gain” part of frontier research It typically uses short term window to calculate citations (<3 years)

◦ Develop new KPIs for monitoring ◦ Proper quantitative and qualitative evaluation

Page 20: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

What does ERC select? Check big impact, novelty, interdisciplinarity of grantees on pre grant publications ◦ Comparing granted vs rejected ERC applicants ◦ Comparing marginally accepted vs marginally rejected ERC applicants ◦ Comparing ERC applicants with non-applicants

What is the impact of ERC funding? Check big impact, novelty, interdisciplinarity of grantees on post-grant publications ◦ Compared to counterfactual: similar grantees without ERC funding ◦ Various techniques to assess causality each with their problems:

DifferenceinDifference, RegressionDiscontinuityDesign Note: ERC only just starting to have finished grants (2007, 2008, 2009)

Page 21: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Difference-in-Difference

Compare the before/after of the grantees (treatment group) with the before/after of the control (matched) group ◦ Compare for the grantees (treatment group) output before and after the funding ◦ Compare for the non-successful applicants (control group) output before and after the application ◦ Compare the first difference of the grantees with the first difference of the non-successful applicants

◦ Eliminates fixed individual effects (eg talent) ◦ Eliminates common trends

The key "identifying" assumption is that the development of the potential for scientific output is similar for those that eventually receive funding and those who don't (common trend), so that the DiD is due to funding.

If the program’s aim is to select exactly those PIs with most potential for impact research (which is not a fixed effect), then the fundamental assumption behind DID is violated (common trend assumption violated); NB: even if this would jeopardize the assessment of causality of the funding, it is nice to know for

the ERC that it managed to select those PIs with most potential for impact research, ie DiD effect may not be because of the funding but because of the selection of high potentials;

Page 22: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Regression Discontinuity Design Match the marginally funded with the marginally rejected Idea is that the decision to marginally fund/reject is close to

“random” Only examines treatment effect at the treshold

But what about the treatment effect at the top of the quality

distribution?

Page 23: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

BEFORE AFTER Funded Unfunded Funded Unfunded

Ratio of TOP1% papers (C5)

6% 3% *** 5% 3% ***

Insignificant diff-in-diffs Ratio of TOP 1% NOVEL (HIGH)

2% 2% ns 2% 2% ns

Page 24: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

BEFORE Funded Rejected

Step1 Rejected

Step2 Ratio of TOP1% papers (C5)

6% 2% 4%

Ratio of TOP 1% NOVEL (HIGH)

2% 2% 2%

Page 25: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

PUB# pubs

Poisson

Ratio of paper in JIF top

1% OLS

Ratio of pubs top 1% cited

(5y) OLS

Ratio of novel paper

OLS

FUNDED 0.509*** (0.065)

0.012*** (0.003)

0.032*** (0.006)

-0.011 (0.009)

AFTER 0.445*** (0.067)

0.002 (0.003)

-0.008 (0.005)

0.007 (0.010)

FUNDED * AFTER

-0.193** (0.087)

0.000 (0.004)

-0.012 (0.008)

0.001 (0.012)

• Sample call year = 2007 or 2008 • Similar results for all call years

• 5y before/after call year • Similar results for 3y before/after

• Controls for gender, age, nationality, year and panel fixed effects;

Page 26: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

All call years Controls: call schemes, panels, and call year fixed effects

# pubs

Poisson

# top cited pubs (3y) Poisson

# novel pubs Poisson

FUNDED 0.301*** (0.029)

0.663*** (0.068)

0.216*** (0.046)

AFTER -0.191*** (0.035)

-0.160* (0.086)

-0.164*** (0.049)

FUNDED * AFTER

-0.001 (0.045)

-0.083 (0.107)

0.002 (0.067)

N 9942 9942 9942 R2 0.335 0.165 0.195 Log lik -88128 -15923 -25765 chi2 4968*** 1451*** 2015***

# pubs

Poisson

# top cited pubs (3y) Poisson

# novel pubs

Poisson FUNDED -0.101

(0.089) -0.017 (0.264)

-0.178 (0.137)

AFTER -0.243*** (0.093)

-0.312 (0.245)

-0.263* (0.151)

FUNDED * AFTER

0.001 (0.145)

0.141 (0.448)

0.109 (0.215)

N 688 688 688 R2 0.391 0.249 0.233 Log lik -5538 -1223 -1702 chi2 599*** 3370*** 3948***

All projects Borderline projects

Page 27: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

ERC good in selecting “high gain”, less in “high risk” ?

Too soon yet for post-grant impact analysis but its evaluation procedure should be able to pick up early warning signs and adjust

Page 28: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

◦ Differential treatment effects: M/F and AdG/StG and fields Along the quality distribution

◦ Timing of the treatment effects: Initial (selection) effect which later disappears Or… long time to effect (for novelty)

◦ Other dimensions of frontier research/impact Cross, outside field effects

◦ Impact on careers of PIs & their hires and position in the scientific community ◦ Impact on technology: scientific publications as references in patent applications,

patents, spin-offs ◦ International orientation: extra-EU co-authorship ◦ …

Page 29: Reinhilde Veugelers - CBS · Science policy based on short term bibliometric indicators and journal Impact Factor has a bias against novelty Over-reliance on such measures Directly

Novelty only one measure of frontier research; others needed Not all frontier research is “novel” Important for public agencies to have a portfolio that includes

risk; not all research funded should be risky. Real role for “ditch diggers”