challenggppes and opportunities in peer review · challenggppes and opportunities in peer review a...

42
Challenges and Opportunities in Peer Review A Vision for Ensuring Its Strategic National Value toni scarpa scarpat@csr .nih.gov toni scarpa scarpat@csr .nih.gov 301-435-1109 Federal Demonstration Partnership Washington, DC January 25, 2011 National Institutes of Health U.S. Department of Health and Human Services

Upload: others

Post on 23-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

  • Challenges and Opportunities in Peer Reviewg pp

    A Vision for Ensuring Its Strategic National Valueg g

    toni scarpa [email protected] scarpa [email protected]

    Federal Demonstration PartnershipWashington, DCJanuary 25, 2011

    National Institutes of HealthU.S. Department of Health and Human Services

  • Peer Review at CSR

  • CSR Activities

    Fiscal Applications Applications Number of Number of Number of Year Received Reviewed Study Sections Reviewers SROs

    2008 73,000 51,000 1,600 16,000 2402008 73,000 51,000 1,600 16,000 240

    2009 113,000 78,000 2,200 39,000 240

    2010 88,000 64,000 1,700 17,000 240

  • This is CSR

    September 2009

  • The Drivers for Change

  • 1st Driver: The NIH Budgeton

    $, B

    illi

  • 2nd Driver: Number of Applications

    100

    120

    80

    100

    ds

    60

    hous

    and

    40

    T

    20

    01996 1998 2000 2002 2004 2006 2008 2010

  • R01 and R21 Received for CSR Review12000

    10000

    12000

    8000

    6000 R01R21

    4000

    0

    2000

    0Feb Jun Oct Feb Jun Oct Feb Jun Oct Feb Jun Oct Feb

    2007 2008 2009 2010 2011

  • Application Received By Month of FY

    2011 R i d

    Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep

    2009 Received

    2010 Received

    2011 Received

    9472

    11336

    11323

    7123

    7509

    10184

    8636

    9886

    3706 8319

    11290

    10532

    8866

    22327

    5209

    13003 11903

    12014

    6521

    5216

    4650

    4835

    5961

    2006 Received

    2007 Received

    2008 Received

    8943

    7928

    9459

    7052

    7103

    7304

    7751

    7764

    7679

    3677

    9220

    8736

    8106

    7924

    7171

    8166

    4338

    4070

    3689

    6018

    8313

    8211

    7748

    6238

    6188

    6886

    3679

    3516

    4102

    2866

    2003 Received

    2004 Received

    2005 Received

    7513

    9341

    8903

    9038

    6998

    8001

    3508

    7417

    7954

    7677

    9133

    9018

    8319

    6841

    8079

    4499

    4635

    4250

    3626

    3556

    8117

    9239

    8766

    6532

    6523

    6627

    3698

    4083

    3995 4290

    0 10,000 20,000 30,000 40,000 50,000 60,000 70,000 80,000 90,000 100,000110,000

    2001 Received

    2002 Received

    5194

    6082

    4997

    5439

    2640

    3168

    5320

    6739

    5706

    6342

    3218

    3797

    3849 5546

    7034

    5226

    6352

    2468

    2995

    Count of Applications

  • 3rd Driver: Reviewer’s Load

    12 00

    14.00

    10.00

    12.00

    Applications Per Reviewer6.00

    8.00

    Per Reviewer

    2 00

    4.00

    0.00

    2.00

  • 4th Driver: CSR Budget70

    50

    60

    40

    50

    llion

    s

    20

    30

    $ M

    il

    0

    10

    2004 2005 2006 2007 2008 20092004 2005 2006 2007 2008 2009

    CSR Reviewer Cost Constant $

    Cost of Peer Review including travel and small honorarium for 20 000Cost of Peer Review, including travel and small honorarium for 20,000Reviewers is 0.4-0.6% of the funds requested

  • Annual Savings in Reviewers’ Expense Budget

    • Non-refundable tickets with one possible change$17 million$17 million

    • 3,000 fewer reviewers$3 million$3 million

    • 20% reviews using electronic platforms$6 million$

    • One meeting a year on the West Coast$1.8 million

    • Replacing CDs with zApp$ 1 million

  • Enhancing Peer Review

  • Major Complaints About NIH Peer Review

    • The process is too slowp

    • There are not enough senior/experienced reviewers

    Th f di t bl h i t d f• The process favors predictable research instead of significant, innovative, or transformative research

    • The time and effort required to write and review are a heavy burden on applicants and reviewers

  • Enhancing Peer Review

    1 Addressing Review and Funding for New Investigators1. Addressing Review and Funding for New Investigators

    2. Reviewing Transformative Research

    3 Funding Promising Research Earlier3. Funding Promising Research Earlier

    4. Shortening the Review Time

    5 Improving Study Sections Alignment with Science5. Improving Study Sections Alignment with Science

    6. Recruiting and Retaining the Best Reviewers

    7 Advancing Additional Review Platforms7. Advancing Additional Review Platforms

    8. Focusing More on Impact and Significance

    9 Saving Reviewer’ s Time and Effort9. Saving Reviewer s Time and Effort

    10. Enhancing Peer Review Training

    11 Continuously Reviewing the Changes11. Continuously Reviewing the Changes

  • Enhancing Peer Review

    1. Addressing Review and Funding for New Investigators

    • Use different paylines for New Investigators and Early Stage Investigators (Only R01)g ( y )

    • Cluster the reviews of New Investigator R01 applications so they are discussed togetherthey are discussed together

  • Funding New Investigators

    Newand Experienced Investigators on R01 Equivalent Grants and New

    45.0%6,000

    sators

    New and Experienced Investigators on R01 Equivalent Grants and New Investigators as a Percentage of All Competing R01 Awardees

    (FY 1962 ‐ 2010) preliminary

    30 0%

    35.0%

    40.0%

    4 000

    5,000

    mpe

    ting

     Awarde

    es

    t Principal Investiga

    20.0%

    25.0%

    30.0%

    3,000

    4,000

    ercentage of All Co

    ced R01 Equivalen t

    10.0%

    15.0%

    1,000

    2,000

    nvestigators as a Pe

    New

     and

     Experien

    0.0%

    5.0%

    0

    1962 1965 1968 1971 1974 1977 1980 1981 1984 1987 1992 1995 1998 2001 2004 2007 2010 est

    New

     I

    Num

    ber of 

    Fiscal Year

    Established  Investigators New Investigators Percent New

  • Funding Longevity of NIH Investigators12

    10

    ort

    8

    R01

    Sup

    po

    4

    6

    Life

    of N

    IH

    2

    Hal

    f L

    0

    Year of Initial R01 Support

  • Enhancing Peer Review

    2. Reviewing Transformative Research

    • Editorial Board Review (3 Stages)Initial scoring based on innovation and potential science transformation by a small study section of distinguished, y y g ,broad-science reviewers (the editors)Specific science reviewed by appropriate reviewers (subject experts-the editorial board)(subject experts the editorial board)Final ranking by editors in face-to-face meeting

  • Enhancing Peer Review

    3. Shortening the Review Time

    2005

    2007

    2009

    0 2 4 6

    Months: Submission to Posting Summary StatementsMonths: Submission to Posting Summary Statements

  • 4 F di th t i i h li

    Enhancing Peer Review

    70%

    4. Funding the most promising research earlier

    60%

    70%

    40%

    50%

    Aw

    ards

    20%

    30%

    of T

    otal

    10%

    20%

    Perc

    ent

    0%1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

  • Funding the most promising research earlier

    60%

    70%

    40%

    50%

    Aw

    ards

    30%

    of T

    otal

    A

    10%

    20%

    Perc

    ent o

    0%1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010

    P

  • 5 Improving Study Section Alignment

    Enhancing Peer Review

    5. Improving Study Section Alignment

    • Input from the community

    • Internal IRG reviews

    • Open houses

    • Advisory Committee

  • Positional Map of Musculoskeletal Tissue Engineering Study Section

  • Positional Map of Membrane Biology and Protein Processing Study Section

  • 6 Recruiting the Best Reviewers

    Enhancing Peer Review

    6. Recruiting the Best Reviewers

    14000

    16000

    10000

    12000

    14000

    6000

    8000

    10000

    2000

    4000

    6000

    0

    000

    2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010

    Chartered Temporary

  • Academic Rank of All CSR Reviewers6. Recruiting the Best ReviewersAcademic Rank of ALL CSR Reviewers

    70.0%

    80.0%

    50 0%

    60.0%

    30 0%

    40.0%

    50.0%

    20.0%

    30.0%

    0.0%

    10.0%

    1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008

    PROFESSOR ASSOCIATE PROFESSOR ASSISTANT PROFESSOR

  • Recruiting the Best ReviewersSome Successful Strategies

    • Move a meeting a year to the West Coast

    Additi l i l tf• Additional review platforms

    • Develop a national registry of volunteer reviewersSearchable database with 5,000 reviewers

    • Provide tangible rewards for reviewersNo submission deadlines for chartered members of study sections (effective February 2008)

    • Provide flexible time for reviewersChoice of 3 times/year for 4 years or 2 times/year for 6 yearsyears

  • Enhancing Peer Review

    7. Advancing Additional Review Platforms• Additional Review Platforms Help RecruitingAdditional Review Platforms Help Recruiting

    Reviewers

    • Electronic Review Modes Reduce Travel

    • Electronic ReviewsTelephone Assisted MeetingVid A i t d M tiVideo Assisted MeetingInternet Assisted Meeting (previously AED)

  • Ad i Additi l R i Pl tfAdvancing Additional Review PlatformsWhat It Looks Like: Video Assisted Meeting

  • Telepresence Study Sections

  • Enhancing Peer Review

    8. Focusing More on Impact and Significance and Less on Approach

    • Shorten Applications (13 or 7 pages instead of 25 or 12)• Scoring SignificanceScoring Significance• Discussed applications receive additional overall impact score• Training of Reviewers and Chairs

  • Enhancing Peer Review

    9. Saving Reviewers Time

    • Shorter Applications• Bullet Critiques• Additional Review PlatformsAdditional Review Platforms

  • Template-Based Critiques

    • The objective is to write evaluative statements and to avoid summarizing the application

    • Comments should be in the form of bullet points or if Co e ts s ou d be t e o o bu et po ts onecessary short narratives

    • The entire template is uploaded to IAR to become part of the summary statement. y

    1. SignificanceSignificance Please limit text to ¼ pageStrengths•••Weaknesses••••

  • ScoringImpact Score DescriptorImpact Score Descriptor

    1 Exceptional

    High Impact 2 Outstanding

    3 Excellent

    Moderate Impact

    4 Very Good

    5 Goodp

    6 Satisfactory

    7 Fair

    Low Impact

    7 Fair

    8 Marginal

    9 P9 Poor

  • ScoringPriority Scores of R01 and R21 Reviewed by CSR

    40%

    50%

    60%Pe

    rcen

    t

    J 2008

    10%

    20%

    30%

    Cum

    ulat

    ive

    P June 2008

    60%

    0%100 150 200 250 300 350 400

    Priority Score

    30%

    40%

    50%

    60%

    ve P

    erce

    nt

    June 2009

    0%

    10%

    20%

    30%

    Cum

    ulat

    iv

    0%10 20 30 40 50 60 70 80 90

    Priority Score

  • Scores of R01 and R21 Reviewed by CSR

    60

    70

    50

    60

    cent

    30

    40

    ulat

    ive

    Perc

    Oct/Nov 2009 Meetings

    20

    30

    Cum

    u Oct/Nov 2009 Meetings

    Feb/March 2010 Meetings

    June/July 2010 Meetings

    0

    10

    y g

    010 20 30 40 50 60 70 80 90

    Overall Impact Score

  • Order of Review

    Why?• Concern of variation of scores during different times of the

    meeting. gThe original plan was to recalibrate scores at the end of the meeting

    Solution:n:• Recalibrate dynamically by discussing in order of average

    preliminary scores from assigned reviewers.

    Requirement:Requirement:• Reviewers must participate in entire meeting

  • 10 Enhancing Peer Review Training

    Enhancing Peer Review

    10. Enhancing Peer Review Training• CSR and NIH Review Staff

    6 face to face training sessions, January 2009g y6 face to face training sessions, April 2009Continuous updating

    • ChairsChairs

    For Chairs 17 sessions in 2009For Chairs appointed in 2010, 9 sessions so far, 4 more planned

    • Reviewers

    Training material (Power Point, interactive training, frequently asked questions mock study section videoquestions, mock study section video Senior CSR staff at the first meeting in May-July 2009

  • Enhancing Peer Review

    11. Continuously Reviewing the Changes

    • 12/09 Applicant and Reviewers Survey (64% response)

    • 1/10 Advisory Council Survey (291 responses)

    5/11 Pl d S Sh t A li ti• 5/11 Planned Survey on Shorter Applications

  • Key Findings from Reviewers and Councils

    • Councils have necessary information to make decisions

    • Reviewers like 9 point scoring scaleReviewers like 9 point scoring scale

    • Overall impact score is NOT the average of criteria scores

    • Approach is most influential criteria score (easiest to assess)

    • Clustering works --no difference in scoring ~ ESI/established

    • New change—have reviewer write overall impact paragraphg p p g p

  • Changes in Peer Review

    If we want things to stay as they are, things will have to change

    The LeopardGi T iGiuseppe Tommasi, Prince of Lampedusa