systematic review: teams, processes, experiences

39
Systematic Review Processes, Teams,& Experiences PF Anderson, University of Michigan March 29, 2012

Upload: university-of-michigan-taubman-health-sciences-library

Post on 07-May-2015

3.083 views

Category:

Health & Medicine


1 download

DESCRIPTION

A presentation given for the School of Information course on SI 653: Evidence-Based Health Information Practice taught by Tiffany Veinot.

TRANSCRIPT

Page 1: Systematic review: teams, processes, experiences

Systematic Review Processes, Teams,& Experiences

PF Anderson, University of MichiganMarch 29, 2012

Page 2: Systematic review: teams, processes, experiences

●About Systematic reviews● Purpose● Uses● Role of the Librarian

●Process & Methodology● FRIAR/SECT● MEMORABLE● Sentinels & MeSH● Publishing● Discovering/Sharing Strategies● Data Abstraction / Extraction● Reporting

Overview

Page 3: Systematic review: teams, processes, experiences

About Systematic Reviews

Page 4: Systematic review: teams, processes, experiences

●Evidence-based -> clinically integrated

●Systematic review -> research methodology

http://www.flickr.com/photos/rosefirerising/1175879764/in/set-72157604660150389/

Evidence-based or Systematic Review, What's the Difference?

Page 5: Systematic review: teams, processes, experiences

Clinical

Page 6: Systematic review: teams, processes, experiences

●Scientific & Unbiased: ● “A systematic review involves the application of scientific strategies,

in ways that limit bias, to the assembly, critical appraisal, and synthesis of all relevant studies that address a specific clinical question.”

●Summary:● “A meta-analysis is a type of systematic review that uses statistical

methods to combine and summarize the results of several primary studies.”

●Clearly Reported:● “Because the review process itself (like any other type of research) is

subject to bias, a useful review requires clear reporting of information obtained using rigorous methods.”

● Cook DJ, Mulrow CD, Haynes RB. Systematic Reviews: Synthesis of Best Evidence for Clinical Decisions. Annals of Internal Medicine 1997 126(5):376-380.

What is a Systematic Review?

Page 7: Systematic review: teams, processes, experiences

http://cochrane.org/consumers/docs/01Cochrane5min.ppt

Cochrane Collaboration

Page 8: Systematic review: teams, processes, experiences

●Clinical expert ● Initiates, defines, selects topic.

●Clinical expert ● Partners in above process, and collaborates in review to prevent

bias.●Statistician

● Provides methodological oversight, ensures process quality for entire project.

●Librarian● Provides methodological oversight, ensues process quality for

information search process.●Healthcare Consumer

● Provides insight into the priorities for research, information conduit for relating priorities and findings between consumers and clinicians.

Cochrane Review Teams

Page 9: Systematic review: teams, processes, experiences

●According to the ADA policy statement on EBD, the term "best evidence" "refers to information obtained from randomized controlled clinical trials, nonrandomized controlled clinical trials, cohort studies, case-control studies, crossover studies, cross-sectional studies, case studies or, in the absence of scientific evidence, the consensus opinion of experts in the appropriate fields of research or clinical practice. The strength of the evidence follows the order of the studies or opinions listed above.”

● Ismail AI, Bader JD. Evidence-based dentistry in clinical practice. JADA 2004 135(1):78-83

What is Evidence-Based Healthcare?

Page 10: Systematic review: teams, processes, experiences

●Short version:●‘Make your [clinical] decisions based on the best evidence available, integrated with your clinical judgment. That’s all it means. The best evidence, whatever that is.’

● Paraphrased from Dr. Ismail in conversation, circa 2003.

What is Evidence-Based Healthcare?

Page 11: Systematic review: teams, processes, experiences

Levels of Evidence, in Context

Page 12: Systematic review: teams, processes, experiences

Process & Methodology

Page 13: Systematic review: teams, processes, experiences

●Team meets● Define topic, overview literature base, suggest inclusion/exclusion

criteria, discuss methodology & timeline.●Librarian

● Generates data for the team● FRIAR/MEMORABLE/SECT● Topic experts collaborate

●Topic experts ● Review data at 3-4 levels (title, abstract, article, [request additional

information]), achieve consensus● Handsearching (librarian generates list, experts implement)● Determine level of evidence for remaining research● Generate review tables

●Share findings (Publication)● Strength of evidence available (strong, weak, inadequate); suggest

directions for future research to fill gaps in research base

Process & Methodology, Overview

Page 14: Systematic review: teams, processes, experiences

Expectations of the Librarian Role

●Search strategy● Background research of already published similar search strategies, systematic

reviews on related topics● Suggest appropriate terms & concepts for review by clinical experts ● Clear communications with team about relationship of search to question and

methodology● Sensitivity / specificity ● Validated, revised, adhering to standards / guidelines / best practices● Publication-ready copy of strategy● Variant strategies for other databases

●Data set● In appropriate format● Data set management support

●Methodology oversight● Write / revise methodology & results as appropriate● Assure replicability of methods

Page 15: Systematic review: teams, processes, experiences

Review Search Terms

●Disease / clinical topic● Anatomical area● Earlier terms● Alternate terms● Related concepts

●Methodologies●Investigative terms●Publication types

Page 16: Systematic review: teams, processes, experiences

Special Issues

●Authorship● Justifying co-authorship● Criteria

● Avoiding plagiarism, self-plagiarism, and other questionable writing practices: A guide to ethical writing: http://ori.dhhs.gov/education/products/plagiarism/index.shtml

● Establishing authorship: http://ori.dhhs.gov/education/products/plagiarism/33.shtml

● Co-authorship versus fee-for-services-rendered (or both)● When do you NOT want to be a co-author?

Page 17: Systematic review: teams, processes, experiences

●F – Frame●R - Rank by Relevance●I - Irrelevant Search Concepts●A - Alternates/Aliases (Term Generation)●R - Review, Revise, Repeat

●S – Search●E – Evaluate●C – Cite●T - Test/Try Again

FRIAR/SECT

Page 18: Systematic review: teams, processes, experiences

●P = Patient●I = Intervention●C = Control group or comparison

● NOTE: In very small research domains, this portion may not be included. A systematic review would not reach clinical significance, but would focus on levels of evidence available and directions for future research.

●O = Outcome desired●T = Time factor

Frame = Question (PIO/PICO/PICOT)

Page 19: Systematic review: teams, processes, experiences

●Searches usually have 2-4 topics that are relevant and searchable.●Rank relevant searchable topics by importance.●TIP: Structure the search to place the greatest attention, focus, and richness of term generation on the most important topic.

R = Relevance

Page 20: Systematic review: teams, processes, experiences

●Too broad●Too specific●Unsearchable●NOTE: Irrelevant topics are not irrelevant to the question or clinical decision, and are important to making the final decisions. They are irrelevant to the search process.

I = Irrelevant

Page 21: Systematic review: teams, processes, experiences

●Term generation process might include:● Alternate terms, spellings (UK), archaic terms● Acronyms & what they stand for● Anatomical area, symptoms, diagnostic criteria● Products, chemicals, microorganisms, registry

numbers, etc.●NOTE: After asking the question, this is most important part of the process. ●TIP: Have team brainstorm terms, then search for more, have team review added terms.

A = Alternates/Aliases

Page 22: Systematic review: teams, processes, experiences

●Did search retrieve what it should? (True positive)● If not, why not?

●Did search retrieve what it shouldn’t (False positive)● Are there patterns that would allow you to exclude the false

positives? Without losing the ones you want?

R = Review, Revise, Repeat

Page 23: Systematic review: teams, processes, experiences

MEMORABLE, A Medline Search Strategy Development Tool

Page 24: Systematic review: teams, processes, experiences

●Number of sentinels desired - 3-5. Can have more or less, but this tends to work best. Verify appropriateness of selected sentinels.●Neither very recent (current year) or old (before 1985)

● Articles old enough to have MeSH assigned, new enough to have complete indexing

● On topic, not broader or narrower● Well-indexed with appropriate terms

●Representative of citations that would be retrieved by a well-done search●Remember – purpose is for validating search, not proving you know the best articles on the topic●Each sentinel article must represent ALL desired concepts in the search

● Articles selected must meet all inclusion and exclusion criteria.

Sentinel Articles

Page 25: Systematic review: teams, processes, experiences

●Earlier term mappings prior to assignment of a MeSH term are often:

● presenting symptom or diagnosis● anatomical area

●TMJD Example:● TMJD = temporomandibular joint

disorder● = (Temporomandibular joint

[anatomical area] + ("myofacial pain" OR "Bone Diseases") [presenting symptom OR diagnosis]

MESH Tip

Image: Frank Gaillard. Normal anatomy of the Temporomandibular joint. 14 Jan 2009. http://commons.wikimedia.org/wiki/File:Temporomandibular_joint.png

Page 26: Systematic review: teams, processes, experiences

●Verify sentinel citations in MEDLINE●Save file with full citations (abstract, MeSH headings, everything)●Make duplicate file to process

MESH & Sentinels 1 & 2

●Process citations● strip out everything except

MeSH terms● sort alphabetically● strip irrelevant Mesh terms● dedupe

● note frequency of repetition of terms to aid in determining weight of term

● note distribution and frequency of subheadings

Page 27: Systematic review: teams, processes, experiences

●Analyse MeSH Terms● retain topical terms● retain methodology terms● retain non-MeSH terms such as publication type and registry

numbers (but separate from core concept terms)

MESH & Sentinels 3

Page 28: Systematic review: teams, processes, experiences

●Torabinejad M, Anderson P, Bader J, Brown LJ, Chen LH, Goodacre CJ, Kattadiyil MT, Kutsenko D, Lozada J, Patel R, Petersen F, Puterman I, White SN. Outcomes of root canal treatment and restoration, implant-supported single crowns, fixed partial dentures, and extraction without replacement: a systematic review. J Prosthet Dent. 2007 Oct;98(4):285-311. PMID: 17936128

Systematic Review Search Strategy Example

Page 29: Systematic review: teams, processes, experiences

●Lingo: filters vs. hedges●Search the Methods of existing systematic reviews.●Warning:

● Many articles published as systematic reviews may have modified the process.

● Many articles published as systematic reviews may not include a replicable search methodology.

● Some articles published as systematic reviews may not actually be systematic reviews.

Sources of Search Strategies

Page 30: Systematic review: teams, processes, experiences

●NIH Consensus Development Conference on Dental Caries: ● http://www.lib.umich.edu/health-sciences-libraries/nih-consensus-

development-conference-diagnosis-and-management-dental-carie●EBHC Strategies Wiki:

● http://ebhcstrategies.wetpaint.com/

Sources of Search Strategies

Page 31: Systematic review: teams, processes, experiences

Data Abstraction / Extraction

●Symbolic expression always arises through abstraction and simplification, and can only reflect a small part of reality.

● Cajal, Advice to a Young Investigator, p. 76

Page 32: Systematic review: teams, processes, experiences

Cochrane: Data Extraction

Page 33: Systematic review: teams, processes, experiences

http://www.aota.org/DocumentVault/AJOT/Template.aspx?FT=.pdf

Evidence Table Example

●Levels of evidence●Participant characteristics●Study characteristics●Intervention and outcome measurements●Results●Study limitations●Inclusion/Exclusion criteria

Page 34: Systematic review: teams, processes, experiences

Data abstraction / extraction samples

●Cochrane:● Forms: http://endoc.cochrane.org/data-extraction-sheets

● Elements: http://www.cochrane.org/handbook/table-73a-checklist-items-consider-data-collection-or-data-extraction

● Cochrane CFGD November 2004 * (DOC): http://cfgd.cochrane.org/sites/cfgd.

cochrane.org/files/uploads/Study%20selection%20and%20%20extraction%20form.doc Data Extraction Template, 2011 (XLS): http://www.latrobe.edu.au/chcp/assets/downloads/DET_2011.xls

● Overview: http://www.cochrane-pro-mg.org/Documents/Reviewsheet-1-4-08.pdf

●Other● CDC Data Abstraction Form * : http://www.nccmt.ca/uploads/registry/CDC%20Tool.

pdf● Social science example (suicide) http://www.cmaj.ca/content/suppl/2009/01/29/180.

3.291.DC2/ssri-barbui-3-at.pdf

Page 35: Systematic review: teams, processes, experiences

http://www.prisma-statement.org/2.1.4%20-%20PRISMA%20Flow%202009%20Diagram.

pdf

Clearly Stating the Evidence (PRISMA)

NOTE: PRISMA-P coming later this year

Page 36: Systematic review: teams, processes, experiences

●CONSORT (Consolidated Standards of Reporting Trials) 2001, 2010●ASSERT (A Standard for the Scientific & Ethical Review of Trials) 2001

○ EQUATOR (Enhancing the QUAlity & Transparency of health Research) 2008

○ SPIRIT (Standard Protocol Items for Randomized Trials) 2008●QUORUM (Quality of Reporting of Meta-analyses) 1999, 2009●MOOSE (Meta-analysis of Observational Studies in Epidemiology) 2000●STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) 2007, STROBE-ME 2011●Others: CENT, COREQ, GNOSIS, ORION, RECORD, REFLECT, REMARK, REMINDE, SQUIRE, STARD, STREGA, TREND, WIDER, ...

Guidelines for Assessing/Reporting Results

Page 37: Systematic review: teams, processes, experiences

●CONSORT 2001 —> 2010 —> PRISMA●QUORUM 1999 —> 2009 —> PRISMA●ASSERT 2001 —> SPIRIT 2008 —> EQUATOR 2008●STROBE 2007 —> STROBE-ME 2011

More on fragmentation & history:

Jan P. Vandenbroucke. Journal of Clinical Epidemiology 62 (2009):594-596 http://www.veteditors.org/Publication%20Guidelines/JOURNAL%20VARIATION%20on%20Guidelines%202009.pdf

●CURRENT LIST: http://www.equator-network.org/resource-centre/library-of-health-research-reporting/

Evolution of Guidelines for Assessing/Reporting Results

Page 38: Systematic review: teams, processes, experiences

● Most common?● Underestimating time/labor requirements● First time findings: "insufficient evidence"

Standard Team/Process vs Reality: Case Studies

Elements Projects & Examples

● Team: size, configuration

● Domain size● Librarian role, #● Clerical support● Software / tech

support● Handsearching /

experts● Outcomes

● NIH Consensus Development Conference on Dental Caries

● Torebinejad: Root Canal Treatment● Cochrane: Herpes Simplex (Oral

Health, Skin)● Delta Dental● Diabetes● Articaine vs. Lidocaine● Healthcare Social Media

Page 39: Systematic review: teams, processes, experiences

●Slides at:● http://slideshare.net/umhealthscienceslibraries/

●Contact:● [email protected]