missing links in evidence-based practice for macro social work richard hoefer catheleen jordan

21
Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan ABSTRACT. The paradigm of evidence-based practice includes a process for searching, appraising, and synthesizing evidence to an- swer a question. Two important elements related to macro social work practice must be addressed in this process. One missing link relates to the imperative to foster client self-determination and empowerment by allowing clients to choose among equally salient interventions. The second missing link is to ensure that the intervention is im- plemented with fidelity to the original model. This article describes the need for incorporating these elements in evidence-based macro practice. Guidelines are provided for implementing the elements and their implications ÍFor practice, research, and policy. KEYWORDS. Client self-determination, evidence-based practice, fidelity, implementation, macro social work practice INTRODUCTION Attempting to incorpiorate the paradigm of evidence-based practice (EBP) in macro social work practice is a challenge. Since the concept Richard Hoefer, PhD, and Catheleen Jordan, PhD, are professors in the School of Social Work, Universiiy of Texas at Arlington. Address correspondence to Richard Hoefer, School of Social Work, University of Texas at Arlington, Arlington, TX, 76019 (E-mail: [email protected]). I Joumal of Evidence-Based Social Work, Vol. 5(3-4) 2008 http://www.haworthprcss.com/web/JEBSW © 2008 by The Haworth Press, Inc. All rights reserved. doi : ¡ 10.1080/15433710802084292 549

Upload: amory-jimenez

Post on 29-Jul-2015

85 views

Category:

Documents


0 download

DESCRIPTION

practica basada en el evidencia, practicas contemporáneas del trabajo social

TRANSCRIPT

Page 1: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Missing Links in Evidence-BasedPractice for Macro Social Work

Richard HoeferCatheleen Jordan

ABSTRACT. The paradigm of evidence-based practice includes aprocess for searching, appraising, and synthesizing evidence to an-swer a question. Two important elements related to macro social workpractice must be addressed in this process. One missing link relates tothe imperative to foster client self-determination and empowermentby allowing clients to choose among equally salient interventions.The second missing link is to ensure that the intervention is im-plemented with fidelity to the original model. This article describesthe need for incorporating these elements in evidence-based macropractice. Guidelines are provided for implementing the elements andtheir implications ÍFor practice, research, and policy.

KEYWORDS. Client self-determination, evidence-based practice,fidelity, implementation, macro social work practice

INTRODUCTION

Attempting to incorpiorate the paradigm of evidence-based practice(EBP) in macro social work practice is a challenge. Since the concept

Richard Hoefer, PhD, and Catheleen Jordan, PhD, are professors in the Schoolof Social Work, Universiiy of Texas at Arlington.

Address correspondence to Richard Hoefer, School of Social Work, Universityof Texas at Arlington, Arlington, TX, 76019 (E-mail: [email protected]).

I

Joumal of Evidence-Based Social Work, Vol. 5(3-4) 2008http://www.haworthprcss.com/web/JEBSW

© 2008 by The Haworth Press, Inc. All rights reserved.doi : ¡ 10.1080/15433710802084292 549

Page 2: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

550 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

of EBP is fairly new in the social work profession, it has not beenfully accepted in macro practice (or even in micro practice wherethe idea was first imported from medicine) and, problematically, th(iprocess of appraising and using evidence has not been adjusted forthe unique elements of macro practice. In fact, as recently as 2004,McNeece and Thyer stated, "Relatively little has been written aboutevidence-based macro practice" (p. 15).

This article defines EBP and refines the approach by incorpo-rating two important elements that can become missing links. Thefirst link relates to client self-determination and empowerment. Thesecond concerns a significant but often ignored process of assessingthe fidelity of the intervention in its implementation, which mustbe conducted before evaluating outcomes. The process of EBP isamended for the application to social work macro practice. Techniquesfor assessing implementation fidelity are described. Implications forpractice, research, and policy are also discussed.

RE-DEFINING THE PROCESS OFEVIDENCE-BASED PRACTICE

Gambrill has written extensively defining evidence-based practicein social work (e.g., see Gambrill, 1990, 1999; Gibbs & Gambrill,2002). According to her formulation, EBP is not an end state whereone can be an evidence-based practitioner simply based on informa-tion one has amassed. Sackett, Richardson, Rosenberg, and Haynes(1997) describe EBP as a problem-solving process consisting of fivesteps.

1. Convert information needs into answerable questions.2. Track down, with maximum efficiency, the best evidence with

which to answer these questions.3. Critically appraise the evidence for its validity and usefulness.4. Apply the results of this appraisal to policy/practice decisions.5. Evaluate the outcome (p. 3).

Missing Link Number I: Efficacy of Intervention

According to McNeece and Thyer (2004), EBP also involves pro-viding the client (whether a person, a community, or an organization)

Page 3: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 551I

TABLE 1. Evidence-Based Macro Practice Process1I

Step 1 Convert information needs into a relevant question for practice in a communityand/or organizational context.

Step 2 Track down with rnaximum efficiency the best evidence to answer the question.

Step 3 Critically appraise, the evidence for its validity and usefulness.

Step 4 Provide clients with appropriate information about the efficacy of differentinterventions and collaborate with them in making the final decision in selectingthe best practice. ',

Step 5 Apply the results of this appraisal in making policy/practice decisions that affectorganizational and/or community change.

Step 6 Assess the fidelity' implementation of the macro practice intervention.

Step 7 Evaluate service outcomes from implementing the best practice.

Note. Adapted from Sackett et ai., (1997).

with appropriate information about the efficacy of different interven-tions and allowing the client to make the final decision. This stepbecomes Step 4 in the process of evidence-based macro practiceas depicted in Table 1. This insight is important for macro socialworkers to consider in using this practice paradigm. Allowing clientcommunities and organizations to choose from among the best optionsavailable ensures that client self-determination is respected. Becauseself-determination and empowerment are core values within the Na-tional Association of Social Workers'(1999) Code of Ethics, promot-ing them within macro,practice is essential. When social workers areable to address these ethical hallmarks by providing information aboutmacro interventions that are not only well suited for the situation butalso have been shown to be efficacious, many benefits may accrue.Among these are procedural benefits (greater community and organi-zational backing for the proposed program or practice, more supportfor overcoming any initial difficulties that may arise, and a greaterwillingness to take responsibility for institutionalizing the approach)and substantive benefits (a greater likelihood of positive results for thecommunity or organization and the people who participate in themfrom the use of an empirically supported intervention).

Missing Link Number 2: Fidelity of Implementation

Gibbs and Gambrill (2002) make a heroic assumption that theintervention (whether at a micro level or macro level) is implemented

Page 4: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

552 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

completely and properly. Once knowledge has been applied, theybelieve that it will stay applied, completely, properly, and consistently,so that one may then easily evaluate the intervention for the outcomesthat are produced. This, however, is not necessarily the case. Theproblem of a lack of implementation or of implementing an evidence-based practice only partially, while unfortunate in micro social work,is perhaps even more problematic in macro practice. The secondmissing link becomes Step 6 in the process of implementing anevidence-based macro practice intervention, as shown in Table 1.

Complete and accurate implementation is not usually the case inreal life. We want to avoid what has been called a type III error:evaluating a program for its outcomes when that program has notbeen implemented properly (Hoefer, 1994). Pressman and Wildavsky(1973) noted over three decades ago that programs launched with thebest of intentions frequently fail to live up to expectations becausethey were not implemented as planned. Choosing an empirically vali-dated program because it has been shown to work well and producedpositive effects on clients is consistent with EBP at the macro level.But expecting the program to achieve those effects after it has beenonly partially or poorly implemented is to be overly optimistic andmay produce negative outcomes suggesting that the program does notreally work. Verifying that an intervention has been implemented asdesigned is crucial to evaluating the effectiveness of a macro practiceintervention.

WHAT ARE EVIDENCE-BASEDINTERVENTIONS?

Evidence-based practice consists of defining a problem, finding thebest possible answer to solving that problem (based on research andclient preferences), operationalizing that answer, assessing whetherthe answer was actually put into effect, and then evaluating how wellthat answer worked in fixing the problem it was supposed to fix.

Evidence-based interventions are those "answers to problems" thathave been shown through, whenever possible, some scientific proi^essto work. When scientific research is not available or has not beenconducted, other sorts of evidence to support the intervention as beingefficacious may be used.

In this paradigm, not all evidence is equal, so the use of EBPimplies being able to evaluate the research base put forward to supporteach possible intervention. McNeece and Thyer (2004, p. 10) indicate

Page 5: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 553

the following forms of evidence and their strength (from strongest toweakest): ;

Systematic reviews/meta-analysesRandomized controlled trialsQuasi-experimental studiesCase-control and cohort studiesPre-experimentai group studiesSurveysQualitative studies

Still, even if an intervention has credible scientific evidence tosupport its efficacy (àt whatever level of social work we wish toexamine), if the intervention is not actually implemented completelyand correctly, we cannot suggest that the intervention as designed isbeing implemented. Carrilio (2006) argues that examining programimplementation is important in understanding program effectiveness.Thus, choosing an "evidence-based intervention," in and of itself,should not be enough' for any practitioner. The intervention modelmust be put through implementation fidelity testing so that the prac-titioner can be assured that the techniques of the model are beingapplied correctly. '

Intervention planning for a particular locale may include modifi-cations of non-essential elements of the program to have it better fitthe situation in that locale. The Centers for Disease Control (CDC),for example, have information on a number of different programsregarding HIV/AIDS prevention efforts. Each program described hascore elements that are required and may not be altered. Core elementsare based on behavioral theory and "are thought to be responsible forthe intervention's effectiveness" (Centers for Disease Control, 2006,April, p. 15). The CDC (p. 16) suggests that "key characteristics" ofan intervention can be altered to fit the organization and clients. CDCguidelines include procedures to follow during implementation as wellas resource requirements. The guidelines promulgated by the CDC arehelpful in showing the ipiportance of having a clear understanding ofwhat a community-based intervention is, what is vital to replicate, andwhat is adaptable. '

The Substance Abuse and Mental Health Services Administration(SAMHSA) within the federal government has taken a few additionalsteps in specifying the need for implementation monitoring. They usethe term "fidelity assessment" to describe this process. According toSAMHSA (2006), fidelity is defined as:

Page 6: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

554 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Fidelity is the degree to which a specific implementation ofa program or practice resembles, adheres to, or is faithful tothe evidence-based model on which it is based. Fidelity is for-mally assessed using rating scales of the major elements of theevidence-based model.

CAN MACRO PRACTICE BE EVIDENCE-BASED?

The most vexing issue for EBP in the macro arena is the extentto which community and administrative practice can be theoreticallybased, tested, generalized, manualized, implemented, and declaredsuccessful. We suggest the most pressing need for social work maci opractice is to more clearly specify what its interventions actuallyare. This specification process requires developing a model, eitherdeductively or inductively, to address a macro issue, propose a courseof action, and specify tasks to be completed in implementing themodel.

Netting, Kettner and McMurtry (2004) define macro practice as"professionally guided intervention designed to bring about plannedchange in organizations and communities" (p. 4). In order for macrosocial work practice to be evidence-based, then, interventions mustbe systematically developed from theory, tested in practice, revisedas needed for greater effectiveness and for different conditions, stan-dardized as much as possible, and communicated to others.

In theory, macro social work practice can be evidence-based asmuch as micro social work can be (Thyer, 2001), simply by followingthe steps illustrated in Table 1. Still, many arguments have beenadvanced as to why evidence-based macro practice is not possible oris unwise. Hewing closely to the logic of Gibbs and Gambrill (2002),let us look at these objections and how they may be countered.

ARGUMENTS AGAINST ANDCOUNTER-ARGUMENTS FOR EVIDENCE-BASED

MACRO PRACTICE

EBP Ideas Only Apply to Clinical Practice,Not Macro Practice

Some may argue that because EBP was initially implemented inclinical medical settings, it is difficult or impossible to have the

Page 7: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 555

concepts apply to macro social work with a focus on communities andorganizations. The argument is that clinical social work concepts justdon't apply to administration and community practice. We considerthis a misunderstanding of what EBP is. To say that one cannot applythe process involved in EBP because of its origin in a clinical settingis akin to saying that one cannot become a community organizer oradministrator if one has begun a career in clinical social work. Withsome re-orientation and effort, we believe the concepts of EBP cansuccessfully be transferred to macro social work practice.

There is Not Enough Credible or Useful Evidence toSupport the Use of EBP in Macro Practice

This may be the most popular critique of EBP in general, not justin the macro social work arena. We must recall an important pointabout macro practice, though: "In today's world, macro practice israrely the domain of one profession. Rather, it involves the skillsof many disciplines and professionals in interaction" (Netting et al.,2004, p. 7). We must therefore remember to search for evidencebeyond the social work literature. Considerable evidence is avail-able in related disciplines and fields such as sociology, political sci-ence, community psychology, public health, nonprofit administration,business administratioh, public administration, and so on, in bothAmerican and other sources. Of course, this has to be evaluated andperhaps adapted to fit the social work context, but that is all partof the process of EBP. Even if there is little or no research-basedevidence with higher levels of credibility, evidence that is based onpractice wisdom or anecdotal is still considered better than a randomguess. Also, literature exists in other countries where social worktakes on a more macro orientation, such as the Swedish Institute forEvidence-Based Social Work Practice: English version home websiteat http://www.socialstyrelsen.se/en/about/IMS/index.htm.

EBP Ignores the Practice Wisdom of Macro Practice

Because so much of what happens in macro practice seems id-iosyncratic to the situation, generalizations are few and far between.This argument is similar ¡to the previous one. Practice wisdom is notignored, but it is relegated^ to a lower status of evidence when assessingwhat intervention should 'be chosen.

Page 8: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

556 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

EBP Ignores the Values of the Client Commttnity orOrganization by Being One Size Fits All

This argument completely ignores the process used to choose anintervention as described above. The practice principle emphasizedby McNeece and Thyer (2004) allows the client in the context ofa community or organization to choose the intervention when all theinformation is collected, appraised, and synthesized. EBP clearly doesnot ignore the values of the client system in macro practice.

DO ANY MACRO INTERVENTIONS FIT THEEBP CRITERIA?

Given macro social work's need for clearly delineated interventions,we would like to assure those interested in using EBP in the macroarena that numerous examples of clearly specified community andadministrative interventions exist. Assessing how well other settingshave implemented these programs is made much easier because oftheir programs' detailed specifications.

Community Practice Examples

Ohmer and Korr (2006) reviewed research to assess the level of ev-idence available to guide community practice. Their review, covering269 articles published over a 16-year period, included nine that usedsome type of experimental controls in the research. The Centers forDisease Control (2006) provide a dozen examples of programs thathave been implemented to diffuse effective behavioral interventions toprevent additional HIV infections. Each of these programs is presentedwith a clear differentiation between core elements, key characteristics,procedures, resource requirements, policies and standards, quality as-surance plans, and monitoring and evaluating efforts.

This same document by the CDC discusses other activities, ser-vices, and strategies that are supported by research. Examples of thesestrategies include comprehensive risk counseling and services, HIVcounseling, testing and referral, and incorporating HIV preventioninto the medical care of persons living with HIV.

The Substance Abuse and Mental Health Services Administra-tion (2006) also has considerable information available describing

Page 9: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 557

programs and practices that have been researched and found effec-tive. It disseminates' this information through its National Registryof Evidence-based Programs and Practices (NREPP) available atwww.nrepp.samhsa.gpv. All programs in this registry are consid-ered "evidence-based," meaning that they are conceptually soundand internally consistent, have program activities that are relatedto conceptualization,! and have been reasonably well implementedand evaluated. The programs on the registry are divided into threecategories, depending on the strength of the research that supportsthem: Promising (soine positive outcomes). Effective (consistendypositive outcomes, strongly implemented and evaluated), and Model("effective" programs' available for dissemination and with technicalassistance available from the program developers).

Administrative Practice ExamplesI

One of the more useful articles in the area of evidence-basedinterventions is an analytic literature review of the topic. Many ofthe social work texts related to administrative practice do not appearto be evidence-based. These authority-based information products areprimarily descriptive and prescriptive, a failure on the part of theirauthors. i

The lack of evidenc'e-based practices in social work texts is notdue to a paucity of research. For example, a considerable amountof interesting research' on nonprofit organization effectiveness andleadership was conducted in the mid to late 1990s by Herman andRenz (1998, 1999) and others, such as Rojas (2000) and Smith (1999).Little of this has filtered into social work management texts but couldbe incorporated into research and practice with only minor translation(at most) into social work language.

In another area of adniinistrative practice, that of the organizationof case management services, Ziguras and Stuart (2000) conducteda meta-analysis of the effectiveness of different types of case man-agement. Based on their analysis of 44 studies, they argue thatAssertive Case Management has some demonstrable advantages overclinical case management in terms of positive outcomes for clients,although both types of case management were stronger than the"usual" treatment approach. This evidence, if disseminated properlywithin social work, should encourage macro social workers to aban-don "usual" treatment approaches for evidence-based practices andinterventions. '

Page 10: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

558 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Policy Practice Examples

In this case, an evidence-based macro worker would be lookingless for details on the program being implemented and more forevidence on the process of trying to affect policy. Political sciencebooks authored by Birkland (2005), Gerston (1997), and Sabatier(1999) present analyses of how to best affect public policy. Theleading proponent of policy as a practice field in social work is Jansson(2002), who has consistently incorporated recent research from polit-ical science and other fields into his policy practice-related textbooks.Hoefer (2006), in explaining how to conduct advocacy practice, reliesheavily on empirical research to make recommendations on how bestto conduct advocacy. One particular empirically based area of focuswithin the book includes how to be influential, a topic that is heavilybased on the work of Cialdini (2000). Hoefer explores communica-tions theory and other empirically supported research as they relateto advocacy and policy practice.

Hoefer also has a series of journal articles in which he addressesquestions regarding the effectiveness of interest groups trying to affectsocial policy (Hoefer, 2000, 2001, 2002, 2005; Hoefer & Ferguson,2007). His accumulated findings provide empirical support for interestin the paradigm of evidence-based practice.

Beyond what is described here, several web-based sources can alsobe consulted:

The Campbell Collaboration, particularly what they call theirC2-RIPE (Campbell Collaboration Reviews of Interventions andPolicy Evaluations) social welfare database, which had 41 list-ings as of October 13, 2006, at www.campbellcollaboration.org.The Centers for Disease Control Diffusion of Effective Behav-ioral Interventions (DEBI) project, which can be found at www.effectiveinterventions.org.The Centers for Disease Control Replicating Effective Programs(REP) project, available at www.cdc.gov/hiv/projects/rep/default.htm.The Coalition for Evidence-Based Social Policy—social pro-grams that work—can be found at www.evidencebasedprogranis.org.The Cochrane Collaboration focuses on health care, which in-cludes mental health, and is available at www.cochrane.org.Columbia University's Evidence-Based Practice and PolicyOnline Resource Training Center is available at http://\vww.

Page 11: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 559

columbia.edu/cu/musherAVebsiteAVebsite/EBP_Resources_WebEBPPhtm. 'The Swedish Institute for Evidence-Based Social Work Practicecompilation of effective practices in the areas of substance abuse,child and adolescent welfare, economic aid (social assistance),ethnicity, migration, and social work theory and practice of evalu-ation is available at http://www.socialstyrelsen.se/en/about/IMS/index.htm.

PROCESSES FOR EVALUATING, IMPLEMENTATION!

Recognizing that there is a need for assessing or evaluating theimplementation process still leaves us with the question of how to con-duct that assessment. Werner (2004) describes some of the challengesin designing and conducting a study to document the implementationof a program:

1. Developing an initial idea of what to observe, whom to inter-view, and what to ask;

2. Sorting through conflicting or even contradictory descriptionsand assessments of a program;

3. Dealing with variations over time and across program sites; and4. Combining quantitative and qualitative data in cogent and useful

ways (p. 8). '

Qualitative data are important to the implementation fidelity processand may be collected by interviewing program participants—staff,consumers, and other stakeholders. Questions that are appropriate forimplementation research include the following:

• Are program goals, concepts, and design based on sound theoryand practice? If not, in what respects are they not?

• What types and le;vels of resources are needed to implement theprogram as planned?

• Are the resources ¡needed in place? If not, how and why not?• Are program processes and systems operating as planned? If not,

how and why? ^• Is the program reaching the intended target population with the

appropriate services at the planned rate and dosage? If not, inwhat respects is it not, and why? (adapted from Werner, 2004,pp. 15-19). i

Page 12: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

560 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Qualitative data from such questions may then be verified by collect-ing and analyzing quantitative data, as described in the next section.

TECHNIQUES OE ASSESSINGIMPLEMENTATION FIDELITY

Whenever possible, it is preferred to use a pre-developed implemen-tation fidelity measure that might accompany a particular evidence-based intervention. Using the measure that has already been developedwill ensure that you are assessing the implementation of the mostimportant elements of the program and will allow you to compare thestate of implementation to other program users. This will also allowyour data to be added to the literature because of its comparabilitywith other implementation fidelity studies.

If this is not possible, however, you can develop your own versionof an implementation fidelity instrument. Bond et al. (2000) provide a14-step approach, as depicted in Table 2, to develop an implementationfidelity scale for a program that does not have a ready-made or stan-dardized fidelity measure. We will briefly describe this approach. ïn

TABLE 2. Steps for Developing a Fidelity Measure

Stage

1

2

3

Source. Bond et al.

Step

1

2

3

4

5

6

7

8

9

10

11

12

13

14

(2000).

Action

Define the purpose of the fidelity scale

Assess the degree of model development

Identify model dimensions

Determine if appropriate fidelity scales already exist

Formulate fidelity scale plan

Develop items

Develop response scale points

Choose data collection sources and methods

Determine item order

Develop data collection protocol

Train interviewers/raters

Pilot the scale

Assess psychometric properties

Determine scoring and weighting of items

Page 13: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 561I

some organizations, many of the steps may require outside consultantassistance.

Stage 1: Preparing for Scale Development

Step 1: Define the Purpose of the Fidelity Scale. Depending on thepurpose of the research effort, the scale will be either more detailedor less, more expensive to use or less, and so on.

Step 2: Assess the Degree of Model Development. If the modelis well developed (that is, it is clear what is supposed to be done,when and with whom), one can use more detailed and quantitativeresearch techniques. If the model is not well developed, it will bemore important to assess the implementation of the program moreheuristically and qualitatively.

Step 3: Identify Model Dimensions. Assuming a well-specifiedmodel, then one must identify what elements of the model are the mostimportant and that differentiate it from other approaches to helpingclients. These elements can be determined by experts in the use of themodel or, more loosely, by people on your staff who are well versedor trained in the model.

Step 4: Determine if Appropriate Fidelity Scales Already Exist. Thisstep consists of answering three questions: How closely is the stafftrying to replicate an existing program model? How well defined isthat model? and How adequate are existing fidelity instruments?

If there is a well-defined model with an adequate instrument, oneshould use the pre-existing measure. If the situation includes notreplicating an existing model, if the model being replicated is notwell defined, or if thé existing instrument is not well constructed,proceed to step 5. '

Stage 2: Scale Development

Step 5: Formulate Fidelity Scale Plan. According to Bond et al.(2000, p. 48), "A scale plan states the model dimensions, gives defini-tions, and outlines the number and possible content of items requiredto tap into those definitions." The plan is considered the road mapto guide the instrument's creation and ensures that items will beconsistent with the model, that no elements will be left out, and asa way for others to check if the fidelity instrument is appropriate forthem. I

Page 14: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

562 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Step 6: Develop Items. Items must be constructed that follow theseprinciples (Bond et al., 2000, p. 51):

• Items should refer to the structure and activities of the programand behaviors of the staff.

• Items should refer to things under the control of the programstaff and program administration.

• Items should be written to fit with the sociocultural context.• Items should be clear and specific.

Additional information on developing good items can be found inBond et al. (2000, pp. 52-54) and most research textbooks.

Step 7: Develop Response Scale Points. Bond et al. (2000) reconri-mend the following attributes for response scales (the item answers):

• A standard number of scale points for every item (e.g., a 5-pointscale).

• Ordinal scale points approximating equal intervals between eachpoint.

• Points that are behaviorally anchored.• No gaps in the response alternatives.• No overlap in the scale points.• Scale points based on the empirical literature.

Step 8: Choose Data Collection Sources and Methods. This step en-tails choosing who or what will provide the information desired (staff,consumers, administrators are possible "whos," while agency recordsare a possible "what"). It also means deciding how the informationwill be collected (orally, surveys, observations, etc.). Each sourceand each method has plusses and minuses and should be chosen tomaximize the benefit for the overall process. Table 3 shows additionaltips for developing the data collection strategy.

Step 9: Determine Item Order. Four principles of item orderingincrease the chances of getting full and accurate information fromrespondents (Bond et al., 2000, pp. 62-63).

• Ask easy items at the beginning. This helps the respondent feelat ease and willing to answer the questions.

• Design the questions in a logical order. This lends coherence tothe interview.

Page 15: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 563I

TABLE 3. Additional Tips for Developing the Data Collection StrategyI

Choose data sources and methods that are congruent with the information needed.

Sources and methods may vary from item to item.

Use multiple sources and rnethods for each item vtrhenever possible.

Train interviewers and observers.

Build in methods to check data quality.

Assess fidelity at more than one timepoint.

Assess reliability of ratings. '

Source. Bond et al. (2000).

I

• Group questions that are related together. This maximizes coher-ence and helps respondents recall information accurately.

• Begin with general questions and move to more specific oneswithin each sectipn.

Step 10: Develop Data Collection Protocol. A data collection pro-tocol tells the evaluat;or exactly how to collect data. Having suchprotocols improves inter-rater reliability relating to the presence orabsence of intervention elements.

Step 11: Train Interviewers/Raters. While it is obvious that inter-viewers need to be trained, it is not always so clear what to train theinterviewers to be able to do. Bond et al. (2000, p. 64), suggest thatinterviewers should learn:

How to contact and introduce the scale to respondents.How to lay out thé questionnaire and how the interviewer shouldprogress through it.How to probe or follow up if initial responses are not on track.How to code responses and place them on the response scale.How to interact with the respondent.

Stage 3: Piloting the Scale

Step 12: Pilot the Scale. Piloting the scale is used to identifyproblems that were not noticed in the previous stage. Piloting isextremely useful in pinpointing where content may be difficult tocollect and different methods might need to be used.

Step 13: Assess Psychometric Properties. Analysis of informationrelating to the instrument's reliability and validity is done in this step.

Page 16: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

564 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

As this type of analysis can be very technical, experts should becalled in for consultation. The important information from this step,however, tells us whether we can trust the information coming fromthe fidelity scale.

Step 14: Determine Scoring and Weighting of Items. At times,information from one source or item may be considered to have moreimportance, or weight, than information from other sources. If this istrue in your situation, this is the step in which you take the differentlevels of importance into account.

Measuring implementation fidelity is a vital part of evidence-basedpractice for macro social workers. Without knowing the extent towhich an evidence-based program has been implemented, it will bedifficult, if not impossible, to say whether the program has beensuccessful in transporting it to a different practice setting. Followingthe many steps mentioned in this section may take a long time, butthe results will be compelling.

IMPLICATIONS FOR THE FUTURE

Several implications emerge from the analysis in this article. Theyfall into three areas: practice, research, and policy.

Practice

If we truly believe that evidence-based practice is important lormacro social workers, it follows that we must begin to adapt thecurrent EBP paradigm in ways that show its vitality and usefulness,such as incorporating crucial elements that might be missing links.We added a step in the problem-solving process by including theopportunity for clients to choose between equally salient interven-tions. We also added another step in this process related to assessingthe implementation fidelity of the evidence-based intervention. Oneshould not attempt to assess an intervention's outcomes without firstdetermining that the intervention was properly implemented. Macropractitioners should document the steps they take to implement theintervention's model, differentiating between the theoretically derivedprogram components that are essential and program components thatare useful, but may vary from one context to another.

In addition, as pointed out by the Center for Substance AbuseTreatment (2006), "most EBPs are not universally applicable to all

Page 17: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 565i

communities, treatment settings, and clients" (p. 4). Several macro-level issues identified by this center need to be addressed in using theparadigm of evidence-based practice:

• Client population characteristics.• Staff attitudes and skills required by the EBP.• Facilities and resources required by the EBP.• Agency policies ¡and administrative procedures needed to support

the EBP. ;• Inter-agency linkages or networks to provide needed additional

services.• State and local regulations.• Reimbursement for the specific services to be provided under the

EBP (p. 4).I

Research

Social workers in ^ e macro arena who are searching, appraising,and selecting evidence-based interventions want choices. Researchersmust partner with front-line workers and managers in designing, im-plementing, and evaluating these interventions. Research on macrointerventions (both community and administrative) has not been suf-ficiently conducted by| social work researchers, and the research fromother disciplines has not been sufficiently well integrated into thesocial work knowledge base to provide additional guidance. All typesof research on macro interventions need additional research support,ranging from more-or-less pure research that can be translated intopractice terminology to more-or-less applied research with enoughtheoretical basis to bei generalizable to additional settings. Researchfrom international sources should be another important resource formacro social workers in the United States.

Policy ,

Social policy is often determined for non-empirically based rea-sons. Shifts in political party control of legislatures and executiveoffices bring policy changes regardless of the evidence base to supportthat change. In recent years, social workers have been increasinglymarginalized as an anti-government philosophy has been used inpolitical campaigns. Financial resources for human resources havedecreased relative to need. Spending priorities have changed to support

Page 18: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

566 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

faith-based organizations rather than secular agencies. Greater callsfor accountability by social welfare agencies have been made by thepublic and its elected officials.

The marginalization of social work-supported policy has been pos-sible, in part, due to the lack of strong evidence to support the con-tinuation of various programs and interventions. One of the importantreasons to support EBP at all levels of social work is the hope that,at least in some political battles, facts and scientific information willplay an important role in making decisions. In addition, informationto improve advocacy and policy practice is available, but is oftenmissing in the social work curriculum.

CONCLUSION

Thyer and Myers (2003) suggest that:

If social work is to continue to enjoy substantial amounts offinancial support from local, state, and federal sources, it isimperative that we be able to demonstrate with legitimate datathat are credible to others, that we are genuinely capable ofhelping the clients we serve, (p. 268)

There are crucial elements that are missing in the EBP process.First, clients must be allowed to choose between equally salient inter-ventions. This practice principle adheres to the ethical hallmark of thesocial work profession related to client self-determination. Second,assessment of the fidelity to which an intervention, as designed, isimplemented is a crucial element for macro practitioners to analyze.Outcome research studies related to both efficacy and effectivenessneed to be conducted in order to build a knowledge base aboutevidence-based macro practices and interventions.

REFERENCES

Birkland, T. (2005). An introdtiction to the policy process: Theories, concepts, andmodels of public policy making. Armonk, NY: M. E. Sharpe.

Bond, G., Williams, J., Evans, L., Salyers, M., Kim, H., & Sharpe, H. (2000). Psychi-atric rehabilitation fidelity tool kit. Cambridge, MA: Human Services ResearchInstitute. Retrieved July 31, 2006, from http://www.tecathsri.org/malerials.asp.

Carrilio, T. (2006). Looking inside the "black box": A methodology for measuringprogram implementation and informing social services policy decisions. In R.

Page 19: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

Richard Hoefer and Catheleen Jordan 567I

Hoefer (Ed.), Cutting edge social policy research (pp. 1-17). New 'Vbrk: HaworthPress. '

Center for Substance Abuse Treatment. (2006). Treatment, Volume 1: Understandingevidence-based practices for co-occurring disorders. COCE overview. Paper 6.DHHS Publication No. (SMA). Rockville, MD: Substance Abuse and MentalHealth Services Administration and Center for Mental Health Services. Postedon the website June 8, 2006. Retrieved October 31, 2006, from http://www.coce.samhsa.gov/cod_resources/PDF/Evidence-basedPractices(OP6).pdf

Centers for Disease Control (CDC). (2006, April). Provisional procedural guidancefor community-based organizations. Retrieved October 9, 2006, from http://www.cdc.gov/hiv/topics/prev_prog/AHP/resources/guidelines/pro_guidance.htm.

Cialdini, R. (2000). Influence: Science and practice (4th ed.). Boston, MA: AllynBacon. i

Gambrill, E. (1990). Critical thinking in clinical practice: Improving the accuracyof judgments and decisions about clients. San Francisco: Jossey-Bass.

Gambrill, E. (1999). Evidence-based practice: An alternative to authority-basedpractice. Families in Society, 80, 341-350.

Gerston, L. (1997). Public policy making: Processes and principles. Armonk, NY:M. E. Sharpe. ,

Gibbs, L., & Gambrill, E. (2002). Evidence-based practice: Counterarguments toobjections. Research on Social Work Practice, 12(3), 452-476.

Herman, R., & Renz, D. (1998). Nonprofit organizational effectiveness: Contrastsbetween especially effective and less effective organizations. Nonprofit Manage-ment and Leadership, 9(1), 23-38.

Herman, R., & Renz, D. (1999). Theses on nonprofit organizational effectiveness.Nonprofit and Voluntary Sector Quarterly, 28(2), 107-126.

Hoefer, R. (1994). A good story, well told: Rules for evaluating human servicesprograms. Social Work} 39(2), 233-236.

Hoefer, R. (2000). Making a difference: Human service interest group influenceon social welfare program regulations. Journal of Sociology and Social Welfare,27(3), 21-38. 1

Hoefer, R. (2001). Highly effective human services interest groups: Seven keypractices. Journal of Community Practice, 9(3), 1-13.

Hoefer, R. (2002). Political advocacy in the 1980s: Comparing human services anddefense interest groups. Social Policy Joumal, 7(1), 99-112.

Hoefer, R. (2005). Altering state policy: Interest group effectiveness among state-level advocacy groups. Social Work, 50(3), 219-227.

Hoefer, R. (2006). Advocacy practice for social justice. Chicago: Lyceum Books.Hoefer, R., & Ferguson, K. (2007). Moving the levers of power: How advocacy

organizations affect the regulation-writing process. Journal of Sociology andSocial Welfare, 34(\), 83-108.

Jansson, B. (2002). Becoming an effective policy advocate: From policy practice tosocial Justice (4th ed.). Pacific Grove, CA: Brooks/Cole/Thomson Leaming.

McNeece, C. A., & Thyer, B. (2004). Evidence-based practice and social work.Journal of Evidence-Based Social Work, 1(\), 7-24.

National Association of Social Workers (NASW). (1999). Code of ethics. RetrievedNovember 1, 2006, from http://www.naswdc.org/pubs/code/default.asp.

Page 20: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan

568 JOURNAL OF EVIDENCE-BASED SOCIAL WORK

Netting, F. E., Kettner, P., & McMurtry, S. (2004). Social work macro practice (3rcled.). Boston: Allyn & Bacon.

Ohmer, M., & Korr, W. (2006). The effectiveness of community-based interventions.Research in Social Work Practice, 76(2), 132-14.5.

Pressman, J., & Wildavsky, A. (1973). Implementation: How great expectations inWashington are dashed in Oakland. Berkley, CA: University of California Press.

Rojas, R. (2000). A review of models for measuring organizational effectivenessamong for-profit and nonprofit organizations. Nonprofit Management and Lead-ership, 77(1), 97-104.

Sabatier, P (Ed.). (1999). Theories of the policy process. Denver, CO: WestviewPress.

Sackett, D., Richardson, W, Rosenberg, W., & Haynes R. (1997). Evidence-basedmedicine: How to practice and teach EBM. New York: Churchill Livingstone.

Smith, D. H. (1999). The effective grassroots association It: Organizational factorsthat produce external impact. Nonpivfit Management and Leadership, 70(1), 103-116.

Substance Abuse and Mental Health Services Administration (SAMHSA). (2006).SAMHSA model programs. Retrieved October 9, 2006, from http://mod(;lprograms.samhsa.gov/template_cf.cim?page=model_list.

Thyer, B. (2001). Evidence-based approaches to community practice. In H. Briggs& K. Corcoran (Eds.), Social work practice: Treating common client problems(pp. 54-65). Chicago: Lyceum Books.

Thyer, B., & Myers, L. (2003). Linking assessment to outcome evaluation usingsingle system and group research designs, ln C. Jordan & C. Franklin (Eds.),Clinical assessment for social workers: Quantitative and qualitative methods(2nd ed., pp. 385-405). Chicago: Lyceum Books.

Werner, A. (2004). A guide to implementation research. Washington, DC: The UrbanInstitute.

Ziguras, S., & Stuart, G. (2000). A meta-analysis of the effectiveness of menialhealth case management over 20 years. Psychiatric Services, 51, 1410-1421.

Page 21: Missing Links in Evidence-Based Practice for Macro Social Work Richard Hoefer Catheleen Jordan