research seminar how are professional interventions with vulnerable young people informed? an...

38
Research Seminar How are Professional Interventions with vulnerable Young People informed? An analysis of how evidence based practice influences the delivery of professional interventions in the third sector. Justin Dunne – [email protected]

Upload: sibyl-mcdowell

Post on 23-Dec-2015

215 views

Category:

Documents


1 download

TRANSCRIPT

Research SeminarHow are Professional Interventions with vulnerable Young People informed?

An analysis of how evidence based practice influences the delivery of professional interventions in the third sector.

Justin Dunne – [email protected]

A Journey from Practice

2006 Home Office Drugs Team of the Year (South-West)

Focus of Research

To understand ‘what works’ and ‘why’?

To give careful consideration of the ‘evidence base’.

To see in what ways a ‘typical’ service is evidence based.

To try and identify a model for effective practice for work with vulnerable young people.

Stage in Research First draft literature review

Critique of EBP Consideration of the evidence base

Observation Stage (CCP) Completion of multiple observations –

Bramah House/Arkells including Foyer, Education Centre, Mentoring Programme, Peer Education.

Other Primary Research (to do) Semi-structured interviews Document analysis

Research approach My position is that an intervention is an

interaction of a method, a recipient, often a practitioner as well as the values, culture and society in which this takes place.

Strauss and Corbin (2008) discuss this as a constructivist view as it is about constantly evolving interactions.

‘..human practices [are]…constructed in and out of interaction between human beings and their world, and developed and transmitted within an essentially social context’ (Bryman, 2008, pp.12-13).

What is Evidence-based practice?

Epistemological debate around definitions Classical Definition Integrative Model Common factors (Mitchell, 2011)

Core belief - policy and practice should always be informed by ‘research’ evidence.

Classical Definition Origins of the EBP movement lie within medicine in

the early 1990s. (Hammersley, 2001; Biesta, 2007; Marks 2002)

Cochrane’s criticism - the medical profession did not organise a summary of controlled trials around health care. (Walker, 2003)

Influenced by the idea of New Public Management and was embraced by the New Labour. (Osborne, 1995; Hammersley, 2012)

Evidence is based on Randomised Controlled Trials (Empirically Supported Treatments). (Mitchell,2011)

Positivist Approach – ’What works?’. (Biesta, 2007; Walker, 2003)

For this approach

Example from several meta-analytical studies regarding prevention and treatment for young people with or at risk of mental health problems. (Weisz, et al.,2005)

Results - The average treated child across these studies was likely to be better off than 75% of young people in control groups.

The same studies conclude that usual practice where practitioners simply use their judgement not constrained by EBP interventions or manuals have an effect size of around zero, thus indicating no treatment benefit.

For this approach

Fidelity refers to the degree that practitioners implement programmes as intended by the programme developers (McGrath, et al., 2006).

ESTs are usually accompanied by manuals explaining how programmes should be implemented.

Henggeler, et al (1997) show that Multi-Systemic therapy for working with youth behaviour problems decreases in its effectiveness the less the model for EBP was adhered to.

For this approach

EBP may also be good for practitioners.

EST ‘Safecare’ designed to reduce child neglect in the home.

Research found that practitioners that implemented this programme experienced lower levels of emotional exhaustion especially when compared with what they describe as ‘usual care services’.

Suggestions for effectiveness in this case are about the programme being a better fit for client needs and a more useful structure for organising services. (Aarons, et al, 2009)

Classical Definition A slogan designed for proponents of a certain

approach to try and discredit others. Who is going to disagree that practice should not be based on some kind of evidence? (Hammersley, 2001)

‘naïve realism’ (Walker, 2003, p.148) Controlled trials are insufficient in constructing our understanding of reality and do not address the gap between practice experience and research.

The Health Development Agency agree describing the positivist approach as, ‘naïve and counterproductive’ (Marks, 2002, p.4)

The evidence base much be broadened and use more inclusive methods than those found in randomised controlled trials.

Against the classical definition

Research evidence is not necessarily the only kind of knowledge. EBP does not take seriously the notion of practice experience and tacit knowledge .

Research-based knowledge is fallible in itself. An intervention may be evidence based in a lab but: What about the skilled practitioner? Who decides on what a good outcome is? What about the context outside a lab and the multiple

needs that may exist? A problem of how research findings are used by

policy makers and practitioners. (Hammersley,

2001)

Practice Wisdom

Criticism - The EBP movement does not take seriously the notion of practice experience and tacit knowledge.

Mitchell (2011, p.208) draws attention to the idea of ‘practice wisdom’ and defines this concept, ’…as practice-based knowledge that has emerged and evolved primarily on the basis of practical experience rather than from empirical research.’

This is a different kind of knowledge. It may also be subject to research. It may reveal ‘why’ and ‘how’ interventions work?

Evidence Based Practice - Three models

1. Empirically Supported Treatments (ESTs) based on randomised controlled trials.

2. Integrative Approach – ESTs & Practice Wisdom Institute of Medicine in America defines EBP as

the ‘integration of best research evidence with clinical expertise and patient values’

ESTs provide specific psychological treatments but EBP in Psychology (according to the APA) should encompass broader ideas around assessment and therapeutic relationships. EBP is a decision making process for integrating multiple streams of research evidence into the intervention process. (Levant, and Hasan, 2008)

Evidence Based Practice - Three models

3. Common factors and characteristics of effective programmes – from meta reviews normally of ESTs but sometimes other research. Meta-analysis draw out a characteristics to indicate

what they see as EBP for those working with juvenile and adult offenders.

These include the use of particular interventions like Cognitive Behaviour Therapy, whilst drawing out of features like the use of sanctions and incentives, family involvement and assessment processes. (Henderson et al., 2008)

Misleading research claims and political rhetoric based on dubious findings can cause credibility issues for EBP.

Marks (2002, p.24) discusses EBP myth - decisions are based not in the latest and best evidence but on out of date ideas and evidence. He terms this Opinion Based Practice.

Myths emerge based of poor or discredited research.

E.g. ‘Pygmalion in the Classroom’ (Rosenthal and Jacobson, 1966 in Hammersley, 2003 – see also Elashoff and Snow, 1971 and Rogers, 1982)

Evidence Based Practice Myths

‘Family Intervention Projects: A classic case of policy-based evidence’ (Gregg, 2010)

The Department for Communities and Local Government claimed an 84% success rate in reducing anti-social behaviour in a pilot.

A year later such behaviour had returned in 53% of tracked families (Gregg, 2008).

Evidence Based Practice Myths

Despite these findings, IIP and FIP was delivered by the Department of Children, Schools and Families (DCSF) between 2008 and 2010 (Youth Justice Board, no date) with massive financial investment and being promoted with the 84% success headline.

Evidence Based Practice Myths

Research has shown a problem known as ‘Peer Deviancy Training’ which suggests such group work is more likely to cause harm than do good amongst young people with problematic behaviour (Dishion, et al., 1996; Poulin, et al., 2001).

Despite this, IIP have endorsed the use of such approaches. ‘Occasionally group interventions may be delivered particularly around reparation/restorative justice, gang awareness and the effects of ASB but these programmes will be tried, tested and measurable’ (Birch, 2009).

Evidence Based Practice Myths

The DCSF was also questioned about how the Dundee pilot project would be replicated as it was rolled out in other areas of the country. EBP is structured – usually manual based – Fidelity is important to EBP.

DCSF said it was up to practitioners to decide what interventions to use as long as they worked towards the stated outcomes of the project.

Evidence Based Practice Myths

Problem of Practice Wisdom

Rodd & Stewart (2009, p.6), ‘For youth workers to be able to do their job, the relationship is often seen as central, foundational and a prerequisite to making other things happen.’

Dishion, et al., (1999, p.760) research found that, ‘… when comparisons were restricted to those with whom a counsellor had particularly good rapport, or those whom the staff believed they had helped most, the objective evidence failed to show the program had been beneficial.

Evidence of Harm

Those carrying out research need to make it clear where interventions or services are failing (Johnson-Reid, 2011)

Littell (2008) draws attention to the idea of confirmation bias where evidence contrary to a hypothesis can be ignored.

Positive results are likely to be made available for publication in a way that null or negative results are not (Dishion, 1999; Hopewell et al., 2009).

Unfortunately, when ineffective or harmful practices are ignored this detracts from the evidence base for practice leaving it incomplete.

Evidence of Harm Hundreds of controlled intervention studies have

focused on adolescent problem behaviour and an estimated 29% show negative effects (Lipsey, 1992).

McCord (1978) was able to successfully follow up 253 men and their matched partners assigned to a control group, 30 years after the a study aimed at preventing young people engaging in criminal activity.

A number of interventions took place including focus on family problems, tutoring in academic subjects, medical or psychiatric attention, attendance at summer camps and regular involvement with organisations like the Scouts and other community programmes.

Evidence of Harm The program seems not only to have failed to

prevent its clients from committing crimes but also to have produced negative side effects.

As compared with the control group:

1. Men who had been in the treatment program were more likely to commit (at least) a second crime.

2. Men who had been in the treatment program were more likely to evidence signs of alcoholism.

3. Men from the treatment group more commonly manifested signs of serious mental illness.

4. Among men who had died, those from the treatment group died younger.

Evidence of Harm

5. Men from the treatment group were more likely to report having had at least one stress-related disease; in particular, they were more likely to have experienced high blood pressure or heart trouble.

6. Men from the treatment group tended to have occupations with lower prestige.

7. Men from the treatment group tended more often to report their work as not satisfying.

Questionable Evidence?ARNOLD, M. E. & HUGHES, J. N. (1999) First Do No Harm: Adverse Effects of Grouping Deviant Youth for Skills Training. Journal of School Psychology, 37, 99-115.

DISHION, T. J., MCCORD, J. & POULIN, F. (1999) When Interventions Harm: Peer Groups and Problem Behavior, American Psychologist, 54, 755-764.

FLORSHEIM, P., BEHLING, S., SOUTH, M., FOWLES, T. R. & DEWITT, J. (2004) Does the Youth Corrections System Work? Tracking the Effectiveness of Intervention Efforts With Delinquent Boys in State Custody. Psychological Services, 1, 126-139.

HANDWERK, M. L., FIELD, C. E. & FRIMAN, P. C. (2000) The Iatrogenic Effects of Group Intervention for Antisocial Youth: Premature Extrapolations? Journal of Behavioral Education, 10, 223-238.

MCCORD, J. (1978) A Thirty-Year Follow-up of Treatment Effects. American Psychologist, 33, 284-289.

MCCORD, J. (2002) Counterproductive Juvenile Justice. Australian and New Zealand Journal of Criminology, 35, 230-237.

MOOS, R. H. (2005) Iatrogenic effects of psychosocial interventions for substance use disorders: prevalence, predictors, prevention. Addiction, 100, 595-604.

POULIN, F., DISHION, T. J. & BURRASTON, B. (2001) Three-ear Iatrogenic Effects Associated with Aggregating High-Risk Adolescents in Cognitive-Behavioral Preventive Interventions. Applied Developmental Science, 5, 214-224.

Why do interventions harm?McCord (1978) posits certain suggestions

Mixing with adults who deliver interventions whose values are different from the family concerned may lead to internal conflicts that manifest themselves is disease and/or dissatisfaction.

Dependence may be created by interventions that when no longer available may result in resentment.

High expectations generated by intervention programmes mean that subsequent experiences tend to produce symptoms of deprivation.

Why do interventions harm?McCord (1978) posits certain suggestions

By being involved in a project that delivered services for a persons welfare, such a person may justify the received help by perceiving themselves as requiring help.

Dishion et al. (1999) suggest that teenagers whose deviant behaviour is reinforced through laughter and attention are more likely to escalate such behaviour and that high risk young people develop a cognitive basis for motivation to behave delinquently because they derive meaning and values through deviancy training.

My research

Desk based - NICE draw attention to 6 categories of evidence that they accept for EBP: evidence from meta-analysis of randomised

controlled trials; evidence from at least one randomised controlled

trial; evidence from at least one controlled study without

randomisation; evidence from at least one other type of quasi-

experimental study; evidence from non-experimental descriptive studies, such as comparative studies, correlation studies and case-control studies;

evidence from expert committee reports or opinions and/or clinical experience of respected authorities

(Marks, 2002, p.7).

My research

Review of Meta-analysis/systematic reviews Categories include – crime, anti-social behaviour,

substance misuse, mental health, family interventions, violence, education, multiple interventions, other.

Examples: Robinson, et al. (2011) ‘Preventing suicide in young people:

systematic review.’ The evidence regarding effective interventions for adolescents and young adults with suicide attempt, deliberate self-harm or suicidal ideation is extremely limited. But Cognitive Behavioural Therapy shows promise.

Waldron & Turner (2008) 'Evidence-Based Psychosocial Treatments for Adolescent Substance Abuse.’ Three treatment approaches - multidimensional family therapy, functional family therapy, and CBT emerged as well-established models for substance abuse treatment.

My research

Review of evidence from those organisations concerned with the dissemination of EBP. National Institute for Clinical Excellence Cochrane Collaboration and Campbell Collaboration Centre for Excellence and Outcomes (C4EO) The Family and Parenting Institute Research in Practice (RIP) The National Children's Bureau (NCB) The Social Care Institute for Excellence (SCIE) The Centre for Evidence-Informed Policy and Practice

Information (EPPI) The Economic and Social Research Council (ESRC)

Document Source Focus Findings ImplicationsYouth Crime Briefing: Effective practice with children and young people who offend.

NACRO (2006) http://www.nacro.org.uk/data/files/nacro-2007061800-57.pdf

Youth Crime

Certain approaches do not work - unstructured psychotherapy, intervention based upon medical models and measures intended to punish or deter. 7 ‘McGuire’ Principles for what does work include: 1. Risk classification to allow intensive targeting; 2. Dosage which refers to intensity and duration of intervention; 3. Criminogenic need refer to factors that directly contribute to offending behaviour; 4. Intervention Modality – cognitive and problem solving approaches seem to work; 5. Responsivity is where interventions impact upon behaviour where the input matches the youngperson’s preferences for understanding and interpreting new information; 6. Programme integrity is where a programme has a clear theoretical rationale, staff are resourced and trained and committed to the intervention approach; 7. Community base is important. This is where the intervention is delivered close to the home environment and makes use of local resources.

“Incarceration can exacerbate underlying difficulties through removal from the community, interrupting education, reducing employment prospects and confirming a criminal identity.” (p.2) Therefore this approach should be a last resort. “McGuire principles are extremely broad, providing a framework for intervention rather than a blueprint for working with young people in trouble.” (p.4). These may be useful principles to adhere to.

Drug use prevention among young people: a review of reviews. Evidence briefing update.

NICE (2006) http://www.nice.org.uk/aboutnice/whoweare/aboutthehda/hdapublications/drug_use_prevention_among_young_people_a_review_of_reviews_evidence_briefing_update.jsp

Substance Misuse

What works in prevention- some evidence (not strong) for social influence approaches; competence enhancement/broad skills training; cognitive behavioural approaches in targeted populations; prevention aimed at 11-14 yr olds was more successful; family interventions especially those aimed at the whole family rather than the child or the parent. Interactive methods are important. The use of incentives in family programmes aids retention and collaborative relationships are needed. Peer education can increase the effectiveness of a programme but has relatively short-lived effects. What does not work in prevention – information dissemination; affective education. Length of programmes and intensity make little difference. Poor evidence may be the result of poor application of theory to practice and poor fidelity of implementation.

Generally evidence is weak in concluding what works. Implications are suggested rather than based on strong evidence. Social influence approach indicates a student’s commitment not to use drugs is important. Standard education approaches to teaching young people about drugs do not work and may increase use. Cognitive Behavioural approaches aimed at target populations and in the 11-14 age bracket could be a promising way forward in prevention. Multi-component family focused programmes are effective are effective but need to be age and development sensitive. Studies in this area also show the importance of facilitators with regard to effectiveness (engagement). Poor implementation may account for lack of evidence base for prevention approaches. Lack of fidelity to programmes makes evaluation difficult. A lack of UK evaluation studies makes generalising these findings difficult.

My Research

Primary Observations No definite sense of EBP – usual

practice ‘It’s about enthusiasm.’ ‘Every child is different.’ ‘We’ve not helped him.’ ‘I’ll do this for another year and then do

teacher training.’

ReferencesAarons, G.A., Fettes, L., Luis Jr, E.F. and Sommerfeld, D.H. (2009) ‘Evidence-based practice implementation and staff emotional exhaustion in children’s services’, Behaviour Research and Therapy, 47, pp. 954-960.

Biesta, G. (2007) ‘Why “What Works” Won’t Work: Evidence-Based Practice and the Democratic Deficit in Education Research’, Educational Theory, 57(1) pp. 1-22.

Birch, T. (2009) Email to Justin Dunne, 26th January.

Bryman, A. (2008) Social Research Methods, 3rd edn, Oxford: Oxford University Press.

Dishion, T., J, Spracklen, K., Andrews, D., W and Patterson, G., R (1996) ‘Deviancy training in male adolescent friendships’, Behavior Therapy, 27, pp. 373-390. Dishion, T.J., McCord, J. & Poulin, F. (1999) ‘When Interventions Harm: Peer Groups and Problem Behavior’, American Psychologist, 54(9), pp. 755-764.

Elashoff, J. D. and Snow, R. E. (eds.) (1971) Pygmalion Reconsidered, Worthington, Ohio: Charles A. Jones.

Gregg, D. P. (2008) Review of Claimed ‘Longer Term Outcomes’ From Six ASB Family Intervention Projects. Available from the author at: [email protected]

Gregg, D. P. (2010) Family intervention projects: a classic case of policy-based evidence [Online]. Available at: http://www.crimeandjustice.org.uk/opus1786/Family_intervention_projects.pdf (Accessed: 27 June 2012).

ReferencesHammersley, M. (2001) 'Some Question about Evidence-based Practice in Education’, Annual Conference of the British Educational Research Association: Evidence-based practice in education. University of Leeds 13-15 September. Leeds: University of Leeds, pp1.13.

Hammersley, M. (2003) ‘Too good to be false? The ethics of belief and its implications for the evidence-based character of educational research, policymaking and practice’, Conference of the British Educational Research Association. Edinburgh: Heriot-Watt University, pp.1-14. Available at: http://www.leeds.ac.uk/educol/documents/00003156.htm (Accessed: 30th Sept 2012).

Hammersley, M. (2012) To be published in The Myth of research-based policymaking and practice, [Preprint].

Henderson, C.E., Taxmanb, F.S. and Young, D.W. (2008) ‘A Rasch model analysis of evidence-based treatment practices used in the criminal justice system’, Drug and Alcohol Dependence, 93(1-2), pp. 163-175. Available at: http://www.sciencedirect.com/science/article/pii/S0376871607003699 (Accessed: 2nd Oct 2012).

Henggeler, S.W., Melton, G.B., Brondino, M.J., Scherer, D.G. and Hanley, J.H. (1997) ‘Multisystemic therapy with violent and chronic juvenile offenders and their families: the role of treatment fidelity in successful dissemination’, Journal of Consulting and Clinical Psychology, 65(5), pp. 821-833.

Hopewell, S., Loudon, K., Clarke, M.J., Oxman, A.D. and Dickersin, K. (2009) Publication bias in clinical trials due to statistical significance or direction of trial results (Review), John Wiley and Sons. Available at: http://www.thecochranelibrary.com/userfiles/ccoch/file/INternational%20Clinical%20Trials%20Day/MR000006.pdf (Accessed: 30th Sept 2012).

ReferencesJohnson-Reid, M. (2011) ‘Disentangling system contact and services: A key pathway to evidence-based children’s policy’, Children and Youth Services Review, 33, pp.598-604.

Levant, R.F. and Hasan, N.T. (2008) ‘Evidence-Based Practice in Psychology’, Professional Psychology: Research and Practice, 39(6), pp. 658–662. Available at: http://web.ebscohost.com/ehost/pdfviewer/pdfviewer?nobk=y&sid=325abbf9-c049-4ebe-8ec1-3554d8e8c6f7@sessionmgr104&vid=7&hid=106 (Accessed: 2nd Oct 2012).

Lipsey, M.W. (1992) Juvenile delinquency treatment: A meta-analytic inquiry into the variability of effect. In Cook, T.D., Hooper, H., Corday, D.S., Hartmann, H., Hedges, L.V., Light, R.J., Louis, T.A. and Musteller, F. (Eds.), Meta-analysis for explanation: A casebook (pp.83-125), New York: Sage.

Littell, J.H. (2008) ‘Evidence-based or biased? The quality of published reviews of evidence-based practices’, Children and Youth Services Review, 30, pp.1299-1317.

Marks, D.F. (2002) Perspectives on evidence-based practice, London: health Development Agency Public Health evidence Steering Group. Available at: http://www.nice.org.uk/niceMedia/pdf/persp_evid_dmarks.pdf (Accessed: 13th November 2012).

McCord, J. (1978) ‘A Thirty-Year Follow-up of Treatment Effects’, American Psychologist, 33, pp. 284-289.

Mitchell, P.F. (2011) ‘Evidence-based practice in real-world services for young people with complex needs: New opportunities suggested by recent implementation science’, Children and Youth Services Review, 33, pp.207-216.

ReferencesOsborne, S.P., Bovaird, T., Martin, S., Tricker,M. and Waterston, P. (1995) ‘Performance Management and Accountability in Complex Public Programmes’, Financial Accountability & Management, 11(1), 19-37.

Poulin, F., Dishion, T. J. and Burratson, B. (2001) ‘Three-year Iatrogenic Effects Associated with Aggregating High-Risk Adolescents in Cognitive-Behavioral Preventive Interventions’, Applied Developmental Science, 5, pp. 214-224.

Rodd, H., and Stewart, H. (2009) ‘The glue that holds our work together: The role and nature of relationships in youth work’, Youth Studies Australia, 28(4), pp. 4−10.

Robinson, J., Hetrick, S.E. and Martin, C. (2011) ‘Preventing suicide in young people: systematic review’, Australian & New Zealand Journal of Psychiatry, 45(1), pp. 3-26. Available at: http://web.ebscohost.com/ehost/pdfviewer/pdfviewer?vid=14&hid=112&sid=2a4cf8bb-d87a-4452-aa9c-56799af2a98c%40sessionmgr15 (Accessed: 27 Nov 2012).

Rogers, C. (1982) A Social Psychology of Schooling: the expectancy process, London: Routledge and Kegan Paul.

Rosenthal, R. and Jacobson, L. (1968) Pygmalion in the Classroom, New York: Holt, Rinehart and Winston.

Strauss, A. and Corbin, J. (1998) Basics of qualitative research techniques and procedures for developing grounded theory. 2nd edn. London: Sage.

ReferencesWaldron, H., and Turner, C. (2008) 'Evidence-Based Psychosocial Treatments for Adolescent Substance Abuse', Journal Of Clinical Child And Adolescent Psychology, 37( 1), pp. 238-261. Available at: http://web.ebscohost.com/ehost/detail?vid=24&hid=112&sid=2a4cf8bb-d87a-4452-aa9c-56799af2a98c%40sessionmgr15&bdata=JnNpdGU9ZWhvc3QtbGl2ZQ%3d%3d#db=eoah&AN=14101256 (Accessed: 27 November 2012).

Walker, K. (2003) ‘Why evidence-based practice now?: A polemic’, Nursing Inquiry, 10(3), pp.145-155. Available at: http://web.ebscohost.com/ehost/pdfviewer/pdfviewer?sid=95e24fa9-ca2e-40e7-93c3-0e2f6764bef6%40sessionmgr111&vid=4&hid=106 (Accessed: 1st Oct 2012).

Weisz, J.R., Sandler, I.N., Durlak, J.A. and Anton, B.S. (2005) ‘Promoting and Protecting Youth Mental Health Through Evidence-Based Prevention and Treatment’, American Psychologist, 60(6), pp. 628-648.

Youth Justice Board (no date) YOTs and third sector invited to bid for funding for Intensive Intervention Projects. London: Youth Justice Board. Available at: http://www.yjb.gov.uk/en-gb/News/IntensiveInterventionProjectsFunding.htm (Accessed: 11 March 2010).