rpg cross-site evaluation and technical assistance: third annual

60
Regional Partnership Grants Cross-Site Evaluation and Evaluation-Related Technical Assistance May 2016 RPG Cross-Site Evaluation and Technical Assistance: Third Annual Report

Upload: lamhanh

Post on 18-Dec-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

Regional Partnership Grants Cross-Site

Evaluation and Evaluation-Related

Technical Assistance

May 2016

RPG Cross-Site Evaluation and Technical Assistance:

Third Annual Report

Page 2: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 3: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG Cross-Site Evaluation and Technical Assistance: Third Annual Report

Contract Number: HSP233201250024A

Mathematica Reference Number: 40170.104

Submitted to: Office on Child Abuse and Neglect Children’s Bureau, ACYF, ACF, HHS 8th Fl. No. 8111, 1250 Maryland Ave., SW Washington, DC 20024 Project Officer: Dori Sneddon

Submitted by: Mathematica Policy Research P.O. Box 2393 Princeton, NJ 08543-2393 Telephone: (609) 799-3535 Facsimile: (609) 799-0005 Project Director: Debra A. Strong

Suggested citation: Jacqueline M. Crowley, Alyson Burnett, Caroline Massad Francis, and Debra A. Strong. “RPG Cross-site Evaluationand Technical Assistance: Third Annual Report.” Children’s Bureau, Administration for Children and Families,U.S. Department of Health and Human Services. May 2016. Contract No.: HSP233201250024A. Available fromMathematica Policy Research, Princeton, N.J.

May 2016

Prepared by:Jacqueline M. CrowleyAlyson BurnettCaroline Massad Francis Debra A. Strong

Regional Partnership Grants and Cross-Site Evaluation

Page 4: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 5: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

ACKNOWLEDGMENTS

We are grateful for the contributions of many people to this report and during the third year of the Regional Partnership Grants (RPG) National Cross-Site Evaluation and Evaluation-Related Technical Assistance project. First and foremost we thank Dori Sneddon, the contracting officer’s representative (COR) for the evaluation. Her leadership and thoughtful guidance is appreciated by all the members of the evaluation team. We would also like to thank Elaine Voces Stedt, Director of the Office of Child Abuse and Neglect, and Jean Blankenship, Child Welfare Program Specialist in the Office of Child Abuse and Neglect, for providing valuable feedback and support as we worked closely with the RPG grantees and with Children and Family Futures, which provides program-related technical assistance (TA) to the grantees through the National Center for Substance Abuse and Child Welfare. At Children and Family Futures, Ken DeCerchio, Nancy Hansen, and Melissa Lujan have generously taken time to provide thoughtful input on many of the cross-site evaluation activities and products. We thank them, along with Marianna Corona, Jane Pfeifer, and Bonnie Washeck of Children and Family Futures who, in the course of providing program-related TA to the grantees, worked closely and supportively with our staff members that were providing evaluation-related TA.

We would especially like to express our appreciation to the RPG grantees and their evaluators. During year three, the grantees collected a significant amount of participant, service, and outcomes data and responded to staff and partner surveys while implementing evidence-based programs and practices. We can only say that those of us at Mathematica, WRMA, Inc., and Synergy Enterprises, Inc. have the utmost respect for those who do what we only study.

The authors look forward to working with all the people mentioned above and their organizations during the final two years of program implementation and evaluation. We thank our editing staff and Ms. Felita Buckner for help producing the report, and retain sole responsibility for any errors or omissions.

iii

Page 6: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 7: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

CONTENTS

I INTRODUCTION: THE THIRD ANNUAL REPORT ........................................................................ 1

A. The third annual report .............................................................................................................. 2

B. An overview of the work conducted under the contract during year 3 ...................................... 2

II PROVIDING TECHNICAL ASSISTANCE AND SUPPORTING GRANTEES ................................. 9

A. Technical assistance ................................................................................................................. 9

1. TA requests ......................................................................................................................... 9

2. The RPG help desk ........................................................................................................... 10

3. Calls with grantees ............................................................................................................ 11

B. Peer learning ........................................................................................................................... 12

C. Collaborating with NCSACW ................................................................................................... 13

III COLLECTING DATA ...................................................................................................................... 15

A. Participant characteristics, service, and outcome data ........................................................... 15

1. Participant characteristics and services ............................................................................ 16

2. Outcomes .......................................................................................................................... 17

B. Surveying partners and EBP staff ........................................................................................... 19

1. Survey recruitment and outreach ...................................................................................... 19

2. Response rates ................................................................................................................. 20

C. Planning site visits ................................................................................................................... 21

IV ANALYZING PARTICIPANT-LEVEL DATA ................................................................................... 23

A. Participant characteristics ........................................................................................................ 23

1. Analysis methods .............................................................................................................. 23

2. Results highlighted in the Third Report to Congress ........................................................ 25

B. EBP enrollment ........................................................................................................................ 26

1. Analysis methods .............................................................................................................. 26

2. Results highlighted in the Third Report to Congress ........................................................ 26

C. Well-being of children and families at baseline ....................................................................... 26

1. Analysis methods .............................................................................................................. 26

2. Results highlighted in the Third Report to Congress ........................................................ 27

D. Data quality .............................................................................................................................. 30

1. ESL-related data ............................................................................................................... 30

2. OAISIS-related data .......................................................................................................... 31

v

Page 8: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

V INTEGRATING THE RPG3 COHORT ........................................................................................... 35

A. Grantee local evaluations ........................................................................................................ 35

B. RPG3 activities ........................................................................................................................ 37

1. Finalizing program plans ................................................................................................... 37

2. Finalizing evaluation plans ................................................................................................ 40

C. Challenges and next steps ...................................................................................................... 40

VI NEXT STEPS: PLANNED ACTIVITIES IN YEAR 4 ...................................................................... 43

A. Ensuring the quality and completeness of evaluation data ..................................................... 43

B. Remaining reports to Congress ............................................................................................... 43

C. Cost study ................................................................................................................................ 43

D. Remaining data collection and analysis .................................................................................. 44

1. Site visits ........................................................................................................................... 44

2. Participant characteristics ................................................................................................. 44

3. Outcomes .......................................................................................................................... 45

E. Analysis and reporting ............................................................................................................. 45

REFERENCES ............................................................................................................................................ 47

vi

Page 9: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

TABLES

I.1 Tasks and activities during year 3 of the RPG cross-site evaluation ............................................... 3

II.1 TA Request tickets opened during year 3 of the cross-site evaluation .......................................... 10

II.2 Help desk tickets received from October 1, 2014 to September 30, 2015 (RPG2 and RPG3 grantees) ............................................................................................................................. 11

II.3 Call tickets from October 1, 2014 to September 30, 2015 (RPG2 and RPG3 grantees) .............. 12

III.1 Types of records collected in the ESL and cumulative number of records collected through September 30, 2015 ......................................................................................................... 17

III.2 Outcome data sources ................................................................................................................... 18

III.3 Percentage of expected standardized instrument and administrative data elements submitted in April 2015................................................................................................................... 19

III.4 Final response rates by grantee for the staff survey ...................................................................... 20

III.5 Final response rates by grantee for partner survey ....................................................................... 21

IV.1 Number of cases analyzed in year 3, by grantee .......................................................................... 24

IV.2 Substance use disorder among adults prior to RPG enrollment ................................................... 28

V.1 RPG3 projects and planned target population and program focus ................................................ 36

V.2 Characteristics of RPG3 grantees’ local outcome evaluations ...................................................... 38

FIGURE

IV.1 Distribution of scores on the PSI-SF for RPG adults compared with the national mean ............... 29

vii

Page 10: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 11: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

I. INTRODUCTION: THE THIRD ANNUAL REPORT

This report describes activities of the RPG Cross-Site Evaluation and Technical Assistance project during the third year of our contract. Before examining the details of that progress, it is helpful to review the contents of the two earlier annual reports and to outline the focus of our activities during year 3.

During the first year of the 2012 Regional Partnership Grant program (RPG2), Mathematica and its subcontractors worked with the Children’s Bureau (CB), the National Center on Substance Abuse and Child Welfare (NCSACW), and the grantees and their evaluators to establish evaluation designs and prepare for the remaining four years of the grant. We (1) designed the cross-site evaluation, (2) provided evaluation-related technical assistance (TA) to grantees, (3) assessed the program plans and evaluation designs that grantees initially proposed, (4) selected or developed measures and instruments for use in the cross-site evaluation, and (5) explored whether a data collection system created during RPG1 could be updated to obtain evaluation and performance indicator data from grantees. The first annual report on the RPG national cross-site evaluation and evaluation-related TA (RPG cross-site evaluation) project described progress in these areas (Strong et al. 2014).

In year 2, Mathematica, WRMA, Inc., and Synergy Enterprises (Synergy), built on the year 1 activities by (1) preparing to obtain data from grantees, including providing grantees with the standardized instruments for use in the cross-site evaluation, obtaining Office of Management and Budget (OMB) clearance, and designing and developing the RPG2 data collection system; (2) providing evaluation-related TA to grantees and monitoring their progress; and (3) recruiting a subset of grantees for participation in the impact study. We described progress in these areas in a second annual report (Strong et al. 2015).

Building a strong foundation for the cross-site evaluation in years 1 and 2 allowed us to focus on four areas in year 3:

1. Providing TA and supporting grantees

a. Conducting evaluation-related TA

b. Coordinating peer learning opportunities

c. Collaborating with NCSACW

2. Collecting data

a. Gathering information on enrollment, services, and outcomes

b. Surveying grantee partners and frontline staff and supervisors

c. Planning site visits to local RPG programs

3. Analyzing participant-level data

a. Evaluating enrollment and service data

b. Examining outcome data

1

Page 12: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

4. Integrating into the cross-site evaluation a new cohort of four grantees (RPG3 grantees, funded in October 2014)

A. The third annual report

This third report describes activities and progress in each of these areas. The report is intended for CB, which sponsors the RPG program and the cross-site evaluation, as well as other stakeholders with whom CB may wish to share the report. We have organized the report as follows:

Section B of this chapter provides a detailed list of the activities conducted under Mathematica’s contract during the period covered by this report. Chapter II discusses our provision of evaluation-related TA and peer learning opportunities, as well as collaboration with NCSACW. In Chapter III, we discuss the systems and processes in place for collecting cross-site evaluation data, including surveying grantee partners and program staff. Chapter IV explores how we have begun to analyze participant-level data, and Chapter V outlines how we have integrated the RPG3 cohort of grantees into the project. Chapter VI details next steps for the cross-site evaluation, including the remaining data collection activities and analysis and reporting plans.

B. An overview of the work conducted under the contract during year 3

Work on the RPG cross-site evaluation is organized into 12 tasks. Mathematica and WRMA completed task 4 (develop, refine, and finalize performance indicators) during year 1. Task 11 (final evaluation report) will not begin until year 5. Table I.1 summarizes the activities we conducted under the remaining tasks between October 1, 2014 and September 30, 2015, and identifies the contractual deliverables completed.

2

Page 13: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1. Tasks and activities during year 3 of the RPG cross-site evaluation

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

Task 1 Participate in project orientation

1.2 Prepare and update project work plans

Updated work plan None (year 3 work plan was submitted at the end of year 2; year 3 plan was submitted early in year 4).

1.3 Facilitate and coordinate bi-weekly or monthly teleconference

Agenda for call

Summary of call and follow-up action plan

2014 call dates: October 2 November 11 and 20 December 4

2015 call dates: January 13 February 27 March 27 April 24 May 29 June 26 July 24 August 27 September 25

1.4 In-person meetings

Agenda for meeting

Summary of meeting and follow-up action plan

Ms. Strong met with Ms. Stedt during the RPG Annual Conference in July 2015 in Washington, DC, to discuss the status of the work.

1.5 Expert consultation

List of proposed experts Agenda for meeting Summary of meeting

No expert meetings were requested.

1.6 Ad hoc briefings Agenda for meeting Summary of meeting and follow-up action plan

No ad hoc briefings were requested.

Task 2 Conduct program strategy confirmation process and evaluability/readiness assessment and develop grantee evaluation profiles

2.2 Collaborate with NCSACW on the development and updating of grantee evaluation profiles

Draft grantee evaluation profile Final initial grantee evaluation profile Update grantee profile

Completed profiles of all grantees except Rockingham Memorial Hospital in Virginia. This profile remained on hold due to the evolving nature of the grantee’s evaluation plans, but was completed early in year 4.

3

Page 14: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1 (continued)

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

Task 5 Review existing data collection system, develop and pilot test data collection system for performance indicators and evaluation data

5.6 Implement and maintain performance indicators data collection system

Implementation schedule/plan Quality control plan

Submitted the RPG Data Quality Control Plan to CB on October 21, 2014. Outcome and Impact Study Information System (OAISIS) activities: WRMA maintained OAISIS. Grantees submitted data to OAISIS in October 2014 and April 2015. WRMA implemented changes to the Safety and Permanency Informational Database (SPID) to address technical assistance and grantee data issues. Mathematica notified grantees of the changes, and an updated version of the OAISIS user guide was made available to grantees on March 18, 2015. Created a custom Access database for Alternate Opportunities to read their Excel files with Safety and Permanency data to ease reporting. Released a fourth version of OAISIS user guide in July 2015.

Enrollment and Service Log (ESL) activities: Along with Synergy Enterprises, maintained the ESL and addressed grantee TA needs: Mathematica and Synergy Enterprises deployed ESL release 1.7 to production on September 17 and 18, 2015. We resolved ESL issues that required taking the system offline on November 21, 2014. The system was brought back online Monday, December 8, 2014. Several ESL data integrity issues that required investigation, including the creation of duplicate service logs and a vanishing save button. The ESL was taken offline to prevent grantees from experiencing additional problems while the system was under examination. In December, we notified grantees of a potential data loss and asked them to retain ESL written records as backup. On March 13, 2015, we deployed version 1.6 of the ESL to address errors in the ESL extract files, eliminate the possibility of data duplication due to inadvertent activity (such as a screen refresh), and correct the text of lead-in questions in the ESL.

4

Page 15: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1 (continued)

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

Task 7 Data collection for cross-site evaluation

7.3 Timely execution of data collection activities

Collect data Programmed and tested staff and partner surveys in spring 2015. Held calls with grantees to provide guidance on nominating participants for the surveys and to obtain the number of potential survey respondents. Launched surveys April 7, 2015 and conducted reminder calls and emails between May 5, 2015 and June 2, 2015. Closed surveys June 5, 2015. Final response rates were 97/111 (87 percent) for staff and 154/205 (75 percent) for partners. Began cleaning and preparing the data for analysis. This task was competed in July 2015.

7.4 Site visits to grantees

Site visits Finalized draft site visit protocols for review by CB in August 2015. Conducted a pilot site visit to TN-DMHSAS in September 2015. Held a 10-hour training for the site visitors in September 2015.

7.5 Coordination and monitoring of data collection activities

Data quality and monitoring

OAISIS activities: WRMA monitored grantee uploads to the OAISIS system from September 29, 2014 through December 1, 2014. Mathematica provided weekly updates of grantee progress and sent regular feedback to grantees to ensure that the data uploaded to the system were accurate and complete. Cleaned OAISIS data and generated scale scores from the standardized instruments and indicator variables from the administrative data. Identified some submitted standardized instruments with out-of-range items, and resolved an issue with the two PDFs (used by grantees to collect and submit these instruments) causing this error. Calculated scores for the standardized instruments, compiled into grantee-specific data sets, and shared with grantees on September 1, 2015.

ESL activities: Worked with grantees to re-enter ESL data lost due to system error; completed by May 2015. Updated the ESL paper forms, user guide, and data dictionary and circulated them in June 2015.

Other activities: Received the grantees’ semi-annual progress reports in October 2014 and April 2015; reviewed and extracted data for use in the third report to Congress. Compiled information from the grantees’ second round of trauma request forms and sent it to CB on May 29, 2015. Prepared analyses for the third report to Congress and the July RPG conference.

5

Page 16: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1 (continued)

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

Task 8 Provide evaluation technical assistance (TA) to grantees

8.1 Provide TA on evaluation, performance measurement, and continuous quality improvement

TA request process Technical assistance

CSLs conducted TA site visit to Iowa-Seasons in October 2014. CSLs continued to participate in calls with grantees and respond to TA requests. Received and responded to questions through the RPG help desk related to the ESL and OAISIS systems, and the standardized instruments and administrative data. These activities were described in TA reports submitted each month. Worked with a grantee having potential inaccuracy in its informed consent statement to develop a new statement and clarify that data collected under the initial statement was still approved by the grantee’s institutional review board.

8.2 Coordinate, facilitate, and support an evaluation peer learning network across the grantees

Peer learning network meetings monthly, as needed

Held sessions with local evaluators at the annual grantee conference in July 2015 (see Chapter II section B).

8.3 Provide technical assistance activities tools and material for knowledge management

TA tools On December 18, 2014, circulated memo to grantees via the RPG listserv with suggestions and tools for obtaining child well-being data from adults not part of the RPG program. On January 12, 2015, circulated memo reminding grantees of the importance of collecting baseline data during or as soon as possible after enrollment as well as collecting follow-up data in a timely way.

Task 9 Coordinate with the National Center on Substance Abuse and Child Welfare and other TA providers

9.2 Coordinate and provide support for annual RPG grantee meetings

Jointly planned and implemented annual grantees meetings

Met with NCSACW to discuss the agenda for the planned annual grantee conference to be held in Washington, DC, in July 2015. Planned a survey of grantees and evaluators on their interests, which NCSACW conducted. Developed sessions for evaluators. Attended the RPG conference on July 7 and 8, 2015, and held multiple sessions, including presenting updates on the cross-site evaluation and holding technical assistance office hours to provide assistance with the ESL and OAISIS data collection systems.

6

Page 17: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1 (continued)

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

9.3 Coordinate with other related TA providers to support the grantees

Collaboration and coordination activities

Participated in national partner calls 2014: December 4

2015: January 15 February 27 May 29 August 27 September 29

Task 10 Prepare reports

10.1 Prepare TA reports

Report of TA provided Submitted 12 TA reports, one with each monthly progress report.

10.2 Prepare annual reports to Congress

Draft report Final report

Submitted the final first report to Congress on February 9, 2015; it was cleared and Mathematica circulated it to grantees on April 14, 2015. Submitted draft second report to Congress on October 28, 2014. Submitted a revised second report to Congress on February 3, 2015. It was cleared and Mathematica circulated it to grantees on September 14, 2015. Submitted the draft third report to Congress to internal Mathematica QA reviewer in September 2015.

10.3 Prepare monthly progress reports

Monthly report Submitted 12 monthly progress reports.

10.4 Prepare annual reports

Draft annual report outline Annual report

First annual report was sent to CB on January 5, 2015. Upon CB’s approval and request, we circulated it to RPG grantees on March 30, 2015. Second annual report was submitted on March 27, 2015. After CB approval on April 14, 2015, we finalized it and circulated it to the RPG grantees via the RPG2 and RPG3 listservs on April 29, 2015.

10.5 Prepare ad hoc reports and/or special topics research briefs

Draft outline ad hoc reports Ad hoc reports

Extracted additional information from the trauma forms received from the grantees with their October 2014 semi-annual progress reports. In January 2015, we drafted a short report to CB and the grantees summarizing information collected for CB on how grantees are addressing child and adult trauma.

Task 12 Conduct strategic knowledge dissemination and knowledge transfer activities

12.1 Strategic dissemination

Coordinated knowledge dissemination activities

No activity.

Optional Task C

Cost Study

7

Page 18: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table I.1 (continued)

Task and Subtask Number

Task and Subtask Title

Contractual Deliverables Relevant

for the Period Activities

1 Cost analysis Identify common cost data elements Develop draft cost data collection measure Pilot Test Prepare summary report

Circulated a memo to grantee project directors and local evaluators on September 22, 2015, about the cost study and opportunities to participate in it. Responded to expressions of interest from grantees and local evaluators and began preparing for the first meeting of the cost study work group in October.

8

Page 19: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

II. PROVIDING TECHNICAL ASSISTANCE AND SUPPORTING GRANTEES

Supporting grantee teams in their efforts to conduct evaluations continued to be a key feature of the cross-site evaluation in year 3. Along with other activities and resources supported by the cross-site evaluation contract, TA is also intended to enhance the ability of grantees to contribute RPG program and evaluation data to the cross-site evaluation and for use in reporting to Congress on grantee performance. To accomplish these goals, Mathematica provided TA in response to specific requests from grantees or their Federal Project Officers (FPOs), but also monitors grantees’ progress implementing their programs and evaluations—mostly through teleconferences with grantee teams—to identify other needs. This chapter describes the type and quantity of TA-related activities during the third year of the contract.

A. Technical assistance

A team of Mathematica staff called cross-site liaisons (CSLs) provides TA on request and through ongoing phone communication. In addition to TA support from the CSLs, grantees can email or call a “help desk” that Mathematica established in year 2 to address questions related to the data that grantees provide for the cross-site evaluation. We also facilitate peer learning activities, so that grantees may capitalize on each other’s knowledge and experience.

1. TA requests The volume of formal requests for TA was slightly lower for RPG2 grantees than in the

second year of the project.1 As the second annual report describes, during the second year of the project, we received 16 requests for TA. These requests focused on data collection, because most grantees began their evaluations during the year. During year 3, we received 11 requests from RPG2 grantees (Table II.1), mostly related to data collection and analytic methods. Four RPG3 grantees in their first year of receiving RPG grants made 10 formal requests for TA as they prepared to begin their evaluations; most of these requests focused on data collection and institutional review board (IRB) approvals. The lower volume of requests overall is misleading as an indicator of the quantity of assistance requested by grantees, because grantees submitted many additional questions through the RPG help desk. However, TA requests typically required more resources and tools than questions submitted through the help desk. For example, one grantee asked Mathematica to consult on its proposed consent process as well as review its consent documents prior to IRB submission. Another grantee asked for TA related to conducting propensity score matching.

1 “TA requests” have been defined for the project as requests that include or require (1) materials and tools (such as examples of consent forms or tools to calculate statistical power); (2) review of grantee or external reference documents; (3) specialized TA from a member of the cross-site evaluation team other than the CSL, such as a survey researcher; or (4) expertise from someone outside the team, such as another expert at Mathematica. Requests were made by the RPG grantees or local evaluators, or sometimes by the FPOs.

9

Page 20: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table II.1. TA Request tickets opened during year 3 of the cross-site evaluation

Number of Requests

RPG2 RPG3* Total

Number of TA requests received October 2014–September 2015 11 10 21

Topics of TA requests:

Data collection 4 5 8

Analytic methods 3 1 4

IRB 1 3 4

Research design 2 0 2

Recruitment 1 0 1

Outcomes 0 1 1

* RPG3 grantees were in their first year of the grant during this time. For more information on RPG3, see Chapter V. 2. The RPG help desk

Grantees or evaluators can also submit questions or requests for assistance via the RPG help desk, as opposed to a more formal TA request made through the CSL or FPO. During the year, the help desk received 383 requests from RPG2 and RPG3 grantees. Most questions (55 percent) involved the use of the Enrollment and Service Log (ESL) system2, a web-based, real-time data collection system that Mathematica and Synergy Enterprises developed for the cross-site evaluation to provide grantees with implementation data (Table II.2). However, grantees also submitted questions on the use of the Outcome and Impact Study Information System (OAISIS)3 and the standardized instruments.4 Grantees have posed a range of questions to the help desk, including queries regarding case composition, such as whether to enter children not receiving RPG services into the ESL or how to add an unborn child to an existing case. Other questions have focused on how to obtain administrative data from child welfare agencies and how to select the most appropriate reporter for the standardized instruments.

2 The ESL collects information about (1) enrollment into RPG, including demographic information on each RPG case and individual case members, as well as the exit date and reason for exit; (2) enrollment and exit dates for evidence-based programs (EBPs); (3) detailed session information for focal EBPs; and (4) selected case and demographic information for comparison group members from RPG grantees participating in the impact study. 3 Grantees or their evaluators will use OAISIS to submit data from standardized instruments and administrative records (including records on child maltreatment, foster care removal and placement, and substance abuse treatment), for their RPG participants. Grantees participating in the impact study will also submit selected data elements for their comparison group members. 4 Primary data will be based on self-administered standardized instruments that CB has asked all grantees and their evaluators to administer to RPG participants to measure outcomes.

10

Page 21: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table II.2. Help desk tickets received from October 1, 2014 to September 30, 2015 (RPG2 and RPG3 grantees)

Topic In Process Completed/Closed Total Requests

Administrative data 0 15 15

Appropriate respondent in RPG case for standardized instruments

0 7 7

ESL: use of paper forms 0 4 4

ESL: use of web-based system 2 206a 208

IRB 0 1 1

OAISIS 0 88 88

Use of standardized instruments 0 45 45

General data collection 0 15 15

Total 0 383 383 a The two cases in process as of September 30, 2015 have since been resolved. 3. Calls with grantees

Liaisons from NCSACW, called program management liaisons (PMLs), work with Mathematica’s CSLs to support grantees and their local evaluators. The CSLs and PMLs participated in recurring teleconference meetings with grantees, their evaluators, and the FPOs for each grantee. During the calls, CSLs provided assistance by discussing evaluation topics. They also monitored the status of local evaluations by asking grantees, for example, when enrollment in the evaluation began, the level of enrollment to date, and the extent of baseline or follow-up data collection. During the calls, grantees discussed other topics, such as the progress of implementing their programs, so CSLs could learn about the challenges of enrolling and serving RPG cases, potential changes in evidence-based programs (EBPs) or other program elements, or other implementation issues that could affect grantees’ local evaluations or participation in the cross-site evaluation.

Planned calls took place every one or two months, but CSLs and the cross-site evaluation team often participated in additional calls related to TA and grantee monitoring. For example, the CSLs often talked with PMLs and FPOs to prepare for upcoming grantee calls or discuss issues that arose during the calls, or they held additional calls to deliver TA. During year 3, CSLs participated in 264 calls of various types with RPG2 and RPG3 grantees (Table II.3), for an average of 12.6 calls per grantee.

11

Page 22: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table II.3. Call tickets from October 1, 2014 to September 30, 2015 (RPG2 and RPG3 grantees)

Number of Calls

Total number of calls conducted by CSLs for RPG2 and RPG3 grantees 264 Call type

Regularly scheduled teleconference with grantee, CSL, PML, and FPOa 159 Check-in with CSL, PML, and FPO to discuss grantee-related issues 63 Provision of TA requested by FPO 16 Discussion of RPG programmatic issues (initiated by PML) 11 Evaluation-focused (requested by grantee) 10 Evaluation-focused (initiated by CSL) 5

Main topics discussedb Implementation/programmatic issues 128 Intake, enrollment, and consent 104 Grantee-collected data (training, processes, questions) 95 Sample size 91 Administrative data (agreements, processes, questions) 84 Staff and staffing issues 57 IRB 47 Fidelity 44 Tracking sample members 43 Systems or collaboration outcomes 38 Treatment and comparison group formation 37 Sample attrition 36 Random assignment 23 Baseline equivalence 18 Outcomes 15 Analysis methods/technical questions 14 Cross-over/contamination 2

a Regularly scheduled calls typically addressed evaluation- and program-related topics. b Calls could include multiple topics. FPO = federal project officer; CSL = cross-site evaluation liaisons; PML = program management liaison; IRB = institutional review board

Calls could cover a number of areas but usually focused on one or two topics (Table II.3). The discussion most often involved study or program intake, enrollment, or consent and grantee-collected data. Issues involving ongoing data collection were also frequent, including sample size (91 calls), administrative data (84 calls), IRB (47 calls), and tracking sample members (43 calls). CSLs also characterized the main topics of some calls as program implementation or issues (128 calls) or staffing (57 calls).

B. Peer learning

At the request of grantees and their evaluators, peer learning activities in year 3 occurred during sessions at the RPG annual conference, held in July 2015 in Washington, DC. Grantees and evaluators expressed a desire to participate in peer learning activities at the annual conference during year 2 of the grant, as webinars and work groups kept grantees and evaluators busy at other times, especially as grantees rolled out their programs and began data collection. The format of the conference provided two full days for participants in RPG2 and RPG3 to meet. Mathematica and its subcontractor WRMA provided one-on-one assistance and information on all aspects of the ESL and OAISIS during an “office hour” that took place each day during three

12

Page 23: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

sessions. In addition, Mathematica coordinated three breakout sessions on the following evaluation topics:

1. Partnering for cost study trauma-focused research and other research opportunities. Debra Strong, the cross-site evaluation project director, and Lareina LaFlaire, an expert in trauma and part of Mathematica’s cost study team, described a working paper on how RPG grantees are addressing trauma in their programs (Strong et al. 2015). They then presented information about a cost study of trauma-specific EBPs and opportunities for evaluators and grantees to participate. Mathematica then led a discussion of whether RPG evaluators have other opportunities to partner on research related to the RPG program including publishing, presentations, or shared data and analyses.

2. Evaluation focused peer-to-peer sharing. Elizabeth Cavadel of Mathematica, a CSL who works with three grantees, moderated a discussion for evaluators to share their progress in their evaluation, highlighting how their current activities differed from their original plan for evaluation. RPG2 and RPG3 evaluators each discussed what they most hope to learn from their evaluation; one challenge that they have overcome; and one challenge that they have not yet solved.

3. Adapting evaluations to real world challenges. Sarah Avellar of Mathematica, a principal investigator for the cross-site evaluation who leads its evaluation TA component, moderated this session, providing an overview of common evaluation challenges for RPG projects, such as forming and recruiting a comparison group, propensity-score matching methods for determining comparability, low enrollment, attrition, or unmatched comparison groups. Russ Cole of Mathematica, an expert on evaluation design, presented information on propensity score matching methods. Grantees then discussed ways to approach common challenges.

C. Collaborating with NCSACW

NCSACW provided programmatic TA to grantees and was an important partner with Mathematica in supporting grantees in the cross-site evaluation. The FPOs and liaisons held joint monitoring calls with grantees (typically monthly) to monitor progress, identify concerns, and troubleshoot with the teams—as described earlier. FPOs and the liaisons often met separately before or after the monitoring calls to strategize or debrief, respectively. As Table II.3 shows, during year 3, the teams held 63 check-in calls.

In addition, Mathematica, NCSACW, and CB leadership consulted with each other regularly via email and scheduled calls. These calls provided an opportunity to draw on each team’s strengths, in sharing information, developing plans to support grantees, and coordinating schedules for webinars or other overlapping activities. At CB’s request, NCSACW also reviewed materials for the cross-site evaluation. During year 3, NCSACW provided comments, input, or text on the following cross-site evaluation products:

• Frontline staff survey instrument

• Partner survey instrument

• Working paper on how RPG grantees are addressing trauma in their programs

• Third report to Congress

13

Page 24: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 25: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

III. COLLECTING DATA

By the end of year 3, RPG grantees and the cross-site evaluation team had collected participant, service, and outcome data on more than 1,400 cases and survey responses from more than 100 RPG staff and 200 partner representatives to inform the cross-site implementation and outcome studies. This accomplishment was significant not only for the cross-site evaluation team, but also for the grantees, who both collected data and responded to staff and partner surveys. The cross-site evaluation team will use these data to answer research questions about the people that RPG projects served; the EBPs that RPG participants received, how projects implemented those EBPs, and the supports in place to facilitate implementation; participant outcomes; and how partners worked together.

Mathematica also began planning site visits to all 17 grantees during year 3. The visits, which will take place in year 4, will seek information on RPG partnerships, infrastructure, service delivery, and sustainability from the perspectives of multiple grantee staff and partners.

This chapter summarizes cross-site evaluation data collection activities during year 3. Section A discusses participant characteristics, service, and outcome data collected by grantees and submitted to the cross-site evaluation team through two linked data collection systems created for the project. Year 3 represented a midpoint in data collection, which began in year 2 and will continue through the end of year 5. Section B discusses the surveys to which grantees and their partners responded during spring 2015. Section C discusses planning for site visits that Mathematica conducted in fall 2015.

A. Participant characteristics, service, and outcome data

The RPG cross-site implementation and outcome studies rely on detailed data on RPG participants and services. The studies will report on characteristics and outcomes of individual adults and children that RPG projects serve, as well as the EBPs each case receives. By the end of year 3, grantees had submitted detailed data on hundreds of cases comprising thousands of participants. The cross-site evaluation used these data in the Third Report to Congress (U.S. Department of Health and Human Services, forthcoming) to describe the people that RPG projects served, the EBPs they received, and their outcomes in five domains.

Grantees collected the data using data collection systems and processes that the cross-site evaluation team developed during years 1 and 2. As cases enrolled in their RPG projects and in individual EBPs, grantees entered information on participant characteristics into the ESL. Grantees collected outcome data in two ways. They used standardized instruments at RPG entry and exit to collect measures of child well-being, family functioning, and recovery. They also worked with state and local agencies to obtain administrative data on safety, permanency, and recovery. After entering cases into the ESL, grantees uploaded outcome data for those cases into OAISIS twice, in October 2014 and April 2015. The Second Annual Report (Strong et al. 2015) offers more detail about the instruments, the ESL, and OAISIS.

The remainder of this section discusses the participant, service, and outcome data collected in year 3.

15

Page 26: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

1. Participant characteristics and services By the end of year 3, September 30, 2015, the ESL had been open for 19 months and

grantees had entered data on more than 1,400 cases. The cross-site evaluation team had received ESL data from all 17 grantees.

Grantees entered into the ESL several types of data on participant characteristics and services. For each case—defined as a family, a household, or a group of individuals enrolling together in the RPG program—grantees created a record that linked case members and the services they received. Grantees recorded each case’s entry into and exit from its RPG project, characteristics of the adults and children within those cases, case members’ enrollment into specific EBPs, and information about individual sessions in which certain EBPs were delivered. Grantees entered these data throughout the course of cases’ involvement in their RPG projects:

• Case enrollment and exit. When cases enrolled in RPG, grantees recorded the date of entry, all case members (identified by grantee-generated ID numbers to protect privacy), and the familial or non-familial relationships between or among case members. When cases exited their RPG projects, grantees recorded the date of exit and whether the case was closing due to successful completion or for another reason.

• Individual characteristics. Grantees also recorded each case member’s demographic and socioeconomic characteristics at the time of enrollment, including birth date, gender, race/ethnicity, and current residence of all case members, and the education, employment, income, and relationship status of all adult case members.

• Receipt of EBPs. As case members began receiving EBPs, grantees recorded the enrollment and exit dates for all EBPs offered as part of their RPG projects, which case members enrolled in the EBP, and which staff provided the service.

• Session data for “focal” EBPs. 5 For a subset of focal EBPs that the cross-site evaluation will study in greater depth, grantees recorded the details of each session in which participants received the EBP. They also recorded a measure of participant engagement throughout service delivery. Session details included duration of the session, case members in attendance, topics covered, and a staff member’s assessment of the session quality.

In addition to collecting data on cases receiving RPG services, grantees participating in the cross-site impact study also collect data on comparison group cases.

Table III.1 shows the types of records collected in the ESL and the cumulative number of each type of record collected by the end of year 3.

5 Grantees proposed a total of 51 EBPs—more than the cross-site evaluation could feasibly study in detail without imposing an enormous burden on grantees. Therefore, CB selected a subset of 10 "focal" EBPs for collection of in-depth data, using four criteria: (1) the EBPs should represent to the extent feasible a range of programs the grantees are implementing; (2) each EBP should be session-based, offering information about the sessions; (3) each EBP should be implemented by at least two grantees as a primary service of their project; and (4) all grantees should be implementing at least one focal EBP.

16

Page 27: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table III.1. Types of records collected in the ESL and cumulative number of records collected through September 30, 2015

Type of Record Number of Records

Case enrollment into RPG 1,411 Individual participant 4,237 Enrollment of a case into an EBP 2,106 Session in which a focal EBP was delivered 9,817

Source: RPG Enrollment and Service Log Data.

2. Outcomes To account for the complexity in the family contexts for children, CB selected measures for

the cross-site evaluation to paint a broad portrait of adult substance use disorder and treatment, caregiver and adult functioning (referred to as “family functioning” in the cross-site evaluation), and child well-being in RPG cases. In year 3 of the cross-site evaluation, grantees submitted data to describe the well-being, permanency, and safety risks of children and the recovery and family functioning risks of adults at baseline. This information included data from 12 valid and reliable standardized instruments administered to adults prior to receiving RPG services and administrative records on participants in the year prior to programming (Table III.2). Understanding the starting point of RPG families will be critical to evaluating their growth through participation in RPG.

Grantees submitted outcome data every six months, which included October 2014 and April 2015 in year 3. During these periods, the OAISIS system was open to accept grantee submissions of outcome data (both standardized instruments and administrative data). We instructed grantees to submit cumulative records at each upload period, so that each grantee’s upload for a given data source represented all of the data collected by that grantee. The October 2014 upload was the first time that grantees submitted data to the OAISIS system. Although grantees had few records to upload at that time, every grantee was able to submit some data.

At the end of the October 2014 data submission window, we received at least one data element from all 16 grantees that had collected data at this point. More specifically, we received 75 percent of the standardized instrument data elements that we expected based on the populations grantees were serving. (Some grantees do not serve all age groups of children, so we would not expect them to administer all standardized instruments, because each instrument specifies a specific age range.) We also received administrative data from 12 grantees. This result was very positive for this initial upload period, the primary goal of which was to have grantees gain familiarity and practice with the data submission process.

17

Page 28: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table III.2. Outcome data sources

Construct Standardized Instrument or Administrative Data Target Sample

Child well-being

Trauma symptoms

Trauma Symptom Checklist for Young Children (TSCYCY; Briere et al. 2001) 3–12 years

Executive functioning

Behavior Rating of Executive Function–Preschool (BRIEF-P; Gioia et al. 2000) 2–5 years

Behavior Rating of Executive Function (BRIEF) 5– 18 years

Child behavior

Child Behavior Checklist (CBCL)–Preschool Form (Achenbach and Rescorla 2000, 2001)

18–60 months

CBCL–School Age Form 6–18 years

Sensory processing

Infant-Toddler Sensory Profile (ITSP; Dunn 2002) Birth to 36 months

Social and adaptive behavior

Socialization Subscale, Vineland Adaptive Behavior Scales, Second Edition, Parent-Caregiver Rating Form (Vineland-II; Sparrow et al. 2005)

All focal children

Child permanency

Removals and placements

Administrative records indicating the date and types of placements outside of the home and subsequent reunification with the family

All focal children

Child safety

Maltreatment Administrative records indicating the date of incident, type of maltreatment, and whether the reported maltreatment was substantiated

All focal children

Adult recovery

Substance use severity

Addiction Severity Index Self Report (ASI-SR), Self-Report Form (McLellan et al. 1992)

All recovery domain adults

Trauma symptoms

Trauma Symptoms Checklist-40 (TSC-40; Briere and Runtz 1989)

Enrollment in treatment

Administrative records indicating the date of enrollment, termination, and completion (if applicable) of publically funded treatment programs

Family functioning/stability

Depressive symptoms

Center for Epidemiologic Studies Depression Scale (CES-D), 12-Item Short Form (Radloff 1977)

All family functioning domain adults

Parenting skills

Adult-Adolescent Parenting Inventory (AAPI-2; Bavolek and Keene1999)

Parental stress

Parenting Stress Index–Short Form (PSI–SF) (Abidin 1995)

The April 2015 data submission window was the more important one from which to obtain

complete data, as this cumulative upload was intended to inform the Third Report to Congress. By the end of the April 2015 data submission window, we received 95 percent of expected data elements from grantees, which enabled the cross-site team to present a very complete report of participant outcomes in the Third Report to Congress (Table III.3).

18

Page 29: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table III.3. Percentage of expected standardized instrument and administrative data elements submitted in April 2015

Grantee Percentage

Center Point, California 100 Georgia State University Research Foundation 100 Judicial Branch, State of Iowa 100 Northwest Iowa Mental Health Center/Seasons Center 88 Children’s Research Triangle, Illinois 100 Kentucky Department for Community Based Services 100 Commonwealth of Massachusetts 100 Families and Children Together, Maine 100 Alternative Opportunities, Missouri 100 The Center for Children and Families, Montana 94 State of Nevada Division of Child and Family Services 64 Summit County Children Services, Ohio 71 Oklahoma Department of Mental Health and Substance Abuse Services 100 Health Federation of Philadelphia 93 Helen Ross McNabb Center, Tennessee 100 Tennessee Department of Mental Health and Substance Abuse Services 100 Rockingham Memorial Hospital, Virginia 100

Total 95

Source: RPG OAISIS data uploaded in April 2015.

To track grantee upload progress and provide formative feedback to grantee data administrators, we assessed the upload status of the system weekly. We sent each grantee weekly status report emails that summarized data elements that had been successfully uploaded, data elements that the system rejected (for example, invalid birth date information), and data elements that had not been uploaded. By providing grantees with regular feedback on their upload status and one-on-one TA as needed to support them in their uploads, we were able to help grantees achieve very high data submission rates.

B. Surveying partners and EBP staff

To obtain a systematic picture of RPG-funded program implementation directly from those involved in RPG, we conducted two surveys in April 2015. One was administered to representatives of RPG partner organizations, and the other to frontline and supervisory staff delivering EBPs to RPG participants. The information provided by EBP staff contributes significantly to our analysis of staff characteristics, attitudes toward and experiences of implementing EBPs, organizational characteristics, and staff supports. Partner survey data is essential to understanding several key aspects of the role of partnerships in RPG implementation, such as the quality of collaboration and the goals of RPG partnerships. Below, we describe sample member recruitment and outreach and grantee response rates.

1. Survey recruitment and outreach As a first step, we worked with grantees to identify appropriate survey participants. During

this process, we experienced some challenges and strategized about how to troubleshoot these issues before launching the surveys. For example, some grantees expressed concerns that

19

Page 30: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

representatives of some partnering organizations listed on their Semi-Annual Progress Reports (SAPRs) would not be knowledgeable enough about the RPG partnership to participate in the survey, particularly organizations that were not considered “primary” or “close” partners. We responded to this issue by emphasizing our interest in the perspectives of all partners (including those considered “secondary” or informal) and requesting that FPOs and PMLs review the lists of partnerships and confirm which organizations should remain on the list. The overall sample size that resulted from this process was 117 staff and 208 partner representatives.

Throughout the field period, we regularly encouraged sample members to complete the surveys using several approaches (for example, reminder calls to grantees with partner survey response rates of less than a 70 percent after the final reminder email).

2. Response rates We used several strategies to maximize response rates. For example, we engaged grantee

contacts and tailored our communications with respondents to stress the importance and value of the surveys. As a result of these efforts and the willingness of respondents, many grantees achieved high response rates for both surveys. Nearly half of grantees achieved a response rate of 100 percent for the partner survey. In some cases, however, responses were low for several reasons, including incorrect email addresses and contacts that had left their organizations (the latter particularly affected the partner survey). In total, grantees achieved average response rates of 87 percent for the staff survey, and 75 percent for the partner survey (Tables III.4 and III.5).

Table III.4. Final response rates by grantee for the staff survey

Grantee Number

Completed Sample

Size

Response Rate

(Percentage)

Center Point, California 5 6 83 Georgia State University Research Foundation 2 5 40 Judicial Branch, State of Iowa 7 8 88 Northwest Iowa Mental Health Center/Seasons Center 6 7 100 Children's Research Triangle, Illinois 5 5 100 Kentucky Department for Community Based Services 4 5 80 Commonwealth of Massachusetts 5 6 100 Families and Children Together, Maine 2 2 100 Alternative Opportunities, Missouri 11 11 100 The Center for Children and Families, Montana 2 4 50 State of Nevada Division of Child and Family Services 3 3 100 Summit County Children Services, Ohio 9 12 75 Oklahoma Department of Mental Health and Substance Abuse Services 9 12 90 Health Federation of Philadelphia 3 4 75 Helen Ross McNabb Center, Tennessee 14 17 93 Tennessee Department of Mental Health and Substance Abuse Services 5 5 100 Rockingham Memorial Hospital, Virginia 5 5 100 Total 97 117 87

*Final grantee and overall response rates for the staff survey were calculated based on the total sample size minus staff deemed to be ineligible (for example, no longer working for the organization or identified as delivering an EBP in error), which was 111 respondents.

20

Page 31: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table III.5. Final response rates by grantee for partner survey

Grantee Number

Completed Sample

Size

Response Rate

(Percentage)

Center Point, California 6 8 75 Georgia State University Research Foundation 7 8 88 Judicial Branch, State of Iowa 11 14 79 Northwest Iowa Mental Health Center/Seasons Center 5 7 71 Children's Research Triangle, Illinois 3 4 75 Kentucky Department for Community Based Services 6 7 86 Commonwealth of Massachusetts 17 23 74 Families and Children Together, Maine 17 22 77 Alternative Opportunities, Missouri 18 24 75 The Center for Children and Families, Montana 10 12 83 State of Nevada Division of Child and Family Services 9 19 47 Summit County Children Services, Ohio 7 7 100 Oklahoma Department of Mental Health and Substance Abuse Services 4 5 80 Health Federation of Philadelphia 7 7 100 Helen Ross McNabb Center, Tennessee 8 18 44 Tennessee Department of Mental Health and Substance Abuse Services 9 9 100 Rockingham Memorial Hospital, Virginia 10 11 91 Total 154 205 75

C. Planning site visits

In-person discussion with staff about their experiences provides an invaluable opportunity to understand the complexities and context of grant projects. In year 3, Mathematica began preparing to visit all 17 RPG2 grantees. The goal of the site visits is to understand each grantee’s RPG planning process, how and why particular EBPs were selected, the implementation system’s ability to support quality implementation of the focal EBPs, and the implementation experiences of grantees and their partners. To prepare for the visits, Mathematica (1) defined the scope of the visits, (2) drafted interview discussion guides, (3) conducted a pilot visit, and (4) trained site visitors.

Setting the scope of visits. The structure and complexity of grantees’ projects varied based on the number of EBPs being implemented, the number of settings in which the EBPs are delivered, and the number of agencies delivering the EBPs. Mathematica had to balance the desire to be comprehensive with the need to make the visits manageable. For example, most grantees are implementing more than one focal EBP, with some grantees offering six focal EBPs. Within a grantee, the implementation experiences of staff may vary by implementing agency and by setting within an agency delivering an EBP in multiple locations. Although we sought to learn about staff experiences implementing the focal EBPs within varying organizational structures, it was not feasible to interview staff from all settings about each of the focal EBPs. Thus to define the scope of the visits, we used information on the structure of each grantee’s project to set the parameters of the visits, resulting in the decision to focus for each grantee on a maximum of two focal EBPs and to visit no more than two settings within an agency or two agencies implementing focal EBPs. For grantees offering more than two focal EBPs, Mathematica strategically selected which focal EBPs to target, balancing the number of interviews for each

21

Page 32: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

focal EBP across grantees. The CSLs provided feedback on the proposed site visit plans based on their understanding of their sites.

Developing discussion guides. To ensure site visitors collect consistent data from all sites, Mathematica developed detailed discussion guides based on the topics approved by OMB. The site visits will explore two overarching topics: (1) the structure of the RPG projects, and (2) the processes grantees used to implement their focal EBPs. The RPG design report focused the site visits primarily on EBP implementation. However, as we learned more about the variation in RPG projects, we realized we needed to explore these design differences during the site visits. Thus, we added the following research question: What factors contributed to the design of each grantee’s RPG project?

Each grantee’s project director and key staff, along with those involved in the design and implementation of the RPG project, are key sources for information about the first topic. These interviews capture information on (1) each project’s structure and reasons the projects vary across grantees; (2) the process involved in working across systems on behalf of families; and (3) how the local context affected projects. Individual interviews with the three levels of staff (managers, supervisors, and frontline staff) responsible for implementing the selected focal EBPs will inform the second topic. The objective of these interviews is to learn about the support projects provided for the implementation of their focal EBPs. The topics covered are based on the National Implementation Research Network’s (NIRN) implementation drivers, which identify implementation best practices.

Piloting site visit protocols. Developing quality site visit protocols is an iterative process greatly informed by the opportunity to pilot protocols before their use with all sites. A senior Mathematica researcher conducted the initial RPG site visit in September 2015, with a focus on gauging how well the discussion guides captured the desired information. Based on the pilot experience, we clarified items and eliminated non-essential questions to allow the interviews to be completed within the allotted time.

Training site visitors. We recruited site visitors who were familiar with RPG through their various roles on the RPG cross-site evaluation, so they would have enough context to facilitate the interviews effectively. However, we chose not to assign CSLs to their grantees, to encourage new dialogue about grantee experiences. Following the pilot visit, we held a day and a half of training on the discussion guides and site visit best practices to prepare the visitors to conduct consistent, high-quality visits. The training included an overview of each of the RPG partners—child welfare, substance use disorder treatment, and the courts—to provide context for the projects, and discussion of site visit etiquette and strategies to facilitate effective interviews. We reviewed each discussion guide in-depth, explaining the purpose of specific questions and providing example responses to ensure the consistent data collection. The site visitors asked valuable clarifying questions that we used to refine the discussion guides. The site visits were conducted in fall 2015 and will be described in the next annual report.

22

Page 33: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

IV. ANALYZING PARTICIPANT-LEVEL DATA

In the first two years of RPG, grantees, Mathematica, and CB worked intensively to lay the foundation for data collection and meaningful analysis. Thanks to this preparation, in year 3, Mathematica began analyzing data that grantees had collected and provided to us on their participants. Mathematica analyzed the composition of RPG cases, the demographic and socioeconomic characteristics of participants, enrollment into EBPs, and the well-being of children and adults at baseline for cases enrolled between February 2014 and March 2015.6 These analyses were the source of initial findings for the Third Report to Congress. That report presents a picture of the RPG population at program entry and the initial services that participants received. It also lays the groundwork for more detailed analyses of service delivery and participant outcomes that the cross-site evaluation team will undertake in the future.

This chapter briefly describes how Mathematica analyzed participant-level data, and summarizes the results. Detailed findings are in the RPG Third Report to Congress (HHS, forthcoming). Sections A and B describe analyses of participant characteristics and service data collected through the ESL. Section C describes the analysis of baseline child well-being, family functioning, and recovery data collected through OAISIS. Finally, Section D briefly discusses data quality issues.

A. Participant characteristics

Although most grantees had enrolled small numbers of cases by the cutoff date for analysis, we were able to combine cases to create a picture of RPG cases, children, and adults at program entry (Table IV.1). In all, we examined the characteristics of 625 cases and their members.

1. Analysis methods Because RPG participants are involved or at risk of involvement with the child welfare

system, members of an RPG case may not be members of a traditional family or household. That is, case members are not necessarily biologically or legally related, nor do they necessarily live together. For this reason, Mathematica sought to describe the composition of RPG cases and relationships between members. After examining cases as a whole, Mathematica analyzed the individual characteristics of adults and children included in those cases, with a focus on the subset of case members on whom outcome data were collected.

To conduct the case-level analysis, Mathematica first categorized cases by the number of members and the relative proportions of children and adults in the case. The analysis also examined how case members were related to each other—for example, whether they were biologically related.

6 Forms to collect enrollment and services information for the ESL system first became available in February 2014; the enrollment cutoff for analysis for the Third Report to Congress was March 2015.

23

Page 34: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table IV.1. Number of cases analyzed in year 3, by grantee

Grantee Number of ESL Cases

Center Point, California 37 Georgia State University Research Foundation 21 Judicial Branch, State of Iowa 41 Northwest Iowa Mental Health Center/Seasons Center 14 Children’s Research Triangle, Illinois 9 Kentucky Department for Community Based Services 28 Commonwealth of Massachusetts 21 Families and Children Together, Maine 78 Alternative Opportunities, Missouri 142 The Center for Children and Families, Montana 23 State of Nevada Division of Child and Family Services 8 Summit County Children Services, Ohio 25 Oklahoma Department of Mental Health and Substance Abuse Services 44 Health Federation of Philadelphia 28 Helen Ross McNabb Center, Tennessee 66 Tennessee Department of Mental Health and Substance Abuse Services 40 Total 625

Source: RPG enrollment and service log data. Note: The ESL includes data collected starting on January 1, 2014. Data from Rockingham Memorial Hospital

(Virginia) could not be included but will be provided in future reports to Congress. Mathematica’s next step was to describe the individual characteristics of members of those

cases. For each RPG case, the grantee had selected one “focal child” on whom to collect outcome data, so the analysis describes focal and non-focal children separately. The sample included 1,080 total children, 625 of whom were focal children. Among adults, Mathematica analyzed characteristics of three overlapping groups of adults: those who reported on outcomes in the family functioning and child well-being domains of the cross-site evaluation, those who reported on outcomes in the recovery domain, and biological parents.7

• Adults who reported on family functioning. One adult per case reported on family functioning and child well-being outcomes through the standardized instruments selected for the cross-site evaluation. This person was usually the focal child’s primary caregiver. In some cases, grantees did not have contact with an adult who could report on family functioning, so we were able to include in this analysis a subset of 583 individuals out of the 625 cases.

• Adults who reported on substance use and recovery. One adult in each case reported on substance use through the instruments described below. In a few cases, grantees did not have contact with an adult who met the criteria for inclusion in the recovery domain, so we were able to include in this analysis 617 individuals from the 625 cases. Within one case, the same adult could report on both family functioning and substance use, and in 87 percent of

7 Adults reporting on family functioning were asked to report on the well-being of the focal child only if the focal child had been in their care for the past 30 days.

24

Page 35: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

the 583 cases that included both a family functioning and a substance use reporter, the same person filled both roles.

• Biological parents. Most cases (604) included at least one of the focal child’s biological parents, and information about their circumstances and characteristics shed light on the situations of the focal children. As a result, Mathematica included an analysis of the characteristics of one biological parent per case, for each case that included a biological parent. If a case included both biological parents, and one of the parents reported on family functioning outcomes, Mathematica included that parent in this analysis. If a case included both biological parents, and one of them reported on substance use, the analysis included that parent. This group overlapped significantly with the first two: 94 percent of adults who reported on family functioning and 97 percent of adults who reported on substance use were also the focal child’s biological parent.

2. Results highlighted in the Third Report to Congress Because RPG addresses needs of children at risk due to potential or actual substance use

disorder by an adult close to them, by definition each case included at least one adult and one child.8 Most case members were biologically related. Analysis showed that most children in RPG cases were age eight or younger, and many adults served by RPG also faced economic challenges.

Case composition. Half of cases had just two members: one adult and one child, and only 15 percent of cases had more than four members. Most case members were biologically related. For example, 94 percent of cases with two members consisted of a child and his or her biological parent, and in 97 percent of cases with more than one child, the children were biological siblings.

Children. Eighty-one percent of focal children were age 8 or younger, and most were white, non-Hispanic, and English-speaking. These demographic characteristics reflected the fact that the composition of the sample was heavily influenced by grantees that served areas with mostly white, non-Hispanic populations. Half of children lived in the case members’ primary residence at the time of enrollment into RPG, while 28 percent lived in a foster parent’s residence. Nonfocal children were similar to focal children, but slightly older, with an average age of six years, compared with five years among focal children.

Adults. Information on biological parents’ income and employment showed that many parents in RPG faced financial hardship. For example, 73 percent of biological parents reported earning less than $10,000 in the year prior to enrolling, and 19 percent of parents reported no income from any source. Fifty percent of parents reported being unemployed . In addition, 59 percent of parents were neither married nor cohabiting with a partner at the time of RPG enrollment.

8 Some cases included foster parents, because some children served by RPG were in foster care. In such cases, the foster parents were part of the case only because of their relationship with one or more children in the case, not because they had, or were suspected of having, substance use disorder. In one percent of cases, a foster parent was the only adult in the case, because one grantee worked with children in an alternative foster care system but did not provide services to their families of origin.

25

Page 36: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

B. EBP enrollment

In comparison to the first round of RPG grants made in 2007, the RPG program in 2012 placed an emphasis on using evidence-based or evidence-informed programs and practices through the RPG projects. Although detailed analysis of services received in these EBPs will begin later in the evaluation, in year 3, Mathematica examined cases’ enrollment into EBPs.

1. Analysis methods To understand enrollment into EBPs, Mathematica first calculated the percentage of RPG

cases in which members had also been enrolled into at least one EBP by the end of the reporting period. For those who had been enrolled in at least one, we calculated the number and types of EBPs in which they were enrolled.

2. Results highlighted in the Third Report to Congress Enrollment into EBPs. Most commonly, cases had by March 2015 enrolled into one or two

EBPs. Among grantees offering more than one EBP, cases were more likely to enroll in a subset of the EBPs offered than in all of them.

Types of EBPs. Most grantees (12) implemented EBPs focused on strengthening families and improving parenting skills, and about half of the cases they served (52 percent) had enrolled in this type of EBP. EBPs intended to respond to trauma were the next most popular: 10 grantees had enrolled cases in them, and 18 percent of their cases were enrolled in a response to trauma EBP.

C. Well-being of children and families at baseline

The objective of RPG is to safeguard and improve the well-being of children and families. To measure these outcomes, grantees collect data at enrollment (baseline), then after program completion. For the Third Report to Congress, we were able to use baseline data to provide an overview of the initial status of children and adults enrolled in RPG programming in the five outcome domains of interest to CB. The report addressed two research questions:

1. What was the well-being, permanency, and safety status of children prior to the start of RPG?

2. What was the recovery and family functioning status of adults prior to the start of RPG?

1. Analysis methods We examined two key sources of data to provide information about the risks and protective

factors of our sample prior to receiving services: (1) standardized instruments collected at baseline, used to measure well-being, family functioning, and adult recovery; and (2) administrative data, used to measure child safety and permanency, and adult participation in substance use disorder treatment.

To analyze data from standardized instruments, we calculated scale scores following the guidelines provided in scoring manuals or instructions published for each instrument. We then calculated mean scores and their standard deviations, and compared them with scores for nationally representative samples. We also examined the percentage of our sample with scores 26

Page 37: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

that would place them in a high “risk category” for potential negative outcomes. For example, the Parenting Stress Index–Short Form (PSI–SF) scoring manual describes parents with total scores above 90 as experiencing “clinically significant levels of stress.” Therefore, we categorized respondents with scores of 90 or higher in the high-risk category for parenting stress.

For administrative data, we examined a one year window of time prior to the child or adult’s enrollment in an RPG project and analyzed relevant events (such as removal of a child from their home, or an adult’s completion of substance use disorder treatment) reported in the data during that window of time. We then describe prevalence rates for each event.

2. Results highlighted in the Third Report to Congress The report highlights that focal children and their families have high risks in some areas but

lower risks than might be expected in other key areas at program entry. Although many children had documented maltreatment and previous experience in the child welfare system, children had relatively low levels of risk in terms of behavioral problems for children ages 1.5–5 and trauma symptoms for children ages 3–12. Adults demonstrated relatively low levels of substance use but struggled with high levels of parenting stress.

Children’s safety and permanency at baseline. Projects enrolled some children with documented maltreatment and previous experience in the child welfare system.

• Thirty-one percent of the focal children in the sample had one or more substantiated episodes of maltreatment in the year prior to enrollment in RPG.

• In addition, 26 percent of focal children were removed from their homes at some point during the year prior to RPG enrollment (not including those who may already have been living outside of the home by the beginning of the period).

• Of the 148 children who were removed from their homes during the year prior to RPG enrollment, 15 percent (22 children) were subsequently reunited with their families during the time period examined, and one other child was placed in a permanent setting.

Children’s well-being at baseline. At enrollment, RPG children were at higher risk than national samples of children in some, but not all, areas of well-being.

• Thirty-one percent of focal children aged 0 to 6 months, and 43 percent aged 7 to 36 months, fell into the high-risk category for sensory processing. Their scores indicated that they over-responded or under-responded to their environments, indicating possible risks for atypical child development.

• Compared with national samples of children, focal children in RPG exhibited limitations in their executive functioning. These children had greater difficulties in tasks such as controlling their impulses, solving problems, and planning. However, many RPG children look similar to the national population in terms of executive functioning, and many scored as well as or better than the national mean.

• The levels of emotional, behavioral, and associated problems among RPG children were mixed. School-aged children’s levels, on average, were consistently but only slightly above

27

Page 38: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

the national means on behavioral and emotional problems. Preschoolers scored close to or below the national mean.

• RPG focal children scored slightly lower on socialization skills (defined as the performance of daily activities required for personal and social sufficiency) relative to peers nationally. The mean socialization score for RPG focal children was 89 compared with a national mean of 100. Eighty-three percent of children scored within the normal range of socialization scores for their age group, which indicates that many children were developing positive socialization skills.

• Children between ages 3 and 12 years in the RPG sample exhibited few signs of post-traumatic stress disorder (PTSD), an anxiety condition brought on by experiencing one or more traumatic events such as abuse. On a scale measuring trauma symptoms associated with PTSD, the mean score for RPG focal children was 37 compared with a national mean of 50. However, 37 percent of RPG focal children had scores indicating a high level of risk for PTSD.

Adult recovery at baseline. A total of 37 percent of RPG adults exhibited high severity of either drug or alcohol use or of both in the past 30 days (Table IV.2). Severe drug use was much more prevalent than severe alcohol use, with 36 percent reporting severe drug use, and 7 percent severe alcohol use. Four percent reported severe use of both drugs and alcohol.9

Table IV.2. Substance use among adults prior to RPG enrollment

Baseline Scale Sample Sizea Percentage of Adults in High-Severity Category

Drug use 349 36b

Alcohol use 329 7b

Use of drugs or alcohol or bothb 380 37

Source: RPG baseline administration of Addiction Severity Index-Self Report. a Sample sizes vary by measure due to instrument or item nonresponse. b A total of 380 adults completed the alcohol use scale, the drug use scale, or both. The percentage of adults in the high-severity category is calculated relative to the number with complete data for a given type of substance use.

At least 20 percent of adult RPG participants had been in one or more publicly funded substance use disorder treatment programs during the year prior to their enrollment in RPG. Of the 112 adults in this category, 30 (27 percent) completed at least one treatment program during that period.

Family functioning at baseline. Families in RPG face challenges related to adult stress, depression, and parenting attitudes.

• Nearly all primary caregivers of RPG children experienced elevated levels of parenting stress, and their mean score for parenting stress substantially exceeded the national mean as measured by the PSI–SF (Figure IV.1). Parenting stress contributes to dysfunctional

9 We used the nationally representative mean for people in substance use disorder treatment settings described in McClellan et al. (2006) as a cutoff score to indicate a high level of severity of substance use, because it was above-average use among participants already in treatment.

28

Page 39: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

parenting and is associated with potential for child maltreatment (Belsky 1984; Berger 2004). There is also a significant association between stress and substance use disorders. Studies have shown that people exposed to stress are more likely to misuse substances, or to relapse after treatment (Najavits et al. 1997; Substance Abuse and Mental Health Administration 2013).

• On average, RPG adults also reported levels of depressive symptoms that are higher than observed in the general population. Among the adult respondents, 38 percent exhibited symptoms of severe depression. Compared with a national mean of 9.3, RPG adults scored a mean of 12.2 on a scale measuring depressive symptoms.

• Most RPG adults reported at least one parenting attitude or behavior, such as lack of empathy or use of corporal punishment that placed their children at risk for maltreatment. However, most adults also held one or more positive parenting attitudes, a potential protective factor for their children’s safety.

Figure IV.1. Distribution of scores on the PSI-SF for RPG adults compared with the national mean

Note: In this figure, the distribution of parenting stress scores for the RPG sample is shown by the yellow histogram on the right-hand side, which is centered on the RPG sample mean score of 141. The height of each bar represents the proportion of the sample with scores in a given range. For example, about 16 percent of the sample has scores between 143 and 151. A red bell curve is overlaid on the histogram. The curve is centered on the national mean score of 69 and represents the distribution of normalized scores for a general population.

29

Page 40: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

D. Data quality

Hallmarks of the cross-site evaluation for RPG2 are collection of detailed, real-time enrollment and services data and the grantees’ use of a common set of instruments to measure outcomes for people in their programs. Grantees provide these data through two web-based data collection systems developed for RPG2: the ESL and OAISIS. Over the year, we worked to improve the ESL to better match how grantees were using the system. In examining data, we also learned that some grantees were still unclear on what data to submit and having difficulty obtaining complete versions of selected instruments. One challenge for grantees who wanted to use baseline data was the limited availability of copyrighted scoring information and, in a few cases, the lack of expertise in scoring and using selected instruments. Mathematica worked with grantees and with WRMA and Synergy Enterprises to identify and address these problems. As a result of these experiences, we also plan more outreach and individualized TA during the coming year to help ensure the completeness and quality of data.

1. ESL-related data System features. Grantees began using the ESL forms in February 2014, and the system

came online that June. However, year 3 was a period of far heavier use of the system, as the pace of enrollment for some grantees increased and their implementation of EBPs advanced. As grantees used the system more regularly and submitted questions or concerns via the RPG help desk, some remaining glitches became apparent. In addition, some features of the system were inconvenient, given the way some grantees entered data. For example:

• A feature of the system intended to ensure that case identification numbers were unique within each grantee’s data was programmed incorrectly, so that it produced an error message if a user entered a case identification number that another grantee had already entered. As a result, grantees with similar case-numbering systems had difficulty entering cases.

• The ESL was designed to collect detailed information on each session of the focal EBPs. To associate these data with individual caseworkers or therapists who delivered the session, we designed the ESL assuming that those individuals would enter the data themselves—and requiring them to do so. That is, only users logged in to the case worker’s ESL account could enter session data. However, some grantees found that it was not practical for their participant-serving staff to enter session data directly into the ESL. Instead, staff recorded data on paper forms, and another staff member later entered it into the ESL. Data entry staff would have to log in and out of different user accounts, a cumbersome process.

In response to these and other issues, Mathematica and Synergy implemented five system upgrades throughout year 3. These upgrades resolved glitches and implemented system improvements. For example, Mathematica and Synergy added a feature to the ESL in September 2015 that allowed the user to select which caseworker to associate with session data. The final upgrade, to version 1.7, occurred in September 2015.

Grantees and Mathematica staff identified other upgrades that could have made the system more convenient. For example, for focal EBPs that included group sessions, grantees asked for the ability to enter session data for all of the group participants at once. These and other desirable upgrades were technically feasible but not possible within the budget for this system.

30

Page 41: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Grantee practices. Once we began analyzing ESL data, we identified several practices that limited the quality and completeness of the data.

• To limit the burden data collection imposed on grantees, the cross-site implementation study was designed so that grantees would provide information on enrollment into and exit from all EBPs participants received, and detailed data on each EBP session only for the focal EBPs they implemented. However, some grantees were providing enrollment and exit data only on their focal EBPs, not other EBPs they implemented.

• For each session in which grantees provided a focal EBP, they were asked to indicate whether the session covered any of several topic areas, such as substance use disorder treatment, parenting skills, or youth therapy and development. Staff from several grantees unintentionally omitted these data for around 400 service logs (approximately 6 percent of all service logs completed during the reporting period), we think because they forgot to complete additional question modules in the system.

Mathematica provided feedback to grantees on relevant issues we identified. We met with several grantees to clarify the EBPs on which they needed to enter data for the cross-site evaluation. In addition, we communicated with grantees who omitted topic data from service logs, to discuss why the data were not recorded and the importance of including them the future. (In January 2016, we provided a refresher training for grantees on updates to the ESL and challenging aspects of data entry.)

2. OAISIS-related data System features. To ensure consistent data entry while providing some flexibility for

grantees, WRMA developed OAISIS to accept data in either of two formats. Grantees could provide each standardized instrument in a fillable PDF, then upload the PDFs into OAISIS. Alternatively, they could administer instruments in their chosen format, store the data on all respondents in their own management information or data systems, then enter it into Excel spreadsheets for submission to OAISIS. We experienced a few challenges with the two formats for data submission.

• In some cases, the values assigned to response options in the Excel files did not match those in the fillable PDF versions of an instrument. For example, in the fillable PDF version of the ASI-SR, a response of “no” was coded as 0, and “yes” was coded as 1, but some grantees instead coded “no” as 2 in their Excel files.

• The fillable PDF for one instrument, the Vineland II socialization scale, initially had errors in the skip patterns assigned to children of different age groups. These errors prohibited children in some age groups from answering all questions that they should have received as part of the scale. Based on the original scoring procedures, most children in the sample would not receive a score.

We replaced the PDF for the Vineland II with a correct version for future use. When we identified responses outside of the expected range (such as the “2” in the example above), we first contacted the grantee submitting the data to ask how it coded the responses in its data set. We then modified our scoring procedures to convert the responses into the same ranges used in the fillable PDF versions. We also explained what values to use for those items in future uploads. 31

Page 42: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

We adapted the scoring instructions of the Vineland II so that we could still score children with a high level of missing data due to the faulty skip patterns of the fillable PDF. The items are ordered beginning with the most basic socialization skills that the youngest children can master and ending with the most advanced socialization skills that typically only older children, teens, and adults can master. For each item, the caregiver rates the child on a scale of 0–2 on whether the child never (0), sometimes (1), or usually (2) demonstrates the described skill. We assumed that older children had mastered the most basic skills at the beginning of the instrument and converted any of those missing items to 2s. Similarly, we assumed that young children had not mastered the most advanced skills and converted those missing items to 0s. This approach allowed us to provide reasonable score estimates for children of all age groups even with a large percentage of items missing.

Grantee practices and needs. During the year, we began to clean and check data submitted through OAISIS. During this process, and once analysis began, we identified several issues that affected data from the standardized instruments.

• The Addiction Severity Index-Self Report (ASI-SR) provides composite scores on seven scales: (1) medical status, (2) employment status, (3) legal status, (4) family/social status, (5) psychiatric status, (6) alcohol use, and (7) drug use. Most of these scales open with a question on how many days the respondent has experienced a problem related to the topic area, then follows with additional questions to assess the severity of problems in that area. Often when a respondent reported that he or she experienced the problem for 0 days, he or she did not respond to the remaining questions in that section. Based on the scoring instructions, a score cannot be generated when these items are left blank.

• Mathematica once a year sends grantees scores on each standardized instrument for all individuals in their data. However, some grantees requested instructions for scoring their own data.

We developed an alternative scoring procedure for the ASI-SR, so we could use instruments with missing data. If a respondent reported that he or she experienced a problem for 0 days, we would assume that the remaining responses on the same scale should indicate no other problems with that particular problem area. For example, if a person indicated that he or she experienced medical problems for 0 days in the past 30 days, we would assume that the respondent would also say that he or she was not troubled by medical problems in the last 30 days and was not seeking treatment for medical problems. We changed blank responses to these questions to “not at all.” This approach allowed us to generate scores for a much higher percentage of respondents than if we were using the original scoring rules.

In response to requests for information on scoring data, we provided detailed scoring instructions for three instruments that are publically available and can be distributed without permission from the publisher (the Center for Epidemiologic Studies Depression Scale, Trauma Symptoms Checklist-40, and the ASI-SR). Unfortunately, the licenses Mathematica purchased from publishers for the grantees’ use of copyrighted instruments do not allow us to circulate publishers’ manuals and scoring instructions. In October, we circulated a memo to grantees with information on how to purchase available manuals for the remaining instruments, including costs.

32

Page 43: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

The cross-site evaluation team will continue to provide targeted and general technical assistance on the ESL and OAISIS into year 4, through one-on-one communication with grantees and a webinar on common issues. In November and December, we will follow up with grantees that have experienced challenges in data collection areas key to our analysis plans—for example, those who have not yet closed any cases. We will also determine how data quality issues may affect our future analysis plans. By sharing information about challenges and questions that some grantees have raised, we hope to reduce the need for future TA on those issues. After successfully resolving the OAISIS challenges discussed above in year 3, the OAISIS team was able to reduce the amount of time and effort needed to correct errors and score the instruments in the most recent OAISIS data submission. We used the same programming that corrected many of the errors in previous uploads, and few grantees had additional out-of-range responses requiring follow-up questions.

33

Page 44: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 45: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

V. INTEGRATING THE RPG3 COHORT

On September 29, 2014, a few days before RPG2 grantees began year 3 of the project, the U.S. Department of Health and Human Services (HHS) made awards to a third cohort of five-year RPG projects (RPG3). The new partnerships were expected to participate in the national cross-site evaluation, including the implementation, partnership, and outcomes studies, as well as an impact study, if appropriate given the design of their local evaluations. Integrating these four new grantees into the cross-site evaluation, and providing support for their local evaluations, were thus key activities for the RPG team during FY 2014. Although this work was supported through a different contract, we describe our activities in this report to present a comprehensive picture of our RPG activities.10 After a brief overview of the grantee projects, we describe their local evaluations (Section A), our main activities with the grantees (Section B), and challenges and next steps in our work with them (Section C).

With their partners, the new RPG3 grantees planned to provide a variety of services to children and caregivers in their identified target groups (Table I.1). One grantee (the University of Kansas Center for Research) primarily targeted children in out-of-home care; the other three targeted families of children at risk of an out-of-home placement. Grantees planned to work with families in which parents were in, or had completed, substance use disorder treatment programs.

The RPG3 projects focused on child well-being, though grantees were each taking a different approach to service provision. Three planned to provide a suite of services to all participants; the fourth (Volunteers of America—Oregon) expected to offer a range of customized services from a menu of options, depending on family needs. Planned services included, for example, parenting education or skills training programs, referral to substance use disorder treatment or other needed services, counseling, support from a peer specialist, and trauma interventions or screening. One project planned to offer a drop-in center as a hub for all services.

A. Grantee local evaluations

HHS required that every RPG3 grantee evaluate its project; CB encouraged grantees to propose evaluation designs comparing participants with nonparticipants whenever possible (Table V.2; ACF 2014). CB preferred comparison group designs, because a well-designed approach could identify the influence of project services and activities on participant outcomes.11 The program and local evaluation plans described in many of the grantees’ applications were brief, and some grantees in the initial months of the grant were still planning specifics of their programs and evaluations—though others had already formulated detailed plans. A CSL from Mathematica explored grantees’ proposed evaluation plans as part of initial monthly calls.

10 Mathematica was awarded a separate contract in September 2014 to include the RPG3 grantees in the cross-site evaluation and provide them with evaluation-related TA. 11 Other evaluation designs, such as pre-post designs that compare participants before and after a program rather than with a separate comparison, are unable to attribute changes to the program being evaluated.

35

Page 46: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table V.1. RPG3 projects and planned target population and program focus

State Grantee

Organization Organization

Type

Federal Grant

Amount Planned Target Population and Program

Focus Florida Our Kids of

Miami–Dade/ Monroe

Child and family services provider

$600,000 Our Kids will provide a suite of services to families with children aged 0 to 11 who are referred through the Child Protective Investigation Process for diversion/prevention. Eligible families have either suspected or verified substance use disorder indicators but do not have an open dependency court case. The services include (1) the Engaging Moms/Parent Program, which provides additional support for engagement in substance use disorder treatment, family therapy interventions, and supports to improve parenting skills; (2) engagement with a peer specialist; (3) Intensive Family Preservation Services; and (4) referral to the Motivational Support Program, a voluntary program that is the gateway to substance use disorder treatment for families with child welfare involvement.

Kansas University of Kansas Center for Research

Public university $564,914 The University of Kanas Center for Research will provide the Strengthening Families Program: Birth to Three (SFP B–3) to families with substance use disorders and children up to 47 months old in foster care or at risk of out-of-home placement. SFP B–3 is a family skills training program that focuses on resilience and risk factors in behavioral, emotional, cognitive, and social domains. The scripted curriculum is delivered in 14 consecutive weekly sessions, with booster sessions occurring after 6 and 12 months. The program will also provide caregiver substance use disorder assessment, child and parent trauma assessment, and referrals.

New York Montefiore Medical Center

Medical center, substance use dependence treatment provider, child and family services provider

$600,000 Montefiore will provide the Family Treatment/ Rehabilitation (FT/R) program and three program enhancements—Seeking Safety, Incredible Years, and contingency reinforcement—to families with substance use disorders and open and indicated child welfare cases with children at risk for removal. Through FT/R, families will receive comprehensive clinical assessment of substance use disorder and other clinical and service needs, referrals to treatment and other services, home visits to monitor safety, and case management. Seeking Safety is a trauma-informed treatment to reduce the risk of substance use disorder. Incredible Years is a parenting education program. Contingency reinforcement provides financial incentives to keep substance use treatment disorder appointments and maintain abstinence.

36

Page 47: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table V.1 (continued)

State Grantee

Organization Organization

Type

Federal Grant

Amount Planned Target Population and Program

Focus

Oregon Volunteers of America—Oregon (VOAOR)

Child and family services provider, substance use dependence treatment provider

$600,000 VOAOR will provide a Recovery Oriented System of Care (ROSC) for parents recovering from substance use disorders who are either engaged with or at risk of engagement with child welfare. In eligible families, the adult in recovery will have already completed substance use disorder treatment. Services will be available through the Family Recovery Support program, which serves as a drop-in center for families. As part of the ROSC, families will receive a recovery support plan that includes services aligned with that family’s particular needs selected from a menu of options. Families will be matched with a certified peer recovery mentor, if requested, and may also work with a resource specialist and/or therapist.

Source: Grantees’ RPG applications, ongoing conversations with grantees, and other grantee materials.

During these conversations, the CSL helped grantees develop more detailed evaluation

designs and plans as needed; responded to questions from grantees, their evaluators, or their FPO about proposed or potential designs; and offered suggestions to bring some designs into closer alignment with goals articulated in the funding announcement.

B. RPG3 activities

During the first six months of RPG3, grantees focused on finalizing their program and evaluation plans. Through this process, grantees also developed strong relationships with the federal TA team, which included the CSL and a PML provided by NCSACW, and became familiar with procedures for requesting TA.

1. Finalizing program plans Work to finalize program plans fell into two overarching categories: (1) staffing and (2)

establishing processes for service delivery. In terms of staffing, all four of the grantees identified current staff and/or began hiring new staff to deliver RPG services. Two of the grantees began training staff, and all four identified and developed schedules for future training needs. Grantees also developed procedures for supervising staff who will be delivering services.

In establishing procedures for service delivery, grantees reviewed their program plans and worked with internal staff and partners to develop a shared understanding of how RPG clients would be recruited and how services should be delivered. They also documented procedures to be shared among staff and partners. For example, one grantee began weekly meetings among staff (at the grantee organization and at a partner agency), and another began service delivery to pilot cases to help staff prepare for full-scale implementation. As necessary, grantees also worked with EBP developers to determine how best to implement EBPs in their local contexts.

37

Page 48: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table V.2. Characteristics of RPG3 grantees’ local outcome evaluations

Grantee Organization

Evaluation Design

Expected Sample Size

Contrast in Services the Program and Comparison Groups Will

Receive Outcome Domains Data Sources

Additional Analyses

University of Kansas Center for Research, Kansas

Randomized controlled trial

720 to 864 families (360 to 432 program, 360 to 432 comparison)

Program group: Strengthening Families Program: Birth to Three, a family skills training program that focuses on resilience and risk factors in behavioral, emotional, cognitive, and social domains; caregiver substance use disorder assessment; child and parent trauma assessment; and referral. Comparison group: An array of business-as-usual services that likely include substance use disorder assessment and referral.

Child well-being Permanency Safety Recovery Family functioning

Direct assessments Administrative records

The grantee will also conduct a process evaluation and a cost study.

Montefiore Medical Center, New York

Matched comparison group design

280 families (80 program, 200 comparison)

Program group: Family Treatment/ Rehabilitation program, which provides families with comprehensive clinical assessment of substance use disorder and other clinical and service needs, referrals to treatment and other services, home visits to monitor safety, and case management; Seeking Safety (a trauma-informed treatment to reduce the risk of substance use disorder); Incredible Years (a parenting education program); and contingency reinforcement (financial incentives to keep substance use disorder treatment appointments and maintain abstinence). Comparison group: Substance use disorder treatment from Montefiore and business-as-usual services from New York’s Administration for Children’s Services.

Child well-being Permanency Safety Recovery Family functioning

Direct assessments Administrative records

The grantee will also conduct an implementation evaluation and a partnership study.

38

Page 49: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Table V.2 (continued)

Grantee Organization

Evaluation Design

Expected Sample Size

Contrast in Services the Program and Comparison Groups Will

Receive Outcome Domains Data Sources

Additional Analyses

Our Kids of Miami–Dade/Monroe, Florida

Randomized controlled trial

288 families (144 program, 144 comparison)

Program group: The Engaging Moms/Parent Program, which provides additional support for engagement in substance use disorder treatment, family therapy interventions, and supports to improve parenting skills; engagement with a peer specialist; Intensive Family Preservation Services (IFPS); and referral to the Motivational Support Program (MSP), a voluntary program that is the gateway to substance use disorder treatment for families with child welfare involvement. Comparison group: IFPS and MSP.

Child well-being Permanency Safety Recovery Family functioning

Direct assessments Administrative records

The grantee will also conduct an implementation evaluation and a partnership study.

Volunteers of America—Oregon (VOAOR), Oregon

Matched comparison group design

400 families (200 program, 200 comparison)

Program group: Recovery Oriented System of Care, which includes development of a recovery support plan that includes services aligned with that family’s particular needs selected from a menu of options; access to the Family Recovery Support program, which serves as a drop-in center for families; and access to a certified peer recovery mentor, a resource specialist, and/or a therapist. Comparison group: Any available business-as-usual services, though services are likely to be limited because substance use disorder treatment will have ended.

Child well-being Permanency Safety Recovery Family functioning

Direct assessments Administrative records

The grantee will also conduct an implementation evaluation and a partnership study.

39

Page 50: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

2. Finalizing evaluation plans A key step in finalizing local evaluation plans was participating in the evaluability

assessment process, which involved developing evaluation plans included in their applications, considering approaches to improve the rigor of their outcome evaluations, finalizing which direct assessments to use in their local evaluations, and adjusting their evaluations based on implementation decisions (if needed). All four grantees maintained close communication between program and implementation team members throughout this process and by the end of the first six months of the grant were able to either submit or finalize their IRB applications.12 Grantees also worked on establishing agreements with state and local partners for obtaining administrative data related to the safety, permanency, and recovery domains of the cross-site evaluation, and participated in trainings supporting participation in the cross-site evaluation.

C. Challenges and next steps

Each grantee experienced challenges in finalizing program and evaluation plans or starting operations. Most challenges were specific to each grantee. In each case, the grantee worked with the federal team and local partners to develop a solution and move forward. For example, one grantee faced unexpected staff turnover at a key partner organization. The partner organization moved quickly to replace staff that were critical to the RPG3 project, and those staff were successfully integrated into the RPG team. All four of the grantees learned from their early challenges and used these lessons to finalize their program and evaluation plans.

RPG grantees made substantial progress during their first year; by September, all had received IRB clearance and could thus begin their programs and evaluations. Key upcoming activities include the following:

• Grantees will conduct outreach activities to recruit eligible families for their RPG services. They will also continue staff training and provide supervision to staff who are providing services.

• To support cross-system collaborations, the grantees will continue regular meetings with their partner agencies.

• For the local and cross-site outcome evaluations, grantees will begin data collection with families as they enter and exit the RPG program. They will also conduct outreach activities to begin enrolling comparison group members. As necessary, grantees will continue working to establish data-sharing agreements with state agencies for accessing safety, permanency, and recovery data.

• Grantees conducting process evaluations and partner studies will begin collecting data from RPG program staff and partners.

• To meet the requirements of the cross-site evaluation, grantees will begin providing the evaluation with data, including enrollment and services data, and baseline data that were used in this report to describe RPG2 cases and participants.

12 HHS required that grantees obtain IRB review for their planned evaluations.

40

Page 51: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

• Finally, with program and evaluation procedures established and implementation of both underway, grantees will begin developing plans and strategies to sustain their partnerships and programs after the end of their five-year RPG grants.

41

Page 52: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 53: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

VI. NEXT STEPS: PLANNED ACTIVITIES IN YEAR 4

Ensuring the quality and completeness of evaluation data, and planning for the legislatively mandated report to Congress due by December 2017 are key priorities for Mathematica during the coming year. We will conduct site visits to the RPG2 grantees, and conduct the staff and partner surveys among the RPG3 cohort. We will continue working with CB, NCSACW, and a group of interested grantees and local evaluators to develop and pilot test instruments for the cost analysis of trauma-specific EBPs.

A. Ensuring the quality and completeness of evaluation data

Mathematica will continue to support grantees in year 4 to improve their ESL and OAISIS data. To further improve upon data collection, CB exercised optional task C through the RPG3 contract at the end of year 3. This option will include targeted TA to RPG2 and RPG3 grantees as well as a data collection tool kit that will be informed by lessons learned during the cross-site evaluation to date. CB can use it in the future with non-RPG grantees. The tool kit will feature common data collection challenges as well as strategies to address them. For example, we will suggest refusal conversion strategies, because the RPG grantees have at times struggled to enroll participants into their evaluations. Other topics include tracking enrollment, participation, and services, as well as how to effectively use incentives for data collection.

B. Remaining reports to Congress

Mathematica has submitted a legislatively mandated report to Congress after each year of the grant and in the coming year will begin to draft the fourth report to Congress. The report will assess the extent to which the grants have been successful in addressing the needs of families with substance use disorders who come to the attention of the child welfare system, using data collected in years 1 through 3 of the grant. More information on the fourth and fifth reports to Congress is available in Section E below.

C. Cost study

Understanding the cost of interventions to address child welfare needs has in recent years gained greater priority at CB.13 Consistent with this emerging interest, CB included optional tasks under the cross-site evaluation for a possible cost study of RPG programs. The intent of the option was to develop an approach and data collection instruments, pilot-test the instruments, and provide a report.

CB exercised the first option in year 3 through the RPG2 contract. Selecting a focus for the study was a goal during year 2, and CB, in consultation with Mathematica, elected to focus on the cost of providing trauma-specific EBPs (TS-EBPs). CB chose this focus because studies indicate that most children in the child welfare system have been exposed to trauma (for

13 For example, “Calculating the Costs of Child Welfare Services Workgroup. Cost Analysis in Program Evaluation: A Guide for Child Welfare Researchers and Service Providers.” Washington, DC: Children’s Bureau, Administration for Children and Families, U.S. Department of Health and Human Services, 2013; and James Bell Associates. “Waiver Demonstration Cost Evaluation Toolkit.” Presentation at the 17th Annual Child Welfare Demonstration Projects Meeting, Washington, DC, September 1, 2015.

43

Page 54: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

example, Kisiel et al. 2009), and ACF is encouraging or requiring its grantees to adopt and implement services that are trauma-informed (ACF 2012). According to reports from RPG2 grantees, as of October 2014, 15 of 17 grantees were screening or assessing adults and/or children for trauma, and 11 grantees were providing trauma-specific services (Strong et al. 2015).

In the coming year, work will shift to developing and testing the instruments, including engaging grantees in the work. To facilitate collaboration with RPG grantees and provide an opportunity for building grantees’ capacity related to cost analysis, Mathematica formed a cost-study work group comprising grantee representatives, local evaluators, and NCSAW. Work group members will provide input and feedback on (1) the status of TS-EBP implementation in grantee sites, (2) which of the nine TS-EBPs being implemented in RPG to study, (3) research questions for a cost study, (4) data collection instruments, and (5) findings from the pilot test of instruments. In addition, we will seek grantees to participate in the pilot tests.

Generally, a cost study requires data on the types, quantity, and value of resources that providers use to deliver an intervention. We will develop a spreadsheet and instructions for pilot sites to use to provide this information. To accurately estimate the costs of delivering TS-EBP services, information on how much time provider staff devote to the program may also be necessary. Instruments for obtaining this type of data include interviews, surveys, and time diaries, and we will work with CB and grantees to decide whether to develop and pilot one of these types of instruments, as well.

Based on the pilot tests, Mathematica would revise the instruments for use in future costs studies of the relevant TS-EBP models. If resources allow and grantees are interested, we could then use data on enrollment and services provided to individual participants in one or more of the pilot sites to estimate costs per participant. Thus, the pilot cost analysis would produce estimates of the total cost incurred by an implementing agency in providing TS-EBP services during one year, and the cost per participant. During the coming year, we will work closely with CB, the work group, and pilot sites through these possible phases of work.

D. Remaining data collection and analysis

1. Site visits The site visit teams worked with the grantees to arrange the visits beginning in October and

scheduled more than 130 interviews to occur during the visits. The visits lasted 1.5 to 2.5 days depending upon the structure of each site, and were completed by December 15, 2015. During years 4 and 5, the teams will complete each discussion guide with the interviewees’ responses code them for analysis, and analyze the data.

2. Participant characteristics In the coming year, Mathematica will build on its previous analyses of participant

characteristics and service data and study EBP receipt in greater detail. First, Mathematica will update its analysis of participant characteristics using data collected through March 2016. Mathematica will then extend its analysis of EBP receipt; potential new topics may include which case members enroll into EBPs, the length of time during which they receive EBPs, and whether there are patterns of concurrent enrollment into certain types of EBPs. Mathematica will 44

Page 55: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

also assess the quality of data on individual sessions of focal EBPs, for analysis in 2017. While undertaking this analytical work, Mathematica will also continue to provide grantees with TA on using the ESL, to ensure the quality of data collected.

3. Outcomes The outcomes data will continue to play an important role in facilitating our understanding

of children’s permanency, safety, and well-being and adults’ recovery and family functioning. As in previous years, grantees will upload outcomes data in April and October, including all standardized instruments and administrative data. In addition, RPG3 grantees will submit administrative data for the first time in April 2016. We will support both RPG2 and RPG3 grantees in this effort, through monitoring of data submissions and troubleshooting with grantees. We will also continue to provide scored grantee-level data sets to RPG2 and RPG3 grantees in summer of 2016, so that grantees may use the results for their own purposes.

In the future, we will analyze baseline data and compare baseline and follow-up data to measure how children’s permanency, safety, and well-being and adults’ recovery and family functioning change after participation in RPG programming. We will conduct an impact analysis using data from grantees who complete comparison group local evaluations meeting criteria established for the analysis (see the RPG cross-site evaluation design report, Strong et al. 2014).

E. Analysis and reporting

The fourth report to Congress will focus primarily on implementation findings, guided in large part by a research question: What procedures, infrastructure, and supports were in place to facilitate implementation of EBPs? The report will use information on participant characteristics, data from the staff survey and from SAPRs, site visit data about enrollment in RPG and EBPs, grantees’ selection of EBPs, characteristics of frontline staff, and procedures and infrastructure to facilitate EBP implementation. These findings will describe features of the conceptual framework that highlights how the RPG program is expected to help change implementation systems, partnerships, and participant outcomes.

The fifth and final RPG2 report to Congress will be comprehensive, answering all key research questions in an effort to describe the elements of the theoretical framework described in the design report. It will use all of the data elements collected during the study and summarize key findings from the fourth report to Congress. To accomplish this goal, it will present results from three substudies:

1. Partnership study: The partnership study will provide a description of the partnerships formed among each of the 17 RPG grantees, agencies in the community implementing RPG services, and organizations who have come together to support the RPG program. Using the partnership survey and site visit data from partner interviews, the study will address the following research questions: Who was involved in each RPG project and how did the partners work together? To what extent were the grantees and their partners prepared to sustain their projects by the end of the grant period?

2. Implementation Study. The implementation study will address five of the cross-site evaluation research questions related to the target populations of RPG projects; the selected EBPs; the procedures, infrastructures, and supports for EBP implementation; the ways in

45

Page 56: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

which EBPs were implemented; and the extent to which RPG projects prepared to sustain EBPs. This study will provide detailed information on the service logs of these EBPs, using data collected from the ESL system by April 2017, as well as a summary of findings from the previous report to Congress, which provided more detail on participant enrollment and case characteristics.

3. Outcomes study: The outcomes study provides an opportunity to describe the changes that occur in children, adults, and families who participate in the 17 RPG projects, answering the final research question for the cross-site evaluation: What were the well-being, permanency, and safety outcomes of children, and the recovery outcomes of adults, who received services from the RPG projects? The data source for this study will be the final submission of standardized instrument and administrative data in April 2017. The analyses on RPG2 grantees will focus primarily on how child and adult outcomes changed through` their involvement in the RPG program, while the RPG3 analyses will focus on enrollment and baseline data.

46

Page 57: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

REFERENCES

Abidin, R. R. (1995). Parenting stress index (3rd ed.). Odessa, FL: Psychological Assessment Resources.

Achenbach, T. M., & Rescorla, L. A. (2000). Manual for the ASEBA preschool forms and profiles. Burlington, VT: University of Vermont, Research Center for Children, Youth, and Families.

Achenbach, T. M., & Rescorla, L. A. (2001). Manual for ASEBA school-age forms and profiles. Burlington, VT: University of Vermont, Research Center for Children, Youth, and Families.

Administration for Children and Families (2012). Regional partnership grants national cross-site evaluation and evaluation technical assistance. Washington, DC: U.S. Department of Health and Human Services.

Administration for Children and Families (2014). Regional partnership grants to increase the well-being of, and to improve the permanency outcomes for, children affected by substance abuse. Washington, DC: U.S. Department of Health and Human Services. (Copies of closed Children’s Bureau discretionary grant funding opportunity announcements are available upon request. Please contact [email protected]).

Bavolek, S. J., & Keene, R. G. (1999). Adult-adolescent parenting inventory—AAPI-2: Administration and development handbook. Park City, UT: Family Development Resources, Inc.

Briere, J., Johnson, K., Bissada, A., Damon, L., Crouch, J., Gil, E., & Ernst, V. (2001). The trauma symptom checklist for young children (TSCYC): Reliability and association with abuse exposure in a multi-site study. Child Abuse and Neglect, 25(8), 1001–1014.

Dunn, W. (2002). The infant/toddler sensory profile manual. San Antonio, TX: The Psychological Corporation.

Gioia, G., Isquith, P., Guy, S., & Kenworthy, L. (2000). Behavior rating inventory of executive function. Child Neuropsychology, 6(3), 235–238.

Kisiel, C. L., Fehrenbach, T., Small, L., & Lyons, J. (2009). Assessment of complex trauma exposure, responses, and service needs among children and adolescents in child welfare. Journal of Child and Adolescent Trauma, 2, 143–160.

McLellan, A.T., Kushner, H., Metzger, D., Peters, R., Smith, I., Grisson, G., Pettinati, H., & Argeriou, M. (1992). The fifth edition of the addiction severity index. Journal of Substance Abuse Treatment, 9(3), 199 – 213.

Radloff, L.S. (1977). The CES-D Scale: A self-report depression scale for research in the general population. Applied Psychological Measurement, 1(3), 385 – 401.

47

Page 58: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

RPG2 THIRD ANNUAL REPORT MATHEMATICA POLICY RESEARCH

Sparrow, S. S., Cicchetti, D. V., & Balla, D. A. (2005). Vineland-II adaptive behavior scales: Survey forms manual. Circle Pines, MN: AGS Publishing.

Strong, D. A., Avellar, S. A., & Ross, C. (2014, February). RPG Cross-site Evaluation and technical assistance: First annual report.

Strong, D. A., Avellar, S. A., & Cole, R. (2015, April). RPG Cross-site Evaluation and technical assistance: Second annual report.

Strong, D. A., Avellar, S.A, Massad Francis, C., Hague Angus, M., Mraz Esposito, A. (2013, October). Serving Child Welfare Families with Substance Abuse Issues: Grantees’ Use of Evidence-Based Practices and the Extent of Evidence. Children’s Bureau, Administration for Children and Families, U.S. Department of Health and Human Services. October 2013. Contract No.: HSP233201250024A. Available from Mathematica Policy Research, Princeton, NJ.

Strong, D. A., Snoek, E., and Massad Francis, C. “Working Paper: RPG Grantees Addressing Child and Adult Trauma.” Princeton, NJ: Mathematica Policy Research, June 2015.

U.S. Department of Health and Human Services. “2012 and 2014 Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children Affected by Substance Abuse: Third Report to Congress.” Forthcoming.

U.S. Department of Health and Human Services. “2012 Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children Affected by Substance Abuse: Second Report to Congress.” Princeton, NJ: Mathematica Policy Research, August 2015.

U.S. Department of Health and Human Services. “2012 Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children Affected by Substance Abuse: First Report to Congress.” Princeton, NJ: Mathematica Policy Research, December 2014.

48

Page 59: RPG Cross-Site Evaluation and Technical Assistance: Third Annual

This page has been left blank for double-sided copying.

Page 60: RPG Cross-Site Evaluation and Technical Assistance: Third Annual