review of data validation results for py 2004 we’ve come a long way, baby!

Post on 01-Jan-2016

215 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Review of Data Validation Results

for PY 2004

We’ve come a long way, baby!

2

What is Validation?

• Validation helps determine the accuracy of the state reports.

– Report Validation: makes sure the reports are calculated correctly

– Data Element Validation: looks to see if the individual data elements that comprise the participant record being accurately captured and recorded

3

Why are you inflicting this pain on us?

• Recent OIG and GAO reports have pointed out deficiencies in the quality and accuracy of ETA data.

• Data Validation allows states the opportunity to review data collection practices and improve the collection and reporting of participant data

• The integrity of state reports and participant files is greatly improving thus allowing us to tell a more complete story about the folks we serve.

4

What happened with PY 2004 DV?• 52 STATES COMPLETED WIA REPORT VALIDATION:

YOU GUYS ROCK!• WIA Report Validation results are required to be

submitted to ETA prior to certifying and submitting the annual report. The due date for PY 2004 validation was by October 1, 2005.

• Data Element validation was due by February 1, 2006. As of Friday, February 3 46 states submitted their DEV results and 6 requested an extension.

• DEV results are being analyzed by PROTECH and MPR; results are expected to available by May 2006

5

Wagner-Peyser RV for PY 2004

• We changed the date for validating the fourth quarter’s ETA 9002/Vets 200 to August 2004

• This validation also includes a random sample of 25 records for minimal data element validation

• Only 19 states submitted their RV results electronically; however, many states used DART for RV without submitting results to ETA.

6

TAA Validation for FY 2005

• TAA Validation made great strides over FY 2004 and 2003 validation

• As of February 3, 41 out of 50 states submitted their TAA DEV results with 5 states requesting an extension

• DEV Analysis will be completed by May 2006 but we anticipate high error rates for application and exit dates

• The average number of cases for TAA nationwide is 118

7

How are we doing?• 23 states used the DRVS software to create their

annual reports which gave them perfect scores for Report Validation

• Of the states who used other software for reports, only 2 (Idaho and Puerto Rico) achieved an acceptable level of performance under report validation

• An acceptable level is defined to be having no report validation error rate greater than 2% for core performance measure

8

State Data

State Software Generated

Qrtly & Annual Report

State Generated

Report

Run Through DVRS

Software

DVRS Generated Report<2%

OutcomesNumerators

Denominators

9

What are the Core Measures?• Core Performance measures refer to the 45 numerators,

denominators, and rates for the core measures excluding customer satisfaction:

– For Adults, Dislocated Workers and Older Youth• entered employment, • retention, • Earnings• credential rates

– For Younger Youth• skill attainment• diploma rate• retention rate

10

What Happens If A State Fails?

• ETA is looking at defining “Failure to Report”– Failure to submit on time– Failure to conduct RV– Failure to submit accurate report

• > 2% error threshold for outcomes and other critical data elements

• For WIA– Failure to comply with TEGL 14-03 when submitting their

annual state report

11

Possible Implication for “Failure to Report”

• For WIA– Automatically excluded from incentives determinations– Possibility of financial sanctions

• Could result in grantee’s data not being included in any summary reports– Basically places state performance at zero

12

What about the Non Core Measures?

• These refer to outcomes for special populations, the other outcome information, and the other reported information, and the participation levels provided in the ETA 9091

• EXAMPLE: many states question why we are looking at DD 214 forms for veterans. The answer is we need to make sure we are accurately counting and serving participants who claim a veteran’s preference. The DD-214 is the only acceptable source for verification.

13

How did states do on those measures?

• 5 states fell into the “good” performance range for non-core measures: Colorado, Idaho, North Carolina, New Jersey and Oregon

• We allow more errors for the non core measures

• Must be more rigorous with the core measures because they are the basis for incentives and sanctions

14

Who are the worst offenders?• Adults: earnings change and retention had a

greater than 2% average error rate

• Dislocated Workers: the credential, earnings replacement, entered employment, and retention rates

• Older Youth: earnings change

• Younger Youth: skill attainment and retention

15

What will happen next year?• ETA needs to revise the software and DV elements

to meet the new reporting requirements• Data Validation will continue to be required despite

the implementation of common measures and new reporting requirements.

• The changes in reporting will lead to a delay in implementation of error rates but states should be prepared to justify their outcomes to Regional Performance staff and will still be subject to DV monitoring reviews.

16

Who do we complain to?

• For complaints about Data Validation, the software and technical assistance please contact your Regional Performance Specialist.

• For compliments, praise and other good words about data validation, please contact Traci DiMartini in the National Office at dimartini.traci@dol.gov.

top related