training program quality assurance and evaluation€¦ · training program quality assurance and...

Post on 14-Jun-2020

9 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Training Program QualityAssurance and Evaluation

Best Practices for Worker Training

Foundations

ÿMinimum Criteria Document (1910.120

Appendix E)

ÿCooperative Agreement Requirements

ÿAwardee Evaluations

Minimum Criteria DocumentSuggested Program Quality Control

Criteria1. Training Plan

2. Program Management

3. Facilities and Resources

4. Quality Control and Evaluation

5. Students

6. Institutional Environment andAdministrative Support

Quality Control andEvaluation

ÿAdvisory Committee and/or OutsideReviewers for overall policy guidance

ÿAdequate and appropriate qualitycontrol and evaluation program toaccount for …ÿ Instructor Performanceÿ Course evaluation for improvementÿ Student evaluations

Key Evaluation Questions

ÿ Quality and appropriateness of …ÿ Program objectives (clarity and

achievement)ÿ Facilities and staffÿ Course material and mix of classroom and

hands-on trainingÿ Assessment of program strengths and

weaknesses and needed improvements

Cooperative Agreements

ÿ NIEHS – Stewardship and Oversight Rolesÿ Guiding Language for Quality Assurance

and Evaluationÿ NIEHS - Review Panel - team of outside

experts and agency staffÿ Awardees – Lead responsibility in quality

control and internal evaluationÿ Established own evaluation systems

Cooperative AgreementRequirements

ÿ Independent Board of Advisorsÿ Appropriate training and expertise to

evaluate and oversee the proposed workertraining program

ÿFormal Quality Control and EvaluationPlanÿ Different forms depending on the the

nature of the student population andawardee’s program culture

Why EvaluateÿUse positive feedback to build and

expand programsÿLearn how to improve programsÿDetermine need of additional trainingÿDocument learning, confidence building

and workplace changesÿAccountability (legal/program

requirements)

Multi-Program Evaluation: ADescriptive Review February 1996

ÿReview of over 50 awardee evaluationreports and 13 grant related journalarticles and publications

ÿUsed in overall program review

Awardee Evaluations - ManyForms

ÿFocus on individuals and groupsÿQualitative (how and why) and

quantitative (how much and how many)ÿDescriptive (non-experimental designs)

and trying to infer cause (quasi-experimental designs)

Awardee Evaluations -FocusÿStudent perceptions of training

ÿ Many thousands of positive student ratings

ÿCourse materialsÿ Perceptions of usefulnessÿ Post-training use

ÿKnowledge, skills and decision-makingÿ Student self-assessmentsÿ Testing and performance assessments

Awardee Evaluations -Focusÿ Changes in awareness, concerns and

attitudesÿ Improvements in post-training response

actions to HAZMAT incidentsÿ Changes in personal protective practicesÿ Systematic changes in worksite programs,

policies, preparedness and equipmentÿ Catalyzing of additional site-based training –

training and sharing of information

An Evaluation of the NIEHSWETP - External Panel Report

“Not only has the NIEHS grant programprovided training to hundreds ofthousands of workers, managers andhealth and safety professionals, it hasalso made a substantial contribution to amore systematic, analytical and scientificapproach to training programdevelopment, delivery and evaluation interms of advancing the state of the art.”

December 28, 1995

Resource Guide forEvaluating Worker TrainingPurposesÿTo provide both general and specific

step by step guidance for bothexperienced and novice evaluators onhow to design and carry out anevaluation

ÿTo provide examples of evaluationinstruments

Resource Guide Content

ÿEvaluation overviewÿSpecial challenges to an evaluation

teamÿEvaluation methodsÿAnnotated bibliographyÿEvaluation instruments (for off-the-

shelf use or adaptation)

Evaluation: Building theCapacity to Learn

The Self-Sufficiency Researchand Evaluation Project (SREP) -A Participatory EvaluationModel

SREP Partnersÿ AFSCME - American Federation of State,

County and Municipal Employees with theUniversity of Massachusetts Lowell

ÿ PACE - Paper, Allied Industrial, Chemical andEnergy Workers International Union with theLabor Institute and New PerspectivesConsulting Group

ÿ UAW - United Auto Workers with University ofMichigan

SREP - A Multi-OrganizationalCollaborative

ÿThree partners—union-basedoccupational safety and healtheducation programs

ÿTeam-based—composed ofworker-trainers, program staffand/or evaluators

SREP OverviewWorkshops Team Evaluation Projects

Developing Evaluation PlansGathering and Analyzing Data

Aug 1998

Collective ReflectionsDeveloping Lessons Learned

May 1999

Research and Evaluation OverviewIntroduction to Program Theories

May 1998

Developing Meaning andPromoting the Use of Findings

Jan 1999

Refine data collection plans ,begin data and analysis

Develop program diffusion,prepare lessons learned report

Develop and refine evaluationquestions and designs

Ongoing data collection, analysisand report generation

Description of SREP TeamProjects

ÿSchool district: Short survey and focusgroup (AFSCME)

ÿMunicipality: Pilot individual interviews(AFSCME)

ÿOil Refinery: In-plant labor managementrefinery team use of “Charting HowYour Program Works:” and monitoringnew safety and health initiative (PACE)

Project Description (continued)

ÿProgram-WideÿWorker understanding of systems of

safety—card sort focus group (PACE)ÿWorkplace Impact—phone interviews

(UAW)ÿWeek-long Training Conference

ÿQuick feedback from and back totraining program participants (UAW)

Model of Worker-Led, Team-based Participatory Evaluation

1. Builds a community united in a sharedcommitment to the rights of all workersto safe and healthy workplaces.

2. Actively involves workers in all aspectsof evaluation.

3. Is a collective effort—within andamong partner organizations—thatdraws upon each other’s insights,strengths and experiences

Model (continued)

4. Understands evaluation as a processof continuous learning, rather thanbeing an end product

5. Provides important ways to measureand document program successes.

6. Recognizes the importance ofidentifying program values and goalsto guide evaluations.

Worker-Led, Team-BasedEvaluation

TraditionalWho:ÿ Evaluation consultant, program

administratorWhat they do:ÿ Consultant designs, conducts,

analyzes and writes report

ÿ Worker trainers and trainersmay distribute and collectevaluation forms

ÿ Consultant recommendschanges and future directionsfor programs

ParticipatoryWho:ÿ Team of worker trainers,

trainers, evaluation consultant,program administrator and staff

What they do:ÿ Team decides evaluation focus,

design, data collectioninstruments, analysis, etc.

ÿ Consultant may provide morehands-on work while thoseinternal to program provideideas and feedback

ÿ Team reflects on findings anddecides implications for futureprogram directions

Worker-Led, Team-BasedEvaluation (continued)

TraditionalWhen:ÿ At the end of projectHow/Who:ÿ Formal written report for

program administrators,funders

Use:ÿ To make judgments

ParticipatoryWhen:ÿ Throughout projectHow/Who:ÿ Variety of formats—formal

written reports, group activities,newsletters—for workertrainers, programadministrators, funders, staff

ÿ Use:ÿ Learn how program works to

guide ongoing improvements

ÿ Expand original learning

SREP Is OngoingÿBeginning May 30 SREP partners will

begin a new round of three-dayworkshops that will involve participantsin training, planning and organizing tocarry out participatory evaluations oftheir programs

ÿWe welcome inquiries about joining us

top related