program evaluation tools and strategies for instructional technology

28
Program Evaluation Tools and Strategies for Instructional Technology

Upload: vanessa-jordan

Post on 25-Dec-2015

218 views

Category:

Documents


3 download

TRANSCRIPT

Program Evaluation Tools and Strategies for

Instructional Technology

2 2

[email protected] 978-251-1600 ext. 204

[email protected]

www.sun-associates.com/necc2006 This presentation will be linked to that site

3 3

Where Do We Stand?

Who’s working on an actual project? Current? Anticipated?

Your expectations for today

4 4

Workshop Goals

To review the key elements of effective program evaluation as applied to instructional technology evaluations

To consider evaluation in the context of your actual projects (via example projects)

5 5

Why Evaluate?

To fulfill program requirements NCLB and hence Title IID carry evaluation

requirements Most other state and federal proposals require an

evaluation componentAnd not simply a statement that “we will evaluate”But actual info on who will evaluate, the evaluation

questions, and methodologies

Project sustainabilityGeneration of new and improved project ideasOthers?

6 6

By Definition, Evaluation…

Is both formative and summative Helps clarify project goals, processes, products Should be tied to indicators of success written for your

project’s goals Is not a “test” or simply a checklist of completed

activities Qualitatively, are you achieving your goals? What adjustments can be made to your project to

realize greater success?

7 7

A Three-Phase Evaluation Process

Evaluation Questions Tied to original project goals Indicator rubrics Allow for authentic, qualitative,

and holistic evaluation

Data Collection Tied to indicators in the rubrics

Scoring and Reporting Role of the evaluation committee

pg 5 in workbook

8 8

Who Evaluates?

Committee of stakeholders (pg 10)Outside facilitator?Task checklist (pg 6)Other issues…

Perspective Time-intensive

9 9

Project Sample

10 10

An Iterative Process

Evaluation breaks your vision down into increasingly observable and measurable pieces.

11 11

Goals Lead to Questions

What do you want to see happen? These are your goals Rephrase goals into questions

Achieving these goals requires a process that can be measured through a formative evaluation

12 12

…And Then to Indicators

What is it that you want to measure? What are the conditions of success and to what degree are

those conditions being met? By what criteria should performance be judged? Where should we look and what should we look for to judge

performance success? What does the range in the quality of performance look like? How should different levels of quality be described and

distinguished from each other?

13 13

Indicators should reflect your project’s unique goals and aspirations Rooted in proposed work Indicators must reflect your own environment...what

constitutes success for you might not for someone else

Indicators need to be highly descriptive and can include both qualitative and quantitative measures

You collect data on your indicators

14 14

Try it on a Sample

Using the Evaluation Logic Map, map your: Project purpose/vision Goals Objectives Actions

We’ll take 20 minutes for this…and then come back for indicators

15 15

1. Logic mapping the sample - 20 min

2. Writing questions and indicators - 30 min

3. Discuss with partner team - 20 min

4. Debrief with everyone - 20 min

16 16

To Summarize...

Start with your proposal or technology planLogic map the connections between actions,

objectives, and goalsFrom your goals/objectives, develop evaluation

questionsQuestions lead to indicators

Indicators are organized into rubrics

Data collection flows from that rubric

17 17

Evidence/Data Collection

Classroom observation, interviews, and work-product review What are teachers doing on a day-to-day basis to

address student needs?Focus groups and surveys

Measuring teacher satisfactionTriangulation with data from administrators and

staff Do other groups confirm that teachers are being

served?

18 18

Data Collection Basics

Review Existing Data Current technology plan Curriculum District/school improvement plans

www.sun-associates.com/eval/sampleCreate a checklist for data collection

19 19

Surveys

Creating good surveys Length Differentiation (teachers, staff, parents,

community, etc..) Quantitative data Attitudinal data Timing/response rates (getting returns!)

www.sun-associates.com/eval/samples/samplesurv.html

20 20

Online Survey Tools

Online VIVED Profiler LoTi Zoomerang SurveyMonkey.com

21 21

Survey Issues

Online surveys produce high response rates

Easy to report and analyze data Potential for abuse Depends on access to connectivity

22 22

Focus Groups/Interviews

Focus Groups/Interviews Teachers Parents Students Administrators Other stakeholders

23 23

Classroom Observations

Using an observation templateUsing outside observers

24 24

Other Data Elements?

Artifact analysis A rubric for analyzing teacher and student

work?

Solicitation of teacher/parent/student stories This is a way to gather truly qualitative data What does the community say about the use

and impact of technology?

25 25

Dissemination

Compile the reportDetermine how to share the report

School committee presentation Press releases Community meetings

26 26

Conclusion

Build evaluation into your technology planning effort

Remember, not all evaluation is quantitative

You cannot evaluate what you are not looking for, so it’s important to —

Develop expectations of what constitutes good technology integration

27 27

More Information

[email protected] 978-251-1600 ext. 204

[email protected]

www.sun-associates.com/evaluationwww.edtechevaluation.com