yolanda e. smith, dissertation, dr. william allan kritsonis, dissertation chair

263
DIFFERENCES IN PROFESSIONAL DEVELOPMENT TRAINING BETWEEN ONE CORPORATION AND ONE LARGE TEXAS PUBLIC SCHOOL DISTRICT A Dissertation by YOLANDA E. SMITH Submitted to the Graduate School Prairie View A&M University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY February 2008 iii

Upload: william-kritsonis

Post on 25-May-2015

326 views

Category:

Education


2 download

DESCRIPTION

Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair, PVAMU/Member of the Texas A&M University System

TRANSCRIPT

Page 1: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

DIFFERENCES IN PROFESSIONAL DEVELOPMENT TRAINING

BETWEEN ONE CORPORATION AND ONE LARGE TEXAS PUBLIC

SCHOOL DISTRICT

A Dissertation

by

YOLANDA E. SMITH

Submitted to the Graduate SchoolPrairie View A&M University

in partial fulfillment of the requirements for the degree of

DOCTOR OF PHILOSOPHY

February 2008

Major Subject: Educational Leadership

iii

Page 2: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

DIFFERENCES IN PROFESSIONAL DEVELOPMENT TRAINING BETWEEN ONE

CORPORATION AND ONE LARGE TEXAS PUBLIC SCHOOL DISTRICT

A Dissertation

by

YOLANDA E. SMITH

Approved as to style and content by:

_____________________________William Allan Kritsonis, Ph.D.

(Dissertation Chair)

________________________________ ______________________________ Ben C. DeSpain, Ed.D. Douglas Hermond, Ph.D.

(Member) (Member)

____________________________ _____________________________ David Herrington, Ph.D. Camille Gibson Ph.D. (Member) (Member)

______________________________ William Parker, Ed.D. (Dean, Graduate School)

February 2008

iii

Page 3: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

ABSTRACT

Differences in Professional Development Training

Between One Corporation and One Large Texas Public School District.

(February, 2008)

Yolanda E. Smith: B.S. – Texas Southern University;

M.Ed., Prairie View A & M University

Chair of Advisory Committee: William Allan Kritsonis, Ph.D.

In a world of accountability, corporations want to ensure that the efforts given to

professional development make a difference in performance. The purpose of the study

was to compare public education professional development training programs with the

corporate sector professional development training programs using the Professional

Development Assessment Tool (PDAT).

The following research questions guided this study:

Quantitative

What are the differences in participants’ reactions, participants’ learning,

organizational support, and participants’ use of knowledge and skills regarding the

professional development training between public educators and corporate employees

as measured by PDAT?

Qualitative

What are the differences in how the evaluation of participants’ learning outcomes is

determined between private corporations and public education as measured by Guskey’s

model?

iv

Page 4: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

This study utilized the triangulation design of the mixed-methods approach. It

gathered quantitative data through the online PDAT survey/questionnaire. Qualitative

data were collected in two parts. The first part was collected with the questionnaire. The

second part regarding overall effectiveness and forms of evaluation used was obtained

through interviews with upper management.

The t-test for two independent samples was used to determine if there was a

significant difference in public educators and corporate employees’ ratings of

professional development programs. Transcripts of the questionnaire were entered into

NVivo software and coded according to the themes that emerged from the data gathered.

Interviews were used as anecdotal records to support the quantitative data. Results from

the quantitative data, the questionnaire, and the interviews were triangulated in order to

strengthen the credibility of the data regarding the overall effectiveness of professional

development programs.

The findings from this study were as follows: (1) There was a significant

difference in how public educators and corporate employees viewed participants’

reactions, participants learning, organizational support, and the use of knowledge and

skills between public educators and corporate employees; (2) There was a significant

difference in the overall effectiveness of the professional development programs provided

to public educators and corporate employees; (3) Employees indicated a positive attitude

about learning new skills when it relates directly with their job; and (4) Evaluation can

make a difference in determining the overall effectiveness and quality of professional

development programs.

v

Page 5: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

DEDICATION

In loving memory of my parents, Andrew Edwards Sr. and Dolores Hernandez Edwards, who made an unselfish choice to give me a better life than they could have ever known possible.

In loving memory of my brothers Andrew Edwards Jr. and Maurice Edwards, I carry you with me every day.

To my oldest sister, Margaret E. Fisher, the one I grew to know as mama, who became my first teacher in life and from where my foundation was laid.

To Emmit E. Fisher Sr., the one I grew to know as daddy, who taught me the real meaning of studying and to always be responsible.

To my brothers and sisters, Raymond Edwards, Irma Green, Daniel Edwards, Nellie Smith, and Sylvia Edwards, because of your love and support, I went into this world believing I could do anything. You have each taught me things I needed.

To my daughters, Desiree, Darrelyn, De’Anna, and Dylana, you girls are the reason I live, love, hope and dream. Without you there is no me.

To my granddaughter, Brooke, you are the apple of my eye.

To my companion and best friend, Bill Wesley, for constantly encouraging and supporting me in all my endeavors. You challenge me to reach for my dreams.

To Eddie, Andy, Sarah, and Nathanlyn, thanks for sharing your parents and being my other brothers and sisters.

And finally, to my Savior, Jesus Christ, who gives me the following words to meditate on

whenever I was tired, stress, weary and anxious:

“Finally, brethren, whatever things are true, whatever things are noble, whatever things are just,whatever things are pure,whatever things are lovely,whatever things are of good report,if there is any virtue and if there is anything praiseworthy,meditate on these things.”

Philippians 4:8

vi

Page 6: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

ACKNOWLEDGEMENTS

I strongly believe that God puts people in your life for a reason and that the

prayers of my mother have been answered. It is with great pleasure that I pause and give

thanks to all that have encouraged and supported me across this new bridge of knowledge

and wisdom. Without their assistance this dissertation would not exist. I call these

wonderful people my support crew. They are:

Dr. William A. Kritsonis, my dissertation committee chair, words cannot express the love

and respect I have for you and your wonderful wife Dr. Mary Alice Kritsonis.

You were my cheerleader throughout this process. Thank you for your faith in me

and most of all for opening my eyes to the world of philosophy and the

importance of publishing. I found a life long friend in you.

Dr. Douglas Hermond, no one knows statistics like you. You set the standards for this

program. Thank you for challenging me to be the best.

Dr. Ben DeSpain, you were the reason I applied to PVAMU doctoral program. Thank

you for that 30 minute talk and for always inspiring me to better myself.

Dr. David Herrington, we met in the masters program. Thank you for the many talks and

for always being so supportive of me. Thank you for being a constant friend

throughout my schooling at PVAMU.

Dr. Camille Gibson, my outside committee member. Thank you for your willingness to

serve on my committee and for always getting my revisions to me early. I really

appreciate your expertise and time.

vii

Page 7: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Dr. Robert Marshall, the instructor who tells it like it is. You were right about being

diligent and making sacrifices in order to finish on time. I miss hearing your

stories. Thanks for the many words of wisdom.

Dr. Pamela Freeman, I met you in the masters program and respected your dedication.

Thank you for writing the grant that started this wonderful doctoral program.

(Yes, I remember when you were writing it)

Dr. Arthur Petterway and Dr. Teresa Hughes, thanks for your wonderful advice, your

help, and most of all for being there when I needed you.

Dr. Gwen Sample, I never met a black female with a doctorate until I met you, thanks for

your life long friendship and encouragement.

Andy Lamboso, thanks for meeting me after working hours and most of all, thanks for

helping me with statistics.

Cohorts 1, 2, 3, and 4, thanks for your warm friendships; each of you have touched my

life in a special way.

Mrs. Jennifer Young, my manager, this would not be possible if it was not for your

support in allowing me to miss work on Fridays to attend classes and to do my

research. I can only wish all managers were like you.

Ms. Mary Alice Alexander, my spiritual advisor; where would I be if you didn’t keep me

on the straight and narrow.

My Co-workers and friends, who have always supported me through my masters and

doctorate degrees, thanks for every thing,

And to all of my family members, I was able to accomplish my dreams because I knew

that no matter what, I was loved.

viii

Page 8: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

TABLE OF CONTENTS

Page

ABSTRACT.................................................................................................................iii

DEDICATION……………………………………………………………………….. v

ACKNOWLEDGEMENTS…………………………………………………………. vi

TABLE OF CONTENTS ......................................................................................... viii

LIST OF TABLES AND FIGURES...........................................................................xii

CHAPTER I. INTRODUCTION.................................................................................1

Background of Problem.....................................................................................4

Statement of the Problem...................................................................................5

Research Questions ...........................................................................................7

Null Hypotheses.................................................................................................8

Purpose of the Study………………………………………………………….. 9

Significance of the Study.................................................................................10

Assumptions…………………………………………………………………..10

Delimitations of the Study...............................................................................11

Limitations of the Study ..................................................................................11

Definition of Terms .........................................................................................11

Organization of Study......................................................................................13

CHAPTER II. REVIEW OF LITERATURE ............................................................15

Professional Development Overview .............................................................15

Background on Professional Development......................................................17

Evaluating Professional Development.............................................................21

ix

Page 9: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Page

Participants’ Reaction......................................................................................29

Participants’ Learning......................................................................................31

Participants’ Use of New Knowledge and Skills.............................................33

Organization Support and Change ..................................................................36

Return on Investment Student’/Participants’ Learning Outcomes..................39

Summary......................................................................................................45

CHAPTER III. METHODS........................................................................................46

Introduction…………………………………………………………………..46

Research Questions..........................................................................................47

Null Hypotheses...............................................................................................48

Research Methods............................................................................................48

Research Design...............................................................................................50

Pilot Study........................................................................................................51

Subjects of the Study.......................................................................................52

Instrumentation ...............................................................................................56

Research Procedures........................................................................................58

Data Collection and Recording .......................................................................61

Data Analysis...................................................................................................63

Summary..........................................................................................................64

CHAPTER IV ANALYSIS OF DATA………………………………………………66

Findings...........................................................................................................70

Research Question 1...............................................................................71

x

Page 10: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Page

Research Question 2……………………………………………………73

Research Question 3……………………………………………………75

Research Question 4……………………………………………………77

Research Question 5...............................................................................83

Discussion........................................................................................................99

Summary........................................................................................................105

CHAPTER V. SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS...................................................................107

Summary of Findings.....................................................................................109

Conclusions....................................................................................................114

Recommendations..........................................................................................115

Recommendations for Further Study.............................................................117

REFERENCES..........................................................................................................119

APPENDIXES...........................................................................................................128

Appendix A Professional Development Assessment Tool Survey................129

Appendix B Interview Questions...................................................................132

Appendix C Permission Letter To School District........................................134

Appendix D School District Approval Letter................................................136

Appendix E Letter to Principals.....................................................................138

Appendix F E-Mail from the Professional Development Department..........140

Appendix G Permission Letter To Private Corporation................................142

Appendix H Private Corporation Approval Letter.........................................144

Appendix I Human Participant Education for Research................................146

xi

Page 11: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Page

Appendix J Institutional Review Board.........................................................149

VITA..........................................................................................................................151

xii

Page 12: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

LIST OF TABLES AND FIGURES

Table or Figure Page

2.1 Guskey 2000 Model for Evaluating Professional Development........................28

2.2 Past Researches on Professional Development..................................................44

4.1.1 Participants’ Reactions between Organizations ...............................................72

4.1.2 Participants’ Reactions between Administration and Subordinates ............... 73

4.2.1 Participants’ Learning between Organizations.................................................74

4.2.2 Participants’ Learning between Administration and Subordinates...................75

4.3.1 Organizational Support between Organizations...............................................76

4.3.2 Organizational Support between Administration and Subordinates.................77

4.4.1 Participants’ Use of Knowledge and Skills between Organizations................79

4.4.2 Participants’ Use of Knowledge and Skills between Administration and

Subordinates.....................................................................................................80

4.5.1 Total Effectiveness between Organizations......................................................81

4.5.2 Total Effectiveness between Administration and Subordinates.......................82

4.5.3 Summary of Rating in Terms of Weighted Means...........................................83

4.6.1. Training Impacted Employees’ Work Performance........................................86

4.6.2 Training Affected Attitude on Learning New Things………………………...92

4.6.3 Training Enhanced Skills or Behaviors……………………………………….96

xiii

Page 13: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

CHAPTER I

INTRODUCTION

Part of being a professional is remaining up-to-date with current ideas, strategies,

and practices in a person’s field. Too often, teachers view professional development as a

waste of their time; something disconnected from their teaching, their students, and their

classrooms (Vontz & Leming, 2006). However, like practitioners in other professional

fields, educators must keep abreast with emerging knowledge and must be prepared to

use it to continually refine their conceptual framework and knowledge skills. Lowden

(2003) stated that effective professional development is a necessary component in all

educational improvement efforts.

With the 2001 No Child Left Behind (NLCB) Act, public schools are required to

hire Highly Qualified Teachers and to keep them well trained. Texas Education Agency

(TEA) implements the NCLB Act by imposing the Highly, Objective, Uniform State

Standard of Evaluation (HOUSE). Under HOUSE, teachers are required to demonstrate

competency by completing 24 HOUSE points. These points are earned by participating

in professional development training and/or Continuing Professional Education (CPE).

In today’s ever-changing business environment, corporations are always looking

for ways to remain competitive (Ketter, 2006). Corporate companies’ survival is based

on looking ahead and predicting future trends; by doing so, they increase their profits. If

there are no profits, there will soon be no company. Leaders who understand how to lead

their organizations in an increasingly competitive, global environment recognize that a

highly trained workforce improves performance (Ketter, 2006).

Page 14: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

In a world of accountability, corporations want to ensure that the efforts given to

professional development training are making a difference in performance. A growing

number of top executives recognize learning as a fundamental driver of organizational

performance (Ketter, 2006). Evaluation of professional development is needed to ensure

that professional development is making a difference.

Why is evaluation of these efforts important? Guskey (2000) stated four reasons

why evaluation is important: (1) Professional development is a continuous process, not an

event; (2) It is a systematic effort to bring about change; (3) There is a need for better

information to guide reforms; and (4) There is increased pressure at all levels of

education for greater accountability.

In creating professional development for teachers, administrators often forget two

very important traits: individuality and self-determination. Effective professional

development involves teachers (Inge, 2005). According to Lowden (2003), 69.8% of

teachers surveyed responded that either the district level administrators were making

decisions about what professional development content would be offered to teachers or

the building level administrators (42.4%) were making decisions. Teachers often believe

that administrators who conduct the workshops or seminars are too disconnected from the

realities of the classroom.

Unlike public education, private corporations allow employees a choice in their

personal development. For example, professionals like medical doctors, lawyers, and

engineers are allowed to choose areas of specialties within their field. Medical doctors

are limited in their choice of specialty according to their licensure score. With a high

score they can become a brain surgeon. With a low score they may have to settle for a

Page 15: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

less prestigious specialty like psychiatry. However, with NCLB, state examinations such

as Texas Assessment of Knowledge and Skills (TAKS) have become the driving force

behind what public educators receive as professional development training. The

professional development opportunities for educators should allow teachers and

administrators to move from one interest area to the next without abandoning their basic

calling.

Quality professional development or the lack thereof, affects how teachers value

their profession. Another problem facing professional development for public educators

is society interferes with the professional growth of teachers by questioning its economic

worth. Surprisingly enough, parents and board members view teacher development as

time taken away from the learning process of students. They expect teachers to be in the

classroom at all times (Marczely, 1996).

Often the public views teachers as merely “glorified baby-sitters” that get paid

huge salaries for nine months of work and the expenditure of funds toward professional

development that takes them out of the classroom is seen as a breach of the public trust

and a waste of money (Marczely, 1996). In a research conducted by Hackett (2005) it

was stated,

Until we improve the methods used to measure the links among professional

development, teacher performance, and student achievement, educators will be

unable to convince parents, community leaders, and local school boards to

provide sufficient time and funding necessary to improve our teachers’

understanding and our students’ performance. (p. 4)

Page 16: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

This is in complete opposition to the philosophy of the corporate world that views

professional development as an investment in the future according to the 2006 American

Society for Training and Development (ASTD) State of the Industry Report. Such

proactive corporate companies are ahead of reactive ones. On the other hand, school

board members and parents use a reactive approach. School district money is spent on

many things except meaningful professional development. If the future of this nation lies

with the teachers who are at the front lines of developing its talents, then professional

development should be looked upon as an investment in the future.

Background of the Problem

Large expenditures on training and the emphasis on organizational efficiency are

critical. Corporations must measure the impact of their training efforts. Corporate

training for effective performance has become critical for many organizations in the

private sector (Swanson, 1994). In the private sector, corporate training is regarded as

essential. When corporate training contributes to effective performance and corporate

executives are convinced of that, corporate training may receive considerable attention,

high status, and sufficient funds (Mulder, Nijhof, & Brinkerhoff, 1995). Sekowski

(2002) stated in his research that large expenditures on training and the emphasis on

organizational efficiency are critical so organizations must measure the impact of their

training efforts.

Educational leaders continue to create or endorse ineffective professional

development training for teachers. At the same time they expect students’ test scores to

improve. Experts suggest that if teachers and students are expected to attain higher levels

of achievement, then there will need to be an increase in resources devoted to teacher

Page 17: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

development (Hackett, 2005). The missing component in professional development

training is the evaluation process. Sekowski (2002) stated in his research that large

expenditures on training and the emphasis on organizational efficiency are critical so

organizations must measure the impact of their training efforts.

At the Education and Value Conflict conference in 1997, a concern regarding the

educational system was expressed. One of the topics discussed was the comparison of

the quality of education teachers received and the link between teacher education and the

business world (Natale & Fenton, 1997). Natale and Fenton, (1997) stated that it will be

interesting to investigate the state of professional development in business and determine

how it is similar to and different from professional development in the school system.

This research attempted to begin the exploration of this issue.

Statement of the Problem

Approximately 2.8 billion dollars of Title II money is aimed at preparing,

training, and recruiting high-quality teachers and principals to ensure that all are

thoroughly proficient by the 2005-2006 school years. In the 1,184 pages of the NCLB

legislation, it is difficult to find sections that do not mention professional development.

That frequent reference to professional learning indicates that the federal government

recognizes professional development’s key role in achieving NCLB’s ambitious goals

(Richardson, 2002). The problem with the large amount of funds in Title II allocation is

the option clause that lists six ways to improve quality teaching. Richardson (2002)

noted that although professional development is listed, it is hard for districts to describe

how it will evaluate the quality of the professional development provided to teachers and

demonstrate that the quality of it is better than the years before.

Page 18: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Under the NCLB Act, school districts are mandated to hire highly qualified

teachers. To be highly qualified, teachers must hold at least a bachelor’s degree from a

four-year institution, hold full state certification, and demonstrate competence in their

subject areas (NCLB, 2001). Federal and state governments have issued new mandates

that require teachers to assist all students in attaining high levels of achievement, and

they have placed increasing pressure on those charged with delivering professional

development experiences that impact teacher and student performance (Hackett, 2005).

The concerns regarding effective professional development for teachers have increased

since the NCLB Act forcing school districts to examine new ways to improve teachers’

knowledge and implementation of it in their classrooms.

Increasing standards for student performance at proficient levels have motivated

state and district level changes in several areas, including professional development.

Regarding to adult development Oja (cited in Meell, 1985) maintained the position that

professional development should attempt to help teachers develop maturity on both the

personal level and the cognitive level. Meell also noted in her research three reasons why

professional development was ineffective: 1) Negative attitudes toward professional

development because of poor planning and organization of the activities; 2) Activities

that are impersonal and unrelated to the day-to-day problems of the participants; and, 3)

Professional development that has a district-wide focus and does not meet the needs of

the individual schools and teachers.

It is uncertain whether the same ineffectiveness exists in corporate training. The

shortage of information regarding possible similarities and differences in participants’

reactions, participants’ learning, organizational support and participants’ use of

Page 19: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

knowledge and skills gained from professional development training both via public

education and via the corporate sector is the focus of this study. This research was

centered around similarities and differences between public education and the corporate

sector on how they evaluate their professional development training to determine its

overall effectiveness.

Research Questions

The following quantitative and qualitative research questions guided the study:

Quantitative

1. What are the differences in participants’ reactions regarding the professional

development training between public educators and corporate employees as

measured by PDAT?

2. What are the differences in participants’ learning in professional development

training between public educators and corporate employees as measured by

PDAT?

3. What are the differences in organizational support for professional development

between public educators and corporate employees as measured by PDAT?

4. What are the differences in participants’ use of knowledge and skills gained from

their professional development training program provided by the corporate sector

and public education as measured by PDAT?

Qualitative

5. What are the differences in how the evaluation of participants’ learning outcomes

is determined by the corporate sector and public school district, based on

Guskey’s model?

Page 20: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

The Professional Development Assessment Tool (PDAT) [See Appendix A] was the

on-line survey that was used as the instrument where employees of the corporate sector

and public school district gave their ratings on the issues and concerns considered to

measure the critical levels of evaluating professional development. Ratings on the PDAT

became the bases of answering the four quantitative questions. The results determined

whether the null hypotheses were accepted or rejected.

The three-question, open-ended questionnaire was given at the end-portion of the

online survey. Answers to the three questions provided the emergent themes for the

qualitative dimension of the study. Explanations were supported by anecdotal records

from the interviews of corporate managers and district administrators.

In order to answer the quantitative research questions, the following null hypotheses

were formulated:

Null Hypotheses

Ho1: There are no statistically significant differences in participants’ reactions

regarding the professional development training provided between public

educators and corporate employees as measured by PDAT.

Ho2: There are no statistically significant differences in participants’ learning

throughout their professional development training between public

educators and corporate employees as measured by PDAT.

Ho3: There are no statistically significant differences in organizational support for

professional development training between public educators and corporate

employees as measured by PDAT.

HO4: There are no statistically significant differences in participants’ use of

Page 21: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

knowledge and skills gained from their professional development training

program provided by the corporate sector and public education as measured

by PDAT.

Purpose of the Study

The purpose of the study was to compare public education professional

development training programs with corporate sector professional development training

programs. Guskey’s (2000) five critical levels of professional development evaluation

model was used to examine the presence and significance of professional development

programs for teachers in public schools and in training programs for employees in the

corporate business world. Both quantitative and qualitative data were used to answer the

question “How do we determine the effects and effectiveness of activities designed to

enhance the professional knowledge and skills of participants so that they might, in turn,

improve the learning of students or in the case of corporate employees, job

performance?” (Guskey, 2000, p. 1)

The five critical levels of evaluation were: (1) Participants’ reactions;

(2) Participants’ learning; (3) Organizational Support and Change; (4) Participants’ Use

of Knowledge and Skills; and (5) Student Learning Outcomes. For this study “Students”

were the “Participants” who received the professional development training. Insight into

the strengths and weaknesses of the professional development programs may impact how

both public education and the corporate sector improve their professional development

efforts.

Page 22: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Significance of the Study

Research is needed to determine the effectiveness of professional development

programs. Professional development has long been regarded as a vital part of the

continuing effort of teachers to develop and refine their insights and skills, and to adapt to

changes (Ehrenberg & Brandt, 1976). “The need for continuous professional

development of teachers may be the one thing that policy makers, researchers,

professional associations, the public, and school personnel agree on” (Lieberman &

Miller, 2007, p. 99).

In organizing a concept for professional development that is rooted in school

renewal and everyday practices of teachers, researchers have found three areas that

school personnel should consider. These suggestions included the following: 1) Teacher

career development and personal change; 2) School organization to support ongoing

learning communities; and 3) Educational reform networks that support teacher learning

(Hawley & Rollie, 2007). It is important to look at every avenue for assistance in

establishing a professional development program that will meet the three areas mentioned

above.

Assumptions

1. All surveys were answered honestly and completely.

2. Public education and the corporate sector require professional development

training, in terms of required hours or specific content areas.

3. All employees received the online survey and questionnaire.

4. All participants had access to a computer.

Page 23: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Delimitations of the Study

1. The use of one public school district has provided enough data for the study.

2. The use of one corporation has provided enough data for the study.

Limitations of the Study

1. Educators (teachers, counselors, and administrators) in the public school district

were the only individuals surveyed regarding their beliefs about their professional

development program.

2. Employees with at least a bachelor’s degree in the corporate sector were the only

individuals surveyed regarding their belief about their professional development

programs.

3. The principals distributed the survey/questionnaire to their employees.

4. The corporate managers distributed the survey/questionnaire to their employees.

5. Only one urban school district residing in the southern region of the United States

with professional development programs was used.

6. Only one private company residing in the southern region of the United States

with professional development programs was used.

7. The school district was already using the Guskey’s model.

Definition of Terms

The following key terms are defined in this study:

Effectiveness is considered as the match between results achieved and those needed or

desired (Rothwell & Kazanas, 1992). This involves aligning learning activities with

business needs and providing timely access to relevant learning opportunities (American

Society for Training and Development, 2006). Effectiveness is measured in terms of

Page 24: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

successful implementation of new strategies demonstrated by a change in practice

(Miller, 2006).

Efficiency is the ratio between the resources needed to achieve results (inputs) and the

value of the results (outputs) (Rothwell & Kazanas, 1992). It is also the result of

balancing centralized and decentralized aspects of the learning functions’ internal process

improvement, use of technology, and strategic outsourcing (American Society for

Training and Development, 2006).

Evaluation is the systematic investigation of merit or worth (Guskey, 2000). Evaluation

is the systematic process of collecting and analyzing data in order to determine whether

and to what degree objectives have been or are being achieved (Boulmetis & Dutwin,

2000). Evaluation is a science and an art (Kirkpatrick & Kirkpatrick, 2006).

Impact is the degree to which a program or project resulted in changes (Boulmetis &

Dutwin, 2000).

Private Corporations are not open to, intended for, or controlled by the public. A legal

entity that exists independently of the person or persons who have been granted the

charter creating it and that is invested with many of the rights given to individuals: a

corporation may enter into contracts, buy and sell property, etc. (Webster’s, 1999)

Profession is work that is regarded as prestigious, generally on the grounds that its

members are not only well-paid but also need lengthy academic training founded on

some systematic body of knowledge; exercise considerable freedom of decision in their

day-to-day work; recognize ethical standards in their activities; serve society; continue to

learn and develop the process while practicing it (Rowntree, 1981).

Page 25: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Professional Development is the organized and deliberate attempt to improve teaching

and enhance student learning (Kremer-Hayon, 1991). As defined by Sparks and Loucks-

Horsley (1989), professional development refers to those processes “that improve the job-

related knowledge, skills, or attributes of school employee’s. Professional development

is “an individual’s gradual and continuing mastery of a field’s body of knowledge,

methods, and procedures. It implies that practitioners adhere to ethical standards

appropriate to the field (Rothwell & Kazanas, 1992, p.328).” Professional development is

a process that is (1) intentional, (2) ongoing, (3) systemic (Guskey, 2000).

Public Education is financed out of public funds (rates and taxes) (Rowntree, 1981).

Free, government-supported schools are open to all citizens (Kritsonis, 2002)

Staff Development includes professional development activities that are designed to

involve the whole staff in developing common goals or themes (Kennedy, 1996).

Training Programs are programs which encompass the complexity of activities in

business and industry that involve skill acquisition for the improvement of employee

productivity and preparation in future skill needs (Meell, 1985).

Organization of the Study

The study contains five chapters. Chapter I includes an introduction, background

of the problem, statement of the problem, research questions, purpose of the study,

significance of the study, assumptions, delimitations and limitations of the study, and

definition of terms. Chapter II includes a review of the literature on the historical

professional development of public education and private corporations. Chapter III of the

study describes the research design, pilot study, subjects of the study, instrumentation,

procedures, data collection and recording, and data analysis. Chapter IV presents the

Page 26: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

findings of the study based on the quantitative and qualitative research questions. A

summary of the study, conclusions, and recommendations for further study are presented

in Chapter V.

Page 27: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

CHAPTER II

REVIEW OF LITERATURE

Professional Development Overview

To understand the meaning of professional development is to understand the

epistemological nature of professional development. Professional development requires

activities designed to build the personal strengths and creative talents of individuals and

thus create human resources necessary for organizational productivity. “The nature of

professional development for teachers relates directly to the nature of teaching” (Adey,

2004, p.143). Attention given to educational professional development has increased

over the years. With the standards of highly qualified teachers coming out of the No

Child Left Behind (NCLB) Act passed in 2001 and the demands for high standards with

calls for improving quality, teachers have a need, as never before, to update and improve

their skills through professional development. This leads us to the question: “Why has

the professional development of teachers already exercised so many good minds for so

long?” The answers involve demands for improvements in the quality of education

(Adey, 2004). A better question is how complex is it to research professional

development? It will be interesting to investigate the state of business development and

how it is similar and different from the education within the school system (Natale &

Fenton, 1997). This perspective formed the basis of the study.

“Effective professional development and day to day practice are inextricably

bonded in the learning community” (Roberts & Pruitt, 2003, p. 55). Professional

development is more meaningful when it addresses the needs of the teachers and

employees of corporations. There are numerous studies pertaining to professional

Page 28: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

development, and throughout most of them, the effectiveness of professional

development continues to surface (Kent, 2004; Labuda, 2004; Miller, 2004; Vontz &

Leming, 2006). Unsuccessful professional development initiatives are those that were

“done” to teachers rather than “with” teachers, (Fullan & Hargreaves, 1996). Inge (2005)

cited a study conducted by Marshall, Prichard and Gunderson (2001) which identified 18

effective school districts out of 1,500 studied, and concluded that one attribute consistent

in the effective districts was that professional development was considered job-

embedded. Job-embedded professional development strategies are associated with the

characteristics of the learning community in that they are collaborative and offer

opportunity for conversation, reflection, and inquiry. They are also in accord with the

principle that adult learners respond best when dealing with real-life situation and

problems (Roberts & Pruitt, 2003).

In the corporate sector another term used for professional development training is

“training and development.” For organizations, the related field of training and

development deals with the design and delivery of learning to improve performance,

skills, or knowledge within organizations. Blanchard and Thacker (2007) stated that

“training is a set of activities, whereas development is the desired outcome of those

activities. Training provides the opportunity for learning, and development is the result

of learning” (p. 20). The transfer of training to implementation is a major challenge

facing training professionals (Kirkpatrick & Kirkpatrick, 2005). Top management wants

to know what results the organization is getting from the hundreds of thousands of dollars

spent annually in training. Instructors and course designers want to know what impact

their programs are having on individuals and the organization. State legislatures trying to

Page 29: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

determine if they should continue to spend millions of dollars each year on staff

development want to know if their investment is a wise one. Both the private sector and

public education organizations have to justify money being spent on performance

improvement training of their employees (Blanchard & Thacker, 2007; Guskey, 2000;

Kirkpatrick & Kirkpatrick, 2005; Parry, 1997).

This study was centered on the Guskey 2000 model for evaluating professional

development programs. Participants’ reactions, participants’ learning, organizational

support, participants’ use of knowledge and skills, and student learning outcomes were

the five levels explored. The background of professional development, evaluating

professional development, the five components of Guskey’s model, and summary were

included in this chapter.

Background on Professional Development

Public Education

“We can train teachers to use overhead projectors, doctors not to drop their

stethoscopes, and priests not to spill the communion wine: but obviously these things,

however important, do not lie at the heart of their profession” (Wilson, 1997, p.279).

Educating professionals is the key word. Professionals in this study were people holding

a professional degree. A professional degree is a degree awarded in a subject such as

law, education, engineering and so on (Rowntree, 1981).

“Student learning and development do not occur without teacher learning and

development” (Hargreaves, 2007, p.37). The problem with professional development is

how it is viewed. It is viewed as “inservice” for teachers, “delivering” professional

Page 30: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

development, and last but not least, let’s not forget the word “training”. According to

Richardson (2007) people tend to perceive these words as negative.

Inservice brings to mind a style of learning in which consultants traveled far and

wide to spend a half-day here, a full day there, presenting to teachers and

principals about what they should do to develop their competency in a new

practice. Delivering suggests that one person picks up a package of information

and takes it to another person. Delivery does not even suggest that the package

has been received. Training is appropriate for dogs and obedience school, but not

the kind of active learning that professional educators should engage in. (p.61)

Hargreaves (2007) describes five flaws of staff development. The five flaws also

put a negative spin on professional development. The five flaws are:

Presentism—this is where staff development occurs largely to achieve

short-term goals.

Authoritarianism – the underlying belief is that those with positional

authority know what is best and they are going to make sure that teachers

comply with it.

Commercialism – school improvement is a multimillion dollar business.

So, too is staff development. Textbooks, video tapes and all the training

guides and consultancy support make a lucrative living for many people,

not all are educators.

Evangelism – staff developers who appeal to the emotional dependency of

their followers. These evangelists are knowledgeable, articulate,

personable, and charismatic. They are called gurus.

Page 31: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Narcissism – staff developers who mainly love themselves. They put

glitzy processes and glamorous performances before worthwhile and

substantive products.

There has to be a more positive word to use in describing professional

development training. Richardson (2007) suggested using the phrase “professional

learning” (p.64). Professional learning implies that someone’s brain has been changed by

the learning. Teachers must be a positive role model for students. If teachers do not

value their own learning, how can they expect their students to value their own learning?

(Richardson, 2007).

In order for professional learning to take place, principals should serve schools as

leaders of professional learning (Sparks, 2005). Instead of teachers leaving their jobs to

learn, teachers learn as they do their day-to-day work. Successful principals know high

quality professional development is critical and should take high priority. “Students pass

through our schools only once and are the ultimate beneficiaries of the quality teaching

and supportive relationships such professional learning can produce in every classroom

and throughout the school community” (Sparks, 2005, p. 2).

Private Corporations

The business environment over the next decade is expected to place even more

demands on management. Managers carry a different and more complex burden for

ensuring the success of the enterprise than do non-managers. Indeed, it is management’s

responsibility to ensure that all systems and resources are integrated properly so the

organization can achieve its objectives (Blanchard & Thacker, 2007). Performance

management is the single largest contributor to organizational effectiveness. Effectively

Page 32: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

managing employee performance breeds organizational success (Walker, 2007).

Knowing how to motivate your “A” performers as well as knowing how to motivate your

“C” performers is the key to any organizations’ success. This is where the professional

development of today’s leaders is crucial.

One of the most frequent types of training provided by companies over the last

several years is management development and executive leadership. For companies of

all sizes, approximately 37 % of all training budget goes toward management and

executive training (Blanchard & Thacker, 2007). High performance leaders value the

opportunity to transfer their knowledge to others and always have leadership teachable

points of view. “Noel Tichy says it best in his book, The Leadership Engine simply put,

if you are not teaching, you’re not leading” (Betof, 2007, p. 48). “Investment in

leadership development is important” (Michael Fullan) stated in his interview with

(Sparks, 2003, p.56). Effective professional development of leaders is just as important

as effective professional development of the employees that work for them.

There is an overwhelming amount of research on professional development

training for teachers with a few research efforts focusing on professional development

training for employees of private corporations. Research on the evaluation of

professional development training given to employees is limited. Why is evaluation so

important? “Accountability” Smith (2006) cited training managers and human resource

managers are asked for more accountability in spending the company’s workforce

training dollars. Furthermore, with some of the funds from the NCLB act allocated to

preparing and recruiting teachers, districts are having to describe how they will evaluate

the quality of the professional development and demonstrate that more teachers are

Page 33: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

receiving quality professional development than have received it in the past (Richardson,

2002).

Evaluating Professional Development

Imagine an organization or business that decided it would not look at its

profitability, return on investment, or its productivity. The manager of this company

never looks at how well or poorly the subordinates are performing their jobs. This is

what training is like when no evaluation is conducted (Blanchard & Thacker, 2007).

Endorsing evaluation is a lot like endorsing regular visits to the dentist. People are quick

to endorse both activities, but when it comes to doing either one, most people are very

uncomfortable (Boulmetis & Dutwin, 2000).

Training programs of any kind will benefit more by using a model to evaluate the

overall effectiveness of their program. So why is there so much resistance to training

evaluation?

Blanchard and Thacker (2007) stated training managers come up with many

reasons for not evaluating training, some of the reasons are:

There’s nothing to evaluate. (Training is a luxury provided as a reward

for good performance)

No one really cares about it. (The most common rationale for not

conducting training evaluations is that formal evaluation procedures are

too expensive and time consuming and no one really cares anyway)

Evaluation is a threat to my job. (Managers fear the result of an

evaluation might show failure of their program which would then affect

their careers).

Page 34: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

The problem is human nature. It is a known fact that people tend to do what is

familiar and comfortable, even if it is not effective (Kirkpatrick & Kirkpatrick, 2005).

Guskey (2000) listed some of the reasons for the lack of success in research on the

elements of effective professional development:

Confused criteria of effectiveness (researchers and evaluators have not

agreed on the most appropriate criteria to use in determining the

effectiveness of professional development);

The misguided search for main effects (researchers only look for “main

effects”; that is, components or processes that are consistent across

programs and contexts); and

The neglect of quality issues (most researchers focus only on issues of

quantity and neglect important quality issues).

There are many reasons organizations can use for not taking the time to evaluate

their training programs. The bottom line is the cost of evaluation might be too high but

the benefits of accountability should provide a balance.

Evaluation of a professional development program has two important goals: to

improve the quality of the program and to determine its overall effectiveness (Guskey,

2000; Lowden, 2003). In order to evaluate the overall quality of professional

development training given to employees, a model is needed. There are numerous

models to use for evaluating professional development. One of the earliest models for

evaluation is Tyler’s Evaluation Model developed during the 1930s and 1940s. Tyler’s

model includes a series of steps that he believed should be followed in any systematic

evaluation. These steps are:

Page 35: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

(1) Establish broad goals or objectives;

(2) Classify or order the goals or objectives;

(3) Define the goals or objectives in observable terms;

(4) Find situations in which achievement of the objectives is demonstrated;

(5) Develop or select measurement techniques;

(6) Collect performance data; and,

(7) Compare the performance data with the stated objectives.

Because Tyler’s was the first model it brought direction and clarity to the

educational world. In the private sector world the Brinkerhoff’s (1987) Model: “Six

Stages of Effective Human Resource Development” was used to assess how well the

design and implementation of a training program matched organizational needs. Its six

stages are:

Evaluate Needs: indicates the need to evaluate the organization’s goals.

Evaluate Human Resource Development (HRD): sets up an appropriate program

design, taking into consideration the needed skills and knowledge for the human

resource development endeavor.

Evaluate Operation: examines the implementation of the program design and how

it fits with the specific organizational context.

Evaluate Learning: assesses immediate outcomes, such as changes in learning,

through workplace performance.

Evaluate Endurance of Learning: assesses the application of learning, as it relates

to immediate outcomes which are identifiable in the workplace, and ensures that

the learning is headed toward a payoff for the organization.

Page 36: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Evaluate the Payoff: assesses organizational benefits resulting from training.

Another of the most frequent evaluation models for the corporate world is the

Kirkpatrick’s evaluation model, developed by Donald L. Kirkpatrick (1977, 1978 &

1996).

Kirkpatrick’s model was designed to judge the quality, efficiency, and

effectiveness of supervisory training programs in business and industry. Kirkpatrick’s

model consists of a four-level evaluation process. The four levels are:

(1) Reaction evaluation which focuses on how participants feel about the

program.

(2) Learning evaluation which measures the knowledge, skills, and attitudes that

participants acquire as a result of the training.

(3) Behavior evaluation which measures the extent to which the on-the-job

behavior of participants changed because of the training.

(4) Results evaluation is designed to assess the bottom line in business and

industry.

Guskey (2000) stated that although Kirkpatrick’s model has been applied widely

in numerous settings, it has seen limited use in the education environment because of

inadequate explanatory power. Kirkpatrick’s model is good in answering the “what?”

questions but it leaves out the “why?” question. Guskey was influenced by Kirkpatrick’s

model; he thought it would be useful in professional development in education.

However, Guskey found that when he used Kirkpatrick’s model he still was not getting

positive results. In examining the program closely, he found that although things were

done right from a training perspective educators were sent back to organizations that did

Page 37: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

not support them in what they were asked to do (Guskey, 2005). Therefore, Guskey

added the fifth level regarding organizational support and change.

For Guskey (2000) there are three types of evaluation; they are planning,

formative and summative. “Planning” evaluation takes place before a program or

activity begins. Evaluation that is used to modify or improve a professional

development program is called “formative” evaluation. Evaluation to determine the

overall effectiveness of a professional development program is called “summative”

evaluation.

How should Guskey’s model be used? First, Guskey suggests that each of these

levels is important in its own right. Each level provides different types of information

that can be used in both formative and summative ways. Formatively, there is a need to

find out what has been done well at each level and, if not done well, how it can be

improved. Summatively, evaluators need to know the effectiveness of elements at each

level to judge the true value and worth of any professional development endeavor.

Second, Guskey’s model includes five levels to gather information about

professional development and its design is hierarchically arranged from simple to

complex. Each level builds on those that come before. Guskey reminds the researcher

that people must have a positive reaction to a professional development experience before

we can expect them to benefit from it. They need to gain specific knowledge and skills

before looking to the organization for critical aspects of support or change.

Organizational support is necessary to gain high quality implementation of new policies

and practices. Appropriate implementation is a prerequisite to seeing improvements in

student learning.

Page 38: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Third, to evaluate professional development Guskey (2002) recommends starting

at the end or with the desired outcome, then working backwards:

Begin with the improvement in learning that we are seeking (level 5);

Based on research, determine what is required in terms of policies and practices

to facilitate this learning (level 4);

Look at the changes in the organization that will be required for successful

implementation of these policies and practices (level 3);

Look at the knowledge and skills staff will require to successfully implement the

policies and practices (level 2); and finally

Look at the professional development that will be required to provide staff with

the required knowledge and skills (Guskey, 2002).

Guskey (2005) states this planning process compels educators to plan not in terms of

what they are going to do but in terms of what they want to accomplish with their

students. He argues that most of the critical evaluation questions that need to be

addressed in determining a professional development program’s effectiveness should be

asked in the planning stage. Planning more carefully and more intentionally not only

makes evaluation easier, it also leads to much more effective professional development.

The National Staff Development Council (NSDC) 2006 standards flow like

Guskey’s bottom to top way of evaluating professional development. The following

questions were used to guide the revision of the NSDC standards:

What are all students expected to know and be able to do?

What must teachers know and do in order to ensure student success?

Where must staff development focus to meet both goals?

Page 39: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Staff development standards provide direction for designing a professional development

experience that ensures educators acquire the necessary knowledge and skills (NSDC,

2006).

The NSDC has three standards that Guskey helped to develop; they are context,

process, and content standards. The context characteristics refer to the “what” of

professional development. They concern the new knowledge, skills, and understanding

that are the foundation of any professional development effort. Process variables refer

to the “how” of professional development. They concern not only the type and forms of

professional development activities but also the way those activities are planned,

organized, carried out, and followed up. Context characteristics refer to the “who?”,

“when?”, “where?”, and “why?” of professional development. They involve the

organization system, or culture in which professional development takes place and

where the new understanding will be implemented as defined by the organization

(Guskey, 2000). The following table 2.1 describes the five levels of Guskey’s model

and how each level of information will be used.

Page 40: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 2.1 Guskey 2000 Model for Evaluating Professional Development

EvaluationLevel

What Questions Are

Addressed?

How will information be

gathered?

What is Measured or Assessed?

How will information be

used?1. Participants’ Reactions

Did they like it? Did the material make sense? Will it be useful?

Questionnaires administered at the end of the session.Focus groupsInterviews

Initial satisfaction with the experience

To improve program design and delivery

2. Participants’ Learning

Did participants acquire the intended knowledge and skills?

Paper-and-pencil instruments.Simulations and demonstrations.

New knowledge and skills of participants.

To improve program content, format, and organization.

3. Organization support and change

What was the impact on the organization?Did it affect organizational climate and procedures?

Questionnaires;Focus groups;Structured interviews with participants and school or district administrators.

The organization’s advocacy, support, accommodation, facilitation, and recognition.

To document and improve organizational support.To inform future change efforts

4. Participants’ use of new knowledge and skills

Did participants effectively apply the new knowledge and skills?

QuestionnairesStructured interviews with participants and their supervisors

Degree and quality of implementation.To document and improve the implementation of program content

5. Student learning outcomes

QuestionnairesStructured interviews with students, parents, teachers, and/or administratorsParticipant portfolios

Cognitive (performance and achievement)Affective (attitudes and dispositions)

To focus and improve all aspects of program design, implementation, and follow-up.To demonstrate the overall impact of professional development.

Page 41: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

The primary focus of the research will be spent on evaluating the differences in

the professional development training given to corporate employees in one corporation

and public educators in one school district. The researcher chose the Guskey 2000 model

over the Kirkpatrick model due to the formative and summative evaluation methods it

includes. In using the Guskey model for evaluating professional development programs,

the overall quality of professional development programs provided to participants will

measure its impact in terms of change in the knowledge, skills, attitudes and beliefs of

participants. If Guskey’s model for education can be applied successfully in the

corporate world, then insights into any differences may be attributed to education.

Guskey (2000) emphasized that in order for staff development to have an impact on

students; it must first have an impact on the teachers who participate. It is likely that for

professional development to have an impact on teachers’ performance; it must first have

an impact on the individual teacher who participates. This is where the evaluation of

professional development training programs is crucial for accountability. Participants

Reaction, Participants’ Learning, Organizational Support, and Participants’ Use of New

Knowledge and Skills, and Students Learning Outcomes will be looked at in one private

corporation and one large Texas public school district.

Participants’ Reaction

In the level one part of the evaluation process the categories include content

questions, process questions, and context questions (National Staff Development Council,

1994, 1995a, 1995b) as cited by Guskey (2000). Evaluating how a participant felt about

Page 42: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

a training session is one of the most common forms of professional development

evaluation. It basically asks the question “Did you like it?”

Content questions measure the relevance, utility, and timeliness of the topics

explored through the training experience. Content questions also focus on the new

knowledge, skills, and understandings that are the basis for the program. Some key

questions to be asked are: Were the issues explored relevant to your professional

responsibilities? Did the content make sense to you? Did the content relate to your

situation? Will what you learned be useful to you? Will you be able to apply what you

learned? Comments from participants will tend to be more positive when the content

addresses specific problems and offers practical, relevant solutions that can be

implemented immediately (Guskey, 2000).

Process questions relate to the conduct and organization of the professional

development experiences. In other words, they ask how things were done. Process

questions tend to focus on the program leaders. Some key questions to be asked are: Was

the leader knowledgeable and helpful? Was the leader or group facilitator well prepared?

Were goals and objectives clearly specified when you began? Was sufficient time

provided for the completion of tasks? Depending on the learning styles of the

participants, reactions to a specific form of professional development training may vary

(Guskey, 2000).

Context questions generally relate to the setting of the professional development

experience. Context deals more with the environment. Were the facilities conducive to

learning? Was the room the right size for the group? Was the lighting adequate? Level

one is the basic form of evaluation and the easiest to conduct (Guskey, 2000).

Page 43: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Data that measure participants’ reaction during professional development training

help leaders spot trouble areas while validating the programs (Champion, 2003).

Reactions of participants should be measured on all programs for two reasons: to let the

participants know that trainers value their reactions, and to measure their reactions and

obtain suggestions for improvements (Kirkpatrick & Kirkpatrick, 2005). Participants’

reactions allow evaluators to collect their impression and opinions on a training session,

but opinions do not reflect what participants’ are really learning. Evaluators must make

sure not to confuse participants’ reaction with participants’ learning (Champion, 2003).

Participants’ Learning

Evaluators always hope for a positive reaction from participants during a training

session. The most important part of an evaluation is whether learning took place.

Professional development is a purposeful and intentional process designed to enhance the

professional knowledge and skills that participants acquire as a result of their experience

(Guskey, 2000). In assessing participants’ learning there are three categories: cognitive

(knowledge and understanding), psychomotor (skills and behaviors), and affective

(attitudes and beliefs). One of the main goals in assessing participants’ learning is the

change affective category. Many professional development training programs or

activities have a goal to change the participants’ attitudes, beliefs, and dispositions. This

is accomplished by trying to gain commitment and enthusiasm from the participants.

Trying to obtain a commitment and enthusiasm from participants at the beginning of a

new presentation is hard; instead, professional development programs should provoke a

sense of curiosity, exploration, and experimentation (Guskey, 1998a).

Page 44: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Champion (2003) discussed some of the areas used to promote participants’

learning. One method is through leadership academies where a variety of tools are used

to determine what participants know when they enter the academy and what they can do

when they leave. Technology training programs is another effect used to promote

participants’ learning by giving a pre-test and post-test to measure what each participant

knows. Yearlong training programs is another method used to determine participants’

learning by giving participants ongoing projects and allowing peers or program leaders to

gather the data through observations. The last method is the workshop; training leaders

who periodically throughout the event check on participants’ understanding by asking

impromptu questions or have the participants’ evaluate a teacher on a video. In dealing

with accountability it is important for program developers to consider Champion’s (2003)

10 suggestions for laying the ground work:

1. Avoid surprise ambushes. Let participants know their learning progress

will be checked frequently.

2. Design the professional learning experience to ensure participants’

learning success. Assume your adult learners are diverse and will need

varying amounts of assistance.

3. Check learning progress early and often. Avoid waiting until the end of

the program to measure everything.

4. Practice what you teach about assessment tools. Model how to use the

kinds of assessment tools you want your participants’ to use with their

students.

Page 45: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

5. Use the learning data immediately to improve the program. Treat

whatever learning data you collect as formative data.

6. Respect the learners’ privacy. Avoid setting up a situation in which

participants must make personal learning results public.

7. Check learning at higher levels. Be sure to match the level of your

learning assessment with the program’s intended learner outcomes.

8. Before using any learning assessment tool, work out the bugs. Always

field-test your learning assessments before using them.

9. Assess the important constructs and skills. Limit the learning assessment

to the most important constructs and major skills.

10. Remember to move on to the next evaluation question. Once you have

evaluated participants’ learning, address the next question: “Are

participants using what they learned?”

When evaluating learning you measure: What knowledge was learned; what skills were

developed or improved; and what attitudes were changed? It is important to measure

learning because no change in behavior can be expected unless one or more of these

learning objectives have been accomplished (Kirkpatrick & Kirkpatrick, 2006).

Participants’ Use of New Knowledge and Skills

The main focus of this level is “Did what participants learn through their

professional development experience affect their professional practice?” Guskey (2000)

stated four challenges in evaluating participants’ use of new knowledge and skills. First,

the challenge is to identify accurate, appropriate, and sufficient indicators of use. This

would relate to the action or behavior that should or should not take place in relation to

Page 46: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

new skills. Second, the challenge is to identify the indicators and to specify dimensions

of both quantity (frequency and regularity of use) and quality (appropriateness and

adequacy of use). Third, the challenge is to determine if adequate time has been allowed

for relevant use to occur. Four, the challenge is sufficient flexibility must be allowed for

contextual adaptations.

In education, traditionally professional development in schools consisted of

activities such as attending conferences or working on curriculum during teacher

workshop days. Kelleher (2003) stated these strategies have proved too inadequate.

First, they tend not to help teachers translate new learning into classroom instruction.

Second, these strategies are often not necessarily tied to specific building and district

goals for student learning. “The best professional development helps teachers to think

critically about their practice; to develop new instructional strategies, along with new

techniques for creating curriculum and assessments; and to measure how new practices

have affected student learning” (Kelleher, 2003, p. 751).

Trainers in the business world have struggled for decades to solve the transfer

problem: getting people who have demonstrated that they understand learning to actually

apply what they have learned on the job (Salopek, 2006). The problem with transferring

new skills into practice was people generally gave training good reviews; then returned to

their work settings with a sincere commitment to change their behavior, and then went

back to their old habits. Training scores were high, but change scores sat at zero

(Patterson, 2006).

Patterson’s (2006) solution to this transfer problem was: “If you want to change

behavior, you have to influence cognitive, behavioral, and motivational factors. You

Page 47: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

must provide appropriate cues; and you must provide and require ample opportunities for

practice” (p.20). Trainers/evaluators need to find a way to cue people to use their new

skills; this is the missing link to a successful learning outcome. In addition to knowing,

doing, and wanting, people have to recognize when it is time to put new skills into action.

They have to learn to recognize, and then respond to, a cue or entry condition. If trainers

do not provide those cues, then learners will still score high on their tests, but will fail to

implement the skills on the job.

In a research study conducted by Lowden (2003), it was found that teachers felt

the new knowledge and skills they learned as a result of professional development had an

impact on student achievement. Research conducted by Eister (2004) found that

beginning principals were able to transfer the professional development which they

received to their on-the-job application because the principals perceived the content to be

relevant. Another research on participants’ learning and use of new knowledge and skills

was conducted by Zender (2002) who found that teachers with a higher level of

involvement in the professional development experiences were more likely to use the

new knowledge and skills presented in their training than teachers who had little or no

involvement in the planning phrase.

An important factor in transferring the new knowledge and skills is the needs of

the adult learners. Adults in the workforce learn somewhat differently than school age

students. They usually learn for a specific reason rather than just for the love of learning

(Roberts & Pruitt, 2003). They prefer to be in charge of their own learning; they also

learn best when the new concepts and skills are related to real-life circumstances. This is

Page 48: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

another reason that job-embedded staff development is so effective (Roberts & Pruitt,

2003).

Organization Support and Change

Guskey (2000) stressed that without a systemic approach; organizational factors

can hinder or prevent the success of improvement efforts, even when the individual

aspects of professional development are done correctly. Most importantly is the culture

of the organization; culture refers to the values, beliefs, and norms that operate within

that organization. In looking at the organization support and change, Guskey (2000)

suggests that the evaluation process should address:

Organization policies – assessing if policies of the organization are in

conjunction with goals and objectives.

Resources – assessing if adequate resources are available to implement

training with the organization.

Protection from intrusions - assessing if the work environment allocates

time for the planning of training, outside of regular work hours.

Provision of time – assessing whether or not adequate time is provided to

encourage professional development.

Openness to experimentation and alleviation of fear – assessing the

openness to learning and experimentation of the organization.

Collegial support – assessing how supportive and encouraging the efforts

of colleagues are in implementing change.

Page 49: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Supervisor’s leadership and support – assessing how supportive and

encouraging supervisors are in efforts their employees display towards

professional development.

Higher level administrators’ leadership and support – assessing whether or

not administrators support employees in opportunities of knowledge

sharing with other professionals in other organizations.

Recognition of success – assessing whether or not employees

improvements will be acknowledged and honored in order to maintain

motivation and give encouragement

Research on organizational support and change conducted by Zender (2002)

found teachers that were highly involved in the professional development process had

more insight into the budget allocation for materials used in classrooms; therefore they

were more likely to perceive the materials’ adoptions as being adequate. Lowden’s

(2003) discovered that teachers who had the most effective professional development

experiences positively evaluated the organization’s support and change. Miller (2006)

found that the organization support and change that took place in her research was one of

the strongest features. Miller (2006) sited the support and change was widespread at

every level of the organization which was important to the participants and contributed to

the overall success of the model being tested.

Budget allocation for professional development training matters in addressing the

organization support. Budget allocation provides insights on answers to the following

questions: (1) How important is continuing education for employees to the organization?

(2) Did the employee have access to the necessary technology? (3) Are leaders open to

Page 50: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

suggestions for improvements in policies and procedures? The higher-level

administrators’ leadership and support must exist for organizational support to be present.

Higher-level management sets the tone for the culture of the organization. The National

Staff Development Council (NSDC) feels so strongly about committing more resources

as a key to effective professional learning that the organization adopted a resolution and

advocates a standard addressing this message. NSDC recommends that school systems

dedicate at least 10% of their budgets to staff development and at least 25% of an

educator’s work time to learning and collaboration with colleagues (Hirsh, 2003). The

2006 State of the Industry reported that the average percent of payroll devoted to

professional development was 2.2%.

Every organization ventures into training in order to meet some intended goal or

objective for the purpose of organizational growth (Tsarouhas, 2004). Organizations

expect acquired learning is implemented. Organizational management plays an important

role in building a shared culture and commitment to professional growth through the

resources and time made available for employees to participate (Lutz-Laine, 2000).

Another key in building organizational support is creating professional development

training with the organization’s goals in mind. However, since spending as a percentage

of payrolls is not increasing, Ketter (2006) reports that employees are receiving more

hours of training than in past years; that suggests companies are becoming more efficient

in providing learning opportunities.

Page 51: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Return on Investment

Students’/Participants’ Learning Outcomes

“Many big organizations, including school districts, regional centers, state

education offices, and educational foundations, are attempting to raise the bar on the

evaluation of professional learning that they sponsor” (Champion, 2005, p. 61). Talking

about evaluation in committee meetings is easy until the conversation shifts into action

mode; then most supporters of accountability sometimes have second thoughts. Most

stakeholders ask “What is the payoff for the investment in measurement and evaluation?”

Champion (2005) citing Phillips, Phillips, and Hodges (2004) book Make Training

Evaluation Work which addresses the question “Why do it?” In the book, the authors

cited 15 different payoffs to consider. They were:

1. Provide response, meet requests and requirements. The grantor of program funds

or a governing board requires or requests in-depth measurements and evaluation

of the programs funded.

2. Justify the budget. Thorough program evaluation can show that learning and

training programs add value to the organization and are worth the budget

allocation.

3. Improve program design. Thorough evaluation digs deeply enough to point out

specific aspects of a program that need to be redesigned to get better results.

4. Identify and improve dysfunctional processes. Program evaluation can uncover

misalignments, such as learning and development programs that have been

implemented but do not fit the organization’s needs.

Page 52: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

5. Enhance the transfer of learning. Transfer of new learning to the workplace

remains a major problem for the designers of staff learning. Thorough program

evaluation can help leaders anticipate and address hurdles that likely will inhibit

the transfer of newly learned knowledge and skills to the workplace.

6. Eliminate unnecessary or ineffective programs. Thorough evaluation can provide

sufficient information to help leaders decide whether a program should be

discontinued.

7. Expand implementation of successful programs. Thorough evaluation can help

leaders decide with confidence whether to expand a limited or pilot program to

broaden its positive effects.

8. Enhance the respect and credibility of the learning and development staff.

Substantive data that go beyond measuring how good participants feel can

enhance other’s opinions of staff developers. Staff developers can help increase

their credibility when they present information establishing that new knowledge

and skills have been applied in the workplace, made an impact, and shown a

return on the investment.

9. Satisfy client needs. Potential participants or clients want to know a program has

been examined thoroughly with data collection at all levels as evidence that is

likely will have a good return on investment.

10. Build support from managers/administrators. Managers/Administrators often

view training and development programs as a nuisance because the programs take

staff away from their jobs.

Page 53: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Managers/Administrators often respond positively when they know that a

program has been thoroughly evaluated and proven to be worth the time away

from the job.

11. Strengthen relationships with key executives and administrators. Top

administrators have great influence over the learning and development function of

organizations. Their perception of staff development as a contributing partner is

critical. Thorough and credible evaluation adds to the top administrators’

impression and use of the department.

12. Set priorities for learning and development. Since the need for staff learning and

development in an organization nearly always exceeds available resources,

leaders must prioritize. Accurate data on program impact can help leaders make

data-driven decisions about their priorities with confidence.

13. Reinvent learning and development. Thorough measurement and evaluation can

help to ensure that learning and development programs are not simply icing on the

cake. Accurate evidence can reinvent learning and development as an essential,

not an extraneous activity.

14. Alter management’s perceptions of learning and development. Middle managers

often have negative views of learning and development programs and consider

them dispensable. Data can help change their minds.

15. Achieve a monetary payoff. Return-on-investment can be estimated by collecting

a variety of data, often easily found data, to determine some programs’ monetary

contribution to identified targets (i.e. absenteeism, retention, time to complete

work, quality of team projects).

Page 54: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Many, if not all, of the payoffs designed for the corporate world can be determined by

using Guskey’s model. The terminology may be different. However, the end result is the

same when asking the question: “How to evaluate if time, money, and efforts result in

better performance/outcomes?” In other words, Guskey’s model asks “Did all the

students/participants acquire the intended knowledge, skills, attitudes, beliefs, or

behaviors? This assumes, of course, that explicit student/participants’ learning goals were

identified when the program or activity was planned” (Guskey, 2000, p.211).

Miller (2006) used the Guskey’s model in her research and found that the success

of student learning outcomes came from teachers and mentors coming together on a

regular basis with the primary consultant and discussing what they needed to do to

improve student learning. “Having teachers work together in classrooms was central to

the Guskey’s model. Students not only saw their teachers modeling knowledge sharing

but also were encouraged to share their knowledge with their classmates and with visiting

teachers during classroom visits” (Miller, 2006, p. 263). Lowden (2003) noted that

supervisors should be able to observe participants’ utilizing the new knowledge and skills

that they gained as a result of their own professional growth. This can be done by linking

the participants’ professional performance review to their personalized professional

development plan and the overall organization’s goals.

Corporations look at performance while education looks at the outcomes through

students test scores and graduation rates. “There is an old saying among training

directors: When there are cutbacks in an organization, training people are the first to go”

(Kirkpatrick & Kirkpatrick, 2006).

Page 55: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

There is a mountain of research on professional development for educators.

Among all the educational research there is hardly any on how the training given to

educators is being evaluated. There is even less research on the training and development

programs within corporations. The following table 2.2 summarizes some of the past

research conducted on professional development training using the same variables this

study used.

Page 56: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 2.2 Past Researches on Professional Development

Author(s)/year/Title Population/Sample Variables Methodology Future Research

Miller, 2006

Professional Development in a Large School District: An Application of Guskey’s Model

Grade one teachers, mentors and principals

Participants’ Reactions, Knowledge and Skills, Organizational Support, Participants’ Use of knowledge, Impact

Case Study; Quantitative and Qualitative

Research linking Professional Development with student achievement in language arts.

Greene, 2005

Quality Matters: A Different Perspective on the Relationship Between School Resources and Student Outcomes

303 Public Comprehensive High Schools in New Jersey

Outcome Variables (Language Arts, Math gain scores) Predictor Variables (Environment & Resource)

Quantitative (Correlational)

Research on more efficient and effective allocation strategies

Tsarouhas, 2004

Understanding organizational context for the evaluation of training outcomes: A multi-site case study in the community mental health sector

Four organizations in the mental health sector. 22 participants were interviewed

Guskey 3rd level (Organizational support and change)

Qualitative only (Interviews)

Various sectors beside education should be used by Guskey’s model.

Lowden, 2003

Evaluating the Effectiveness of Professional Development

Certified K-12 teachers in two districts in New York state.

Participants’ Satisfaction, Learning, Organizational Support and Change, Participants’ Knowledge, Student learning, Teachers Attitudes/beliefs

Quantitative (Survey only)

Research on PD based on the New Reform; Replicated on a larger population; Teacher perception of PD & teacher evaluation

Page 57: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Summary

Two primary ways of intervening in the learning of individuals is through

schooling and through the development that takes place in corporations. Schools

are primarily influenced by tradition, and local control, while business

development is highly influenced by vendors and a pedagogy that fits with

business values (Natale & Fenton, 1997, p.68).

Guskey (2000) stated that evaluation at any of these five levels can be done well

or poorly, convincingly or laughably. The information gathered at each level is important

and can help improve professional development programs and activities. Sadly, the bulk

of professional development today is evaluated only at level one, if at all. Studies

mentioned earlier in table 2.2 are only the tip of the iceberg when it comes to research on

professional development. There is limited research comparing the differences or

similarities in professional development training between private corporations and public

education. It is important to evaluate the differences between private corporations and

public education since tomorrow’s employees will be affected by today’s educators.

Page 58: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

CHAPTER III

METHODS

Introduction

The purpose of this study was to compare public education professional

development training programs with the corporate sector professional development

training programs. Guskey’s professional development evaluation model comprising of

five critical levels was used to examine the presence and significance of professional

development programs for educators in a public school district and in training programs

for employees in a private corporation. The five levels of this evaluation model included:

(a) participants’ reactions; (b) participants learning; (c) organization support and change;

(d) participants’ use of knowledge and skills; and (e) student learning outcomes. For this

study “students” were the “participants” receiving professional development training.

In using the Guskey’s model for evaluating professional development the

following question was addressed: What were the differences in participants’ reaction,

participants’ learning, organization support and change, participants’ use of knowledge

and skills, and student learning outcomes between a private corporation and a public

school district? An instrument called Professional Development Assessment Tool

(PDAT) which included the survey and questionnaire and the interview questions (see

Appendix B) were utilized to obtain quantitative and qualitative data for this study.

Knowledge gained from the study may provide vital information for

organizations to use to improve the professional development training given to their

employees in making a difference in their performance. Evidence of accountability is

Page 59: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

crucial in determining how organizations value professional development training as an

investment.

Research Questions

The following questions guided the quantitative and qualitative portions of this

study:

Quantitative

3. What are the differences in participants’ reactions regarding the professional

development training between public educators and corporate employees as

measured by PDAT?

4. What are the differences in participants’ learning in professional development

training between public educators and corporate employees as measured by

PDAT?

6. What are the differences in organizational support for professional development

between public educators and corporate employees as measured by PDAT?

4. What are the differences in participants’ use of knowledge and skills gained from

their professional development training program provided by the corporate sector

and public education as measured by PDAT?

Qualitative

5. What are the differences in how the evaluation of participants’ learning outcomes

is determined by the corporate sector and public school district, based on

Guskey’s model?

Page 60: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Null Hypotheses

Based on the quantitative research questions, the following hypotheses were

formulated:

Ho1: There are no statistically significant differences in participants’ reactions to

the professional development training provided between public educators

and corporate employees as measured by PDAT.

Ho2: There are no statistically significant differences in participants’ learning

throughout their professional development training outcomes between

public educators and corporate employees as measured by PDAT.

Ho3: There are no statistically significant differences in organizational support for

professional development training between public educators and corporate

employees as measured by PDAT.

Ho4: There are no statistically significant differences in participants’ use of

knowledge and skills gained from their professional development training

program provided by private corporations and public education as measured

by PDAT.

Research Methods

A mixed-methods study was utilized to examine the differences in professional

development training provided in a public school district and a private corporation.

Fraenkel and Wallen (2006) stated that increased attention has been given to mixed-

method studies and described Creswell’s three types of mixed methods designs:

Page 61: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Triangulation design is utilized when the researcher simultaneously collects both

quantitative and qualitative data, compares the results, and then uses those

findings to see whether they validate each other.

Explanatory design is used when the researcher first collects and analyzes

quantitative data, and then obtains qualitative data to follow up and refine the

quantitative findings.

Exploratory design is followed when the researcher first collects qualitative data

and then uses the findings to give direction to quantitative data collection. The

data are used to validate or extend the qualitative findings.

A triangulation design was used for this study. Quantitative data were collected

on participants’ reaction, participants’ learning, organizational support, and participants’

use of knowledge and skills using the web based survey tool called the Professional

Development Assessment Tool (PDAT).

Qualitative data were collected two ways: (1) through the online questionnaire

given at the end portion of the PDAT instrument relating to participants’ learning

outcomes; and (2) through interviews conducted with professional development

administers and managers of the training and development department to explore how

upper management evaluated the overall effectiveness of their professional development

programs. The focus of the interview questions was on the evaluation process used to

determine how the last two levels of Guskey’s model namely, participants’ use of

knowledge and skills and student learning outcomes were being evaluated.

The study utilized descriptive research methods. Fraenkel and Wallen (2006, p.

189) describe this type of research as: “it permits researchers to describe the information

Page 62: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

contained in many, many scores with just a few indices, such as the mean, and median.”

A survey study also falls under the classification of descriptive research. Kritsonis (2005)

states that surveys are “an application of the scientific method to gather data from a

relatively large number of cases in order to describe a particular population”(p. 17).

“When answers to a set of questions are solicited in person, the research is called an

interview” (Fraenkel & Wallen, 2006. p. 12).

Research Design

This study utilized the triangulation design mixed-methods study where both

quantitative and qualitative information were gathered simultaneously.

Quantitative Data

Quantitative data were gathered regarding participants’ reactions, participants’

learning, organizational support and participants’ use of knowledge and skills. The

online survey was answered by respondents from a private corporation and a public

school district who have participated in professional development training or programs.

Qualitative Data

Qualitative data regarding overall professional development training effectiveness

were obtained through the online, open-ended questionnaire found at the end portion of

the PDAT survey, and through interviews with upper management in the human resource

departments in the public school district and in the private corporation. Under education,

upper management consisted of any one serving in the capacity of assistant principal,

principal, area superintendent, assistant superintendent, or director of the professional

development department. Under private corporation, upper management consisted of any

Page 63: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

one serving as a first line manager, second line manager, assistant director, or director of

the training and development department.

Pilot Study

A pilot study was necessary to test the reliability and validity of the procedure and

survey questions. Sixty respondents were purposively invited to participate; 30 educators

from one school district in a large metropolitan area in the southwestern United States

and 30 corporate employees at one corporate sector. A test-retest method was used to

determine the reliability of the instrument. Fraenkel and Wallen (2006) states “A test-

retest method involves administering the same test twice to the same group after a certain

time interval has elapsed” (p. 159). Participants were asked to respond to the survey on

two occasions approximately three weeks apart.

A survey tool (PDAT) with 20 questions was utilized to gather the quantitative

data. Questions were tested for trustworthiness by using the content-related method.

Fraenkel and Wallen (2006) define content-related method as “the degree to which an

instrument logically appears to measure an intended variable; it is determined by expert

judgment” (p. G-2). Experts in the field of professional development evaluation

validated the questions by making sure they reflected Guskey’s model. Necessary

changes were made based on inputs from the content experts. For the test-retest to work,

participants were asked to put their name in place of the company name. A reliability

coefficient was taken to test the reliability of the instrument. Changes were made to the

instrument to include a one click selection for organization and position title.

Qualitative data resulting from on-line open-ended questionnaire involving

Guskey’s fifth level of evaluation, student learning outcomes and interview questions

Page 64: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

were tested for trustworthiness by using the content-related method. Any necessary

changes were made based on inputs from the content experts. Results were categorized

using the NVivo (version 7) software package. Frequencies for the responses by the

different respondents (teachers, educational administrators, engineers, and management)

pertaining to the different categories were tallied and percentages were computed.

Listing of categories was based on the total frequencies; those categories identified most

by the respondents were listed first followed by those with lower frequencies.

Isaac and Michael (1995) stated three basic considerations for pre-testing:

(1) select a sample of individuals who are representative of the population toward which

the questionnaire is eventually intended; (2) administer the pre-test under conditions

comparable to those anticipated in the final study; (3) check the percent of responses as

an estimate of what will occur in the final run.

The pilot study proved to be a good indication of what the final run would

produce. Corporate employees who rarely participated in outside surveys gave a higher

rate of return. In the hope of a higher return during the actual study, the researcher

modified the approach that was used for the public education employees based on the

results of the pilot study and expert advice. Pilot study participants data were excluded

from the research.

Subjects of the Study

This study collected data from educators and employees who work for a school

district and a private corporation with at least 3,000 professional employees based in the

Region 4 school district in Texas. Region 4 was selected based on the data from Texas

Education Agency (TEA) as being the largest region in the state. One school district and

Page 65: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

one private corporation that were similar in size or revenues and had an active

professional development or training and development department were used. Purposive

sampling was used in selecting a school district and a private corporation. “Purposive

sampling allows the researcher to select a sample that [they] believe, based on prior

information will provide the data [they] need” (Fraenkel & Wallen, 2006, p.101). A

school district with a professional staff population of at least 3000 was selected. A

private corporation that was a sub-contractor to National Aeronautics and Space

Administration (NASA), and had a similar professional staff population or revenues to

the selected school district was selected for this study.

Quantitative Data

Public Education

A database of school districts in Texas was obtained from the Texas Education

Agency (TEA) and the Region 4 school district 2006-2007 directory. Professional staff

population and revenue of the districts were obtained from the Academic Excellence

Indicator System (AEIS) report. There were seven school districts in the Region IV area

that have at least 3000 professional staff members. Only one school district had a total

revenue budget of over a billion dollars. A letter was sent to the school district (see

Appendix C). Once the school district granted approval (See Appendix D), a cluster

sampling of schools took place. “An advantage of cluster sampling is that it can be used

when it is difficult or impossible to select a random sample of individuals from a

population sampling frame” (Fraenkel & Wallen, 2006, p .97-98). A letter was sent to

the school building principals that were selected (See Appendix E).

Page 66: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Raosoft sample size calculator was used to help determined the right sample size

for a population of 15,000 employees. With a margin of error of 5% and a confidence

level of 95%, Raosoft recommended a sample size of 375. Salant and Dillman (1994)

confirm the sample size of 375 for a population of 15,000. “For descriptive studies, a

sample size with a minimum number of 100 is essential” (Fraenkel & Wallen, 2006, p.

104). For the public education employees with a population of approximately 15,000

professional employees, to reach a sample of at least 375 a cluster random sampling was

used to select one high school, one middle school, two elementary schools and one

administration building. It gave the researcher a sample of 402. With only 192

respondents of the 402, the researcher solicited the help of the school district professional

development personnel. The school district professional development department

distributed the survey via e-mail to all the schools with a full time mentor on campus and

central office staff members (see Appendix F). This gave the researcher a sample size of

735 participants. The total number of participants that completed the survey was 475.

The rate of return was 65%.

Private Corporation

Private corporations that were sub-contractors to the National Aeronautics and

Space Administration (NASA) were identified from two resources. A list of contractors

was obtained form the NASA home page and from the Greater Houston Partnership web

page. Professional staff size and the revenues of each corporation were obtained from

each corporation’s web page. In looking at private corporations the researcher looked

more at their revenue. The corporation whose revenue was closest to one billion dollars

was considered. This was done to match the public and private groups. There were four

Page 67: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

corporations that fit the criteria of the researcher. A letter was sent to the private

corporation (See Appendix G). Once approval to participate was granted (See Appendix

H) cluster sampling was also used to select the departments within the selected

corporation.

Raosoft sample size calculator was used to help determined the right sample size

for a population of 3,000 employees. With a margin of error of 5% and a confidence

level of 95%, Raosoft recommended a sample size of 341. (Salant & Dillman, 1994)

confirms the sample size of 341 for a population of 3,000. Cluster sampling produced

three departments giving a total sample size of 476; 304 participants completed the

survey for a return rate of 64%.

From these two sources a total of 1,211 participants were invited to participate.

From the invited participants 779 responded to the online survey. A 64% overall return

rate was obtained.

Qualitative Data

The online questionnaire was given to all participants (educational administrators,

teachers/counselors, corporate managers, and corporate employees). Out of those who

participated, 369 responded to the questionnaire focusing on Guskey’s (2000) fifth level,

student learning outcome. One-on-one interviews involved those serving in the position

of educational administrator or corporate manager. There were four educational

administrators and four corporate managers who were interviewed. Guskey (2000) stated

that one of the most efficient ways of gather information on affective student/participant

outcomes is through interviews with their supervisors.

Page 68: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

To protect the identity of the participants who were interviewed the researcher

coded their names and position by using a generic label:

(CM)- For any one serving in the capacity of corporate management;

(e.g. director, second line manager, and first line manager).

(EA)- For anyone serving in the capacity of educational administrator;

(e.g. principals, assistant principals, superintendent, area superintendent).

For confidentiality of the data from the online survey a random numeric code was

electronically generated and assigned to each survey/questionnaire submitted. The

company name was not used, just the word “public education” or “corporate sector”. All

transcripts collected through the study are secured on a password protected internet site

or in a safe deposit box for seven years. The researcher is the only person with the

password. For security purposes the password will be changed every three months.

When there was an opportunity to mask the data, it was done.

Instrumentation

Quantitative Data

A web-based survey/questionnaire, PDAT was distributed and data collected as

responses were electronically sent back to the researcher via email. PDAT consists of 20

Likert-type items and three open ended questions developed by the researcher. PDAT

was worded so that either an educator or corporate employee could answer the questions.

PDAT was validated for content by experts in the field of professional development

evaluation.

PDAT has five categories. The first four categories deal with participants’

reactions, participants’ learning, organizational support, and participants’ use of

Page 69: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

knowledge and skills. Under each category a general statement was made to which there

were five statements addressing each category. Each of the five statements was answered

by selecting “strongly agree”, “agree”, “neutral”, “disagree”, and “strongly disagree”.

“Strongly agree” received a score of five points; “agree” received a score of four points;

“neutral” received a score of three points; disagree received a score of two points; and

“strongly disagree” received a score of one point. Scores for descriptors, within each

indicator, ranged from one to five; each category ranged from 5 - 20 whereas the total

scores ranged from 20 - 100. The closer the score for each item is to 20 indicated that the

participant believes that their professional development training was effective in that

category. On the other hand, the closer the score for each item is to 5 indicated that

employees believed their professional development training was ineffective in that

category. For overall effectiveness the closer the mean score is to 100 indicated the

participants’ viewed their professional development training as effective. Also, the closer

the mean score is to 20 indicated that the participants’ viewed their professional

development training to be ineffective.

Qualitative Data

An online questionnaire containing three open-ended questions and interview

questions were used to evaluate the overall effectiveness of professional development

training provided to employees. NVivo (version 7) software package was used in

developing emergent themes and to minimize misinterpretation. Guskey (2000)

suggested that questionnaires or interviews were used to gather information on

participants’ use of knowledge and skills and student learning outcomes. An explanation

of the interview process was given prior to the interview questions. Interview questions

Page 70: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

were guided by the last level of Guskey 2000 model, student learning outcomes Guskey

(2000).

Validity and Reliability

“Validity refers to the appropriateness, meaningfulness, correctness, and

usefulness of the inferences a researcher makes. Reliability refers to the consistency of

scores or answers from on administration of an instrument to another and from one set of

items to another,” (Fraenkel & Wallen, 2006, p.150). The triangulation method involving

the analysis of the quantitative data, the collation of data from the on-line questionnaire

and interviews assured the validity and reliability and the triangulation design of the

mixed methods study.

For the quantitative part of the study, a pilot study was performed. To test the

reliability of the PDAT instrument a test-retest was completed. The test-retest scored a

+0.88 for the reliability coefficient to indicate the relationship between the two sets of

scores obtained. Experts in the field of professional development evaluation validated

the questions by making sure they reflected Guskey (2000) model. Necessary changes

were made based on inputs from the National Staff Development Counsel (NSDC)

content experts.

Research Procedures

Quantitative Data

Public Education

Once the school district was identified, the researcher went to the school district

website and down loaded the permission to conduct research application. Upon

completion of the permission to conduct research application, the researcher turned in all

Page 71: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

required documentation with a letter (see Appendix C) to the assistant superintendent of

research. Permission was granted and a letter to that effect (see Appendix D) was sent to

the researcher. A cluster random sampling was done to select schools from all parts of

the district. Selected school principals were notified by e-mail (see Appendix E) along

with the PDAT website attached with the school district approval letter.

The principal sent all teachers and administrators of the school building an e-

mail with the website information and directions. As a backup plan, a letter explaining

who the researcher was, the purpose of the study and the website address of the PDAT

survey/questionnaire were placed in the boxes of all teachers and administrators. In the

event the principal did not forward the website to the employees of the building, the

professional development department sent an e-mail out to all school building mentors

and personnel in the administration building. Upon completion of the survey, the

participants hit the submit button and the results were saved on the website. Notification

that someone responded to the survey was forwarded directly to the researcher’s personal

e-mail address.

Private Corporations

Once permission was granted by the ethics and legal department, a letter was sent

to the researcher (see Appendix G). A cluster random sampling was performed to select

departments within the corporation. Directors of selected departments were contacted by

the researcher. Directors that volunteered to participate distributed the PDAT

survey/questionnaire to all professional employees through their massive e-mail

distribution list. The e-mail described the purpose of the survey and clearly stated that it

was on a volunteer basis only. Two weeks after the initial e-mail was sent, the researcher

Page 72: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

delivered a 3x4 index card reminding them of the survey. Upon completion of the

survey, the participants hit the submit button and the results were saved on the website.

Notification that someone participated in the survey was forwarded directly to the

researcher’s personal e-mail address.

Qualitative Data

Step two of the research was the qualitative portion of the study. Qualitative data

were collected in two ways. First, the qualitative data were collected through the

questionnaire portion of the online PDAT instrument. Second, the qualitative data were

obtained through interviews with eight upper management personnel. Four management

personnel within the private corporation and four administrators within the school district

were interviewed. First line managers and/or administrators working in either the

professional development or in the training development of human resources department

were asked to participate. Interview questions addressed how they evaluated the last

level of Guskey’s model; student learning outcomes.

An explanation of the Guskey’s model was given, as it related to the training

session evaluations, before the interview questions. Interviewees were given an overview

of the Guskey (2000) model and an explanation of why this information was important to

the researcher. The purpose of the interview was to determine the extent to which the

human resource department administrators viewed their evaluation process for

effectiveness as similar to the views of the employees receiving the training.

Administration personnel such as (training and development directors and managers)

were given the same online survey as their employees. One-on-one interviews focusing

on how their organization evaluates the overall effectiveness of professional development

Page 73: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

training provided to their employees took place any time after the online survey was

distributed. The time frame was one hour to two weeks after they had completed the

online PDAT survey/questionnaire.

Data Collection and Recording

Quantitative

Data collection took place in two parts. First, the quantitative data were collected

from the web-based survey/questionnaire. A letter and a hard copy of the survey were

sent to the superintendent of the school district requesting permission to survey its

professional staff. Once permission was granted, the researcher notified the principals.

Hughes (2006) cited Gall statement, “Researchers must inform each individual about

what will occur during the research study, the information to be disclosed to the

researchers, and the intended use of the research data that are to be collected (2003,

p.69).” The PDAT web-based survey is a Likert five point scale survey tool used for

collecting information online. Once completed the information was exported into an

Excel file. The Excel file was coded and downloaded into the Statistical Package for the

Social Sciences (SPSS) version 13 for analysis and kept on a password secured internet

site.

Qualitative

Second, the qualitative data were collected in two parts. Part one, was from the

questionnaire at the bottom portion of the PDAT survey/questionnaire. Part two, was

collected data from the interview portion. An interview was administered to the

professional/training development managers who volunteered to participate. Guskey

(2000) recommends giving structured interviews to supervisors in determining the over

Page 74: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

all effectiveness. This study used a structured open-ended questioning format. Data

from interviews were recorded using hand notes. The notes of the interview are being

kept in a bank safe deposit box for seven years. Interviewees’ names were not recorded,

but job titles were.

Qualitative data regarding the evaluation process and overall effectiveness of

professional development training were collected through online, open-ended

questionnaires and interviews. Results from the questionnaire were placed under

categories suggested by the Guskey 2000 model. Transcripts of the questionnaires were

entered into NVivo software system (version 7.0) and coded according to the themes that

emerged from the data gathered. Themes that emerged from the data were compiled and

compared between public education and the corporate sector.

The researcher triangulated the results from the quantitative data analysis, the

information from the online open-ended questionnaire and the interviews in order to

strengthen the credibility of the data regarding the overall effectiveness of professional

development training programs. In the transferability of the results, the interview

questions used could be asked in either an educational environment or in a corporate

sector environment. Information gathered may be used to help improve professional

development for both educators and/or corporate employees. Finally, one day after the

interview process, the questions and answers given to the researcher were e-mailed back

to the participants for verification and corrections.

Page 75: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Data Analysis

Quantitative

Descriptive statistics and analyses were conducted to test each question for each

variable. After the collection of data, the next step was to test for statistical significance

at the criterion value of p≤ .05. Collected data regarding the weighted means of the two

groups (public education sector and private corporations) were exported into Statistical

Package for the Social Sciences (SPSS). The type of data collected called for a t-test for

independent means for testing the hypotheses. T-test for independent means is a

parametric test of significance used to determine whether there is a statistically

significant difference between the means of two independent samples (Fraenkel &

Wallen, 2006). Independent variables were the participants’ reactions, participants

learning, and organizational support of the two groups of respondents (those from the

public education sector and those from the private corporation). The values of the

dependent variable are the ratings given to the different questions and whose weighted

means are computed to indicate the overall effectiveness of the professional development.

Qualitative

Information from the qualitative data utilized the frequency distribution table.

After each emergent theme, the frequencies of responses from the different groups of

respondents were tallied and percentages were computed. Frequencies were of

headcounts or tallies indicating the number of cases in a particular category or the total

number of cases measured (Sirkin, 2006). A table consisting of columns for educators,

education administration, corporate employees, and corporate administration was formed

with rows of each category listed on the left side.

Page 76: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Data collection of this study involved a triangulation of the following:

(a) quantitative data analysis from the online survey; (b) qualitative data from the online

questionnaire; and (c) qualitative data from the interviews conducted with the

professional development and training development administrators. Triangulation of this

information helped strengthen the credibility of the survey study and to validate that the

evaluation process use or not use impacted the overall effectiveness of professional

development training programs.

Summary

The purpose of this study was to compare public education professional

development training programs with the corporate sector professional development

training programs. Guskey’s professional development evaluation model comprising of

five critical levels was used to examine the presence and significance of professional

development programs for educators in a public school district and in training programs

for employees in a private corporation. The five levels of this evaluation model included:

(a) participants’ reactions; (b) participants’ learning; (c) organization support and change;

(d) participants’ use of knowledge and skills; and (e) student learning outcomes. For this

study “students” were the “participants” receiving professional development training.

An instrument called Professional Development Assessment Tool (PDAT) which

included the survey and questionnaire were given to respondents from a private

corporation and a public school district were utilized to obtain quantitative and qualitative

data for this study. Interviews of educational administrators and corporate managers of

the sectors involved in the study provided support to the qualitative portion of the study.

Page 77: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Quantitative data on the ratings of respondents from both sectors, expressed in

terms of weighted means, on the critical levels of the evaluation model were presented in

tabular form. The Statistical Package for the Social Sciences (SPSS) was utilized to

determine if there was significant difference of the comparable means.

Data for the qualitative portion of the study showed the emergent themes resulting

from the open-ended questionnaire. Resulting themes were presented in frequency

distribution tables. Results of the interviews supported the qualitative portion of the

study regarding how evaluation of professional development training was undertaken by

the respective sector.

Page 78: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

CHAPTER IV

ANALYSIS OF DATA

The purpose of the study was to compare public education professional

development training programs existing in one public school district with a private

corporation’s professional development training programs. Five critical levels of

professional development evaluation model advocated by Guskey was used to examine

the presence and significance of professional development programs for educators in a

public school district and in training programs for employees in a private corporation.

These critical levels included participants’ reactions, participants’ learning,

organizational support, participants’ use of knowledge and skills and participants’

learning outcomes.

One private company with headquarters in Texas, having 3,000 employees and a

Training and Development Department agreed to participate in the study. Permission

was also given by a public school district in Texas which was comparable in terms of

location and budget to be included in the study.

Quantitative data obtained from the Professional Development Assessment Tool

(PDAT) survey were used to determine if differences existed in how employees in public

education and employees in the corporate sector rated the effect of their professional

development programs. Weighted means were added for each of the four levels of the

tool used which included participants’ reaction, participants’ learning, organizational

support, and participants’ use of knowledge and skills. To determine the overall level of

effectiveness of professional development, the total weighted means for both sectors were

added and compared. The t-test for two independent samples of the Statistical Package

Page 79: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

for the Social Sciences (SPSS) software was utilized to determine if the difference in the

means were significant. Comparison was done between the ratings (expressed in

weighted means, also computed using the SPSS software package) given by the

employees of the corporate sector and educators in the public school district for each of

the critical levels. Another comparison for each of the same critical levels was done

between the managers in the corporate sector or administrators in the public school

district and their respective subordinates. The significance level was set at p ≤ 0.05.

Emergent themes were determined from the online, open-ended, questionnaire

that was designed to follow with the online survey. Tabulated results showed the

emergent themes and the number of times these were mentioned by both respondents

from the public school district and the private corporation. Results of the one-on-one

interviews were compared with these emergent themes.

Regarding the issue on why the corporate sector should evaluate their professional

development, Parry (1997) offers this standpoint:

“Top management wants to know what results the organization is getting from the

hundreds of thousands of dollars spent annually in training. Instructors and

course designers want to know what impact their programs are having on

individuals and the organization. Trainees and their supervisors want to know

what kind of payoff they can expect from taking time away from productive work

to participate in a course” (p. 1).

From the public education sector, Hackett (2005) voices his concern that methods

used to measure the links among professional development, teacher performance and

student achievement have to be improved. Otherwise, “educators will be unable to

Page 80: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

convince parents, community leaders, and local school boards to provide sufficient time

and funding necessary to improve our teachers’ understanding and our students’

performance” (Hackett, 2005, p. 4).

In the public education system, it is hoped that benefits of the staff development

will redound to the betterment of students. Participants in the professional development

should first feel the impact of the training in order to have a sort of positive osmosis

between the educators and students. Evaluation helps to determine if the program is

meeting its organizational objectives and points the way to continuous improvement

(Roberts & Pruitt, 2003).

The following research questions were generated to compare how employees in

the corporate sector represented by the private company and educators in the public

school district selected for the study gave their ratings on the issues and concerns

included in the Professional Development Assessment Tool (PDAT).

Quantitative

1. What are the differences in participants’ reactions regarding the professional

development training between public educators and corporate employees as measured

by PDAT?

2. What are the differences in participants’ learning in professional development

training between public educators and corporate employees as measured by PDAT?

3. What are the differences in organizational support for professional development

between public educators and corporate employees as measured by PDAT?

4. What are the differences in participants’ use of knowledge and skills gained from

Page 81: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

their professional development training program provided by the corporate sector

and public education as measured by PDAT?

The respondents from both corporate and public education sectors rated the issues and

concerns included in the Professional Development Assessment Tool, regarding

participants’ reactions, participants’ learning, organizational support on professional

development and participants’ use of knowledge and skills. Issues and concerns were all

positively stated.

To gather the quantitative data, the following weights were given to the ratings of the

respondents: 5 for “strongly agree”, 4 for “agree”, 3 for “neutral”, 2 for “disagree” and 1

for “strongly disagree”. Computations of the weighted means were based on these assigned

values.

For the qualitative dimension of the study, the following research question was

utilized.

Qualitative

5. What are the differences in how the evaluation of participants’ learning outcomes

is determined between private corporations and public education, based on

Guskey’s model?

The above research question was supported by the following three open-ended

questions:

1. In what ways has your professional development training impacted

your work performance?

2. In what ways has your professional development training affected your attitude

about learning new things?

Page 82: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

3. In what ways has your professional development training enhanced your skills or

behaviors?

The emergent themes were determined from the responses to these three

questions by the respondents from both the corporate sector and the public school

district. A frequency table showed the emergent themes and the number of times these

were mentioned by the respondents for each of the questions. The percentages were

computed based on the total number of respondents; the totals may have varied since

some responses may have included more than one theme or respondents refrained from

giving an answer.

Findings

Quantitative

For the quantitative portion of the study, the ratings given by respondents for each

of the critical areas which included participants’ reactions, participants’ learning,

organizational support and participants’ use of knowledge and skills, were tallied,

assigned weights and weighted means were computed using the SPSS software package

(version 13.0). From the weighted means, the t-test for two independent samples was

computed using the same statistical software package. Results were analyzed and a

decision was made whether to accept or reject the null hypothesis for each comparison of

means.

Page 83: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Research Question One

1. What are the differences in participants’ reactions regarding the professional

development training between public educators and corporate employees as

measured by PDAT?

The following null hypothesis was formulated to answer the above question:

Ho1: There are no statistically significant differences in the participants’ reactions

regarding the professional development training provided between public

educators and corporate employees as measured by PDAT.

The issues and concerns that were used to signify participants’ reactions to

professional development training received included clarity of objectives of the trainings,

appropriateness of materials presented, expertise of the presenters, relevance of topics

and depth of trainings received.

The ratings of the respondents from both corporate sector and public school

district for the issues and concerns cited in the participants’ reactions were tallied and

results of the subsequent computations for weighted means and t-tests are shown in Table

4.1.1 and Table 4.1.2.

The total weighted mean for the ratings of educators (administrators and

teachers/counselors) regarding their reactions to professional development training was

20.70. This is shown in Table 4.1.1. For the corporate sector, the total weighted mean

was 19.68. Using the t-test for two independent samples, the difference in the means was

shown to be statistically significant. The decision was to reject the null hypothesis.

Page 84: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.1.1

Descriptive Statistics on Participants’ Reactions Regarding the Professional Development

Training Received

F Mean s.d. t df Sig. (2-tailed)Public School District 465 20.70 4.15 3.736* 770 .000

Private Corporation 307 19.68 2.85*Significant at p≤ 0.05

Table 4.1.2 shows the comparison of the ratings on the participants’ reactions

regarding professional development training received, between the district administrators

or corporate managers and their respective subordinates.

For the public school district, the educational administrators rated with a weighted

mean of 22.2, higher than the teachers/counselors whose weighted mean was 20.59. The

t-test value of 2.247 was statistically significant. This indicated that the public school

administrators had higher expectations of the trainings received by personnel in the

district.

For the corporate managers and other employees in the private corporation, their

ratings that resulted to the weighted means of 19.40 and 19.70 respectively, showed no

significant difference. Both the managers and subordinates had about the same level of

expectations regarding the trainings received by employees of the corporation.

Page 85: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.1.2.

Comparison of Ratings on Participants’ Reactions Between Educational Administrators

or Corporate Managers and their Respective Subordinates

f Mean s.d. t df Sig. (2-tailed)Educational Admin. 35 22.22 2.64 2.247* 463 .025

Teachers/Counselors 430 20.59 4.22

Corporate Managers 27 19.40

2.60 -.513**

302 .608

Other Employees 277 19.70 2.88* Significant at p≤ 0.05 **Not Significant

Research Question Two

2. What are the differences in participants’ learning in professional development

training between public educators and corporate employees as measured by

PDAT?

The following null hypothesis below was formulated to answer the above question:

Ho2: There are no statistically significant differences in participants’ learning

throughout their professional development training outcomes between public

educators and corporate employees as measured by PDAT.

The issues and concerns regarding participants’ learning included opportunities to

engage in problem solving activities during the duration of the training, range of choices

in topics, usefulness of materials learned to everyday activities, positive application of

objectives and acquisition of new knowledge and skills. Respondents from both the

corporate sector and public school district gave their ratings for each of the issues or

concerns cited.

Page 86: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

The total weighted means for the ratings of educators (administrators and

teachers/counselors) regarding their participants’ learning from professional development

training was 19.83. This is shown in Table 4.2.1. For the corporate sector, the total

weighted mean was 18.24. Using the t-test for two independent samples, the difference in

the means was shown to be statistically significant. The decision was to reject the null

hypothesis.

Table 4.2.1.

Descriptive Statistics on Participants’ Learning Regarding the Professional Development

Training Received

f Mean s.d. t df Sig. (2-tailedPublic School District 463 19.83 4.34 5.482* 764 .000

Private Corporation 303 18.24 3.22*Significant at p≤ 0.05

It is interesting to note that in the ratings of educational administrators or

corporate managers compared to their subordinates in the area of participants’ learning,

the difference in their weighted means were not statistically significant. The results are

shown in Table 4.2.2. This means that the level of expectations regarding what can be

learned from the professional development training by the educational administrators was

about the same as the teachers and counselors; the same was true between the corporate

managers and their subordinates.

Page 87: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.2.2.

Comparison of Ratings on Participants’ Learning Between Educational Administrators or

Corporate Managers and their Respective Subordinates

f Mean s.d. t df Sig. (2-tailed)Educational Admin. 35 20.85 3.35 1.429** 461 .154

Teachers/Counselors 428 19.76 4.41

Corporate Managers 26 18.46 2.64 .392** 298 .695

Other Employees 274 18.20 3.29** Not Significant

Research Question Three

3. What are the differences in organizational support for professional development

between public educators and corporate employees as measured by the PDAT?

The following null hypothesis was formulated to answer the above question:

Ho3: There are no statistically significant differences in the organizational support

for professional development training between public educators and corporate

employees as measured by PDAT.

The issues and concerns on organizational support provided by a private

corporation and a public school district included the trainings being aligned with the

organization’s mission, vision and goals, sufficient funding for training in the budget,

presence of incentives and allowing employees to participate in trainings were considered

as elements of organizational support of the organization for training and development.

The comparison of ratings, expressed in terms of weighted means, regarding

organizational support is shown in Table 4.3.1. Employees in the private corporation

Page 88: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

gave a higher rating shown in the weighted mean of 19.42; the weighted mean for the

public school district was 18.81. Results of the computations for t-test for two

independent samples showed that the difference in the weighted mean was statistically

significant. The decision was to reject the null hypothesis.

This situation for the private corporation was expected. The private corporation

in this study had a reasonable amount of funding for training and development of their

employees. Incentives, tuition reimbursement and career opportunities are linked to the

private corporation’s operations involving training and development of its employees.

For the public school district, new employees normally go through the orientation

program given before students report for school during the new school year. Trainings

include new trends in teaching strategies, classroom management, employee benefits,

district procedures, and other related matters. Since training hours were part of the

teachers’ work days, no monetary incentive was given. Other trainings during the year

may happen during school days or weekends. Employees attend these trainings since they

are required to accumulate a certain number of hours to satisfy the district requirement

that may be a part of the appraisal system.

Table 4.3.1.

Descriptive Statistics on Organizational Support for the Professional Development

Training

Number Mean Std Dev. t df Sig. (2-tailed)Public School District 462 18.81 4.45 -2049* 764 .041

Private Corporation 304 19.42 3.26*Significant at p≤ 0.05

Page 89: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.3.2 shows the comparison of ratings, in terms of weighted means, on

organizational support between the educational administrators or corporate managers and

their respective subordinates. The t-test result for the public school district was 2.641 and

3.106 for the private organization. Both results were statistically significant. Educational

administrators and corporate managers are aware of what their respective organization

has planned for the employees’ training and development and the monetary requirements

for the different activities. In this aspect, the managers gave a higher rating compared to

their respective subordinates.

Table 4.3.2.

Comparison of Ratings on Organizational Support Between Educational Administrators

or Corporate Managers and their Respective Subordinates

Number Mean Std. Dev. t df Sig. (2-tailed)Educational Admin. 35 20.71 4.66 2.641* 460 .009

Teachers/Counselors 427 18.65 4.40

Corporate Managers 27 21.25 3.81 3.106* 299 .002

Other Employees 274 19.23 3.17*Significant at p≤ 0.05

Research Question Four

4. What are the differences in participants’ use of knowledge and skills gained from

their professional development training program provided by the corporate sector

and public education as measured by PDAT?

The following null hypothesis was formulated to answer the above question:

Page 90: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Ho4: There are no statistically significant differences in the participants’ use of

knowledge and skills gained from their professional development training

program provided by the corporate sector and public education as measured by

PDAT.

Issues and concerns regarding participants’ use of knowledge and skills included

being able to implement knowledge learned, positive effect on the employee’s behavior

and implementation assessed by a mentor are elements considered in participants’ use of

knowledge and skills.

Table 4.4.1 shows that the total weighted mean of the public school district of

17.45 was significantly different from the private corporation’s weighted mean of 16.01.

Results of the computations for t-test for two independent samples showed that the

difference in the weighted mean was statistically significant. The decision was to reject

the null hypothesis.

The results show that the impact of training and development on employees in a

public school district had an immediate effect on how they do their job on a daily basis.

On the other hand, employees hired by a private corporation may be oriented to company

policies and regulations during their early days with the organization, but they were

ascertained to fit in the role they were hired to do. Training and development for them

may be needed on a situational basis, like learning new measures in safety procedures to

improve the safety rating of the company.

Page 91: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.4.1.

Descriptive Statistics on Participants’ Use of Knowledge and Skills from the Professional

Development Training Received

Number Mean Std Dev. T df Sig. (2-tailed)Public School District 459 17.45 3.67 6.069* 763 .000

Private Corporation 306 16.01 2.32* Significant at p≤ 0.05

Table 4.4.2 shows the comparison of ratings on participants’ use of knowledge

and skills between educational administrators or corporate managers and their respective

subordinates. The difference between the means of educational administrators (18.87)

and the teachers/counselors (17.33) was found to be statistically significant. Educational

administrators may also have expected that the training and development programs would

produce positive and meaningful impact on those who attended the training programs.

The weighted mean of the corporate managers (16.22) on participants’ use of

knowledge and skills, compared to the mean of other employees (15.98) was not

statistically different. Corporate managers and the other employees had the basic

knowledge and skills required when they were hired or promoted to their respective

positions. Additional training and development programs may enhance their level of

expertise and both managers and subordinates see this possibility at almost the same

level.

Page 92: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.4.2.

Comparison of Ratings on Participants’ Use of Knowledge and Skills Between

Educational Administrators or Corporate Managers and their Respective Subordinates

f Mean s.d. t df Sig. (2-tailed)Educational Admin. 33 18.87 3.66 2.339* 457 .020

Teachers/Counselors 426 17.33 3.65

Corporate Managers 27 16.22 2.04 .496** 301 .621

Other Employees 276 15.98 2.35*Significant at p≤ 0.05 ** Not Significant

Table 4.5.1 shows the descriptive statistics showing the over-all effectiveness of

the professional development training for the private school district and the private

corporation. The means for the participants’ reaction, participants’ learning,

organizational support and participants’ use of knowledge and skills were added together

and the new totals became the basis for comparison. Table 4.5.1 also shows the

combined weighted mean of public educators for all four levels considered was 76.70; the

combined weighted mean for corporate employees was 73.19. Computations of the t-test

for two independent samples showed that this difference was statistically significant.

This was expected since the ratings of the public school district were higher compared to

the ratings of employees in the corporate sector in three areas of participants’ reaction,

participants’ learning and participants’ use of knowledge and skills. The employees of

the corporate sector gave higher ratings, expressed in terms of a higher weighted mean,

only in the aspect of organizational support.

Page 93: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.5.1.

Descriptive Statistics on Total Effectiveness of the Professional Development Training

Received

f Mean s.d. t df Sig. (2-tailedPublic School District 444 76.70 14.91 3.600* 573 .000

Private Corporation 296 73.19 9.36* Significant at p≤ 0.05

Table 4.5.2 shows the total combined weighted mean of the educational

administrators for all four levels under study, was 82.57 and the comparable mean of the

teachers/counselors was 76.25. The t-test result (2.354) for two independent samples

considered this difference to be statistically significant.

The total combined weighted mean for the four levels of corporate managers was

75.50 and the comparable mean for the other corporate employees was 72.92. Result of

the t-test for two independent samples of 1.338 considered this difference between the

weighted means for corporate managers and their subordinates to not be statistically

significant. Both corporate managers and subordinates are aware that placement in jobs

was based on expertise and merit. In a private corporation, employees are hired based on

their background knowledge and skills. Focus of their training may include policies and

procedures of the company and legal requirements like safety procedures and other

compliance issues. Managers had more training in leadership than their subordinates that

led to the ability of the former to be able to implement required changes. Unlike the

public school district, the private corporation under study was not following a specific

evaluation model for their training and development programs.

Page 94: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.5.2.

Comparison of Ratings on Total Effectiveness of Professional Development Training

Between Educational Administrators or Corporate Managers and their Respective

Subordinates

f Mean s.d. t df Sig. (2-tailed)Educational Admin. 33 82.57 12.48 2.354* 442 .019

Teachers/Counselors 411 76.25 15.01

Corporate Managers 26 75.50 7.55 1.338** 291 .182

Corporate Employee 267 72.92 9.53*Significant at p≤ 0.05 ** Not Significant

The summary of the ratings of the respondents expressed in terms of weighted

means is shown in Table 4.5.3. Both educational administrators and corporate managers

generally gave higher ratings for the different areas, indicated by the higher weighted

means, compared to their subordinates (except for participants’ reaction of other

employees in the private corporation).

Page 95: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.5.3

Summary of Ratings in Terms of Weighted Means

Educational Teachers/ Corporate Other

Admin. Counselors Combined Managers Employees Combined

Participants’

Reactions 22.22 20.59 20.70 19.40 19.70 19.68

Participants’

Learning 20.85 19.76 19.83 18.46 18.20 18.24

Organizational

Support 20.71 18.65 18.81 21.25 19.23 19.42

Use of Knowledge

And Skills 18.87 17.33 17.45 16.22 15.98 16.01

OVER-ALL 82.57 76.25 76.70 75.50 72.92 73.19

Qualitative

For the qualitative portion of the study, the following data and findings were used to

answer the fifth research question:

5. What are the differences in how the evaluation of participants’ learning outcomes

is determined between private corporations and public education based on

Guskey’s model?

The findings regarding the qualitative portion of the study are presented as

follows: (1) explanation of the results to the open-ended questionnaire regarding

participants’ learning outcomes; (2) frequency tables showing the emergent themes and

the percentage of respondents giving the same responses; and (3) anecdotal records,

Page 96: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

showing the views and opinions of district educational administrators and corporate

managers regarding how they evaluated their employees implementing the new

knowledge and skills learned to indicate whether the training they received was effective.

Answers of some respondents belonged to more than one emergent theme; the

total number of answers may have exceeded the total number of respondents. The

percentage shown after the total responses given for each emergent theme was computed

based on total number of the respondents.

The major focus for the following questions asked on the online questionnaire

dealt with participants’ learning outcomes. There were three open-ended questions,

found in the tail portion of the survey, for participants to answer:

1: In what ways has your professional development training impacted your work

performance?

Responses to the above question are shown in Table 4.6.1. Respondents from the

private corporation considered the following top five reasons on how professional

development training impacted their work performance: (1) learned new skills essential

to job (14.8 %); (2) increased productivity/efficiency (11.8 %); (3) learned something

new (10.8 %); (4) familiar with better techniques (9.8 %); and (5) improved human

relations (9.3 %). Of the respondents from the private corporation, 9.3 % answered that

they had no training.

Respondents in the public school district considered the following as having

impacted their work performance: (1) increased productivity/efficiency (19.0 %);

(2) learned new skills essential to the job (17.3 %); (3) familiar with better techniques

(15.1 %); (4) minimal or no impact (11.7 %); and, (5) learned something new (10.0%).

Page 97: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Four of the top five reasons were mentioned by respondents from both the private

corporation and the public school district. These reasons included learned skills essential

to job, increased productivity/efficiency, learned something new and familiar with better

techniques.

Improved human relations is the other significant reason on how training

impacted employees. Corporate managers sought more positive interactions between

their employees and smooth interpersonal relationships that contribute to better work

performance. Corporate employees are required to be team players; therefore a lot of

their professional development training is focus on interpersonal skills. The biggest

difference was that (11.7%) of employees in the public school district indicated their

training had minimal or no impact to their work performance.

Page 98: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.6.1

Ways That Professional Development Training Impacted Employees’ Work Performance

Private Corporation Public School District

Emergent Themes: Freq. Percent Freq. Percent

a. Learned skills essential to job 30 14.8 62 17.3

b. Increased productivity/efficiency 24 11.8 68 19.0

c. Learned something new 22 10.8 36 10.0

d. Familiar with better techniques 20 9.8 54 15.1

e. Improved human relations 19 9.3 7 2.0

f. No training 19 9.3 2 0.6

g. Minimal or no impact 18 8.9 42 11.7

h. Improved leadership skills 13 6.4 12 3.4

i. Able to handle additional workload 11 5.4 3 0.8

j. Able to handle problems 7 3.4 11 3.1

k. Not able to apply in work area 7 3.4 23 6.4

l. Able to set priorities 5 2.4 5 1.4

m. Improved communication skills 5 2.4 4 1.1

n. Incentive to work harder 4 1.9 7 2.0

o. Proactive/positive outlook - - 17 4.7

p. No support from administration - - 5 1.4

Total 204 100 % 358 100 %

Managers of corporate employees evaluated the impact of professional

development training on work performance through the goals of the employees. There

Page 99: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

are no formal ways of evaluating the impact on work performance through the

professional development training received by their employees. As one manager stated,

“it’s not being done, right now it’s left up to the individual to use the skills they have

learned.” Another manager mentioned, “Goals on the performance appraisal help drive

the training needed. Managers work with employees to help find training and use

experience to see if the training will be applicable.” Upper management concurred with

the other managers and stated “Leadership development for one management style is

based on their personal desire.”

Educational administrators evaluated the impact of professional development

training through the student learning outcomes. As one administrator puts it; “all

professional development requirements focus on students’ success of learning.” Another

administrator stated that “the goal of the district which is “student achievement” is the

overall process used to determine the link between learning and individual/organizational

performance.”

Corporate employees agree with their management in stating that there is no

formal evaluation process in determining the impact of professional development training

received. Employees decide whether to use it or not. Employees agreed that there should

be a common goal that will guide them in their work. There were some corporate

employees who viewed their professional development as having no impact on their work

performance because they were not able to implement the new skills learned.

As one employee stated:

My work performance has improved in some areas, based on what training I have

taken. But after getting back to my regular routine, I find that I often don’t get to

Page 100: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

put my training to good use and no one is checking to see if I’m using the skills

I’ve learned.

Another employee stated:

Much of the training involves emphasizing the importance to a lot of ideas and

techniques that we all know but don’t acknowledge or use. This increased

awareness is good and brings about some positive change. I even try out some

ideas but this is most often short lived back in the “rat race”. So I think it only

incrementally improves my performance with the potential for greater benefit if

there continued to be some forcing function holding me accountable.

In dealing with how training has impacted employees’ performance, one employee

simply stated: “Sometimes it’s hard to tell but I’m sure that training will eventually

impact my work performance.” Another employee stated: “There has been limited

professional training that would improve or impact work performance.”

Employees of the school district agreed with the administrators in having multiple

evaluation processes. They agreed upon the same goal that was “student achievement.”

Having the same goal makes it easier to evaluate implementation. Most employees were

pleased with the opportunity to try new things with their students. There were some

employees who stated they would benefit more from professional development that was

more specific and along their core area of expertise like Science, Mathematics, English

and others. Most comments dealt with content related material, relevance or on how to

meet individual student needs.

In regards to content material and material being relevant, one teacher stated:

“Content related material is more meaningful.” Another teacher stated: “It depends on

Page 101: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

how it relates to what I teach. If related directly, then it has a greater impact.” Comments

from other teachers are as follows:

“The use of common assessments helped with student learning.”

“Greater focus and organization; more student centered activities; new types of

assessment tools helped with student achievement.”

“The training I’ve selected is more relevant to me because I know exactly what I

need.”

“If I have a choice in what training I take, it has a more positive effect on how I

teach my students.”

“The more specific the training, the larger an effect it has had. For example, a

development session on teaching AP English Literature will be much more meaningful

than District Curriculum Day.”

2: In what ways has your professional development training affected your attitude about

learning new things?

Responses of both respondents from the private corporation and the public school

district to the above question are shown in Table 4.6.2. On the different ways that

development training affected their attitude on learning new things, respondents from the

private corporation cited the following five main reasons: (1) eager to learn (20.8 %); (2)

no or little change (16.4 %); (3) more positive outlook (10.9 %); (4) more open-minded

(10.4 %); and, (5) realized more to learn (6.6 %).

Respondents from the public school district identified the following five main

reasons: (1) eager to learn (19.9 %); (2) learned new strategies (13.6 %); (3) no or little

Page 102: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

change (12.3 %); (4) more positive outlook (12.3 %); and, (5) more open-minded (9.5

%).

Employees from both sectors identified the same top four reasons on how

professional development training affected their attitude on learning new things. The

same reasons identified by respondents from both sectors included: eager to learn, no or

little change, more positive outlook and more open minded.

Respondents from the private corporation differed in their fifth reason- they

realized more to learn, possibly after they were aware that there were more job

opportunities open to them, if they perform well in their current job and have added

knowledge and skills needed for another job as a possible promotion. Improvement in

career path is a bright opportunity in a private corporation. One employee stated: “I find

myself wanting to learn more.” Another employee stated: “I noticed I still had a lot to

learn.”

Teachers and counselors may also avail themselves of opportunities to become an

administrator. However, additional schooling and additional training under the

mentorship of another administrator are required. Job openings may be available, but

competition is stiff.

Respondents in the public school district referred to learning new strategies as

their second main reason. In their college training or in their Alternative Certification

Program (ACP), teachers may have learned the basic theories of teaching and learning.

Classroom and new teaching strategies expounded by experts during the training may

have enlightened the teachers on what they can do to achieve the goal of “student

Page 103: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

achievement.” The following statements from teachers described how their attitude has

been affected by training received:

“It has enhanced my attitude. I understand the importance of PD and its profound

impact on my teaching.” “My attitude in learning new ideas or strategies has given me

the continuous support that is needed to enlighten my students to have a “fresh look” at

new objectives.” “I enjoy attending training, because I always want to learn about new

strategies that may be better for my students. I always experiment with the new strategies

that were given to me from my training.” And finally, another teacher stated: “I have a

more positive outlook because I am excited to try new strategies.”

Page 104: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.6.2

Ways Professional Development Training Affected Attitude on Learning New Things

Private Corporation Public School District

Emergent Themes: Freq. Percent Freq. Percent

a. Eager to learn 38 20.8 63 19.9

b. No or little change 30 16.4 39 12.3

c. More positive outlook 20 10.9 38 12.3

d. More open-minded 19 10.4 30 9.5

e. Realized more to learn 12 6.6 20 6.3

f. Learned new strategies 11 6.0 43 13.6

g. Saw future benefits 11 6.0 14 4.4

h. Appreciated different outlooks 9 4.9 9 2.8

i. Renewed interest in job 9 4.9 15 4.7

j. Prepared for more meaningful tasks 9 4.9 11 3.5

k. Discouraging 5 2.7 18 5.7

l. Felt support/improved morale 4 2.2 7 2.2

m. Affected home environment 3 1.6 - -

n. Training too long 3 1.6 1 0.3

o. Not aligned with job - - 9 2.8

Total 183 100 % 317 100 %

There is no formal way for evaluating how an employee’s attitude has been

affected by the material he/she gets from professional development training. One

corporate manager indicated no formal post evaluation is being done. Managers just hold

Page 105: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

informal conversations like; “Did you like the class and was the class helpful?” Several

managers stated that the only somewhat formal evaluation is done through the

performance appraisal process and is done mostly for difficult employees. When it

comes to how first line managers evaluate participants’ change in attitude/or behavior

after receiving professional development training, one manager simply stated:

It’s not evaluated. Managers need to be more proactive in following up to ensure

employee training is beneficial and they are bringing something back to the office

to implement and work on towards individual and group growth. Employees are

more in control of the choices they make as far as training. Management has

always been reactive. They see a problem with an employee, and they decide that

the employee needs to go to training to help with the problem. If a manager was

more proactive, they would set up specific training flows for each individual

based on their future goals and follow up their training with mock situations to

help them practice the skills they’ve just learned.

Another manager stated: “It’s not being done. As manager, I should track their learning

with the employee’s Performance Appraisal goals and objectives.” Upper management

concurred with the above statement by saying: “Goals of the employees is how they

evaluate training. The employee set [their personal goals] that they want to achieve.

Training and Development (T&D) offer a list of courses and we simply pick what’s out

there.”

Educational administrators evaluate the attitudes of their employees continuously

as one administrator puts it: “We use observation and two way communications with

employees to see how they are doing.” Upper administration concurred by stating:

Page 106: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Surveys are used. Negative comments are taken seriously. We look into what

can be done differently. Because everyone has a say into the development service

they receive, attitudes are changing in a positive way. Learning is becoming a

district wide attitude.

3: In what ways has your professional development training enhanced your skills or

behaviors?

Responses of educators and corporate employees regarding what ways

professional development training has enhanced their skills or behaviors are shown in

Table 4.6.3. Respondents from the private corporation cited the following top reasons:

(1) learned rudiments of the job (20.8 %); (2) no significant change (11.4 %);

(3) improved human relations (8.9 %); (4) increased productivity/efficiency (8.4 %); (5)

reinforced previous training, better communication skills and respect people’s

opinions/attitudes (the three tied at 7.4 %).

Respondents from the public school district offered a slightly different set of

reasons: (1) increased productivity/efficiency (22.5 %); (2) learned rudiments of job (19.2

%); (3) no significant change (16.0 %); (4) saw new perspectives (13.3 %); and, (5)

developed positive attitudes (7.4 %).

Respondents from both sectors agreed on three top reasons on the ways that

professional development training enhanced their knowledge and skills; these included:

(1) learned rudiments of job; (2) increased productivity/efficiency, and (3) no significant

change. Corporate employees identified learning rudiments of the job, while educators

considered increased productivity/efficiency as their respective top reason on how

professional training development enhanced their knowledge and skills.

Page 107: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

There was no significant change for corporate employees and educators

concerning the second reason which was increased productivity/efficiency. Employees in

the corporate sector saw no significant change since they already have the knowledge and

skills required of their respective jobs. Hours of professional development training were

more meaningful for employees in the corporate sector who were aiming at acquiring

knowledge and skills related to career development, paving the way for possible

promotion.

Participation in training programs was available to public school employees

through the internet website called ETRAIN. Participation was mostly free of charge if it

was district-sponsored. However, if trainings require certain fees, like university summer

or non-district trainings, approval of the principal was required, unless the teacher paid

the expenses himself/herself. Trainings during weekends may provide incentives. What

was important was that the employees have gone to at least 45 hours of district mandated

training by the end of the year.

Employees in the private corporation followed a series of trainings designed by

the Training and Development Department. Normally included in the orientation

program for all employees were company rules and regulations, and policies (vacation,

sick leave, disciplinary and others). Trainings were conducted in-house. Promising

employees who are being considered for a certain career path may be sent for more

training.

Page 108: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Table 4.6.3.

Ways Professional Development Training Enhanced Skills or Behaviors

Private Corporation Public School District

Emergent Themes: Freq. Percent Freq. Percent

a. Learned rudiments of job 42 20.8 65 19.2

b. No significant change 23 11.4 54 16.0

c. Improved human relations 18 8.9 6 1.8

d. Increased productivity/efficiency 17 8.4 76 22.5

e. Reinforced previous training 15 7.4 8 2.4

f. Respect people’s opinions/attitudes 15 7.4 8 2.4

g. Better communication skills 15 7.4 8 2.4

h. Enhanced leadership skills 12 5.9 18 5.3

i. Saw new perspectives 12 5.9 45 13.3

j. Decision making/problem solving 11 5.5 5 1.5

k. Developed positive attitudes 9 4.5 25 7.4

l. Better time management 6 3.0 4 1.2

m. Able to multi-task 4 2.0 1 0.3

n. More safety conscious 3 1.5 - -

o. Learned new techniques/strategies - - 10 2.9

p. Make learning fun - - 5 1.5

Total 202 100 % 338 100 %

Page 109: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

In looking at ways for enhancing skills or behaviors in participants who have

received professional development training one corporate manager stated:

No one from the training and development department has ever contacted their

department to see how best they can serve them. As manager, I’m given a budget

from my manager; then I divide the money amongst the employees. I give more

money to those employees that are on the Leadership Development Plan. Budget

is a line item for each employee. Employees choose which class they would like

to participate in.

Upper management added to the above comment by stating:

More follow up is done on technical training than in leadership training. No

follow up is done with individuals who take the initiative for their own

development; however, follow up is done on those employees that are

recommended for development training when there is a concern about a weakness

they have.

Another manager mentioned how employees have input in the process by stating:

As manager, I saw a need for a particular class, and the training and development

department developed one based on my request. There is no set list of courses an

employee must take. There are a series of courses which were linked to a

management prep flow.

Educational leaders use observations, surveys, two-way communications and open

door policies to evaluate the participants’ enhanced skills or behaviors. One

administrator stated:

Page 110: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

With the new direction of the professional development service department,

everyone has a say in the type of development training they need. Part of the

service contract we have with the professional development department is the

evaluation of service they provide at different levels. Constant online surveys are

given to see how the participants’ are using the material taught to them and to see

what didn’t work and what needs to change.

Another educational administrator supported the above statement by saying:

“Teachers use a self report called Management by Objectives to try to make their

training more meaningful. Teachers have a say in all training they participate in,

this includes content areas like Math, Science and other content area.”

Corporate employees stated the following regarding enhanced skills or behaviors due to

their professional development training:

“It’s hard to implement any thing that I have learned.” “It helped with my plans

for advancement.” “Behavior has never been negative when I learn new things.” “I’ve

been gathering skills as required in my training plan that will help me advance to another

position.” “The training has definitely increased my skills in the subject areas, but I am

not always assigned tasks that allow me to implement those skills.”

Employees from the school district had the following to say regarding enhanced skills or

behaviors due to their professional development training:

“My professional development training been a positive motivating factor to

continue to enhance my skills and continue to reflect on my behaviors in order to

improve my teaching and increase student learning.” “When related to the area I teach, it

has enhanced my skill significantly.” “Every now and again I am able to attend a training

Page 111: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

that I’ve selected to meet specific needs and usually those impact skills and behaviors

positively.” “It depends, if I can use the material its great, if not it’s a waste of my time.”

“Being able to implement new skills always has a positive effect.”

Discussion

From the ratings of both respondents from the private corporation and the public

school district, the weighted means were computed after assigned weights were given to

the responses. The t-test for two independent samples was utilized to determine if the

difference between the means of each of the critical levels was statistically significant.

The mean of the public school district on participants’ reactions regarding

professional development received was 20.70; the comparable mean for the private

corporation was 19.68. Computed t-test value of 3.736 was statistically significant at

p ≤ 0.05. The null hypothesis was rejected. This indicated that the ratings given by

respondents of the public school district were significantly higher than those given by

respondents of the private corporation.

Ratings given by the corporate managers and educational administrators were

compared to the ratings given by their respective subordinates. The weighted mean of the

educational administrators of 22.2 was statistically significant when compared to the

weighted mean of 20.59 resulting from the ratings of teachers/counselors. Educational

administrators were more optimistic in their ratings, possibly due to their direct

involvement in the preparation and execution of the training programs. When ratings of

the corporate managers (Mean =19.40), expressed in terms of the weighted mean, were

compared to the ratings of their subordinates (Mean=19.70), the t-test result showed no

significant difference between the weighted means.

Page 112: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Results from the PDAT regarding the respondents’ ratings on participants’

learning showed a weighted mean of 19.83 for the public school district and 18.24 for the

private corporation. The computed t-test value of 5.482 was statistically significant;

consequently the null hypothesis was rejected. Public school educators viewed the

learning of new skills more positively than corporate employees.

In comparing the weighted means of educational administrators (20.85) and

teachers/counselors (19.76), the resulting t-test value was not statistically significant.

Comparison of the weighted means of the corporate managers (18.46) and their

subordinates (18.20) similarly resulted to no significant difference.

In the critical area of organizational support, respondents from the corporate

sector gave a higher rating expressed in the weighted mean of 19.42, compared to 18.81

of the public school district. The difference was statistically significant.

When the weighted means of the corporate managers and educational

administrators were compared to the weighted means of their respective subordinates

using the t-test for two independent samples, both computations showed statistical

significance. Educational administrators and corporate managers were more aware of

what their respective organization had planned for the trainings of employees,

specifically the monetary requirement.

Comparison of the weighted means of the public school district (17.45) and the

private corporation (16.01) regarding the aspect of participants’ use of knowledge and

skills showed significant difference. Public school educators considered their ability to

implement their skills more positively than corporate employees.

Page 113: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Educational administrators gave significantly higher ratings expressed in the

weighted mean of 18.87, compared to the weighted mean of 17.33 resulting from the

ratings of teachers/counselors.

On the critical level of participants’ use of knowledge and skills, the weighted

mean of the corporate mangers (16.22) was not statistically significant compared to the

weighted mean of other employees (15.98). Both corporate managers and other

employees were aware that employees hired already possess the knowledge and skills

required in the different job areas.

Due to the higher weighted means of the public school district in all the critical

areas except in organizational support, the difference between their total weighted mean

of 76.70 compared to the total weighted mean of 73.19 for the private corporation, was

computed to be statistically significant. Based on the criteria set by Guskey’s model of

evaluation involving the different critical areas, the overall effectiveness of the training

program was more positively evaluated by the public school district.

Evaluation of the overall effectiveness of the training program given by the

educational administrators (82.57) was significantly higher than the ratings of the

teachers/counselors (76.25).

Comparison of the evaluation regarding the overall effectiveness of the

professional development training by the corporate managers (Mean = 75.50) and their

subordinates (Mean = 72.92), yielded to no statistical significant difference. The bulk of

professional development training given was leadership training. Corporate managers are

able to implement the soft skills learned during their training more easily than their

employees. Lack of implementation was the reason for the difference in the mean score

Page 114: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

regarding overall effectiveness. The private corporation was not utilizing a model for

evaluating their training and development programs.

For the qualitative portion of the study based on the responses to the open-ended

questionnaire, the following key points were noted.

(1) Responses to how training and development impacted work performance from

both corporate and public school sectors. The following themes emerged: learned skills

essential to the job, increased productivity/efficiency, learned something new, and

became familiar with better techniques. Improved human relationship was another major

reason that respondents indicated as having impacted their training and development.

(2) Respondents from both sectors identified the following reasons as having

affected their attitude about learning new things. The following reasons were: eagerness

to learn, more positive outlook, and more open-mindedness in their dealings. A good

percentage of respondents (12.3 % for the public school district and 16.4 % for the

private corporation) indicated that the professional development training produced no or

little change in them.

(3) Public school educators and private corporation employees agreed on three top

reasons that professional development training enhanced their skills and behaviors.

These included: learned rudiments of the job, increased productivity/efficiency and no

significant change. Corporate employees added the aspects of improved human relations,

better communication skills and reinforced previous trainings to reasons that enhanced

their skills or behaviors. Educators saw new perspectives and developed positive

attitudes as additional reasons in how professional development training enhanced their

skills and behaviors.

Page 115: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Interviews with corporate managers and educational administrators yielded to the

following findings:

1) Managers of corporate employees evaluated the impact of professional

development training on work performance through the goals of the employees. For the

private corporation involved in the study, there were no formal ways of evaluating the

impact on work performance through the professional development training received by

their employees.

2) Educational administrators evaluated the impact of professional development

training through the student learning outcomes. Another way of expressing learning

outcomes was through students’ success in learning or student achievement. This has

been the focus of the district’s organizational activities.

3) There was no formal way for evaluating how an employee’s attitude has been

affected by the material he/she received from professional development training in the

private corporation. Evaluation was rather informal.

4) Educational administrators evaluated the attitudes of their employees

continuously through online surveys to see how they were using the material taught to

them and to see what work and what needed to be changed.

5) In the private corporation, more follow up was done on technical training than

in leadership training. No follow up was done with individuals who undertook personal

initiative for their own development. Follow up was done on employees who were

recommended for development training to alleviate a certain weakness.

6) Educational leaders used observations, surveys, two-way communications and

open-door policies to evaluate the participants’ enhanced skills or behaviors.

Page 116: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

The quantitative portion of the study was based on the Likert-type scale where

respondents from the private corporation and the public school district gave their ratings

on the different issues and concerns regarding the critical areas of participants’ reactions,

participants’ learning, organizational support and participants’ use of knowledge and

skills. Weights were assigned to the responses and weighted means were computed.

Comparison of means was done through the t-test for two independent samples.

For the qualitative portion of the study respondents gave their views regarding the

three open-ended questions on how professional development training impacted their

work performance, how it affected their attitude about learning new things and how

professional development training enhanced their skills and behaviors. Responses were

categorized under emergent themes and corresponding frequencies were tallied and

percentages were computed. A comparison of these themes from both sectors provided a

set of priorities on how the private corporation and the public school district considered

the different aspects of their professional development training.

Interviews with the corporate managers and educational administrators provided

the additional support of the study since feedback was drawn from them regarding the

need for evaluating the professional development program.

It is important to mention that in this study the researcher had no idea when

selecting organizations to participate that one of the organizations were using Guskey’s

model. It was only discovered during the interview process.

Through interviews it was discovered that the school district utilized Guskey’s

model by working with the desired outcome in mind during the planning process.

Guskey (2005) states:

Page 117: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

That most of the critical evaluation questions that need to be addressed in

determining a professional development program’s effectiveness should be asked

in the planning stage. Planning more carefully and more intentionally not only

makes evaluation easier, it also leads to much more effective professional

development. (p.22)

The private corporation training and development department in this study only

used level one of the evaluation model. No follow up was provided. Kirkpatrick (2007)

stated: “the number one myth corporations have is using a smile-sheet for level 1 and pre

and post-tests for level 2, while hoping for the best, is an appropriate use of the four

levels.” (p.35). One can conclude that the reason the public school district in this study

faired better than the private corporation was due to the use of an evaluation model.

Kandola (2000) stated: “Given the enormous amount of training carried out each year

and, consequently the enormous amount of money spent on it, it is surprising to find how

little evaluation is carried out. Or is it?” (p. 30)

Summary

The study utilized the triangulation design of mixed methods study to compare the

professional development training between a private corporation and a public school

district. Five critical levels of Guskey’s evaluation model which included participants’

reactions, participants’ learning, organizational support, participants’ use of knowledge

and skills and participants’ learning outcomes became the focus of the quantitative and

qualitative dimensions of the study.

A Likert-type survey with twenty items was rated by the respondents from both a

private corporation and a public school district. Responses were assigned weights and

Page 118: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

weighted means were determined using the SPSS model. The differences between the

weighted means of the corporate sector and the public school district were computed to

be statistically significant or not, using the t-test for two independent samples. Another

comparison was made between the ratings of corporate managers and educational

administrators and their respective subordinates. Analysis resulted in the null hypothesis

being rejected.

The qualitative portion of the study was determined through three open-ended

questions which determined the ways training and development impacted work

performance, identified the reasons that affected the respondents’ attitude about learning

new things, and how professional development training enhanced the respondents’ skills

and behaviors. Emergent themes were identified and presented in tabulated form.

In conclusion, the quantitative data show a significant difference in the corporate

sector and public education views regarding their professional development training

programs. The qualitative data yielded insight on the overall effectiveness of the

professional development training programs. Overall data from the qualitative portion

supported the quantitative data by giving evidence of what was valued in the corporate

sector and public education regarding the quality of their professional development

programs. The qualitative data supported the quantitative findings in how regarding

using an evaluation model made a difference in determining the overall effectiveness of

training provided to employees of public education.

Page 119: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

CHAPTER V

SUMMARY, CONCLUSIONS, AND RECOMMENDATIONS

Summary

The purpose of the study was to compare public education professional

development training programs with the corporate sector professional development

training programs. A professional development evaluation model developed by Thomas

Guskey in 2000 included five critical levels was used to examine the presence and

significance of professional development programs for educators in a public school

district and in training programs for employees in a private corporation. Both

quantitative and qualitative data were collected and analyzed to determine the overall

effectiveness of professional development training programs found in both educational

and corporate sectors.

Data obtained through the online survey formed the quantitative portion of the

study; results expressed in terms of weighted means were used to determine whether

there was a significant difference in the ratings of educators and corporate employees on

participants’ reactions, participants’ learning, organizational support, and participants’

use of knowledge and skills received from professional training development.

Data obtained from the qualitative portion regarding the evaluation process and

overall effectiveness of participants’ learning outcomes data were collected through the

online, open-ended questionnaire and through interviews of educational administrators

and corporate managers.

Page 120: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Research Questions

The following research questions guided the study:

Quantitative

5. What are the differences in participants’ reactions regarding the professional

development training between public educators and corporate employees as

measured by PDAT?

6. What are the differences in participants’ learning in professional development

training between public educators and corporate employees as measured by

PDAT?

6. What are the differences in organizational support for professional development

between public educators and corporate employees as measured by PDAT?

4. What are the differences in participants’ use of knowledge and skills gained from

their professional development training program provided by the corporate sector

and public education as measured by PDAT?

Qualitative

6. What are the differences in how the evaluation of participants’ learning outcomes

is determined by the corporate sector and public school district, based on

Guskey’s model?

Based on the quantitative research questions, the following hypotheses were

formulated:

Null Hypotheses

Ho1: There are no statistically significant differences in participants’ reactions to

the professional development training provided between public educators

Page 121: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

and corporate employees as measured by PDAT.

Ho2: There are no statistically significant differences in participants’ learning

throughout their professional development training outcomes between

public educators and corporate employees as measured by PDAT.

Ho3: There are no statistically significant differences in organizational support for

professional development training between public educators and corporate

employees as measured by PDAT.

Ho4: There are no statistically significant differences in participants’ use of

knowledge and skills gained from their professional development training

program provided by private corporations and public education as measured

by PDAT.

Summary of Findings

From the results in Chapter IV regarding the quantitative and qualitative data, the

researcher made the following conclusions:

1. The t-test for two independent samples indicated that there was a significant

difference in participants’ reactions regarding the professional development training

between the public educators and corporate employees as measured by the Professional

Development Assessment Tool (PDAT). Consequently the null hypothesis was rejected.

This indicated that the ratings given by respondents of the public school district were

significantly higher than those given by respondents of the private corporation.

Ratings given by the corporate managers and educational administrators were

compared to the ratings given by their respective subordinates. The weighted mean of

the educational administrators was statistically significant when compared to the

Page 122: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

weighted mean resulting from the ratings of teachers/counselors. Educational

administrators were more optimistic in their ratings, possibly due to their direct

involvement in the preparation and execution of the training programs. When ratings of

the corporate managers, expressed in terms of the weighted mean, were compared to the

ratings of their subordinates, the t-test result showed no significant difference between

the weighted means.

2. Results from the PDAT regarding the respondents’ ratings on participants’

learning showed a computed t-test value which was statistically significant; consequently

the null hypothesis was rejected. Public school educators viewed the learning of new

skills more positively than corporate employees.

In comparing the weighted means of educational administrators and

teachers/counselors, the resulting t-test value was not statistically significant. Comparison

of the weighted means of the corporate managers and their subordinates similarly showed

no significant difference.

3. In the critical area of organizational support, respondents from the corporate

sector gave a higher rating expressed in the weighted mean compared to the public school

district. The difference was statistically significant; consequently the null hypothesis was

rejected. The higher ratings of the private corporation may have been due to the adequate

funding budgeted for training and development, known to the corporate managers and

other employees. Tuition reimbursement was seen as a positive factor by both

management and other employees.

When the weighted means of the corporate managers and educational

administrators were compared to the weighted means of their respective subordinates

Page 123: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

using the t-test for two independent samples, both computations showed statistical

significance. Educational administrators and corporate managers were more aware of

what their respective organization had planned for the trainings of employees,

specifically the monetary requirement.

4. Comparison of the weighted means of the public school district and the private

corporation regarding the aspect of participants’ use of knowledge and skills showed

significant difference; consequently the null hypothesis was rejected. Public school

educators considered their ability to implement their skills more positively than corporate

employees.

Educational administrators gave significantly higher ratings expressed in the

weighted mean compared to the weighted mean resulting from the ratings of

teachers/counselors.

On the critical level of participants’ use of knowledge and skills, the weighted

mean of the corporate mangers was not statistically significant compared to the weighted

mean of other employees. Both corporate managers and other employees were aware that

employees hired already possess the knowledge and skills required in the different job

areas.

5. Due to the higher weighted means of the public school district in all the critical

areas except in organizational support, the difference between their total weighted mean

compared to the total weighted mean for the private corporation, was computed to be

statistically significant. Based on the criteria set by Guskey’s model of evaluation

involving the different critical areas, the overall effect of the training program was more

positively evaluated by the public school district.

Page 124: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Evaluation of the overall effectiveness of the training program given by the

educational administrators was significantly higher than the ratings of the

teachers/counselors. Possible reasons for this more positive outlook of educational

administrators included their awareness of the Guskey’s model in their training program,

involvement in the creation and execution of the different programs, and their loftier

expectations regarding the impact of the training programs.

Comparison of the evaluation regarding the overall effectiveness of the

professional development training by the corporate managers and their subordinates

produced no statistical significant difference. The private corporation was not utilizing

Guskey’s model or any model for evaluating their training and development programs.

The analysis of the qualitative data in Chapter IV led the researcher to draw the

following conclusions:

(1) Responses to how training and development impacted work performance from

both corporate and public school sectors included the following: learned skills essential to

the job, increased productivity/efficiency, learned something new, and became familiar

with better techniques. Improved human relations were another major reason that private

corporation respondents indicated as having impacted their training and development.

(2) Respondents from both sectors identified the following reasons as having

affected their attitude about learning new things: eagerness to learn, more positive

outlook, and more open-mindedness in their dealings. Respondents indicated that the

professional development training produced no or little change in them.

(3) Public school educators and private corporation employees agreed on three

main reasons on ways that professional development training enhanced their skills and

Page 125: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

behaviors. These included: learned rudiments of the job, increased

productivity/efficiency and no significant change. Corporate employees added the

aspects of improved human relations, better communication skills and reinforced

previous trainings to reasons that enhanced their skills or behaviors. Educators saw new

perspectives and developed positive attitudes as additional reasons on how professional

development training enhanced their skills and behaviors.

Interviews with corporate managers and educational administrators showed the

following conclusions:

1) Managers of corporate employees evaluated the impact of professional

development training on work performance through the goals of the employees. For the

private corporation involved in the study, there were no formal ways of evaluating the

impact on work performance through or as a result of the professional development

training received by their employees. Evaluation was done rather informally or not at all.

Educational administrators evaluated the impact of professional development

training through the student learning outcomes. Another way of expressing learning

outcomes was through students’ success in learning or student achievement. This has

been the focus of the district’s organizational activities.

2) In the private corporation involved in the study there was no formal way for

evaluating how an employee’s attitude has been affected by the material he/she received

from professional development training. Again, evaluation was rather informal or not at

all.

Educational administrators evaluated the attitudes of their employees

continuously. Mentors had been assigned to beginning teachers and help was given in

Page 126: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

any area needed. The professional development department evaluated the participants’

progress throughout the year with ongoing surveys and homework assignments.

3) In the private corporation, more follow up was done on technical training than

in leadership training. No follow up was done with individuals who undertook personal

initiative for their own development. Follow up was done on employees who were

recommended for development training to remedy a certain weakness.

Educational leaders used observations, surveys, two-way communications and

open-door policies to evaluate the participants’ enhanced skills or behaviors.

4) The school district was already using the Guskey’s model and the corporation

was not. This might have accounted for some of the differences between the corporation

and public school system.

Conclusions

Quantitative and qualitative data were collected to reveal the overall effectiveness

of a professional development training program given to employees of one large public

school district and employees of one corporation. Quantitative data collected from the

online PDAT survey/questionnaire instrument revealed that public school district

employees rated their professional development training higher in all critical areas except

organizational support than the corporate employees. The difference between their total

weighted mean of 76.70 compared to the total weighted mean of 73.19 for the private

corporation, was computed to be statistically significant. Based on the criteria set by

Guskey’s model of evaluation involving the different critical areas, the overall

effectiveness of the training program was more positively evaluated by the public school

district.

Page 127: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Qualitative data were collected two ways. The first part was through the online

open-ended questionnaire given to employees and management regarding participants’

learning outcomes (Guskey’s fifth level). Participants’ learning outcomes demonstrates

the overall impact of professional development training, most importantly,

implementation of new skills learned. Qualitative data gathered from employees of both

the private sector and school district indicates their eagerness to learn new things

essential to their jobs; however, implementation of new skills learned was a problem.

The second part of the qualitative data was collected through interviews with

management and administrators regarding how they evaluated implementation of new

skills learned by their subordinates. The school district used Guskey’s model for

planning and evaluating their professional development programs. The private

corporation did not use an evaluation model for designing and evaluating their training

programs.

Recommendations

The data gathered in this study suggested there was a significant difference in

how public educators from one school district and corporate employees from one

corporation viewed the overall effectiveness of their professional development programs.

There was not a significant difference in how corporate managers and corporate

employees viewed their professional development programs. It is clear from this study

the public school district functions differently from the corporate sector. The following

practical suggestions for implications of findings are supported by this study:

1. Organizations must move away from the one shot evaluation process, better

known as participants’ reactions level one of the Guskey and Kirkpatrick

Page 128: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

models. Data obtained through the study confirmed that participants’

reactions should be considered during the planning process but not as a driver

for evaluating the overall success of a program. The school district used the

reactions of their participants to solicit suggestions for improvements and to

see what did or did not work for them.

2. In developing a training program, it is important to discover the needs of the

participants. Implementation is the key for evaluating participants’ learning.

This study confirmed that when objectives and goals of the participants are

made clear, it was easy to assess their learning. The school district’s

professional development department demonstrated this skill when developing

a program for the school they were servicing. The school determined the

outcome it wanted and the professional development department assisted to

achieve the desired outcome.

3. Organizational support is the key for implementation of new knowledge and

skills learned. This study showed that allowing all participants an active voice

in the planning process helped ensure the effectiveness of the program.

Another important key in building support for a training program was having

a common goal. The school district professional development department

implemented these key components when meetings were held with the school

building principals and teachers to determine the needs of the students and

school. In the corporate sector, employees developed their own personal

goals. Their goals may or may not have been aligned to the goals of the

company.

Page 129: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

4. Participants’ learning outcomes must be the beginning stage of the planning

process. The designer of a development programs must have the outcome as

the first step of planning. This study affirmed that when the participants’

needs are taken into consideration, successful implementation was easier to

evaluate.

5. Using an evaluation model is crucial in the success of a professional

development program. Evaluation must be a continuous and systematic effort

to bring about a positive change. Objectives must be clear from the

beginning. Evaluations of how those objectives must be met should be

established in the planning process.

Recommendations for Further Study

Based on the results of the study, the researcher recommends the following

concerns for further study:

Public Education

1. A study should be conducted to explore the professional development

differences between educational leaders across school districts.

2. A study should be conducted to explore the professional development

programs among school districts; one using an evaluative model and non-

evaluative models.

3. A study should be conducted using a different evaluation model to see if the

results are similar.

Page 130: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

4. A statewide study should be conducted in education using Guskey’s model to

determine the overall effectiveness of state mandated professional

development programs.

5. A study should be conducted between rural school districts and urban school

districts using Guskey’s model.

Corporations

6. A study should be conducted examining the correlation between revenue and

program effectiveness.

7. A study should be conducted to see if corporate employees’ are active

participants in the planning stage of professional development training verses

those that are not active participants.

8. A study should be conducted to explore the differences in training and

development programs among Fortune 500 corporations utilizing personal

goals verses corporate goals.

9. A study should be conducted to see if corporate managers are allow to

observe participants utilizing the new knowledge and skills learned that they

gained as a result of their own professional growth.

Page 131: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

REFERENCES

Adey, P. (2004). The professional development of teachers: Practice and theory.

Dordrecht, Netherlands: Kluwer Academic Publishers.

American Society for Training and Development (2006). 2006 state of the industry

report. Alexandria, VA: Author.

Betof, E. (2007, March). Teachable points of view for leadership. ASTD Training +

Development, 61(3), 48-53.

Blanchard, P.N., & Thacker, J.W. (2007). Effective training: Systems, strategies, and

practices (3rd ed.). New Jersey: Pearson Prentice Hall.

Boulmetis, J., & Dutwin, P. (2000). The abc’s of evaluation: Timeless techniques for

program and project managers. San Francisco, CA: Jossey-Bass Publishers.

Brinkerhoff, R.O.(1987). Achieving results from training. San Francisco, CA: Jossey-

Bass Limited.

Champion, R. (2005, Summer). Taking measure: Identify the payoffs for investment in

better program evaluation. Journal of Staff Development, 26(3), 61-62.

Champion, R. (2003, Winter). The real measure of professional development programs

effectiveness lies in what the participants learn. Journal of Staff Development,

24(1), 75-76.

Eister, I. (2004). Professional development content and delivery strategies perceived by

urban, elementary principals and expert staff developers that are beneficial in the

transfer of learning from professional development occurrences to on-the-job

application. ProQuest Information and Learning Company (UMI No. 3150000).

Page 132: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Fraenkel, J. R., & Wallen, N. E. (2006). How to design and evaluate research in

education (6th ed.). New York: McGraw Hill.

Fullan, M., & Hargreaves, A. (1996). What’s worth fighting for in your school. New

York: Teachers College, Columbia University.

Gall, M. D., Gall, J. P., & Borg, W.R. (2003). Educational research: An introduction.

Boston, MA: Allyn and Bacon.

Greene, G. K. (2005). Quality matters: A different perspective on the relationship

between school resources and student outcomes. ProQuest Information and

Learning Company (UMI No. 3175689).

Guskey, T. R. (1998). Follow-up is key, but it’s often forgotten. Journal of Staff

Development,19(2), 7-8.

Guskey, T.R. (2000). Evaluation professional development. Thousand Oaks, CA: Corwin

Press, Incorporated.

Guskey, T. R. (2002). Getting to the root of the gap. School Administrators, 59(7), 23-25.

Guskey, T. R. (2005/2006, Winter). A conversation with Thomas R. Guskey. The

Evaluation Exchange. 11(4), 12-13.

Hackett, J. (2005). Exploring the links among professional development: Teacher

performance, and student achievement. Pro-Quest Information and Learning

Company (UMI No. 3169621).

Hargreaves, A. (2007, Summer). Five flaws of staff development and the future beyond.

Journal of Staff Development, 28(3), 37-38.

Hawley, W. D., & Rollie, D. L. (2007). The keys to effective schools: Educational reform

as continuous improvement (2nd ed.). Thousand Oaks, CA: Crown Press.

Page 133: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Hirsh, S. (2003, Summer). Teacher development takes time and money, but it’s the only

sure way to improve student performance. Journal of Staff Development, 24(3), 8-

11.

Hughes, T. A. (2006). The relationship between professional learning communities and

student achievement in high schools. Unpublished doctoral dissertation, Prairie

View A & M University, Prairie View, TX.

Husby, V. (2005). Individualizing professional development: A framework for meeting

school and district goals. Thousand Oaks, CA: Corwin Press.

Inge, R. R. (2005). A survey of school principals and teachers regarding teachers’

professional development participation. ProQuest Information and Learning

Company (UMI No. 3178947).

Isaac, S., & Michael, W. B. (1995). Handbook in research and evaluation for education

and the behavioral sciences (3rd ed.). San Diego, CA: Edits.

Kandola, B. (2000, July). Training evaluation: How to get results. Training Journal, 30.

Retrieved January 6, 2008, from ABI/INFORM Global database. (Document

ID: 56392537).

Kennedy, J. E. (1996). Professional growth decisions of mid-career teachers and

influencing work context factors. Dissertation Abstracts International, 57(6),

2296. (UMI No. 9632054).

Kent, A. M. (2004, Spring). Improving teacher quality through professional development.

Education, 124(3), 427-435.

Kelleher, J. (2003, June). A modal for assessment-driven professional development. Phi

Delta Kappan, 84 (10), 751.

Page 134: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Ketter, P. (2006, December). Investing in learning; Looking for performance. ASTD

Training + Development 60(12), 31-33.

Kirkpatrick, D. L., & Kirkpatrick, J. D.(2005). Transferring learning to behavior. San

Francisco, CA: Berrett-Koehler Publishers, Incorporated.

Kirkpatrick, D. L., & Kirkpatrick, J. D. (2006). Evaluating training programs. San

Francisco, CA: Berrett-Koehler Publishers, Incorporated.

Kirkpatrick, D. L. (1977). Evaluating training programs: Evidence vs. proof. ASTD

Training + Development, 31(11), 9-12.

Kirkpatrick, D. L. (1978). Evaluating in-house training programs. ASTD Training +

Development, 32(9), 6-9.

Kirkpatrick, D. L. (1996). Great ideas revisited. Techniques for evaluating training

programs. Revisiting Kirkpatrick’s four-level model. ASTD Training +

Development, 32(9), 54-59.

Kirkpatrick, J. (2007, August). The hidden power of kirkpatrick’s four levels. ASTD

Training + Development, 61(8), 34-37.

Kremer-Hayon, L. (1991). Teacher professional development: The elaboration of a

concept. European Journal of Teacher Education, 14(1), 79-85.

Kritsonis, W. (2002). Schooling: Historical- philosophical-contemporary events and

milestones. Mansfield, OH: BookMasters.

Kritsonis, W. (2005). Practical applications of educational research and basic statistics.

Mansfield, OH: BookMasters.

Page 135: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Labuda, C. B. (2004). The impact of professional development program on the

implementation of problem-solving strategies in the classroom. Unpublished

doctoral dissertation, University of Houston, Houston, TX.

Laine-Lutz, S. W. M. (2000). Organizational support for staff development: Exemplary

practices in education and the private sector. ProQuest Information and Learning

Company (UMI No. 3038613).

Laine, S. W. M., & Otto, C. (2000). Professional development in education and the

private sector: Following the leaders. In A.M. Kent, Improving teacher quality

through professional development. Education, 124(3), 427-435.

Lieberman, A. (1995). Practices that support teacher development. Phi Delta Kappan,

76(8), 591-596.

Lieberman, A., & Miller, L. (2007). Transforming professional development:

Understanding and organizing learning communities. In W.D. Hawley & D.L.

Rollie (Eds.). The keys to effective schools: Educational reform as continuous

improvement (2nd ed.). Thousand Oaks, CA: Crown Press.

Loucks-Horsley, S. (1987). Continuing to learn: A guidebook for teacher development.

Andover, MA: Regional Laboratory for Educational Improvement of the

Northeast and Islands.

Lowden, C. (2003). Evaluating the effectiveness of professional development. ProQuest

Information and Learning Company (UMI No. 3081025).

Marczely, B. (1996). Personalizing professional growth. Thousand Oaks, CA: Crown

Press, Incorporated.

Marshall, J. C., Pritchard, R. J., & Gunderson, B. H. (2001, February). Professional

development: What works and what doesn’t. Principal Leadership, 1(6), 64-68.

Page 136: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Maxwell, J. A. (2005). Qualitative research design: An interactive approach. Thousand

Oaks, CA: Sage Publications.

Meell, M. A. (1985). The impact of motivational strategies on staff development

programs in education and on training programs in business and industry:

Implications for teacher education. Unpublished doctoral dissertation, University

of Houston, Houston, TX.

Miller, L. M. (2006). Professional development in a large school district: An application

of Guskey model. Unpublished doctoral dissertation, University of Toronto,

Toronto, Canada.

Miller, R.W. M. (2004). A retrospective analysis of Hewlett-Packard’s software job skills

professional development program. Unpublished doctoral dissertation, University

of Houston, Houston, TX.

Mulder, M., Nijhof, W., & Brinkerhoff, R. (1995). Corporate training for effective

performance. Norwell, MA: Kluwer Academic Publishers.

Natale, S. M., & Fenton, M. B. (1997). Business education and training: A value-laden

process: Volume one, education and value conflict. Lanham, MD: University

Press of America.

National Staff Development Counsel (1994). Standards for staff development: Middle

level edition. Oxford, OH: Author.

National Staff Development Counsel (1995a). Standards of staff development:

Elementary school edition. Oxford, OH: Author.

National Staff Development Counsel (1995b). Standards for staff development: High

school edition. Oxford, OH: Author.

Page 137: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

National Staff Development Counsel (2006). Standards. Retrieved October 27, 2006,

from http://www.nsdc.org/standards/about/index.cfm

No Child Left Behind Act (2001). Retrieved January 3, 2006, from www.ed.gov

Oja, S. (1980). Adult development is implicit with staff development. Journal of Staff

Development, 1(3), 7-56.

Parry, S. B. (1997). Evaluating the impact of training. Alexandria, VA: American Society

for Training and Development.

Patterson, K (2006, October). Old dogs, new tricks: T + d talked with Kerry Paterson.

ASTD Training + Development, 60(10), 20-21.

Phillips, J. J., Phillips, P. P., & Hodges, T. K.(2004). Making training evaluation work.

Alexandria, VA: American Society of Training and Development Press.

Richardson, J. (2007, Summer). No tears for the dear departed “inservice” – its time has

come. Journal of Staff Development, 28(3), 61-64.

Richardson, J. (2002, September) Leave no teacher behind. Retrieved October 15, 2006,

from http://www.nsdc.org/library/publications/results/res9-02rich.cfm

Roberts, S. M., & Pruitt, E. Z. (2003). Schools as professional learning communities:

Collaborative activities and strategies for professional development. Thousand

Oaks, CA: Sage Publishing.

Rossett, A. (2007, February). Leveling the levels. ASTD Training + Development, 61(2),

49-53.

Page 138: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Rothwell, W. J., & Kazanas, H. C. (1992). Mastering the instructional design process: A

systematic approach. San Francisco, CA: Jossey-Bass Publishers.

Rowntree, D. (1981). A dictionary of education. Totowa, NJ: Barnes & Noble.

Salopek, J.J. (2006, October). Old dogs, new tricks. ASTD Training + Development,

60(10), 20-21.

Salant, P., & Dillman, D. (1994). How to conduct your own survey. United States of

America: John Wiley & Sons, Incorporated.

Sekowski, G. J. (2002). Evaluating training outcomes: Testing an expanded model of

training outcome criteria. ProQuest Information and Learning Company (UMI

No. 3076227).

Sirkin, R. M. (2006). Statistics for the social sciences. Thousand Oaks, CA: Sage

Publication.

Smith, D. W. (2006). A return on investment study for an engineering company in

Huntsville, Alabama using a community college web-based training program.

ProQuest Information and Learning Company (UMI No. 3211246).

Sparks, D. (2005, April). Principals serve schools as leaders of professional learning.

Retrieved on July 20, 2007, from

http://www.nadc.org/library/publications/results/res4-05spar.cfm

Sparks, D. (2003, Winter). Interview with Michael Fullan: Change agent. Journal of Staff

Development, 24(1), 55-58.

Sparks, D.,& Hirsh, S. (1997). A new vision for staff development. Alexandria, VA:

Association for Supervision and Curriculum Development.

Page 139: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Swanson, R. A. (1994). Analysis for improving performance: Tools for diagnosing

organizations and documenting workplace expertise. San Francisco, CA: Berrett-

Koehler.

Tsarouhas, A. (2004). Understanding organizational context for evaluation of training

outcomes: A multi-site case study in the community mental health sector.

Unpublished doctoral dissertation, University of Ottawa, Ottawa, Canada.

Vontz, T., & Leming, R. (2005, Fall/2006, Winter). Designing and implementing

effective professional development in civic education. International Journal of

Social Education, 20(2), 67-88.

Walker, A.G. (2007, February). Is performance management as simple as abc? ASTD

Training + Development, 61(2), 54-57.

Webster’s new world college dictionary (4th ed.). (1999). New York, NY: Macmillan,

USA.

Wilson, J. (1997). First steps in education professionals. In S. Natale & M. Fenton

(Eds.), Business education and training: A value-laden process (pp.279-283).

Lanham, MD: United Press of America.

Zender, G. (2002). An evaluation of a science professional development model:

Examining participants’ learning and use of new knowledge and skills,

organizational support and change, and student learning outcomes. ProQuest

Information and Learning Company (UMI No. 3074837).

Page 140: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIXES

Page 141: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX A

PROFESIONAL DEVELOPMENT ASSESSMENT TOOL SURVEY

Page 142: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 143: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 144: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX B

INTERVIEW QUESTIONS

Page 145: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Qualitative Interview Questions

The following questions will be used as a guide during the interview process in

the qualitative portion of the study. The qualitative portion of the survey will be

administered to department heads, administrators and managers of the training and

development and professional development departments. The qualitative questions will

focus on Guskey’s level four (Participants’ use of knowledge and skills) and level five

(Students learning outcomes).

1. How is the professional development provided to your employees evaluated for

overall effectiveness? (probe: are goals, desire outcomes, or criteria clearly define?)

2. Describe the process your organization uses to link learning to individual and

organizational performance?

3. Describe ways in which follow up training or evaluation is provided to ensure

participants’ are implementing the new skills learned.

4. Does your organization discuss ways for improvement with employees outside the

Professional/Training and Development department (probe: Are employees involve in

the professional development planning process)

5. Describe ways in which your organization evaluates participants’ change in

attitude /or behavior?

6. How is the training being offered selected? Who determines what training is needed?

7. Does your organization use an evaluation model? If so which one?

8. How much money is allocated per employee for P.D. training? Is there a different

budget allocated for administrators/management verses what’s allocated for

employees? or (What percentage of your overall budget is allocated to professional

development training?)

Page 146: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX C

PERMISSION LETTER TO SCHOOL DISTRICT

Page 147: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

[Date]

.

Dear School District:

It is a pleasure for me to contact you. I am a PhD student in Educational Leadership at Prairie View A & M University. I am conducting a research study on evaluating professional development and its overall effectiveness.

The purpose of my study is to examine the differences in how private corporations evaluate their professional development training verses how public education evaluates their professional development training. In looking at the evaluation process, I hope to answer the question: Does the evaluation process determine the overall effectiveness of professional development training? Permission to survey your employees is requested. Employees will be asked to complete an online survey title Professional Development Assessment Tool (PDAT). The instrument will take approximately 10-12 minutes to complete. No risks are associated with the study. Attached you will find a hard copy of the instrument. The benefits of participation will be significant in that it will provide valuable data to school districts and private corporations about the overall effectiveness of professional development training given to employees. Thank you very much for considering my study. If you have any questions, please feel free to contact me and my dissertation advisor at your earliest convenience.

Sincerely,

Yolanda E. Smith William Allan Kritsonis, PhDPhD Student in Educational Leadership Professor & Dissertation AdvisorPrairie View A & M University PhD Program in Educational Leadership Prairie View A&M UniversityUnited Space Alliance College of EducationSr. Robotic Instructor Prairie View, TX 77446

[email protected] Almond Lake Dr. 281-550-5700Houston, Texas 77047 (713)703-0429 cell Ms. Marcia [email protected] Prairie View A & M Research & Development P. O. Box 4149 Prairie View, Texas (936)261-1588

Page 148: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

[email protected]

APPENDIX D

SCHOOL DISTRICT APPROVAL LETTER

Page 149: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 150: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX E

LETTER TO PRINCIPALS

Page 151: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

[Date]

.

Dear School Principal:

It is a pleasure for me to contact you. I am a PhD student in Educational Leadership at Prairie View A & M University. I am conducting a research study on evaluating professional development and its overall effectiveness.

The purpose of my study is to examine the differences in how private corporations evaluate their professional development training verses how public education evaluates their professional development training. In looking at the evaluation process, I hope to answer the question: Does the evaluation process determine the overall effectiveness of professional development training? Permission to survey your employees is requested. Employees will be asked to complete an online survey title Professional Development Assessment Tool (PDAT). The instrument will take approximately 10-12 minutes to complete. No risks are associated with the study. Attached you will find a hard copy of the instrument. The benefits of participation will be significant in that it will provide valuable data to school districts and private corporations about the overall effectiveness of professional development training given to employees. Thank you very much for considering my study. If you have any questions, please feel free to contact me and my dissertation advisor at your earliest convenience.

Sincerely,

Yolanda E. Smith William Allan Kritsonis, PhDPhD Student in Educational Leadership Professor & Dissertation AdvisorPrairie View A & M University PhD Program in Educational Leadership

Prairie View A&M UniversityUnited Space Alliance College of EducationSr. Robotic Instructor Prairie View, TX 77446

[email protected] Almond Lake Dr. 281-550-5700Houston, Texas 77047(713)703-0429 cell Ms. Marcia [email protected] Prairie View A & M Research & Development P. O. Box 4149 Prairie View, Texas 77446 (936)261-1588 [email protected]

Page 152: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX F

E-MAIL FROM THE PROFESSIONAL DEVELOPMENT DEPARTMENT

Page 153: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Please see Exxxxxxx message below.  Today, she and Sxxxxxxxxx participated in an interview with a person working on her doctorate.  The focus of the dissertation is the comparison of P.D. as delivered by public education and that of the corporate world.  Exxxxxxx says that the survey takes about five minutes and would be good for us to complete.  If you h ave time you can find the link in her message below..  It must be completed today. XXXXXX XXXXXXProfessional Development Services713-XXX-XXXHigh Quality Professional Learning to Support Student Achievement 

From: XXXXXXXXXXX Sent: Tuesday, September 18, 2007 2:33 PMTo: XXXXXXXXXXXXXSubject: RE: thank you

We should have recorded it!  It was good for us to do this.  It would also be great if our department members participated in the survey which will be a part of her research.  It takes about 5 minutes at  www.pdat.speedsurvey.com   (She approached principals at a time when they were inundated with surveys and they let her know it; therefore, she only received 197 responses from public educators). If we are going to participate, we must complete today. XXXXXXXXXXXXX,, ManagerProfessional Development ServicesXXXXXXXXXXX Independent School DistrictXXXXXXXX StreetHouston, TX  713-xxxxxxxxxxx

  High Quality Professional Learning to Support Student Achievement 

From: xxxxxxxxxxxxxxSent: Tuesday, September 18, 2007 1:16 PMTo: xxxxxxxxxxxxxxxxxxxxxSubject: thank you

Thank you for making yourselves available to be interviewed by Yolanda Smith today.   xxxxxxxxxxxxxxProfessional Development Services713-xxxxxxxxxxxHigh Quality Professional Learning to Support Student Achievement 

Page 154: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX G

PERMISSION LETTER TO PRIVATE CORPORATION

Page 155: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

[Date]

.

Dear Private Corporation:

It is a pleasure for me to contact you. I am a PhD student in Educational Leadership at Prairie View A & M University. I am conducting a research study on evaluating professional development and its overall effectiveness.

The purpose of my study is to examine the differences in how private corporations evaluate their professional development training verses how public education evaluates their professional development training. In looking at the evaluation process, I hope to answer the question: Does the evaluation process determine the overall effectiveness of professional development training? Permission to survey your employees is requested. Employees will be asked to complete an online survey title Professional Development Assessment Tool (PDAT). The instrument will take approximately 10-12 minutes to complete. No risks are associated with the study. Attached you will find a hard copy of the instrument. The benefits of participation will be significant in that it will provide valuable data to school districts and private corporations about the overall effectiveness of professional development training given to employees. Thank you very much for considering my study. If you have any questions, please feel free to contact me and my dissertation advisor at your earliest convenience.

Sincerely,

Yolanda E. Smith William Allan Kritsonis, PhDPhD Student in Educational Leadership Professor & Dissertation AdvisorPrairie View A & M University PhD Program in Educational Leadership

Prairie View A&M UniversityUnited Space Alliance College of EducationSr. Robotic Instructor Prairie View, TX 77446

[email protected] Almond Lake Dr. 281-550-5700Houston, Texas 77047(713)703-0429 cell Ms. Marcia Sheldon [email protected] Prairie View A & M Research & Development P. O. Box 4149

Page 156: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

Prairie View, Texas 77446 (936)261-1588 [email protected]

APPENDIX H

PRIVATE CORPORATION APPROVAL LETTER

Page 157: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 158: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 159: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX I

HUMAN PARTICIPANT EDUCATION FOR RESEARCH

Page 160: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 161: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 162: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

APPENDIX J

INSTITUTIONAL REVIEW BOARD

Page 163: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair
Page 164: Yolanda E. Smith, Dissertation, Dr. William Allan Kritsonis, Dissertation Chair

VITA

YOLANDA E. SMITH4026 Almond Lake Dr.Houston, Texas 77047

EDUCATIONAL HISTORY

Texas Southern University, Houston, Texas, B.S.in Mathematics, August, 1994

Prairie View A & M University, Prairie View, Texas M. Ed.in Educational Administration, December, 2004

EMPLOYMENT HISTORY

1998 - 2008 Sr. Robotic Instructor, United Space Alliance

1994 - 1998 Mathematics Teacher, Houston, ISD

1990 - 1994 Teacher Aide, Houston, ISD