ijqrm the baldrige education criteria for performance...

41
The Baldrige Education Criteria for Performance Excellence Framework Empirical test and validation Masood Abdulla Badri and Hassan Selim Department of Business Administration, College of Business & Economics, United Arab Emirates University, Al Ain, United Arab Emirates Khaled Alshare and Elizabeth E. Grandon Accounting & Computer Information System Department, Emporia State University, Emporia, Kansas, USA, and Hassan Younis and Mohammed Abdulla Department of Business Administration, College of Business & Economics, United Arab Emirates University, Al Ain, United Arab Emirates Abstract Purpose – The purpose of this paper is to empirically test the causal relationships in the Malcolm Baldrige National Quality Award (MBNQA) Education Performance Excellence Criteria. Design/methodology/approach – Using a sample of 220 respondents from 15 United Arab Emirates (UAE) universities and colleges, results of regression analysis and confirmatory structural equation modeling show that all of the hypothesized causal relationships in the Baldrige model are statistically significant. Findings –A comprehensive “measurement model” grounded in the Baldrige Performance Excellence in Education Criteria for the 33 items of measurement is developed, tested, and found to be valid and reliable. Leadership is identified as a driver for all components in the Baldrige System, including measurement, analysis and knowledge management, strategic planning, faculty and staff focus and process management. All Baldrige components (categories) are significantly linked with organizational outcomes as represented by the two categories of organizational performance results and student, stakeholder and market focus. The paper also tests the statistical fit of the only Baldrige model dealing with higher education, which was published in 1998 by Winn and Cameron. Research limitations/implications – The data obtained are based on a sample of UAE higher education institutions. Studies in other countries should be conducted using the developed model to ensure the reliability of the results obtained. Practical implications – A greater understanding of the linkages between the elements making-up the MBNQA Education Performance Excellence Criteria model, facilitating the guiding role that the award models play in the implementation of quality management in higher education. Originality/value – For the first time, an instrument of the MBNQA Education Performance Excellence Criteria is developed and tested. A new in-depth and holistic perspective for examining the relationships and linkages in the MBNQA Education Performance Excellence Criteria model is provided. Keywords Baldrige Award, Quality awards, Higher education, Performance measures, United Arab Emirates Paper type Research paper The current issue and full text archive of this journal is available at www.emeraldinsight.com/0265-671X.htm IJQRM 23,9 1118 Received March 2005 Revised August 2005 International Journal of Quality & Reliability Management Vol. 23 No. 9, 2006 pp. 1118-1157 q Emerald Group Publishing Limited 0265-671X DOI 10.1108/02656710610704249

Upload: others

Post on 10-Jun-2020

10 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

The Baldrige Education Criteriafor Performance Excellence

FrameworkEmpirical test and validation

Masood Abdulla Badri and Hassan SelimDepartment of Business Administration, College of Business & Economics,

United Arab Emirates University, Al Ain, United Arab Emirates

Khaled Alshare and Elizabeth E. GrandonAccounting & Computer Information System Department,Emporia State University, Emporia, Kansas, USA, and

Hassan Younis and Mohammed AbdullaDepartment of Business Administration, College of Business & Economics,

United Arab Emirates University, Al Ain, United Arab Emirates

Abstract

Purpose – The purpose of this paper is to empirically test the causal relationships in the MalcolmBaldrige National Quality Award (MBNQA) Education Performance Excellence Criteria.

Design/methodology/approach – Using a sample of 220 respondents from 15 United ArabEmirates (UAE) universities and colleges, results of regression analysis and confirmatory structuralequation modeling show that all of the hypothesized causal relationships in the Baldrige model arestatistically significant.

Findings – A comprehensive “measurement model” grounded in the Baldrige PerformanceExcellence in Education Criteria for the 33 items of measurement is developed, tested, and found tobe valid and reliable. Leadership is identified as a driver for all components in the Baldrige System,including measurement, analysis and knowledge management, strategic planning, faculty and stafffocus and process management. All Baldrige components (categories) are significantly linked withorganizational outcomes as represented by the two categories of organizational performance resultsand student, stakeholder and market focus. The paper also tests the statistical fit of the only Baldrigemodel dealing with higher education, which was published in 1998 by Winn and Cameron.

Research limitations/implications – The data obtained are based on a sample of UAE highereducation institutions. Studies in other countries should be conducted using the developed model toensure the reliability of the results obtained.

Practical implications – A greater understanding of the linkages between the elements making-upthe MBNQA Education Performance Excellence Criteria model, facilitating the guiding role that theaward models play in the implementation of quality management in higher education.

Originality/value – For the first time, an instrument of the MBNQA Education PerformanceExcellence Criteria is developed and tested. A new in-depth and holistic perspective for examining therelationships and linkages in the MBNQA Education Performance Excellence Criteria model is provided.

Keywords Baldrige Award, Quality awards, Higher education, Performance measures,United Arab Emirates

Paper type Research paper

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/0265-671X.htm

IJQRM23,9

1118

Received March 2005Revised August 2005

International Journal of Quality &Reliability ManagementVol. 23 No. 9, 2006pp. 1118-1157q Emerald Group Publishing Limited0265-671XDOI 10.1108/02656710610704249

Page 2: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

IntroductionMany researchers, encouraged by case study success stories, have called for evidencefrom large-scale studies on the effectiveness of quality management programs, such asthe Baldrige Criteria (Meyer and Collier, 2001; Bigelow and Arndt, 1995, 2000; Motwaniet al., 1996; Gann and Restuccia, 1994). The MBNQA has evolved from a means ofrecognizing and promoting exemplary quality management practices to acomprehensive framework for world-class performance, widely used as a model forimprovement. As such, its underlying theoretical framework is of critical importance,since the relationships it portrays convey a message about the route to competitiveness(Flynn and Saladin, 2001). It becomes imperative that the relationship betweenconstructs be tested and validated. It is important because organizations allocatesubstantial resources toward improvement of their processes based on therelationships in the Baldrige framework.

There are only a few studies that fully address Baldrige in the area of education.Evans (1997) initially discussed MBNQA and institutions of higher education byrelating it to learning and curriculum issues and identifying what higher educationshould be teaching based upon a survey of Baldrige Award winners. Using thefindings of Evans study as a baseline, Weinstein et al. (1998) identified an apparent gapbetween the Baldrige Award winners’ perceptions and the current practice in highereducation institutions. While developing a curriculum based upon Baldrige principleshas received noteworthy attention, what is not readily evident within the literature isthe actual application of the MBNQA concepts as part of the educational deliveryprocess. Belohlav et al. (2004) described how several faculty members in theDepartment of Management at DePaul University designed, developed, and deliveredcourse material using the MNBQA framework both as part of the structure and aspoint in their individual classes. They concluded that end-of-term student evaluationsindicated that the approach led to a higher level of student engagement in the learningprocess, as evidenced by more abundant and higher-quality feedback to theinstructors.

Winn and Cameron (1998) examined the validity of the proposed relationshipsamong the MBNQA dimensions using data from higher education. They developed asurvey instrument of the processes, practices, and outcomes of quality at a largeMidwestern university in the USA. Through some psychometric tests, they indicatedthat the seven MBNQA dimensions are distinct constructs and are being measuredreliably by the questionnaire items. To assess the validity of the framework’sassumptions, three sets of regression analysis were conducted. The relationshipbetween the leadership dimension and each of the four system dimensions was strongand statistically significant. They concluded that the assumed relationship between anorganization’s leadership and each of the quality process was definitely supported.Using structural equations modeling, the same authors proceeded to perform statisticalanalysis of MBNQA framework as a whole. They presented an alternative frameworkof relationships that took into account the lack of direct effects on quality andoperational results from leadership, information and analysis, and strategic planningand the lack of direct effects on customer focus and satisfaction from leadership,information and analysis, and human resource development and management. Thealternative model evidenced some acceptable goodness-of-fit with the data.

The BaldrigeEducation

Criteria

1119

Page 3: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

While the Baldrige Award in education has captured the attention of decisionmakers, there has been little empirical research examining the usefulness of the awardcriteria to guide the actions of organizations that seek to improve performance(Goldstein and Schweikhart, 2002; Arif and Smiley, 2004). This research takes a steptoward providing senior leaders in educational organizations with a valid means ofmaking those decisions. The published Baldrige model (Education Criteria forPerformance Excellence) (NIST, 2004) is shown in Figure 1. The general MBNQAtheory that “Leadership drives the system which creates results” suggests that theperformance relationships are recursive (Meyer and Collier, 2001). When Baldrigequality experts defined the performance relationships among the seven categories,uncertain of the true direction of causation, they defaulted to the premise that allcategories are related and used two-headed arrows among all Baldrige categories.

We seek to add to the growing body of support related to the validity of the generalBaldrige framework by examining it at the level of its theoretical constructs as itrelates to the education industry in an international context. By moving beyond thespecific criteria, we seek to examine the model in a larger context, as a theoreticalmodel for quality management in higher education. We tested if there was empiricalevidence that the relationships between the theoretical constructs held. To this end, weexamined individual relationships between categories and overall relationshipsbetween categories when they acted as an integrated system. We hypothesized that theseven Baldrige categories were related in a recursive causal model and that the sign ofeach path coefficient was positive. So, for example, Leadership’s direct effects in thecausal model were represented in two ways: first, as the leadership score increased, thescores of the other dimensions of strategic planning, faculty and staff focus, student,stakeholder, and market focus, and process management increased as well; and second,as the leadership score increased, the organizational results dimensions scores shouldalso increase. Leadership’s indirect effects were represented by increases in the

Figure 1.Baldrige EducationCriteria for PerformanceExcellence Model

IJQRM23,9

1120

Page 4: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

leadership score causing the organization results scores to increase throughleadership’s influence on the mediating dimensions in between. The award criteriawere studied to determine if the Baldrige theory of relationships among the sevenBaldrige categories were supported in UAE higher education institutions.

The objectives of this study of Baldrige Education Criteria for PerformanceExcellence model were to:

. develop a comprehensive measurement model, with associated constructs andscales, that accurately captured the content of the MBNQA-Education Criteriafor Performance Excellence;

. address whether the seven Baldrige categories represented a good model forhigher education organizations (especially in the UAE); and

. provide insight into the strength and direction of causation among the sevenBaldrige categories.

The insights gained from these objectives should contribute to the qualitymanagement, performance measurement, and education literature. While the sevencategories and the associated structural (causal) model in the original and educationcriteria were similar, the specific measures addressed within each category (i.e. themeasurement model) were significantly different. For example, the original BaldrigeCriteria (NIST, 1995) most applicable to manufacturing defined the customer as thebuyer of goods and services; however, the Baldrige-Education Criteria for PerformanceExcellence (NIST, 2004) defined customers as the students, their families, communities,governments, and investors in students. Hence, the customer-driven measures used todevelop the scales and measurement model for the Baldrige Education Criteria forPerformance Excellence were different than the original Baldrige Criteria.

The importance of the studyThe recent trends in decreasing financial support for educational institutions,increasing education costs, more local and global competition, changing students’expectations and backgrounds, and more engagement of students and communities incontinuous lifelong learning require higher education institutions to do more with less.Under such pressure, administrators of these institutions should be concerned aboutthe quality of their products. Thus, a solid theoretical model that helps them inmanaging the quality of education would be highly appreciated. Additionally, theimportance of this study relied on the fact that it attempted to test the model at thetheoretical constructs (items) rather than at the criteria level (dimensions), whichvalidates the model in a broader context as a theory of quality management. Moreover,no study, to the best of our knowledge, has utilized the Baldrige Educational Criteriafor Performance Excellence as a framework for studying quality management ineducational institutions; especially, in a non-Western country; instead, researchershave used the original (business) Baldrige criteria. Even though the MBNQAframework acknowledged that educational criteria were built on the same sevendimensions (categories) used for the business criteria, it did not assume that therequirements of all organizations were necessarily addressed in the same way (ECPE,2005). The Baldrige criteria for Education project is dedicated to improve educationalorganizations across the nation by providing leaders with resources for improvementthat will make a difference when implemented as designed. Moreover, as leaders

The BaldrigeEducation

Criteria

1121

Page 5: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

continue to improve their understanding about making meaningful changes in theirorganizations, the wealth of resources and tools available to everyone will alsoimprove.

Review of literatureIn general, much of the published work on the quality aspects of higher education hasconcentrated on effective course delivery mechanisms and the quality of courses andteaching (Oldfield and Baron, 2000; Athiyaman, 1997; Bourner, 1998; Cheng and Tam,1997; McElwee and Redman, 1993; Palihawadana, 1996; Soutar and McNiel, 1996;Varey, 1993; Yorke, 1992). In particular, commentaries and case examples of qualityinitiatives appeared, but most authors focused on the applicability of quality principlesand tools to the education setting (Chaffee and Sherr, 1992; Seymour, 1993; Sherr andLozier, 1991; Cornesky et al., 1991; Marchese, 1993). More evidence has yet to beproduced to confirm the effectiveness of quality programs and processes on desiredorganizational outcomes in higher education (Winn and Cameron, 1998). Addressingthis dearth was a key objective of this research.

The MBNQA framework and the European Foundation for QualityManagement (EFQM) model have become templates for most quality awards inmany countries (Mackerron et al., 2003). These frameworks are widely adopted byorganizations as a means of self-assessment to enhance performance. Theyrepresent an operational assessment tool for quality management practices. Asindicated by the large number of criteria guidelines that have been distributed,many organizations use these criteria to assess their organizational quality. Theapplicability and usefulness of both MBNQA and EFQM models was evident fromthe vast empirical research that exist (Mackerron et al., 2003; Stewart, 2003; DaRosa et al., 2003; George et al., 2003; Li and Yang, 2003; Castka et al., 2003).However, in this study, we will concentrate on the MBNQA EducationPerformance Excellence Framework, since it was the most popular model usedat UAE institutions (Badri and Abdulla, 2004).

The seven dimensions in the MBNQA are hypothesized to have a particularrelationship to each other, as illustrated in Figure 1. Although the Baldrige criteria andframework are widely accepted in practice, there is surprisingly little theoretical andempirical evidence of their validity (Ford and Evans, 2000). Several studies presentedempirical analyses of the original Baldrige Criteria in the manufacturing environmentand provided evidence that the performance relationships observed in the Baldrigecausal model were supported in US firms. Most recently, York and Miree (2004)examined the relationship between TQM and financial performance, using a sample ofBaldrige Award winners; they replicated a second sample of state quality awardwinning companies with three different sets of financial performance measures.Baldrige quality award winners generally had better financial performance than theirpeers after and before winning a quality award.

Several studies examined the issue of the validity of the Baldrige framework andcriteria more directly. Authors, such as Keinath and Gorski (1999), have used statequality awards as surrogates for the Baldrige award, since data on actual scores canoften be obtained from state award agencies, and most state awards are virtuallyidentical to the Baldrige award. Pannirselvam et al. (1998) reported an empiricalanalysis of data from the Arizona Governor’s Quality Award (AGQA), whose criteria

IJQRM23,9

1122

Page 6: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

mirror the original Baldrige Criteria (with only minor editing). Their objective was toprovide evidence of validity for the AGQA model and to generalize the validity to theMBNQA Criteria. They concluded that the MBNQA measurement model (vis-a-visAGQA data) was reliable and valid. However, they did not evaluate dependentrelationships among the Baldrige categories (i.e. the structural model).

In a similar line of inquiry, Pannirselvam and Ferguson (2001) tested the validity ofthe relationships between the categories by modifying the 1992 Baldrige frameworkinto an eight-construct model, separating customer focus and satisfaction into twoseparate constructs. Their results provided evidence to confirm the validity of themodified framework. Similarly, Ford and Evans (2000) conducted a detailed analysis ofthe content validity of the strategic planning category. Evans and Ford (1997)examined the relationship between the Baldrige core values and the processesembedded in the criteria. Evans (1997) proposed a causal model describing the keylinkages in the Baldrige framework; however, the model was not tested. Handfield andGhosh (1995) used structural equations modeling to empirically test the linkagesbetween criteria in the 1992 framework. They reported empirical support for numerouscausal relationships among the seven categories of the Baldrige model in themanufacturing environment. Similar to Handfield and Ghosh (1995), Wilson andCollier (2000) used structural equation modeling of the 1992 framework, concludingthat a modified set of five Baldrige causal relationships was a good predictor oforganizational performance. Other studies that also examined the casual relationshipsin the MBNQA in certain industries, other than education, include Khanna et al. (2002),Goldstein and Schweikhart (2002), Flynn and Saladin (2001), Dow et al. (1999) andSamson and Terziovski (1999).

The findings in these studies provided statistical support for the Baldrige theory ofperformance relationships depicted in the Baldrige causal model. Most of the studiesfound that the Leadership dimension is classified as a driver of quality (Meyer andCollier, 2001; Winn and Cameron, 1998; Pannirselvam and Ferguson, 2001; Flynn andSaladin, 2001). Although each of these studies contributed to the validation of theBaldrige framework, they all focused on the 1992 framework. It is important tounderstand the evolution of the framework and investigate the validity of the 2004framework as it pertains to higher education, particularly given the majorre-engineering of the criteria since 1992.

Since questions have been raised about the lack of evidence for the causalrelationships underlying the quality framework in higher education organizations, thisresearch addressed two questions:

(1) Are the proposed relationships between the categories in the BaldrigeEducation Criteria for Performance Excellence framework valid?

(2) What is the strength of the relationships between the different qualitymanagement constructs prescribed by the criteria?

We addressed these two questions using data from higher education organizations inthe UAE. The paper also attempts to present some detailed results and implications forhigher education authorities in the UAE.

The BaldrigeEducation

Criteria

1123

Page 7: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Research methodologyResearch model: dimensions and categoriesLeadership is the key driver in MBNQA. Without the involvement and commitment ofsenior leaders, the quality management journey becomes difficult and at timesimpossible (Vora, 2002). The MBNQA model evaluates top management leadership’sability to instill quality values and customer focus among the employees, and tocontinuously improve their leadership styles. In higher education, senior leadersshould inspire and motivate the entire workforce and should encourage all faculty andstaff to contribute, develop and learn, be innovative, and be creative. The governancebody is responsible ultimately to all stakeholders for the ethics, vision, actions, andperformance of the organization. Senior leaders should serve as role models throughtheir ethical behavior and personal involvement in planning, communication, coaching,development of future leaders, reviewing of organizational performance, and facultyand staff recognition (Vora, 2002). As role models, they can reinforce ethics, values, andexpectations while building leadership, commitment, and initiative throughout theorganization. In addition to their important role within the organization, senior leadershave other avenues to strengthen education. Reinforcing the learning environment inthe organization might require building community support and aligning communityand business leaders and community services with this aim. The leadership dimensionin Baldrige Education Criteria for Performance Excellence includes six categories:organizational leadership (senior leadership direction, organizational governance,organizational performance review); and social responsibility (responsibility to thepublic, ethical behavior, and support of key communities).

The emphasis of the MBNQA, with respect to the “strategic planning” criterion, ison keeping up with marketing changes and needs, and using advanced technology forlaunching new products and services (Khoo and Tan, 2003; Mak, 1999, 2000). Thestrategic planning dimension examines how the organization develops strategicobjectives and action plans, how strategic objectives and action plans are deployed,and how progress is measured. For higher education, the category stresses thatlearning centered education and operational performance are key strategic issues thatneed to be integral parts of the organization’s overall planning. For example,e-learning-centered education is a strategic view of education. The focus is on thedrivers of key factors in educational success such as student learning, studentpersistence, student and stakeholder satisfaction, new markets, and market share.Learning-centered education focuses on the real needs of students, including thosederived from market requirements and citizenship responsibilities. The criteriaemphasize that improvement and learning need to be embedded in work processes.The Strategic Planning category examines how the organization understands keystudent, stakeholder, market, and societal requirements as input to set strategicdirections. The requirements in the Strategic Planning category encourage strategicthinking and acting – to develop a basis for a distinct leadership position in themarket. The strategic planning dimension in Baldrige Education Criteria forPerformance Excellence has four categories: strategy development (strategydevelopment process, and strategic objectives); and strategy deployment (action plandevelopment and deployment, and performance projections).

In the Baldrige framework for education, Student, Stakeholder, and Market Focusaddress how the organization seeks to understand needs of current and future students

IJQRM23,9

1124

Page 8: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

and stakeholders and to understand the markets, with a focus on delighting studentsand stakeholders, building loyalty, and meeting students’ and stakeholders’expectations. The MBNQA stresses this issue in its ‘‘customer and market focus’’criterion by highlighting the importance of developing listening and learning skills inresponding to customers’ opinions and complaints. The MBNQA criteria, in evaluatingcustomer relations, determine how special training and the career needs ofcustomer-contact employees are met. For higher education, this dimensionconsidered relationships as an important part of an overall listening, learning, andperformance excellence strategy. The criteria also evaluate trends in customersatisfaction and how these trends compare with competitors as a means to assesseffectiveness of the company’s customer-relations management process. The studentand stakeholder satisfaction and dissatisfaction results provide vital information forunderstanding students, stakeholders, and markets. In many cases, these results andtrends provide the most meaningful information, not only on students’ andstakeholders’ views but also on their actions and behaviors – student persistenceand positive referrals. The student, stakeholder, and market focus dimension isreflected by three categories; however, for the purpose of this research, we split thestudent, stakeholder and market focus dimension into two sub-dimensions for a betterrepresentation. Hence, we have two dimensions and four categories: the first dimensionis student, stakeholder and market knowledge with two categories (student knowledge,and stakeholder and market knowledge); and the second dimension is student andstakeholder relationship, and satisfaction with two categories (student and stakeholderrelations, and student and stakeholder satisfaction determination).

The MBNQA criteria provide for evaluation of data from the support processes. Itevaluates information analysis at different levels of business. The MBNQA does notcall for the evaluation of the financial performance of an organization. However, it doesevaluate the ability of the institution to link quality and operational data to financialperformance. It evaluates the methods used to continuously improve its informationgathering and analysis cycle. In higher education, the measurement, analysis, andknowledge management dimension is the main point within the criteria for all keyinformation about effectively measuring and analyzing performance and managingorganizational knowledge to drive improvement in student and operational outcomes.It calls for the alignment of the organization’s programs and offerings and its strategicobjectives. The dimension addresses knowledge management and all basicperformance related to relevant information, as well as how such information isanalyzed and used to optimize organizational performance. The measurement, analysisand knowledge management dimension is given by four categories: measurement andanalysis of organizational performance (performance measures, and performanceanalysis); and information and knowledge management (data and informationavailability, and organizational knowledge).

The MBNQA criteria emphasize the need for human resource plans to support andhelp achieve the organization’s goals. In higher education, faculty and staff focusaddresses key human resource practices – those directed toward creating andmaintaining a high-performance workplace with a strong focus on students andlearning and toward developing faculty and staff for adaptation to change. Thedimension covers faculty and staff development and management requirements in anintegrated way, which is aligned with the organization’s strategic objectives. The

The BaldrigeEducation

Criteria

1125

Page 9: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

faculty and staff focus includes the work environment and the faculty and staffsupport climate. To reinforce the basic alignment of workforce management withoverall strategy, the criteria also cover faculty and staff planning as part of overallplanning in the strategic planning dimension. The faculty and staff dimension is givenby seven categories: work systems (organization and management, faculty and staffperformance management system, and hiring and career progression); faculty and stafflearning and motivation (faculty and staff education, training, and development, andmotivation and career development); and faculty and staff well-being and satisfaction(work environment, and faculty and staff support and satisfaction).

The MBNQA “process management” criterion examines how new products andservices are designed to meet customer needs. Hence, the MBNQA “processmanagement” criterion examines how new products and services are designed to meetcustomer needs and to identify critical customer needs and competitor characteristics.The MBNQA is non-prescriptive regarding the tools used to control process quality. Indiscussing process management, prior research focused on the main process.Additionally, the MBNQA model evaluates the process management of supportservices. The criteria evaluate supplier quality management more thoroughly,measuring not only the methods used to inspect incoming material but also actionstaken to improve the quality of supplied material and hence reduce the cost ofinspection. The criteria also evaluate the methods used by the business to audit andimprove its own quality assessment practices. In higher education, processmanagement is the focal point within the Education Criteria for all key processes.Built into the category are the central requirements for efficient and effective processmanagement: effective education design and delivery; a focus on student learning;linkage to students, stakeholders, suppliers, and partners and a focus onlearning-centered processes that create value for all key stakeholders; andevaluation, continuous improvement, and organizational learning. Agility,operational efficiencies tied to changes in revenue, and cycle time reduction areincreasingly important in all aspects of process management and organizationaldesign. It is crucial to utilize key measures for tracking all aspects of the overallprocess management. The process management dimension is given by two categories:(learning centered process) and (support process).

In higher education, the organizational performance results category provides aresults focus that encompasses student learning; student and stakeholder satisfaction;and overall budgetary, financial, and market performance. Also, initiatives seek tocreate a positive, productive, learning-centered, and supportive work environment;governance structure and social responsibility; and recognition of results for all keyprocesses and process improvement activities. Through this focus, the criteria’spurposes – superior value of offerings as viewed by students, stakeholders, andmarkets; superior organizational performance as reflected in operational, legal, ethical,and financial indicators; and organizational and personal learning – are maintained.Thus, this dimension provides “real-time” information (measures of progress) forevaluation and improvement of educational programs, offerings, services, andorganizational processes, in alignment with the overall operational strategy. It calls foranalysis of organizational results data and information to determine the overallorganizational performance. Responses should include comparison information thatincorporates brief descriptions of how the organization ensures appropriateness of

IJQRM23,9

1126

Page 10: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

each comparison. Comparable organizations might include those of similar types/sizes,both domestic and international, as well as organizations serving similar populationsof students. The organizational performance results dimension is given by sixcategories: (student learning results); (student and stakeholder results); (budgetary,financial and market results); (Faculty and staff results); (organizational effectivenessresults); and (governance and social responsibility results).

Questionnaire development and pilot testTo investigate the MBNQA Education dimensions, an instrument was developed tosurvey the level of practice for the quality items in the 33 categories. The sevenBaldrige dimensions were operationalized through items on the questionnaire thatcaptured the key elements in the MBNQA Application Guidelines. Items were guidedby the criteria specified in the Malcolm Baldrige Award Application Guidelines.

Several steps were taken to ensure that the questionnaire used in this studyprovided a valid measurement of the Baldrige Education Criteria for PerformanceExcellence. The measurement of each of the 33 Baldrige Education categories, whichcannot be measured directly, was operationalized using a scale of items. Each scalewas developed based on a thorough review and understanding of the criteria(dimensions). Additionally, the content and wording of the items were directlytraceable to the Baldrige Education Criteria for Performance Excellence. The numberof items for each category was determined so that the content of the dimension wasadequately addressed. Because the Baldrige Criteria do not prescribe particularmethodologies or practices, the items intended to identify whether, rather than how,relevant management and quality issues were addressed. For example, a scale item fordimension Leadership (Organizational leadership-senior leadership direction) (see theAppendix) asked whether senior leaders practice creating strategic directions (ratherthan specifying whether a particular method, was used). In the development of thesescales, prior scales used in different settings such as manufacturing and healthcarewere selected (Flynn and Saladin, 2001; Meyer and Collier, 2001; Meyer andSchweikhart, 2002).

Each item was measured using a seven-point Likert scale. Several college anduniversity faculties and administrators assisted with pre-testing the questionnaire andprovided valuable feedback in terms of wording and useful performance measures tobe included in the questionnaire. This helped to establish content validity and focus thequestionnaire on the MBNQA Education Criteria for Performance Excellence (NIST,2004). For example, the dimension Leadership (Organizational leadership-seniorleadership direction) used the following survey question: “Our senior leadershipcreates strategic direction” and “senior leaders communicate a clear vision.” Thesequestions were tied to Baldrige Education Criteria for Performance ExcellenceDimension 1.1 note (1) that states, “Organizational directions relate to creating thevision for the organization and to setting the context for strategic objectives and actionplans”. All survey questions were tied to specific criteria in the 2004 BaldrigeEducation Criteria for Performance Excellence (NIST, 2004).

Forty-three individuals participated in a pilot test that was conducted to determinethe reliability of the measurement scales. Participants included university professors,deans, academic policy advisors, administrators, and senior college leaders. Cronbach’scoefficient alpha was one measure used to evaluate reliability, and a guideline of 0.60

The BaldrigeEducation

Criteria

1127

Page 11: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

was used for the new scales in this study (Nunnally, 1967; Meyer and Collier, 2001).Some items were dropped to improve the reliability of the scale and shorten theinstrument length without compromising the content validity. (In the Appendix, thedropped items are identified with an asterisk in the last column). Alpha values rangedfrom 0.820 to 0.909 for the pilot test and from 0.857 to 0.925 for the main study (Table I),which indicated excellent internal consistency of the scale.

Study sampleColleges and universities in the UAE composed the population studied in this research.The study was conducted at the facility level, so that, each university or college wascounted separately in the sample, regardless of its affiliation with a university orcollege system. There were some small colleges in the country that were not included inthis study because they lacked minimum resource requirements to be considered. Forexample, two of these small colleges operated through a single office in a certainbuilding. In addition, those universities or colleges usually have not developedextensive quality management systems. Small colleges that did not have any sort ofaccreditation from the Ministry of Higher Education in the UAE were also excludedfrom the study. It should be noted that universities and colleges provide a wide varietyof educational services and are complex organizations. The Baldrige Criteria mustaccount for this complexity and the broad range of human resource (faculty and staff),process, and information management (measurement, analysis, and knowledgemanagement) issues that these organizations face. A total of 15 universities andcolleges participated in the study.

The questionnaire was only mailed to individuals after a phone call informing themof the study, apologizing for the size of the questionnaire, and encouraging them to behonest and objective in responding to each item. Titles of individuals contactedincluded vice chancellors, deputy vice chancellors, associate deputy vice chancellors,advisors, deans, vice deans, associate deans, assistant deans, academic departmentchairs, and unit heads. In all cases, it was made certain that each individual wasfamiliar with the practice of each item on the questionnaire at their institution.

The questionnaire was e-mailed to 409 individuals in 15 facilities. In total, 224individuals completed and returned the questionnaire for a response rate of 54.7percent. Six of the questionnaires were missing substantial results data (mostlystudent, stakeholder, and market focus, and organizational performance resultscategories) and were not included in further analysis, resulting in a final sample size of220. A small number of missing data points were replaced with scale-average scores.

Research hypothesesThe research hypotheses provided a comprehensive evaluation of the theory andperformance relationships proposed in the Malcolm Baldrige National Quality Award– Education Criteria for Performance Excellence (NIST, 2004). These hypothesesaddressed specific causal relationships among the seven Baldrige categories.

As mentioned earlier, the Baldrige theory states that “Leadership drives the systemwhich creates results” (Meyer and Collier, 2001, Winn and Cameron, 1998;Pannirselvam and Ferguson, 2001). Figure 1 presents this model indicating therelationships between the different quality management and performance evaluationconstructs. The exogenous (independent) factor in the model was leadership. The

IJQRM23,9

1128

Page 12: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Number of items andCronbach Alpha

Pilot study Main study

Percentvarianceexplained

(main study)

LeadershipSenior leadership direction 11 (0.883) 4 (0.905) 85.789Organizational governance 6 (0.830) 4 (0.911) 70.709Organizational performance review 11 (0.846) 5 (0.901) 65.968Responsibilities to the public 9 (0.890) 4 (0.914) 65.694Ethical behavior 8 (0.909) 5 (0.915) 66.702Support of key communities 4 (0.857) 3 (0.900) 89.801

Strategic developmentStrategy development process 11 (0.867) 5 (0.907) 75.505Strategic objectives 7 (0.862) 4 (0.901) 81.985Action plan development and deployment 8 (0.839) 5 (0.921) 71.593Performance projection 6 (0.845) 3 (0.919) 73.679

Student, stakeholder, and market focusStudent knowledge 10 (0.853) 5 (0.901) 72.246Stakeholders and market knowledge 9 (0.846) 4 (0.905) 71.457Student and stakeholder relationships 6 (0.868) 3 (0.915) 87.823Student and stakeholder satisfaction determination 6 (0.820) 3 (0.916) 72.469

Measurement, analysis, and knowledge managementPerformance measurement 8 (0.870) 4 (0.906) 83.883Performance analysis 7 (0.881) 4 (0.905) 90.025Data and information availability 10 (0.844) 5 (0.871) 73.602Organizational knowledge 6 (0.852) 3 (0.911) 81.766

Faculty and staff focusOrganization and management of work 7 (0.836) 3 (0.857) 73.778Faculty and staff performance management 8 (0.875) 4 (0.916) 85.617Hiring and career progression 9 (0.833) 4 (0.925) 67.114Faculty and staff education, training anddevelopment 8 (0.832) 5 (0.871) 69.339Motivation and career development 6 (0.852) 3 (0.911) 81.797Work environment 6 (0.854) 4 (0.915) 82.190Faculty and staff support and satisfaction 9 (0.838) 5 (0.872) 68.509

Process managementLearning-centered processes-LCP 10 (0.864) 5 (0.906) 76.610Support processes-SP 9 (0.860) 5 (0.910) 76.536

Organizational performance resultsStudent learning results 10 (0.879) 5 (0.906) 84.883Student – and stakeholder – focused results 9 (0.885) 4 (0.901) 89.988Budgetary, financial and market results 10 (0.884) 5 (0.912) 88.443Faculty and staff results 9 (0.883) 5 (0.915) 88.711Organizational effectiveness results 12 (0.868) 6 (0.895) 76.750Governance and social responsibility results 9 (0.859) 5 (0.911) 77.383

Table I.The Baldrige categories,

number of items, scalereliabilities, and percent

variance explained (pilotstudy and main study)

The BaldrigeEducation

Criteria

1129

Page 13: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

endogenous factors were strategic quality planning; faculty and staff focus; processmanagement; measurement, analysis, and knowledge management; student,stakeholder, and market focus; and organizational performance results.

Four specific research hypotheses were formulated to test directional relationshipsbetween leadership and the four system dimensions:

H1. Leadership has a positive influence on Process Management.

H2. Leadership has a positive influence on Faculty and Staff Focus.

H3. Leadership has a positive influence on Strategic Planning.

H4. Leadership has a positive influence on Measurement, Analysis, andKnowledge Management.

Two specific research hypotheses were formulated to test the directional relationshipsbetween leadership and the two results dimensions:

H5. Leadership has a positive influence on Student, Stakeholder, and MarketFocus.

H6. Leadership has a positive influence on Organizational Performance Results.

Eight hypotheses were formulated to test the directional relationship between each ofthe system dimensions and each of the two results dimensions listed below:

H7. Process Management has a positive influence on Focus on and Student,Stakeholder, and Market Focus.

H8. Process Management has a positive influence on Organizational PerformanceResults.

H9. Faculty and Staff Focus has a positive influence on Student, Stakeholder, andMarket Focus.

H10. Faculty and Staff Focus has a positive influence on OrganizationalPerformance Results.

H11. Strategic Planning has a positive influence on Student, Stakeholder, andMarket Focus.

H12. Strategic Planning has a positive influence on Organizational PerformanceResults.

H13. Measurement, Analysis, and Knowledge Management has a positiveinfluence on student, stakeholder, and market focus.

H14. Measurement, Analysis, and Knowledge Management has a positiveinfluence on Organizational Performance Results.

Additionally, six within-system hypotheses were formulated to test the Baldrigetheory that management systems should be “built upon a framework of measurement,information and data, and analysis” (Meyer and Collier, 2001; NIST, 1995, p. 4):

IJQRM23,9

1130

Page 14: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

H15. Measurement, Analysis, and Knowledge Management has positive influenceon Strategic Planning.

H16. Measurement, Analysis, and Knowledge Management has positive influenceon Faculty and Staff Focus.

H17. Measurement, Analysis, and Knowledge Management has positive influenceon Process Management.

H18. Strategic planning has a positive influence on Process Management.

H19. Strategic planning has a positive influence on faculty and staff focus.

H20. Faculty and Staff Focus has a positive influence on Process Management.

Finally, the last hypothesis tested the Baldrige theory that improving internalcapabilities and organizational performance results leads to improved externalperformance (customer satisfaction):

H21. Organizational Performance Results has a positive influence on Student,Stakeholder, and Market Focus.

Each of these 21 hypothesized relationships was supported by the general theory that“Leadership drives the system which creates results”. The general theory guided ourassumption about a recursive casual model and the direction for each of the specifichypotheses.

The review of literature indicated that there was one study that dealt with theMBNQA categories in higher education, the Winn and Cameron (1998) study. Theyempirically examined the relationships between the MBNQA categories using datafrom higher education. They administered a 190-item survey based on the MBNQAcriteria to all permanent non-instructional staff at a large Midwestern university.Factor analysis indicated that the seven categories were reliable and valid. Winn andCameron used confirmatory path analysis to determine if the relationships betweencategories as suggested by the MBNQA framework were supported. Results from theLISREL analysis indicated that not all of the relationships in the framework wereentirely supported. As a result, they generated a modified model that proved to fit thedata very well. In this study, we used Winn and Cameron’s (1998) modified model totest whether the data collected fit the modified model as well. Thus, we stated thefollowing hypothesis:

H22. The Winn and Cameron (1998) modified model will provide good-fit statisticsusing the current data.

Analysis methodsTo test hypotheses H1 to H19, two different procedures were used. The two proceduresexamined the relationships among the MBNQA dimensions. First, multiple regressionanalysis examined the relationships among each of the dimensions individually.Second, structural equation modeling examined the predicted relationships among alldimensions in the overall framework together (given the integrative direct and indirecteffects). Structural equations model was also used to test H20.

The BaldrigeEducation

Criteria

1131

Page 15: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Structural equation modeling consists of two components, a measurement modeland a structural model (Hair et al., 1995; Hoyle, 1995; Bollen and Long, 1993; Bollen,1989). The measurement model includes the relationships between the dimensions(Baldrige subcategories) and the questionnaire items (indicators) that operationalizemeasurement of those dimensions. For this study, the measurement model included the33 categories of the Baldrige Educational Criteria for Performance Excellence, and the141 questionnaire items (see Appendix) that comprise the measurement scales for thecategories. The results of statistical tests for the structural model are valid only if themeasurement model uses reliable scales that accurately measure the content of theMBNQA Educational Criteria for Performance Excellence. The structural modelconsisted of the relationships that link the Baldrige dimensions to their respectivecategories as well as the dependent causal relationships that link the seven Baldrigedimensions to one another.

In addition to testing the first 21 hypotheses, the structural equations model alsoserved as a test of theory verification of the Baldrige Education Criteria forPerformance Excellence framework. To test the Winn and Cameron (1998) model, H22,structural equation modeling was also used.

Scale reliabilitiesThe internal consistency method was used to test the reliability of the researchconstructs. As suggested by Nunnally (1967), the coefficient alpha developed byCronbach (1951) was used to test for internal consistency. A Cronbach alpha value of0.70 is considered the criterion for internal consistency for established scales(Nunnally, 1967). Although the Cronbach alpha values for these constructs wereacceptable, we decided that a more conservative measure for reliability should becalculated to confirm that these constructs were reliable (Pannirselvam and Ferguson,2001)). Therefore, the amount of variance captured by the category in relation to theamount of variance due to measurement error was also calculated for each construct (amethod suggested by Fornell and Larcker, 1981).

Scale unidimensionality, which was tested and confirmed for each scale, wasevaluated in the main study data set using Carmines and Zeller (1979) guidelines.These guidelines were also recommended by other researchers dealing withpsychometric properties of scale (Meyer and Collier, 2001). The percent of varianceexplained by the first principal component of each measurement scale is given inTable I, addressing Carmines and Zeller (1979) criterion that the first component ofeach scale explains more than 40 percent of the variance in the items. These resultsshow that the scales meet Carmines and Zeller (1979). The two remaining criteria (alarge eigenvalue for the first component and small, fairly equal eigenvalues forsubsequent components) were also evaluated and upheld in the main study data set.Principal component analysis was used to reduce item responses to a single score foreach of the 33 Baldrige categories. In this case, the first component score for eachcategory was used in subsequent analysis.

ResultsScale reliabilitiesThe reliability of each of the 33 scales (categories) used in this study was re-evaluatedbased on the main study data set. Cronbach’s alpha values for the 33 measurement

IJQRM23,9

1132

Page 16: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

scales ranged from 0.857 to 0.944, exceeding guidelines for adequate reliability(Nunnally, 1967; Flynn et al., 1990; Meyer and Collier, 2001), as shown in Table I (beforeand after dropping certain items). The values were well above the minimumrecommended value of 0.70. The mean-average variance explained by each factor, wereall greater than 50 percent indicating that the variance captured by each construct wasgreater than the variance due to measurement error (Fornell and Larcker, 1981).

Regression analysis: relationships among dimensionsThe MBNQA –Education Criteria for Performance Excellence framework (shown inFigure 1) assumes that the following exist:

. A direct relationship exists between leadership and the four system dimensionsof measurement, analysis, and knowledge management, strategic planning,process management, and faculty, and staff focus.

. A direct relationship exists between leadership and the two outcome dimensionsof student, stakeholder, and market focus and organizational performanceresults.

. A direct relationship exists between the four system dimensions ofmeasurement, analysis, and knowledge management, strategic planning,process management, and faculty and staff focus and the two outcomedimensions of student, stakeholder, and market focus, and organizationalperformance results.

To assess the validity of the framework’s assumptions, three sets of regressionanalyses were conducted. The first regressed each of the four system dimensions onthe leadership dimension. The standardized regression coefficients produced by thisanalysis is reported in Table II. The relationship between the leadership dimension andeach of the system dimensions was strong and statistically significant. The assumedrelationship between an organization’s leadership and each of the quality processes isdefinitely supported. Table III reports the relationships between the leadershipdimension and the two outcomes dimensions and between the system dimensions(individually) and the outcome dimensions. When each of the outcome dimensions wasregressed on the leadership dimension, the resulting relationships were alsosignificant. That is, the multiple regression analysis revealed that the leadershipdimension had a statistically significant effect on organization performance results andstudent, stakeholder, and market focus. In summary, leadership significantlyinfluenced the organization’s systems and outcomes. Other results indicated that all

The four system dimensions

Predictordimension

Strategicplanning

Processmanagement

Faculty andstaff focus

Measurement,analysis, and

knowledgemanagement

Leadership Adjusted-R 2 0.764 0.788 0.724 0.786Beta (b) 0.874 0.888 0.851 0.887P , 0.000 , 0.000 , 0.000 , 0.000

Table II.Regression of the foursystem dimensions on

leadership

The BaldrigeEducation

Criteria

1133

Page 17: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

four system dimensions (individually) had relatively strong and statisticallysignificant effects on the two outcome dimensions.

Next, we ran two sets of multiple regressions where the two outcome dimensionswere the dependent variables, and the four system dimensions were the independentvariables (see Table IV). We noted the four system dimensions collectively hadrelatively strong and statistically significant effects on the outcome dimensions. Theyaccounted for approximately 84 percent of the variation in the student, stakeholder,and market focus dimension and approximately 93 percent of the variation in theorganizational performance results dimension. However, exceptions were a relativelyweak relationship between the student, stakeholder, and market focus dimension, andthe strategic planning and the faculty and staff focus dimension. These relationshipswere statistically not significant. In summary, the regression analyses showed thatleadership had a significant effect on the four system dimensions and the outcome

Outcome dimensions

Predictor dimension

Student,stakeholder,and market

focus

Organizationalperformance

results

Leadership Adjusted-R 2 0.870 0.632Beta (b) 0.933 0.796P , 0.000 , 0.000

Strategic planning Adjusted-R 2 0.830 0.636Beta (b) 0.911 0.798P , 0.000 , 0.000

Process management Adjusted-R 2 0.819 0.819Beta (b) 0.906 0.905P , 0.000 , 0.000

Faculty and staff focus Adjusted-R2 0.724 0.780Beta (b) 0.851 0.884P , 0.000 , 0.000

Measurement, analysis, andknowledge management

Adjusted-R 2

0.925 0.608Beta (b) 0.962 0.781P , 0.000 , 0.000

Table III.Regression results of thetwo outcome dimensionson the driver (leadership)and the four systemdimensions (individually)

Student, stakeholder, andmarket focus

Organizationalperformance results

Independent variables Beta (b) t Sig. Beta (b) t Sig.

Strategic planning 20.135 21.465 0.144 0.133 2.236 0.000Process management 1.257 8.973 0.000 0.448 4.953 0.000Faculty and staff focus 0.058 0.474 0.636 20.374 24.760 0.000Measurement, analysis, and knowledgemanagement 20.306 23.430 0.001 0.754 13.105 0.000Adjusted-multiple R 2 0.839 0.933F Test 284.601 (P , 0.000) 757.277 (P , 0.000)

Table IV.Multiple regressionresults of the twooutcome dimensions onthe four systemdimensions

IJQRM23,9

1134

Page 18: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

dimensions. In turn, the system dimensions had a significant effect on the outcomedimensions. The direct effects of leadership on organizational outcomes assumed in theMBNQA framework were supported.

The Baldrige Education Criteria Performance Excellence Model fitThe root mean square error of approximation (RMSEA) is a measure of model fit that isnot dependent on sample size (Hair et al., 1995; Browne and Mels, 1994; Steiger, 1990).Many other fit measures (e.g. Chi-square, goodness of fit index) are highly dependenton sample size. The following guidelines were used to determine model fit usingRMSEA: RMSEA ,0.05, good model fit; 0.05 , RMSEA ,0.10, reasonable model fit;RMSEA .0.10, poor model fit (Browne and Mels, 1994, p. 86-87; Browne and Cudeck,1993). The computed RMSEA value for the model was 0.057 indicating a reasonablemodel fit.

The overall fit of the model can be tested by using the Chi-Square (x2). For a good fit,the x2 value should be low and non-significant. The x2 value for the model was 1342.32,which was significant (p ¼ 0:0). This would suggest that the model was not confirmedby the sample data. The significance levels of x2, however, are sensitive to sample sizeand multivariate normality. Therefore, other indicators of fit, such as x2/df, Bentler’s(1990) comparative fit index (CFI), Joreskog and Sorbom (1993) goodness of fit index(GFI), Bollen’s (1989) incremental fit index (IFI), and the fit index (NNFI), that correctfor these factors should also be used to assess the adequacy of the model (Joreskog andSorbom, 1993). With 476 degrees of freedom, the x2/df is 2.82, which was less than theratio of five suggested in the literature. All the measures of goodness of fit for themodel tested were above the desired 0.9 level. The CFI, IFI, NNFI and GFI are 0.94,0.91, 0.92, and 0.90, respectively. These fit indices indicated an acceptable fit betweenthe model and data (Bollen, 1989).

The standardized path coefficients for the set of causal relationships are presentedin Figure 2. We noticed that all paths were significant at the 0.01 or the 0.05 levels.Table V shows the results of model estimation including path estimates, standarderrors, and results of t-tests for the significance of the paths. A two-tailed t-test wasperformed on each path estimate to evaluate its statistical significance. The results oftesting the research hypotheses provided empirical support for all of the causalrelationships in the Baldrige Education Criteria for Performance Excellence model.However, the level of significance was different from one path to another. Hypotheses 1through 4 addressed a casual influence of leadership on each of the system categories.We noticed that leadership had great influence on these four categories (path estimatesvaried from 0.60 to 0.75). The support of hypotheses H1 to H4 indicated that leadershipis an overall driver of strategic planning, process management, faculty and staff focus,and measurement, analysis and knowledge management in higher education.

We also noted that leadership had significant influence on both outcome categorieswith path estimates of 0.22 (student, stakeholder and market focus) and 0.54(organizational performance results). These values gave support for hypotheses H5

and H6.The results of testing the research hypotheses provided empirical support also for

the influence of the four system categories on both outcome categories. The highestpath estimate of 0.81 reflected the significant influence of process management on

The BaldrigeEducation

Criteria

1135

Page 19: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Figure 2.The general Baldrigemodel (Education)

IJQRM23,9

1136

Page 20: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

student, stakeholder and market focus. Thus, support was given to the next eighthypotheses, H7 through H14.

The influences of the system categories on each other were also evident from the pathestimates and significance levels. Thus, hypotheses H15 to H20 were also supported.

The last hypothesis in the Baldrige framework dealing with the two outcomecategories was supported as well. Results showed that organizational performanceresults positively affected student, stakeholder and market focus. In summary,considering the regression and structural equation model results, it was possible toconclude that hypotheses H1 to H21 were all supported.

Hypotheses PathPointestimate t-value

Standarderror

H1 Leadership ! Process management 0.67 19.50091 * * 0.035H2 Leadership ! Faculty and staff focus 0.62 16.2695 * * 0.038H3 Leadership ! Strategic planning 0.60 14.77693 * * 0.041H4 Leadership ! Measurement, analysis and

knowledge management 0.75 23.47116 * * 0.032H5 Leadership ! Student, stakeholder and market

focus 0.22 2.418208 * 0.090H6 Leadership ! Organizational performance results 0.54 11.29924 * * 0.048H7 Process management ! Student, stakeholder and

market focus 0.81 27.94886 * * 0.029H8 Process management ! Organizational

performance results 0.26 3.203337 * 0.082H9 Faculty and staff focus ! Student, stakeholder and

market focus 0.57 12.23809 * * 0.047H10 Faculty and staff focus ! Organizational

performance results 0.27 3.449619 * 0.078H11 Strategic planning ! Student, stakeholder and

market focus 0.58 14.28437 * * 0.041H12 Strategic planning ! Organizational performance

results 0.25 3.057055 * 0.082H13 Measurement, analysis and knowledge

management ! Student, stakeholder and marketfocus 0.44 7.94642 * * 0.055

H14 Measurement, analysis and knowledgemanagement ! Organizational performanceresults 0.70 21.23975 * * 0.033

H15 Measurement, analysis and knowledgemanagement ! Strategic planning 0.44 7.83642 * * 0.056

H16 Measurement, analysis and knowledgemanagement ! Faculty and staff focus 0.43 7.29013 * * 0.059

H17 Measurement, analysis and knowledgemanagement ! Process management 0.42 6.74385 * * 0.062

H18 Strategic planning ! Faculty and staff focus 0.33 5.627313 * * 0.059H19 Strategic planning ! Process management 0.32 5.18103 * * 0.062H20 Faculty and staff focus ! Process management 0.62 16.2699 * * 0.038H21 Organizational performance results ! Student,

stakeholder and market focus 0.59 13.73065 * * 0.043

Notes: *path significant at p , 0:05; * *path significant at p , 0:01

Table V.Path estimates for the

structural model

The BaldrigeEducation

Criteria

1137

Page 21: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

The Winn and Cameron model fitWinn and Cameron (1998) found that their research did not validate the Baldrigeframework. As a result, they performed modifications of the framework to derive amodel that was statistically significant and used exploratory analysis to suggest analternate statistically significant model. The alternative framework included the directeffects of leadership on each of the four systems categories, the direct effects ofstrategic quality planning on management of process quality and customer focus andsatisfaction, the direct effect of human resource development and management onquality and operational results, and the direct effects of management of process qualityon customer focus and satisfaction and quality and operational results.

On the other hand, this alternative framework took into account the lack of directeffects on quality and operational results from leadership, information and analysis,and strategic quality planning and the lack of direct effects on customer focus andsatisfaction from leadership, information and analysis, and human resourcedevelopment and management. However, it also recognized the indirect effects ofleadership and the direct effects of information and analysis, strategic qualityplanning, and human resource development and management on the outcomevariables. Based on its ability to account for these predictive relationships and the factthat it has an acceptable goodness-of-fit with the data, the plausibility of the alternativeframework (modified model) was supported in Winn and Cameron’s (1998) study. Theyconcluded that leadership affected the outcomes by mediating effects through theorganizational systems.

After fitting Winn and Cameron model to the current data, all the measures ofgoodness of fit were above the desired 0.9 level, except for one. The CFI, IFI, NNFI andGFI were 0.94, 0.92, 0.91, and 0.89, respectively. These fit indices indicated anacceptable fit between the model and data (Bollen, 1989). The standardized pathcoefficients for the set of causal relationships are presented in Figure 3. We noticed thatall paths were significant at the 0.01 or the 0.05 levels. These results provided supportfor the acceptance of H22. It is interesting to note that our analysis provided evidence toconfirm the validity of the original Baldrige criteria (2004). The differences in Winnand Cameron’s results and the results obtained in this study could be partiallyexplained by differences in the sample studied.

DiscussionThe major finding of this research related to the role of leadership in the BaldrigeEducation Criteria for Performance Excellence. Leadership has a direct causalinfluence on each of the components of the Baldrige System: process management,faculty and staff focus, strategic planning, and measurement, analysis and knowledgemanagement. Leadership causes direct positive changes in each of the Baldrige Systemcategories. This result confirmed Baldrige theory that leadership drives the System.These results corresponded to previous research (see Meyer and Collier, 2001; Winnand Cameron, 1998; Belohlav et al., 2004; Ford and Evans, 2000; Goldstein andSchweikhart, 2002; Handfield and Ghosh, 1995; Wilson, 1997).

The study showed that leadership was the most important enabler for achievingeducational performance excellence. We assumed that effective leadership modulatedthe implementation of performance excellence in universities and colleges. Seniorleaders have a significant influence on, and the ability to make changes to, the

IJQRM23,9

1138

Page 22: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Figure 3.The Winn and Cameron

(1998) model with currentdata

The BaldrigeEducation

Criteria

1139

Page 23: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

educational system. Thus, their role is crucial. Leadership must guide every system,strategy, and process for achieving excellence. However, there are several enablers ofquality and performance excellence in higher education: strategic planning; faculty andstaff focus; student, stakeholder and market focus; process management; andmeasurement, analysis and knowledge management. These enablers influenced sixoutcomes: student learning results; student and stakeholder results; budgetary,financial and market results; faculty and staff results; organizational effectivenessresults; and governance and social responsibility results.

Our research also showed evidence of an important causal relationship fromleadership to measurement, analysis and knowledge management. The influence ofleadership on measurement, analysis and knowledge management is (0.75), which wasrelatively stronger from leadership’s influence on the other system categories ofprocess management, faculty and staff focus, and strategic planning (0.67, 0.62, 0.60respectively).

The stronger influence of leadership on measurement, analysis and knowledgemanagement was also addressed in other empirical studies (Meyer and Collier, 2001).This means that quality-driven institution’s leaders recognized the critical role ofuniversity information systems in providing systems of measurement, information,and data analysis.

This study also showed that leadership’s role in university’s quality managementsystems was both direct, as indicated by the significant paths from leadership toorganizational performance results and student, stakeholder and market focus. Also, itwas indirect, as it influenced outcomes through the four system categories ofmeasurement, analysis and knowledge management, process management, strategicplanning, and faculty and staff focus.

For other industries, such as healthcare, Meyer and Collier (2001) did not find anysupport for direct effects of leadership on customer and stakeholder satisfaction. Inother industries, such as manufacturing, Handfield and Ghosh (1995) and Wilson(1997) did not find direct linkages between leadership and outcome categories. In theeducation industry, our study showed that measurement, analysis and knowledgemanagement was a driver of within-system performance with a significant causalinfluence on each of the other system categories: strategic planning, faculty and stafffocus, and process management. These relationships identified measurement, analysisand knowledge management as the critical link in the Baldrige System.

A comparison of within-system causal linkages for the published BaldrigeEducation model showed that we clarified the direction and strength of causationwithin the Baldrige System. These results corresponded with those of Wilson (1997)and Wilson and Collier (2000) for manufacturing firms, Winn and Cameron (1998) foreducation institutions, and Meyer and Collier (2001) for healthcare. The statisticallysignificant causal influence of measurement, analysis and knowledge management onthe other system categories supported the Baldrige theory that an effectiveorganization needs to be built upon a framework of measurement, information, data,and analysis (NIST, 2004). Hence, University colleges, departments, administrativeunits and other systems must be linked by an effective information system, and thiswas reflected by the significant linkage of measurement, analysis and knowledgemanagement to the other system categories. We also noted that measurement, analysisand knowledge management had a direct causal influence on both outcomes,

IJQRM23,9

1140

Page 24: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

organizational performance results, and student, stakeholder and market focus. Thisrelationship indicated that effective use of measurement, information, and data, alladdressed in the Baldrige Criteria, represented key assets in the organizationalperformance (Meyer and Collier, 2001; Winn and Cameron, 1998).

Results showed that faculty and staff focus development and satisfaction had apositive causal influence on student and stakeholder satisfaction. The research foundan important causal relationship from Baldrige process management to student andstakeholder satisfaction in UAE higher education (strongest link of 0.80). These resultsprovided evidence that the design and delivery of educational and non-educationalprocesses were critical to student and stakeholder satisfaction and should be managedfrom their perspectives.

Organizational performance results had a positive causal influence on student,stakeholder, and market focus. This performance relationship supported Baldrigetheory that improving internal capabilities and performance results in improvedexternal performance (Meyer and Collier, 2001; Collier, 1991; Collier and Wilson, 1997).The results of this research provided impetus for senior leaders in higher education tofocus on improving faculty and staff resources and process management, both ofwhich had a direct causal influence on customer satisfaction, and to strive for improvedinternal performance outcomes that also help to create improved customer satisfaction.

Strategic planning had a statistically significant causal influence on both of the twooutcome categories. This result was in contrast to the outcome of other studiesperformed in healthcare (Meyer and Collier, 2001; Wilson and Collier, 2000). Theyfound that strategic planning did not exert any significant causal influence on “focusand satisfaction of patients and other stakeholder.” In healthcare, it may be difficult forsome hospitals to develop and deploy strategic plans because authorities are uncertainwhat to include in the mission statement (Meyer and Collier, 2001; Gibson et al. 1990;Calem and Rizzo, 1995). Our results reported that higher education institutions usuallywere under pressure to obtain accreditation for their programs and offerings. Mostinternational accrediting agencies, such as ABET (engineering programs), or AACSB(business programs), require universities to develop clear and specific strategiestoward educational excellence. Moreover, measures and indicators of performanceexcellence in higher education may differ from those of healthcare organizations.However, it might be more realistic if performance results are not compiled into asingle construct. In higher education, there are many categories with regard toperformance results and outcome where each might address a certain aspect of theeducational process (i.e. curriculum, delivering methods, teaching, advising, research,job placements, campus life, etc.). Similar arguments were also made by other authors(i.e. Meyer and Schweikhart, 2002; Meyer and Collier, 2001).

Conclusions and implicationsBefore discussing conclusions and implications, the limitations of the study should beacknowledged. The use of self-reported information is always a concern in studies ofthis nature. In addition, even though the sample size of this study (220) was consistentwith recommendations given by Anderson and Gerbing (1988), it was marginalaccording to recommendations given by Hoelter (1983) and Hair et al. (1995).Considering the size of the theoretical models tested in this study, the sample was justwithin the range of acceptability. However, and despite these shortcomings, the results

The BaldrigeEducation

Criteria

1141

Page 25: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

provided useful insights for administrators in higher education institutions andresearchers.

This study used a confirmatory structural equation modeling and testing approachto empirically validate many of the causal relationships in the MBNQA Educationmodel. The research study empirically tested the Baldrige education framework – thatthere is a significant relationship between the leadership, systems, and processes ofhigher education organizations and the consequent outcomes. Specifically, this studyfocused on determining the extent to which higher education results are explained bythe Baldrige Criteria. By providing empirical evidence of the nature of the relationshipsbetween what organizations do and the results they achieve, this study offered decisionmakers, managers, and researchers evidence that the Baldrige framework is a usefultool for developing and managing quality systems in institutions of higher education.

The research aimed at exploring the nature of educational quality at UAE highereducation institutions. The research designed and presented a reliable and validself-assessment tool for higher education based mainly on the Baldrige EducationCriteria for Performance Excellence, which are recognized as involving mostcomprehensive quality concepts. Through the survey results, the institutions of highereducation or just those schools wishing to undertake TQM programs are able todiagnose their quality status, identify their strengths and weaknesses, and developaction plans after performing a thorough cost-benefit analysis.

The Malcolm Baldrige Education Criteria for Performance encourages highereducation organizations to address quality on a broad range of issues. Universities andcolleges that wish to compare equitably with Award winners must produce evidence ofleadership and long-term planning, initiate verifiable quality control procedures,address the happiness and well-being of the faculty and staff and, above all, worktoward student and stakeholder satisfaction and market focus. The criteria arguestrongly for customer-driven organizations, high levels of employee involvement, andinformation-based management. Many universities could utilize the criteria as aframework for implementing a quality program and establishing benchmarks formeasuring future progress.

Implications for senior administrators in higher education institutionsThe results provided insight for higher education leaders into the dominant roleleadership plays in effective implementation of quality management systems. Strongsupport of quality initiatives from senior level management has long been cited as thestarting point for an organization’s quest to achieve a quality-driven culture. Theseresults corresponded with Winn and Cameron (1998) findings that strong support bysenior administrators was an accelerator in the implementation of quality initiatives ateducational institutions.

The results of this study have some important implications for senior leaders ininstitutions of higher education. Many institutions want to improve the quality of theirprograms, offerings, and services, but they might be uncertain as to which qualityphilosophy is the best one to use. Some higher education institutes might focus on thephilosophy of a single quality guru in planning their improvement process. Often,these philosophies provide sound principles for senior leaders involved in qualityimprovements, but they seldom provide a comprehensive system for measurement andevaluation of quality efforts at the organizational level.

IJQRM23,9

1142

Page 26: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

A second implication for managers is drawn from our construct validity analysis.From our structural equation modeling analysis, we found that each of the items wasan important part of a representative category. We also noted that all seven categorieswere correlated with each other. The implications that each of these was correlated tothe others indicated that quality improvement efforts concentrated on one or a few ofthese categories would be less effective. Senior leaders will need to plan and execute aconcerted effort on several fronts to achieve world-class quality education.

One of the most important results of this study was the presentation of a reliableand valid instrument based on the Baldrige Education Criteria for PerformanceExcellence. This instrument could be utilized by senior leaders at institutions of highereducation as a self-assessment tool. Self assessment is important because it helpsinstitutions of higher education to define their quality system and select student,stakeholder, and market focus quality objectives. A major motivation in developing thesurvey instrument we used was to make it simple enough to assist senior leaders inhigher education to conduct internal MBNQA Education Excellence self assessment.

In particular, the tested model provided guidelines on how to proceed with a qualityimprovement strategy in higher education. Assuming that committed and effectiveleadership is in place, the first step is to gather and utilize information on internal andexternal environments. The model indicated that this information should feed thedevelopment of a strategic quality plan, which in turn guides the design anddevelopment of a faculty and staff management system as well as a set oforganizational processes focused on quality. The design of these organizationalprocesses should form a base because they are most important elements influencingthe core outcome dimensions of student, stakeholder and market focus, and universityperformance results.

Publicizing the use of the MBNQA Education Criteria for Performance Excellence isone way of raising awareness of quality management in institutions of highereducation in the UAE (and comparable institutions in the Gulf Cooperation Council). Itwould help identify areas for improvement. When pursuing customer-focused andmarket-driven quality strategies, these criteria and standards can also providereferences to higher education organizations. Finally, all levels of faculty and staff in acollege or university might take the initiative to fulfill their different needs foreducation and training in quality management. To further the quality movement inhigher education, senior leaders should take a leading role in promoting contemporary,strategic quality management concepts and practices. Likewise, they should play anactive role in UAE’s efforts to improve quality of the educational system.

Implications for researchersAs mentioned, organizational performance results dimension is conceptually broad,measuring many facets of internal and external university/college performance. Wefound extreme difficulties in identifying general items to capture the two outcomecomponents of the Baldrige framework in education. Higher education organizationsdeal with many different issues and priorities. More research on specific outcomes fordifferent facets of higher education for both internal and external customers is neededto identify specific and clear measures or indicators of performance and satisfaction(i.e. student segments, disciplines, majors, research, administrative, campus life, jobplacement, alumni activities, interdepartmental links, accreditation, etc.).

The BaldrigeEducation

Criteria

1143

Page 27: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Future research can improve upon our research findings by evaluating othereducational units using different samples at other educational organizations, such aspublic and private school systems. Another plausible direction for future researchmight test the model across different cultures (countries). It should make the BaldrigeEducation Criteria for Performance Excellence more generalizable. Our research addsto the rich body of other endeavors to find the best casual model of organizationalperformance for higher education. Other research should reinvestigate the generalmodel tested here and explore other competing (alternative) models.

References

Anderson, J.C. and Gerbing, D.W. (1988), “Structural equation modeling in practice: a review andrecommended two-step approach”, Psychological Bulletin, Vol. 103 No. 3, pp. 411-23.

Arif, M. and Smiley, F. (2004), “Baldrige theory in practice: a working model”, InternationalJournal of Educational Management, Vol. 18 No. 5, pp. 324-8.

Athiyaman, A. (1997), “Linking student satisfaction and service quality perceptions: the case ofuniversity education”, European Journal of Marketing, Vol. 31 No. 7, pp. 528-40.

Badri, M. and Abdulla, M. (2004), “Awards of excellence in institutions of higher education:an AHP approach”, International Journal of Educational Management, Vol. 18 No. 4,pp. 224-42.

Belohlav, J., Cook, L. and Heiser, D. (2004), “Using the Malcolm Baldrige National Quality Awardin teaching: one criterion, several perspectives”, Decision Sciences Journal of InnovationEducation, Vol. 2 No. 2, pp. 153-76.

Bentler, P. (1990), “Comparative fit index in structural models”, Psychological Bulletin, Vol. 107No. 2, pp. 238-46.

Bigelow, B. and Arndt, M. (1995), “Total quality management: field of dreams?”, Health CareManagement Review, Vol. 20 No. 4, pp. 15-25.

Bigelow, B. and Arndt, M. (2000), “The more things change, the more they stay the same”, HealthCare Management Review, Vol. 25 No. 1, pp. 65-72.

Bollen, K. (1989), Structural Equations with Latent Variables, Wiley, New York, NY.

Bollen, K. and Long, J. (1993), Testing Structural Equation Models, Sage Publications, NewburyPark, CA.

Bourner, T. (1998), “More knowledge, new knowledge: the impact of education and training”,Education+Training, Vol. 40 No. 1, pp. 11-14.

Browne, M. and Cudeck, R. (1993), “Alternative ways of assessing model fit”, in Bollen, K.A. andLong, J.S. (Eds), Testing Structural Equation Models, Sage Publications, Newbury Park,CA, pp. 136-62.

Browne, M. and Mels, G. (1994), RAMONA User’s Guide, Department of Psychology, The OhioState University, Columbus, OH.

Calem, P. and Rizzo, J. (1995), “Competition and specialization in the hospital industry:an application of Hotelling’s location model”, Southern Economic Journal, Vol. 61 No. 4,pp. 1182-98.

Carmines, E. and Zeller, R. (1979), Reliability and Validity Assessment, Sage Publications, BeverlyHills, CA.

Castka, P., Bamber, C. and Sharp, J. (2003), “Measuring teamwork culture: the use of modifiedEFQM Model”, Journal of Management Development, Vol. 22 No. 2, pp. 149-70.

IJQRM23,9

1144

Page 28: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Chaffee, E. and Sherr, L. (1992), Quality: Transforming Postsecondary Education, ASHE-ERICEducation Report No. 3, ASHE-ERIC Clearinghouse on Higher Education, Washington,DC.

Cheng, Y. and Tam, W. (1997), “Multi-models of quality in education”, Quality Assurance inEducation, Vol. 5 No. 1, pp. 22-32.

Collier, D. (1991), “A service quality process map for credit card processing”, Decision Sciences,Vol. 22 No. 2, pp. 406-19.

Collier, D. and Wilson, D. (1997), “The role of automation and labor in determining customersatisfaction in a telephone repair service process”, Decision Sciences, Vol. 28 No. 3,pp. 689-708.

Cornesky, R. and Associates (1991), Implementing Total Quality Management in HigherEducation, Magna Publications, Madison, WI.

Cronbach, L. (1951), “Coefficient alpha and the internal structure of tests”, Psychometrika, Vol. 16No. 3, pp. 297-334.

Da Rosa, M., Saraiva, P. and Diz, H. (2003), “Excellence in Portuguese higher educationinstitutions”, Total Quality Management, Vol. 14 No. 2, pp. 189-97.

Dow, D., Samson, D. and Ford, S. (1999), “Exploding the myth: do all quality managementpractices contribute to superior quality performance?”, Production and OperationsManagement, Vol. 8 No. 1, pp. 1-27.

Education Criteria for Performance Excellence (2005), p. 9, available at: www.quality.nist.gov/PDF_files/2005_Education_Criteria.pdf (accessed 20 April, 2005).

Evans, J. (1997), “Critical linkages in the Baldrige award criteria: research models andeducational challenges”, Quality Management Journal, Vol. 5 No. 1, pp. 13-30.

Evans, J. and Ford, M. (1997), “Value-driven quality”, Quality Management Journal, Vol. 4 No. 4,pp. 19-31.

Flynn, B. and Saladin, B. (2001), “Further evidence on the validity of the theoretical modelsunderlying the Baldrige criteria”, Journal of Operations Management, Vol. 19 No. 6,pp. 617-52.

Flynn, B., Sakakibara, S., Shroeder, R., Bates, K. and Flynn, E. (1990), “Empirical researchmethods in operations management”, Journal of Operations Management, Vol. 9 No. 2,pp. 250-84.

Ford, M. and Evans, J. (2000), “Conceptual foundations of strategic planning in the MalcolmBaldrige criteria for performance excellence”, Quality Management Journal, Vol. 7 No. 1,pp. 8-26.

Fornell, C. and Larcker, D. (1981), “Evaluating structural equation models with unobservablevariables and measurement error”, Journal of Marketing Research, Vol. 18 No. 1, pp. 39-50.

Gann, M. and Restuccia, J. (1994), “Total Quality Management in health care: a view of currentand potential research”, Medical Care Review, Vol. 51 No. 4, pp. 467-500.

George, C., Cooper, F. and Douglas, A. (2003), “EFQM Excellence Model in a local authority”,Managerial Auditing, Vol. 18 No. 2, pp. 122-7.

Gibson, C., Newton, D. and Cochran, D. (1990), “An empirical investigation of the nature ofhospital mission statements”, Healthcare Management Review, Vol. 15 No. 3, pp. 35-45.

Goldstein, S. and Schweikhart, S. (2002), “Empirical support for the Baldrige Award Frameworkin US hospitals”, Health Care Management Review, Vol. 27 No. 1, pp. 62-75.

Hair, J., Anderson, R., Tatham, R. and Black, W. (1995), Multivariate Data Analysis, 4th ed.,Prentice-Hall, Upper Saddle River, NJ.

The BaldrigeEducation

Criteria

1145

Page 29: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Handfield, R. and Ghosh, S. (1995), “An empirical test of linkages between the Baldrige criteriaand financial performance”, Proceedings of the Decision Sciences Institute, Vol. 3, DecisionSciences Institute, Atlanta, GA, pp. 1713-15.

Hoelter, J.W. (1983), “The analysis of covariance structures: goodness-of-fit indices”, SociologicalMethods and Research, Vol. 11 No. 3, pp. 325-44.

Hoyle, R. (Ed.) (1995), Structural Equation Modeling: Concepts, Ideas, and Applications, SagePublications, Thousand Oaks, CA.

Joreskog, K.G. and Sorbom, D. (1993), LISREL 8: Analysis of Linear Structural Relationships byMaximum Likelihood, Instrument Variables and Least Squares Methods, 8th ed., ScientificSoftware, Morresville, IN.

Keinath, B. and Gorski, B. (1999), “An empirical study of the Minnesota quality award evaluationprocess”, Quality Management Journal, Vol. 6 No. 1, pp. 29-39.

Khanna, V., Vrat, P., Shankar, R. and Sahay, B. (2002), “Developing causal relationships for aTQM index for the Indian automobile sector”, Work Study, Vol. 51 No. 7, pp. 364-73.

Khoo, H. and Tan, K. (2003), “Managing for quality in the USA and Japan: differences betweenthe MBNQA, DP and JQA”, The TQM Magazine, Vol. 15 No. 1, pp. 14-24.

Li, M. and Yang, J. (2003), “A decision model for self-assessment of business process based on theEFQM Excellence Model”, International Journal of Quality & Reliability Management,Vol. 20 No. 2, pp. 164-88.

McElwee, G. and Redman, T. (1993), “Upward appraisal in practice: an illustrative example usingthe Qualed model”, Education+Training, Vol. 35 No. 2, pp. 27-31.

Mackerron, G., Masson, R. and McGlynn, M. (2003), “Self assessment: use at operational level topromote continuous improvement”, Production Planning & Control, Vol. 14 No. 1, pp. 82-9.

Mak, W. (1999), “Cultivating a quality mind-set”, Total Quality Management, Vol. 10 Nos 4/5,pp. 622-6.

Mak, W. (2000), “The Tao of people-based management”, Total Quality Management, Vol. 11 Nos4/5/6, pp. 537-43.

Marchese, T. (1993), “TQM: a time for ideas”, Change, Vol. 25 No. 3, pp. 10-13.

Meyer, S. and Collier, D. (2001), “An empirical test of the causal relationships in the BaldrigeHealth Care Pilot Criteria”, Journal of Operations Management, Vol. 19 No. 4, pp. 403-25.

Meyer, S. and Schweikhart, S. (2002), “Empirical support for the Baldrige Award Framework inUS hospitals”, Health Care Management Review, Vol. 27 No. 1, pp. 62-75.

Motwani, J., Sower, V. and Brashier, L. (1996), “Implementing TQM in the health care sector”,Health Care Management Review, Vol. 21 No. 1, pp. 73-82.

NIST (1995), Malcolm Baldrige National Quality Award, 1995 Award Criteria, National Instituteof Standards and Technology, Gaithersburg, MD.

NIST (2004), Education Criteria for Performance Excellence, National Institute of Standards andTechnology, Gaithersburg, MD.

Nunnally, J.C. (1967), Psychometric Theory, McGraw-Hill, New York, NY.

Oldfield, B. and Baron, S. (2000), “Student perceptions of service quality in a UK universitybusiness and management faculty”, Quality Assurance in Education, Vol. 8 No. 2, pp. 85-95.

Palihawadana, D. (1996), “Modeling student evaluation in marketing education”, Proceedings ofthe 1996 Annual Marketing Education Group Conference.

Pannirselvam, G. and Ferguson, L. (2001), “A study of the relationships between the Baldrigecategories”, International Journal of Quality & Reliability Management, Vol. 18 No. 1,pp. 14-34.

IJQRM23,9

1146

Page 30: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Pannirselvam, G., Siferd, S. and Ruch, W. (1998), “Validation of the Arizona governor’s qualityaward criteria: a test of the Baldrige criteria”, Journal of Operations Management, Vol. 16No. 5, pp. 529-50.

Samson, D. and Terziovski, M. (1999), “The relationship between total quality managementpractices and operational performance”, Journal of Operations Management, Vol. 17 No. 4,pp. 393-409.

Seymour, D. (1993), On Q: Causing Quality in Higher Education, Oryx Press, Phoenix, AZ.

Sherr, L. and Lozier, G. (1991), “Total quality management in higher education”, in Sherr, L. andTetter, D. (Eds), New Directions for Institutional Research, Association for InstitutionalResearch, Louisville, KY.

Soutar, G. and McNeil, M. (1996), “Measuring service quality in a tertiary institution”, Journal ofEducational Administration, Vol. 34 No. 1, pp. 72-82.

Steiger, J. (1990), “Structural model evaluation and modification: an interval estimationapproach”, Multivariate Behavioral Research, Vol. 25 No. 2, pp. 173-80.

Stewart, A. (2003), “An investigation of suitability of the EFQM Excellence Model for apharmacy department with NHS trust”, International Journal of Health Care QualityAssurance, Vol. 16 No. 2, pp. 65-76.

Varey, R. (1993), “The course for higher education”, Managing Service Quality, September,pp. 45-9.

Vora, M. (2002), “Business excellence through quality management”, Total Quality Management,Vol. 13 No. 8, pp. 1151-9.

Weinstein, L., Petrick, J. and Saunders, P. (1998), “What higher education should be teachingabout quality – but is not”, Quality Progress, Vol. 1998, pp. 91-5.

Wilson, D. and Collier, D. (2000), “An empirical investigation of the Malcolm Baldrige NationalQuality Award causal model”, Decision Sciences, Vol. 31 No. 2, pp. 361-90.

Wilson, D.D. (1997), “An empirical study to test the causal linkages implied in the MalcolmBaldrige National Quality Award”, dissertation, The Ohio State University, Columbus,OH.

Winn, B. and Cameron, K. (1998), “Organizational quality: an examination of the MalcolmBaldrige quality framework”, Research in Higher Education, Vol. 39 No. 5, pp. 491-512.

York, K. and Miree, C. (2004), “Causation or covariation? An empirical re-examination of the linkbetween TQM and financial performance”, Journal of Operations Management, Vol. 22No. 3, pp. 291-311.

Yorke, M. (1992), “Quality in higher education: a conceptualization and some observations on theimplementation of a sectoral quality system”, Journal of Higher Education, Vol. 16 No. 2,pp. 34-46.

Further reading

NIST (1995a), Malcolm Baldrige National Quality Award 1995 Education Pilot Criteria, NationalInstitute of Standards and Technology, Gaithersburg, MD.

NIST (1995b), Malcolm Baldrige National Quality Award 1995 Health Care Pilot Criteria,National Institute of Standards and Technology, Gaithersburg, MD.

NIST (1999), Malcolm Baldrige National Quality Award 1998 Criteria for PerformanceExcellence, National Institute of Standards and Technology, Gaithersburg, MD.

The BaldrigeEducation

Criteria

1147

Page 31: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Appendix

Table AI shows the Malcolm Baldrige Education Criteria for Performance Excellence categories

(items in the pilot study and deleted items in the main study).

Deleted inmain study

For items 1 to 215, please indicate how often the following occur in your college or university:Scale anchors are 1, 2, 3, 4, 5, 6, or 7; where (1) Not at all . . . (4) Sometimes . . . (7) Always

Leadership (Organizational leadership-Senior leadership direction)1. Senior leaders create strategic directions2. Senior leaders communicate a clear vision3. Senior leaders guide in setting organizational values * *

4. Senior leaders set specific action plans for successful implementation ofstrategic objectives * *

5. Senior leaders show strong commitment to policies and strategies6. Senior leaders guide in setting performance expectations * *

7. Senior leaders continuously communicate with staff and faculty8. Senior leaders continuously address the needs of students and community * *

9. Senior leaders create an environment characterized by ethical behavior * *

10. Senior leaders create an environment that encourages learning * *

11. Senior leaders create an environment that takes into account key developmentneeds of students, staff and faculty * *

Leadership (Organizational leadership-Organizational governance)12. Our governance system ensures accountability of staff and faculty members13. Our governance system ensures monitoring the performance of our senior leaders14. Our governance system ensures protection of students’ interests15. Our senior leaders are accessible to students and faculty and staff16. Our governance system ensures protection of faculty and staff interests * *

17. Our governance system ensures protection of community interests * *

Leadership (Organizational leadership-Organizational performance review)18. Senior leaders continuously review our organizational performance19. Senior leaders continuously review our organizational capabilities * *

20. Senior leaders communicate the importance of continuous improvement andquality

21. Senior leaders continuously use reviews to assess our performance relative toour competitors * *

22. Senior leaders continuously use reviews to assess our progress relative to shortand long term goals * *

23. We have an established set of performance measures * *

24. Senior leaders use our performance measures for setting future directions * *

25. We have a formal procedure to evaluate our senior leaders26. External bodies perform some organization performance reviews27. Leadership performance evaluation is supported by feedback and survey data

from faculty and staff28. Leadership performance evaluation is supported by feedback and survey data

from parents * *

Leadership (Social responsibility-Responsibilities to the public)29. Our leaders address the impact of our programs and offerings on society30. We establish key measures for achieving international accreditation requirements

(continued )Table AI.

IJQRM23,9

1148

Page 32: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

31. We establish key measures for achieving local-national accreditation requirements * *

32. We establish key measures for addressing risk associated with our programs * *

33. We integrate public responsibility into performance improvement efforts * *

34. In our planning, we anticipate public’s concern with our programs and offerings35. In our planning, we anticipate public’s concern with our future programs and

offerings36. We support and encourage the community service of our faculty * *

37. We give students the opportunity to develop their social and citizenship valuesand skills * *

Leadership (Social responsibility-ethical behavior)38. We ensure ethical behavior in all our students * *

39. We ensure ethical behavior in all our faculty and staff40. We ensure ethical behavior in all our higher administration41. We have established clear measures to monitor ethical behavior of students,

faculty and staff42. We have established clear measures to monitor ethical behavior of our partners

(i.e. vendors) * *

43. Our organization is sensitive to public issues44. We practice and support good citizenship in our organization * *

45. We try to portray ourselves as role models when it comes to publicresponsibility, ethics and citizenship

Leadership (Social responsibility-support of key communities46. Our faculty is actively engaged in support of our key communities47. Our senior leaders are actively engaged in support of our key communities * *

48. Our organization supports efforts to strengthen our local communities49. We lead efforts to improve community services, including environmental programs

Strategic planning (strategy development-strategy development process)50. We follow a formal/informal process of strategy development * *

51. We utilize various types of forecasts, projections, options, and scenarios indecision making about our future

52. Our strategies usually lead to changes or modifications in programs, services,and use of technologies. * *

53. We involve faculty and staff when developing our strategies * *

54. We involve stakeholders when developing our strategies * *

55. We perform studies to identify the factors that affect our organization’s future56. We gather and analyze relevant data and information for our strategic planning

process * *

57. We take a long-term view when planning for our organization’s futureopportunities and directions * *

58. Our strategic development process is student, stakeholders, and market-focused59. Our strategic development process takes into account our competitors

weaknesses and strengths60. We ensure that our strategic planning addresses student learning and development

Strategic planning (strategy development-strategic objectives)61. We specify timetables for accomplishing our strategic objectives * *

62. Our strategic objectives directly address the challenges outlined in ourorganizational profile

(continued ) Table AI.

The BaldrigeEducation

Criteria

1149

Page 33: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

63. Our strategic objectives are aimed at developing a competitive leadershipposition in our educational offerings

64. Our long-term vision guides our day-to-day activities * *

65. Our strategic objectives address both short- and long-term challenges andopportunities

66. Our strategic objectives balance the needs of all student and key stakeholders67. Partnership with our community support our strategic plans * *

Strategic Planning (strategy deployment-action plan development and deployment)68. We convert our strategic objectives into short- and long-term action plans to

accomplish the objectives * *

69. Strategic plans are translated into specific requirements for each work unit ordepartment

70. Improvement plans are regularly upgraded71. We continuously assess progress relative to these action plans * *

72. We allocate necessary resources for carrying out these action plans73. We use key measures and indicators in tracking progress relative to action plans74. Strategic decisions are evaluated with objectives measures or indicators * *

75. We continuously develop human resource plans (i.e. education and training) thatwill enable accomplishment of our strategic objectives and action plans

Strategic Planning (Strategy deployment-performance projection)76. We use key established measures or indicators to performance projection77. Short and long term decisions and actions are aligned with our strategic plans * *

78. We compare our projected performance with the projected performance ofcompetitors and key benchmarks

79. Our strategic plans include reducing waste (including idle time, materials, etc.)in all departments * *

80. We use measures or indicators to track dynamic, competitive performance factors81. Our tracking mechanism of performance measures or indicators are utilized as

key diagnostic tool * *

Student, stakeholders, and market focus (Student, stakeholders, and marketknowledge-student knowledge)

82. We have well established mechanism for determining student needs andexpectations

83. We have created a climate conductive to learning84. We analyze student complaints to improve our services * *

85. We conduct regular student surveys for better listening and learning * *

86. Our educational programs and services address the needs of “special students” * *

87. We have an effective student placement service unit88. We provide a variety of extracurricular activities * *

89. Our educational programs emphasize “problem solving” approaches90. Our educational programs emphasize “learning and communication skills” * *

91. Our educational programs emphasize “critical thinking skills”

Student, stakeholders, and market focus (student, stakeholders, and marketknowledge-stakeholders and market knowledge)

92. Our programs are relevant to community needs * *

93. Our educational programs are dynamic and keep pace with market changes94. We conduct regular visits to high schools to promote our university and

programs * *

(continued )Table AI.

IJQRM23,9

1150

Page 34: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

95. We conduct regular visits to community and industry to promote our universityand programs * *

96. We use feedback from our alumni to assess our programs and offerings * *

97. We use feedback from our stakeholders to assess our programs and offerings98. We conduct regular stakeholders’ surveys for better listening and learning * *

99. We take into consideration changing methods of delivering educational services100. In planning our programs, we take into account global and international

requirements

Student, stakeholders, and market focus (student and stakeholder relationship andsatisfaction-student and stakeholder relationships)101. We continuously build active relationships with students and stakeholders102. We have developed partnerships and alliances with students and stakeholders * *

103. We build active relationships to enhance student performance and expectations * *

104. We have modern mechanism for students and stakeholders to accessinformation about our programs

105. We have modern mechanism for students/stakeholders to make complaintsabout our programs/ services

106. We have set a process that ensures that complaints are resolved effectively andpromptly * *

Student, stakeholders, and market focus (student and stakeholder relationship andsatisfaction-student and stakeholder satisfaction determination)107. We have established effective mechanism for determining student/stakeholders

satisfaction/ dissatisfaction108. We use students/stakeholders satisfaction/dissatisfaction information to

improve programs/services109. We use “drop-out rates”, “absenteeism”, “complaint data” as methods to

determine student/stakeholder satisfaction/ dissatisfaction * *

110. We use modern technologies (internet) for determining satisfaction/dissatisfaction * *

111. We use satisfaction/dissatisfaction data to determine value, cost and revenueimplications * *

112. We seek information from staff and faculty for building long-term partnershipwith students and stakeholders

Measurement, analysis, and knowledge management (measurement and analysis oforganizational performance-Performance measurement)113. We collect and integrate information on evidence of student learning114. We collect and integrate information for tracking daily operations * *

115. We use data and information for tracking overall organization performance116. We use data and information to support organization decision making * *

117. Information systems are used to link our programs and services with studentoutcomes

118. We obtain data and information by benchmarking and seeking competitivecomparisons

119. We collect and utilize information on mistakes, complaints, and customerdissatisfaction * *

120. We ensure the effective use of key comparative data from within and outside theeducational community * *

(continued ) Table AI.

The BaldrigeEducation

Criteria

1151

Page 35: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

Measurement, analysis, and knowledge management (measurement and analysis oforganizational performance-performance analysis)121. Our performance analysis includes examining trends122. Our performance analysis includes organizational and academic community

projections * *

123. Our performance analysis includes technology projections * *

124. Our performance analysis includes comparisons and cause and effectrelationships

125. Our performance analysis help determine root causes and set priorities forresource use

126. Our performance analysis draws upon all types of data (student, programs,stakeholders, market, operational, budgetary and comparative data)

127. Results of our performance analysis contribute highly to senior leaders’ reviewand strategic planning * *

Measurement, analysis, and knowledge management (information and knowledgemanagement-data and information availability)128. We ensure the availability of high quality information for key users129. We ensure the availability of timely data and information for key users130. Our data and information are accessible to our partners (communities and

stakeholders) * *

131. We ensure that our hardware and software are reliable, secure and user friendly132. We ensure that data, information and organizational knowledge enjoy

appropriate levels of security and confidentiality * *

133. We ensure that data, information and organizational knowledge enjoy integrity,reliability, accuracy and timeliness * *

134. We encourage the use of electronic information135. Our information systems are standardized across departments136. We encourage the use of the internet for information storage and access * *

137. We encourage the use of advanced information technology to communicate withour students * *

Measurement, analysis, and knowledge management (information and knowledgemanagement-organizational knowledge)138. We ensure that our people keep current with changing educational needs and

directions139. We constantly develop innovative solutions that add value for our students140. We constantly develop innovative solutions that add value for stakeholders * *

141. The focus of our knowledge management is on the knowledge that our peopleneed to do their work * *

142. The focus of our knowledge management is on the knowledge we need toimprove processes, programs and services * *

143. Our organizational knowledge system focuses on the identification and sharingof best practices

Faculty and staff focus (work systems-organization and management of work)144. We have effective ways to organize and manage work and jobs to promote

empowerment and innovation145. We ensure that the skills and experiences of our staff and faculty are equitably

distributed * *

(continued )Table AI.

IJQRM23,9

1152

Page 36: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

146. We have effective ways to organize and manage work and jobs to achieve theagility to keep current with educational service needs * *

147. We motivate employees by improved job design * *

148. Our work system capitalizes on the diversity of culture and thinking of ourfaculty, staff and communities * *

149. We achieve effective communication and skill sharing across departments andfunctions

150. Our work system ensures ongoing education and training for our staff and faculty

Faculty and staff focus (work systems-faculty and staff performance managementsystem-PMS)151. Our PMS includes feedback to faculty and staff152. Our PMS supports a stakeholder focus * *

153. Our compensation, recognition, and related reward and incentive practicesreinforce high performance work * *

154. Our PMS is characterized by a focus on student achievement and innovation155. Our compensation and recognition system is tied to efforts in community and

university service156. Our compensation and recognition system is tied to student evaluation of

teaching and classroom performance157. Our compensation and recognition approaches include rewarding exemplary

performances158. Our PMS emphasizes consistency between compensation and recognition * *

Faculty and staff focus (work systems-hiring and career progression)159. We have an effective mechanism to identify skills needed by potential staff and

faculty * *

160. We have an effective way of recruiting and hiring faculty and staff161. We have an effective way of retaining faculty and staff162. We ensure that our faculty and staff represent diverse ideas, cultures, and

thinking163. We have established an effective succession planning for senior leadership and

supervisory positions * *

164. We manage effective career progression for all faculty throughout the organization * *

165. We manage effective career progression for all administrative and technicalstaff throughout the organization

166. We ensure that our faculty and staff are appropriately certified and licensedwhen required * *

167. Our faculty promotion process is based on accepted principles of academicperformance * *

Faculty and staff focus (faculty and staff learning and motivation-faculty and staffeducation, training and development)168. Our faculty and staff education and training contribute to the achievement of

our action plans * *

169. We utilize faculty and staff education and training delivery programs bothinside and outside our organization * *

170. Our faculty and staff education and training addresses our key needs associatedwith our organizational performance improvement and technological change

171. We seek and use input from faculty and staff and their supervisors on educationand training needs

(continued ) Table AI.

The BaldrigeEducation

Criteria

1153

Page 37: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

172. We deliver education and training to our staff and faculty using diverse modernmethods

173. We reinforce the use of new knowledge and skills obtained by faculty and staffon the job

174. We regularly evaluate the effectiveness of education and training obtained * *

175. We provide appropriate orientation of new faculty and staff as part of oureducation and training programs * *

Faculty and staff focus (faculty and staff learning and motivation-motivation andcareer development)176. We have effective ways in motivating faculty and staff to develop and utilize

their full potential177. We use formal/informal mechanisms to help faculty and staff attain job- and

career-related development and learning objectives * *

178. Faculty and staff appraisals include personal improvement plans179. We provide many opportunities for faculty and staff professional development180. Our senior leaders and supervisors help faculty and staff attain job- and

career-related development and learning objectives * *

181. To help faculty and staff utilize their full potential we use individualdevelopment plans that addresses his or her career and learning objectives * *

Faculty and staff focus (faculty and staff well-being and satisfaction-workenvironment)182. Our work environment supports the well-being and development of all

employees183. We continuously work to improve workplace health, safety, security and

ergonomics184. We ensure that our faculty and staff take part in improving workplace health,

safety, security and ergonomics * *

185. We have established set of measures or indicators for each of these keyworkplace factors

186. We continuously solicit faculty and staff to communicate to us their workenvironment problems

187. We ensure workplace preparedness for emergencies or disasters * *

Faculty and staff focus (faculty and staff well-being and satisfaction-faculty and staffsupport and satisfaction)188. We have established key factors that affect faculty and staff well-being,

satisfaction and motivation189. Our key factors are segmented for our diverse workforce * *

190. We support our faculty and staff via services, benefits, and policies191. We provide various faculty and support services (i.e. counseling, career

development, day-care)192. We provide various recreational and cultural activities to our faculty and staff193. The services, benefits and policies are tailored to the needs of our divers

workforce * *

194. We use formal/informal assessment methods and measures to determine facultyand staff well-being, satisfaction and motivation

195. We relate assessment findings to key organizational performance results toidentify priorities for improving our work environment * *

196. We ensure effective resolution of faculty and staff problems and grievances * *

(continued )Table AI.

IJQRM23,9

1154

Page 38: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

Process management (learning-centered processes-LCP)197. We have effective ways in determining and ensuring our LCP * *

198. We use effective key LCP that deliver our educational programs and offerings199. Our LCP create value for the organization, students, and our key stakeholders * *

200. Our LCP address student educational and developmental needs to maximizetheir success

201. We incorporate inputs from students, faculty, staff and stakeholders todetermine key LCP requirements * *

202. We ensure that our faculty and staff are properly prepared to deliver our LCP203. Our LCP take into account student learning rate differences * *

204. We incorporate new technology and organizational knowledge into the design ofour LCP

205. We use key performance measures for the control and improvement of our LCP206. We continuously improve our LCP to maximize student success and improve

educational programs * *

Process management (support processes-SP)207. We have effective ways in determining and ensuring our key SP’s208. We use effective key SP’s for supporting our LCP’s209. We incorporate inputs from students, faculty, staff and stakeholders to

determine key SP requirements * *

210. We design our SP’s to meet all the key requirements we have already identified * *

211. We incorporate new technology and organizational knowledge into the design ofour SP’s

212. We use key performance measures for the control and improvement of our SP’s213. We try to minimize overall costs associated with process and performance

audits and SP’s * *

214. We prevent errors and rework in designing our SP’s * *

215. We continuously improve our SP’s to achieve better performance and to keepcurrent with organizational needs

For items 216 to 274, please indicate your college or university’s position relative to your competitorson each of the following: Scale anchors are 1, 2, 3, 4, 5, 6, or 7; where (1) Significantly worse . . . (4)

About the same . . . (7) Significantly betterOrganizational performance results (student learning results)216. Overall measures or indicators of student learning results217. The effectiveness of our programs segmented by majors and disciplines218. Current levels and trends in key measures or indicators of student learning * *

219. Student learning results (and trends) for each student segment * *

220. Student learning results represented by requirements derived from our markets221. Correlation between education design and delivery and student learning * *

222. Improvement trends in student admission qualifications223. Improvement in student learning beyond what which could be attributed to

entry-level qualifications * *

224. Educational services attributes as evidence of student and stakeholdersatisfaction * *

225. Positive referrals to and recommendation of our services by students andstakeholders

(continued ) Table AI.

The BaldrigeEducation

Criteria

1155

Page 39: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Deleted inmain study

Organizational performance results (student – and stakeholder – focused results)226. Relevant data that determine and predict our performance as reviewed by

students * *

227. Current levels and trends in key measures or indicators of student satisfaction * *

228. Current levels and trends in key measures or indicators of stakeholders’ satisfaction * *

229. Students and stakeholder loyalty * *

230. Student and stakeholder perceived value of organization231. Student and stakeholder relationship after graduation (alumni loyalty)232. Results of student/stakeholder satisfaction measures * *

233. Trends of gains and losses of students from or to other schools or alternativemeans of education

234. Feedback from students and stakeholders on their assessment of oureducational operation

Organizational performance results (budgetary, financial and market results)235. Trend data on instructional and general administration expenditure per student236. Trend data on cost per academic credit237. Maintaining control over cost while better utilizing income and resources * *

238. Budgetary and financial results as tools for better utilization of resources * *

239. Key budgetary, financial and market indicators * *

240. The effectiveness of management of financial resources * *

241. Financial measures’ data242. Current levels and trends in key measures or indicators of market performance

and market share243. Designing and experimenting with realistic scenarios reflecting budget

increases and decreases244. Current levels and trends in key measures or indicators of student enrolment

and transfer rate * *

Organizational performance results (Faculty and staff results)245. Creating and maintaining a positive and productive environment for faculty and

staff246. Creating and maintaining a learning-centered environment for faculty and staff * *

247. Creating and maintaining a caring environment for faculty and staff248. Enjoying an effective faculty and staff work system performance * *

249. Trends showing improvements in job classification and work design * *

250. Local and regional comparative data on faculty and staff well-being251. Improved levels of faculty and staff satisfaction252. Extent of training and cross-training of staff and faculty * *

253. Trends in experiencing improvements in faculty turnover and absenteeism

Organizational performance results (organizational effectiveness results)254. Experiencing annual increases in overall productivity of scientific research

measures * *

255. Experiencing improvements in timeliness in all key areas of educational andstudent support areas

256. Continuously improving admission standards * *

257. Annual improvements in administrative performance * *

258. Annual funds and budgets allocated for scientific research259. Annual funds and budgets allocated to innovation in teaching260. Emphasis on athletic programs * *

(continued )Table AI.

IJQRM23,9

1156

Page 40: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

About the authorsMasood Abdulla Badri is a Professor of Production and Operations Management, in theDepartment of Business Administration, College of Business & Economics, United ArabEmirates University, Al Ain, United Arab Emirates. He is the corresponding author and can becontacted at: [email protected]

Hassan Selim is an Associate Professor of Management Information Systems in theDepartment of Business Administration, College of Business & Economics, United ArabEmirates.

Khaled Alshare is an Associate Professor of Computer Information Systems in theAccounting & Computer Information System Department, Emporia State University, Emporia,Kansas, USA.

Elizabeth E. Grandon is an Assistant Professor in the Accounting & Computer InformationSystem Department, Emporia State University, Emporia, Kansas, USA.

Hassan Younis is an Assistant Professor of Management in the Department of BusinessAdministration, College of Business & Economics, United Arab Emirates University, Al Ain,United Arab Emirates.

Mohammed Abdulla is an Associate Professor of Management in the Department of BusinessAdministration, College of Business & Economics, United Arab Emirates University, Al Ain,United Arab Emirates.

Deleted inmain study

261. Increased use of web-based technologies262. Cost containment initiatives and redirection of resources * *

263. Experiencing positive annual increases in external funds obtained throughresearch and services

264. Recording positive annual increases in the number of faculty researchpublications

265. Maintaining an effective management of financial resources * *

Organizational performance results (governance and social responsibility results)266. Showing upward scores of stakeholders’ trust in the organization * *

267. Maintaining current accreditation of programs while working towards seekingaccreditation of other programs * *

268. Appropriately and optimally using the funds allocated by the federal government * *

269. Advisory boards and senior leaders continuously tracking relevant performancemeasures on regular basis

270. Considering senior leaders to be accountable for their actions * *

271. Support for key communities and other public purposes272. Demonstrate high standards of overall conduct273. Measures of environmental and regulatory compliance274. Continuously enjoying positive governance/ethical performance measures from

stakeholders Table AI.

The BaldrigeEducation

Criteria

1157

To purchase reprints of this article please e-mail: [email protected] visit our web site for further details: www.emeraldinsight.com/reprints

Page 41: IJQRM The Baldrige Education Criteria for Performance ...docshare01.docshare.tips/files/22994/229949346.pdf · The Baldrige Education Criteria for Performance Excellence Framework

Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.