project performance system

10

Click here to load reader

Upload: yang-bern

Post on 26-Nov-2015

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Project Performance System

Available online at www.sciencedirect.com

www.elsevier.com/locate/ijproman

International Journal of Project Management 29 (2011) 155–164

A case study approach for developing a project performanceevaluation system

Qing Cao *, James J. Hoffman 1

Area of Information Systems and Quantitative Sciences, Rawls College of Business, Texas Tech University, Lubbock, TX 79409, United States

Received 27 October 2009; received in revised form 11 February 2010; accepted 18 February 2010

Abstract

Prior project management research has identified a wide variety of measures that describe the outcomes of a project and the inputcharacteristics that impact outcomes. In practice however, project schedules are still used as the sole project performance measure insome firms. Although the use of project schedules is still a good practice for some companies, for other companies the use of projectschedules as the sole project performance measure can result in industrial projects falling behind schedule and coming in over-budget.In order to examine how the evaluation of project performance can be improved, a two-step approach is documented that was used todesign a new project performance evaluation system at Honeywell Federal Manufacturing & Technologies (FM&T) that would enablemanagers to audit a project and determine where improvements could be made. Lessons learned from the development of a project per-formance evaluation system at Honeywell Federal Manufacturing & Technologies are then discussed.� 2010 Elsevier Ltd and IPMA. All rights reserved.

Keywords: Project management; Performance measures; Case study; Data envelop analysis

1. Introduction

We often hear or read about projects that are late, notcompleted correctly, and/or over-budget. Amazingly differ-ent lobbies of people still claim that these projects havebeen successful. Prior project management research hasidentified a wide variety of measures that describe the out-comes of a project and the input characteristics that impactoutcomes (Banker et al., 1984; Ling et al., 2009; Prabhakar,2008; Thomas and Fernandez, 2008). The most commonlyused project outcome measures include cost, schedule,technical performance outcomes and client satisfaction.Although in general terms project performance is recog-nized as a multidimensional parameter (Baccarini, 1999;Bannerman, 2008; Shenhar et al., 2001) several organiza-tions still evaluate project performance primarily through

0263-7863/$36.00 � 2010 Elsevier Ltd and IPMA. All rights reserved.

doi:10.1016/j.ijproman.2010.02.010

* Corresponding author. Tel.: +806 742 3919.E-mail addresses: [email protected] (Q. Cao), [email protected]

(J.J. Hoffman).1 Tel.: +1 806 928 1364.

cost and schedule performance measures (Might andFischer, 1985).

One possible outcome of the use of project schedules asthe sole project performance measure is that industrial pro-jects can fall behind schedule and come in over-budget. Forexample, in the case of Honeywell Federal Manufacturing& Technologies (FM&T), project schedules were used asthe sole project performance measure. This method cen-tered on measuring ongoing and final project performanceagainst project goals. While this approach provided somebasis for evaluating the extent of success across projects,it did not explicitly take into account differences in projectcharacteristics which may have impacted cost and scheduleperformance, nor did it take into consideration the appro-priateness of project goals (Freeman and Beale, 1992).

Over the years, several studies have examined approachesto improve management practices (Fortune and White,2006; Lewis, 2000; Sullivan and Beach, 2009; Yu et al., 2005).

One of these approaches is cross-project learning (i.e.,based on productivity) which has been identified as beingvital for any organization seeking to continuously improve

Page 2: Project Performance System

156 Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164

its project management practices Lewis (2000). The firststep in cross-project learning is to identify outstanding pro-jects that can serve as role models. A minimum prerequisitefor identifying these best practice projects is the ability tomeasure productivity based performance. Measuring pro-ject performance allows for the creation of incentives thatlikely will yield higher performance.

In order to improve the evaluation of project perfor-mance at FM&T the company decided to participate in aresearch project focused on designing a new evaluation sys-tem that would enable managers to audit a project anddetermine where improvements could be made. Specifi-cally, a two-step approach (i.e., cross-project learningserves as part of the theoretical foundation for this newproject performance evaluation system) was used to designa new project performance evaluation system at HoneywellFederal Manufacturing & Technologies (FM&T) thatwould enable managers to audit a project and determinewhere improvements could be made.

In addition to designing a new project performance eval-uation system, the following research questions are alsoexamined as part of the research project at FM&T:

Research Question 1: Does the use of project schedulesas the sole project performance measure result in themajority of projects at FM&T being inefficient?Research Question 2: Will the development and imple-mentation of a new performance management systemprovide both tangible and intangible benefits forFM&T?Research Question 3: Will engaging in cross-projectlearning provide benefits to FM&T?

The purpose of the current paper is to examine theseresearch questions and to illustrate how a case studyapproach can be used to develop a new project performanceevaluation system at FM&T. Lessons that can be learnedduring the implementation of a new project performanceevaluation system are also presented.

In the next section of the paper the project managementliterature is reviewed, specifically the literature regardingperformance measurement of project-based activities.Next, the two-step approach that we utilized for designingthe new performance evaluation system for FM&T is dis-cussed. Results from the case study are then presentedalong with the answers to the research questions posedabove. The paper then concludes with a discussion of thelessons learned from the development of the new projectperformance evaluation system.

2. Literature review

Project management is different from manufacturing-type operations in that project management is the businessprocess of producing a unique product, service, or resultover a finite period of time (Project Management Institute,2004). The primary challenge of project management is to

achieve all of the project goals and objectives while adher-ing to project constraints (Harrison and Lock, 2004).Extant studies have documented various measurementsthat describe outputs of a project and input factors thatimpact outputs (Dumaine, 1989; Morris and Hough,1987; Shenhar and Dvir, 2007; Turner, 2009). Accordingto Belassi and Tukel (1996), project success factors arerather multidimensional and include factors related to pro-ject (e.g., size, urgency); factors related to the project man-agers and team members (e.g., competence, leadership);and factors related to the external environment (e.g., cus-tomer, market). Although there is no universally agreeddefinition of project output measures, the most cited pro-ject output variables are comprised of cost, schedule, tech-nical performance outputs, and customer satisfaction(Kerzner, 2004; Pinto and Slevin, 1988).

In spite of the multidimensional nature of the projectperformance, cost and schedule performance measures stillremain as the most widely used methods of project perfor-mance evaluation by organizations in the real world (Pro-ject Management Institute, 2004). Moreover, most of theproject performance evaluation methods used by manyorganizations do not explicitly consider key input variablesthat add value for the client (Farris et al., 2006). Because ofthis, the design and use of performance measurement sys-tems has received considerable attention in recent years(Kennerley and Neely, 2003). Neely et al. (1997) note thatinadequately designed performance measures can result indysfunctional behavior often due to the method of calcula-tion which can encourage individuals to pursue inappropri-ate courses of action.

Although the importance of performance measurementhas long been recognized by practitioners and academicsfrom a variety of functional disciplines (Neely et al.,2005), and even though many organizations have rede-signed their measurement systems to ensure that they takeinto consideration their current environment and strate-gies, few organizations appear to have systematic processesin place to ensure that their performance measurement sys-tems continue to reflect their environment and strategies(Kennerley and Neely, 2003; Neely, 1999). Recent researchpertaining to the implementation and use of performancemanagement systems has identified the most severe prob-lems organizations encounter as being lack of top manage-ment commitment; performance management getting a lowpriority or its use being abandoned after a change in man-agement; not having a performance management culture;management putting a low priority on implementation;and people not seeing enough benefit from performancemanagement (de Waal and Counet, 2009).

The literature reviewed above indicates that there isquite a bit of agreement regarding what constitutes projectsuccess (i.e., delivering value to the client), and whileachieving time, cost, and quality are contributors to valueto the client they are not the primary success criteria. Asmentioned above, in this study we examine a company(FM&T) which still uses project schedules as a key perfor-

Page 3: Project Performance System

Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164 157

mance measure. In order to improve the evaluation of pro-ject performance at FM&T we design a performance man-agement system that will enable managers to audit aproject and determine where improvements can be made.Project management research going as far back as the1980s which purports that time, cost and quality are notper se important (e.g., Morris and Hough, 1987; Shenharand Dvir, 2007; Turner, 2009) serves as the basis for ourdesign approach. Specifically, we propose a project perfor-mance evaluation approach that allows mangers to explic-itly consider differences in input variables across projectswhen evaluating the project outputs. Moreover, we alsosuggest that project managers employ a data envelop anal-ysis (DEA) method, which does not require mangers tospecify variables a priori, in their project performance eval-uations (Charnes et al., 1978).

3. Scope of research project and methodology

3.1. Company overview

We were invited by Mr. Joe Vance to help develop a sys-tematic approach to enhance the engineering project per-formance at FM&T. Mr. Vance is the Director ofEngineering at the Honeywell Federal Manufacturing &Technologies plant in Kansas City, Missouri. FM&T pro-vides high-tech production services to government agenciesincluding the National Nuclear Security Administration(NNSA). For more than half a century, Honeywell andits predecessors have manufactured some of the NNSA’smost intricate and technically demanding products at theKansas City Plant. As one of the nation’s most diverselow-volume, high-reliability production facilities, the Kan-sas City Plant is at the heart of the NNSA nuclear weaponscomplex. Traditionally, the plant has taken productrequirements from the NNSA and designs from thenational laboratories, procured supplies as needed, andproduced quality components and systems for othernuclear weapons complex sites and the military. Thesecapabilities also have formed the basis for FM&T’swork-for-others program, which provides services, prod-ucts, and systems for homeland security, the Departmentof Defense and other government agencies (source: Honey-well Federal Manufacturing & Technologies webpage:http://www.honeywell.com/sites/kcp/).

3.2. Case study approach

In the first step of designing the project performanceevaluation system we employed a case study approach toestablish a viable set of productivity metrics (engineeringproject inputs and outputs). A key feature of this researchproject was the close collaboration between the researchersand the case study site (i.e., we interviewed the technicalmanager and more than 10 engineers from various depart-ments at FM&T), in order to ensure that the performancemodel and results were readily understood by organiza-

tional personnel, as well as reflective of actual practiceswithin the organization. The grounded theory approach(Glaser and Strauss, 1967; Strauss and Corbin, 1998) wasadopted in this step because it provides a set of proceduresto inductively develop a framework from data, and allowedus to focus on contextual elements as well as the conse-quences of productivity measurement in the organization(Orlikowski, 1993).

A case study is an “empirical inquiry that investigates acontemporary phenomenon within its real-life context,especially when the boundaries between the phenomenonand context are not clearly evident” (Yin, 1994). Addition-ally, Benbasat et al. (1987) have stressed that a case study isa powerful methodology that allows researchers as well aspractitioners to study information systems in natural set-tings, learn about the state of the art, and generate theoriesfrom practice. A case study also allows the researchers andpractitioners alike to understand the nature and complexityof the process that is taking place and gain an in-depthunderstanding of the phenomenon under study. In addi-tion, a case study is used for studying new phenomenawhere quantitative research methodologies are not possibleor appropriate (Benbasat et al., 1987; Yin, 1994) The use ofa case study approach is appropriate for the currentresearch project since engineering project performanceevaluation is a new managerial endeavor that has greatimplications for FM&T, but has not yet been adoptedand used.

3.3. Case study – productivity metrics

3.3.1. Data collection

We interviewed eleven employees from the organiza-tion who have been involved in various engineeringprojects at FM&T. The eleven participants interviewedin this study were from different functional areas;therefore, they offered different perspectives on how tomeasure engineering project performance in theorganization.

Face-to-face interviews were conducted by theresearcher. Structured and semi-structured questions wereasked during the interviews. Interview questions includedgeneral questions asking subjects to describe their involve-ment and experience in the engineering projects, as well assub-questions such as why an engineering project perfor-mance evaluation system is important, what is the currentsystem at FM&T, and what do they think about effective-ness of the current system. The researcher further asked theinterviewees to brain storm on project based productivitymeasures (inputs and outputs) deemed to be viable andfair. The interview protocol is attached in Appendix I.Each interview lasted approximately 1 h. The interviewswere audio-recorded and notes were taken by theresearcher during each interview. The interviews were tran-scribed and analyzed.

Table 1 shows the data collection checklist that summa-rizes the data resources used in this study.

Page 4: Project Performance System

Table 1Data collection checklist.

Data type Resources Number

Interviews Technical manager 1Mechanical engineers 4Electrical engineers 3Purchasing engineers 3

Documentations Current performance measurement YesOrganization website Yes

158 Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164

3.4. Case analysis

According to Strauss and Corbin (1998), data analysisof grounded theory research starts with open coding inwhich “data are broken down into discrete parts, closelyexamined, and compared for similarities and differences”

(p. 102). In this initial stage of data analysis, the research-ers reviewed the interview transcripts and looked for “dis-crete incidents, ideas, events, and acts” and then gavenames (or “conceptual labels”) for those concepts (Straussand Corbin, 1998). Throughout the data analysis, theresearchers performed constant “comparative analysis,”that is, when the objects, events, acts, or happenings sharedsome common characteristics, they were grouped togetherand formed a category that captured their shared charac-teristics. Categories, as defined by Strauss and Corbin(1998), “are concepts, derived from data that stand for phe-nomena” (p. 114). By doing so, the researchers were able toreduce the vast amount of raw interview data into smaller,more manageable pieces of data.

4. Research results

4.1. Categories and concepts

Following the principles of open coding (Strauss andCorbin, 1998), two researchers went through all of theinterview transcripts, the notes taken by the researchersduring the interviews, and the relevant documentation,and identified a list of concepts from the raw data. Theresearchers compared the emerged concepts across differentparticipants and multiple data resources for validation. Forexample, concepts emerged from the interview transcript ofparticipant A that were then corroborated with interviewtranscript of participant B, or checked against the docu-mentation. Triangulation across data resources helps tostrengthen the emerging concepts. Additionally, prior liter-ature was used for “supplemental validation,” that is, refer-ences from prior literature gave validation for the accuracyof the findings and helped in naming the concepts (Chat-zoglou and Soteriou, 1999; Farris et al., 2006; Herreroand Salmeron, 2005; Pinto and Slevin, 1988).

Through discussions, the two researchers reached con-sensuses on the concepts identified from the data, as wellas on the naming and phrasing of the concepts. The tworesearchers then reviewed all concepts and categorized

the concepts into categories based on similar characteristicsshared among the concepts. Results from the open codinginclude a list of categories and concepts (inputs and out-puts shown in Table 2) related to project based productiv-ity measures at FM&T.

FM&T engineers agreed that project duration was thekey output of interest to the case study application, andthat the driving force behind the business process improve-ment was the need to reduce project duration. In the pro-ject management literature, “time” represents a keycategory of project performance measures, and minimizingproject duration is one objective that a project-based orga-nization can pursue (Chatzoglou and Soteriou, 1999).Other potentially relevant output measures, such as qualityand customer satisfaction, were not being tracked by theFM&T. As such, project duration was utilized as the out-put variable in this study.

After identifying the output variable of interest, the nextstep was to identify the input variables necessary to captureimportant differences between projects. Input variableswere identified through consultation with engineers andthe technical manager at FM&T (to accurately describetheir practices) and through a review of the project man-agement literature. Effort (Variable 2) describes the totalamount of person-days consumed by the project. This var-iable is under the influence of the project manager, but isfixed beyond a certain minimum point. While inefficientproject management practices can increase effort, throughrework, there is a certain minimum amount of work thatmust be completed to meet the objectives of the project –that is, there is a minimum level of effort. Therefore, effortcan be viewed as a cost measure, and also as a measurerelated to project scope or size.

Project staffing (Variable 3) describes the concentrationof labor resources on the project. Specifically, project staff-ing describes the average number of people scheduled towork on a project each project day, thus capturing resourceassignment decisions within FM&T. Obtaining and sched-uling labor resources is a significant portion of any projectmanager’s job, and is also a concern of top management.

Priority (Variable 4) indicates the importance (urgency)assigned to a project by top management. Project prioritycan be rated on a nine-point scale, with “1” representingthe lowest level of priority and “9” representing the highestlevel of priority. Thus, while priority is actually an intervalvariable, the relatively large number of intervals suggeststhat it can be treated like a continuous variable. All elsebeing equal, a higher-urgency project would be expectedto achieve shorter project duration than a lower-urgencyproject, because higher-urgency projects would receivemore attention and experience shorter turnaround time inresource requests and other administrative tasks.

Number of engineers (Variable 5) indicates the numberof engineers available at FM&T to support a project, notthe actual number of engineers directly assigned to a pro-ject. All else being equal, increasing the number of officersshould allow engineers to give more attention to individual

Page 5: Project Performance System

Table 2Productivity metrics.

Variable Variable Definition Units

1 Output Project duration Work days to complete project Days2 Input Effort Work content of the project Person days3 Input Project staffing Number of People on project/effort People/day4 Input Priority Urgency of project Interval (1 = lowest priority; 9 = highest priority)5 Input Number of Engineers Number of engineers at the functional area during the project People6 Input Technical

ComplexityTechnical difficulty and uncertainty of project Categorical (1 = most complex; 3 = least

complex)

Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164 159

projects, thereby reducing the turnaround time for admin-istrative tasks and, ultimately, reducing project duration.Increasing the number of engineers could also allow officersto specialize in a particular type of project, thereby increas-ing efficiency of project oversight.

The last variable, technical complexity (Variable 6)describes the technical difficulty and uncertainty of a pro-ject. Although related to effort (i.e., more technically com-plex projects tend to involve more work content), technicalcomplexity captures additional elements that affect projectduration—such as the extent of risk, need for testing, needfor increased coordination between functions, and degreeof technological uncertainty. Engineering projects atFM&T can be categorized according to general level oftechnical complexity, with “1” representing the most tech-nically complex projects, “2” representing projects of med-ium technical complexity, and “3” representing the leasttechnically complex projects.

4.2. Data envelop analysis approach (DEA)

In the second stage of the research project, Data Envel-opment Analysis (DEA), a non-parametric linear program-ming method, was used to assess the project productivity.DEA was introduced in 1978 by Charnes et al. (1978)and it is a fractional programming model that estimatesthe relative efficiencies of a homogeneous set of units byconsidering multiple sets of inputs and outputs. Accordingto Charnes et al. (1978), DEA has the following advantagesin assessing project productivity:

� It does not require functional relationships betweeninputs and outputs.� Multiple inputs and outputs can be considered

concurrently.� It has the ability to identify inefficient projects.� Using DEA sensitivity analysis, the sources and

amounts of inefficiency for each inefficient project canbe found.

Data envelopment analysis (DEA) is also a widelyaccepted benchmarking approach in exploring project pro-ductivity efficiency (Stensrud and Myrtveit, 2005). Severalprior research efforts have used DEA to analyze softwareefficiency including Banker et al. (1991), who used totallabor hours as the single input measure with DEA for

investigating the productivity of software maintenance pro-jects. Additionally, in a Communications of the ACM arti-cle, Herrero and Salmeron (2005) explained how a DEAmodel could be used for systems analysis and designed torank software project technical efficiency. They foundDEA to be a better means of measurement than other tra-ditional methods.

DEA has two basic models associated with it; they areCRS and VRS models (see Appendix II for a discussionof basic DEA models). The CRS model developed byCharnes et al. (1978) assumes constant returns to scale(CRS), while the VRS model created by Banker et al.(1991) assumes variable returns to scale (VRS). CRSmodels provide the most conservative measure of effi-ciency (i.e., the most aggressive DEA project duration tar-gets). Under CRS, all units are compared against afrontier defined by units operating under the most pro-ductive scale size. Units operating under any diseconomiesof scale, therefore, cannot be 100% efficient. On the otherhand, VRS models allow units operating under disecono-mies of scale to form part of the frontier, as long as theyperform better than their most similar peers (e.g., thoseoperating under similar diseconomies of scale). Choosingwhich model to use depends on both the characteristicsof the data set and the question being analyzed. Stensrudand Myrtveit (2005) assume that software engineeringprojects exhibit variable returns to scale (VRS) with non-linear relationships between input and output, while Far-ris et al. (2006) presume that engineering projects showconstant returns of scale (CRS). In this research projectwe assume that diseconomies of scale might exist formany input variables. For instance, increasing projectstaffing beyond a certain level may yield diminishingreturns in project duration, due to congestion. Similarly,projects with large amounts of effort could also experiencediminishing returns of scale.

For the current research project, FM&T did not wantunits operating under diseconomies of scale (e.g., over-staffed projects or projects with increased effort due torework or other inefficient practices) to be considered100% efficient. Instead, FM&T wanted to draw aggressivecomparisons based on the performance of best practiceunits. Thus, the CRS model output was deemed to be mostappropriate since it identifies inefficiency due to disecono-mies of scale and benchmarks performance against unitsthat are operating under the most productive scale size.

Page 6: Project Performance System

160 Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164

4.3. Data gathering

Using the project productivity metrics that were derivedduring the first stage of the research project, we gatheredinput and output data from 20 engineering projects atFM&T (see Table 3). We then used DEA to identify the mostefficient projects. Next, a sensitivity analysis was performedto seek the causes of inefficient projects and to identify fac-tors of efficiency that could be targeted for improvement.These inefficiencies are caused by input and/or output slacks.An input slack for a project means the project can reduce itsinput by the slack amount without reducing the output(s),while an output slack means the project will have to increaseits output(s) by the slack amount to become efficient.

5. Results and discussions

DEA Excel Solver was used with the engineering projectproductivity metrics to generate the relative efficiency foreach of the 20 engineering projects in various functionalareas at Honey FT&M. The projects were considered effi-cient if their relative efficiency ratios equaled one andregarded as inefficient if they obtained a relative efficiencyratio less than one. The results indicate that five engineer-ing projects were deemed as efficient while fifteen projectswere rated as inefficient (see Table 4, column 2). In thisway, DEA clearly identified the most efficient projects.

A sensitivity analysis was then performed in order todetermine the causes of the inefficient projects. These ineffi-ciencies are caused by input and/or output slacks. An inputslack for a project means the project can reduce its input bythe slack amount without reducing the output, while an out-put slack means the project will have to increase its out-put(s) by the slack amount to become efficient. Table 4presents input slacks (i.e., excess input) for the fifteen pro-jects. In Table 4, column 2 the efficiency ratios are presented

Table 3Project input and output data.

Project Effort Project staffing Priority

1 924 0.05 82 558 0.06 63 730 0.07 74 677 0.13 55 561 0.05 86 732 0.03 77 323 0.17 48 129 0.12 79 482 0.06 9

10 528 0.04 611 146 0.08 812 143 0.04 513 479 0.02 814 123 0.21 915 252 0.08 716 86 0.14 617 310 0.05 518 185 0.06 919 682 0.07 520 256 0.09 8

in a descending order. Columns 3–7 show the slacks of eachinput variables (i.e., effort). The results reveal projects 2, 4,6, 8, and 12 were efficient, and all other projects were inef-ficient. Among the inefficient projects, the efficiency ratiorankings ranged from 0.26973 to 0.89755. The informa-tional value of the slacks in columns 3 through 7 reveal froman input standpoint how and to what degree the inefficientproject teams can make their projects efficient. For example,if Project 19 team could increase their effort by approxi-mately 63 and with the help of the managers adding moreengineers to the project and adjusting the complexity levelof the project, it would become efficient without changingcurrent output values. It is interesting to note that the sen-sitivity analysis reveals that efficiency is not only controlledby the project team but also by other factors (managementand nature of the projects).

6. Research questions answered and Lessons learned during

the development the new project performance evaluation

system at FM&T

During the development the new project performanceevaluation system at FM&T several research questions thatwe posed in this study were answered. Specifically we foundthe answer to Research Question 1 to be that the use ofproject schedules as the sole project performance measuredid result in the majority of projects at FM&T being inef-ficient. In terms of Research Question 2 we found that thedevelopment and implementation of the new performancemanagement system provided both tangible and intangiblebenefits for FM&T. Regarding Research Question 3 wefound that engaging in cross-project learning did providebenefits to FM&T. In addition to finding the answers tothese research questions, during the development the newproject performance evaluation system at FM&T welearned several lessons. These lessons are discussed below.

Number of engineers Complexity Project durations

2 3 17593 1 28261 3 10882 1 25664 3 15764 3 14213 2 7882 2 12864 3 9215 3 7442 1 6462 1 9822 2 8252 1 5984 3 4503 3 7362 1 8882 3 5673 1 9202 2 599

Page 7: Project Performance System

Table 4Sensitivity analysis.

Project Input-oriented CRS efficiency Input slacks

Column 1 Column 2 Column 3 Column 4 Column 5 Column 6 Column 7Effort Project staffing Priority Number of engineers Complexity

2 1.00000 0.00000 0.00000 0.00000 0.00000 0.000004 1.00000 0.00000 0.00000 0.00000 0.00000 0.000006 1.00000 0.00000 0.00000 0.00000 0.00000 0.000008 1.00000 0.00000 0.00000 0.00000 0.00000 0.00000

12 1.00000 0.00000 0.00000 0.00000 0.00000 0.000001 0.89755 465.03101 0.00000 3.49048 0.00000 2.06104

13 0.87283 97.58252 0.00000 3.84046 0.00000 0.5818916 0.85848 0.00000 0.05151 1.14463 1.43079 1.430793 0.84801 331.99688 0.00424 3.81606 0.00000 2.120035 0.66873 0.00000 0.00000 1.43933 0.64880 1.09463

14 0.66314 0.00000 0.09947 3.45073 0.50592 0.0000011 0.64374 0.00000 0.01287 2.63104 0.43111 0.0000017 0.52710 0.00000 0.00000 0.25665 0.00000 0.063597 0.46483 0.00000 0.05825 0.00000 0.53029 0.58342

18 0.45829 0.00000 0.00000 1.69506 0.00000 0.8323510 0.39433 0.00000 0.00000 0.24548 0.84350 0.5806119 0.38210 62.66431 0.00000 0.00000 0.23881 0.0477620 0.37472 0.00000 0.00443 0.96224 0.00000 0.261129 0.36451 0.00000 0.00000 0.97257 0.36451 0.68870

15 0.26973 0.00000 0.00000 0.00000 0.36487 0.38744

Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164 161

6.1. Lesson #1 – the drawbacks of using projects schedules as

the sole project performance measure

One outcome of FM&T’s use of project schedules as thesole project performance measure was a trend at the com-pany for some industrial projects to inefficient (i.e., behindschedule and/or over-budget). As mentioned above DEAExcel Solver was used with the engineering project produc-tivity metrics to generate the relative efficiency for each ofthe 20 engineering projects in various functional areas atHoney FT&M. The projects were considered efficient iftheir relative efficiency ratios equaled one and regardedas inefficient if they obtained a relative efficiency ratio lessthan one. The results indicate that five engineering projectswere deemed as efficient while fifteen projects were rated asinefficient.

6.2. Lesson #2 – why previous project productivity

improvement methods failed

Over the past decade, FM&T tried numerous times tobetter manage its engineering projects armed with variousproductivity improvement methods such as total qualitymanagement, six sigma, earned value project manage-ment. FM&T identified three possible reasons why mostof the previous efforts failed. The first reason was thatmost previous attempts rendered little implementablevalue either due to the complexity of the system (e.g.,earned value system) or disagreement among differentfunctional areas (e.g. design engineering vs. manufactur-ing engineering). The second reason was that in the past,the projects were done mostly in a sequential way, mean-ing, there was virtually no communication among differ-ent engineering functions. The third reason was that

there was a great need for a systematic (generalizable)approach across all functional areas.

6.3. Lesson #3 – the benefits of cross-project learning

Engaging in cross-project learning provided several ben-efits for FM&T. One benefit is that identifying and study-ing best practice projects turned out to be an invaluablesource of learning for all members of the company. Addi-tionally, by identifying its best projects, these projects thenserved as role models for guiding the company in terms ofhow to improve. Identifying best practice projects alsoallowed FM&T’s stakeholders to benchmark its projects.This is important since customers of engineering servicescompanies are increasingly demanding that performancebenchmarks based on past project performance be includedin bidding proposals (Luu et al., 2008). Therefore, firmslike FM&T must identify and provide benchmarks in thebidding process in order to stay competitive. Benchmarksalso provided FM&T with a basis for setting compensationschemes, determining who to promote, and for identifyingthe company’s best performers.

6.4. Lesson #4 – intangible benefits from the new projectperformance evaluation system

Currently, there are two functional areas at FM&T(design and manufacturing departments) using the new pro-ject performance evaluation system and although the pro-curement and sales departments have not yet adopted thenew system, they have requested that the system be revisedto better suit their needs. Overall, there has been a lot ofpositive feedback from the departments and project teamswho have used the system. Specifically, the departments

Page 8: Project Performance System

162 Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164

and project teams have reported several intangible benefitsassociated with the use of the new system. For example,they have found the new system to be relatively easy toimplement. They have also found that the system promotescross-functional efforts, and that these efforts really pay offresulting in less disputes and more consensus via interac-tions. According to the design engineering departmenthead, “the new system provides a viable tool for design engi-neers to change their mindset and effectively carry out theconcurrent engineering (CE) design concept in our depart-ment.” Additionally, they found that the new evaluationsystem promotes cross-project learning which is very help-ful for project manager to deal with resource allocation,personnel management, and budget control.

6.5. Lesson #5 – tangible benefits from the new project

performance evaluation system

FM&T also derived several tangible benefits from thenew project performance evaluation system. A recent anal-ysis of 20 projects that were completed after the new systemhad been implemented showed that the average efficiencyrate associated with the projects rose to 0.854, up froman average of 0.684 for 20 comparable projects that werecompleted before the new system was implemented. Theaverage time to complete the 20 projects was also reducedby 25% when compared to the average time it took to com-plete similar projects that were completed before the newsystem was implemented. Additionally, FM&T reportedthat there has also been a visible decrease in calling in sicktime among engineers during the past 10 months since thesystem was implemented.

Results from the second stage of the research projectalso highlighted tangible benefits. Specifically, results indi-cated that the proposed engineering project productivitymetrics and DEA approach not only can detect inefficientengineering projects, but also can provide informationand guidance for managers at FM&T in terms of how toimprove the engineering projects.

7. Conclusions

In this research project we utilized a two-step approachthat was used for designing a system that would improvethe engineering process at FM&T. Our first step was touse a case study approach to derive the engineering projectperformance metrics. In step two we employed DataEnvelop Analysis (DEA) to show the engineering projectproductivity metrics’ (i.e., the input and output variablesidentified in the case study stage of the research project),ability to measure the performance of the engineering pro-jects at FM&T and to explore the efficiency of those pro-jects. DEA was also used to perform a sensitivity analysisto identify factors of efficiency to target for improvement.

Through multiple case studies with engineers and a man-ager at FM&T, viable engineering project productivity met-rics (input and output variables) were developed to evaluate

project performance. The metrics were developed based onthe inputs of FM&T engineers from various functional areasas such it is more realistic, reliable, and generalize-able.

Based on the initial results from using the new system,FM&T plans on using the system in other functional areaswithin the company. Additionally, although the new sys-tem is currently being used at the project or team level,in the future FM&T plans on using the new system at indi-vidual level which will ultimately tie project performancewith the company’s reward system.

It is important to note that the results obtained throughthe case study method are often new hypotheses or theories,explanations of change- or development processes, evennormative instructions. Although the material and its pro-cessing are empirical, the material is usually formed of asmall number of cases (in this case 20 different projectswithin a single company). Thus, there is a potential problemof generalization to the results obtained by the case studymethod since the extent to which results obtained in a lim-ited number of cases can be generalized to be applicable to alarger group (i.e. in other companies than in Honeywell) isnot precisely known. This means that the results must beregarded as more or less probable hypotheses.

This said however, the concepts we applied to design theproject performance evaluation system for FM&T and thelessons we learned should still provide a good referencepoint for companies whose goal is to improve their engineer-ing projects performance and, to reduce cost and scheduleoverruns. This is important given that as competitive pres-sures increase companies find themselves under more andmore pressure to cut costs while maintaining productivity.

Appendix I – Interview protocol

Demographic information

Your gender

A. Female

B. Male Your age

A. 18–25

B. 25–38 C. 38–45 D.>45

The highest educational degree you have received

A. Highschool

B. Bachelor’sdegree

C. Graduatedegree

What is your position at the organization?Your business unit nameHow many employees are in your business unit?How many employees report to you?How long have you been working in this organization?

Questions:

� Is it important for your department to have a projectproductivity measurement system? Why, why not? Whatis the purpose of having such a system in general?� What do you think about the system (i.e., viable or fair)?

Why?

Page 9: Project Performance System

Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164 163

� Is the current productivity measurement system capableof enabling comparisons of performance acrossprojects?� Do you think we need to revamp the current system?� Can you describe in general what the current project

productivity measures are?� What are the performance measures (a.k.a., the output

variables) of a project?� Is the project duration one of the major output

variables?� Do you think that output needs to include cost, scope

and customer satisfaction?� What other factors do you think we need to incorporate

in the output measures?� Input variables are very important to capture differences

between projects. What are the input variables for pro-jects in the current system?� What input variables do you think are the most impor-

tant in the current system or for a new system?� Do you think input variables should include effort, pro-

ject staffing, priority, technical complexity, and etc.?What other input variables do you think to be includedin the system?

Appendix II – Basic DEA models

Charnes et al. (1978) initially introduced the DEAmodel to measure the relative efficiency of decision makingunits (DMUs) using multiple inputs to produce multipleoutputs. They addressed constant returns to scale (CRS).CRS assumes that there is a proportional change betweeninputs and outputs. The CRS efficiency represents technicalefficiency (TB), which measures inefficiencies due to input/output configuration as well as due to the size of the oper-ation. Banker et al. (1984) presented the DEA model todetermine whether there are any inefficiencies attributedto disadvantageous conditions under which a DMU isoperating, which are not directly related to the inputsand outputs, and to allow for a larger peer group to be con-sidered. They addressed variable returns to scale (VRS).VRS assumes that there is a proportional change in inputs,but this does not result in a proportional change in out-puts. The DEA VRS model can be obtained through theaddition of a convexity constraint (

Pnj¼1kj) to the DEA

CRS model. The VRS efficiency represents pure technicalefficiency (PTE), that is, a measure of efficiency withoutscale efficiency (SE). It is thus possible to decompose TE(assuming CRS) into PTE and SE. Scale efficiency can beestimated by dividing PTE into TE.

References

Baccarini, D., 1999. The logical framework method for defining projectsuccess. Project Manage. J. 30 (4), 25–32.

Bannerman, P.L., 2008. Defining project success: a multilevel framework.In: Proc. PMI Research Conference, Warsaw, Poland.

Banker, R.D., Srikant, M., Kemerer, C.F., 1991. A model to evaluatevariables impacting the productivity of software maintenance projects.Manage. Sci. 37 (1), 1–18.

Banker, R.D., Charnes, A., Cooper, W.W., 1984. Some models forestimating technical and scale inefficiencies in data envelopmentanalysis. Manage. Sci. 30 (9), 1078–1092.

Belassi, W., Tukel, O., 1996. A new framework for determining criticalsuccess/failure factors in projects. Int. J. Project Manage. 14 (3), 141–151.

Benbasat, I., Goldstein, D.K., Mead, M., 1987. The case research strategyin studies of information systems. MIS Quart. 11 (3), 369–386.

Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency ofdecision making units. Eur. J. Oper. Res. 2 (6), 429–444.

Chatzoglou, P.D., Soteriou, A.C., 1999. A DEA framework to assess theefficiency of the software requirements capture and analysis process.Decision Sci. 30 (2), 503–531.

de Waal, A., Counet, H., 2009. Lessons learned from performancemanagement system implementations. Int. J. Prod. Perform. Manage.58 (4), 367–390.

Dumaine, B., 1989. How managers can succeed through speed. Fortune 19(4), 54–59.

Farris, J., Groesbeck, R.L., Van Aken, E.M., Letens, G., 2006. Evaluatingthe relative performance of engineering design projects: a case studyusing data envelopment analysis. IEEE Trans. Eng. Manage. 53 (3),471–482.

Fortune, J., White, D., 2006. Framing of project critical success factors bya systems model. Int. J. Project Manage. 24 (1), 53–65.

Freeman, M., Beale, P., 1992. Measuring project success. Project Manage.J. 23 (1), 8–17.

Glaser, B., Strauss, A., 1967. The Discovery of Grounded Theory:Strategies for Qualitative Research. Aldine de Gruyter, New York.

Harrison, F., Lock, D., 2004. Advanced Project Management a StructuredApproach. Gower Publishing, Ltd.. p. 34.

Herrero, I., Salmeron, J., 2005. Using the DEA methodology to ranksoftware technical efficiency. Commun. ACM 48 (1), 101–105.

Kennerley, M., Neely, A., 2003. Measuring performance in a changingbusiness environment. Int. J. Oper. Prod. Manage. 23 (2), 213–229.

Kerzner, H., 2004. Advanced Project Management: Best Practices onImplementation. John Wiley & Sons.

Lewis, J.P., 2000. The Project Manager’s Desk Reference. McGraw-Hill,New York.

Ling, F., Low, S., Wang, S., Lim, H., 2009. Key project managementpractices affecting Singaporean firms’ project performance in China.Int. J. Project Manage. 27 (1), 59–71.

Luu, V., Kim, S., Huynh, T., 2008. Improving project managementperformance of large contractors using benchmarking approach. Int. J.Project Manage. 26 (7), 758–769.

Might, R.J., Fischer, W.A., 1985. The role of structural factors indetermining project management success. IEEE Trans. Eng. Manage.32 (2), 71–77.

Morris, P., Hough, G., 1987. The Anatomy of Major Projects: A Study ofthe Reality of Project Management, vol. 1. Chichester, John Wiley &Sons, Ltd., UK.

Neely, A., 1999. The performance measurement revolution: why now andwhat next. Int. J. Oper. Prod. Manage. 19 (2), 205–228.

Neely, A., Gregory, M., Platts, K., 2005. Performance measurementsystem design: a literature review and research agenda. Int. J. Oper.Prod. Manage. 25 (12), 1228–1263.

Neely, A., Richards, H., Mills, J., Platts, K., Bourne, M., 1997. Designingperformance measures: a structured approach. Int. J. Oper. Prod.Manage. 17 (11), 1131–1153.

Orlikowski, W.J., 1993. CASE tools as organizational change: investigat-ing incremental and radical changes in systems development. MISQuart. 17 (3), 309–340.

Pinto, J., Slevin, D., 1988. Project success: definitions and measurementtechniques. Project Manage. J. 19 (1), 67–73.

Prabhakar, G.P., 2008. What is project success: a literature review. Int. J.Bus. Manage. 3 (9), 3–10.

Page 10: Project Performance System

164 Q. Cao, J.J. Hoffman / International Journal of Project Management 29 (2011) 155–164

Project Management Institute, 2004. A Guide to the Project ManagementBody of Knowledge, ANSI/PMI 99-001-2004, 3rd Newton Square,PA.

Shenhar, A., Dvir, D., 2007. Reinventing Project Management: TheDiamond Approach to Successful Growth & Innovation. HarvardBusiness School Publishing.

Shenhar, A.J., Dvir, D., Levy, O., Maltz, A.C., 2001. Project success: amultidimensional strategic concept. Long Range Plann. 34 (4), 699–725.

Stensrud, E., Myrtveit, I., 2005. Identifying high performance ERPprojects. IEEE Trans. Software Eng. 29 (5), 398–416.

Strauss, A., Corbin, J., 1998. Basics of Qualitative Research: Techniquesand Procedures for Developing Grounded Theory. SAGE Publication,Thousand Oaks, California.

Sullivan, J., Beach, R., 2009. Improving project outcomes throughoperational reliability: a conceptual model. Int. J. Project Manage.27 (8), 765–775.

Thomas, G., Fernandez, W., 2008. Success in IT projects: a matter ofdefinition? Int. J. Project Manage. 26 (7), 733–742.

Turner, J.R., 2009. The Handbook of Project-based Management:Leading Strategic Change in Organizations. McGraw-Hill.

Yin, R.K., 1994. Case Study Research: Design and Methods. SagePublications, Thousand Oaks, California.

Yu, A., Flett, P., Bowers, J., 2005. Developing a value-centred proposalfor assessing project success. Int. J. Project Manage. 23 (6), 428–436.