performanta serviciilor in sectorul public

55
International Journal of Operations & Production Management Performance measurement system design: A literature review and research agenda Andy Neely Mike Gregory Ken Platts Article information: To cite this document: Andy Neely Mike Gregory Ken Platts, (1995),"Performance measurement system design", International Journal of Operations & Production Management, Vol. 15 Iss 4 pp. 80 - 116 Permanent link to this document: http://dx.doi.org/10.1108/01443579510083622 Downloaded on: 22 October 2014, At: 09:48 (PT) References: this document contains references to 143 other documents. To copy this document: [email protected] The fulltext of this document has been downloaded 14408 times since 2006* Users who downloaded this article also downloaded: NONE, Andy Neely, Mike Gregory, Ken Platts, (2005),"Performance measurement system design: A literature review and research agenda", International Journal of Operations & Production Management, Vol. 25 Iss 12 pp. 1228-1263 Andy Neely, (1999),"The performance measurement revolution: why now and what next?", International Journal of Operations & Production Management, Vol. 19 Iss 2 pp. 205-228 Umit S. Bititci, UTrevor Turner, Carsten Begemann, (2000),"Dynamics of performance measurement systems", International Journal of Operations & Production Management, Vol. 20 Iss 6 pp. 692-704 Access to this document was granted through an Emerald subscription provided by 607121 [] For Authors If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service information about how to choose which publication to write for and submission guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information. About Emerald www.emeraldinsight.com Emerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of online products and additional customer resources and services. Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation. *Related content and download information correct at time of download. Downloaded by UNIVERSITY OF GREENWICH At 09:48 22 October 2014 (PT)

Upload: deea-coste

Post on 15-Jan-2016

264 views

Category:

Documents


0 download

DESCRIPTION

Performanta in sectorul public

TRANSCRIPT

Page 1: Performanta serviciilor in sectorul public

International Journal of Operations & Production ManagementPerformance measurement system design: A literature review and research agendaAndy Neely Mike Gregory Ken Platts

Article information:To cite this document:Andy Neely Mike Gregory Ken Platts, (1995),"Performance measurement system design", International Journal ofOperations & Production Management, Vol. 15 Iss 4 pp. 80 - 116Permanent link to this document:http://dx.doi.org/10.1108/01443579510083622

Downloaded on: 22 October 2014, At: 09:48 (PT)References: this document contains references to 143 other documents.To copy this document: [email protected] fulltext of this document has been downloaded 14408 times since 2006*

Users who downloaded this article also downloaded:NONE, Andy Neely, Mike Gregory, Ken Platts, (2005),"Performance measurement system design: A literature review andresearch agenda", International Journal of Operations & Production Management, Vol. 25 Iss 12 pp. 1228-1263Andy Neely, (1999),"The performance measurement revolution: why now and what next?", International Journal ofOperations & Production Management, Vol. 19 Iss 2 pp. 205-228Umit S. Bititci, UTrevor Turner, Carsten Begemann, (2000),"Dynamics of performance measurement systems", InternationalJournal of Operations & Production Management, Vol. 20 Iss 6 pp. 692-704

Access to this document was granted through an Emerald subscription provided by 607121 []

For AuthorsIf you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors serviceinformation about how to choose which publication to write for and submission guidelines are available for all. Pleasevisit www.emeraldinsight.com/authors for more information.

About Emerald www.emeraldinsight.comEmerald is a global publisher linking research and practice to the benefit of society. The company manages a portfolio ofmore than 290 journals and over 2,350 books and book series volumes, as well as providing an extensive range of onlineproducts and additional customer resources and services.

Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee on PublicationEthics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation.

*Related content and download information correct at time of download.Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 2: Performanta serviciilor in sectorul public

IJOPM15,4

80

Performance measurementsystem design

A literature review and research agendaAndy Neely, Mike Gregory and Ken Platts

Manufacturing Engineering Group, University of Cambridge, UK

IntroductionWhen you can measure what you are speaking about, and express it in numbers, you knowsomething about it…[otherwise] your knowledge is of a meagre and unsatisfactory kind; itmay be the beginning of knowledge, but you have scarcely in thought advanced to the stage ofscience. (Lord Kelvin, 1824-1907).

Performance measurement is a topic which is often discussed but rarelydefined. Literally it is the process of quantifying action, where measurement isthe process of quantification and action leads to performance. According to themarketing perspective, organizations achieve their goals, that is they perform,by satisfying their customers with greater efficiency and effectiveness thantheir competitors[1]. The terms efficiency and effectiveness are used precisely inthis context. Effectiveness refers to the extent to which customer requirementsare met, while efficiency is a measure of how economically the firm’s resourcesare utilized when providing a given level of customer satisfaction. This is animportant point because it not only identifies two fundamental dimensions ofperformance, but also highlights the fact that there can be internal as well asexternal reasons for pursuing specific courses of action[2]. Take, for example,one of the quality-related dimensions of performance – product reliability. Interms of effectiveness, achieving a higher level of product reliability might leadto greater customer satisfaction. In terms of efficiency, it might reduce the costsincurred by the business through decreased field failure and warranty claims.Hence the level of performance a business attains is a function of the efficiencyand effectiveness of the actions it undertakes, and thus:

● Performance measurement can be defined as the process of quantifyingthe efficiency and effectiveness of action.

● A performance measure can be defined as a metric used to quantify theefficiency and/or effectiveness of an action[3].

This paper was produced during the research project – Manufacturing Strategy andPerformance Measurement – which was sponsored by the ACME Directorate of SERC undergrant number GR/H 21470.

International Journal of Operations& Production Management, Vol. 15No. 4, 1995, pp. 80-116. © MCBUniversity Press, 0144-3577

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 3: Performanta serviciilor in sectorul public

Performancemeasurement

system design

81

● A performance measurement system can be defined as the set of metricsused to quantify both the efficiency and effectiveness of actions[4,5].

Even with these definitions performance measurement remains a broad topic.Hence this article will focus on the issues associated with the design ofperformance measurement systems, rather than the detail of specific measures.

The remainder of the article has been split into four main sections. It isstructured around the framework shown in Figure 1, which highlights the factthat a performance measurement system can be examined at three differentlevels:

(1) the individual performance measures;

(2) the set of performance measures – the performance measurementsystem as an entity; and

(3) the relationship between the performance measurement system and theenvironment within which it operates.

As an example of this, take the “set of measures” shown in Table I, which areused by a leading US supplier of computer-based information-processingequipment[6]. At the level of the individual measure, this “performancemeasurement system” can be analysed by asking questions such as:

● What performance measures are used?

● What are they used for?

● How much do they cost?

● What benefit do they provide?

Figure 1.A framework for

performancemeasurement system

design

Individualmeasures

Individualmeasures

Individualmeasures

Individualmeasures

Performancemeasurement

system

Theenvironment

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 4: Performanta serviciilor in sectorul public

IJOPM15,4

82

At the next higher level, the system can be analysed by exploring issues suchas:

● Have all the appropriate elements (internal, external, financial, non-financial) been covered?

● Have measures which relate to the rate of improvement been introduced?● Have measures which relate to both the long- and short-term objectives

of the business been introduced?● Have the measures been integrated, both vertically and horizontally?● Do any of the measures conflict with one another?

And at the highest level, the system can be analysed by assessing:● whether the measures reinforce the firm’s strategies;● whether the measures match the organization’s culture;

Table I.Typical monthlyperformance measures

Category of measure Measures used

Shipments ActualPerformance-to-build planCurrent backlog

Inventories Total (weeks and $)ScrapExcessObsolete

Variances Purchase priceProduction burdenMaterials acquisitionMaterials burdenMaterials usageLabour

Labour performance EfficiencyUtilizationProductivityOverhead percentageOvertimeAbsenteeismIndirect: direct radio

Capital AppropriationsExpenditures

Spending Salaries and benefitsControllable expensesNon-controllable expenses

Headcount DirectIndirectTotalBy functional areas

(Adapted from [6])

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 5: Performanta serviciilor in sectorul public

Performancemeasurement

system design

83

● whether the measures are consistent with the existing recognition andreward structure;

● whether some measures focus on customer satisfaction;

● whether some measures focus on what the competition is doing.

In the next three sections the issues surrounding these, and similar questions,will be explored more fully. In the fourth, the implications of this review will besummarized in the form of a research agenda.

Individual measures of performanceAs Figure 1 shows, all performance measurement systems consist of a numberof individual performance measures. There are various ways in which theseperformance measures can be categorized, ranging from Kaplan andNorton’s[7] balanced scorecard through to Fitzgerald et al.’s[8] framework ofresults and determinants. The rationale underlying this article is thatperformance measures need to be positioned in a strategic context, as theyinfluence what people do. Measurement may be the “process of quantification”,but its affect is to stimulate action, and as Mintzberg[9] has pointed out, it isonly through consistency of action that strategies are realized.

Following their review of the manufacturing strategy literature, Leong etal.[10] claim that it is widely accepted that the manufacturing task, and hencethe key dimensions of manufacturing’s performance, can be defined in terms ofquality, delivery speed, delivery reliability, price (cost), and flexibility. Despitethis assertion, however, confusion still exists over what these generic termsactually mean. Wheelwright[11], for example, uses flexibility in the context ofvarying production volumes, while Tunälv[12] uses it to refer to a firm’s abilityto introduce new products rapidly. And, as shown in Table II, other authorssuch as Garvin[13], Schonberger[14], Stalk[15], Gerwin[16], and Slack[17] haveall pointed out that the generic terms quality, time[18] cost and flexibilityencompass a variety of different dimensions[19]. It would be impractical, then,

Table II.The multiple

dimensions of quality,time, cost and

flexibility

Quality Time FlexibilityQ1: Performance T1: Manufacturing lead time F1: Material qualityQ2: Features T2: Rate of production introduction F2: Output qualityQ3: Realiability T3: Deliver lead time F3: New productQ4: Conformance T4: Due-date performance F4: Modify productQ5: Technical durability T5: Frequency of delivery F5: DeliverabilityQ6: Serviceability F6: VolumeQ7: Aesthetics Cost F7: MixQ8: Perceived quality C1: Manufacturing cost F8: Resource mixQ9: Humanity C2: Value addedQ0: Value C3: Selling price

C4: Running costC5: Service cost

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 6: Performanta serviciilor in sectorul public

IJOPM15,4

84

to review all the possible measures of manufacturing’s performance in thisarticle. Hence only a selection of the most important measures relating toquality, time, cost and flexibility will be discussed.

One of the problems with the performance measurement literature is that it isdiverse. This means that individual authors have tended to focus on differentaspects of performance measurement system design. Business strategists andorganizational behaviourists, for example, have explored the rationaleunderlying the use of performance measures more fully than the productionand operations management community, and as this is an important part of theperformance measurement system design process, it will also be explored inthis section.

One of the techniques used to gather data for this review was a survey whichexamined the use of performance measures in UK SMEs – small and medium-sized manufacturing enterprises[20,21]. A key finding of that survey was thatthe cost of measurement is an issue of great concern to managers in SMEs.Indeed in the free-form section of the questionnaire one of the respondentswrote:

For small and medium sized companies often the best justification is “feel”, even when thenumbers don’t add up. Measurement is a luxury – success and failure are obvious.

To date, however, the authors have been unable to identify any studies whichhave explored how the cost-benefit relationship of performance measures canbe analysed.

Performance measures relating to qualityTraditionally quality has been defined in terms of conformance to specificationand hence quality-based measures of performance have focused on issues suchas the number of defects produced and the cost of quality. Feigenbaum[22] wasthe first to suggest that the true cost of quality is a function of the prevention,appraisal and failure costs. Campanella and Corcoran[23] offer the following asdefinitions of these three types of cost:

Prevention costs are those costs expended in an effort to prevent discrepancies, such as thecosts of quality planning, supplier quality surveys, and training programmes;

Appraisal costs are those costs expended in the evaluation of product quality and in thedetection of discrepancies, such as the costs of inspection, test, and calibration control;

Failure costs are those costs expended as a result of discrepancies, and are usually dividedinto two types:

● Internal failure costs are costs resulting from discrepancies found prior to delivery of theproduct to the customer, such as the costs of rework, scrap, and material review;

● External failure costs are costs resulting from discrepancies found after delivery of theproduct to the customer, such as the costs associated with the processing of customercomplaints, customer returns, field services, and warranties.

Crosby’s assertion[24] that “quality is free” is based on the assumption that, formost firms, an increase in prevention costs will be more than offset by adecrease in failure costs. Basically, the logic underlying the cost of quality

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 7: Performanta serviciilor in sectorul public

Performancemeasurement

system design

85

literature is that for a given set of organizational conditions there is an optimallevel of quality. The cost of quality is a measure of the extra cost incurred by theorganization because it is either under- or over-performing. It is commonlyargued that this can be as much as 20 per cent of net sales[23].

Plunkett and Dale[25] point out that, although conceptually appealing, theacademic rigour of the cost of quality model is debatable. It is based onassumptions and estimates, rather than on data. And like the economic orderquantity (EOQ) model, it is questionable whether an optimum level of qualityreally exists. More relevant for performance measurement system design,however, is the point made by Crosby[26]. He says that many companies fail tointegrate the cost of quality model with their management process. That is,although managers estimate the cost of quality they fail to take appropriateactions to reduce it.

With the advent of total quality management (TQM) the emphasis has shiftedaway from “conformance to specification” and towards customer satisfaction.As a result the use of customer opinion surveys and market research hasbecome more widespread. The establishment of the Malcolm Baldrige NationalQuality Award in the USA and the European Quality Award reflects this trend.Table III (adapted from[27])shows how entrants for the 1991 Baldrige Awardwere judged. Note how the strongest weighting was given to customersatisfaction (300 of the 1,000 points available).

Other common measures of quality include statistical process control[28,29]and the Motorola six-sigma concept. Motorola is one of the world’s leadingmanufacturers and suppliers of semiconductors. It set a corporate quality goalof achieving six-sigma capability (3.4 defects per million parts) by 1992 andrecently one 27 person unit reported its 255th consecutive week without a singledefect[30]. These last two measures of quality raise an important issue relevantto performance measurement system design because they focus on themeasurement of the process rather than the output.

Performance measures relating to timeTime has been described as both a source of competitive advantage and thefundamental measure of manufacturing performance[15,31]. Under the just-in-time (JIT) manufacturing philosophy the production or delivery of goods justtoo early or just too late is seen as waste[32]. Similarly, one of the objectives ofoptimized production technology (OPT) is the minimization of throughputtimes[33].

Galloway and Waldron[34-37] have developed a time-based costing systemknown as throughput accounting. It is based on the following threeassumptions[38]:

(1) Manufacturing units are an integrated whole whose operating costs in the short term arelargely predetermined. It is more useful and infinitely simpler to consider the entire cost,excluding material, as fixed and to call the cost the “total factory cost”.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 8: Performanta serviciilor in sectorul public

IJOPM15,4

86

(2) For all businesses, profit is a function of the time taken to respond to the needs of themarket. This in turn means that profitability is inversely proportional to the level ofinventory in the system, since the response time is itself a function of all inventory.

(3) It is the rate at which a product contributes money that determines relative productprofitability. And it is the rate at which a product contributes money compared to the rateat which the factory spends it that determines absolute profitability[34].

Table III.Scoring the 1991Baldrige award

1.0 Leadership (100 points)1.1 Senior executive leadership (40)1.2 Quality values (15)1.3 Management for quality (25)1.4 Public responsibility (20

2.0 Information and analysis (70 points)2.1 Scope and management of quality data and information (20)2.2 Competitive comparisons and benchmarks (30)2.3 Analysis of quality data and information (20)

3.0 Strategic quality planning (60 points)3.1 Strategic quality planning process (35)3.2 Quality goals and plans (25)

4.0 Human resource utilization (150 points)4.1 Human resource management (20)4.2 Employee involvement (40)4.3 Quality education and training (40)4.4 Employee recognition and performance measurement (25)4.5 Employee well-being and morale (25)

5.0 Quality assurance of products and services (150 points)5.1 Design and introduction of quality products and services (35)5.2 Process quality control (20)5.3 Continuous improvement of processes (20)5.4 Quality assessment (15)5.5 Documentation (10)5.6 Business process and support service quality (20)5.7 Supplier quality (20)

6.0 Quality results (180 points)6.1 Product and service quality results (90)6.2 Business processes, operational, and supportive service quality results (50)6.3 Supplier quality results (40)

7.0 Customer satisfaction (300 points)7.1 Determining customer requirements and expectations (30)7.2 Customer relationship management (50)7.3 Customer service standards (20)7.4 Commitment to customers (15)7.5 Complaint resolution for quality improvement (25)7.6 Determining customer satisfaction (20)7.7 Customer satisfaction results (70)7.8 Customer satisfaction comparison (70)

(Adapted from [27])

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 9: Performanta serviciilor in sectorul public

Performancemeasurement

system design

87

Galloway and Waldron’s philosophy is that contribution should be measured interms of the rate at which money is received rather than as an absolute. Hencethey define the key throughput accounting ratio as return per factory hourdivided by cost per factory hour, where:

Practical examples of the use of throughput accounting are still rare. One of thefirst to be published was the case of Garrett Automotive[39]. In that report itwas claimed that throughput accounting was one of the factors that helpedGarrett Automotive double its profits. Interestingly the authors also identifiedthree problems with the adoption of throughput accounting, namely:

(1) It can be difficult to identify the constraints and the bottlenecks correctly.(2) The reduction in stocks and work-in-progress stemming from the use of

throughput accounting means a short-term profit problem as feweroverheads are carried forward in the stocks. This gives a one-off profitreduction.

(3) Reduced inventories highlight problems which have been hidden foryears.

Of course, one can argue that these problems are not due to throughputaccounting. Indeed they appear to be more akin to the problems (andopportunities) that arise from an OPT implementation.

In a different context House and Price[40] recommend the use of the Hewlett-Packard return map to monitor the effectiveness of the new productdevelopment process. Figure 2 shows a sample return map. Note how itintegrates the dimensions of time and cost. House and Price argue that thereturn map can be used to provide superordinate goals which, given anappropriate organizational culture, will encourage managers from marketing,R&D and manufacturing to work together.

Fooks[41] reports that Westinghouse has used similar cost-time profiles formore than a decade. Basically the idea is that any set of business activities orprocesses can be defined as a collection of costs over time. In Westinghouse:

Vertical lines on the profile represent purchased materials and services. In a factory, they’rethe raw materials; in the office, they’re supplies, outside services, and information. Diagonallines represent work – dollars per hour of cost added over time. The slope of the line dependson the workers’ pay rates and on the duration of the work. Horizontal lines are wait times –when nothing is happening to the process but time is going by. In factories the material sits instorerooms or in the aisles; in offices, the information resides in in-baskets or electronicallyinside computers. Reducing wait times – which can approach 95 per cent of elapsed time –offers the most immediate opportunity for improvement efforts[41].

Return per factory hour = Sale price – material costTime on the key resource

Cost per factory hour =Total factory cost

Total time available on the key resource

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 10: Performanta serviciilor in sectorul public

IJOPM15,4

88

On an alternative tack, an interesting approach to the design of time-basedperformance measures is proposed by Azzone et al.[42]. They suggest thatcompanies that seek to employ time as a means of competitive advantageshould use the generic set of measures shown in Table IV. Note how these reflectthe efficiency (internal configuration) and effectiveness (external configuration)dimensions of performance that were discussed earlier.

Performance measures relating to costThe development of management accounting has been well documented byJohnson[43-49], among others, and his work shows that many of the

Table IV.Measures for time-based competition

Internal configuration External configuration

R&D Number of changes in Development time forEngineering time projects new products

∆ average time between twosubsequent innovations

Operations Adherence to due dates Outgoing qualityThroughput time Incoming quality Manufacturing cost

Distance travelledValue-added time (as apercentage of total time)Schedule attainment

Sales and marketing Complexity of procedures Cycle timeOrder processing lead time Size of batches of information Bid time

(Adapted from [42])

Figure 2.The Hewlett-Packardreturn map

DevelopmentInvestigation Manufacturing sales

Time (months)

Sales

Profit

Investment

Cumulative costs and revenues(millions of pounds)

Source: Adapted from [40]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 11: Performanta serviciilor in sectorul public

Performancemeasurement

system design

89

management accounting systems used today are based on assumptions thatwere made 60 years ago. Indeed Garner’s[50] review of the accounting literatureindicates that most of the so-called sophisticated cost accounting theories andpractices had been developed by 1925[51]. Take, for example, return oninvestment (ROI). In 1903 the three DuPont cousins decided to adopt adecentralized structure for their diverse business interests and hence had toface the problem of how they could control them.

Pierre DuPont rejected the [then] widely used measure of profits or earnings as a percentageof sales or costs, because it failed to indicate the rate of return on capital invested[51].

Instead, DuPont developed the accounting measure return on investment andused this to assess both the efficiency of each business unit, and DuPont’ssuccess as a whole[44,52]. ROI is still commonly used for the same purposetoday, but it is now widely recognized that it often induces short-termism[53].

Johnson and Kaplan’s[54] thesis is that because the business environment haschanged dramatically in the last 60 years, management accounting is based onassumptions which are no longer valid. One of the most widely criticizedpractices is the allocation of indirect labour and overhead according to the directlabour cost[54]. In the early 1900s direct labour made up the majority of the fullproduct cost. Hence it made sense to allocate overheads to products accordingto their direct labour content. With the increasing use of advancedmanufacturing technologies, however, direct labour cost now typically accountsfor only 10-20 per cent of the full product cost[55], while overhead constitutes30-40 per cent[56]. This leads to overhead burdens of between 400 and 1,000 percent and hence even a relatively small change in a product’s direct labourcontent can have a massive impact on its cost structure. Furthermore, thepractice of allocating overheads according to direct labour hours encouragesmanagers to concentrate on trying to minimize the number of direct labourhours attributed to their cost centre, while ignoring overhead. Johnson andKaplan[54] argue that these problems are likely to become more severe in thefuture as product life cycles become shorter and hence an ever increasingproportion of the full product cost will take the form of research anddevelopment overhead.

It should be noted that there is not universal agreement within the accountingfraternity that Johnson and Kaplan[54] are correct. In 1988 the UK’s CharteredInstitute of Management Accountants (CIMA) instigated a study designed toexplore what criticisms were then being levelled at management accounting inthe USA. Following their investigation, Bromwich and Bhimani[57] reportedthat the perception at the time was that management accounting systems had ashort-term orientation, lacked a strategic focus, relied on redundantassumptions concerning the manufacturing process, and were too often used toprovide data for external financial reporting rather than that which wasnecessary for managing the business. In their conclusion, however, Bromwichand Bhimani contend that contrary to the suggestions of, among others,Johnson and Kaplan:

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 12: Performanta serviciilor in sectorul public

IJOPM15,4

90

The evidence and arguments advanced by advocates of wholesale changes in managementaccounting are not yet sufficient to justify the wholesale revision of management accountingin the UK[57].

Murphy and Braund[56] support this thesis and question the academic rigourof some of the data with which Kaplan supports his position.

As a result of the criticisms levelled at traditional management accountingCooper[58-63] developed an approach known as activity-based costing (ABC).Jeans and Morrow[64] argue that ABC overcomes many of managementaccounting’s traditional problems, such as:

● Management accounting has become distorted by the needs of financialreporting; in particular, costing systems are driven by the need to valuestock, rather than to provide meaningful product costs.

● Direct labour has shrunk as a percentage of total cost for the majority ofmanufacturing companies, yet it is still by far the most common basis ofloading overheads onto products.

● Overhead costs are no longer a mere burden to be minimized. Overheadfunctions such as product design, quality control, customer service,production planning and sales order processing are as important to thecustomer as are the physical processes on the shopfloor.

● Complexity has increased. Production processes are more complex,product ranges wider, product life cycles shorter, quality higher.

● The marketplace is more competitive. Global competition is a reality inmost sectors. Every business should be able to assess the trueprofitability of the sectors it trades in, understand product costs andknow what drives overhead. The cost management systems shouldsupport process improvement and the performance measures should bein line with strategic and commercial objectives.

In 1985 Miller and Vollmann[65] pointed out that while many managers focuson the visible costs, e.g. direct labour, direct material, etc., the majority ofoverheads are caused by the “invisible” transaction costs. Cooper appears tohave based his early work on this paper and the central assumption underlyingABC is that it is activities, and not products, that cause cost:

In an activity-based costing system, the cost of a product is the sum of all activities requiredto manufacture and deliver the product[60].

Troxel and Weber[66] say that ABC is not really a new concept and they suggestthat it has undergone three phases of development. In the first, ABC was notformally recognized. It was merely seen as an advanced version of traditionalcost accounting and indeed there is evidence to suggest that some firms wereusing ABC in the late 1960s. In the second phase of its development, which wasled by academics such as Cooper, ABC became recognized as a costing systemin its own right, but many practitioners were put off by the thought that theymight have to scrap their existing cost accounting system in order to introduce

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 13: Performanta serviciilor in sectorul public

Performancemeasurement

system design

91

it. Troxel and Weber[66] argue that it was not until phase three that it wasfinally recognized that ABC was not an alternative to traditional costaccounting, but that it could be used as part of the strategic decision-makingprocess. Hence they agree with Kaplan’s[67] assertion that there should be adisconnection between external financial reporting and the systems used togather information for strategic decision making.

Despite the fact that ABC is seen by many US firms as a low payoff activityit has been subject to only a small amount of criticism[68]. Piper andWalley[69,70] question whether the assumption that activities cause cost isvalid. They point out that the claim that ABC provides more accurate productcosts has, first, never been proven and second, is based on a comparison withtraditional cost accounting systems, rather than more modern ones such ascontribution costing. Allen[71] pursues a similar theme when he points out thatABC bases its analysis on the existing cost structure of a firm. Hence it does notencourage managers think radically and examine whether their businessprocesses can be redesigned. Other problems with ABC include the fact that itignores opportunity costs[72] and that most of the data required has to beestimated[73]. Currently the view in the ascendancy appears to be that ABC isbest used as a means of analysis on a one-off basis[66]. Indeed, a KPMGconsultant recently told one of the authors that in his experience the greatestbenefit of ABC was that it forced managers to think about how they couldreduce the hidden or “invisible” cost of transaction.

For the purpose of this article, then, it is sufficient to note that ABC wasoriginally driven by the need to generate more accurate product costs. Recently,however, it has become increasingly apparent that ABC’s benefit is largely afunction of process analysis[74]. This links in with the concept of businessprocess redesign which involves looking at information which flows across,rather than down through, an organization. For the interested reader a fullerreview of the ABC literature is provided by Innes and Mitchell[75,76].

Another widely documented cost-based performance measure isproductivity. This is conventionally defined as the ratio of total output to totalinput[77]. Hence:

Productivity is a measure of how well resources are combined and used to accomplish specific,desirable results[78].

Ruch[79] has pointed out that higher productivity can be achieved in a numberof ways, including:

● increasing the level of output faster than that of the input (managedgrowth);

● producing more output with the same level of input (working smarter);

● producing more output with a reduced level of input (the ideal);

● maintaining the level of output while reducing the input (greaterefficiency);

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 14: Performanta serviciilor in sectorul public

IJOPM15,4

92

● decreasing the level of output, but decreasing the level of input more(managed decline).

Problems arise with the measurement of productivity because it is difficult notonly to define inputs and outputs, but also to quantify them[77]. Craig andHarris[80] suggest that firms should seek to measure total, rather than partial,productivity. This point was later picked up by Hayes et al.[81] when theydescribed how firms could measure total factor productivity. For the interestedreader more detail on productivity is available in[82-88].

Performance measures relating to flexibilitySlack[89] identifies range, cost and time as dimensions of flexibility, althoughhe later modifies this model so that it includes only range and response, whererange refers to the issue of how far the manufacturing system can change andresponse focuses on the question of how rapidly and cheaply it can change[17].

Gerwin[16] observes that very little is known about the implications offlexibility for manufacturing management and suggests that “part of theproblem arises from the lack of operational measures of flexibility”. Afteridentifying various dimensions of flexibility he suggests the followingmeasures:

Mix flexibility can be measured by the number of components handled by theequipment…Where this figure is heavily influenced by short term fluctuations in marketdemand an average over a given time period can be used. However, the range of componentcharacteristics handled is probably a more sophisticated measure of mix flexibility. Amanufacturing process may handle a small number of different components but they may bevery different from each other. Another measure to consider is the ratio of the number ofcomponents processed by the equipment to the total number processed by the factory.

One possible raw measure of changeover flexibility is the number of componentsubstitutions made over a given time period. However, a correction also needs to be made herefor the degree to which the new and old component differ from each other. There may be a lowfrequency of changes but they may involve very dissimilar components. An alternativeapproach suggested by Gustavsson[90] is to calculate the ratio of the equipment investmentrelevant for the next product to the total investment.

Modification flexibility can be measured in terms of the number of design changes made ina component per time period.

Rerouting flexibility has a long-term aspect which is salient when machines are taken out ofproduction to accommodate major design changes. There is also a short-term aspect whicharises from the necessity to cope with machine shutdowns due to equipment or qualityproblems. A measure for each aspect should reflect the following mutually exclusivepossibilities: it is possible to reroute components directly affected by the stoppage; reroutingis not possible but production of other components produced by the machine ormanufacturing system continues; and all production stops. Alternatively Buzacott[91]suggested measuring rerouting flexibility by the drop in production rate when a machinestoppage occurs.

Volume flexibility needs to be considered at the aggregate level as well as at the level ofindividual components. Volume changes depend upon how high capacity limits are set andhow rigid are these limits. Flexibility can be measured in terms of the average volumefluctuations that occur over a given time period divided by the capacity limit.

Material flexibility exists when there are adjustment mechanisms at one stage of amanufacturing process which identify and then correct or adapt to variations arising

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 15: Performanta serviciilor in sectorul public

Performancemeasurement

system design

93

previously. For example, in manual systems operators can locate bent metal and adjust it orproperly position it in the machines. Otherwise, quality problems and machine breakdownswill mount. Material flexibility can be measured by the extent of variations in key dimensionaland metallurgical properties handled by the equipment. Sequencing flexibility could bemeasured by the number of different sequences handled by the equipment with the lower limitbeing inviable sequence and the upper limit being random processing.

Cox[92] sees the concept of flexibility as a measure of the efficiency with whichthe manufacturing process can be changed. He focuses on the issues of product-mix and volume flexibility and argues that the measures shown in Table V canbe used to operationalize the first of these concepts.

Applications of performance measurement?Managers find it relatively easy to decide what they should be measuring. Asmentioned earlier one of the ways in which data were collected for this reviewwas through a survey of small-to-medium-sized manufacturing enterprises inthe UK. Of the respondents to that survey 69 per cent agreed or strongly agreedwith the statement “we find it easy to decide which of the financial aspects ofmanufacturing we should be measuring”, while 72 per cent agreed or stronglyagreed with the statement “we find it easy to decide which of the non-financialaspects of manufacturing (quality, lead times, etc.) we should be measuring”.Indeed it could be argued that managers find it too easy to decide what theyshould be measuring. When, for example, four senior managers were recentlyasked to identify what they thought should be measured in their firm theyidentified over 100 different measures. The problem facing this particularcompany, then, is not identifying what could be measured, but reducing the listof possible measures to a manageable set. One way of doing this might be to

Table V.Measures of product-

mix

Element Measure(s)

Vendor lead time Percentage which can be obtained in X days or lessLabour: job classes 100 less the number of job classesLabour: cross training Percentage of workforce trained to do two or more jobsLabour: transference Percentage of workforce doing more than one production job in

any given monthSet-up time Percentage of equipment changed over in X minutes or lessCycle time Make time divided by total time in systemProgrammable equipment Percentage of equipment programmableMultipurpose equipment Percentage of equipment with multiple versus single productLot size Percentage of products for which economic lot size is smaller than X“Pull” production Percentage of product made under kanban or a similar systemWIP inventory Work-on-station divided by total work on floor(Adapted from [92])

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 16: Performanta serviciilor in sectorul public

IJOPM15,4

94

explore the rationale underlying the introduction of specific measures ofperformance.

In the manufacturing literature it is frequently argued that performancemeasures should be derived from strategy; that is, they should be used toreinforce the importance of certain strategic variables[93]. And although thisdoes not always appear to happen in reality[21], the link between performancemeasurement and strategy has been extensively explored in the businessstrategy literature. Many of the early writers on strategy, such as Andrews[94]and Ansoff[95], believed that strategies were synonymous with plans. In reality,however, strategies are much more complex because they evolve as decisionsare made and courses of action are pursued[9]. Take, for example, Nissan,where the espoused business strategy is “to build profitably the highest qualitycar sold in Europe”[96]. If Nissan’s purchasing manager were to decideindependently to buy low-cost, low-quality components then Nissan could endup following a strategy radically different to the one it had planned to adopt.

The key theme here is consistency – consistency of both decision making andaction – because a strategy is only realized as decisions are made and coursesof action are pursued. Indeed, it has been argued that a strategy can only besaid to exist when one can identify a consistent pattern of decisions and actionwithin a firm[9]. Hence an important question is how can one induceconsistency of decision making and action within a firm?

Hrebiniak and Joyce[97] argue that as humans are “calculative receptors” astrategic control system can be used to influence their behaviour. The processstarts with the receipt of a stimulus; an external crisis, the normal job/taskdemands, or a request from a supervisor. Calculative receptors interpret thestimulus, assess the perceived costs and benefits of various responses and arelikely to select whichever course of action they believe will maximize their gain.Control, which in this context includes performance measurement andfeedback, follows action. Finally, rewards or sanctions are used to reinforce ormodify behaviour depending on the employee’s performance and on theappropriateness of the course of action pursued. Hence to business strategiststhe rationale underlying the introduction of a performance measure is that it isone element of a strategic control system and can be used to influencebehaviour.

Despite the academic interest in strategic control systems, there has beenrelatively little empirical research on their use[98]. In 1979 Horovitz surveyed 52European companies and found little evidence to suggest that firms actuallyuse strategic controls. He does, however, describe how one firm used theirperformance measurement system to reinforce the importance of customerservice:

In one British electronics company investigated, top management has in fact put emphasis ona critical element in control. Whereas usual performance (i.e. financial results) is only reportedevery quarter, top management closely monitors – daily if needed – customer satisfaction,explicitly defined to be its major strategic strength: no equipment sold to a customer shall bedown for more than 12 hours. To check on this, every morning and afternoon the chiefexecutive is warned when any equipment has been down for more than 12 hours and

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 17: Performanta serviciilor in sectorul public

Performancemeasurement

system design

95

corrective action is immediately taken at the highest level to replace or send a part to thecustomer. A systematic procedure has been set up whereby a service man unable to repairequipment within 2 hours notifies his superior, who in turn notifies his superior after 2 morehours (and so on up to the chief executive) in order to allow close control over what has beendefined as a distinctive competence by the company: no down time whatever the costs[99].

More recently Goold and Quinn[100] surveyed 200 of the largest Britishcompanies and report that only 11 per cent of them claimed to have a strategiccontrol system. Interestingly these findings can be contrasted with those ofDaniel and Reitsperger[101]. They surveyed 26 Japanese automotive andconsumer electronics firms and found that:

Japanese firms have taken to heart the strategic management literature advocating strategiccontrols…Our findings indicate that modifications of management control systems byJapanese manufacturers are applied in Japanese plants as well as in operations abroad. Thesefindings and the success of Japanese manufacturers in penetrating world markets support thenormative theory that management control systems should be modified to fit strategy (p. 616).

There is also some evidence that Japanese firms use their managementaccounting systems to influence behaviour. Hiromoto[102] argues that “high-level Japanese managers seem to worry less about whether an overheadallocation system reflects the precise demands each product makes oncorporate resources than about how the system affects the cost-reductionpriorities of middle managers and shop-floor workers”. Morgan andWeerakoon[103] concur with this view. It should be noted, however, that this isnot a particularly novel idea. Indeed as long ago as 1974 Hopwood[104]suggested that managers should pay more attention to the behaviouralimplications of management accounting.

Data on Japanese management accounting practices are relatively rare.Yoshikawa et al.[105] surveyed 200 Scottish and 500 Japanese manufacturingcompanies and found that the Japanese cost accountants placed more emphasison the generation of financial statements than did the Scottish ones. It isinteresting to compare this finding with Johnson and Kaplan’s[54] assertionthat the fact that cost accounting is driven by the need to generate financialstatements is one of its greatest weaknesses. Yoshikawa et al.[105] also foundthat Japanese companies pay more attention to product costing at the designstage. This is consistent with Sakurai’s[106] paper on target costing. He arguesthat many Japanese firms, especially those in the automotive industry, assumethat selling prices are set by the market and hence if a company knows whatreturn it wants to achieve, a target cost for the product can be established.

Simons[107-109] extends the notion that performance measures can be usedto influence behaviour. He argues that management control systems can also beused as a means of surveillance, motivation, monitoring performance,stimulating learning, sending signals or introducing constraints. These are allthemes that will be picked up again later. Before then, however, we will explorethe concept of a performance measurement system as an entity and also therelationship between the performance measurement system and theenvironment within which it operates.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 18: Performanta serviciilor in sectorul public

IJOPM15,4

96

The performance measurement system as an entityThe previous section focused on the individual measures which togetherconstitute a performance measurement system. This one will adopt a slightlydifferent slant and examine the performance measurement system as a whole.It will begin by identifying the various dimensions of a performancemeasurement system. Then it will review a specific performance measurementsystem design process which has been proposed.

As discussed earlier Leong, et al.[10] have suggested that the manufacturingtask, and hence the key dimensions of manufacturing performance, can bedefined in terms of quality, time, price (cost), and flexibility. Other authors takea different stance. Following their study of performance measurement in theservice sector, Fitzgerald et al.[8] suggest that there are two basic types ofperformance measure in any organization – those that relate to results(competitiveness, financial performance), and those that focus on thedeterminants of the results (quality, flexibility, resource utilization andinnovation). This suggests that it should be possible to build a performancemeasurement framework around the concepts of results and determinants.

Perhaps the best known performance measurement framework is Kaplanand Norton’s[7] “balanced scorecard” (see Figure 3), which is based on theprinciple that a performance measurement system should provide managerswith sufficient information to address the following questions:

● How do we look to our shareholders (financial perspective)?

● What must we excel at (internal business perspective)?

● How do our customers see us (customer perspective)?

● How can we continue to improve and create value (innovation andlearning perspective)?

Figure 3.The balanced scorecard

Financial perspective

How do we look toour shareholders?

Innovation andlearning perspective

Can we continue toimprove and create value?

Customer perspective

How do ourcustomers see us?

Internal businessperspective

What must we excel at?

Source: Adapted from [7]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 19: Performanta serviciilor in sectorul public

Performancemeasurement

system design

97

Consultants from KPMG claim to have used the balanced scorecardsuccessfully both internally and with a number of their clients, but while itprovides a useful framework there is little underlying it, in terms of the processof performance measurement system design[110,111]. In addition, the balancedscorecard contains a serious flaw because if a manager were to introduce a setof measures based solely on it, he would not be able to answer one of the mostfundamental questions of all – what are our competitors doing (the competitorperspective)?

Keegan et al.[112] proposed a similar, but lesser known performancemeasurement framework – the performance measurement matrix. As with thebalanced scorecard, its strength lies in the way it seeks to integrate differentdimensions of performance, and the fact that it employs the generic terms“internal”, “external”, “cost” and “non-cost” enhances its flexibility. That is, theperformance measurement matrix should be able to accommodate any measurewhich fits within the framework provided by the balanced scorecard, while theconverse – take, for example, competitor performance – may not be true.

Rather than proposing frameworks, other authors prefer to provide criteriafor performance measurement system design. Globerson[113], for example,suggests that the following guidelines can be used to select a preferred set ofperformance criteria:

● Performance criteria must be chosen from the company’s objectives.

● Performance criteria must make possible the comparison oforganizations which are in the same business.

● The purpose of each performance criterion must be clear.

● Data collection and methods of calculating the performance criterionmust be clearly defined.

● Ratio-based performance criteria are preferred to absolute number.

● Performance criteria should be under control of the evaluatedorganizational unit.

● Performance criteria should be selected through discussions with thepeople involved (customers, employees, managers).

● Objective performance criteria are preferable to subjective ones.

Similarly, Maskell[114] offers seven principles of performance measurementsystem design:

(1) The measures should be directly related to the firm’s manufacturingstrategy.

(2) Non-financial measures should be adopted.

(3) It should be recognized that measures vary between locations – onemeasure is not suitable for all departments or sites.

(4) It should be acknowledged that measures change as circumstances do.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 20: Performanta serviciilor in sectorul public

IJOPM15,4

98

(5) The measures should be simple and easy to use.

(6) The measures should provide fast feedback.

(7) The measures should be designed so that they stimulate continuousimprovement rather than simply monitor.

In the late 1980s General Motors invested some $20 million defining a set of 62primary measures that could be applied consistently at various organizationallevels. (See Figure 4). They distinguish between measures of results, e.g. qualityand responsiveness, and measures of the process of strategy implementation(Merchent[115] and Fitzgerald et al.[8] make similar distinctions[116].) Therationale underlying this “integrated” performance measurement system is thatit should ensure that GM employees retain their focus on continuousimprovement through teamwork in the key business activities[117].

Dixon et al.[118] present an interesting structured methodology for auditingwhether a firm’s performance measurement system encourages continuousimprovement. They describe a performance measurement questionnaire (PMQ)which consists of three stages. In the first, general data on both the companyand respondent are collected. In the second, the respondent is asked to identifythose areas of improvement that are of long-term importance to the firm and tosay whether the current performance measurement system inhibits or supportsappropriate activity. In the third, the respondent is asked to compare andcontrast what is currently most important for the firm with what themeasurement system emphasizes.

The data are collected using seven-point Likert scales and then four types ofanalysis are conducted. The first is alignment analysis in which the extent ofmatch between the firm’s strategies, actions and measures is assessed. The

Figure 4.General Motors’integrated performancemeasurement system

8

8

12

11

8

5

7

7

13

8

7

9

9

3

5

5

6

5

1

9

6

6

1

1

38

32

52

45

30

3

7

12

12

12

Productinitiation Operations

Marketingsales andservice

Retailcustomer

satisfactionShareholdersatisfaction Total

Peopledevelopment/

employeesatisfaction

Corporate

Group

Divisional/SBU

Plant/field

Department/cell

Source: Adapted from [117]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 21: Performanta serviciilor in sectorul public

Performancemeasurement

system design

99

second is congruence analysis which provides more detail on the extent towhich the strategies, actions and measures are mutually supportive. The thirdis consensus analysis, in which the data are analysed according to managementposition or function. And the fourth is confusion analysis in which the range ofresponses, and hence the level of disagreement, is examined.

Hayes and Abernathy[119] focus on a different dimension of performancemeasurement – namely the fact that many traditional measures of financialperformance encourage managers to adopt a short-term perspective. Theyhypothesize that the use of short-term financial controls may partly be to blamefor the economic decline of the USA:

Although innovation, the lifeblood of any vital enterprise, is best encouraged by anenvironment that does not unduly penalise failure, the predictable result of relying too heavilyon short-term financial measures – a sort of managerial remote control – is an environment inwhich no one feels he or she can afford a failure or even a momentary dip in the bottomline[119].

One of the most comprehensive studies of short-termism is the one reported byBanks and Wheelwright[53]. They conducted a series of in-depth interviewswith managers and planners in six major US firms and found that short-termism encouraged managers to delay capital outlays; postpone operatingexpenses; reduce operating expenses; and make other operating changes suchas varying the product mix, the delivery schedules, or the pricing strategy.Furthermore, they suggest that one of the ways in which short-termism can beminimized is by establishing performance measures which reflect both theshort and long term. This recommendation is supported by Kaplan[120].

Another issue that has to be considered when designing a performancemeasurement system is conflict, as illustrated by Fry and Cox[121]. They citethe case of a firm where the plant manager was primarily concerned with returnon investment, the product group managers were evaluated according to thenumber of orders that were shipped on time, and the supervisors and operativeswere measured according to standard hours produced. This measurementsystem encouraged the supervisors and operatives to save set-up time byproducing batches larger than those scheduled. Hence some orders weredelayed and the product group managers had to sanction overtime to ensuregood due-date performance. This, in turn, had a negative impact on the plant’smanagers performance which was measured by return on investment.

Finally, the perspective adopted by Computer Aided Manufacturing –International (CAM-I) provides an interesting insight. CAM-I is a consortium ofindustrialists and academics which sponsors, among other things, a researchproject on cost management systems (CMS):

The goal of a cost management system is to provide information to help companies useresources profitably to produce services or products that are competitive in terms of cost,quality, functionality, and timing in the world market. Within this context a cost managementsystem can be defined as a management planning and control system with the followingobjectives:

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 22: Performanta serviciilor in sectorul public

IJOPM15,4

100

● To identify the cost of resources consumed in performing significant activities of the firm(accounting models and practices);

● To determine the efficiency and effectiveness of the activities performed (performancemeasurement);

● To identify and evaluate new activities that can improve the future performance of thefirm (investment management).

● To accomplish the three previous objectives in an environment characterised bychanging technology (manufacturing practices)[122].

CAM-I projects tend to adopt an activity-based approach. That is, they focus onthe process of doing something, rather than the output. As discussed earlier,this has implications for the performance measurement system design taskbecause, as Deming[123] and others have argued, gaining control over aprocess, and hence producing things right first time, is probably more costeffective than repairing things after the event.

This section has sought to illustrate the complexity of the performancemeasurement system. To attempt to produce a single unifying framework atthis stage seems unrealistic. Indeed as Blenkinsop and Davis[124] report onehas to consider all of the following when designing a performancemeasurement system:

● Departmental goal-setting without creating inconsistencies in policy orexcessive interdepartmental conflict.

● Whether the measure is a valid indicator of the performance of thegroup.

● An appropriate mix of integration and differentiation (i.e. goals set bothhorizontally and vertically within the framework of the organizationalchart).

● A thorough understanding of the existing measurement systems, bothformal and informal, spoken and unspoken, as they are perceived.

● Management consensus concerning the organization’s objectives and themeans at its disposal for attaining them.

● The corporate culture.

● Long-, short- and medium-term goals (both financial and non-financial),not a fixation with “this month’s” sales figure.

● Part-ownership of problems – so that a solution has to be found acrossfunctional boundaries and the escape route, “it’s somebody else’s fault”(often the ethereal “company’s” fault), no longer has any meaning orvalidation.

● Total commitment from all involved, so that the “end-of-the-month”syndrome – a system driven by sales value – does not rear its ugly headat the end of the first month following implementation and each andevery subsequent month thereafter.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 23: Performanta serviciilor in sectorul public

Performancemeasurement

system design

101

The authors are currently involved in a project which adopts the view that thebest way to overcome this complexity is by producing a process for designing ameasurement system, rather than a framework. Work on this is rare. Indeed, theonly only examples of processes for the design of performance measurementsystems that the authors have been able to identify consist of a set of steps withlittle underlying content. The work of Wisner and Fawcett[125] is typical. Theypropose the following nine-step “process” for developing a performancemeasurement system, but make no attempt to explain how it can beoperationalized:

(1) Clearly define the firm’s mission statement.

(2) Identify the firm’s strategic objectives using the mission statement as aguide (profitability, market share, quality, cost, flexibility, dependability,and innovation).

(3) Develop an understanding of each functional area’s role in achieving thevarious strategic objectives.

(4) For each functional area, develop global performance measures capableof defining the firm’s overall competitive position to top management.

(5) Communicate strategic objectives and performance goal to lower levelsin the organization. Establish more specific performance criteria at eachlevel.

(6) Assure consistency with strategic objectives among the performancecriteria used at each level.

(7) Assure the compatibility of performance measures used in all functionalareas.

(8) Use the performance measurement system to identify competitiveposition, locate problem areas, assist the firm in updating strategicobjectives and making tactical decisions to achieve these objectives, andsupply feedback after the decisions are implemented.

(9) Periodically re-evaluate the appropriateness of the establishedperformance measurement system in view of the current competitiveenvironment.

The performance measurement system and its environmentOnce a performance measurement system has been developed it has to beimplemented. Among other things this means that the performancemeasurement system will have to interact with a wider environment. There aretwo fundamental dimensions to this environment. The first is the internal one –that is the organization. The second is the external one – that is the marketwithin which the organization competes. These will be discussed in turn in thissection.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 24: Performanta serviciilor in sectorul public

IJOPM15,4

102

The internal environmentEarlier the concept of a strategic control system was introduced. There theperformance measurement system is seen as but a part of a wider systemwhich includes goal setting, feedback, and reward or sanction. Businessstrategists argue that the wider system has to match the businessstrategy[97,126]. Organizational culturists, such as Weick[127] suggest thatstrategies and cultures are synonymous. Hence one can argue that theperformance measurement system has to be consistent with the organization’sculture. Indeed it is easy to imagine what would happen in a culture based onblame if, for example, one were to introduce a measure of the number of defectsproduced by each operative. The culture of the organization would encourageeveryone to lie. Hence one can question whether there would be any point inintroducing such a measure.

Another dimension to the problem arises because of the functional structureadopted by many organizations. Shapiro[128] discusses the conflict betweenmarketing and manufacturing and suggests that it is partly a result of theevaluation and reward systems used in many firms.

One prime reason for the marketing/manufacturing conflict is that the two functions areevaluated on the basis of different criteria and receive rewards for different activities. On theone hand, the marketing people are judged on the basis of profitable growth of the company interms of sales, market share, and new markets entered. Unfortunately, the marketers aresometimes more sales-oriented than profit-oriented. On the other hand, the manufacturingpeople are often evaluated on running a smooth operation at minimum cost. Similarlyunfortunately, they are sometimes more cost-oriented than profit-oriented.

The system of evaluation and reward means that the marketers are encouraged to generatechange, which is one hallmark of the competitive marketplace. To be rewarded, they mustgenerate new products, enter new markets, and develop new programmes. But themanufacturing people are clearly rewarded for accepting change only when it significantlylowers their costs.

Because the marketers and manufacturers both want to be evaluated positively andrewarded well, each function responds as the system asks it to in order to protect its self-interest[128, p. 108].

At a higher level, KPMG, a UK management consultancy, reports that“increasing numbers of executive directors of KPMG client companies expressconcern that the information they receive neither enables them to measureperformance against their chosen strategy and objectives, nor helps them intheir strategic-decision making process. The common complaints are of toomuch data and too little analysis”. More specifically, and following a survey of150 of The Times 1,000 companies, excluding the top 200, KPMG found that:

● Most companies used targets and performance indicators based oninternal financial standards. External comparisons and non-financialtargets were not widely used.

● Information used to monitor performance was rated poor or average byjust under half the companies contacted in terms of its relevance,accuracy, timeliness, completeness, cost-effectiveness and presentation.Dissatisfaction appeared to be most marked in the cost-effectiveness andpresentation of information.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 25: Performanta serviciilor in sectorul public

Performancemeasurement

system design

103

● A majority of the respondents rated the information available toformulate and review strategy as poor or average, while 42 per cent saidthat at times it failed to highlight critical issues. Strategic planners wereless satisfied than their accounting colleagues with the relevance,timeliness and completeness of the information that was presented.

● The primary objective of most of the companies surveyed was industryleadership, achievement of target earnings per share, or a certain marketshare. Achievement of these objectives was measured by financialcriteria in three-quarters of the organizations.

● Internal information such as cost profiles, product profitability and pastfinancial performance appeared to dominate the information set.External information, such as domestic and overseas macroeconomicdata, demographic projections, EC policies and impending legislationwas not reported as being widely used in strategy formulation ormonitoring[129].

In one of the more academic studies of performance measurement Richardsonand Gordon[130] adopt an entirely different perspective and explore whetherperformance measures change as products move through their life cycle andhence begin to compete on different factors. At the outset of their study theyhypothesized that:

● as products move through their life cycle, the appropriate performancemeasures will change;

● performance measures will be easier to develop for products late in theirlife cycle as these tend to compete on cost rather than innovativeness;

● dysfunctional consequences will result if measures are not appropriate;● in multi-product facilities “traditional” measures will inhibit innovation;● manufacturing managers will respond to their perceived measures of

performance.Having collected data during interviews with the chief executive andmanufacturing managers of 15 Canadian electronics companies, Richardsonand Gordon[130] observed that:

● the performance measures used did not change as products movedthrough their life cycle;

● the measures did not become better defined later in the products’ lifecycle, primarily because the performance measures used tended to focuson the plant as a whole, rather than individual products;

● inappropriate measures did introduce dsyfunctionality;● traditional measures did inhibit innovation;● managers did respond to their perceived measures of performance.

Yet another perspective is adopted by Crawford and Cox[131]. They suggestthat instead of trying to link measures to the manufacturing strategy, the

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 26: Performanta serviciilor in sectorul public

IJOPM15,4

104

organization’s culture or the product life cycle, one should seek to integrate themeasures and the manufacturing system. They therefore conducted a series ofcase studies and suggest that the following guidelines can be used to design aperformance measurement system suitable for a just-in-time (JIT)manufacturing environment:

● Performance to schedule criteria should evaluate group, not individual,work.

● Specific numeric standards, or goals, should be established for theperformance to schedule criteria and these goals should be revised oncethey have been met.

● Specific numeric standards are not required for inventory and qualitycriteria; improving trends are needed.

● Performance criteria should be measured in ways that are easilyunderstood by those whose performance is being evaluated.

● Performance data should be collected, where possible, by those whoseperformance is being evaluated.

● Graphs should be the primary method of reporting performance data.

● Performance data should be available for constant review.

● Schedule performance should be reported daily or weekly.

● A monthly reporting cycle for inventory performance and qualityperformance is sufficient.

● The reporting system should not replace frequently held performancereview meetings.

The external environmentFor the purposes of this review the external environment is assumed to consistof two distinct elements – customers and competitors. As discussed earlier atruly balanced performance measurement system would provide managerswith information relating to both of these. Measures of customer satisfactionhave already been discussed (see performance measures relating to quality),hence this section will focus on the measurement of competitor performance.One technique that can be used to do this is benchmarking, althoughbenchmarking does not have to focus on competitors. Indeed there are fourbasic types of benchmarking:

(1) Internal. Internal to a corporation, but perhaps external to a plant or aparticular business unit. One of the major advantages of internalbenchmarking is that it minimises problems of access and dataconfidentiality.

(2) Competitive. This is probably the most beneficial form of benchmarking,but the collection of data which is directly comparable is very difficult.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 27: Performanta serviciilor in sectorul public

Performancemeasurement

system design

105

(3) Functional. This involves functional comparison with companies whichare similar, but not direct competitors.

(4) Generic. The study and comparison of truly generic business process,e.g. order entry, invoicing.

Benchmarking is proving to be a topic of interest to both academics andconsultants. One of the first benchmarking studies to be documented was theone carried out by Garvin[132]. He collected data on the performance of USAand Japanese air conditioning manufacturers in the early 1980s. More recentlyWomack et al.[88] have completed a study of the automotive industry. Over afive-year period they collected detailed performance data, such as that shown inTable VI, on most of the world’s major automotive plants and from these wereable to identify a two to one performance gap along a number of dimensions. In1993 Andersen Consulting[133] reported a similar finding, following their studyof automotive components suppliers. While in the USA the MIT/SloanFoundation is continuing to fund a number of industry specific benchmarkingprojects, including ones which focus on semi-conductors; textiles; computers;steel; pharmaceutical; chemical process; process equipment.

In terms of academic research Voss et al.[134-137] have produced a series ofworkbooks which discuss the process of benchmarking innovation. Theyidentify four dimensions to innovation:

(1) product innovation;(2) product development;(3) process innovation;(4) technology acquisition.

They argue that firms should seek to benchmark themselves along each ofthese dimensions.

Industrial interest in benchmarking is also high. The Britain’s Best FactoriesAward, run by Management Today, was based on a benchmarking exercise forthe first time in 1992. Each entrant filled in a self-assessment questionnairewhich covered issues such as the plant profile; the nature of the manufacturingoperations; the cost structure; the inventory profile; the employee profile;product innovation; management information; and market positioning. The

Table VI.Comparative assembly

performance data

GM Framingham Toyota Takaoka

Gross assembly hours per car 40.7 18.0Adjusted assembly hours per car 31 16Assembly defects per 100 cars 130 45Assembly space per car 8.1 4.8Inventory of parts (average) 2 weeks 2 hoursSource: [88]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 28: Performanta serviciilor in sectorul public

IJOPM15,4

106

judges, all based at Cranfield School of Management, UK, analysed the data andthen visited the best plants before making a final decision. This innovationlends greater credibility to the Best Factories Award as previously thejudgement process was based on consultants’ opinions rather than on data.

Some authors see benchmarking as a means of identifying improvementopportunities as well as monitoring the performance of competitors.Young[138], for example, argues that benchmarking is being used in this wayby many large companies. He proposes that as most of the “low hanging fruithas been picked” the identification of improvement opportunities is becomingincreasingly difficult. Hence managers are adopting benchmarking as a meansof searching for best practice and new ideas. He identifies four steps in thebenchmarking process:

(1) planning;(2) analysis;(3) integration;(4) action.

Of course one danger with this approach is that the company that is searchingfor best practice will always be following rather than leading.

Perhaps the most comprehensive description of benchmarking, to date, hasbeen provided by Camp[139]. He defines benchmarking as the search forindustry best practices that lead to superior performance and bases his book onthe nine-step benchmarking process shown in Figure 5 (adapted to the nine-stepbenchmarking process for this article). In terms of performance measurementsystem design, however, the work of Oge and Dickinson[140] is perhaps morerelevant. They suggest that firms should adopt closed loop performancemanagement systems which combine periodic benchmarking with ongoingmonitoring/measurement (see Figure 6).

Implications – performance measurement research agendaThere appears to be a growing recognition that the measures of performancethat companies have traditionally used are inappropriate for manufacturingbusinesses[51,120,141,142], not least because they:

● encourage short-termism, for example the delay of capitalinvestment[53,143];

● lack strategic focus and do not provide data on quality, responsivenessand flexibility[144];

● encourage local optimization, for example manufacturing inventory tokeep people and machines busy[145];

● encourage managers to minimize the variances from standard ratherthan seek to improve continually[146,147];

● fail to provide information on what their customers want and what theircompetitors are doing[7,139].

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 29: Performanta serviciilor in sectorul public

Performancemeasurement

system design

107

Figure 5.The nine-step

benchmarking process

Identify what is to be benchmarked

Identify comparative companies

Determine data collection method and collect data

Determine current performance "gap"

Project future performance levels

Communicate findings and gain acceptance

Establish functional goals

Implement specific actions and monitor progress

Recalibrate benchmarks

Source: Adapted from [139]

Figure 6.Closed loop

performancemanagement

Periodic benchmarking

Compare processesto best practices

InternalExternalAssess progress

Structure programmes to close gaps

ImplementationBest practiceelementsContinuousimprovement

Activity

Standard processEventsDeliverablesProgrammes

Process

CostQualityTime

Product

PerformanceCustomersatisfactionMarket share

Ongoing monitoring/measurement

Performance management

Source: Adapted from [140]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 30: Performanta serviciilor in sectorul public

IJOPM15,4

108

Performance measurement system design, or re-design, is firmly on theindustrial agenda[148,149]. In terms of issues that need researching this reviewhas identified the following as key.

Issues associated with individual measures of performanceIssues associated with individual measures of performance include:

● Is performance measurement a luxury for the SMME (small or mediummanufacturing enterprise)?

– Which performance measures are of greatest value to SMMEs?

– Do the costs of some measures outweigh their benefit?

– Is this only an issue for SMMEs or does it apply to all sizes oforganization?

● Should measures focus on processes, the outputs of processes, or both?

● Is time the fundamental measure of manufacturing performance?

● How can the measurement of total factor productivity be simplified?

● How can flexibility, which is often simply a property of the “system”, bemeasured?

● How can performance measures be designed so that they encourageinter-functional co-operation?

● How can measures which do not encourage short-termism be designed?

● How can performance measures be designed so that they encourageappropriate behaviour?

● Can “flexible” measures which take account of the changing businessenvironment be defined?

● How should the data generated as a result of a particular measure bedisplayed?

● How can one ensure that the management loop is closed – that correctiveaction follows measurement?

Issues associated with the performance measurement system as an entityThese issues ask the questions:

● What are the “definitive” principles of performance measurementsystem design?

● How can measures be integrated both across an organization’s functionsand through its hierarchy?

● How can conflicts between performance measures be eliminated?

● What techniques can managers use to reduce their list of “possible”measures to a meaningful set?

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 31: Performanta serviciilor in sectorul public

Performancemeasurement

system design

109

– Would a “generic” performance measurement system facilitate thisprocess?

– Would a performance measurement framework facilitate thisprocess?

– Or is a process-based approach required?– What are the relative advantages and disadvantages of the above?

● Do “generic” performance measurement systems actually exist?● What does a truly “balanced scorecard” constitute?● Can a practicable performance measurement system design process be

specified?● Can a “flexible” performance measurement system which takes account

of the changing business environment be defined?● How can the cost-benefit of a performance measurement system be

analysed?

Issues associated with the system and its environmentIssues associated with the system and its environment raise the followingpoints:

● What implications do emerging concepts, such as ABC, BPR or thenotion of core competencies, have for performance measurement?

● Why do firms fail to integrate their performance measures into theirstrategic control systems?

● How can we ensure that the performance measurement system matchesthe firm’s strategy and culture?

● To which dimensions of the internal and external environment does theperformance measurement system have to be matched?

The final issue, and one which has not yet been touched, is that of predictiveperformance measurement. The assumption underpinning this review, andindeed much of the work on performance measurement to date, is thatmanagers use measures both to monitor past performance and stimulate futureaction. Increasingly, however, people are beginning to look for “predictive”measures. Measures, such as statistical process control (SPC), which show thatsomething is going out of control, before too much damage has been done. Akey item on the performance measurement research agenda must therefore bethe identification, and/or development, of “predictive performance measures”.

Notes and references1. Kotler, P., Marketing Management Analysis, Planning and Control, Prentice-Hall,

Englewood Cliffs, NJ, 1984.2. Slack, N., The Manufacturing Advantage: Achieving Competitive Manufacturing

Operations, Mercury, London, 1991.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 32: Performanta serviciilor in sectorul public

IJOPM15,4

110

3. This can be expressed either in terms of the actual efficiency and/or effectiveness of anaction, or in terms of the end result of that action.

4. In this context the term metric refers to more than simply the “formula” used to calculatethe measure. For a given performance measure to be specified it is necessary to define,among other things, the title of the measure, how it will be calculated (the formula), whowill be carrying out the calculation, and from where they will get the data. See Neely[5] fora fuller explanation.

5. Neely, A.D., “Performance measurement system design – third phase”, draft of the fourthsection of the “Performance Measurement System Design Workbook”, April 1994.

6. Kaplan, R.S. (Ed.), Measures for Manufacturing Excellence, Harvard Business SchoolPress, Boston, MA, 1990.

7. Kaplan, R.S. and Norton, D.P., “The balanced scorecard – measures that driveperformance”, Harvard Business Review, January-February 1992, pp. 71-9.

8. Fitzgerald, L., Johnston, R., Brignall, S., Silvestro, R. and Voss, C., PerformanceMeasurement in Service Business, CIMA, London, 1991.

9. Mintzberg, H., “Patterns in strategy formulation”, Management Science, Vol. 24 No. 9,1978, pp. 934-48.

10. Leong, G.K., Snyder, D.L. and Ward, P.T., “Research in the process and content ofmanufacturing strategy”, OMEGA International Journal of Management Science, Vol. 18No. 2, 1990, pp. 109-22.

11. Wheelwright, S.C., “Manufacturing strategy – defining the missing link”, StrategicManagement Journal, Vol. 5, 1984, pp. 77-91.

12. Tunälv, C., “Manufacturing strategy – plans and business performance”, InternationalJournal of Operations & Production Management, Vol. 12 No. 3, 1992, pp. 4-24.

13. Garvin, D.A., “Competing on the eight dimensions of quality”, Harvard Business Review,November-December 1987, pp. 101-9.

14. Schonberger, R.J., Creating a Chain of Customers, Guild Publishing, London, 1990.15. Stalk, G., “Time – the next source of competitive advantage”, Harvard Business Review,

July-August 1988, pp. 41-51.16. Gerwin, D., “An agenda of research on the flexibility of manufacturing processes”,

International Journal of Operations & Production Management, Vol. 7 No. 1, 1987, pp. 38-49.

17. Slack, N., “The flexibility of manufacturing systems”, International Journal of Operations& Production Management, Vol. 7 No. 4, 1987, pp. 35-45.

18. The two factors delivery speed and delivery reliability are both assumed to fall into thegeneric category time for the purpose of Figure 3. They are shown as factors T3 and T4respectively.

19. Neely, A.D. and Wilson, J.R., “Measuring product goal congruence: an exploratory study”,International Journal of Operations & Production Management, Vol. 12 No. 4, 1992, pp. 45-52.

20. Neely, A.D. and Mills, J.F., Manufacturing in the UK – Report on a Survey of PerformanceMeasurement and Strategy Issues in UK Manufacturing Companies, ManufacturingEngineering Group, London, June 1993.

21. Neely, A.D., Mills, J.F., Platts, K.W., Gregory, M.J. and Richards, A.H., “Realising strategythrough measurement”, International Journal of Operations & Production Management,Vol. 14 No. 3, 1994, pp. 140-52.

22. Feigenbaum, A.V., Total Quality Control, McGraw-Hill, New York, NY, 1961.23. Campanella, J. and Corcoran, F.J., “Principles of quality costs”, Quality Progress, April

1983, pp. 16-22.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 33: Performanta serviciilor in sectorul public

Performancemeasurement

system design

111

24. Crosby, P.B., Quality is Free, McGraw-Hill, New York, NY, 1972.25. Plunkett, J.J. and Dale, B.G., “Quality costs: a critique of some economic cost of quality

models”, International Journal of Production Research, Vol. 26 No. 11, 1988, pp. 1713-26.26. Crosby, P.B., “Don’t be defensive about the cost of quality”, Quality Progress, April 1983,

pp. 38-9.27. Garvin, D.A., “How the Baldrige Award really works”, Harvard Business Review,

November-December 1991, pp. 80-93.28. Deming, W.E., Quality, Productivity and Competitive Position, MIT, Cambridge, MA, 1982.29. Price, F., Right First Time, Gower, Aldershot, 1984.30. Zairi, M., “TQM-based performance measurement – practical guidelines”, Technical

Communications, Letchworth, 1992.31. Drucker, P.F., “The emerging theory of manufacturing”, Harvard Business Review, May-

June 1990, pp. 94-102.32. Potts, D., “Just-in-time improves the bottom line”, Engineering Computers, September

1986, pp. 55-60.33. Goldratt, E.M. and Cox, J., The Goal: Beating the Competition, Creative Output Books,

Hounslow, 1986.34. Galloway, D. and Waldron, D., “Throughput accounting part 1 – the need for a new

language for manufacturing”, Management Accounting, November 1988, pp. 34-5.35. Galloway, D. and Waldron, D., “Throughput accounting part 2 – ranking products

profitability”, Management Accounting, December 1988, pp. 34-5.36. Galloway, D. and Waldron, D., “Throughput accounting part 3 – a better way to control

labour costs”, Management Accounting, January 1989, pp. 32-3.37. Galloway, D. and Waldron, D., “Throughput accounting part 4 – moving on to complex

products”, Management Accounting, February 1989, pp. 40-1.38. Note how closely these assumptions match the OPT measures: operating costs, inventory

and throughput. Waldron used to work for Creative Options – the company that sold theOPT system developed by Goldratt.

39. Darlington, J., Innes, J., Mitchell, F. and Woodward, J., “Throughput accounting – theGarrett Automotive experience”, Management Accounting, April 1992, pp. 32-8.

40. House, C.H. and Price, R.L., “The return map: tracking product teams”, Harvard BusinessReview, January-February 1991, pp. 92-100.

41. Fooks, J.H., Profiles for Performance: Total Quality Methods for Reducing Cycle Time,Addison-Wesley, Reading, MA, 1992.

42. Azzone, G., Masella, C. and Bertelè, U., “Design of performance measures for time-basedcompanies”, International Journal of Operations & Production Management, Vol. 11 No.3, 1991, pp. 77-85.

43. Johnson, H.T., “Early cost accounting for internal management control: Lyman Mills inthe 1850s”, Business History Review, Winter 1972, pp. 466-74.

44. Johnson, H.T., “Management accounting in early integrated industry – E.I. duPont deNemours Powder Company 1903-1912”, Business History Review, Summer 1975, pp. 184-204.

45. Johnson, H.T., “The role of history in the study of modern business enterprise”, TheAccounting Review, July 1975, pp. 444-50.

46. Johnson, H.T., “Management accounting in an early multidivisional organisation: GeneralMotors in the 1920s”, Business History Review, Winter 1978, pp. 450-517.

47. Johnson, H.T., “Markets, hierarchies, and the history of management accounting”, ThirdInternational Congress of Accounting Historians, London, August 1980.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 34: Performanta serviciilor in sectorul public

IJOPM15,4

112

48. Johnson, H.T., “Toward a new understanding of nineteenth-century cost accounting”, TheAccounting Review, July 1981, pp. 510-18.

49. Johnson, H.T., “The search for gain in markets and firms: a review of the historicalemergence of management accounting systems”, Accounting Organisations and Society,Vol. 2 No. 3, 1983, pp. 139-46.

50. Garner, S.P., Evolution of Cost Accounting to 1925, University of Alabama Press,Tuscaloosa, AL, 1954.

51. Kaplan, R.S., “The evolution of management accounting”, The Accounting Review, Vol. 59No. 3, 1984, pp. 390-418.

52. Chandler, A.D., The Visible Hand – Managerial Revolution in American Business,Harvard University Press, Boston, MA, 1977.

53. Banks, R.L. and Wheelwright, S.C., “Operations versus strategy – trading tomorrow fortoday”, Harvard Business Review, May-June 1979, pp. 112-20.

54. Johnson, H.T. and Kaplan, R.S., Relevance Lost – The Rise and Fall of ManagementAccounting, Harvard Business School Press, Boston, MA, 1987.

55. In the electronics industry it is even less – perhaps only two or three per cent.56. Murphy, J.C. and Braund, S.L., “Management accounting and new manufacturing

technology”, Management Accounting, February 1990, pp. 38-40.57. Bromwich, M. and Bhimani, A., “Management accounting – evolution not revolution”,

Management Accounting, October 1989, pp. 5-6.58. Cooper, R., “The two-stage procedure in cost accounting: part 1”, Journal of Cost

Management, Summer 1987, pp. 43-51.59. Cooper, R., “The two-stage procedure in cost accounting: part 2”, Journal of Cost

Management, Fall 1987, pp. 39-45.60. Cooper, R., “The rise of activity-based cost systems: part I – what is an activity-based cost

system?”, Journal of Cost Management, Summer 1988, pp. 45-54.61. Cooper, R., “The rise of activity-based cost systems: part II – when do I need an activity-

based cost system?”, Journal of Cost Management, Fall, 1988, pp. 41-8.62. Cooper, R., “The rise of activity-based cost systems: part III – how many cost drivers do

you need and how should you select them?”, Journal of Cost Management, Winter 1989,pp. 34-46.

63. Cooper, R., “The rise of activity-based cost systems: part IV – what do activity-basedsystems look like?”, Journal of Cost Management, Spring 1989, pp. 34-46.

64. Jeans, M. and Morrow, M., “The practicalities of using activity-based costing”,Management Accounting, November 1989, pp. 42-4.

65. Miller, J.G. and Vollmann, T.E., “The hidden factory”, Harvard Business Review,September/October 1985, pp. 142-50.

66. Troxel, R.B. and Weber, M.G., “The evolution of activity-based costing”, Journal of CostManagement, Spring 1990, pp. 14-22.

67. Kaplan, R.S., “One cost system isn’t enough”, Harvard Business Review, January-February 1988, pp. 61-6.

68. Kim, J.S. and Miller, J.G., “Building the value factory: a progress report for USmanufacturing”, Boston University Manufacturing Roundtable, October 1992.

69. Piper, J.A. and Walley, P., “Testing ABC logic”, Management Accounting, September 1990,pp. 37-42.

70. Piper, J.A. and Walley, P., “ABC relevance not found”, Management Accounting, March1991, pp. 42-54.

71. Allen, D., “Never the twain shall meet?”, Accountancy Age, January 1989, p. 21.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 35: Performanta serviciilor in sectorul public

Performancemeasurement

system design

113

72. Dugdale, D., “Costing systems in transition”, Management Accounting, January 1990,pp. 38-41.

73. Maskell, B., “Relevance regained – an interview with Professor Robert S. Kaplan”,Management Accounting, September 1988, pp. 38-42.

74. Cooper, R. and Kaplan, R.S., The Design of Cost Management Systems, Prentice-Hall,London, 1991.

75. Innes, J. and Mitchell, F., “Activity-based costing research”, Management Accounting,May 1990, pp. 28-9.

76. Innes, J. and Mitchell, F., Activity-based Costing: A Review with Case Studies, CIMA,London, 1990.

77. Burgess, T.F., “A review of productivity”, Work Study, January/February 1990, pp. 6-9.78. Bain, D., The Productivity Prescription – The Manager’s Guide to Improving Productivity

and Profits, McGraw-Hill, New York, NY, 1982.79. Ruch, W.A., “The measurement of white-collar productivity”, National Productivity

Review, Autumn 1982, pp. 22-8.80. Craig, C.E. and Harris, C.R., “Total productivity measurement at the firm level”, Sloan

Management Review, Vol. 14 No. 3, 1973, pp. 13-29.81. Hayes, R.J., Wheelwright, S.C. and Clark, K.B., Dynamic Manufacturing: Creating the

Learning Organisation, Free Press, New York, NY, 1988.82. Clark, K.B., Hayes, R.H. and Lorenz, C., The Uneasy Alliance – Managing the Productivity-

Technology Dilemma”, Harvard Business School Press, Boston, MA, 1985.83. Kendrick, J.W., Improving Company Productivity: Handbook with Case Studies, Johns

Hopkins University Press, Baltimore, MD, 1984.84. Lieberman, M.B., Lau, L.J. and Williams, M.D., “Firm level productivity and management

influence: a comparison of US and Japanese automobile producers”, Management Science,Vol. 36 No. 10, 1990, pp. 1193-1215.

85. Mundel, M., “Measuring total productivity of manufacturing organisations”,unpublished, Kraus International, White Plains, NY, 1987.

86. Sink, D.S., Productivity Management – Planning, Measurement and Evaluation, Controland Improvement, John Wiley and Sons, Chichester, 1985.

87. Skinner, W., “The productivity paradox”, Harvard Business Review, July-August 1986,pp. 55-9.

88. Womack, J.P., Jones, D.T. and Roos, D., The Machine that Changed the World, MaxwellMacmillan, Oxford, 1990.

89. Slack, N., “Flexibility as a manufacturing objective”, International Journal of Operations& Production Management, Vol. 3 No. 3, 1983, pp. 4-13.

90. Gustavsson, S.O., “Flexibility and productivity in complex production processes”,International Journal of Production Research, Vol. 22 No. 5, 1984, pp. 801-8.

91. Buzacott, J.A., “The fundamental principles of flexibility in manufacturing systems”,Proceedings of the First International Congress on Flexible Manufacturing Systems,Brighton, 1982, pp. 13-22.

92. Cox, T., “Towards the measurement of manufacturing flexibility”, Productivity andInventory Management, Vol. 30 No. 1, 1989, pp. 68-72.

93. Skinner, W., “Manufacturing – missing link in corporate strategy”, Harvard BusinessReview, May-June 1969, pp. 136-45.

94. Andrews, K.R., The Concept of Corporate Strategy, Dow Jones-Irwin, Homewood, IL,1971.

95. Ansoff, H.I., Corporate Strategy – An Analytic Approach to Business Policy for Growthand Expansion, revised edition, Penguin, Harmondsworth, 1986.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 36: Performanta serviciilor in sectorul public

IJOPM15,4

114

96. Gibson, I., “Our company’s philosophy: Nissan Motor Manufacturing (UK) Limited”,undated.

97. Hrebiniak, L.G. and Joyce, W.F., Implementing Strategy, Macmillan, New York, NY, 1984.98. Goold, M. and Quinn, J.J., “The paradox of strategic controls”, Strategic Management

Journal, Vol. 11, 1990, pp. 43-57.99. Horovitz, J.H., “Strategic control: a new task for top management”, Long Range Planning,

Vol. 12, 1979, pp. 2-7.100. Goold, M. and Quinn, J.J., “Strategic control: a survey of current practice”, working paper,

Ashridge Strategic Management Centre, London, 1988.101. Daniel, S.J. and Reitsperger, W.D., “Linking quality strategy with management control

systems: empirical evidence from Japanese industry”, Accounting, Organisations andSociety, Vol. 16 No. 7, 1991, pp. 601-18.

102. Hiromoto, T., “Another hidden edge: Japanese management accounting”, HarvardBusiness Review, July-August 1988, pp. 22-6.

103. Morgan, M.J. and Weerakoon, P.S.H., “Japanese management accounting, its contributionto the Japanese economic miracle”, Management Accounting, June 1989, pp. 40-3.

104. Hopwood, A.G., Accounting and Human Behaviour, Prentice-Hall, Englewood Cliffs, NJ,1974.

105. Yoshikawa, T., Innes, J. and Mitchell, F., “Japanese management accounting – acomparative survey”, Management Accounting, November 1989, pp. 20-3.

106. Sakurai, M., “Target costing and how to use it”, Journal of Cost Management, Summer1989, pp. 39-50.

107. Simons, R., “Accounting control systems and business strategy: an empirical analysis”,Accounting, Organisations and Society, Vol. 12 No. 4, pp. 357-74.

108. Simons, R., “The role of management control systems in creating competitive advantage:new perspectives”, Accounting, Organisations and Society, Vol. 15 No. 1/2, pp. 127-43.

109. Simons, R., “Strategic orientation and top management attention to control systems”,Strategic Management Journal, Vol. 12, 1991, pp. 49-52.

110. Hazell, M. and Morrow, M., “Performance measurement and benchmarking”,Management Accounting, December 1992, pp. 44-5.

111. Kaplan, R.S. and Norton, D.P., “Putting the balanced scorecard to work”, HarvardBusiness Review, September-October 1993, pp. 134-47.

112. Keegan, D.P., Eiler, R.G. and Jones, C.R., “Are your performance measures obsolete?”,Management Accounting, June 1989, pp. 45-50.

113. Globerson, S., “Issues in developing a performance criteria system for an organisation”,International Journal of Production Research, Vol. 23 No. 4, 1985, pp. 639-46.

114. Maskell, B., “Performance measures of world class manufacturing”, ManagementAccounting, May 1989, pp. 32-3.

115. Merchent, K.A., Control in Business Organisations, Pitman, Marshfield, MA, 1985.116. Merchent suggests that as business managers need discretion, goals (and hence controls)

should be set for them which relate to the results that they are expected to achieve ratherthan the actions that they have to take. Fitzgerald et al. are not as prescriptive, but stillsuggest that a distinction should be made between measures which relate to results(profits, ROI…) and measures which report on the determinants of those results (quality,flexibility, resource utilization).

117. CAMI-I, Minutes of the meeting held in Milan, Italy, 23-25 April 1991.118. Dixon, J.R., Nanni, A.J. and Vollmann, T.E., The New Performance Challenge – Measuring

Operations for World-Class Competition, Dow Jones-Irwin, Homewood, IL, 1990.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 37: Performanta serviciilor in sectorul public

Performancemeasurement

system design

115

119. Hayes, R.H. and Abernathy, W.J., “Managing our way to economic decline”, HarvardBusiness Review, July-August 1980, pp. 67-77.

120. Kaplan, R.S., “Yesterday’s accounting undermines production”, Harvard Business Review,Vol. 62, 1984, pp. 95-101.

121. Fry, T.D. and Cox, J.F., “Manufacturing performance: local versus global measures”,Production and Inventory Management Journal, Vol. 30 No. 2, 1989, pp. 52-6.

122. Berliner, C. and Brimston, J.A., Cost Management for Today’s Advanced Manufacturing:The CAM-I Conceptual Design, Harvard Business School Press, Boston, MA, 1988.

123. Deming, W.E., Out of the Crisis, MIT, Cambridge, MA, 1986.124. Blenkinsop, S. and Davis, L., “The road to continuous improvement”, OR Insight, Vol. 4

No. 3, 1991, pp. 23-6.125. Wisner, J.D. and Fawcett, S.E., “Link firm strategy to operating decisions through

performance measurement”, Production and Inventory Management Journal, ThirdQuarter, 1991, pp. 5-11.

126. Lorange, P., Implementation of Strategic Planning, Prentice-Hall, Englewood Cliffs, NJ,1982.

127. Weick, K., “The significance of corporate culture”, in Frost, P., Moore, L., Louis, M.,Lundberg, C. and Martin, J. (Eds), Organisational Culture, Sage, Beverly Hills, CA, 1985,pp. 381-9.

128. Shapiro, B.P., “Can marketing and manufacturing coexist?”, Harvard Business Review,September-October 1977, pp. 104-14.

129. KPMG, Information for Strategic Management – A Survey of Leading Companies, KPMGManagement Consulting, London, 1990.

130. Richardson, P.R. and Gordon, J.R.M., “Measuring total manufacturing performance”,Sloan Management Review, Winter 1980, pp. 47-58.

131. Crawford, K.M. and Cox, J.F., “Designing performance measurement systems for just-in-time operations”, International Journal of Production Research, Vol. 28 No. 11, 1990,pp. 2025-36.

132. Garvin, D.A., “Quality on the line”, Harvard Business Review, September-October, 1983,pp. 65-75.

133. Andersen Consulting, The Lean Enterprise Benchmarking Project, Andersen Consulting,London, February 1993.

134. Voss, C., Coughlan, P., Chiesa, V. and Fairhead, J., “Innovation – self assessment andbenchmarking – a framework”, prepared by London Business School for the DTIInnovation Unit, July 1992.

135. Voss, C., Coughlan, P., Chiesa, V. and Fairhead, J., Innovation – A Benchmarking and SelfAssessment Framework, London Business School, October 1992.

136. Voss, C., Coughlan, P., Chiesa, V. and Fairhead, J., Scoresheets for Quantifying Performancein Managing Innovation, London Business School, October 1992.

137. Voss, C., Coughlan, P., Chiesa, V. and Fairhead, J., “Developing and testing benchmarkingand self assessment frameworks in manufacturing”, Management and New ProductionSystems, 4th International Production Management Conference of EIASM, London, 5-6April 1993, pp. 563-84.

138. Young, S., “Checking performance with competitive benchmarking”, ProfessionalEngineering, February 1993, pp. 14-15.

139. Camp, R.C., Benchmarking – The Search for Industry Best Practices that Lead to SuperiorPerformance, ASQS Quality Press, Milwaukee, WI, 1989.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 38: Performanta serviciilor in sectorul public

IJOPM15,4

116

140. Oge, C. and Dickinson, H., “Product development in the 1990s – new assets for improvedcapability”, Economist Intelligence Unit, Japan Motor Business, December 1992, pp. 132-44.

141. Kaplan, R.S., “Measuring manufacturing performance – a new challenge for managerialaccounting research”, The Accounting Review, Vol. 58 No. 4, 1983, pp. 686-705.

142. Kaplan, R.S., “Accounting lag – the obsolescence of cost accounting systems”, CaliforniaManagement Review, Vol. 28 No. 2, 1986, pp. 174-99.

143. Hayes, R.H. and Garvin, D.A., “Managing as if tomorrow mattered”, Harvard BusinessReview, May-June 1982, pp. 70-9.

144. Skinner, W., “The decline, fall and renewal of manufacturing”, Industrial Engineering,October 1974, pp. 32-8.

145. Hall, R.W., Zero Inventories, Dow-Jones Irwin, Homewood, IL, 1983.146. Schmenner, R.W., “Escaping the black holes of cost accounting”, Business Horizons,

January-February 1988, pp. 66-72.147. Turney, P.B.B. and Anderson, B., “Accounting for continuous improvement”, Sloan

Management Review, Vol. 30 No. 2, 1989, pp. 37-48.148. Lynch, R.L. and Cross, K.F., Measure Up – The Essential Guide to Measuring Business

Performance, Mandarin, London, 1991.149. Geanuracos, J. and Meiklejohn, I., Performance Measurement: The New Agenda, Business

Intelligence, London, 1993.

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 39: Performanta serviciilor in sectorul public

This article has been cited by:

1. Professor Andy Neely, Professor Irene C.L. Ng, Professor Rajkumar Roy, Yufeng Zhang, Lihong Zhang. 2014. Organizingcomplex engineering operations throughout the lifecycle. Journal of Service Management 25:5, 580-602. [Abstract] [FullText] [PDF]

2. Dominique EstampeSupply Chain Management Modeling 1-35. [CrossRef]3. Jaakko Tätilä, Pekka Helkiö, Jan Holmström. 2014. Exploring the performance effects of performance measurement system

use in maintenance process. Journal of Quality in Maintenance Engineering 20:4, 377-401. [Abstract] [Full Text] [PDF]4. Gunter Festel, Martin Würmseher. 2014. Benchmarking of industrial park infrastructures in Germany. Benchmarking: An

International Journal 21:6, 854-883. [Abstract] [Full Text] [PDF]5. Paolo Taticchi, Patrizia Garengo, Sai S. Nudurupati, Flavio Tonelli, Roberto Pasqualino. 2014. A review of decision-support

tools and performance measurement and sustainable supply chain management. International Journal of Production Research1-22. [CrossRef]

6. Hella Abidi, Sander de Leeuw, Matthias Klumpp, Beverly Wagner, Beverly Wagner. 2014. Humanitarian Supply ChainPerformance Management: A Systematic Literature Review. Supply Chain Management: An International Journal 19:5/6. .[Abstract] [PDF]

7. Liliana Avelar-Sosa, Jorge Luis García-Alcaraz, Osslan Osiris Vergara-Villegas, Aidé Aracely Maldonado-Macías, Giner Alor-Hernández. 2014. Impact of traditional and international logistic policies in supply chain performance. The InternationalJournal of Advanced Manufacturing Technology . [CrossRef]

8. Kwee Keong Choong. 2014. The fundamentals of performance measurement systems. International Journal of Productivityand Performance Management 63:7, 879-922. [Abstract] [Full Text] [PDF]

9. Loredana Di Pietro, Eleonora Pantano, Francesca Di Virgilio. 2014. Frontline employees׳ attitudes towards self-servicetechnologies: Threats or opportunity for job performance?. Journal of Retailing and Consumer Services 21, 844-850. [CrossRef]

10. Xenophon Koufteros, Anto (John) Verghese, Lorenzo Lucianetti. 2014. The effect of performance measurement systems onfirm performance: A cross-sectional and a longitudinal study. Journal of Operations Management 32, 313-336. [CrossRef]

11. Karine Araujo Ferreira, Robson Nogueira Tomas, Rosane Lúcia Chicarelli Alcântara. 2014. A theoretical framework forpostponement concept in a supply chain. International Journal of Logistics Research and Applications 1-16. [CrossRef]

12. Suresh Kumar Jakhar, Mukesh Kumar Barua. 2014. An integrated model of supply chain performance evaluation and decision-making using structural equation modelling and fuzzy AHP. Production Planning & Control 25, 938-957. [CrossRef]

13. Harri Laihonen, Aki Jääskeläinen, Sanna Pekkola. 2014. Measuring performance of a service system – from organizations tocustomer-perceived performance. Measuring Business Excellence 18:3, 73-86. [Abstract] [Full Text] [PDF]

14. Giustina Secundo, Gianluca Elia. 2014. A performance measurement system for academic entrepreneurship: a case study.Measuring Business Excellence 18:3, 23-37. [Abstract] [Full Text] [PDF]

15. Fernando A. F. Ferreira, Sérgio P. Santos, Paulo M. M. Rodrigues, Ronald W. Spahr. 2014. How to create indices for bankbranch financial performance measurement using MCDA techniques: an illustrative example. Journal of Business Economicsand Management 15, 708-728. [CrossRef]

16. Payman Ahi, Cory Searcy. 2014. An analysis of metrics used to measure performance in green and sustainable supply chains.Journal of Cleaner Production . [CrossRef]

17. Debora M. Gutierrez, Luiz Felipe Scavarda, Luiza Fiorencio, Roberto A. Martins. 2014. Evolution of the performancemeasurement system in the Logistics Department of a broadcasting company: An action research. International Journal ofProduction Economics . [CrossRef]

18. Kwee Keong Choong. 2014. Has this large number of performance measurement publications contributed to its betterunderstanding? A systematic review for research and applications. International Journal of Production Research 52, 4174-4197.[CrossRef]

19. Bedanand Upadhaya, Rahat Munir, Yvette Blount. 2014. Association between performance measurement systems andorganisational effectiveness. International Journal of Operations & Production Management 34:7, 853-875. [Abstract] [FullText] [PDF]

20. Fiorenzo Franceschini, Maurizio Galetto, Elisa Turina. 2014. Impact of performance indicators on organisations: a proposalfor an evaluation model. Production Planning & Control 25, 783-799. [CrossRef]

21. Frank Acito, Vijay Khatri. 2014. Business analytics: Why now and what next?. Business Horizons . [CrossRef]22. Rebecca Geiger and, Andreas AschenbrückerPerformance measurement and management in German universities 337-363.

[Abstract] [Full Text] [PDF] [PDF]23. David W.C. Wong, K.L. Choy, Harry K.H. Chow, Canhong Lin. 2014. Assessing a cross-border logistics policy using a

performance measurement system framework: the case of Hong Kong and the Pearl River Delta region. International Journalof Systems Science 45, 1306-1320. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 40: Performanta serviciilor in sectorul public

24. Davide Luzzini, Federico Caniato, Gianluca Spina. 2014. Designing vendor evaluation systems: An empirical analysis. Journalof Purchasing and Supply Management 20, 113-129. [CrossRef]

25. Steven A. Melnyk, Umit Bititci, Ken Platts, Jutta Tobias, Bjørn Andersen. 2014. Is performance measurement andmanagement fit for the future?. Management Accounting Research 25, 173-186. [CrossRef]

26. PETER SMITH, LISA CALLAGHER, XINLEI HUANG. 2014. ALLIANCE SCOPE AND FIRM PERFORMANCEIN THE BIOTECHNOLOGY INDUSTRY. International Journal of Innovation Management 18, 1440008. [CrossRef]

27. Dr Stefan Schaltegger, Prof Roger Burritt, Adolf Acquaye, Andrea Genovese, John Barrett, S.C. Lenny Koh. 2014.Benchmarking carbon emissions performance in supply chains. Supply Chain Management: An International Journal 19:3,306-321. [Abstract] [Full Text] [PDF]

28. Gunter Festel, Martin Würmseher. 2014. Benchmarking of energy and utility infrastructures in industrial parks. Journal ofCleaner Production 70, 15-26. [CrossRef]

29. Deborah Agostino, Bauke Steenhuisen, Michela Arnaboldi, Hans de Bruijn. 2014. PMS development in local public transport:Comparing Milan and Amsterdam. Transport Policy 33, 26-32. [CrossRef]

30. Stefania Veltri, Giovanni Mastroleo, Michaela Schaffhauser-Linzatti. 2014. Measuring intellectual capital in the universitysector using a fuzzy logic expert system. Knowledge Management Research & Practice 12, 175-192. [CrossRef]

31. Federico Caniato, Davide Luzzini, Stefano Ronchi. 2014. Purchasing performance management systems: an empiricalinvestigation. Production Planning & Control 25, 616-635. [CrossRef]

32. Junxiao Liu, Peter E.D. Love, Jim Smith, Michael Regan, Monty Sutrisna. 2014. Public-Private Partnerships: a review oftheory and practice of performance measurement. International Journal of Productivity and Performance Management 63:4,499-512. [Abstract] [Full Text] [PDF]

33. Felipe Calderon, Li Choy Chong. 2014. Dilemma of sustainable lending. Journal of Sustainable Finance & Investment 4,192-209. [CrossRef]

34. Jong Hun Woo, Young Joo Song. 2014. Systematisation of ship production management and case study for ship blockassembly factory. International Journal of Computer Integrated Manufacturing 27, 333-347. [CrossRef]

35. Samsul Islam, Tava Olsen. 2014. Truck-sharing challenges for hinterland trucking companies. Business Process ManagementJournal 20:2, 290-334. [Abstract] [Full Text] [PDF]

36. Olli Teriö, Jaakko Sorri, Kalle Kähkönen, Jukka Hämäläinen. 2014. Environmental index for Finnish construction sites.Construction Innovation 14:2, 245-262. [Abstract] [Full Text] [PDF]

37. Baofeng Huo, Xiande Zhao, Honggeng Zhou. 2014. The Effects of Competitive Environment on Supply ChainInformation Sharing and Performance: An Empirical Study in China. Production and Operations Management 23:10.1111/poms.2014.23.issue-4, 552-569. [CrossRef]

38. Valeria Belvedere. 2014. Defining the scope of service operations management: an investigation on the factors that affectthe span of responsibility of the operations department in service companies. Production Planning & Control 25, 447-461.[CrossRef]

39. Morteza Shafiee, Farhad Hosseinzadeh Lotfi, Hilda Saleh. 2014. Supply chain performance evaluation with data envelopmentanalysis and balanced scorecard approach. Applied Mathematical Modelling . [CrossRef]

40. Anca Draghici, Anca-Diana Popescu, Luminita Maria Gogan. 2014. A Proposed Model for Monitoring OrganizationalPerformance. Procedia - Social and Behavioral Sciences 124, 544-551. [CrossRef]

41. Felix T.S. Chan, Ashutosh Nayak, Ratan Raj, Alain Yee-Loong Chong, Tiwari Manoj. 2014. An innovative supply chainperformance measurement system incorporating Research and Development (R&D) and marketing policy. Computers &Industrial Engineering 69, 64-70. [CrossRef]

42. Ioannis E. Tsolas. 2014. Firm credit risk evaluation: a series two-stage DEA modeling framework. Annals of OperationsResearch . [CrossRef]

43. Julie Whitfield, Leonardo A.N. Dioko, Don E. Webber. 2014. Scoring environmental credentials: a review of UK conferenceand meetings venues using the GREENER VENUE framework. Journal of Sustainable Tourism 22, 299-318. [CrossRef]

44. Patrizia Garengo, Milind Kumar Sharma. 2014. Performance measurement system contingency factors: a cross analysis ofItalian and Indian SMEs. Production Planning & Control 25, 220-240. [CrossRef]

45. Grant MacKerron, Maneesh Kumar, Andreas Benedikt, Vikas Kumar. 2014. Performance management of suppliers inoutsourcing project: case analysis from the financial services industry. Production Planning & Control 1-17. [CrossRef]

46. Marlen Christin Jurisch, Wolfgang Palka, Petra Wolf, Helmut Krcmar. 2014. Which capabilities matter for successful businessprocess change?. Business Process Management Journal 20:1, 47-67. [Abstract] [Full Text] [PDF]

47. Mike Perkins, Anna Grey, Helge Remmers. 2014. What do we really mean by “Balanced Scorecard”?. International Journalof Productivity and Performance Management 63:2, 148-169. [Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 41: Performanta serviciilor in sectorul public

48. N.E. Myeda, S.M. Zaid, R. Sulaiman, N. Mahyuddin, M.A. Othuman Mydin, N.A. Agus Salim. 2014. The Implementationof Performance Measurement System (PMS): Malaysian Facilities Management (FM) Industry. MATEC Web of Conferences15, 01014. [CrossRef]

49. Chia Chi Sun. 2014. Assessing the Relative Efficiency and Productivity Growth of the Taiwan LED Industry: DEA andMalmquist Indices Application. Mathematical Problems in Engineering 2014, 1-13. [CrossRef]

50. Fernando A. F. Ferreira, Sérgio P. Santos, Paulo M. M. Rodrigues, Ronald W. Spahr. 2014. Evaluating retail banking servicequality and convenience with MCDA techniques: a case study at the bank branch level. Journal of Business Economics andManagement 15, 1-21. [CrossRef]

51. Maurice Bonney, Mohamad Y. Jaber. 2013. Deriving research agendas for manufacturing and logistics systems: Amethodology. International Journal of Production Economics . [CrossRef]

52. Pamela Danese. 2013. Supplier integration and company performance: A configurational view. Omega 41, 1029-1041.[CrossRef]

53. U.S. Bititci, S.U.O. Firat, P. Garengo. 2013. How to compare performances of firms operating in different sectors?. ProductionPlanning & Control 24, 1032-1049. [CrossRef]

54. Ben Clegg, Jillian MacBryde and Prasanta Dey, Mike Bourne, Andrey Pavlov, Monica Franco-Santos, Lorenzo Lucianetti,Matteo Mura. 2013. Generating organisational performance. International Journal of Operations & Production Management33:11/12, 1599-1622. [Abstract] [Full Text] [PDF]

55. Kwee Keong Choong. 2013. Understanding the features of performance measurement system: a literature review. MeasuringBusiness Excellence 17:4, 102-121. [Abstract] [Full Text] [PDF]

56. Neetu Yadav, , Mahim Sagar. 2013. Performance measurement and management frameworks. Business Process ManagementJournal 19:6, 947-971. [Abstract] [Full Text] [PDF]

57. Per J. Agrell, Adel Hatami-Marbini. 2013. Frontier-based performance analysis models for supply chain management: Stateof the art and research directions. Computers & Industrial Engineering 66, 567-583. [CrossRef]

58. Luisa D. Huaccho Huatuco, Jairo Rafael Montoya-Torres, Nicky Shaw and Anisoara Calinescu, Nancy Bocken, DavidMorgan, Steve Evans. 2013. Understanding environmental performance variation in manufacturing companies. InternationalJournal of Productivity and Performance Management 62:8, 856-870. [Abstract] [Full Text] [PDF]

59. Luisa D. Huaccho Huatuco, Jairo Rafael Montoya-Torres, Nicky Shaw and Anisoara Calinescu, Paolo Taticchi, Flavio Tonelli,Roberto Pasqualino. 2013. Performance measurement of sustainable supply chains. International Journal of Productivity andPerformance Management 62:8, 782-804. [Abstract] [Full Text] [PDF]

60. Dimitrios P. Kafetzopoulos, Evangelos L. Psomas, Panagiotis D. Kafetzopoulos. 2013. Measuring the effectiveness of theHACCP Food Safety Management System. Food Control 33, 505-513. [CrossRef]

61. Andreea-Ioana Coste, Adriana Tiron Tudor. 2013. Service Performance - Between Measurement and Information in thePublic Sector. Procedia - Social and Behavioral Sciences 92, 215-219. [CrossRef]

62. Graham Manville, Martin Broad. 2013. Changing Times for Charities: Performance management in a Third Sector HousingAssociation. Public Management Review 15, 992-1010. [CrossRef]

63. Majid Ramezan, Mohammad Ebrahim Sanjaghi, Hassan Rahimian Kalateh Baly. 2013. Organizational change capacity andorganizational performance. Journal of Knowledge-based Innovation in China 5:3, 188-212. [Abstract] [Full Text] [PDF]

64. Sara Haji‐Kazemi, Bjørn Andersen. 2013. Application of performance measurement as an early warning system. InternationalJournal of Managing Projects in Business 6:4, 714-738. [Abstract] [Full Text] [PDF]

65. Stefanos C. Doltsinis, Svetan Ratchev, Niels Lohse. 2013. A framework for performance measurement during productionramp-up of assembly stations. European Journal of Operational Research 229, 85-94. [CrossRef]

66. Lamia Berrah, Laurent Foulloy. 2013. Towards a unified descriptive framework for industrial objective declaration andperformance measurement. Computers in Industry 64, 650-662. [CrossRef]

67. Elena Giovannoni, Maria Pia Maraghini. 2013. The challenges of integrated performance measurement systems. Accounting,Auditing & Accountability Journal 26:6, 978-1008. [Abstract] [Full Text] [PDF]

68. Vincenza Esposito, Ernesto De Nito, Mario Pezzillo Iacono, Lucia Silvestri. 2013. Dealing with knowledge in the Italianpublic universities. Journal of Intellectual Capital 14:3, 431-450. [Abstract] [Full Text] [PDF]

69. Salah Alhyari, Moutaz Alazab, Sitalakshmi Venkatraman, Mamoun Alazab, Ammar Alazab. 2013. Performance evaluationof e‐government services using balanced scorecard. Benchmarking: An International Journal 20:4, 512-536. [Abstract] [FullText] [PDF]

70. Hany Abd Elshakour M. Ali, Ibrahim A. Al-Sulaihi, Khalid S. Al-Gahtani. 2013. Indicators for measuring performanceof building construction companies in Kingdom of Saudi Arabia. Journal of King Saud University - Engineering Sciences 25,125-134. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 42: Performanta serviciilor in sectorul public

71. Andrew Taylor, Margaret Taylor. 2013. Antecedents of effective performance measurement system implementation: anempirical study of UK manufacturing firms. International Journal of Production Research 1-14. [CrossRef]

72. Louis Raymond, Marie Marchand, Josée St-Pierre, Louise Cadieux, François Labelle. 2013. Dimensions of small businessperformance from the owner-manager's perspective: a re-conceptualization and empirical validation. Entrepreneurship &Regional Development 25, 468-499. [CrossRef]

73. Kwee Keong Choong. 2013. Are PMS meeting the measurement needs of BPM? A literature review. Business ProcessManagement Journal 19:3, 535-574. [Abstract] [Full Text] [PDF]

74. Kai Foerstl, Evi Hartmann, Finn Wynstra, Roger Moser. 2013. Cross‐functional integration and functional coordination inpurchasing and supply management. International Journal of Operations & Production Management 33:6, 689-721. [Abstract][Full Text] [PDF]

75. Patxi Ruiz‐de‐Arbulo‐Lopez, Jordi Fortuny‐Santos, Lluís Cuatrecasas‐Arbós. 2013. Lean manufacturing: costing the valuestream. Industrial Management & Data Systems 113:5, 647-668. [Abstract] [Full Text] [PDF]

76. Dewan, Md Zahurul Islam, Thomas H. Zunder, Ronald Jorna. 2013. Performance evaluation of an online benchmarking toolfor European freight transport chains. Benchmarking: An International Journal 20:2, 233-250. [Abstract] [Full Text] [PDF]

77. Dominique Estampe, Samir Lamouri, Jean-Luc Paris, Sakina Brahim-Djelloul. 2013. A framework for analysing supply chainperformance evaluation models. International Journal of Production Economics 142, 247-258. [CrossRef]

78. Marcelo Bronzo, Paulo Tarso Vilela de Resende, Marcos Paulo Valadares de Oliveira, Kevin P. McCormack, Paulo Renatode Sousa, Reinaldo Lopes Ferreira. 2013. Improving performance aligning business analytics with process orientation.International Journal of Information Management 33, 300-307. [CrossRef]

79. Manoj Kumar Mohanty, Padmabati Gahan. 2013. Supplier performance measurement in discrete manufacturing industry-empirical study on Indian manufacturing sector. Journal of Business Economics and Management 14, 330-347. [CrossRef]

80. Mathias Jönsson, Carin Andersson, Jan-Eric Ståhl. 2013. Conditions for and development of an information technology-support tool for manufacturing cost calculations and analyses. International Journal of Computer Integrated Manufacturing26, 303-315. [CrossRef]

81. Hannes Johnson, Mikael Johansson, Karin Andersson, Björn Södahl. 2013. Will the ship energy efficiency management planreduce CO 2 emissions? A comparison with ISO 50001 and the ISM code. Maritime Policy & Management 40, 177-190.[CrossRef]

82. Constantin Blome, Tobias Schoenherr, Daniel Rexhausen. 2013. Antecedents and enablers of supply chain agility and its effecton performance: a dynamic capabilities perspective. International Journal of Production Research 51, 1295-1318. [CrossRef]

83. Rahat Munir, Kevin Baird, Sujatha Perera. 2013. Performance measurement system change in an emerging economy bank.Accounting, Auditing & Accountability Journal 26:2, 196-233. [Abstract] [Full Text] [PDF]

84. Tuomas Korhonen, Teemu Laine, Petri Suomala. 2013. Understanding performance measurement dynamism: a case study.Journal of Management & Governance 17, 35-58. [CrossRef]

85. Jennifer A. Farris, Eileen M. Van Aken,, Geert LetensOrganizational Performance Measurement . [CrossRef]86. Fiorenzo Franceschini, Elisa Turina. 2013. Quality improvement and redesign of performance measurement systems: an

application to the academic field. Quality & Quantity 47, 465-483. [CrossRef]87. Horst Meier, Henning Lagemann, Friedrich Morlock, Christian Rathmann. 2013. Key Performance Indicators for Assessing

the Planning and Delivery of Industrial Services. Procedia CIRP 11, 99-104. [CrossRef]88. Laura Grosswiele, Maximilian Röglinger, Bettina Friedl. 2013. A decision framework for the consolidation of performance

measurement systems. Decision Support Systems 54, 1016-1029. [CrossRef]89. Evangelos L. Psomas, Dimitrios P. Kafetzopoulos, Christos V. Fotopoulos. 2012. Developing and validating a measurement

instrument of ISO 9001 effectiveness in food manufacturing SMEs. Journal of Manufacturing Technology Management 24:1,52-77. [Abstract] [Full Text] [PDF]

90. Josep Bisbe, Ricardo Malagueño. 2012. Using strategic performance measurement systems for strategy formulation: Does itwork in dynamic environments?. Management Accounting Research 23, 296-311. [CrossRef]

91. Rafael Henrique Palma Lima, Luiz Cesar Ribeiro Carpinetti. 2012. Analysis of the interplay between knowledge andperformance management in industrial clusters. Knowledge Management Research & Practice 10, 368-379. [CrossRef]

92. Alexandre Veronese Bentes, Jorge Carneiro, Jorge Ferreira da Silva, Herbert Kimura. 2012. Multidimensional assessment oforganizational performance: Integrating BSC and AHP. Journal of Business Research 65, 1790-1799. [CrossRef]

93. Khalfan Zahran Al Hijji, Andrew M. Cox. 2012. Performance measurement methods at academic libraries in Oman.Performance Measurement and Metrics 13:3, 183-196. [Abstract] [Full Text] [PDF]

94. Pedro Gustavo Siqueira Ferreira, Edson Pinheiro de Lima, Sergio E. Gouvea da Costa. 2012. Perception of virtual team’sperformance: A multinational exercise. International Journal of Production Economics 140, 416-430. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 43: Performanta serviciilor in sectorul public

95. Bunjongjit Rompho, Sununta Siengthai. 2012. Integrated performance measurement system for firm's human capital building.Journal of Intellectual Capital 13:4, 482-514. [Abstract] [Full Text] [PDF]

96. Ahmad Saleh Shatat, Zulkifli Mohamed Udin. 2012. The relationship between ERP system and supply chain managementperformance in Malaysian manufacturing companies. Journal of Enterprise Information Management 25:6, 576-604. [Abstract][Full Text] [PDF]

97. Fahriye Uysal. 2012. An Integrated Model for Sustainable Performance Measurement in Supply Chain. Procedia - Social andBehavioral Sciences 62, 689-694. [CrossRef]

98. Tajul Ariffin Masron, Zamri Ahmad, Norizan Baba Rahim. 2012. Key Performance Indicators vs Key Intangible PerformanceAmong Academic Staff: A Case Study of a Public University in Malaysia. Procedia - Social and Behavioral Sciences 56, 494-503.[CrossRef]

99. M. Gupta, S. Andersen. 2012. Revisiting local TOC measures in an internal supply chain: a note. International Journal ofProduction Research 50, 5363-5371. [CrossRef]

100. Umit Bititci, Patrizia Garengo, Viktor Dörfler, Sai Nudurupati. 2012. Performance Measurement: Challenges for Tomorrow*.International Journal of Management Reviews 14:10.1111/ijmr.2012.14.issue-3, 305-327. [CrossRef]

101. Jan Stentoft Arlbjørn, Teit Lüthje. 2012. Global operations and their interaction with supply chain performance. IndustrialManagement & Data Systems 112:7, 1044-1064. [Abstract] [Full Text] [PDF]

102. Ricardo Chalmeta, Sergio Palomero, Magali Matilla. 2012. Methodology to develop a performance measurement system insmall and medium-sized enterprises. International Journal of Computer Integrated Manufacturing 25, 716-740. [CrossRef]

103. Pedro Sena Ferreira, A.H.M. Shamsuzzoha, Cesar Toscano, Pedro Cunha. 2012. Framework for performance measurementand management in a collaborative business environment. International Journal of Productivity and Performance Management61:6, 672-690. [Abstract] [Full Text] [PDF]

104. Radoslav Škapa, Alena Klapalová. 2012. Reverse logistics in Czech companies: increasing interest in performance measurement.Management Research Review 35:8, 676-692. [Abstract] [Full Text] [PDF]

105. Junyang Shao, Roger Moser, Michael Henke. 2012. Multidimensional supply performance framework: A conceptualdevelopment and empirical analysis. International Journal of Production Economics 138, 26-34. [CrossRef]

106. Marcelo Bronzo, Marcos Paulo Valadares de Oliveira, Kevin McCormack. 2012. Planning, capabilities, and performance: anintegrated value approach. Management Decision 50:6, 1001-1021. [Abstract] [Full Text] [PDF]

107. P.R.C. Gopal, Jitesh Thakkar. 2012. A review on supply chain performance measures and metrics: 2000‐2011. InternationalJournal of Productivity and Performance Management 61:5, 518-547. [Abstract] [Full Text] [PDF]

108. Nopadol Rompho, Sakun Boon‐itt. 2012. Measuring the success of a performance measurement system in Thai firms.International Journal of Productivity and Performance Management 61:5, 548-562. [Abstract] [Full Text] [PDF]

109. Stefan Strecker, Ulrich Frank, David Heise, Heiko Kattenstroth. 2012. MetricM: a modeling method in support of thereflective design and use of performance measurement systems. Information Systems and e-Business Management 10, 241-276.[CrossRef]

110. Sima Ajami, Saeedeh Ketabi. 2012. Performance Evaluation of Medical Records Departments by Analytical Hierarchy Process(AHP) Approach in the Selected Hospitals in Isfahan. Journal of Medical Systems 36, 1165-1171. [CrossRef]

111. H. J. Doeleman, Steven ten Have, Kees Ahaus. 2012. The moderating role of leadership in the relationship betweenmanagement control and business excellence. Total Quality Management & Business Excellence 23, 591-611. [CrossRef]

112. Paolo Taticchi, Kashi Balachandran, Flavio Tonelli. 2012. Performance measurement and management systems: state of theart, guidelines for design and challenges. Measuring Business Excellence 16:2, 41-54. [Abstract] [Full Text] [PDF]

113. Gianpaolo Iazzolino, Domenico Laise, Laura Marraro. 2012. Business multicriteria performance analysis: a tutorial.Benchmarking: An International Journal 19:3, 395-411. [Abstract] [Full Text] [PDF]

114. Christian Homburg, Martin Artz, Jan Wieseke. 2012. Marketing Performance Measurement Systems: DoesComprehensiveness Really Improve Performance?. Journal of Marketing 76, 56-77. [CrossRef]

115. John Taylor, Claire Baines. 2012. Performance management in UK universities: implementing the Balanced Scorecard. Journalof Higher Education Policy and Management 34, 111-124. [CrossRef]

116. Andreas Taschner. 2012. Umsetzungsoptionen eines KMU-spezifischen Kennzahlen-Managements. ZfKE – Zeitschrift fürKMU und Entrepreneurship 60, 111-135. [CrossRef]

117. Gerhard Speckbacher, Paul Wentges. 2012. The impact of family control on the use of performance measures in strategictarget setting and incentive compensation: A research note. Management Accounting Research 23, 34-46. [CrossRef]

118. Peter Letmathe, Marcus Schweitzer, Marc Zielinski. 2012. How to learn new tasks: Shop floor performance effects ofknowledge transfer and performance feedback. Journal of Operations Management 30, 221-236. [CrossRef]

119. Narimantas Kazimieras Paliulis, Laura Uturytė–VrubliauskienėPerformance Measurement System to Evaluate the Efficiencyof E-Business 895-903. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 44: Performanta serviciilor in sectorul public

120. Kaushik Mandal, Chandan Kumar Banerjee. 2012. An Empirical Identification of Performance Gap in Engineering EducationProgram from the Perspective of Stakeholders. International Journal of Trade, Economics and Finance 281-292. [CrossRef]

121. Parveen Farooquie, Abdul Gani, Arsalanullah K Zuberi, Imran Hashmi. 2012. An empirical study of innovation-performancelinkage in the paper industry. Journal of Industrial Engineering International 8, 23. [CrossRef]

122. Mattias Elg, Beata Kollberg. 2012. Conditions for reporting performance measurement. Total Quality Management & BusinessExcellence 23, 63-77. [CrossRef]

123. Laura Uturytė-Vrubliauskienė, Eglė KvieskaitėRSocial Network in Higher Educational Institution: Effective ManagementDiscourse 937-942. [CrossRef]

124. Mathies Pohl, Kai Förstl. 2011. Achieving purchasing competence through purchasing performance measurement systemdesign—A multiple-case study analysis. Journal of Purchasing and Supply Management 17, 231-245. [CrossRef]

125. Klaus Möller, Ramin Gamerschlag, Finn Guenther. 2011. Determinants and effects of human capital reporting andcontrolling. Journal of Management Control 22, 311-333. [CrossRef]

126. Anthony A. Hines, António Pimenta da Gama. 2011. An expanded model of marketing performance. Marketing Intelligence& Planning 29:7, 643-661. [Abstract] [Full Text] [PDF]

127. Renata Gomes Frutuoso Braz, Luiz Felipe Scavarda, Roberto Antonio Martins. 2011. Reviewing and improving performancemeasurement systems: An action research. International Journal of Production Economics 133, 751-760. [CrossRef]

128. Suwit Srimai, Jack Radford, Chris Wright. 2011. Evolutionary paths of performance measurement. International Journal ofProductivity and Performance Management 60:7, 662-687. [Abstract] [Full Text] [PDF]

129. W.H. Ip, S.L. Chan, C.Y. Lam. 2011. Modeling supply chain performance and stability. Industrial Management & DataSystems 111:8, 1332-1354. [Abstract] [Full Text] [PDF]

130. Mohammad Ali Shafia, Mohammad Mahdavi Mazdeh, Mahboobeh Vahedi, Mehrdokht Pournader. 2011. Applying fuzzybalanced scorecard for evaluating the CRM performance. Industrial Management & Data Systems 111:7, 1105-1135. [Abstract][Full Text] [PDF]

131. Ioannis E. Tsolas. 2011. Modelling profitability and effectiveness of Greek-listed construction firms: an integrated DEA andratio analysis. Construction Management and Economics 29, 795-807. [CrossRef]

132. Kim Sundtoft Hald, Chris Ellegaard. 2011. Supplier evaluation processes: the shaping and reshaping of supplier performance.International Journal of Operations & Production Management 31:8, 888-910. [Abstract] [Full Text] [PDF]

133. Nik Elyna Myeda, Syahrul Nizam Kamaruzzaman, Michael Pitt. 2011. Measuring the performance of office buildingsmaintenance management in Malaysia. Journal of Facilities Management 9:3, 181-199. [Abstract] [Full Text] [PDF]

134. Md Habib‐Uz‐Zaman Khan, Abdel K. Halabi, Kurt Sartorius. 2011. The use of multiple performance measures and thebalanced scorecard (BSC) in Bangladeshi firms. Journal of Accounting in Emerging Economies 1:2, 160-190. [Abstract] [FullText] [PDF]

135. F A F Ferreira, S P Santos, P M M Rodrigues. 2011. Adding value to bank branch performance evaluation using cognitivemaps and MCDA: a case study. Journal of the Operational Research Society 62, 1320-1333. [CrossRef]

136. SHUN CHUAN HO, WILLIAM YU CHUNG WANG, DAVID J. PAULEEN, PING HO TING. 2011.PERSPECTIVES ON THE PERFORMANCE OF SUPPLY CHAIN SYSTEMS: THE EFFECTS OF ATTITUDEAND ASSIMILATION. International Journal of Information Technology & Decision Making 10, 635-658. [CrossRef]

137. Beata Kollberg, Mattias Elg. 2011. The practice of the Balanced Scorecard in health care services. International Journal ofProductivity and Performance Management 60:5, 427-445. [Abstract] [Full Text] [PDF]

138. Changiz Valmohammadi, Azadeh Servati. 2011. Performance measurement system implementation using Balanced Scorecardand statistical methods. International Journal of Productivity and Performance Management 60:5, 493-511. [Abstract] [FullText] [PDF]

139. Yufeng Zhang, Mike Gregory. 2011. Managing global network operations along the engineering value chain. InternationalJournal of Operations & Production Management 31:7, 736-764. [Abstract] [Full Text] [PDF]

140. M. Adel El-Baz. 2011. Fuzzy performance measurement of a supply chain in manufacturing companies. Expert Systems withApplications 38, 6681-6688. [CrossRef]

141. A.R. Rezaei, T. Çelik, Y. Baalousha. 2011. Performance measurement in a quality management system. Scientia Iranica 18,742-752. [CrossRef]

142. Carlos Ferreira Gomes, Mahmoud M. Yasin. 2011. Toward the promotion of effective performance of entry‐level managers:the case of Portugal. Competitiveness Review 21:3, 288-305. [Abstract] [Full Text] [PDF]

143. Istemi Demirag, Iqbal Khadaroo. 2011. Accountability and value for money: a theoretical framework for the relationship inpublic–private partnerships. Journal of Management & Governance 15, 271-296. [CrossRef]

144. Kang Zhao, Akhil Kumar, John Yen. 2011. Achieving High Robustness in Supply Distribution Networks by Rewiring. IEEETransactions on Engineering Management 58, 347-362. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 45: Performanta serviciilor in sectorul public

145. Bo Enquist, Carolina Camén, Mikael Johnson. 2011. Contractual governance for public service value networks. Journal ofService Management 22:2, 217-240. [Abstract] [Full Text] [PDF]

146. Danilo Hisano Barbosa, Marcel Andreotti Musetti. 2011. The use of performance measurement system in logistics changeprocess. International Journal of Productivity and Performance Management 60:4, 339-359. [Abstract] [Full Text] [PDF]

147. P. Parthiban, Mark Goh. 2011. An integrated model for performance management of manufacturing units. Benchmarking: AnInternational Journal 18:2, 261-281. [Abstract] [Full Text] [PDF]

148. Michael Bourlakis, T.C. Melewar, Ruth Banomyong, Nucharee Supatn. 2011. Selecting logistics providers in Thailand: ashippers' perspective. European Journal of Marketing 45:3, 419-437. [Abstract] [Full Text] [PDF]

149. R. Rajesh, S. Pugazhendhi, K. Ganesh, C. Muralidharan, R. Sathiamoorthy. 2011. Influence of 3PL service offerings onclient performance in India. Transportation Research Part E: Logistics and Transportation Review 47, 149-165. [CrossRef]

150. S.S. Nudurupati, U.S. Bititci, V. Kumar, F.T.S. Chan. 2011. State of the art literature review on performance measurement.Computers & Industrial Engineering 60, 279-290. [CrossRef]

151. Adrienn Molnár, Xavier Gellynck, Filiep Vanhonacker, Taras Gagalyuk, Wim Verbeke. 2011. Do chain goals match consumerperceptions? the case of the traditional food sector in selected European Union countries. Agribusiness 27:10.1002/agr.v27.2,221-243. [CrossRef]

152. Markus Perkmann, Andy Neely, Kathryn Walsh. 2011. How should firms evaluate success in university-industry alliances?A performance measurement system. R&D Management 41:10.1111/radm.2011.41.issue-2, 202-216. [CrossRef]

153. Dawei Lu, Alan Betts. 2011. Why process improvement training fails. Journal of Workplace Learning 23:2, 117-132. [Abstract][Full Text] [PDF]

154. Andrew Potter, Paul Childerhouse, Ruth Banomyong, Nucharee Supatn. 2011. Developing a supply chain performance toolfor SMEs in Thailand. Supply Chain Management: An International Journal 16:1, 20-31. [Abstract] [Full Text] [PDF]

155. Andrey Pavlov, Mike Bourne. 2011. Explaining the effects of performance measurement on performance. International Journalof Operations & Production Management 31:1, 101-122. [Abstract] [Full Text] [PDF]

156. Fuyume Sai. 2011. Applying Fuzzy Outranking Method to Measurement System of Multidimensional Performance. Procedia- Social and Behavioral Sciences 25, 345-352. [CrossRef]

157. N.E.M. Nik-Mat, S.N. Kamaruzzaman, M. Pitt. 2011. Assessing The Maintenance Aspect of Facilities Management througha Performance Measurement System: A Malaysian Case Study. Procedia Engineering 20, 329-338. [CrossRef]

158. Ki Yeon Chung, Yoon Seok Chang, Hannu Vanharanta, Jussi Kantola. 2011. Evaluation of radio-frequency identificationprojects in public sectors of Korea. Human Factors and Ergonomics in Manufacturing & Service Industries 21:10.1002/hfm.v21.1, 44-51. [CrossRef]

159. Angele Pieters, Charlotte van Oirschot, Henk Akkermans. 2010. No cure for all evils. International Journal of Operations &Production Management 30:11, 1112-1139. [Abstract] [Full Text] [PDF]

160. Qiang Wang, Baofeng Huo, Fujun Lai, Zhaofang Chu. 2010. Understanding performance drivers of third‐party logisticsproviders in mainland China. Industrial Management & Data Systems 110:9, 1273-1296. [Abstract] [Full Text] [PDF]

161. Dimitris Papakiriakopoulos, Katerina Pramatari. 2010. Collaborative performance measurement in supply chain. IndustrialManagement & Data Systems 110:9, 1297-1318. [Abstract] [Full Text] [PDF]

162. Ruggero Sainaghi. 2010. A meta‐analysis of hotel performance. Continental or worldwide style?. Tourism Review 65:3, 46-69.[Abstract] [Full Text] [PDF]

163. Lie-Chien Lin, Tzu-Su Li. 2010. An integrated framework for supply chain performance measurement using six-sigmametrics. Software Quality Journal 18, 387-406. [CrossRef]

164. Timothy Tunde Oladokun. 2010. Towards value‐creating corporate real estate assets management in emerging economies.Journal of Property Investment & Finance 28:5, 354-364. [Abstract] [Full Text] [PDF]

165. Roman Schneider, Rui Vieira. 2010. Insights from action research: implementing the balanced scorecard at a wind‐farmcompany. International Journal of Productivity and Performance Management 59:5, 493-507. [Abstract] [Full Text] [PDF]

166. Fernando Bernardi de Souza, Sílvio R.I. Pires. 2010. Theory of constraints contributions to outbound logistics. ManagementResearch Review 33:7, 683-700. [Abstract] [Full Text] [PDF]

167. Ioannis Manikas, Leon A. Terry. 2010. A case study assessment of the operational performance of a multiple fresh producedistribution centre in the UK. British Food Journal 112:6, 653-667. [Abstract] [Full Text] [PDF]

168. Professor Joseph Sarkis, Sarah Shaw, David B. Grant, John Mangan. 2010. Developing environmental supply chainperformance measures. Benchmarking: An International Journal 17:3, 320-339. [Abstract] [Full Text] [PDF]

169. Paul M. Gibbons, Stuart C. Burgess. 2010. Introducing OEE as a measure of lean Six Sigma capability. International Journalof Lean Six Sigma 1:2, 134-156. [Abstract] [Full Text] [PDF]

170. Abhijit Basu, Rosemary Howell, Deepa Gopinath. 2010. Clinical performance indicators: intolerance for variety?. InternationalJournal of Health Care Quality Assurance 23:4, 436-449. [Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 46: Performanta serviciilor in sectorul public

171. Mahmoud M. Yasin, Carlos F. Gomes. 2010. Performance management in service operational settings: a selective literatureexamination. Benchmarking: An International Journal 17:2, 214-231. [Abstract] [Full Text] [PDF]

172. R.M. Chandima Ratnayake, Tore Markeset. 2010. Technical integrity management: measuring HSE awareness using AHPin selecting a maintenance strategy. Journal of Quality in Maintenance Engineering 16:1, 44-63. [Abstract] [Full Text] [PDF]

173. Gunjan Soni, Rambabu Kodali. 2010. Internal benchmarking for assessment of supply chain performance. Benchmarking: AnInternational Journal 17:1, 44-76. [Abstract] [Full Text] [PDF]

174. Jussi Lehtinen, Tuomas Ahola. 2010. Is performance measurement suitable for an extended enterprise?. International Journalof Operations & Production Management 30:2, 181-204. [Abstract] [Full Text] [PDF]

175. Ben Clegg, Alberto Bayo‐Moriones, Alejandro Bello‐Pintado, Javier Merino‐Díaz de Cerio. 2010. 5S use in manufacturingplants: contextual factors and impact on operating performance. International Journal of Quality & Reliability Management27:2, 217-230. [Abstract] [Full Text] [PDF]

176. K. Hafeez, K.H.A. Keoy, M. Zairi, R. Hanneman, S.C. Lenny Koh. 2010. E-supply chain operational and behaviouralperspectives: an empirical study of Malaysian SMEs. International Journal of Production Research 48, 525-546. [CrossRef]

177. Alex NunnPerformance management and neo-liberal labour market governance: the case of the UK 77-99. [Abstract] [FullText] [PDF] [PDF]

178. Al Bento, Lourdes Ferreira WhiteAn exploratory study of strategic performance measurement systems 1-26. [Abstract] [FullText] [PDF] [PDF]

179. Barbara B. Flynn, Baofeng Huo, Xiande Zhao. 2010. The impact of supply chain integration on performance: A contingencyand configuration approach. Journal of Operations Management 28, 58-71. [CrossRef]

180. Rafael Henrique Palma Lima, Luiz Cesar Ribeiro Carpinetti. 2010. Proposal of a method for performance measurement systemdesign and implementation of a software application in SMEs. International Journal of Business Performance Management 12,182. [CrossRef]

181. Matthieu Lauras, Guillaume Marques, Didier Gourc. 2010. Towards a multi-dimensional project Performance MeasurementSystem. Decision Support Systems 48, 342-353. [CrossRef]

182. Subhash Wadhwa, Madhawanand Mishra, Felix T.S. Chan, Y. Ducq. 2010. Effects of information transparency andcooperation on supply chain performance: a simulation study. International Journal of Production Research 48, 145-166.[CrossRef]

183. Sarah Shaw, David B. Grant, John Mangan. 2010. Developing environmental supply chain performance measures.Benchmarking: An International Journal 17, 320-339. [CrossRef]

184. P. A. Smart, H. Maddern, R. S. Maull. 2009. Understanding Business Process Management: Implications for Theory andPractice. British Journal of Management 20:10.1111/bjom.2009.20.issue-4, 491-507. [CrossRef]

185. S. Zorzut, V. Jovan, D. Gradišar, G. Mušič. 2009. Closed-loop control of a polymerisation plant using production performanceindicators (PIs). International Journal of Computer Integrated Manufacturing 22, 1128-1143. [CrossRef]

186. André de Waal, Karima Kourtit, Peter Nijkamp. 2009. The relationship between the level of completeness of a strategicperformance management system and perceived advantages and disadvantages. International Journal of Operations & ProductionManagement 29:12, 1242-1265. [Abstract] [Full Text] [PDF]

187. Luis E. Quezada, Felisa M. Cordova, Pedro Palominos, Katherine Godoy, Jocelyn Ross. 2009. Method for identifying strategicobjectives in strategy maps. International Journal of Production Economics 122, 492-500. [CrossRef]

188. Murali Sambasivan, Tamizarasu Nandan, Zainal Abidin Mohamed. 2009. Consolidation of performance measures in a supplychain environment. Journal of Enterprise Information Management 22:6, 660-689. [Abstract] [Full Text] [PDF]

189. Christos V. Fotopoulos, Dimitrios P. Kafetzopoulos, Evangelos L. Psomas. 2009. Assessing the critical factors and theirimpact on the effective implementation of a food safety management system. International Journal of Quality & ReliabilityManagement 26:9, 894-910. [Abstract] [Full Text] [PDF]

190. Khalid Al-Mutawah, Vincent Lee, Yen Cheung. 2009. A new multi-agent system framework for tacit knowledge managementin manufacturing supply chains. Journal of Intelligent Manufacturing 20, 593-610. [CrossRef]

191. Sebastjan Zorzut, Dejan Gradišar, Vladimir Jovan, Gašper Mušič. 2009. Use of a procedural model in the design of productioncontrol for a polymerization plant. The International Journal of Advanced Manufacturing Technology 44, 1051-1062. [CrossRef]

192. MARK FRANCIS. 2009. PRIVATE-LABEL NPD PROCESS IMPROVEMENT IN THE UK FAST MOVINGCONSUMER GOODS INDUSTRY. International Journal of Innovation Management 13, 467-499. [CrossRef]

193. Karen Fryer, Jiju Antony, Susan Ogden. 2009. Performance management in the public sector. International Journal of PublicSector Management 22:6, 478-498. [Abstract] [Full Text] [PDF]

194. Dr Hokey Min, Jitesh Thakkar, Arun Kanda, S.G. Deshmukh. 2009. Supply chain performance measurement framework forsmall and medium scale enterprises. Benchmarking: An International Journal 16:5, 702-723. [Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 47: Performanta serviciilor in sectorul public

195. Alan Meekings, Simon Povey, Andy Neely. 2009. Performance plumbing: installing performance management systems todeliver lasting value. Measuring Business Excellence 13:3, 13-19. [Abstract] [Full Text] [PDF]

196. Olivia Pinon, Kelly Fry, John-Paul ClarkeThe Air Transportation System as a Supply Chain . [CrossRef]197. Mithat Zeydan, Cüneyt Çolpan. 2009. A new decision support system for performance measurement using combined fuzzy

TOPSIS/DEA approach. International Journal of Production Research 47, 4327-4349. [CrossRef]198. Taco Elzinga, Bé Albronda, Frits Kluijtmans. 2009. Behavioral factors influencing performance management systems' use.

International Journal of Productivity and Performance Management 58:6, 508-522. [Abstract] [Full Text] [PDF]199. Andrea Kasztler, Karl‐Heinz Leitner. 2009. An SNA‐based approach for management control of intellectual capital. Journal

of Intellectual Capital 10:3, 329-340. [Abstract] [Full Text] [PDF]200. Ola Johansson, Henrik Pålsson. 2009. The impact of Auto‐ID on logistics performance. Benchmarking: An International

Journal 16:4, 504-522. [Abstract] [Full Text] [PDF]201. Manoochehr Najmi, Rahim Ehsani, Ahmad Sharbatoghlie, Mohammad Saidi‐Mehrabad. 2009. Developing an integrated

dynamic model for evaluating the performance of research projects based on multiple attribute utility theory. Journal ofModelling in Management 4:2, 114-133. [Abstract] [Full Text] [PDF]

202. Jacques Martin, Claudio Baccarani, Glória Antunes, António Pires, Virgílio Machado. 2009. Process improvement measuresin social area organisations. The TQM Journal 21:4, 334-352. [Abstract] [Full Text] [PDF]

203. Alan Pilkington, Jack Meredith. 2009. The evolution of the intellectual structure of operations management—1980–2006:A citation/co-citation analysis. Journal of Operations Management 27, 185-202. [CrossRef]

204. Jim Rooney, Suresh Cuganesan. 2009. Contractual and Accounting Controls in Outsourcing Agreements: Evidence from theAustralian Home Loan Industry. Australian Accounting Review 19:10.1111/auar.2009.19.issue-2, 80-92. [CrossRef]

205. Ioannis Manikas, Leon A. Terry. 2009. A case study assessment of the operational performance of a multiple fresh producedistribution centre in the UK. British Food Journal 111:5, 421-435. [Abstract] [Full Text] [PDF]

206. Kerry Walsh, Jiju Antony. 2009. An assessment of quality costs within electronic adverse incident reporting and recordingsystems. International Journal of Health Care Quality Assurance 22:3, 203-220. [Abstract] [Full Text] [PDF]

207. Thomas Åhrén, Aditya Parida. 2009. Maintenance performance indicators (MPIs) for benchmarking the railwayinfrastructure. Benchmarking: An International Journal 16:2, 247-258. [Abstract] [Full Text] [PDF]

208. Mattias Elg, Beata Kollberg. 2009. Alternative arguments and directions for studying performance measurement. Total QualityManagement & Business Excellence 20, 409-421. [CrossRef]

209. Stefan Schmid, Katharina Kretschmer. 2009. Performance evaluation of foreign subsidiaries: A review of the literature and acontingency framework. International Journal of Management Reviews . [CrossRef]

210. VITTORIO CHIESA, FEDERICO FRATTINI, VALENTINA LAZZAROTTI, RAFFAELLA MANZINI. 2009. ANEXPLORATORY STUDY ON R&D PERFORMANCE MEASUREMENT PRACTICES: A SURVEY OF ITALIANR&D-INTENSIVE FIRMS. International Journal of Innovation Management 13, 65-104. [CrossRef]

211. Luca Mari, Valentina Lazzarotti, Raffaella Manzini. 2009. Measurement in soft systems: Epistemological framework and acase study. Measurement 42, 241-253. [CrossRef]

212. Vittorio Chiesa, Federico Frattini, Valentina Lazzarotti, Raffaella Manzini. 2009. Performance measurement of research anddevelopment activities. European Journal of Innovation Management 12:1, 25-61. [Abstract] [Full Text] [PDF]

213. Ali Uyar. 2009. Quality performance measurement practices in manufacturing companies. The TQM Journal 21:1, 72-86.[Abstract] [Full Text] [PDF]

214. Stephan Schmidberger, Lydia Bals, Evi Hartmann, Christopher Jahns. 2009. Ground handling services at European hubairports: Development of a performance measurement system for benchmarking. International Journal of Production Economics117, 104-116. [CrossRef]

215. Gunjan Soni, Rambabu Kodali. 2009. Performance value analysis for the justification of the leagile supply chain. InternationalJournal of Business Performance Management 11, 96. [CrossRef]

216. Ygal Bendavid, Élisabeth Lefebvre, Louis A. Lefebvre, Samuel Fosso-Wamba. 2009. Key performance indicators for theevaluation of RFID-enabled B-to-B e-commerce applications: the case of a five-layer supply chain. Information Systems ande-Business Management 7, 1-20. [CrossRef]

217. Rodney McAdam, Shirley‐Ann Hazlett, Karen Anderson‐Gillespie. 2008. Developing a conceptual model of lead performancemeasurement and benchmarking. International Journal of Operations & Production Management 28:12, 1153-1185. [Abstract][Full Text] [PDF]

218. Dimitrios Koufopoulos, Vassilios Zoumbos, Maria Argyropoulou, Jaideep Motwani. 2008. Top management team andcorporate performance: a study of Greek firms. Team Performance Management: An International Journal 14:7/8, 340-363.[Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 48: Performanta serviciilor in sectorul public

219. C.Y. Lam, S.L. Chan, W.H. Ip, C.W. Lau. 2008. Collaborative supply chain network using embedded genetic algorithms.Industrial Management & Data Systems 108:8, 1101-1110. [Abstract] [Full Text] [PDF]

220. Carmen Carnero, Sheila Delgado. 2008. Maintenance audit by means of value analysis technique and decision rules. Journalof Quality in Maintenance Engineering 14:4, 329-342. [Abstract] [Full Text] [PDF]

221. Robert Johnston, Panupak Pongatichat. 2008. Managing the tension between performance measurement and strategy: copingstrategies. International Journal of Operations & Production Management 28:10, 941-967. [Abstract] [Full Text] [PDF]

222. I.P.S. Ahuja, J.S. Khamba. 2008. Total productive maintenance: literature review and directions. International Journal ofQuality & Reliability Management 25:7, 709-756. [Abstract] [Full Text] [PDF]

223. Baofeng Huo, Willem Selen, Jeff Hoi Yan Yeung, Xiande Zhao. 2008. Understanding drivers of performance in the 3PLindustry in Hong Kong. International Journal of Operations & Production Management 28:8, 772-800. [Abstract] [Full Text][PDF]

224. Prasanta K. Dey, Seetharaman Hariharan, Ozren Despic. 2008. Managing healthcare performance in analytical framework.Benchmarking: An International Journal 15:4, 444-468. [Abstract] [Full Text] [PDF]

225. Alberto Grando, Valeria Belvedere. 2008. Exploiting the balanced scorecard in the Operations Department: the Ducati MotorHolding case. Production Planning & Control 19, 495-507. [CrossRef]

226. Kevin McCormack, Marcelo Bronzo Ladeira, Marcos Paulo Valadares de Oliveira. 2008. Supply chain maturity andperformance in Brazil. Supply Chain Management: An International Journal 13:4, 272-282. [Abstract] [Full Text] [PDF]

227. Marie Marchand, Louis Raymond. 2008. Researching performance measurement systems. International Journal of Operations& Production Management 28:7, 663-686. [Abstract] [Full Text] [PDF]

228. Shuki Dror. 2008. The Balanced Scorecard versus quality award models as strategic frameworks. Total Quality Management& Business Excellence 19, 583-593. [CrossRef]

229. Marvin B. Lieberman, Jina Kang. 2008. How to measure company productivity using value-added: A focus on Pohang Steel(POSCO). Asia Pacific Journal of Management 25, 209-224. [CrossRef]

230. Ronald Gleich, Jaideep Motwani, Andreas Wald. 2008. Process benchmarking: a new tool to improve the performance ofoverhead areas. Benchmarking: An International Journal 15:3, 242-256. [Abstract] [Full Text] [PDF]

231. Joongsan Oh, Seung‐Kyu Rhee. 2008. The influence of supplier capabilities and technology uncertainty on manufacturer‐supplier collaboration. International Journal of Operations & Production Management 28:6, 490-517. [Abstract] [Full Text][PDF]

232. Chieh-Yu Lin. 2008. Determinants of the adoption of technological innovations by logistics service providers in China.International Journal of Technology Management and Sustainable Development 7, 19-38. [CrossRef]

233. Jean-François Henri, Marc Journeault. 2008. Environmental performance indicators: An empirical study of Canadianmanufacturing firms. Journal of Environmental Management 87, 165-176. [CrossRef]

234. Ben Clegg, Yufeng Zhang, Mike Gregory, Yongjiang Shi. 2008. Global engineering networks (GEN). Journal of ManufacturingTechnology Management 19:3, 299-314. [Abstract] [Full Text] [PDF]

235. Panupak Pongatichat, Robert Johnston. 2008. Exploring strategy‐misaligned performance measurement. International Journalof Productivity and Performance Management 57:3, 207-222. [Abstract] [Full Text] [PDF]

236. Mostafa Jazayeri, Robert W. Scapens. 2008. The Business Values Scorecard within BAE Systems: The evolution of aperformance measurement system. The British Accounting Review 40, 48-70. [CrossRef]

237. Wai Peng Wong, Kuan Yew Wong. 2008. A review on benchmarking of supply chain performance measures. Benchmarking:An International Journal 15:1, 25-51. [Abstract] [Full Text] [PDF]

238. Paul D. Cousins, Benn Lawson, Brian Squire. 2008. Performance measurement in strategic buyer‐supplier relationships.International Journal of Operations & Production Management 28:3, 238-258. [Abstract] [Full Text] [PDF]

239. Matthew Hall. 2008. The effect of comprehensive performance measurement systems on role clarity, psychologicalempowerment and managerial performance. Accounting, Organizations and Society 33, 141-163. [CrossRef]

240. I.P.S. Ahuja, J.S. Khamba. 2008. An evaluation of TPM initiatives in Indian industry for enhanced manufacturingperformance. International Journal of Quality & Reliability Management 25:2, 147-172. [Abstract] [Full Text] [PDF]

241. Benita M. Beamon, Burcu Balcik. 2008. Performance measurement in humanitarian relief chains. International Journal ofPublic Sector Management 21:1, 4-25. [Abstract] [Full Text] [PDF]

242. Dejan Gradišar, Sebastjan Zorzut, Vladimir Jovan. 2008. Production Control of a Polymerization Plant Based on ProductionPerformance Indicators. Organizacija 41. . [CrossRef]

243. Dia Zeglat, Yuksel Ekinci, Andrew LockwoodService quality and business performance 209-236. [CrossRef]244. P.L. González Torre, V. Ordóñez Alvarez, A. Zapico Granda. 2008. CARACTERIZACIÓN DE LA INDUSTRIA

AUXILIAR DEL AUTOMÓVIL ATENDIENDO AL GRADO DE IMPLANTACIÓN DE LAS PRÁCTICASMEDIOAMBIENTALES. Investigaciones Europeas de Dirección y Economía de la Empresa 14, 35-50. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 49: Performanta serviciilor in sectorul public

245. M. Bennour, D. Crestani. 2007. Formalization of a process activity performance estimation approach using humancompetencies. International Journal of Production Research 45, 5743-5768. [CrossRef]

246. Krystin Zigan, Fraser Macfarlane, Terry Desombre. 2007. Intangible resources as performance drivers in European hospitals.International Journal of Productivity and Performance Management 57:1, 57-71. [Abstract] [Full Text] [PDF]

247. Sujatha Perera, Pamela Baker. 2007. Performance measurement practices in small and medium size manufacturing enterprisesin Australia. Small Enterprise Research 15, 10-30. [CrossRef]

248. Adem Göleç, Harun Taşkın. 2007. Novel methodologies and a comparative study for manufacturing systems performanceevaluations. Information Sciences 177, 5253-5274. [CrossRef]

249. Yingli Wang, Chandra S. Lalwani. 2007. Using e‐business to enable customised logistics sustainability. The InternationalJournal of Logistics Management 18:3, 402-419. [Abstract] [Full Text] [PDF]

250. Massimiliano Bonacchi, Leonardo Rinaldi. 2007. DartBoards and Clovers as new tools in sustainability planning and control.Business Strategy and the Environment 16:10.1002/bse.v16:7, 461-473. [CrossRef]

251. Carlos F. Gomes, Mahmoud M. Yasin, João V. Lisboa. 2007. The dimensionality and utilization of performance measures ina manufacturing operational context. Cross Cultural Management: An International Journal 14:4, 286-306. [Abstract] [FullText] [PDF]

252. Helen Reijonen, Raija Komppula. 2007. Perception of success and its effect on small firm performance. Journal of SmallBusiness and Enterprise Development 14:4, 689-701. [Abstract] [Full Text] [PDF]

253. Carlos F. Gomes, Mahmoud M. Yasin, João V. Lisboa. 2007. The effectiveness of hospitality service operations: measurementand implementation concerns. International Journal of Contemporary Hospitality Management 19:7, 560-573. [Abstract] [FullText] [PDF]

254. Qi Zhou Moss, Johanna Alho, Keith Alexander. 2007. Performance measurement action research. Journal of FacilitiesManagement 5:4, 290-300. [Abstract] [Full Text] [PDF]

255. Viara Popova, Jan Treur. 2007. A specification language for organisational performance indicators. Applied Intelligence 27,291-301. [CrossRef]

256. Eric O. Olsen, Honggeng Zhou, Denis M.S. Lee, Yoke‐Eng Ng, Chow Chewn Chong, Pean Padunchwit. 2007. Performancemeasurement system and relationships with performance results. International Journal of Productivity and PerformanceManagement 56:7, 559-582. [Abstract] [Full Text] [PDF]

257. Karen Anderson, Rodney McAdam. 2007. Reconceptualising benchmarking development in UK organisations: the effectsof size and sector. International Journal of Productivity and Performance Management 56:7, 538-558. [Abstract] [Full Text][PDF]

258. T.F. Burgess, T.S. Ong, N.E. Shaw. 2007. Traditional or contemporary? The prevalence of performance measurement systemtypes. International Journal of Productivity and Performance Management 56:7, 583-602. [Abstract] [Full Text] [PDF]

259. Chris Clegg, Craig Shepherd. 2007. ‘The biggest computer programme in the world…ever!’: time for a change in mindset?.Journal of Information Technology 22, 212-221. [CrossRef]

260. Sai Nudurupati, Tanweer Arshad, Trevor Turner. 2007. Performance measurement in the construction industry: An actioncase investigating manufacturing methodologies. Computers in Industry 58, 667-676. [CrossRef]

261. Bob Lillis, Robin Lane. 2007. Auditing the strategic role of operations. International Journal of Management Reviews9:10.1111/ijmr.2007.9.issue-3, 191-210. [CrossRef]

262. Vittorio Chiesa, Federico Frattini. 2007. Exploring the differences in performance measurement between research anddevelopment: evidence from a multiple case study. R&D Management 37:10.1111/radm.2007.37.issue-4, 283-301. [CrossRef]

263. Paul D. Cousins, Benn Lawson. 2007. The Effect of Socialization Mechanisms and Performance Measurement on SupplierIntegration in New Product Development. British Journal of Management 18:10.1111/bjom.2007.18.issue-3, 311-326.[CrossRef]

264. Paul Folan, Jim Browne, Harinder Jagdev. 2007. Performance: Its meaning and content for today's business research. Computersin Industry 58, 605-620. [CrossRef]

265. Aditya Parida, Gopi Chattopadhyay. 2007. Development of a multi‐criteria hierarchical framework for maintenanceperformance measurement (MPM). Journal of Quality in Maintenance Engineering 13:3, 241-258. [Abstract] [Full Text][PDF]

266. Mike Bourne, Steven Melnyk, Norman Faull, Monica Franco‐Santos, Mike Kennerley, Pietro Micheli, Veronica Martinez,Steve Mason, Bernard Marr, Dina Gray, Andrew Neely. 2007. Towards a definition of a business performance measurementsystem. International Journal of Operations & Production Management 27:8, 784-801. [Abstract] [Full Text] [PDF]

267. Chieh-Yu Lin. 2007. Supply chain performance and the adoption of new logistics technologies for logistics service providersin Taiwan. Journal of Statistics and Management Systems 10, 519-543. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 50: Performanta serviciilor in sectorul public

268. Mark Stevenson, Martin Spring. 2007. Flexibility from a supply chain perspective: definition and review. International Journalof Operations & Production Management 27:7, 685-713. [Abstract] [Full Text] [PDF]

269. Veronica Martinez, Zoe Radnor, Patrizia Garengo, Giovanni Bernardi. 2007. Organizational capability in SMEs. InternationalJournal of Productivity and Performance Management 56:5/6, 518-532. [Abstract] [Full Text] [PDF]

270. Veronica Martinez, Zoe Radnor, Zoe J. Radnor, David Barnes. 2007. Historical analysis of performance measurement andmanagement in operations management. International Journal of Productivity and Performance Management 56:5/6, 384-396.[Abstract] [Full Text] [PDF]

271. Angappa Gunasekaran, Bulent Kobu. 2007. Performance measures and metrics in logistics and supply chain management:a review of recent literature (1995–2004) for research and applications. International Journal of Production Research 45,2819-2840. [CrossRef]

272. Aghahowa Enoma, Stephen Allen. 2007. Developing key performance indicators for airport safety and security. Facilities25:7/8, 296-315. [Abstract] [Full Text] [PDF]

273. Bob Lillis, Robin Lane. 2007. Auditing the strategic role of operations. International Journal of Management Reviews0:10.1111/ijmr.0.0.issue-0, 070517202646001-???. [CrossRef]

274. Ana Maria Garcia Perez, Marian Garcia Martinez. 2007. The agrifood cooperative netchain. A theoretical framework to studyits configuration. Food Economics - Acta Agriculturae Scandinavica, Section C 4, 31-39. [CrossRef]

275. Shuki Dror. 2007. A process causality approach given a strategic frame. Journal of Modelling in Management 2:1, 40-55.[Abstract] [Full Text] [PDF]

276. Shankar Purbey, Kampan Mukherjee, Chandan Bhar. 2007. Performance measurement system for healthcare processes.International Journal of Productivity and Performance Management 56:3, 241-251. [Abstract] [Full Text] [PDF]

277. Carlos F. Gomes, Mahmoud M. Yasin, João V. Lisboa. 2007. An empirical investigation of manufacturing performancemeasures utilization. International Journal of Productivity and Performance Management 56:3, 187-204. [Abstract] [Full Text][PDF]

278. Li‐cheng Chang. 2007. The NHS performance assessment framework as a balanced scorecard approach. International Journalof Public Sector Management 20:2, 101-117. [Abstract] [Full Text] [PDF]

279. Elliot Bendoly, Eve D. Rosenzweig, Jeff K. Stratman. 2007. Performance Metric Portfolios: A Framework and EmpiricalAnalysis. Production and Operations Management 16, 257-276. [CrossRef]

280. Suzanne Bergin-Seers, Leo Jago. 2007. Performance measurement in small motels in Australia. Tourism and HospitalityResearch 7, 144-155. [CrossRef]

281. Vittorio Chiesa, Federico Frattini, Valentina Lazzarotti, Raffaella Manzini. 2007. How do measurement objectives influencethe R&D performance measurement system design?. Management Research News 30:3, 187-202. [Abstract] [Full Text] [PDF]

282. Imad Alsyouf. 2007. The role of maintenance in improving companies’ productivity and profitability. International Journalof Production Economics 105, 70-78. [CrossRef]

283. Nizamettin Bayyurt. 2007. The Performance of Turkish Manufacturing Firms in Stable And Unstable Economic Periods.South East European Journal of Economics and Business 2. . [CrossRef]

284. Jitesh Thakkar, S.G. Deshmukh, A.D. Gupta, Ravi Shankar. 2006. Development of a balanced scorecard. International Journalof Productivity and Performance Management 56:1, 25-59. [Abstract] [Full Text] [PDF]

285. Beata Kollberg, Jens J. Dahlgaard, Per‐Olaf Brehmer. 2006. Measuring lean initiatives in health care services: issues andfindings. International Journal of Productivity and Performance Management 56:1, 7-24. [Abstract] [Full Text] [PDF]

286. Umit S. Bititci, Kepa Mendibil, Sai Nudurupati, Patrizia Garengo, Trevor Turner. 2006. Dynamics of performancemeasurement and organisational culture. International Journal of Operations & Production Management 26:12, 1325-1350.[Abstract] [Full Text] [PDF]

287. Helen Atkinson. 2006. Strategy implementation: a role for the balanced scorecard?. Management Decision 44:10, 1441-1460.[Abstract] [Full Text] [PDF]

288. Alan Pilkington, Robert Fitzgerald. 2006. Operations management themes, concepts and relationships: a forward retrospectiveof IJOPM. International Journal of Operations & Production Management 26:11, 1255-1275. [Abstract] [Full Text] [PDF]

289. Beata Kollberg, Mattias Elg. 2006. Challenges Experienced in the Development of Performance Measurement Systems inSwedish Health Care. Quality Management in Health Care 15, 244-256. [CrossRef]

290. L. Berrah, G. Mauris, F. Vernadat. 2006. Industrial performance measurement: an approach based on the aggregation ofunipolar or bipolar expressions. International Journal of Production Research 44, 4145-4158. [CrossRef]

291. Prasanta Kumar Dey, Seetharaman Hariharan, Benjamin Thomas Clegg. 2006. Measuring the operational performanceof intensive care units using the analytic hierarchy process approach. International Journal of Operations & ProductionManagement 26:8, 849-865. [Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 51: Performanta serviciilor in sectorul public

292. Jean-François Henri. 2006. Management control systems and strategy: A resource-based perspective. Accounting,Organizations and Society 31, 529-558. [CrossRef]

293. Fiorenzo Franceschini, Maurizio Galetto, Domenico Maisano, Luciano Viticchiè. 2006. The Condition of Uniqueness inManufacturing Process Representation by Performance/Quality Indicators. Quality and Reliability Engineering International22:10.1002/qre.v22:5, 567-580. [CrossRef]

294. Carlos F. Gomes, Mahmoud M. Yasin, João V. Lisboa. 2006. Key performance factors of manufacturing effective performance.The TQM Magazine 18:4, 323-340. [Abstract] [Full Text] [PDF]

295. Aditya Parida, Uday Kumar. 2006. Maintenance performance measurement (MPM): issues and challenges. Journal of Qualityin Maintenance Engineering 12:3, 239-251. [Abstract] [Full Text] [PDF]

296. Benita M. Beamon, Stephen A. Kotleba. 2006. Inventory management support systems for emergency humanitarian reliefoperations in South Sudan. The International Journal of Logistics Management 17:2, 187-212. [Abstract] [Full Text] [PDF]

297. Imad Alsyouf. 2006. Measuring maintenance performance using a balanced scorecard approach. Journal of Quality inMaintenance Engineering 12:2, 133-149. [Abstract] [Full Text] [PDF]

298. Jillian MacBryde, Zoe Radnor, Jairo R. Montoya‐Torres. 2006. Manufacturing performance evaluation in wafer semiconductorfactories. International Journal of Productivity and Performance Management 55:3/4, 300-310. [Abstract] [Full Text] [PDF]

299. Jillian MacBryde, Zoe Radnor, Craig Shepherd, Hannes Günter. 2006. Measuring supply chain performance: current researchand future directions. International Journal of Productivity and Performance Management 55:3/4, 242-258. [Abstract] [FullText] [PDF]

300. Angappa Gunasekaran, Goran D. Putnik, Sérgio D. Sousa, Elaine M. Aspinwall, A. Guimarães Rodrigues. 2006. Performancemeasures in English small and medium enterprises: survey results. Benchmarking: An International Journal 13:1/2, 120-134.[Abstract] [Full Text] [PDF]

301. Marco Busi, Umit S. Bititci. 2006. Collaborative performance management: present gaps and future research. InternationalJournal of Productivity and Performance Management 55:1, 7-25. [Abstract] [Full Text] [PDF]

302. Andy Neely. 2005. The evolution of performance measurement research. International Journal of Operations & ProductionManagement 25:12, 1264-1277. [Abstract] [Full Text] [PDF]

303. Marc Wouters, Mark Sportel. 2005. The role of existing measures in developing and implementing performance measurementsystems. International Journal of Operations & Production Management 25:11, 1062-1082. [Abstract] [Full Text] [PDF]

304. Alberto Bayo-Moriones, Javier Merino-D�az de Cerio. 2005. Employee involvement: Its interaction with advancedmanufacturing technologies, quality management, and inter-firm collaboration. Human Factors and Ergonomics inManufacturing 14:10.1002/hfm.v14:2, 117-134. [CrossRef]

305. Paul Folan, Jim Browne. 2005. A review of performance measurement: Towards performance management. Computers inIndustry 56, 663-680. [CrossRef]

306. Douglas G. Harper, Leonhard E. Bernold. 2005. Success of Supplier Alliances for Capital Projects. Journal of ConstructionEngineering and Management 131, 979-985. [CrossRef]

307. Macarena Sacristán Díaz, María José Álvarez Gil, José A. Dominguez Machuca. 2005. Performance measurement systems,competitive priorities, and advanced manufacturing technology. International Journal of Operations & Production Management25:8, 781-799. [Abstract] [Full Text] [PDF]

308. Krishan M. Gupta, A. Gunasekaran. 2005. Costing in new enterprise environment. Managerial Auditing Journal 20:4, 337-353.[Abstract] [Full Text] [PDF]

309. Karen Anderson, Rodney McAdam. 2005. An empirical analysis of lead benchmarking and performance measurement.International Journal of Quality & Reliability Management 22:4, 354-375. [Abstract] [Full Text] [PDF]

310. Beata Kollberg, Mattias Elg, Jan Lindmark. 2005. Design and Implementation of a Performance Measurement System inSwedish Health Care Services. Quality Management in Health Care 14, 95-111. [CrossRef]

311. I Egan, J M Ritchie, P D Gardiner. 2005. Measuring performance change in the mechanical design process arena. Proceedingsof the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 219, 851-863. [CrossRef]

312. Patrizia Garengo, Stefano Biazzo, Umit S. Bititci. 2005. Performance measurement systems in SMEs: A review for a researchagenda. International Journal of Management Reviews 7:10.1111/ijmr.2005.7.issue-1, 25-47. [CrossRef]

313. Sérgio D. Sousa, Elaine Aspinwall, Paulo A. Sampaio, A. Guimarães Rodrigues. 2005. Performance measures and quality toolsin Portuguese small and medium enterprises: survey results. Total Quality Management & Business Excellence 16, 277-307.[CrossRef]

314. Kit Fai Pun, Anthony Sydney White. 2005. A performance measurement paradigm for integrating strategy formulation:A review of systems and frameworks. International Journal of Management Reviews 7:10.1111/ijmr.2005.7.issue-1, 49-71.[CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 52: Performanta serviciilor in sectorul public

315. E. Tapinos *, R.G. Dyson, M. Meadows. 2005. The impact of the performance measurement systems in setting the ‘direction’in the University of Warwick. Production Planning & Control 16, 189-198. [CrossRef]

316. Monica Franco-Santos *, Mike Bourne. 2005. An examination of the literature relating to issues affecting how companiesmanage through measures. Production Planning & Control 16, 114-124. [CrossRef]

317. Carlos Ochoa‐Laburu, Gene R. Simons, R. Trachtenberg. 2005. Cross‐national evaluation and benchmarking ofmanufacturing SMEs using an expert system based assessment tool (QuickView). Benchmarking: An International Journal12:1, 16-29. [Abstract] [Full Text] [PDF]

318. Shamsuddin Ahmed ., Masjuki Hj. Hassan ., Yap Hui Fen .. 2005. Performance Measurement and Evaluation in an InnovativeModern Manufacturing System. Journal of Applied Sciences 5, 385-401. [CrossRef]

319. Annabel Jackson. 2005. Falling from a Great Height: Principles of Good Practice in Performance Measurement and the Perilsof Top Down Determination of Performance Indicators. Local Government Studies 31, 21-38. [CrossRef]

320. Stefan Tangen. 2005. Demystifying productivity and performance. International Journal of Productivity and PerformanceManagement 54:1, 34-46. [Abstract] [Full Text] [PDF]

321. Petri SuomalaLIFE CYCLE PERSPECTIVE IN THE MEASUREMENT OF NEW PRODUCT DEVELOPMENTPERFORMANCE 523-700. [Abstract] [Full Text] [PDF] [PDF]

322. K.K.B. Hon. 2005. Performance and Evaluation of Manufacturing Systems. CIRP Annals - Manufacturing Technology 54,139-154. [CrossRef]

323. Stefan Tangen. 2004. Performance measurement: from philosophy to practice. International Journal of Productivity andPerformance Management 53:8, 726-737. [Abstract] [Full Text] [PDF]

324. L. Berrah, G. Mauris *, F. Vernadat. 2004. Information aggregation in industrial performance measurement: rationales, issuesand definitions. International Journal of Production Research 42, 4271-4293. [CrossRef]

325. H'ng Gaik Chin, Muhamad Zameri Mat Saman. 2004. Proposed analysis of performance measurement for a productionsystem. Business Process Management Journal 10:5, 570-583. [Abstract] [Full Text] [PDF]

326. Chris Morgan. 2004. Structure, speed and salience: performance measurement in the supply chain. Business ProcessManagement Journal 10:5, 522-536. [Abstract] [Full Text] [PDF]

327. Denis Borenstein, João Luiz Becker, Vaner José do Prado. 2004. Measuring the efficiency of Brazilian post office stores usingdata envelopment analysis. International Journal of Operations & Production Management 24:10, 1055-1078. [Abstract] [FullText] [PDF]

328. Karen Anderson, Rodney McAdam. 2004. A critique of benchmarking and performance measurement. Benchmarking: AnInternational Journal 11:5, 465-483. [Abstract] [Full Text] [PDF]

329. GülÇİn Büyüközkan. 2004. A success index to evaluate e-Marketplaces. Production Planning & Control 15, 761-774.[CrossRef]

330. Carlos F. Gomes, Mahmoud M. Yasin, João V. Lisboa. 2004. A literature review of manufacturing performance measures andmeasurement in an organizational context: a framework and direction for future research. Journal of Manufacturing TechnologyManagement 15:6, 511-530. [Abstract] [Full Text] [PDF]

331. Monica Franco‐Santos, Mike Bourne, Russell Huntington. 2004. Executive pay and performance measurement practices inthe UK. Measuring Business Excellence 8:3, 5-11. [Abstract] [Full Text] [PDF]

332. Michel J. Leseure, Joachim Bauer, Kamal Birdi, Andy Neely, David Denyer. 2004. Adoption of promising practices: asystematic review of the evidence. International Journal of Management Reviews 5-6:10.1111/ijmr.2004.5-6.issue-3-4, 169-190.[CrossRef]

333. Clemens Lohman, Leonard Fortuin, Marc Wouters. 2004. Designing a performance measurement system: A case study.European Journal of Operational Research 156, 267-286. [CrossRef]

334. Igal M. Shohet, Sarel Lavy. 2004. Healthcare facilities management: state of the art review. Facilities 22:7/8, 210-220.[Abstract] [Full Text] [PDF]

335. Jean‐François Henri. 2004. Performance measurement and organizational effectiveness: bridging the gap. Managerial Finance30:6, 93-123. [Abstract] [PDF]

336. Steven A Melnyk, Douglas M Stewart, Morgan Swink. 2004. Metrics and performance measurement in operationsmanagement: dealing with the metrics maze. Journal of Operations Management 22, 209-218. [CrossRef]

337. James O'Kane. 2004. Simulating production performance: cross case analysis and policy implications. Industrial Management& Data Systems 104:4, 309-321. [Abstract] [Full Text] [PDF]

338. J. Schmitz, K.W. Platts. 2004. Supplier logistics performance measurement: Indications from a study in the automotiveindustry. International Journal of Production Economics 89, 231-243. [CrossRef]

339. Carlo Rafele. 2004. Logistic service measurement: a reference framework. Journal of Manufacturing Technology Management15:3, 280-290. [Abstract] [Full Text] [PDF]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 53: Performanta serviciilor in sectorul public

340. H. A. Bassioni, A. D. F. Price, T. M. Hassan. 2004. Performance Measurement in Construction. Journal of Management inEngineering 20, 42-50. [CrossRef]

341. Andrew Greasley. 2004. Process improvement within a HR division at a UK police force. International Journal of Operations& Production Management 24:3, 230-240. [Abstract] [Full Text] [PDF]

342. Thomas Grünberg. 2004. Performance improvement. International Journal of Productivity and Performance Management 53:1,52-71. [Abstract] [Full Text] [PDF]

343. Hung-Da Wan, F. Frank Chen. 2004. A FRAMEWORK FOR PERFORMANCE-DRIVEN WEB-BASEDMANUFACTURING. Journal of the Chinese Institute of Industrial Engineers 21, 527-534. [CrossRef]

344. I. J. Chen, A. Paulraj. 2004. Understanding supply chain management: critical research and a theoretical framework.International Journal of Production Research 42, 131-163. [CrossRef]

345. Stefan Tangen. 2003. An overview of frequently used performance measures. Work Study 52:7, 347-354. [Abstract] [FullText] [PDF]

346. Petri Suomala, Ilkka Jokioinen. 2003. The patterns of success in product development: a case study. European Journal ofInnovation Management 6:4, 213-227. [Abstract] [Full Text] [PDF]

347. Stefan A. Seuring, Julia Koplin, Torsten Behrens, Uwe Schneidewind. 2003. Sustainability assessment in the Germandetergent industry: from stakeholder involvement to sustainability indicators. Sustainable Development 11:10.1002/sd.v11:4,199-212. [CrossRef]

348. Bernard Marr, Gianni Schiuma. 2003. Business performance measurement – past, present and future. Management Decision41:8, 680-687. [Abstract] [Full Text] [PDF]

349. Jill MacBryde, Kepa Mendibil. 2003. Designing performance measurement systems for teams: theory and practice.Management Decision 41:8, 722-733. [Abstract] [Full Text] [PDF]

350. Monica Franco, Mike Bourne. 2003. Factors that play a role in “managing through measures”. Management Decision 41:8,698-710. [Abstract] [Full Text] [PDF]

351. Felix T.S. Chan, H.J. Qi. 2003. An innovative performance measurement method for supply chain management. Supply ChainManagement: An International Journal 8:3, 209-223. [Abstract] [Full Text] [PDF]

352. Felix T.S. Chan, H.J. Qi. 2003. Feasibility of performance measurement system for supply chain: a process‐based approachand measures. Integrated Manufacturing Systems 14:3, 179-190. [Abstract] [Full Text] [PDF]

353. Kwai-Sang Chin, Kit-Fai Pun, Henry Lau. 2003. Development of a knowledge-based self-assessment system for measuringorganisational performance. Expert Systems with Applications 24, 443-455. [CrossRef]

354. John Maleyeff. 2003. Benchmarking performance indices: pitfalls and solutions. Benchmarking: An International Journal 10:1,9-28. [Abstract] [Full Text] [PDF]

355. Mike Kennerley, Andy Neely. 2003. Measuring performance in a changing business environment. International Journal ofOperations & Production Management 23:2, 213-229. [Abstract] [Full Text] [PDF]

356. Md. Mostaque Hussain, A. Gunasekaran. 2002. An institutional perspective of non‐financial management accountingmeasures: a review of the financial services industry. Managerial Auditing Journal 17:9, 518-536. [Abstract] [Full Text] [PDF]

357. Peter Fredriksson. 2002. Modular assembly in the car industry—an analysis of organizational forms’ influence on performance.European Journal of Purchasing & Supply Management 8, 221-233. [CrossRef]

358. Sérgio P. Santos, Valerie Belton, Susan Howick. 2002. Adding value to performance measurement by using system dynamicsand multicriteria analysis. International Journal of Operations & Production Management 22:11, 1246-1272. [Abstract] [FullText] [PDF]

359. F.J. O’Donnell, A.H.B. Duffy. 2002. Modelling design development performance. International Journal of Operations &Production Management 22:11, 1198-1221. [Abstract] [Full Text] [PDF]

360. Dilanthi Amaratunga, David Baldry. 2002. Performance measurement in facililities management and its relationships withmanagement theory and motivation. Facilities 20:10, 327-336. [Abstract] [Full Text] [PDF]

361. Rodney McAdam, Brian Bailie. 2002. Business performance measures and alignment impact on strategy. International Journalof Operations & Production Management 22:9, 972-996. [Abstract] [Full Text] [PDF]

362. Kit‐Fai Pun, Ip‐Kee Hui, Henry C.W. Lau, Hang‐Wai Law, Winston G. Lewis. 2002. Development of an EMS planningframework for environmental management practices. International Journal of Quality & Reliability Management 19:6, 688-709.[Abstract] [Full Text] [PDF]

363. Liane Easton, David J. Murphy, John N. Pearson. 2002. Purchasing performance evaluation: with data envelopment analysis.European Journal of Purchasing & Supply Management 8, 123-134. [CrossRef]

364. Kit-Fai Pun. 2002. Development of an integrated total quality management and performance measurement system for self-assessment: A method. Total Quality Management 13, 759-777. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 54: Performanta serviciilor in sectorul public

365. F. T. S. Chan, H. J. Qi. 2002. A fuzzy basis channel-spanning performance measurement method for supply chainmanagement. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture 216, 1155-1167.[CrossRef]

366. Md. Mostaque Hussain, A. Gunasekaran. 2002. Non‐financial management accounting measures in Finnish financialinstitutions. European Business Review 14:3, 210-229. [Abstract] [Full Text] [PDF]

367. Donald C.K. Chan, K.L. Yung, Andrew W.H. Ip. 2002. An application of fuzzy sets to process performance evaluation.Integrated Manufacturing Systems 13:4, 237-246. [Abstract] [Full Text] [PDF]

368. A.M. Ahmed. 2002. Virtual integrated performance measurement. International Journal of Quality & Reliability Management19:4, 414-441. [Abstract] [Full Text] [PDF]

369. Erkki K Laitinen. 2002. A dynamic performance measurement system: evidence from small Finnish technology companies.Scandinavian Journal of Management 18, 65-99. [CrossRef]

370. Stanislav Karapetrovic, Walter Willborn. 2002. Self‐audit of process performance. International Journal of Quality & ReliabilityManagement 19:1, 24-45. [Abstract] [Full Text] [PDF]

371. Srinivas Talluri, Joseph Sarkis. 2002. A methodology for monitoring system performance. International Journal of ProductionResearch 40, 1567-1582. [CrossRef]

372. Robert Johnston, Lin Fitzgerald, Eleni Markou, Stan Brignall. 2001. Target setting for evolutionary and revolutionary processchange. International Journal of Operations & Production Management 21:11, 1387-1403. [Abstract] [Full Text] [PDF]

373. Helen Driva, Kulwant S. Pawar, Unny Menon. 2001. Performance evaluation of new product development from a companyperspective. Integrated Manufacturing Systems 12:5, 368-378. [Abstract] [Full Text] [PDF]

374. Mel Hudson, Andi Smart, Mike Bourne. 2001. Theory and practice in SME performance measurement systems. InternationalJournal of Operations & Production Management 21:8, 1096-1115. [Abstract] [Full Text] [PDF]

375. Md. Mostaque Hussain, A. Gunasekaran. 2001. Activity‐based cost management in financial services industry. ManagingService Quality: An International Journal 11:3, 213-226. [Abstract] [Full Text] [PDF]

376. Bruce H. Andrews, John J. Carpentier, Tracy L. Gowen. 2001. A New Approach to Performance Measurement and GoalSetting. Interfaces 31, 44-54. [CrossRef]

377. Andy Neely, Roberto Filippini, Cipriano Forza, Andrea Vinelli, Jasper Hii. 2001. A framework for analysing businessperformance, firm innovation and related contextual factors: perceptions of managers and policy makers in two Europeanregions. Integrated Manufacturing Systems 12:2, 114-124. [Abstract] [Full Text] [PDF]

378. A. De Toni, S. Tonchia. 2001. Performance measurement systems ‐ Models, characteristics and measures. InternationalJournal of Operations & Production Management 21:1/2, 46-71. [Abstract] [Full Text] [PDF]

379. Manoochehr Najmi, Dennis F. Kehoe. 2001. The role of performance measurement systems in promoting quality developmentbeyond ISO 9000. International Journal of Operations & Production Management 21:1/2, 159-172. [Abstract] [Full Text][PDF]

380. Imad AlsyoufBalanced Scorecard Concept Adapted to Measure Maintenance Performance 227-234. [CrossRef]381. V. Hanna, N.D. Burns, C.J. Backhouse. 2000. Realigning organisational variables to support workplace behaviour.

International Journal of Operations & Production Management 20:12, 1380-1391. [Abstract] [Full Text] [PDF]382. Stefan Holmberg. 2000. A systems perspective on supply chain measurements. International Journal of Physical Distribution

& Logistics Management 30:10, 847-868. [Abstract] [Full Text] [PDF]383. Andy Neely, John Mills, Ken Platts, Huw Richards, Mike Gregory, Mike Bourne, Mike Kennerley. 2000. Performance

measurement system design: developing and testing a process‐based approach. International Journal of Operations & ProductionManagement 20:10, 1119-1145. [Abstract] [Full Text] [PDF]

384. LAWRENCE DOOLEY, KATHRYN CORMICAN, SIOBHAN WREATH, DAVID O'SULLIVAN. 2000.SUPPORTING SYSTEMS INNOVATION. International Journal of Innovation Management 04, 277-297. [CrossRef]

385. Dilanthi Amaratunga, David Baldry. 2000. Assessment of facilities management performance in higher education properties.Facilities 18:7/8, 293-301. [Abstract] [Full Text] [PDF]

386. Umit S. Bititci, UTrevor Turner, Carsten Begemann. 2000. Dynamics of performance measurement systems. InternationalJournal of Operations & Production Management 20:6, 692-704. [Abstract] [Full Text] [PDF]

387. Daniel R Dolk. 2000. Integrated model management in the data warehouse era. European Journal of Operational Research122, 199-218. [CrossRef]

388. J.S. Busby, A. Williamson. 2000. The appropriate use of performance measurement in non‐production activity. InternationalJournal of Operations & Production Management 20:3, 336-358. [Abstract] [Full Text] [PDF]

389. P Suwignjo, U.S Bititci, A.S Carrie. 2000. Quantitative models for performance measurement system. International Journalof Production Economics 64, 231-241. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)

Page 55: Performanta serviciilor in sectorul public

390. CONCHA ÁLVAREZ‐DARDET, GLORIA CUEVAS‐RODRÍGUEZ, RAMÓN VALLE‐CABRERA. 2000. Value‐BasedManagement: Performance Measurement Systems for Human Resources. Journal of Human Resource Costing & Accounting5:1, 9-26. [Abstract] [PDF]

391. H Driva, K.S Pawar, U Menon. 2000. Measuring product development performance in manufacturing organisations.International Journal of Production Economics 63, 147-159. [CrossRef]

392. Paul A. Phillips. 1999. Hotel performance and competitive advantage: a contingency approach. International Journal ofContemporary Hospitality Management 11:7, 359-365. [Abstract] [Full Text] [PDF]

393. T.C. Bond. 1999. The role of performance measurement in continuous improvement. International Journal of Operations &Production Management 19:12, 1318-1334. [Abstract] [Full Text] [PDF]

394. Lawrence Dooley, David O'Sullivan. 1999. Decision support system for the management of systems change. Technovation19, 483-493. [CrossRef]

395. Albert H.C. Tsang, Andrew K.S. Jardine, Harvey Kolodny. 1999. Measuring maintenance performance: a holistic approach.International Journal of Operations & Production Management 19:7, 691-715. [Abstract] [Full Text] [PDF]

396. Daniel B Waggoner, Andy D Neely, Mike P. Kennerley. 1999. The forces that shape organisational performance measurementsystems:. International Journal of Production Economics 60-61, 53-60. [CrossRef]

397. Benita M. Beamon. 1999. Measuring supply chain performance. International Journal of Operations & Production Management19:3, 275-292. [Abstract] [Full Text] [PDF]

398. Andy Neely. 1999. The performance measurement revolution: why now and what next?. International Journal of Operations& Production Management 19:2, 205-228. [Abstract] [Full Text] [PDF]

399. Archie Lockamy III. 1998. Quality‐focused performance measurement systems: a normative model. International Journal ofOperations & Production Management 18:8, 740-766. [Abstract] [Full Text] [PDF]

400. Benita M. Beamon, Victoria C. P. Chen. 1998. Performability-based fleet sizing in a material handling system. TheInternational Journal of Advanced Manufacturing Technology 14, 441-449. [CrossRef]

401. Giovanni Azzone, Giuliano Noci. 1998. Identifying effective PMSs for the deployment of “green” manufacturing strategies.International Journal of Operations & Production Management 18:4, 308-335. [Abstract] [Full Text] [PDF]

402. Robin C. Daniels, N.D. Burns. 1997. Behavioural consequences of performance measures in cellular manufacturing.International Journal of Operations & Production Management 17:11, 1066-1080. [Abstract] [Full Text] [PDF]

403. Andy Neely, Huw Richards, John Mills, Ken Platts, Mike Bourne. 1997. Designing performance measures: a structuredapproach. International Journal of Operations & Production Management 17:11, 1131-1152. [Abstract] [Full Text] [PDF]

404. Alberto De Toni, Guido Nassimbeni, Stefano Tonchia. 1997. An integrated production performance measurement system.Industrial Management & Data Systems 97:5, 180-186. [Abstract] [Full Text] [PDF]

405. Liliane Pintelon, Frank Van Puyvelde. 1997. Maintenance performance reporting systems: some experiences. Journal of Qualityin Maintenance Engineering 3:1, 4-15. [Abstract] [Full Text] [PDF]

406. Cecil C. Bozarth, William L. Berry. 1997. Measuring the Congruence Between Market Requirements and Manufacturing:A Methodology and Illustration. Decision Sciences 28:10.1111/deci.1997.28.issue-1, 121-150. [CrossRef]

407. Stan Brignall, Joan Ballantine. 1996. Performance measurement in service businesses revisited. International Journal of ServiceIndustry Management 7:1, 6-31. [Abstract] [Full Text] [PDF]

408. A. De Toni, S. Tonchia. 1996. Lean organization, management by process and performance measurement. InternationalJournal of Operations & Production Management 16:2, 221-236. [Abstract] [Full Text] [PDF]

409. Milanka Slavova, Neva YalmanBehavioral Branding as a Customer-Centric Strategy 396-413. [CrossRef]410. Jan van den Berg, Guido van Heck, Mohsen Davarynejad, Ron van DuinInventory Management, a Decision Support

Framework to Improve Operational Performance 268-288. [CrossRef]411. Voulgarakis Nikolaos, Aidonis Dimitris, Achillas Charisios, Triantafillou Dimitrios, Moussiopoulos NicolasSelection of the

3rd/4th Party Logistics Provider 296-312. [CrossRef]412. Voulgarakis Nikolaos, Aidonis Dimitris, Achillas Charisios, Triantafillou Dimitrios, Moussiopoulos NicolasSelection of the

3rd/4th Party Logistics Provider: 197-213. [CrossRef]413. Jan van den Berg, Guido van Heck, Mohsen Davarynejad, Ron van DuinInventory Management, a Decision Support

Framework to Improve Operational Performance 581-600. [CrossRef]414. Kijpokin KasemsapThe Role of Information System within Enterprise Architecture and their Impact on Business Performance

262-284. [CrossRef]

Dow

nloa

ded

by U

NIV

ER

SIT

Y O

F G

RE

EN

WIC

H A

t 09:

48 2

2 O

ctob

er 2

014

(PT

)