key performance indicators for facility performance...
TRANSCRIPT
This article was downloaded by: [Texas A&M University Libraries]On: 17 December 2014, At: 08:30Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House,37-41 Mortimer Street, London W1T 3JH, UK
Click for updates
Construction Management and EconomicsPublication details, including instructions for authors and subscription information:http://www.tandfonline.com/loi/rcme20
Key performance indicators for facility performanceassessment: simulation of core indicatorsSarel Lavya, John A. Garciab, Phil Scintoc & Manish K. Dixita
a Department of Construction Science, Texas A&M University, College Station, TX77843-3137, USAb Alpha Facilities Solutions, San Antonio, TX, USAc Scinto Statistical Services, Cleveland, OH, USAPublished online: 08 Dec 2014.
To cite this article: Sarel Lavy, John A. Garcia, Phil Scinto & Manish K. Dixit (2014) Key performance indicators for facilityperformance assessment: simulation of core indicators, Construction Management and Economics, 32:12, 1183-1204, DOI:10.1080/01446193.2014.970208
To link to this article: http://dx.doi.org/10.1080/01446193.2014.970208
PLEASE SCROLL DOWN FOR ARTICLE
Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) containedin the publications on our platform. However, Taylor & Francis, our agents, and our licensors make norepresentations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of theContent. Any opinions and views expressed in this publication are the opinions and views of the authors, andare not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon andshould be independently verified with primary sources of information. Taylor and Francis shall not be liable forany losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoeveror howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use ofthe Content.
This article may be used for research, teaching, and private study purposes. Any substantial or systematicreproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in anyform to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions
Key performance indicators for facility performance assessment:simulation of core indicators
SAREL LAVYa*, JOHN A. GARCIAb, PHIL SCINTOc and MANISH K. DIXITa
aDepartment of Construction Science, Texas A&M University, College Station, TX 77843-3137, USAbAlpha Facilities Solutions, San Antonio, TX, USAcScinto Statistical Services, Cleveland, OH, USA
Received 5 November 2013; accepted 22 September 2014
Assessing a facility’s performance is important for measuring its contribution towards organizational goals.
Among many approaches to performance assessment is the holistic key performance indicator (KPI) approach.
However, there are numerous KPIs available, and the chosen KPI needs to be relevant to facility goals and must
be calculated, analysed and evaluated to allow for the future state of the facility to be acceptable at the lowest
cost. The value of several key descriptive analytics in facility performance assessment may be enhanced through
the use of simulation. Simulation transforms the descriptive analytics into predictive and prescriptive analytics by
allowing for the robust assessment of plans and future outcomes through the creation of multiple scenarios, in
this case, for an education facility. The simulation approach quantifies the interrelationship and interdependence
of KPIs, and is potentially effective in analysing how maintenance expenditures can be optimized to maintain a
desired level of Condition Index as demonstrated by several simulation scenarios.
Keywords: Educational facilities, facility management, key performance indicators, maintenance, simulation.
Introduction
The context of this study
Evaluating a facility’s performance is critical for track-
ing its contribution towards accomplishing organiza-
tional goals and for making future decisions regarding
facilities management (Douglas, 1996; Amaratunga
et al., 2000; Barrett and Baldry, 2003; Cable and Davis,
2004). A facility supports core business activities by
providing a conducive and productive work environ-
ment (Douglas, 1996). A range of measurement
approaches have been proposed in the literature, such
as benchmarking, balanced scorecard, critical success
factor, and key performance indicator (KPI) method
(Lavy et al., 2010), with a preference for the KPI
method being emphasized (Lavy et al., 2010). How-
ever, an extensive list of KPIs exists that includes indi-
cators that are either not quantifiable, not measurable,
or provide redundant measurements (Neely et al.,
1997; Shohet, 2006). A need to weed out unnecessary
and repetitive indicators and to derive a concise list of
core, quantifiable, and measurable indicators has been
highlighted by numerous scholars (Slater et al., 1997;
Hinks and McNay, 1999; Augenbroe and Park, 2005;
Gumbus, 2005; Shohet, 2006).
Facility managers face increasing pressure to prior-
itize the direction of limited resources to address main-
tenance and capital renewal needs. The consequences
of failing to prioritize the direction of sustainment and
capital appropriately may include unscheduled facility
maintenance, or system availability, and additional
expense to repair or replace failed components or sys-
tems due to short/emergency notice procurements.
Equipped with KPI simulation (modelling) tools, facil-
ity managers can better prioritize maintenance and cap-
ital renewal actions by developing scenarios to examine
the interrelationships between funding levels, funding
timing, business rules and critical KPIs. This enables
them to make more informed decisions, and offers
another way to communicate funding priority require-
ments to other stakeholders in order to build enterprise
awareness and support for requirements.
*Author for correspondence. E-mail: [email protected]
© 2014 Taylor & Francis
Construction Management and Economics, 2014Vol. 32, No. 12, 1183–1204, http://dx.doi.org/10.1080/01446193.2014.970208
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
An organizational plan is divided into three levels.
The highest level contains organizational goals, which
focus the attention of different stakeholders and constit-
uents toward achieving a set of goals that are believed to
advance the organization. In the case of education facil-
ities, organizational goals may include raising student
scores, decreasing student absenteeism, reducing tea-
cher turnover, eliminating lost time in class, and reduc-
ing student accidents to the lowest possible level, among
others. The second level, which supports these organi-
zational goals, is the facility performance assessment.
This level involves setting targets for aspects that affect
the facility’s operation, such as personnel (e.g. recruit-
ing, training, retaining), maintenance issues (e.g. fre-
quency, quality), environmental issues (e.g. noise, air
quality, aesthetics), space (e.g. amount of space, quality
of space, privacy), among other factors. If an organiza-
tion is unsuccessful in assessing its progress towards
reaching these targets at the facility performance assess-
ment level, how does it know that it is still on track to
achieve its organizational goals? These targets can, of
course, be refined and modified with time, as per
changes that may occur in the business environment
as well as changes that may occur in the overall organi-
zational goals (level one). Level three is where key per-
formance indicators may be found. At this level, the
organization develops metrics (performance indicators)
to help it measure (past), analyse trends (present), and
predict (future) how successful it is and what action is
to be taken in order to successfully achieve the targets
set at the second level: facility performance assessment.
In other words, the KPIs (in level three) support reach-
ing the targets set in facility performance assessment
(level two), which in turn, support reaching the organi-
zational goals (level one). This is the typical hierarchy of
an organizational plan.
The scope of this paper is limited to the third level
of the organizational plan hierarchy, i.e., the level of
development of key performance indicators in order
to set facility performance assessment targets. This
study focuses on educational facilities due to the fact
that, according to various studies, such as the American
Federation of Teachers (2006), Young et al. (2003),
Cash and Twiford (2009), and Collins and Parson
(2010), there is strong evidence that the condition of
an education facility (level three in the organizational
plan hierarchy) affects the performance of not only
the students, but also its teachers (level one in the orga-
nizational plan hierarchy).
Research goal and objectives
To make performance measurement more meaningful,
its analysis should also focus on predicting future
impacts and on prescribing measures to improve facility
performance. In Lavy et al. (2014a, 2014b), core KPIs
and the variables that affect them were identified, listed
and discussed, and equations to calculate them were
derived. The goal of this study is to propose a platform
to simulate dynamic, complex facility performance out-
puts (metrics), utilizing realistic data inputs. The data
inputs are also dynamic, complex and subject to special
cause (war, natural disasters, etc.) and common cause
(natural market fluctuations) variability. Input vari-
ables, which themselves may be metrics and KPIs,
include facility factors such as cost of construction,
renewal rate, maintenance costs and rates, and covari-
ates such as cost of capital, interest rates, etc. As real
data is not readily available for all input variables due
to the complex nature of this topic, the following are
the specific objectives for this study:
(1) Simulate the identified core KPI outputs for
facility performance assessment using hypo-
thetical input data.
(2) Demonstrate how simulation allows for the
study of correlations and relationships between
and among KPIs and input variables within a
system facility and for the entire facility.
(3) Highlight the sensitivity of outputs and out-
comes to input variable sensitivity.
(4) Demonstrate that, due to variability and future
uncertainty, simulation is a valuable tool for
generating possible scenarios and making deci-
sions based upon forecasts and logic.
Literature review
Facility performance assessment
Measuring a facility’s performance includes reviewing
past and current conditions to compare its performance
within similar and across different organizations. The
process of performance assessment evaluates the contri-
bution of a facility to achieving organizational goals,
and therefore is vital to establishing future facility
management strategies (Kincaid, 1994; Lebas, 1995;
Amaratunga and Baldry, 2000; Barrett and Baldry,
2003; Cable and Davis, 2004). Decisions relating to a
facility’s extension, acquisition, and strategic changes
dependon reviewing the facility’s performance (Douglas,
1996; Amaratunga and Baldry, 2000). Syakima et al.
(2011) explained that the term ‘facilities’ includes the ser-
vices required to support core business activities of an
organization, whereas the term ‘performance’ relates to
efficiency and effectiveness. Hence, facility performance
means assessing a facility’s efficiency and effectiveness in
providing services for organizational business activities.
The performance measurement helps in understanding
the impacts of management decisions on success and
1184 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
failure of the facility portfolio and in suggesting possible
improvements (Cable and Davis, 2004). According to
Atkin and Brooks (2000), in a performance assessment,
it is important to identify issues crucial to organizational
success. Such issues can be tied to organizational
goals and objectives and eventually to performance
measures.
Among the most common approaches for
performance evaluation are benchmarking, balanced
scorecard, post-occupancy evaluation (POE), key per-
formance indicators (KPIs) and critical success factors
(CSFs). According to Ho et al. (2000), the performance
metrics that include KPIs can be used for making com-
parisons within and across organizations. Hitchcock
(2002) and O’Sullivan et al. (2004) stated that perfor-
mance metrics can define performance objectives clearly
and quantifiably. Assessing a facility’s performance
using a set of KPIs provides the user with an opportunity
to select the indicators of choice (Lavy et al., 2010).
Performance assessment of school facilities
According to Syakima et al. (2011), the assessment of
performance must focus on physical condition (poor,
good or excellent), functional appropriateness (ability
of facilities to support required functions) and issues
relating to user satisfaction and productivity. In the
case of school facilities, the quality of the built environ-
ment affects the performance of students and teachers
significantly. Research has shown that the condition
of school facilities is strongly and positively correlated
with student achievement (American Federation of
Teachers, 2006; Cash and Twiford, 2009; Collins
and Parson, 2010). If socioeconomic factors are con-
trolled, a significant difference (5–17 percentile points)
between academic performance of students from poor
and standard school conditions is suggested by studies
such as American Federation of Teachers (2006),
Young et al. (2003) and Earthman (2002). The condi-
tion of school facilities also has an impact on teacher
empowerment, which is strongly related to student aca-
demic achievements (Collins and Parson, 2010).
The factors responsible for poor school conditions
can be categorized as physical, spatial and environmental
(Young et al., 2003; Lippman, 2010). The physical con-
dition of a school includes factors such as interior and
exterior conditions, for example age of the building and
its components, structural damage, leakage, cracks,
and expired building components. Spatial factors
include aspects that affect the adequacy of space to fulfil
the required function. Despite having sufficient floor
area, a school facility could have inadequate space due
to the lack of proper space management. Three types
of spaces exist in a school environment: instructional
(e.g. classrooms), supplemental (e.g. laboratories) and
support spaces (e.g. gymnasium and auditorium), with
space standards for each. The National Clearinghouse
for Educational Facilities provides space standards for
school settings at various levels such as elementary, mid-
dle and high school. The space standards are provided in
gross square foot area per student for different annual
enrolment categories. Space standards for various
instructional, supplemental and support school spaces
are also provided by state agencies such as Rhode Island
Department of Education or RIDE (2007) and counties
such as Clark County School District in Nevada (Fife,
2006). The assessment of spacemanagement can be per-
formed by comparing existing floor areas with those
specified by the standards. The quality of school spaces
is governed by issues such as cleanliness, indoor air qual-
ity, lighting, thermal comfort and noise. Young et al.
(2003) covered these issues under the environmental
category.
Although numerous sets of KPIs have been derived
in the literature to cover most aspects of a facility’s per-
formance, the sets are extensive and, in some cases,
include KPIs that are not relevant, measurable, or
quantifiable (Shohet, 2006; Lavy et al., 2010). For
instance, Hinks and McNay (1999) derived a list of
172 KPIs categorized under eight categories: business
benefits, equipment, space, environment, change,
maintenance/services, consultancy, and general.
According to literature (Slater et al., 1997; Ho et al.,
2000), a succinct list of core KPIs that are quantifiable
on the basis of readily available information is still lack-
ing. The existing extensive lists must be filtered through
a set of criteria to develop a concise list of those indica-
tors that express one or more aspects of performance
assessment effectively (Slater et al., 1997; Ho et al.,
2000). Hinks and McNay (1999) and Slater et al.
(1997) suggested establishing concise lists of no more
than 4–6 and 7–12 KPIs, respectively. Literature also
recommends identifying the core KPIs that not only
cover financial aspects but also focus on aspects such
as business, organization goals, and other non-financial
qualitative aspects (user satisfaction and productivity).
The core KPIs must also be measurable and quantifi-
able on the basis of readily available information in
the industry in order to make genuine comparisons
(Tsang, 1998; Tsang et al., 1999; Ho et al., 2000;
Chan et al., 2001; Shohet, 2003; Cable and Davis,
2004; Augenbroe and Park, 2005; Gumbus, 2005).
Performance measurement approaches using
KPIs
According to Ahluwalia (2008), a building houses a
range of components (up to 170 different types of
KPIs for facility performance assessment 1185
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
components) that have differing maintenance require-
ments, making the building’s maintenance a complex
exercise. Moreover, each component has a different
service life, making replacement and its schedule com-
plicated. One conventional focus of facility mainte-
nance is on reducing maintenance expenditures while
providing a healthy, comfortable and safe workplace
for occupants (De Groote, 1995; Horner et al., 1997;
Tsang et al., 1999; Office of the Legislative Auditor,
2000; Kutucuoglu et al., 2001; Shohet and Lavy,
2004). It is also vital to determine how often a facility
is replacing its components that are expired or nearing
the end of their service life (Association of Higher Edu-
cation Facilities Officers et al., 2003; Kinnaman,
2007).The Facility Condition Index (FCI) is an indica-
tor often used for the condition assessment of a facility,
demonstrating the impact of deferred maintenance as a
ratio of deferred maintenance to total current replace-
ment value (Vanier, 2000; Briselden and Cain, 2001;
Teicholz and Edgar, 2001). The condition assessment
at each building system level and at overall building
level is more accurate, as it demonstrates specific
defects and their severity (Ahluwalia, 2008). Ahluwalia
(2008) defined Condition Index (CI) as a value
between 0 and 100, derived mathematical expressions
and conducted a case study.
In order to evaluate the maintenance performance
of a facility, a Maintenance Efficiency Indicator
(MEI) was proposed by Lavy and Shohet (2004). An
MEI is defined as the ratio of maintenance expenditure
to a facility’s CI. The amount spent on deferred main-
tenance was calculated as a ratio of actual to targeted
deferred maintenance by Lavy et al. (2014b), who also
defined a Replacement Efficiency Indicator (REI) to be
the ratio of the sum of capital renewals to the total cost
of expired systems in a given year. The purpose of MEI
and REI is to provide metrics to assess the maintenance
and replacement of a facility.
Apart from the facility’s maintenance, replacement,
and condition, it is important to evaluate the suitability
of a facility to house a particular function. The func-
tional suitability depends on the provision of sufficient
space in the facility and effective space management
(Douglas, 1993/94; Schroeder et al., 1995; Douglas,
1996). Other indicators, such as indoor air quality,
are responsible for health, comfort and safety of
building occupants and hence could affect a facility’s
overall performance (Fowler et al., 2005; Prakash,
2005; Mozaffarian, 2008).
A facility’s performance over its life cycle can also be
assessed using various computer and web-based tools
currently available. Tools such as BLAST, ECOTECT,
TRNSYS, HOT 2000, Energy Plus, DOE 2.1 (US
DOE), RIUSKA, MOIST 3.0, CONDENSE, Airpak
and hyglRC can be used to simulate the economic and
environmental performance of a built facility (Crawley
et al., 2005). Although a facility’s performance can be
simulated using computer programs, it still needs vali-
dation (Hammad et al., 2005). Moreover, often com-
plex inputs are required for running the simulation
collection, which can be either time-consuming or
expensive (Bazjanac, 2001; Hamza and Horne, 2007).
Among studies focused on KPIs are Shohet (2003),
who developed and validated four KPIs for healthcare
facilities by performing six case studies; Ahluwalia
(2008) derived an approach to assess the condition of
building components on the basis of maintenance data
and validated the approach using a case study approach;
and Enoma and Allen (2007) developed a set of KPIs
for assessing the performance of airport facilities using
a case study method to validate the KPIs.
Data analytics for facility management
KPIs have become a part of a growing area of study
called analytics, a field that is more prescriptive than
descriptive in nature (Cooper, 2012; Neiger et al.,
2012). By prescriptive, we mean that decisions regard-
ing how to improve the performance of a facility are
made based on data analysis. These decisions are
actionable rather than a mere description or reporting.
In the context of post-compulsory education, Cooper
(2012) defined the field of analytics as ‘the process of
developing actionable insights through problem defini-
tion and the application of statistical models and anal-
ysis against existing and/or simulated future data’. In
simpler terms, analytics is the process of generating
and obtaining an optimal, realistic decision based on
existing or simulated data.
Elias (2011) discussed a five-step process of aca-
demic analytics that involves data capturing, reporting,
predicting, acting, and refining. The gathered and
refined data is then analysed from three perspectives
(Evans, 2012). The first perspective is descriptive, which
answers the questions of what has happened in the past
and what is currently happening (Fitz-enz, 2009;
Evans, 2012). According to Fitz-enz (2009), descrip-
tive analytics help identify and analyse the relationships
and differences of various groups of a dataset. The
second perspective is predictive, which predicts what
will happen next based on the analysis of past data
(Evans, 2012). Prescriptive is the third level, also the
most advanced one, which utilizes the process of opti-
mization to identify the best solution. In other words,
it prescribes how to achieve the best outcome consider-
ing the effects and variability (Fitz-enz, 2009; Evans,
2012). This is the recommendation phase where deci-
sion and support tools are coupled with expert opinions
to create tactical and strategic guidance for the organi-
zation. The process of data analysis may be performed
1186 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
using actual data or simulated data that is based on
reasonable assumptions (Cooper, 2012). In summary,
the process of analytics can be effectively used with
simulated data to analyse the relationships and impacts
of KPIs.
Simulations in facility management
Simulation as a descriptive or prescriptive tool has been
used in maintenance and operations management
(Montazer et al., 2003). In a study of steel rolling mills,
Bala Krishnan (1992) divided an entire multistage
machine into sub-systems and simulated maintenance
policy options. According to his findings, opportunistic
maintenance is far better than other maintenance poli-
cies. Chen et al. (2003) developed and used a Markov
Regenerative Process (MRGP) model to simulate con-
dition-based maintenance with generally distributed
inspection intervals as opposed to the exponential dis-
tribution adopted by previous studies. In an effort to
integrate a maintenance evaluation tool into a build-
ing’s design and construction phase, Dessouky and
Bayer (2002) developed a simulation model using
maintenance type, maintenance quality attributes, and
task hours to estimate excess labour hours. By optimiz-
ing labour hours, they were able to allocate a mainte-
nance cost to a building’s construction and design
phase. Montazer et al. (2003) noted how simulation
has become not only a descriptive but also a predictive
tool for making decisions about future strategies in
operations management.
Another area in which simulation has been exten-
sively used as a descriptive, predictive, and prescriptive
tool is the assessment of a facility’s energy performance.
Zhu (2006) conducted an energy simulation study of a
complex facility in the southeast region of the United
States to help facility managers optimize the facility’s
energy performance. In this study, eQuest was used
to simulate the energy performance in order to devise
strategies to save energy use in the facility’s operations
(prescriptive role). According to Augenbroe (2002),
simulation modelling that was once used mainly during
the design and construction phase is now being applied
extensively to post-construction phases such as
commissioning and facility management. In fact,
simulation has become an integral part of the whole
building design, engineering, and operations process
(Augenbroe, 2002).
Research methods
A simulation approach was applied in this study to ana-
lyse the impacts of key input variables on facility KPI
estimates and robustness, as well as relationships
between and among input variables and KPIs. The
KPIs calculated from simulating outputs from the input
variables include CI, MEI, and REI. The main variable
considered for optimization is the Net Present Value of
Dollars Spent. A complete list of the variables and
abbreviations used in this paper may be found in the
Appendix. The simulation process is presented in a
high level flowchart in Figure 1 (life cycle simulation
of a single system). The flowchart illustrates the process
of converting input variables in KPIs and the Net
Present Value of Dollars Spent throughout the process.
The following sections describe the simulation sce-
narios and the main assumptions made to perform
them.
Simulation scenarios
This study contains two parts. In Part 1, we simulate
and examine input and output variable relationships
for a single system within a facility (note that the single
system may be the facility itself). In Part 2, we simulate
and examine input and output variable relationships for
multiple systems within a facility. The difference
between Part 1 and Part 2 is an added level of complex-
ity. One could imagine adding sub-systems within sys-
tems which would add yet another level of complexity.
Downsides to adding levels of complexity include the
need to gather additional data, adding more time-
consuming and complex simulations, and the introduc-
tion of bias. Bias is introduced because the finer the
details introduced, the higher the likelihood that other,
more subtle but just as important, fine details would be
missed. On the positive side, adding levels may reduce
the variability in outputs, as well as better map out the
impact and relationships between and among major
KPIs, such as Condition Index (CI), Maintenance Effi-
ciency Index (MEI), and Replacement Efficiency Index
(REI). Ultimately, recognizing the mutual impacts of
these KPIs may lead to a better understanding of orga-
nizational goals, such as students’ test scores and
absenteeism rates. The following paragraphs describe
Part 1 and Part 2 in more detail.
Part 1: This included an analysis of the relationships
between CI and its variables, such as Plant Replace-
ment Value (PRV) and/or System Replacement Value
(SRV), Deferred Maintenance (DFM), and Dollars
Spent on Maintenance (DSM) at a facility level. In this
part, four scenarios were analysed. The service life of
the educational facility was assumed as 50 years. The
main goal at this stage was to identify maintenance
budget and equipment replacement strategies to main-
tain a desirable CI while optimizing the total spending
on maintenance.
KPIs for facility performance assessment 1187
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
Part 2: In this part, the KPIs were calculated and
analysed for their relationships and dependence at indi-
vidual system levels. The main goal at this stage was to
study the impacts of KPIs at individual system levels for
the overall condition of the facility. In this scenario, five
building systems were considered in the calculation of
the Facility Condition Index: Roof, HVAC, Plumbing,
Electrical, and Other (the remaining major systems in a
typical building, combined into one system).
Values for the input data used in the simulations for
Parts 1 and 2 are listed in Tables 1 and 2, respectively;
four scenarios were developed for each part. While
numerous input variables are possible, the maintenance
budget and the years until replacement are the only
variables changed over the four scenarios. The other
difference to note between the two parts is that built-
in variability is shown in the plots of KPIs in Part 2,
but not in Part 1. All costs in the simulation scenarios
are given in thousands of dollars ($k). Also, the plant
represented in Table 1 is a different plant than the
one in Table 2, therefore, only trends can be compared
between the two, but not costs.
To better explain the simulations, let us take, for
example, scenario 1 of Part 1 (line item #13 in Table 1)
and analyse its input variables: one building system
represents all building systems and components. The
estimated life cycle of this system is 45 years (line item
#1), and it originally cost $10 000 to install (line item
#4). The annual average increase in the System
Replacement Value is 3%, distributed as 2% for infla-
tion (line item #11) plus an additional 1% increase
due to environmental/safety laws/other concerns (line
item #12). This rate, however, is an estimated average,
and is subject to variability according to the beta distri-
bution; therefore, it is possible that in any single year,
the combined rate could be 2% or 4%, or anything in
Life CycleSimulation of aSingle System
Set initial values forvariables in Table 1
Do S = 1 to TotalNumber of Simulations
Calculate CI, MEI, REIfor the year
Calculate the exponentialdistribution for
maintenance costs basedon SRV when new and
the future projectedsystem replacement
value
Do P = 1 to TotalNumber of Years
End of Years Loop
Calculate the NetPresent Value of the
amount spent onmaintenance andamount spent on
replacement
End of Simulations Loop
Update SRV based onrates of inflation forsystem/environment.
Rates randomly drawnfrom Beta Distribution
Update total deferredmaintenance as
previous DFM plus costsdue to inflation (raterandomly drawn fromBeta) plus this year’s
DFM
Calculate this year’sdeferred maintenanceas difference betweencalculated costs and
calculated amount spent
Calculate amount spenton maintenance basedon budget subject toGaussian variability
Is ittime toreplace
thesysem?
Calculate maintenancecosts based on
exponential costschedule subject toGaussian variability
Replace system andrecord money spent onnew systems. Update
exponentialmaintenance cost
schedule
Yes
NO
Figure 1 Life cycle simulation of a single system
1188 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
between. This specific scenario assumes that the system
is kept for 45 years (variable X, line items #6 and 13),
which means that it would be replaced at the end of its
service life (similar to line item #1). It also assumes that
the maintenance rate in the first year is 2% of the
System Replacement Value (line item #2), increasing
to 100% of the System Replacement Value in its last
year of service life (45) (line item #3). However, only
70% of its maintenance needs are funded each year
throughout its entire service life (variable Y, line items
#8 and 13). The unfunded maintenance goes into a
deferred maintenance backlog, which increases at a rate
of 2% per year (line item #10). The annual discount
rate, or the rate for opportunity cost of capital, is con-
sidered to average 2% (line item #9), and is also subject
to variability according to the beta distribution. The
value shown for the Current System (Plant) Replace-
ment Value (end of Year 1) (line item #5) is the value
in line item #4 plus the average rate of inflation (line
items #11 and 12). The Renewal Rate Factor of PRV
Table 1 Input data for Part 1
Item # Variables/Parameters Value field
1 Estimated Life Cycle (Years) 45
2 Initial Maintenance Rate or Maintenance Rate in the First Year (%) of PRV 2
3 Maintenance Rate in the Final Year of Life (%) of PRV 100
4 System (Plant) Replacement Value when New/Replaced $10 000
5 Current System (Plant) Replacement Value (end of Year 1) $10 300
6 Years Until Planned Replacement X
7 Renewal Rate Factor of PRV for Replacement 1.05
8 *Current Budget for Maintenance as a Percent of Yearly Maintenance Y
9 *The Discount Rate; Rate for Opportunity Cost of Capital (%) 2
10 *Rate of Inflation for Deferred Component Maintenance (%) 2
11 *Rate of Inflation for System (%) 2
12 *PRV Rate of Increase Due to Environmental/Safety Laws/Concerns (%) 1
Scenario
13 Scenario 1: X = 45, Y = 70% X = 45, Y = 70%
14 Scenario 2: X = 50, Y = 50% X = 50, Y = 50%
15 Scenario 3: X = 23, Y = 50% X = 23, Y = 50%
16 Scenario 4: X = 45, Y = 99.9% X = 45, Y = 99.9%
Note: *These variables are subject to variability according to the beta distribution in the simulation.
Table 2 Input data for Part 2
Variables/Parameters
Value fields for each system
Roof HVAC Plumbing Electrical Other
Estimated Life Cycle (Years) 20 30 20 18 20
Initial Maintenance Rate (%) 2 1.5 0.25 0.65 2
Final Maintenance Rate (%) 100 100 100 100 100
PRV when New/Replaced $110 $1400 $800 $1200 $2100
Current PRV (end of Year 1) $113 $1442 $824 $1236 $2163
Years Until Planned Replacement X X X X X
Renewal Rate Factor 1.2 1.1 1.17 0.84 0.77
*Budget for Maintenance Y Y Y Y Y
*The Discount Rate (%) 2 2 2 2 2
*Rate of Inflation for DFM (%) 2 2 2 2 2
*Rate of Inflation for System (%) 2 2 2 2 2
*Environmental, etc. Rate (%) 1 1 1 1 1
Scenario
Scenario 1: X = (Estimated Life Cycle), Y = 70%
Scenario 2: X = (Estimated Life Cycle + 5), Y = 50%
Scenario 3: X = ((Estimated Life Cycle/2) + 1), Y = 50%
Scenario 4: X = (Estimated Life Cycle), Y = 99.9%
Note: *These variables are subject to variability according to the beta distribution in the simulation.
KPIs for facility performance assessment 1189
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
for Replacement (in line item #7) represents the
additional cost factor to replace one system with
another (the cost to remove, haul, and dispose of the
old system), if a replacement is pursued (for example,
in scenario 3, where the value of variable X is less than
the building’s life cycle). Similarly, scenarios 2 through
4 in Part 1 (line items #14 through 16, respectively), as
well as scenarios 1 through 4 in Part 2 (as shown in
Table 2) can be analysed in terms of their input
variables.
The simulation program was constructed using
Microsoft Excel 2010. Based on user inputs, a period
of 50 years for the facility was simulated up to 100
times. The KPIs were calculated and graphed based
upon the simulated results.
Assumptions
While the simulation scenarios represent an oversimpli-
fied example of an educational building, extensive col-
laboration and debate should be employed for
analysing actual cases. The following assumptions were
made in this study for the development of the simula-
tion scenarios:
(1) The Facility Condition Index is an aggregate of
the CI of each individual system weighted only
by costs. Therefore, if the HVAC system is
more expensive than the Roof serving class-
rooms, by default, the HVAC system would
have a larger influence on the CI. Nevertheless,
the authors recognize that there may be other
methods to weight the individual systems
rather than cost (such as importance to the
overall mission of the organization) in calculat-
ing the CI.
(2) The life of a building system is based upon an
exponential rate of deterioration model where
the costs of maintenance increase exponentially
as the system ages (New Zealand National
Asset Management Steering Group, 2011).
The maintenance model adopted in this study
is shown in Figure 2. The maintenance costs
in the first and the final year of a system’s
service life are assumed as a percentage of its
System Replacement Value (SRV). According
to the maintenance model:
(a) Maintenance Cost Index = 1000 �{(Maintenance Cost) / (SRV)}
(b) Figure 2 shows three examples of compo-
nents with 10, 20, and 30 years of service
life. Each one presents two scenarios: (1)
initial maintenance cost is 0.1% of SRV
and final maintenance cost is 100% of
SRV; and (2) initial maintenance cost is
10% of SRV and final maintenance cost
is 100% of SRV. The exponential equa-
tions and trend lines in Figure 2 are set
simply by commanding Microsoft Excel
to fit an exponential curve between the ini-
tial and final maintenance costs. Neverthe-
less, the authors recognize that the values
used in this assumption for both initial
and final maintenance costs, as well as
the pattern of deterioration, may differ on
a case-by-case basis.
(3) The total cost of maintenance is the sum of
preventive and corrective maintenance. The
fraction of preventive and corrective mainte-
nance in the total maintenance cost was
assumed based on the inferences of the survey
conducted by Whitestone Research Corpora-
tion (1999). According to the survey results,
each building system has a unique value rang-
ing from 21% to 63% of the total maintenance
cost that occurs due to corrective maintenance.
(4) The cost and probability of corrective mainte-
nance are not functions of the accumulated
deferred maintenance to date. One might
hypothesize that both the cost and probability
of corrective maintenance would increase with
the accumulation of deferred maintenance.
One reason that we assume an exponential rate
of deterioration model for the life of building
systems is that we do not currently have a
quantifiable estimate of the relationship
between the cost and probability of corrective
maintenance and the accumulation of deferred
maintenance.
(5) A user of the simulation program could specify
the years until replacement of a system even if
the length of time exceeds the estimated service
life of the system. At this point in time, the
maintenance cost in a given year could not
exceed the SRV for that year.
(6) The SRV is assumed to increase in time as a
function of the rate of inflation and the rate
of cost increase due to environmental factors,
safety regulations, etc.
(7) The budget for maintenance is set as a percent-
age of expected yearly maintenance, and dis-
tributed as a random variable that follows the
beta probability distribution. In addition, the
rates of inflation are distributed as random vari-
ables following the beta probability distribu-
tion. The beta distribution is a family of
continuous probability distributions. It is
defined on an interval greater than 0 and less
than 1, with the shape of the distribution
1190 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
defined by two positive shape parameters,
denoted by α and β. The advantage of the beta
distribution is that it is flexible in characterizing
percentages and rates. Also, the interest
rate converges in law to a beta distribution
(Delbaen and Shirakawa, 2002).
(8) While the simulation could start at any life
cycle stage for each system, in this study, all
systems are assumed to start as new systems.
(9) Traditional life cycle cost analysis (LCCA) is a
technique used to determine the total cost of
facility ownership, including the first cost of
acquisition and/or construction (Langston,
2005). For existing facilities, the total cost of
ownership incorporates the sunk costs of
acquiring the facility, the ongoing costs of
operations and maintenance, and the future
costs of disposing of a facility. To minimize
Maintenance Cost Index (y) Parameterized by the ExponentialDistribution as a Function of Years (x), Initial Maintenance Rate and
Expected System Life1000
900
800
700
600
500
400
300
200
100
00 5 10 15
Years Since New/Replacement
y = 77.426e y = 88.587e
y = 0.6952e
y = 0.788e
y = 92.367e
y = 0.4642e
0.2558x 0.1212x
0.7675x
0.3636x
0.2382x
0.0794x
20 25 30
Mai
nten
ance
Cos
t In
dex
App
roac
hing
Sys
tem
Rep
lace
men
t Val
ue
Figure 2 Maintenance cost model based on the assumption of an exponential rate of deterioration
$50 000.00
$45 000.00
$40 000.00
$35 000.00
$30 000.00
$25 000.00
$20 000.00
$15 000.00
$10 000.00
$5 000.00
$0.000 5 10 15 20
Age of Plant or System in Years
25 30 35 40 45 500.00
0.10
0.20
0.30
0.40
0.50
0.60
PRV $DFM $$ Spent on MaintenanceCondition Index (0 to 1)
0.70
0.80
0.90
1.00
Figure 3 Scenario 1 showing KPI relationships over the building’s service life (NPV of $ spent = $121k)
KPIs for facility performance assessment 1191
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
the variability associated with land acquisition
and initial construction costs, as well as future
disposal or resale costs, this paper focuses
solely on operation and maintenance costs for
a performance period of 50 years.
(10) There are three independent appraisal method-
ologies commonly used to evaluate real estate
in the United States (Ling and Archer, 2012):
(a) Cost approach: sums the value of real estate
(land) and the depreciated value of the
improvements to the land (facilities, equip-
ment, infrastructure, etc.) to determine a
total replacement cost for a property. The
cost approach is commonly used when
appraising specialty properties (e.g. facto-
ries, public works, laboratories, etc.).
(b) Sales comparison: assumes that a prop-
erty’s market value is closely correlated to
the current value of a substitutable compa-
rable property. The sales comparison
approach depends on an assessment of
$50 000.00
$45 000.00
$40 000.00
$35 000.00
$30 000.00
$25 000.00
$20 000.00
$15 000.00
$10 000.00
$5 000.00
$0.000 5 10 15 20 25 30 35 40 45 50
Age of Plant or System in Years
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
PRV $DFM $$ Spent on MaintenanceCondition Index (0 to 1)
Figure 4 Scenario 2 showing KPI relationships over service life (NPV of $ spent = $130k)
$50 000.00
$45 000.00
$40 000.00
$35 000.00
$30 000.00
$25 000.00
$20 000.00
$15 000.00
$10 000.00
$5 000.00
$0.000 5 10 15 20 25 30 35 40 45 50
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
PRV $
DFM $
$ Spent on MaintenanceCondition Index (0 to 1)
Age of Plant or System in Years
Figure 5 Scenario 3 showing KPI relationships over service life (NPV of $ spent = $47k)
1192 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
similar marketplace transactions to estab-
lish value, and is commonly used in resi-
dential markets.
(c) Income capitalization approach: develops a
financial pro forma based on the antici-
pated market return of a property, and
establishes cost based on the discounted
cash value of that property over a given per-
formance period. The income approach is
commonly used to evaluate commercial or
investment properties.
To minimize the variability associated with
real estate market cycles and regional land
values, this paper utilizes the cost approach
to establish the Plant Replacement Value
(PRV). For the purpose of this analysis,
$50 000
$45 000
$40 000
$35 000
$30 000
$25 000
$20 000
$15 000
$10 000
$5 000
$00 5 10 15 20 25 30 35 40 45 50
0.00
0.10
0.20
0.30
0.40
0.50
0.60
0.70
0.80
0.90
1.00
Age of Plant or System in Years
PRV $ DFM $ $ Spent on Maintenance
Condition Index (0 to 1) NPV of $ spent =$166507
Figure 6 Scenario 4 showing KPI relationships over service life (NPV of $ spent = $166k)
1.000.950.900.850.800.750.700.650.600.550.500.45
CI
0.400.350.300.250.200.150.100.050.00
1 3 5 7 9 11 13 15 17 19 21 23 25
Years Since Facility Baseline Measurement27 29 31 33 35 37 39 41 43 45 47 49 51
Figure 7 Simulated CI under scenario 1 based on user inputs and assumptions
KPIs for facility performance assessment 1193
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
PRV is based on the depreciated value of
the improvements to the land (facilities,
equipment, infrastructure, etc.) and
excludes the acquisition cost of the land
itself, in order to focus the discussion on
operation and maintenance costs.
Results
Part 1
In Part 1, we simulated various input variables in
order to calculate the Condition Index (CI), Plant
Replacement Value (PRV), Deferred Maintenance
(DFM), and Dollars Spent on Maintenance in four dif-
ferent scenarios over a facility’s service life of 50 years.
Figures 3 through 6 show the results of the simulations
for the four scenarios of Part 1. The left side axis repre-
sents all the KPIs with monetary values (PRV, DFM,
and $ Spent on Maintenance), while the right side axis
represents the Condition Index, with values ranging
from a minimum of 0 (not functioning at all) to a
maximum of 1 (condition is as good as new). It must
be emphasized that the minimum acceptable threshold
for the Condition Index is up to the facility manager in
each specific organization to determine, based on the
organization’s mission, goals, policies, and standards.
As seen in Figures 3 through 5, as DFM increased,
the value of CI decreased. At the point that DFM
equalled PRV, the CI reached its minimum value, but
funds were still increasingly being spent on mainte-
nance. This is demonstrated in Figures 3 and 4. After
the point where DFM equals PRV, the spending con-
tinues but the CI cannot be improved until system
replacement. The situation is akin to making minimum
payments on credit card debt: while payments may be
the smallest they can be without going into default,
the debt can never be paid and the amount spent over
the life of the loan adds up to many times the cost of the
original loan. The key to understanding and using these
graphs is knowing that, while taking on some deferred
maintenance (akin to taking on debt) is advantageous
due to the time value of money, too much deferred
maintenance (akin to too much debt) leads to poor
conditions that are increasingly expensive to maintain.
One might expect that the best overall CI over the
50 years would have occurred under scenario 4 (highest
budget of 99.9%), and the worst under scenario 2 (low-
est budget and postponed replacement). This assump-
tion was shown to be correct, as seen in Figures 3
through 6. The least amount of money spent on
maintenance occurred in scenario 3 ($47k), followed
by scenario 1 ($121k). Maintenance budgeting and
replacement strategies in these scenarios prevented an
excessive build-up of DFM. In this part, it was interest-
ing to note that the strategy in scenario 3 led to much
less maintenance expenditure than in scenario 1: this
was due to an early replacement strategy that included
the least years until planned replacement. Note that
this strategy did not work in all scenarios: it only
worked because of the assumption of the exponential
1.00
0.95
0.90
0.85
0.80
0.75
0.70
0.65
0.60
0.55
0.50
0.45
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
CI
Years Since Facility Baseline Measurement1 2 3 4 5 6 7 8 9 101112131415161718192021222324252627282930313233343536373839404142434445464748495051
Figure 8 Simulated CI under scenario 2 based on user inputs and assumptions
1194 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
rate of deterioration, which may or may not be true.
Scenarios 2 and 4 were the most expensive because
they did not have the correct balance of taking on some
deferred maintenance (scenario 2) and too much
deferred maintenance (scenario 4), as mentioned
above.
Even though the overall maintenance costs over the
entire building’s service life were lowest under scenario
3, the CI was under 80% for 20 of the 50 years of ser-
vice life. Scenario 1 had its CI below 80% for 28 of the
50 years of service life, and in scenario 2, the CI was
below 80% in 37 of the 50 years of service life. It is evi-
dent that the best scenario in terms of CI was scenario
4, in which the CI was always close to 100%. However,
this was also the most expensive scenario ($166k,
which is 253% more expensive than scenario 3), and,
if such a high CI was not necessary, a significant
amount of maintenance dollars could have not been
spent. The key is always to find the strategy that main-
tains a desired CI while spending the least amount on
maintenance. This strategy should also be more robust
to changes in assumptions than other strategies. In
other words, a strategy that is slightly more expensive,
but more insensitive to slight changes to the budget,
the interest rate, or the discount rate might be better
than a strategy that is optimal under assumed condi-
tions, but changes drastically in response to any
changes in the assumptions.
Part 2
In Part 2, we looked at the relationships between CI,
PRV, DFM and Dollars Spent on Maintenance, but
the calculations were rolled up over five different
building systems (Roof, HVAC, Plumbing, Electrical,
and Other). Random variability was incorporated into
the calculations and the results of the simulations were
plotted in order to highlight the variability. Once
again, four different maintenance budget and replace-
ment strategy scenarios were considered, similar to
those in Part 1. Studying a facility at both facility
and individual system levels generated similar results
in terms of CI and Dollars Spent on Maintenance.
Using a rollup approach required more upfront work
because each system, and even sub-system, was char-
acterized and assessed individually. The incorporation
of variability into the simulation also improved the
relationship to actual practice, and allowed for more
‘real-world’ robust and sensitivity analysis. In this
part, the original cost to install the building (all five
systems combined) was $5610; and therefore, the val-
ues below cannot be compared to those calculated in
Part 1.
The least amount of money spent on maintenance
occurred in scenarios 3 (NPV over 50 years of $35k)
and 1 ($51k), with scenario 3 being the lowest. The
best scenario for Condition Index was, again, scenario
4 (~$69k) where CI was always close to 100%. How-
ever, it was almost twice as expensive as scenario 3
(the least expensive). The CI, rolled up over all sys-
tems, under scenarios 1 through 3 is shown in Figures
7 through 9, respectively. The variability in the CI cal-
culations can also be seen in these Figures. It can be
noticed that the variability in the estimated CI
increased as the age of the system moved further away
from the point when it was new or replaced; that hap-
pened because there is more uncertainty as we move
further into forecasting the future.
Due to an early replacement on a system-by-system
basis, the CI under scenario 3 did not fall below 80%
(see Figure 9). Overall, it appeared to be the best strat-
egy that resulted in a balanced maintenance cost and
CI. However, care must be taken in assessing scenario
3; as mentioned previously, this strategy worked well
due to the assumption of the exponential rate of deteri-
oration. This strategy also worked because the facility’s
CI was calculated as an aggregate of the CIs of all indi-
vidual systems weighted only by their costs. This means
that the facility’s CI could be over 80% even if some
individual system’s or multiple systems’ CIs were still
below 80%. This strategy could be misleading from
the standpoint of organizational goals: for instance, if
the facility’s CI was above 80% but the CI of the
HVAC system dropped below 60%, it might adversely
affect factors such as students’ performance and/or
absenteeism more than for any other system.
Two other KPIs that could help find a robust strat-
egy that maintained an appropriate CI while spending
the least amount of money on maintenance were the
Maintenance Efficiency Indicator (MEI) and the
Replacement Efficiency Indicator (REI). It was no
coincidence that the best overall MEI and REI, given
that we were targeting 100% for these KPIs, belonged
to scenario 3, as seen in Figures 10 through 12 (note
that the MEI for scenario 4 is not shown because it is
extremely high and distorts the graph). The target CI
for the MEI graphs was 80%. Therefore, in this sce-
nario, MEI of 100% meant that the exact amount nec-
essary to maintain the target CI was being spent in the
given year on DFM. MEI below or above 100% dem-
onstrated a lack, or a surplus, of maintenance expendi-
ture, respectively. Conventionally, a facility’s CI greater
than the targeted CI resulted in an MEI higher than
100%, and vice versa. However, note that even if the
facility’s CI was greater than the target CI, the MEI
could drop below 100%, if the necessary preventive
maintenance actions were ignored.
KPIs for facility performance assessment 1195
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
Maintenance rate sensitivity analysis
As seen from the results of Parts 1 and 2, a simulation
approach based on hypothetical data can be utilized for
understanding the interrelationships and interdepen-
dence of KPIs and key variables affecting them. This
approach can also show the combined effects of KPIs
over the higher level organizational goals such as stu-
dents’ scores and absenteeism. However, these results
were based on some assumptions such as the initial
maintenance rate, the final maintenance rate, the
renewal rate, inflation rate, and life of systems after
replacement. Through sensitivity analyses, the best
solution over a range of assumptions can be calculated.
For example, Figure 13 shows the Net Present Value
(NPV) as a function of the number of years until
replacement (x axis), the initial maintenance rate (set
at 2% in all but one scenario) and the maintenance rate
1.00
0.95
0.90
0.85
0.80
0.75
0.70
0.65
0.60
0.55
0.50
0.45
0.40
0.35
0.30
0.25
0.20
0.15
0.10
0.05
0.00
Years Since Facility Baseline Measurement
CI
1 2 3 4 5 6 7 8 9 101112131415161718192021222324252627282930313233343536373839404142434445464748495051
Figure 9 Simulated CI under scenario 3 based on user inputs and assumptions
1000%REI MEI
950%900%850%800%750%700%650%600%550%500%450%400%350%300%250%200%150%100%50%0%
1 3 5
Years Since Facility Baseline Measurement
RE
I an
d M
EI
7 9 11 13 15 17 19 21 23 25 27 29 31 33 35 37 39 41 43 45 47 49 51
Figure 10 MEI and REI under scenario 1 based on user inputs and assumptions
1196 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
in the final year, over a period of 50 years. In all scenar-
ios, the annual maintenance budget was assumed at
99.9% (maximum possible rate to keep the CI at 1 at
all times), and the initial cost of the system (PRV)
was considered at $10 000. The maintenance rate in
the final year ranged from 100% to 80%, 60%, 40%,
20%, 10%, 4%, and 2% (the last value yields a mainte-
nance rate constant over the entire service life).
The following is observed from Figure 13:
(1) The NPV was found to behave in a saw-tooth
pattern because it is a function of how many
replacements occur over 50 years of service life,
in addition to years until replacement, initial
maintenance rate, and final maintenance rate.
So any jumps that occur are due to a different
1000.00%
RE
I an
d M
EI
Years Since Facility Baseline Measurement
950.00%
900.00%
850.00%
800.00%
750.00%
700.00%
650.00%
600.00%
550.00%
500.00%
450.00%
400.00%
350.00%
300.00%
250.00%
200.00%
150.00%
100.00%
50.00%
0.00%1 2 3 4 5 6 7 8 9 101112131415161718192021222324252627282930313233343536373839404142434445464748495051
REI
MEI
Figure 11 MEI and REI under scenario 2 based on user inputs and assumptions
Years Since Facility Baseline Measurement1 2 3 4 5 6 7 8 9 101112131415161718192021222324252627282930313233343536373839404142434445464748495051
REI
MEI
1000.00%
RE
I an
d M
EI
950.00%
900.00%
850.00%
800.00%
750.00%
700.00%
650.00%
600.00%
550.00%
500.00%
450.00%
400.00%
350.00%
300.00%
250.00%
200.00%
150.00%
100.00%
50.00%
0.00%
Figure 12 MEI and REI under scenario 3 based on user inputs and assumptions
KPIs for facility performance assessment 1197
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
number of replacements within the 50-year
period. If the NPV were calculated over a
period of infinite years, there would be no
saw-tooth pattern. However, calculating over
a period of infinite years is not necessarily the
best method because optimizing over infinity
is not realistic. Therefore, it is very important
that the NPV be optimized over the true period
of study, whether that is 25, 50 or 100 years.
(2) The optimum NPV for a maintenance rate of
100% and 80% in the final year was achieved
when replacement was performed at 18 years,
i.e., two replacements over a period of 50 years.
The near-optimum NPV for these scenarios
was 26 years.
(3) The optimum NPV for all other maintenance
rates in the final year was achieved when
replacement was performed at 26 years, i.e.,
5
NP
V
$200 000
2%, 100%
2%, 10%
2%, 80%
2%, 4%
2%, 60%
2%, 2%
2%, 40%
0.1%, 4%
2%, 20%
$180 000
$160 000
$140 000
$120 000
$100 000
$80 000
$60 000
$40 000
$20 000
$010 15 20 25 30 35 40 45 50
Years Until System Replacement for a System with a Life Expectancyof 50 Years
Figure 13 Net Present Value over a period of 50 years under various maintenance rate assumptions
5
NP
V
$70 000$65 000$60 000$55 000$50 000$45 000$40 000$35 000$30 000$25 000$20 000$15 000
$5 000$10 000
2%, 100%
2%, 10%
2%, 80%
2%, 4%
2%, 60%
2%, 2%
2%, 40%
0.1%, 4%
2%, 20%
$010 15 20 25
Years Until System Replacement for a System with a Life Expectancyof 50 Years
Figure 14 Net Present Value over a period of 25 years under various maintenance rate assumptions
1198 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
one replacement over a period of 50 years. The
reader should keep in mind that this may not be
optimal for any rates over a period of 52 years
because of the added cost of the system being
replaced twice instead of once.
(4) NPV was flat, or close to flat, from the replace-
ment periods of 26 to 50 years for all final year
maintenance rates below 10%.
This, basically, leads to the following two major
conclusions:
(1) For a service life of 50 years, the optimum, or
near-optimum, NPV was achieved with a
replacement at 26 years in all scenarios.
(2) If there is a high degree of confidence that the
initial maintenance rate was 2%, and the final
maintenance rate was no more than 4% of the
System Replacement Value, there was found
to be no financial justification to replace a sys-
tem expected to last 50 years. In these scenar-
ios, if we only consider economic reasoning,
the recommendation would be to keep main-
taining the system; however, this might severely
impact on the condition of that system, which
might result in other negative effects to the
building and/or the building users.
As mentioned earlier, there is a saw-tooth pattern in
the NPV, which means that it is very important to opti-
mize the NPV over the true period of service life of the
building. While optimizing it over a five-year period is
too short term because it only serves ‘selfish’ short-term
benefits, a period of 1000 years, on the other hand, is
unrealistic. Simulations may be performed over a range
of periods to identify the optimum solution to this
question.
In Figures 14 and 15, the NPV is calculated over a
period of 25 years and 150 years, respectively. Even
though the life expectancy of the system is the same
in Figures 13 through 15, it can be seen that the num-
ber of years until replacement to optimize NPV
depends on the number of years over which the NPV
is to be optimized (25, 50 or 150).
Conclusions
Under the specific set of assumptions utilized in this
study, Part 1 shows that scenario 4 (full maintenance)
maintained the highest Condition Index for the build-
ing throughout its service life; however, scenario 3
(early replacement) was the least expensive to imple-
ment while still maintaining a better overall CI than
scenarios 1 or 2. Scenario 3 was also found to be the
least expensive in Part 2 of the simulation, where one
building was substituted by five separate building sys-
tems that together rolled up to the building level. The
conclusions themselves may not be important because
they may differ somewhat from real-world settings
and assumptions. However, it is important to note that
the analysis and conclusions are not all obvious and
could not have been drawn without the use of simula-
tion. Is early replacement of a system always the best
strategy? Of course it is not. The best strategy depends
5
NP
V
$1 000 000
$900 000
$800 000
$700 000
$600 000
$500 000
$400 000
$300 000
$200 000
$100 000
$010 15 20 25 30 35 40 45 50
Years Until System Replacement for a System with a Life Expectancyof 50 Years
2%, 100%
2%, 10%
2%, 80%
2%, 4%
2%, 60%
2%, 2%
2%, 40%
0.1%, 4%
2%, 20%
Figure 15 Net Present Value over a period of 150 years under various maintenance rate assumptions
KPIs for facility performance assessment 1199
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
on all of the circumstances, inputs, influences, and
variabilities considered and analysed simultaneously.
Simulation is the tool that enables this pursuit.
As facility managers are faced with increasing pres-
sure to prioritize the direction of limited resources, this
paper develops a tool that may assist them in their
decision-making regarding maintenance and replace-
ment of building systems. Better prioritization of main-
tenance and capital renewal actions could be achieved
by using a simulation (modelling) tool to examine the
interrelationships between the variables that are to be
considered for making more informed decisions. The
significance of using the KPI approach for assessing a
facility’s performance has been highlighted in the liter-
ature. This paper builds upon the proposed set of five
KPIs, as identified in Lavy et al. (2014a, 2014b). The
analysis performed in this paper shows that the interde-
pendence and the relationships of KPIs can be investi-
gated using the simulation approach. Moreover, it can
also be used to understand the combined impact of
KPIs over organizational goals. However, to truly
understand the state of a facility and how the facility’s
maintenance expenditure is optimized while keeping
the CI at a desired level, a simulation based upon actual
data is needed. The process begins with understanding
goals, objectives, functions and inputs, as well as cap-
turing and analysing relevant data. Key performance
indicators (KPIs) are an important part of the data to
be captured, but much more information needs to be
collected and stored in order to understand the cause
and effects among various KPIs.
This study had four main objectives. The first
objective was to simulate the identified core KPI out-
puts for facility performance assessment using hypo-
thetical input data. This is clearly demonstrated in
both Parts 1 and 2, where computer simulations were
conducted to analyse the effect of core KPIs on facility
performance assessment. The second objective of this
study was to demonstrate how simulation allows for
the study of correlations and relationships between
and among KPIs. This was conducted in both Parts 1
and 2, and is evident in Figures 3 through 12, where
correlations between KPIs are presented and analysed.
The third objective of this paper was to highlight the
sensitivity of outputs and outcomes to input variable
sensitivity. Evidence of this may be found in Figures
13 through 15, where the sensitivity of an output vari-
able (Net Present Value over a 50-year period) was
tested against changes in one input variable (final year’s
maintenance rate).
The fourth objective of this study was to demon-
strate that due to variability and future uncertainty,
simulation is a valuable tool for generating future pos-
sible scenarios and making decisions based upon
forecasts and logic. The accomplishing of this objective
is evident throughout the paper, as simulation is a
powerful tool in understanding and projecting relation-
ships among KPIs, their inputs, and their outputs.
Simulations may be used to play out complex relation-
ships and interactions, test sensitivity to changes in
inputs and assumptions, predict the effect of various
scenarios, and optimize objectives. However, as with
any predictive tool, the results are only as good as the
assumptions and inputs. Extremely important ques-
tions to answer include whether or not we have identi-
fied all the potential variables that influence KPIs and
whether the assumptions regarding variable relation-
ships and behaviours are reasonable. In this paper,
the authors demonstrate the power and potential of
understanding and using KPIs through computer sim-
ulations. Additionally, output variability increases as
the prediction refers to a point which is further away
from the present time. This happens due to the fact that
there is more uncertainty as one moves further into
forecasting the future.
There is no ‘right or wrong’ answer to the question
of how to better prioritize available resources. Is a five-
year solution preferred over a 10-year, 20-year, or 50-
year solution, or vice versa? We argue that individual
users should determine the best solution for their spe-
cific circumstances. As long as decision-makers under-
stand the hierarchy of how key performance indicators
may affect facility performance assessment decisions,
which in turn, may affect the achievement of organiza-
tional goals, simulation tools can be used to examine
various scenarios. In our simulation, we estimated a
building’s life cycle of 50 years, which is considered
as the economic service life of a building, but the tool
is flexible enough to work with different assumptions
as well.
One additional area of concern involves short- and
long-term objectives, goals, and decision-making. It is
not illogical for a president, chief executive officer, chief
financial officer, facility manager, or a person holding
any other function/position in an organization to plan
their actions based on the amount of time (‘term’) they
will serve in their position. However, this type of plan-
ning strategy may result in deferring decisions or
actions, such as for maintenance and/or replacement
of building systems and components, to a later date.
This planning may be akin to scenario 2 in our exam-
ples, and could lead to the situation of taking on too
much deferred maintenance wherein the amount of
maintenance expenditure is poor; a point at which the
Condition Index cannot be improved will be reached,
and this might be a very expensive strategy.
Composite KPIs can more directly link KPIs such
as those addressed in this paper to ultimate
performance goals/metrics. For example, while there
are assumed correlations between some of these KPIs
1200 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
to facility performance assessment and ultimately, to
organizational goals, e.g., student academic perfor-
mance, there needs to be a balance of critical systems’
KPIs (such as HVAC), and functional KPIs (such as
space, technology, access, air quality) to better link to
organizational performance metrics (in the case of edu-
cation facilities: test scores, promotion rates, etc.).
For future research, the authors propose:
(1) Validating the identified core KPIs and the sim-
ulation results using actual data. Data to be
used for such a study could be obtained from
different organizations spread over various
industries.
(2) Selecting actual facilities with available histori-
cal data supporting these KPIs, and comparing
assumptions yielding optimum simulated
results (using the simulation methods described
in this paper) with simulation results using
actual data. This will be the first step to charac-
terize the relationship between the simulation
models and actual operations to better define
facility asset management decision support.
(3) Identifying composite KPIs to better link KPIs
defined in this paper to organizational perfor-
mance.
References
Ahluwalia, S.S. (2008) A framework for efficient condition
assessment of the building infrastructure, Doctoral disser-
tation, University of Waterloo, Ontario.
Amaratunga, D. and Baldry, D. (2000) Assessment of
facilities management performance in higher education
properties. Facilities, 18(7/8), 293–301.
Amaratunga, D., Baldry, D. and Sarshar, M. (2000) Assess-
ment of facilities management performance – what next?
Facilities, 18(1/2), 66–75.
American Federation of Teachers (2006) Building Minds,
Minding Buildings, Turning Crumbling Schools into Environ-
ments for Learning, American Federation of Teachers,
Washington, DC.
Association of Higher Education Facilities Officers (APPA),
Federal Facilities Council (FFC), Holder Construction
Company, International Facility Management Association
(IFMA), and National Association of State Facilities
Administration (NASFA) (2003) Asset Lifecycle Model for
Total Cost of Ownership Management, Framework, Glossary
& Definitions, available at http://goo.gl/abhT31 (accessed
25 February 2009).
Atkin, B. and Brooks, A. (2000) Total Facilities Management,
Blackwell Science, Oxford.
Augenbroe, G. (2002) Trends in building simulation. Build-
ing and Environment, 37(8–9), 891–902.
Augenbroe, G. and Park, C.S. (2005) Quantification meth-
ods of technical building performance. Building Research
& Information, 33(2), 159–72.
Bala Krishnan, N.T. (1992) A simulation model for mainte-
nance planning, in Proceedings of Annual Symposium on Reli-
ability and Maintainability, Las Vegas, NV, 21–23 January,
pp. 109–18.
Barrett, P. and Baldry, D. (2003) Facilities Management:
Towards Best Practice, Blackwell Science, Oxford.
Bazjanac, V. (2001) Acquisition of building geometry in the
simulation of energy performance. Paper presented at the
Building Simulation Conference, Rio de Janeiro, 13–15
August.
Briselden, D. and Cain, D. (2001) The facilities condition
index: a useful tool for capital asset planning. Facility Man-
ager, 17(4), 33–7.
Cable, J.H. andDavis, J.S. (2004)KeyPerformance Indicators for
Federal Facilities Portfolios, Federal Facilities Council Technical
Report 147, National Academies Press, Washington, DC.
Cash, C. and Twiford, T. (2009) Improving student achieve-
ment and school facilities in a time of limited funding, avail-
able at http://goo.gl/fzRq1u (accessed 12 January 2012).
Chan, K.T., Lee, R.H.K. and Burnett, J. (2001) Mainte-
nance performance: a case study of hospitality engineering
systems. Facilities, 19(13/14), 494–503.
Chen, D., Cao, Y., Trivedi, K.S. and Hong, Y. (2003) Pre-
ventive maintenance of multi-state system with phase-type
failure time distribution and non-zero inspection time.
International Journal of Reliability, Quality and Safety Engi-
neering, 10(3), 323–44.
Collins, T.N. and Parson, K.A. (2010) School climate and
student outcomes. Journal of Cross-Disciplinary Perspectives
in Education, 3(1), 34–9.
Cooper, A. (2012) What is analytics? Definition and essential
characteristics. CETIS Analytics Series, 1(5), 1–10.
Crawley, D.B., Hand, J.W., Kummert, M. and Griffith, B.T.
(2005) Contrasting the Capabilities of Building Energy Perfor-
mance Simulation Programs, Report sponsored by United
States Department of Energy, Washington, DC, University
of Strathclyde, Glasgow, and University of Wisconsin,
Madison, WI.
De Groote, P. (1995) Maintenance performance analysis: a
practical approach. Journal of Quality in Maintenance Engi-
neering, 1(2), 4–24.
Delbaen, F. and Shirakawa, H. (2002) An interest rate model
with upper and lower bound. Asia-Pacific Financial Mar-
kets, 9(3), 191–209.
Dessouky, Y.M. and Bayer, A. (2002) A simulation and
design of experiments modeling approach to minimize
building maintenance costs. Computers & Industrial Engi-
neering, 43(3), 423–36.
Douglas, J. (1993/94) Developments in appraising the total
performance of buildings. Structural Survey, 12(6), 10–5.
Douglas, J. (1996) Building performance and its relevance to
facilities management. Facilities, 14(3/4), 23–32.
Earthman, G.I. (2002) School facility conditions and student
academic achievement. UCLA’s Institute for Democracy,
Education and Access University of California, Los
Angeles, CA, available at http://bit.ly/1HFE0jU.
KPIs for facility performance assessment 1201
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
Elias, T. (2011) Learning analytics: definitions, processes and
potential, available at http://goo.gl/SnJCZk (accessed 10
July 2013).
Enoma, A. and Allen, S. (2007) Developing key performance
indicators for airport safety and security. Facilities, 25(7/8),
296–315.
Evans, J.R. (2012) Business analytics: the next frontier for
decision sciences. Decis Line, 43(2), 4–6.
Fife, G.A. (2006) 2006 Middle School Prototype, Space Require-
ments and Special Conditions, New School and Facility Plan-
ning, Clark County School District, NV.
Fitz-enz, J. (2009) Predicting people: from metrics to analyt-
ics. Employment Relations Today, 36(3), 1–11.
Fowler, K.M., Solana, A.E. and Spees, K. (2005) Building
Cost and Performance Metrics: Data Collection Protocol, Paci-
fic Northwest National Laboratory, Richland, WA.
Gumbus, A. (2005) Introducing the Balanced Scorecard:
creating metrics to measure performance. Journal of Man-
agement Education, 29(4), 617–30.
Hammad, A., Fazio, P. and He, H.S. (2005) Computer tool to
achieve better performance and integration of building
envelopes with structural and mechanical systems. Paper
presented at the 33rd Annual General Conference of the
Canadian Society for Civil Engineering, Toronto, 2–4 June.
Hamza, N. and Horne, M. (2007) Building information
modeling: empowering energy conscious design. Paper pre-
sented at the 3rd International ASCAAD Conference on
Embodying Virtual Architecture, Alexandria, Egypt, 28–
30 November.
Hinks, J. and McNay, P. (1999) The creation of a manage-
ment-by-variance tool for facilities management perfor-
mance assessment. Facilities, 17(1/2), 31–53.
Hitchcock, R.J. (2002) High Performance Commercial Building
Systems Program, Element 2 – Project 2.1 – Task 2.1.2, Stan-
dardized Building Performance Metrics, Final Report, Build-
ing Technology Department, Lawrence Berkeley National
Laboratory, Berkeley, CA.
Ho, D.C.H., Chan, E.H.W., Wong, N.Y. and Chan, M.
(2000) Significant metrics for facilities management bench-
marking in the Asia Pacific region. Facilities, 18(13/14),
545–55.
Horner, R.M.W., El-Haram, M.A. and Munns, A.K. (1997)
Building maintenance strategy: a new management
approach. Journal of Quality in Maintenance Engineering, 3
(4), 273–80.
Kincaid, D.G. (1994) Measuring performance in facility
management. Facilities, 12(6), 17–20.
Kinnaman, M. (2007) Searching for excellence using APPA’s
facilities performance indicators. Paper presented at the
National Collegiate FM Technology Conference at the
Massachusetts Institute of Technology, Cambridge, MA,
7–10 August.
Kutucuoglu, K.Y., Hamali, J., Irani, Z. and Sharp, J.M.
(2001) A framework for managing maintenance using per-
formance measurement systems. International Journal of
Operations and Production Management, 21(1/2), 173–94.
Langston, C.A. (2005) Life-Cost Approach to Building Evalua-
tion, Elsevier Butterworth-Heinemann, Oxford.
Lavy, S. and Shohet, I.M. (2004) Integrated maintenance
management of hospital buildings: a case study. Construc-
tion Management and Economics, 22(1), 25–34.
Lavy, S., Garcia, J.A. and Dixit, M.K. (2010) Establishment
of KPIs for facility performance measurement: review of lit-
erature. Facilities, 28(9), 440–64.
Lavy, S., Garcia, J.A. and Dixit, M.K. (2014a) KPIs for
facility’s performance assessment, Part I: identification
and categorization of core indicators. Facilities, 32(5/6),
256–74.
Lavy, S., Garcia, J.A. and Dixit, M.K. (2014b) KPIs for facil-
ity’s performance assessment, Part II: identification of vari-
ables and deriving expressions for core indicators. Facilities,
32(5/6), 275–94.
Lebas, M.J. (1995) Performance measurement and perfor-
mance management. International Journal of Production Eco-
nomics, 41(1–3), 23–35.
Ling, A. and Archer, W. (2012) Real Estate Principles: A Value
Approach, 4th edn, McGraw-Hill, New York, NY.
Lippman, P.C. (2010) Can the physical environment have an
impact on the learning environment? CELE Exchange,
2010/13, OECD Publishing, available at http://goo.gl/
ogxy1w (accessed 3 June 2014).
Montazer, M.A., Ece, K. and Alp, H. (2003) Simulation
modeling in operations management. Paper presented at
the 14th Annual Conference of the Production and Oper-
ations Management Society, Savannah, GA, 4–7 April.
Mozaffarian, R. (2008) Life-cycle indoor air quality compar-
isons between LEED certified and non-LEED certified
buildings, MS thesis, University Florida, Gainesville, FL.
Neely, A., Richards, H., Mills, J., Platts, K. and Bourne, M.
(1997) Designing performance measures: a structured
approach. International Journal of Operations and Production
Management, 17(11), 1131–52.
Neiger, B.L., Thackeray, R., van Wagenen, S.A., Hanson,
C.L., West, J.H., Barnes, M.D. and Fagen, M.C. (2012)
Use of social media in health promotion: purposes, key per-
formance indicators, and evaluation metrics. Health Promo-
tion Practice, 13(2), 159–64.
New Zealand National Asset Management Steering Group
(2011) International Infrastructure Management Manual,
4th edn, New Zealand National Asset Management Steer-
ing Group and the Institute of Public Works Engineering
Australia, Wellington, New Zealand.
Office of the Legislative Auditor (2000) Preventive Mainte-
nance for Local Government Buildings, Report #00-06, Office
of the Legislative Auditor, State of Minnesota, St. Paul,
MN, 12 April.
O’Sullivan, D.T., Keane, M.M., Kelliher, D. and Hitchcock,
R.J. (2004) Improving building operation by tracking per-
formance metrics throughout the building lifecycle
(BLC). Energy and Buildings, 36(11), 1075–90.
Prakash, P. (2005) Effect of indoor environmental quality on
occupant’s perception of performance: a comparative
study, MS thesis, University of Florida, Gainesville, FL.
Rhode Island Department of Education (2007) RIDE School
Construction Regulations, Rhode Island Department of Edu-
cation, RI.
1202 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
Schroeder, C.G., Baird, J., Roberts, K.B. and Cohen, G.L.
(1995) Studies for the Development of a Space Utilization
Index, Technical Report 95/34, US Army Corps of Engi-
neers, Construction Engineering Research Laboratories,
September.
Shohet, I.M. (2003) Building evaluation methodology for set-
ting maintenance priorities in hospital buildings. Construc-
tion Management and Economics, 21(7), 681–92.
Shohet, I.M. (2006) Key performance indicators for strategic
healthcare facilities maintenance. Journal of Construction
Engineering and Management, 132(4), 345–52.
Shohet, I.M. and Lavy, S. (2004) Healthcare facilities man-
agement: state of the art review. Facilities, 22(7/8), 210–20.
Slater, S.F., Olson, E.M. and Reddy, V.K. (1997) Strategy-
based performance measurement. Business Horizons, 40(4),
37–44.
Syakima, N., Sapri, M. and Shahril, A.R. (2011) Measuring
performance for classroom facilities. Paper presented at
the 2011 International Conference on Sociality and Eco-
nomics Development, Kuala Lumpur, 4–5 June.
Teicholz, E. and Edgar, A. (2001) Facility Condition Assess-
ment Practices, available at http://goo.gl/XXPUQH
(accessed 15 August 2009).
Tsang, A.H. (1998) A strategic approach to managing
maintenance performance. Journal of Quality in Mainte-
nance Engineering, 4(2), 87–94.
Tsang, A.H.C., Jardine, A.K.S. and Kolodny, H. (1999)
Measuring maintenance performance: a holistic approach.
International Journal of Operations & Production Manage-
ment, 19(7), 691–715.
Vanier, D.J. (2000) Asset management 101: a primer. Paper
presented at the Innovations in Urban Infrastructure Seminar
of the APWA International Public Works Congress, 10–12
September, Louisville, KY.
Whitestone Research Corporation (1999) Estimates of
Unscheduled Facility Maintenance, Whitestone Report,
Whitestone Research Corporation.
Young, E., Green, H.A., Patrick-Roehrich, L., Joseph, L. and
Gibson, T. (2003) Do K-12 School Facilities Affect Education
Outcomes?, The Tennessee Advisory Commission on Inter-
governmental Relations, Nashville, TN.
Zhu, Y. (2006) Applying computer-based simulation to energy
auditing: a case study. Energy and Buildings, 38(5), 421–8.
Appendix
List of the variables and abbreviations used
in this paper
Estimated Life Cycle (Years)
This is the estimated life of the system of interest.Changing the life cycle affects the rate of deteriorationof the system and the rate of maintenance costs. Thesimulation may start at any point of the life cycle forthe system, but for the purposes of this paper, all sys-tems start as new.
Initial Maintenance Rate (%)
This is the expected rate of maintenance in the first yearof the system. The amount of maintenance ($) expectedfor the system in the first year is calculated by multiply-ing the rate by the System Replacement Value at the endof the first year.
System Replacement Value (SRV)/Plant Replacement Value(PRV)
This is the cost to replace the system/plant withoutaccounting for the Renewal Rate. It is assumed toincrease due to the Rate of Inflation for System andEnvironmental Rate.
Final Maintenance Rate (%)
This is the expected rate of maintenance in the final yearof the system. The amount of maintenance ($) expectedfor the system in the final year is calculated by multiply-ing the rate by the projected System Replacement Valueat the end of the final year.
Years Until Planned Replacement
This is when we plan to replace the system, which doesnot have to be at the end of the life of the system. Thesimulation allows us to study how KPIs are affected byaltering this value.
Renewal Rate Factor
This rate is multiplied by the SRV/PRV to estimatethe cost of replacement accounting for the use ofexisting infrastructure or the cost of tear down and haulaway.
Budget for Maintenance
Treated as a random variable in the simulation (we havean expected target, but it fluctuates from year to yeardue to variability). This is the rate of the SRV to deter-mine how much is spent during the year on mainte-nance. The difference between the amount spent onmaintenance and the amount needed to be spent onmaintenance is the Deferred Maintenance. If the budgetis 100%, the amount spent on maintenance is simplythe current SRV, and there is no Deferred Maintenancefor that particular year.
Deferred Maintenance (DFM)
The difference between the amount ($) spent on main-tenance and the amount ($) needed to be spent onmaintenance.
Targeted Deferred Maintenance
The amount of DFM ($) tolerated to maintain a targetCI.
The Discount Rate (%)
Treated as a random variable in the simulation (we havean expected target, but it fluctuates from year to yeardue to variability). This is the rate for opportunity costof capital, which is used in the calculation of the NetPresent Value.
KPIs for facility performance assessment 1203
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014
Net Present Value of Dollars Spent
The equivalent present value of all future cash outflowsthat enable to conduct maintenance and replacement(capital renewal) activities, given a specific discountrate.
Rate of Inflation for DFM (%)
Treated as a random variable in the simulation (we havean expected target, but it fluctuates from year to yeardue to variability). The cost of repair, for maintenancethat is not performed, is estimated to increase by thisrate of inflation each year.
Rate of Inflation for System (%)
Treated as a random variable in the simulation (we havean expected target, but it fluctuates from year to year
due to variability). The cost of replacement for a systemis estimated to increase by this rate each year due toinflation.
Environmental Rate (%)
Treated as a random variable in the simulation (we havean expected target, but it fluctuates from year to yeardue to variability). The cost of replacement for a systemis estimated to increase by this rate each year due toenvironmental laws and regulations, updated buildingcodes, etc.
Condition Index (CI)
CI is a metric to assess the readiness of a system, plant,and/or facility. If the CI equals 1, the system is ‘as goodas new’; if the CI equals 0, the system life is expired.
CI ¼ 1� Total Current Deferred Maintenance
Total Current SRV
� �
Maintenance Efficiency Indicator (MEI)
MEI is a metric to assess the efficiency of maintenanceexpenditure. MEI of 100% means that the exactamount necessary to maintain the target CI was spentin the given year on deferred maintenance. MEI below100% means that not enough was spent, and MEIabove 100% means that more was spent than needed.If MEI equals 60%, for example, that means we under-spent on what was needed to be spent on deferred main-tenance to either maintain the target CI or take care ofpreventative maintenance costs by 40%. If MEI equals110%, then we spent 10%more than necessary to main-tain the target CI.
Replacement Efficiency Indicator (REI)
REI is a metric to assess the efficiency of system replace-ment expenditure. Given that systems typically do notexpire each year, REI is not a smooth continuous func-tion. REI below 100% means that not enough was spenton replacement and REI above 100% means that morewas spent than needed. In this paper, the REI is cappedat a maximum of 200%.
REI ¼ 100
� $Amount Spent on SystemReplacement þ 0:001
Total $Cost of Expiring Systemsþ 0:001
� �
MEI ¼ 100� $Amount Spent onMaintenance ðExcluding ReplacementÞDFM � Targeted DFM
� �
1204 Lavy et al.
Dow
nloa
ded
by [
Tex
as A
&M
Uni
vers
ity L
ibra
ries
] at
08:
30 1
7 D
ecem
ber
2014