quality improvement and information technology ...oz.stern.nyu.edu/seminar/sp02/0404.pdfquality...
TRANSCRIPT
Quality Improvement and Information Technology Infrastructure Costs in
Software Product Development: A Longitudinal Analysis
Donald E. Harter University of Michigan Business School
Ann Arbor, MI 48109-1234 Phone: (734) 764-3108
Fax: (734) 936-0279 Email: [email protected]
Sandra A. Slaughter*** Graduate School of Industrial Administration
Carnegie Mellon University Pittsburgh, PA 15213 Phone: (412) 268-2308 Fax: (412) 268-7345
Email: [email protected]
*** contact author for this paper
Do not cite, quote, or distribute without permission of the authors
Last Updated: October 31, 2001
ii
Quality Improvement and Information Technology Infrastructure Costs in
Software Product Development: A Longitudinal Analysis
Abstract
Information technology (IT) infrastructure activities include shared computing services like computer
operations, data management, and configuration management. These activities facilitate, but are not directly
involved in, software product development. IT infrastructure costs represent a substantial portion of firms’ IT
budgets. This highlights the importance of innovations that yield significant cost savings in infrastructure
activities. Although increasing evidence indicates that quality improvement reduces costs in software product
development, it is not known whether IT infrastructure activities also benefit from quality improvement. In
this study, we examine the link between software process maturity, product quality and the costs of
infrastructure activities. We evaluate monthly cost data collected in IT infrastructure activity centers over ten
years of software product development in a major IT company. During this time, the company advanced
several levels in process maturity as assessed by the Software Engineering Institute’s Capability Maturity
Model™. We find that IT infrastructure activities benefit substantially from quality improvement. The greatest
marginal cost reductions are realized in activities that occur later in the software product life cycle.
Organizational inertia influences the rapidity with which the infrastructure activity centers benefit from
higher product quality. Finally, our findings suggest diminishing returns to quality improvement in
infrastructure activities.
Key Words: Software Process Improvement; Information Technology Infrastructure Costs; Software
Quality; Capability Maturity Model; Organizational Inertia.
1
Quality Improvement and Information Technology Infrastructure Costs in Software Product Development: A Longitudinal Analysis
1. Introduction
In information-intensive industries where state-of-the-art development and deployment of
information technology (IT) are strategic necessities, IT infrastructure is a central issue. IT infrastructure is a
critical enabler of competitive performance and improved business processes (Broadbent et al. 1999; Hamel
and Prahalad 1994) and is key to the development time, cost and feasibility of implementing innovative
information systems (Duncan 1995). According to Broadbent et al. (1996), IT infrastructure is “the enabling
base of shared IT capabilities which provide the foundation for other business systems… This capability
includes both the internal technical (equipment, software and cabling) and managerial expertise required to
provide reliable services” (p. 175). IT infrastructure activities include shared services such as computer
operations, data management, and configuration management (Broadbent et al. 1999; Boehm 1981). These
activities facilitate, but are not directly involved in, software product development.
Empirical studies indicate that the costs of IT infrastructure are substantial, often exceeding those of
software development. A study of IT investment patterns across 27 firms in seven countries by Weill and
Broadbent (1998) reveals that the firms spent on average more than half (58%) of their total IT investments
on infrastructure and infrastructure services and the remainder on the development of informational, strategic
and transactional systems. Information-intensive firms spent even more (65%) of their total IT investments on
infrastructure and infrastructure services. A broad-based analysis of IT costs across 2,151 U.S. corporations
by Strassman (1999) found that 40% of IT spending is on services to support IT infrastructure. As these
examples illustrate, the costs of IT infrastructure activities can be substantial, and can rival or outweigh
systems development costs (McLean and Wilkes 1990). This highlights the importance of innovations that
result in significant cost savings in IT infrastructure activities.
An innovation that is yielding performance benefits in software development is software process
improvement. A software process is a set of well-defined procedures leading to the development of software
products (Curtis et al. 1992; Zahran 1998). Software processes are improved by implementing a variety of
practices such as training, quality assurance, measurements, design and code reviews, and change control
(Dekleva and Drehmer 1992). A mature software process consistently yields software products that meet
design specifications (Krishnan and Kellner 1999). Research has linked software process maturity to higher
product quality, increased development productivity, and reduced development cycle time and costs (Harter
2
et al. 2000; Krishnan et al. 2000; Goldenson and Herbsleb 1995). This work has theorized that the primary
driver of improved software development performance is reduced rework due to better software processes.
While increasing evidence supports the performance benefits of quality improvement for software
development, it is not known whether IT infrastructure activities also benefit. However, it is not unreasonable
to posit that there could be significant cost savings in infrastructure activities deriving from improved product
development processes. A basic conjecture is that as software development processes become more
consistent, product quality improves, and less rework is required in the infrastructure activities that support
product development. For example, consider program control. A primary function of this activity is to create
and monitor software development schedules and plans. If software development processes are inconsistent,
and software products have many defects leading to development delays, significant rework is required in
program control to re-schedule and re-budget product development. However, as software processes mature
and become less variable, products have fewer defects and are more likely to be on schedule and within
budget. Thus, the costs of rework in program control are reduced. Generalizing this logic suggests that the
cost savings from rework avoidance in IT infrastructure activities could be substantial. However the impact of
quality improvement on IT infrastructure activities is not known.
In this study, we build a conceptual framework to link software process maturity, product quality, and
IT infrastructure costs. Our objective is to model the IT infrastructure activity-specific costs and benefits of
quality improvement. We evaluate the relative magnitudes of the costs and benefits in constructing our
hypotheses. We also model the effects of organizational inertia on IT infrastructure costs. Given the long-
term nature of investments in IT infrastructure, it is likely that organizational inertia would delay the
responsiveness of infrastructure activities to quality improvements. Our framework is evaluated by examining
monthly cost data collected in nine IT infrastructure activity centers over ten years of software development
in a major IT company. During this time, the company advanced several levels in process maturity as
assessed by the Software Engineering Institute’s Capability Maturity Model (CMM™). Our paper is organized
as follows. The next section develops our research framework and hypotheses. Section 3 provides details
about our research site and data collection. Model estimation and results are presented in Section 4. In the
final sections, we interpret and discuss our findings, discuss the contributions and limitations of this study and
provide directions for future research.
3
2. Research Framework and Hypotheses
2.1. Research Framework
Our research framework (Figure 1) depicts the cost impact of quality improvement on IT
infrastructure activities. The framework integrates two models that interrelate software process maturity,
product quality, organizational inertia, and infrastructure costs. The first model relates software product
quality to process maturity, controlling for other factors. Software quality at time t is modeled as a function of
process maturity at time t as process improvements directly impact product development activities: Software
Product Qualityt = f(Software Process Maturityt, Software Development Controlst). The second model relates
costs in IT infrastructure activity center c at time t to software product quality at time t, controlling for other
factors. The effects of software process improvements are mediated through software product quality in terms
of their impacts on IT infrastructure costs. Organizations can be slow to change even in light of efficiency
improvements; thus, we include the effect of organizational inertia via lagged IT infrastructure costs: IT
Infrastructure Costsc,t = f(Software Product Qualityt, IT Infrastructure Costsc,t-1, IT Infrastructure Controlst). In
the following sections, we explain and justify our models in more detail and present our hypotheses.
--------- insert Figure 1 here ------
2.2 Software Process Maturity and Software Product Quality
The software engineering literature suggests a strong association between software process maturity
and product quality (e.g., Harter et al. 2000; Krishnan et al. 2000; Herbsleb et al. 1997). Software processes
become more mature, that is, more disciplined and consistent as organizations add practices like design
reviews, code reviews, configuration control, and measurement (Krishnan and Kellner 1999). These practices
can reduce defects in the software product through identification of disparities between requirements, design
specifications, and the code. Such activities ascertain whether the developed software matches customer
expectations and design specifications early in the software development life cycle. In particular, several
studies indicate that processes promoting early detection and correction of errors help to design quality into
the delivered software products (e.g., Laitenberger and DeBaud 2000; Slaughter, et al. 1998; Dyer and
Kouchakdjian 1990). Therefore, in line with prior research, we expect that:
Hypothesis 1: Higher software process maturity is associated with higher software product quality.
We state this hypothesis controlling for the potential effects of product size and the use of computer-
aided software engineering (CASE) tools in software development. Product size has been identified as a
primary determinant of product defects (Basili and Musa 1991). As products increase in size, there are more
4
opportunities to introduce errors. In addition, large products can have more modules and module interactions,
further increasing the likelihood of defects. Finally, larger products tend to be more complex both
functionally and technically, and software complexity is strongly associated with software defects (Banker, et
al. 2002). Thus, we expect that product quality will decrease with product size. We also control for the use of
CASE tools in software development. CASE tools automate the software development process by generating
software code to match design parameters entered into the tools by software engineers. Because the code is
automatically generated, CASE tools introduce a consistency in the code and reduce syntactical errors. In
addition, CASE tools facilitate increased end user involvement in the software development process (Finlay
and Mitchell 1994). Thus, we expect that the use of CASE tools is associated with higher product quality.
2.3 Software Product Quality and the Costs of IT Infrastructure Activities
The effects of software process maturity are mediated through product quality in terms of their
impacts on infrastructure costs. This is because process improvements are focused directly on facilitating
software development, while IT infrastructure activities indirectly benefit from these improvements. Building
upon the work of Boehm (1981) and Weill and Broadbent (1998), we delineate five functional categories of
IT infrastructure activities that support software development: product management, process management,
integration management, operations management, and documentation production. Product Management
manages client relations, establishes product schedules and budgets, monitors development progress, controls
resource allocation, and ensures timely product delivery. Process Management creates procedures to manage
the software development process, monitors compliance, and audits products and processes to ensure quality.
Integration Management establishes technical standards for software and data integration, monitors
compliance, and controls changes to technical content that affects other product lines. Operations
Management controls the system level architecture, runs production software for users, and provides
assistance to the software development team through systems expertise and operations support.
Documentation Production involves the physical production of specification documents, user manuals,
operator manuals, and maintenance documentation. In the following sections, we delineate the tasks and
responsibilities of each activity and describe the impact of quality improvement. We then state our second
hypothesis that relates software quality to IT infrastructure costs.
2.3.1 Product Management. Product Management includes Senior Management and Program Control.
Senior Management has a primary focus on the long-term issues associated with software production (Zahran
1998). Senior Management provides a periodic review of the product development and infrastructure
5
functions and allocates the resources for long-term improvement of software processes (Paulk et al. 1994).
Senior management commitment is considered a requirement for the success of quality improvements in
software development (Alter 1991). Although generating an organizational commitment to quality
improvement could be very challenging, the actual time invested (cost incurred) is limited, relative to senior
managers’ other responsibilities. Senior Management can benefit from quality improvement in several ways.
Poorly constructed products with many defects require significant management time to resolve issues of
schedule and budget; this may require reallocation of resources. Product defects also have the potential to
negatively impact the relationship with the client, and senior managers may need to re-negotiate contract
terms or otherwise mitigate the negative consequences of product defects on client relations. As software
quality improves, there are fewer defects and reduced instances requiring intervention with the client and
reallocation of resources to resolve schedule and budget problems. Because Senior Management usually
includes highly paid executives, the cost savings from reduced rework in this center could be significant.
Program Control is a key factor in software development success (Douglas 1999). The purpose of
Program Control is to provide visibility into development performance (Paulk et al. 1994), facilitate software
development management by measuring progress against cost and schedule constraints (Aptman 1986), and
identify activities not meeting their cost and schedule objectives (Ray 1998). Historically, Program Control
was used in engineering and construction projects, but has recently become important in the management of
software development (Findley 1998). As the first step of the planning and control function, Program Control
develops work breakdown structures, work package descriptions, initial schedules with milestones, cost
estimates and budgets, and resource profiles (Kibler 1992). After establishing cost estimates and schedules,
Program Control updates the schedules and budgets based on the accumulation of costs and resources over
time (Martin 1992, Jones 1994). Program Control has an investment in quality improvement that is limited to
modifying schedule and budget templates to reflect new and improved software development processes.
Potential cost savings in Program Control from higher product quality are significant and would accrue from
fewer revisions to schedule and budget baselines. Products with many defects frequently require changes to
both schedule and budget. Thus, as quality improves, there is potential for significant reduction in costs in this
IT infrastructure activity due to rework avoidance in re-estimating, re-scheduling and re-budgeting.
2.3.2 Process Management. Process Management includes Configuration Management and Quality
Assurance. Configuration Management is often identified as an important aspect in the management of
technology projects (Guthrie 1984). The goal of software configuration management is to ensure that
6
delivered software meets user requirements and performance criteria (Bersoff 1984). Configuration
Management is responsible for establishing baselines against which progress is measured and for collecting
necessary metrics (Humphrey 1989). Configuration Management is also responsible for the storage and
recording of software products and associated documentation (Jones 1994). The storage includes the cross-
referencing between software and documentation to ensure traceability from the software backward through
the specifications to the original requirement (Paulk et al. 1994). This enables management and control of
changes in requirements, design specifications, and software code. Configuration Management’s investment
in software process maturity and quality improvement involves initial effort to establish tracking and
monitoring procedures, and ongoing effort to collect metrics that facilitate change control and traceability.
Benefits of quality improvement to Configuration Management derive from reduced rework in number of
areas. With fewer product defects, there are fewer versions of changed software code, fewer migrations of
changed code, and fewer modifications to software-documentation cross-references.
Quality Assurance in software development evolved from sophisticated techniques developed to
assure quality in the hardware industry (Gupta 1989, Buckley and Poston 1984) and has become an important
component in the high technology industry (Guthrie 1984). Unfortunately, Quality Assurance has often been
perceived as an impediment to software development (Turnbull 1986). Quality Assurance personnel are
functionally independent of the software development process (Buckley and Poston 1984) and have important
process and product monitoring responsibilities focused on ensuring software product quality. Quality
Assurance activities include the creation and monitoring of development standards, in addition to the auditing
of software products and processes (Boehm 1981, Ferry 1985). The objectives of Quality Assurance are to
provide management visibility into processes and products (Paulk et al. 1994) and to provide confidence that
the software meets all technical requirements (Zahran 1998). Additional responsibilities include reviewing
development plans, test plans, and test results (Humphrey 1989). Quality Assurance establishes the policies
and procedures that are essential to improving process maturity. Benefits of increased product quality to
Quality Assurance accrue from several sources. A high level of defects creates additional work for Quality
Assurance. Each code change to fix a defect must be reviewed, as well as new and modified test plans, test
results for the changed code, and updates to documentation. Thus, as product quality improves, there is
correspondingly less rework required to review revised software, test plans and results, and documentation.
2.3.3 Integration Management. Integration Management includes Software Integration and Data
Integration. Software Integration activity centers are established to facilitate inter-group coordination in
7
organizations that develop products that interact through software interfaces (Paulk et al. 1994). Failures in
software integration have been notorious in delaying major programs (McKenna 1996, Evers 1996). Software
Integration manages the technical specifications for interfaces and interactions between software products and
development teams. The quality investment in Software Integration is limited to the initial development of
standards for interface development and inter-group coordination. The benefit of quality improvement in
Software Integration is reduced rework associated with the renegotiation of interface specifications between
software product development teams. As the number of product defects decreases, there is correspondingly
less effort required to renegotiate interface specifications. Software Integration is typically staffed with highly
paid senior engineers who have broad technical and functional knowledge. Thus, as with Senior Management
and Program Control, there is potential for significant cost savings from less rework.
Large-scale systems rely on integrated databases to facilitate the sharing of data among multiple
software products and at distributed processing sites (Adrian 1986). Data Integration creates tools such as
data dictionaries to store descriptions of data elements and relationships among elements, and to control
access to data to ensure a common understanding for users and developers (Pressman 1997, White 1987). To
facilitate data access, Data Integration maintains rules and instructions for locating data (Henderson 1987)
and for cross-referencing data to software products (Venkatakrishnan 1988). The quality investment in Data
Integration is limited to initial development of data integration standards to control access to database
elements, editing criteria, and relationships among elements. Similar to the effect on Software Integration, the
benefits of quality improvement in Data Integration include reduction in rework associated with the re-
negotiation of data definitions, database access rules, and cross-references between data and software code.
2.3.4 Operations Management. Operations Management includes activities that support the day-to-
day operation of the data center. This function has the smallest quality investment relative to the other
functional areas, due to the emphasis on supporting software production activities rather than software
development activities. We distinguish Systems Support from Operations activities in the Operations
Management function. Systems Support includes specialists in operating systems, database management
system software, security software, and telecommunications systems. Systems Support activities include the
installation and maintenance of operating systems, database management systems, security systems, tape
management systems, networks, backup and recovery, etc. Investments in quality improvement in Systems
Support are generally limited to initial development of processes associated with each component of the
systems software. Benefits from improved process maturity include reduced involvement in system failures
8
resulting from software product defects. Each failure has the potential to require significant effort by Systems
Support to help resolve the problem. Thus, as quality improves, and defects decrease, system failures due to
defects also decrease, and there is less effort required in this activity. Systems Support is highly compensated
due to the level of expertise required. Thus, as with Senior Management and Program Control, there is
potential for significant reduction in costs due to less effort in resolving system failures.
Operations is traditionally characterized as automated data processing or computer operations and
involves the day-to-day running of the data center by computer operators. Computer operations represents a
significant cost in the IT budget. For example, Keen argues that every one dollar investment in software
development incurs an annual liability of 20 cents in operations (Keen 1991). Operations activities include
scheduling and running batch programs, controlling computer site workload, managing printers and tapes, and
monitoring disk, memory, and telecommunications workload. Investments in quality improvement in
Operations are generally limited to training of staff on new processes. However, significant benefits can
accrue to Operations from quality improvement. Each defect could involve code changes and tests, requiring
Operations to conduct additional runs of new, changed, and existing batch programs. With improved quality,
there are fewer re-runs of batch programs due to system crashes in both production and development tests.
2.3.5 Documentation Management. Documentation is essential to the management of software
development because it is a primary mechanism available to non-technical managers to monitor the
development of their products (Beach 1984). Leavitt (1977) reported that documentation costs are an
important component of software costs, and Jones (1986) characterized documentation as one of the hidden
costs of software development. Boehm (1981) found that 51% to 54% of all software development activity
results in a document as its end product, versus only 34% resulting in actual code. Examples of
documentation include design specifications, user guides, programmer manuals, operator guides, and
maintenance manuals (Jones 1994). Similar to Configuration Management and Quality Assurance, there are
initial and ongoing quality investments in Documentation. Documentation supports the process redesign
necessary for higher levels of process maturity. However, each product defect that requires code changes
could also require changes to documentation as well as additional inspection to determine that software and
documentation are in sync. Thus, as quality improves, there is reduced rework due to fewer errors in
documentation and fewer consistency errors between software and documentation.
Based upon the task descriptions for each IT infrastructure activity elaborated in the sections above, we
expect that there are initial investments required to establish procedures and mechanisms for quality
9
improvement. However, after initial set up of work practices and standards, on an ongoing basis, investments
in quality improvement are significantly lower, while potential benefits are high. Higher quality products
have fewer defects and thus require less rework in the activities supporting software product development.
Overall, we expect that less rework would translate into lower IT infrastructure activity costs as personnel
reduce the time spent on re-doing tasks. Therefore,
Hypothesis 2: Higher software product quality is associated with lower costs in IT infrastructure
activities.
Table 1 summarizes the tasks accomplished and the quality investments and benefits in each IT
infrastructure activity center.
----- insert Table 1 here -----
2.4 Organizational Inertia
Organizational inertia, i.e., slowness to change, is an impediment to process improvement associated
with quality initiatives (Reger et al. 1994). The theory of organizational inertia originates with the seminal
work by Hannan and Freeman (1984). According to Hannan and Freeman, organizations do not often succeed
in making radical changes in response to environmental changes because of structural inertia. Structural
inertia is high when the speed of organizational change is much slower than the rate at which environmental
conditions change. Inertia limits both the rate of change and the organization’s ability to adapt (Laverty
1996). Organizations with established work routines are particularly inert. An established work practice has
little ambiguity in execution and a known history of returns, while a new practice has highly uncertain future
returns (March 1981). In fact, Hannan and Freeman (1989) argue that the capacity to respond quickly to
environmental changes competes with the capacity to perform reliably and accountably. Studies by Greve
(1999; 1998) have found that organizations with highly effective work practices are likely to suffer
performance decrements when implementing changes. Disrupting internal routines, structures, and linkages in
core work activities makes an organization vulnerable to lower performance until it can rebuild these
elements (Tushman and Romanelli 1985). Some suggest that this is why adoption of best practices that are
new to the organization is difficult (Rumelt et al. 1994).
Organizational inertia may particularly apply to the specialized infrastructure activities supporting
software development. Given the magnitude of investments in IT infrastructure, the decision-making horizon
tends to be oriented toward the long term rather than the short term (Weill and Broadbent 1998). In addition,
each IT infrastructure activity center has a unique set of work practices that reflects its organizational identity
10
(Fiol and Huff 1992; Albert and Whetten 1985). This organizational identity could act as an inertial barrier
hindering organizational change (Reger et al. 1994). In addition, IT infrastructure activities are linked in
complex ways to software development processes, with complicated interdependencies. Thus, changes to core
work practices in software development could require adjustments in IT infrastructure activities that are
difficult to predict, causing coordination uncertainty and reducing work performance.
Reger and colleagues argue that one of the important aspects that distinguishes quality initiatives
from other types of organizational change is the redistribution of resources. Quality improvement can
increase productivity, reducing the number of resources required to perform IT infrastructure activities.
However, organizational inertia could delay the management response (Gresov et al. 1993) to quality
improvement. When inertia is high, managers are less able to recognize and respond to the need for change
and often delay in spite of evidence necessitating change (Cohen 1998). Inertia becomes more salient in
organizational units that develop very focused tasks and skill sets. In software development, Fichman and
Kemerer (1997) and Ravichandran (2000) have found that organizations are relatively worse at process
innovation when they have less diverse information flows, technical knowledge and activities. In IT
infrastructure activities such as Data Integration, staff are very specialized in their knowledge and application
of database concepts and cannot easily take on tasks outside of the database domain. The tendency to not
change increases as the switching costs of moving personnel to other departments rises (Hannan and Freeman
1984). Special-purpose units have small margins of error because they cannot readily reduce or expand the
scope of their activities in response to temporary changes in the environment. Therefore, the most specialized
IT infrastructure activities, such as Data Integration, may be the slowest to change.
For the reasons articulated above, we expect that organizational inertia will influence the impact of
quality improvement on all IT infrastructure activity centers, and particularly on those that have very
specialized work practices. This implies that changes in IT infrastructure activities will lag actual productivity
gains from improved software process maturity. Thus,
Hypothesis 3: Organizational inertia is associated with higher costs in IT infrastructure activities.
Our hypotheses for IT infrastructure costs are stated controlling for the magnitude of software
development workload (the size of products under development) and software production workload (the size
of products supported) in each activity center. The work in software development and production drives work
in IT infrastructure and thus should be positively related to IT infrastructure costs. We also control for a
change in human resource policy by the organization in this study that involved a decision to hire individuals
11
with higher educational qualifications to staff many of the jobs in IT infrastructure. Educational qualifications
are primary elements of individuals’ human capital endowments. Consistent with human capital theory
(Becker 1975), we expect that individuals with higher levels of education are compensated more highly to
reflect their greater human capital endowments, leading to higher costs in IT infrastructure activities after
implementation of the new hiring policy.
3. Research Design and Methodology
3.1 Research Setting
We analyze data collected over approximately ten years from nine IT infrastructure activity centers in
the systems integration division of a major IT company. The company operated internationally with over
9,000 employees and $750 million per year in revenue, with contracts to commercial, international, and
government clients. The company utilized a matrix management approach to facilitate rapid reassignment of
staff. The systems integration division in the company consisted of approximately 300 staff members
generating $67 million in revenue per year on software development and systems integration projects. The IT
infrastructure activity centers in this division supported the development of approximately 3.4 million lines of
code for a material resource planning (MRP) system from 1985 to 1994 for a government client. The division
designed the system on IBM compatible mainframes using a third generation language. The activity centers
also supported the operation of the software products in this system after customer acceptance.
The company was under contract to develop software products in a fixed price incentive agreement.
Under this agreement, costs incurred above a ceiling price were entirely absorbed by the company. Both IT
infrastructure activity costs and software development costs were included in this ceiling price. The client and
company shared cost savings resulting from process improvement. This incentive agreement encouraged the
company to seek opportunities to reduce costs in both software development and infrastructure activities.
During the ten-year period of the study, changes occurred in the use of CASE tools, and the company’s hiring
policies. CASE tools were introduced in Software Development in the first year. In the second year, senior
management recognized the need for engineers with better conceptual skills to effectively use the CASE
technology, and changed the division’s hiring policy to a minimum requirement of an undergraduate degree,
with strong preference given to candidates with graduate degrees.
3.2 Data Collection Methods
Several types of data were collected for this analysis: IT infrastructure costs, process maturity levels,
software product defects, and the volume of development and production activity. Infrastructure cost data
12
were collected from the corporate time reporting and cost accounting system. Employees were required to
record all time expended on bi-monthly timesheets, which were individually reviewed and approved by both
administrative and management personnel. Timesheet data were tracked by IT infrastructure activity cost
centers and reported monthly. All costs were audited for accuracy by internal and external auditors. Process
maturity data were collected by external divisions and agencies to provide independent assessments of the
company’s software development processes. Independent auditors and senior corporate personnel external to
the systems integration division performed five software process maturity assessments during the ten-year
period. These independent teams used the Software Engineering Institute’s CMM (Paulk et al. 1994) to assess
the maturity of software development and infrastructure activities. The independent auditors and outside
teams reviewed the division’s processes and interviewed employees and managers to determine process
maturity for each evaluation. The Configuration Management activity center maintained a database on
software product defects identified during development and acceptance tests. Configuration Management also
used automated tools to count the lines of source code for each software product. All software products were
written in the same programming language. The use of a single programming language as well as automated
counting tools ensured a consistent approach to measurement of lines of code over the ten-year period.
3.3 Construct Measurement
We operationalize the major constructs in our models as follows. IT infrastructure costs and lagged
IT infrastructure costs (or organizational inertia) are assessed in terms of the inflation-adjusted dollar costs
incurred in each IT infrastructure activity center for the current month and prior month, respectively.1 The
primary cost in all IT infrastructure activity centers is the time recorded by personnel to support software
products under development and completed software products in production. Thus, the cost data include
predominantly labor (not hardware) costs.2 Process maturity is evaluated using the Software Engineering
Institute’s CMM (Paulk et al. 1994) based upon a number of external assessments of the IT company, as
described earlier. We averaged the company’s assessed CMM maturity levels across the software products
developed in each month. Each CMM maturity level conceptually represents a distinct evolutionary plateau in
an upward path toward achieving excellence in software development (Humphrey 1989). A study of the
measurement properties of the CMM by Dekleva and Drehmer (1997) indicates that the five CMM maturity
levels form a cumulative hierarchy that is necessary to establish a pattern of growth in terms of improvement
1 The consumer price index (CPI) established by the Bureau of Labor Statistics of the U.S. government was used to deflect costs in all IT infrastructure activity centers to a common year. 2 In the Documentation center, costs also include the cost of paper, and in Configuration Management, costs also include the cost of file folders.
13
in software development practices.3 The company in this study advanced from CMM level 1 (“initial”) to
CMM level 3 (“defined”) over the ten years of the study. Product quality is assessed as the ratio of thousand
lines of code developed each month to the total number of defects found in software product development and
acceptance tests each month. Independent test teams conducted system level development tests using a test
database and a test plan to perform a technical test of the system design. Customers conducted an acceptance
test of the software by using a test database and test plan to verify that functional requirements were satisfied
and by using a full database to stress test the system under simulated live conditions. All defects found in
development and acceptance tests were recorded in a defect database.
We include a number of control variables in our models. Two variables measure the workload in the
IT infrastructure activity centers. Development Workload is the magnitude (in thousand lines of code) of
products under development that were supported by the infrastructure activity centers in each month.
Production Workload is the magnitude (in thousand lines of code) of completed products that were supported
by the infrastructure activity centers in each month. For the purpose of this analysis, the measure of lines of
code produced per month was derived based on the start and end date of each product, resulting in an estimate
of the lines of code generated during each month of product development. The cumulative sum of lines of
code produced each month reflects the size of software products in production. Thus, lines of code in
development and in production reflect the magnitude of support required in the IT infrastructure activity
centers. Several other variables control for the potential effects of management and technical policy changes
(described earlier) that occurred in the company over the time period of the study. CASE tools were
introduced later in the first year of the study’s timeframe. CASE tool introduction is operationalized as a
binary variable that is set to “1” when CASE tools are introduced, and is zero prior to then. In the second
year, senior management implemented a change in hiring policies and focused on hiring individuals with
advanced degrees for jobs in IT infrastructure activities. The change in hiring policy is operationalized as a
binary variable that is set to “1” when new hiring policies were implemented, and is zero prior to then.
4. Analysis and Results
4.1 Statistical Models
Although there has been no research that evaluates the linearity of process, size and IT infrastructure
costs, the effects of size and other factors are log-linear for software development costs. Both economies and
3 The CMM maturity levels reflect the top level of a multi-layer structure of software development capabilities. In 1993, the five maturity levels were decomposed into clusters of related practices or “key process areas” (KPAs; Paulk, et al. 1993). An alternative operationalization of process maturity
14
diseconomies of scale have been observed for development costs (Banker and Slaughter 1997, Banker et al.
1994). Thus, we adopted a log-linear specification for the statistical models and empirically confirmed that
this specification was appropriate (Kmenta 1986). We estimated ten time series equations; the first equation
estimates software product quality, and the other equations estimate costs in the nine different IT
infrastructure activity centers:
ln(Product-Qualityt) = β00 + β10* ln(Process-Maturityt) + β20*ln(Development-Workloadt) + β30*CASEt + ε0 [1]
ln(Senior-Mgtt) = β01 + β11*ln(Senior-Mgtt-1) + β21*ln(Product-Qualityt) + β31*ln(Development-Workloadt) + β41*ln(Production-Workloadt) + β51*ln(Hiringt) + ε1 [2]
ln(Program-Controlt) = β02 + β12*ln(Program-Control t-1) + β22*ln(Product-Qualityt) + β32*ln(Development-Workloadt) + β42*ln(Production-Workloadt) + β52*ln(Hiringt) + ε2 [3]
ln(Config-Mgtt) = β03 + β13*ln(Config-Mgt t-1) + β23* ln(Product-Qualityt) + β33* ln(Development-Workloadt) + β43*ln(Production-Workloadt) + β53*ln(Hiringt) + ε3 [4]
ln(Qual-Assurancet) = β04 + β14*ln(Qual-Assurance t-1) + β24* ln(Product-Qualityt) + β34* ln(Development-Workloadt) + β44*ln(Production-Workloadt)
+ β54*ln(Hiringt) + ε4 [5]
ln(Software-Integrationt) = β05 + β15*ln(Software-Integration t-1) + β25* ln(Product-Qualityt) + β35* ln(Development-Workloadt) + ε5 [6] ln(Data-Integrationt) = β06 + β16*ln(Data-Integration t-1) + β26* ln(Product-Qualityt) + β36* ln(Development-Workloadt) + ε6 [7] ln(Systems-Supportt) = β07 + β17*ln(Systems-Support t-1) + β27* ln(Product-Qualityt)
+ β37* ln(Development-Workloadt) + β47*ln(Production-Workloadt) + β57*ln(Hiringt) + ε7 [8] ln(Operationst) = β08 + β18*ln(Operations t-1) + β28* ln(Product-Qualityt)
+ β38* ln(Development-Workloadt) + β48*ln(Production-Workloadt) + β58*ln(Hiringt) + ε8 [9] ln(Documentationt) = β09 + β19*ln(Documentation t-1) + β29* ln(Product-Qualityt)
+ β39* ln(Development-Workloadt) + β59*ln(Hiringt) + ε9 [10]
Each pair of product quality and infrastructure cost equations forms a fully recursive model. The
residuals are not correlated across the pairs of equations. Thus, the model may be consistently estimated using
could involve use of KPA assessment data. However, as KPAs were not introduced into the CMM until 1993, KPA data cannot be used to assess process maturity in a longitudinal study such as ours, with its ten year time period (from 1985 to 1994).
15
equation-by-equation ordinary least squares (OLS) (Greene 1997). We first estimated each equation using
OLS. The assumption of normality in the equations was not rejected using the Kolmogorov-Smirnov test
(Stephens 1986). We did not find evidence of multi-collinearity in the equations using the criteria specified in
Belsley et al. (1980). However, White’s test for heteroscedasticity (White 1980) suggested problems with
heteroscedastic data, and the Breusch-Godfrey test (Breusch and Godfrey 1981) indicated the presence of
serial correlation. Thus, we estimated the equations using the autoregressive conditional heteroscedasticity
(ARCH) technique to correct for both serial correlation and heteroscedasticity. The criteria specified by
Belsley et al. (1980) indicated one outlier in the cost equations for configuration management, data
integration, documentation and systems support and two outliers in the cost equation for operations.
However, our results from the equations estimated without outliers are consistent with those when all
observations are included. Therefore, we report our results for all equations with all observations included.
4.2 Results
Descriptive statistics and pair-wise correlations for the variables in our data set are displayed in
Tables 2 and 3, respectively. The ARCH estimates for the product quality equation are shown in Table 4, and
the ARCH estimates for the IT infrastructure cost equations are shown in Table 5.
-------- insert Tables 2, 3, 4 and 5 here ---------
In the model for software quality [1], we find as expected, that higher levels of process maturity are
associated with higher levels of product quality (β10 = 1.252, p < 0.001). There are increasing returns to
process maturity in terms of improved product quality, i.e., a 1% increase in process maturity is associated
with a 1.25% improvement in product quality. Development workload is negatively associated with product
quality; as the size of development workload increases, software product quality decreases. The coefficient
for CASE tools is positive as we expected but is not statistically significant.
In the model for Senior Management [2], we find as anticipated, that higher levels of product quality
are associated with reduced Senior Management costs (β21 = -0.425, p < 0.001). The coefficient implies that a
1% improvement in product quality is associated with a 0.43% reduction in Senior Management costs. There
is significant inertia represented by the coefficient on the lagged Senior Management costs (β11 = 0.185, p <
0.01). As expected, development workload is positively and significantly correlated with Senior Management
costs. The coefficient for production workload is not statistically significant. Finally, the change in hiring
policy is associated with significantly higher Senior Management costs. The results for the Program Control
model [3] indicate relationships similar to those found for Senior Management. Program Control costs are
16
reduced by 0.36% for a 1% improvement in product quality (β22 = -0.360, p < 0.001). Costs in the current
month are significantly related to costs in the previous month (β12 = 0.344, p < 0.001). The coefficients for
both development workload and production workload are not statistically significant. The change in hiring
policy is associated with significantly higher Program Control costs.
Results for the Configuration Management model [4] indicate that the effect of product quality on
Configuration Management costs is negative and statistically significant (β23 = -0.189, p < 0.01). The
coefficient implies that a 1% improvement in product quality is associated with a 0.19% reduction in
Configuration Management costs. We find a significant lagged effect of costs on current Configuration
Management costs (β13 = 0.385, p < 0.001). The coefficients for both development workload and production
workload are not statistically significant. The change in hiring policy is associated with significantly higher
Configuration Management costs. In the Quality Assurance model [5], the relationship between product
quality and Quality Assurance costs is negative and statistically significant (β24 = -0.381, p < 0.001). A 1%
increase in product quality is associated with a 0.38% reduction in Quality Assurance costs. Monthly Quality
Assurance costs are positively and significantly related to previous month costs (β14 = 0.540, p < 0.001). As
expected, development workload is positively and significantly correlated with Quality Assurance costs. The
coefficients for production workload and for the change in hiring policy are not statistically significant.
In the Software Integration model [6], we find that a 1% improvement in product quality is related to
a 0.28% reduction in Software Integration costs (β25 = -0.275, p < 0.01). Software Integration costs are
positively and significantly correlated with costs in the prior month (β15 = 0.490, p < 0.001) and with
development workload. In the Data Integration model [7], we find that the relationship between product
quality and Data Integration costs is not statistically significant (β26 = -0.200, p > 0.10), contrary to our
hypothesis. Data Integration includes staff with very specialized skills. As Hannan and Freeman (1984)
suggest, organizations with high staff specialization have less flexibility in responding to changes in the
environment and cannot easily reduce or re-deploy resources. This is reinforced by the high positive value of
the coefficient on lagged costs (β16 = 0.670, p < 0.001). Data Integration costs are also positively and
significantly correlated with development workload. The Systems Support model [8] analysis finds a 1.30%
reduction in Systems Support costs when product quality improves by 1% (β27 = -1.296, p < 0.001). As
expected, there is a lagged effect on support costs (β17 = 0.229, p < 0.01). However, Systems Support costs
are not influenced by the development workload and the production workload. The change in hiring policy is
17
associated with significantly higher Systems Support costs. The results for the Operations model [9] are
similar to the Systems Support model. A 1% improvement in product quality is related to a 0.56% reduction
in Operations costs (β28 = -0.555, p < 0.001). There is a lagged effect of Operations costs from month to
month (β18 = 0.401, p < 0.001). The coefficients for development workload, production workload, and the
change in hiring policy are not statistically significant. In our final model, Documentation [10], we find that
increasing software product quality by 1% is associated with a 0.35% reduction in Documentation production
costs (β29 = -0.352, p < 0.001). There is a significant lagged effect on Documentation costs due to costs in the
prior month (β19 = 0.519, p < 0.001). The coefficients for development workload, production workload, and
the change in hiring policy are not statistically significant.
5. Discussion
In this section, we examine the results for each of our hypotheses and identify the common patterns
across IT infrastructure activities. We then examine the actual cost savings realized in each IT infrastructure
activity center at different levels of software process maturity.
5.1 Patterns of Results
Our first hypothesis concerning the relationship between software process maturity and software
product quality is supported. That is, we find that increases in software process maturity are associated with
increased product quality. This is consistent with our hypothesis and with prior research in this area (e.g.,
Krishnan et al. 2000; Harter et al. 2000). We also find support for our second hypothesis concerning the
relationship between software product quality and IT infrastructure costs. With the exception of Data
Integration, all of the other IT infrastructure activity centers experience a reduction in costs associated with
increased product quality. As we have noted earlier, Data Integration tasks require very specialized skill sets;
individuals in this center cannot easily be reassigned to other tasks. This could have inhibited the center’s
ability to take advantage of economies due to increased process maturity. Considering the results for the other
activity centers, we find that IT infrastructure activities concentrated in later stages of the product life cycle
benefit more from quality improvement. An explanation is that downstream activities such as Operations and
Systems Support are more influenced by defects since downstream activities can incur substantial rework in
resolving system failures due to poor quality. For example, it can be difficult and time-consuming to establish
cause and effect relationships between design choices and production outcomes. This is consistent with the
economics of software engineering which suggests that the effort required for rework is greater the later in the
software life cycle that a defect is detected (Boehm 1981).
18
Our third hypothesis concerning the relationship between lagged and current IT infrastructure costs is
supported: in all IT infrastructure activity centers, lagged costs are positively and significantly associated with
current costs. Thus, organizational inertia appears to play a significant role in determining how rapidly an
infrastructure activity center benefits from improvements in process maturity. The elasticity of inertia reflects
the slowness to change in an activity center’s expenditures from one month to the next. There are several
interesting patterns in the coefficients of organizational inertia across infrastructure activity centers. First, we
see that less inertia is experienced in activity centers such as Senior Management and Systems Support, i.e.,
prior period expenditures have less effect on current period expenditures in these centers. Both Senior
Management and Systems Support have more diverse tasks and greater flexibility in assigning personnel to
different tasks and can therefore innovate and change their work practices more easily. This is consistent with
the research on organizational inertia and innovation (Hannan and Freeman 1984; Greve 1999, 1998;
Fichman and Kemerer 1997; Ravichandran 2000) which argues that organizations with more diverse
information flows, tasks and skill sets are better able to absorb new technologies and to assimilate innovative
work practices. In contrast, in units where staff are very specialized in their knowledge and work practices
such as in Data Integration, Software Integration, Quality Assurance, and Documentation, there is less
flexibility in reassigning personnel to different tasks, resulting in a greater inelasticity of inertia, i.e., prior
period expenditures have a greater impact on current period expenditures.
There are also a number of interesting patterns in the findings for the control variables across IT
infrastructure activities. The activities closest to technical development have the least benefits from
economies of scale in product development. The Integration Management activity centers (Software
Integration and Data Integration) have the largest coefficients for development workload. This indicates that
these functions cannot easily be scaled up to handle large development workloads without significant
increases in staffing and associated costs. Finally, we find that the change in hiring policy to hire individuals
with higher levels of education impacts those activity centers where such individuals are more likely to be
deployed (e.g., Senior Management, Systems Support, and Program Control). IT infrastructure activity
centers such as Operations and Documentation are less likely to deploy personnel with higher degrees as the
tasks in these centers do not require advanced degrees and are not impacted by the change in hiring policy.
5.2 IT Infrastructure Cost Savings Realized at Different Process Maturity Levels
During the period of analysis, the IT infrastructure activity centers supported the development of
20,000 to 30,000 lines of code per month. The monthly infrastructure costs to support this software
19
development and the net cost savings from quality improvement are identified in Table 6. At CMM level 1 in
this company, product quality averaged 4.56 defects per KLOC, and monthly infrastructure costs per KLOC
averaged $21,396 (or $256,752 per KLOC per year). As the company improved processes and advanced from
CMM level 1 to level 2, product quality averaged 1.92 defects per KLOC, and the company reduced
infrastructure costs by $6,698 per KLOC per month (or just over $80,376 per KLOC in net cost savings per
annum). By further improving to level 3, product quality averaged 1.15 defects per KLOC, and the company
saved an additional $2,768 per KLOC per month (or about $33,216 in net cost savings per KLOC per year).
----- insert Table 6 here -----
The most significant cost savings appear in Senior Management. The greatest dollar savings are
realized in Senior Management because a small percentage reduction in executive time can have a significant
dollar impact, due to the high salaries of senior personnel. Senior management costs are reduced by $3,168
per KLOC per month in moving from CMM level 1 to level 2, and an additional $1,379 per KLOC per month
by advancing to level 3. By comparison, Program Control savings are reduced $428 per KLOC per month
when improving from level 1 to level 2, and an additional $195 per KLOC per month when moving to level 3.
Monthly Process Management costs and Integration Management costs exhibit some savings as processes
mature; however, the cost savings are relatively small in these activity centers. Configuration Management
costs decrease by $248 per KLOC per month when advancing from CMM level 1 to level 2; when advancing
to level 3, a further savings of $126 per KLOC per month are achieved. Quality Assurance costs decline by
$205 per KLOC per month when improving from CMM level 1 to level 2, and a further $92 per KLOC per
month when advancing to level 3. Software Integration costs are reduced by $198 per KLOC per month when
moving from CMM level 1 to level 2, and another $96 per KLOC per month when improving to CMM level
3. However, Data Integration savings are not statistically significant. The monthly expenditures in Operations
Management exhibit significant savings across activities. This is consistent with our earlier observation that
activities downstream in the software development life cycle experience a large impact from quality
improvement. Systems Support activities reflect reduced costs by $765 per KLOC per month when moving
from level 1 to level 2, and another $177 per KLOC per month when progressing to level 3. Operations
realizes a larger savings of $1,170 per KLOC per month when moving to level 2, and an additional $465 per
KLOC per month when moving to level 3. Documentation support costs also are significantly reduced when
processes improve. Documentation costs decline by $484 per KLOC per month when the organization
matures from CMM level 1 to level 2, and by $222 per KLOC per month when maturing to level 3.
20
In Figure 2, we graph product defect rates (defects per KLOC) and monthly IT infrastructure costs
per KLOC over time and at each level of process maturity in the company. As can be seen in this graph,
improving process maturity from CMM level 1 to level 2 has the greatest impact on both defect rates and IT
infrastructure costs. While both defect rates and IT infrastructure costs are reduced when the company
advances from CMM level 2 to level 3, the rate of improvement is less than that achieved in the initial
advancement from CMM level 1 to 2. This suggests declining returns to quality improvement in terms of
reductions in defect rates and IT infrastructure costs.
----- insert Figure 2 here -----
6. Conclusions
In this study, we have delineated the IT infrastructure activities that support software product
development. We have developed and empirically evaluated a conceptual framework to assess the cost impact
of quality improvement on IT infrastructure activities. In the context of a major software development effort
over a ten year time period, our analysis reveals that higher levels of process maturity are associated with
higher product quality as well as significant reductions in IT infrastructure costs. For eight of the nine
infrastructure activity centers at our research site, the benefits of quality improvement significantly outweigh
the investments, resulting in a net decrease in IT infrastructure costs.
Our study makes several contributions to academic research and managerial practice. A primary
contribution is our rigorous examination of the effect of quality improvement on IT infrastructure activities.
The literature on quality improvement has focused on software development activities. To the best of our
knowledge, no prior studies have considered and empirically evaluated the cost impact of quality initiatives
on infrastructure activities, although resources expended on IT infrastructure can be substantial. For example,
over the ten-year time period at our research site, 42.3% of total IT expenditures (excluding hardware) were
on infrastructure activities and 57.7% were on software development and maintenance. Our results imply that
economic frameworks for evaluating the cost impact of quality improvement should be broadened to include
IT infrastructure activities as well as software development activities. Another contribution of this study is
our examination of the effect of organizational inertia on infrastructure costs. Organizational inertia has been
well documented in the management literature, but few studies have modeled and quantified the effects of
inertia in the adoption of quality improvements in software engineering. Our results imply that organizational
inertia can reduce or delay the benefits obtained from quality improvement. Although process improvements
afford organizations the opportunity to reduce staff and costs, we find that the infrastructure activity centers at
21
our research site are slow to take full advantage of these opportunities. There are several explanations for this
delay. Workload variations over time can inhibit staff reductions since managers might be loathe to reduce
staffing levels given the potential of increased workload in the near term. Specialization of task can also
inhibit staff reductions. Frequently, departments reduce staff by loaning or transferring personnel elsewhere
in the company. The requirement for specialized skills can reduce the flexibility in transferring or sharing
personnel. The lack of understanding that quality improvement reduces support requirements over time might
also lead to reluctance by infrastructure managers to reduce staffing. Managers could interpret short term cost
savings as anomalies rather than a result of quality improvement. Our results therefore have implications for
organizational design. At our research site, infrastructure activities such as Software Integration and Data
Integration are stand-alone centers. However, these centers could be combined, or the staff could be cross-
trained. By enabling flexible deployment of staff to a variety of tasks, the problem of inertia due to staff
specialization could be mitigated. An interesting opportunity for further research is to evaluate the efficacy of
varied organizational designs for different types of software development and infrastructure activities in the
context of quality improvement.
Our study focused on examining the impact of quality improvement on IT infrastructure activities
supporting a major software development effort over a long period of time in one organization. This design
has several strengths. The focus on one organization and one major application development effort has the
advantage of naturally controlling for contextual factors that could differ across firms and projects. We also
added variables in our models to control for any other major managerial and technical changes in the
company over the time period of the study. These controls in our research design and models strengthen the
internal validity of our results. In addition, our focused research design enabled us to examine an organization
over a significant period of time (ten years). This is important because the time horizon needed to make,
realize, and recoup investments in quality improvement appears to be relatively long in software engineering
(Herbsleb et al. 1997). For example, the company in our study required more than six years to advance two
levels in process maturity. Sterman et al. (1997) suggest that defect reduction is relatively slow when complex
product development processes are involved. This is because it is difficult to discern cause and effect
relationships, results from actions are not immediate, and activities often cross organization boundaries.
Software development processes have many of these characteristics. Thus, an essential feature of our research
design is that the time period was long enough for us to observe advancements in process maturity as well as
their impacts on IT infrastructure costs.
22
Although our focused research design enhances the internal validity of our findings, external validity
could be limited to organizational and software development contexts similar to the company in our study. As
no single study is definitive, the external validity of our results can be strengthened by replication. We see
several areas where future research would be particularly fruitful in this regard. First, we note that our
findings concerning the relationships between process maturity, product quality, and infrastructure costs are
valid only in the process maturity ranges observed in our data set, i.e., CMM levels 1 through 3. It is possible
that the investments and cost savings in IT infrastructure to achieve process maturity levels above CMM level
3 could be different. Thus, it is important for future work to examine the quality improvement investments
and cost savings at higher levels of process maturity. Further, our study has examined quality improvement in
one application domain and development environment (custom software development of an algorithmically
intense system in a mainframe development environment). It is possible that the economics of quality
improvement could be quite different in other domains and development environments. For example, Internet
software development occurs over very short cycles, using different methods, tools, and infrastructure than
traditional development (Cusumano and Yoffie 1999, MacCormack et al. 2001). Compared to traditional
development, rapid development cycles could speed the organization’s advancement to higher levels of
process maturity by shortening the length of time between implementing new practices and experiencing
results, thereby enabling quicker returns on quality investments. The economics of quality improvement may
be quite different in this development context. Thus, it is essential for future research to explore the impacts
of quality improvement in a variety of application domains and development environments.
7. References
Adrian, M. 1986. Dictionary key to data access, Computerworld, Framingham, 20(49) 37-38. Albert, S., D. Whetten. 1985. Organizational Identity, in Research in Organizational Behavior, L.L. Cummings and B.M. Staw (editors) 14 179-229, Greenwich, CT: JAI Press. Alter, A.E. 1991. Software Productivity: Increasing the Yield, CIO, Framingham, 4(6) 23-25. Amburgey, T.L., D. Kelly, W.P. Barnett. 1993. Resetting the clock: The dynamics of organizational change and failure, Administrative Science Quarterly 38 51-73. Aptman, L.H. 1986. Project management: Setting controls, Management Solutions, Saranac Lake, 31(11) 31-33. Banker, R.D., S.A. Slaughter. 1997. A field study of scale economies in software maintenance, Management Science 43(12) 1709-1725.
23
Banker, R.D., H. Chang, C.F. Kemerer. 1994. Evidence of economies of scale in software development, Information and Science Technology 36(5) 275-282. Banker, R.D., S.M. Datar, C.F. Kemerer, and D. Zweig. 2002. Software errors and software maintenance management, Information Technology and Management, forthcoming. Basili, V., J. Musa. 1991. The future engineering of software: A management perspective. IEEE Computer 20(4) 90-96. Beach, L.M. 1984. Managing software development, Information Management, Woodbury, 18(1) 20-21. Becker, G.S. 1975. Human capital, 2nd ed. Chicago: The University of Chicago Press. Belsley, D.A., E. Kuh, R.E. Welsch. 1980. Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, Wiley and Sons, New York. Bersoff, E.H. 1984. Elements of software configuration management, IEEE Transactions on Software Engineering, New York, 10(1) 79-87. Boehm, B.W. 1981. Software Engineering Economics, Englewood Cliffs, NJ: Prentice-Hall. Breusch, T., L. Godfrey. 1981. A review of recent work on testing for autocorrelation in dynamic simultaneous models, in D. Currie, R. Nobay and D Peel, eds. Macroeconomic Analysis: Essays in Macroeconomics and Econometrics, Croon Helm, London, England, 63-105. Broadbent, M., P. Weill, D. St. Clair. 1999. The implications of information technology infrastructure for business process redesign, MIS Quarterly 23(2) 159-182. Broadbent, M., P. Weill, T. O’Brien, B.S. Neo. 1996. Firm context and patterns of IT infrastructure capability, Proceedings of the Seventeenth International Conference on Information Systems, J. DeGross, S. Jarvenpaa, and A. Srinivasan (eds.), Cleveland, OH, 174-194. Buckley, F.J., R. Poston. 1984. Software quality assurance, IEEE Transactions on Software Engineering, New York, 10(1) 36-41. Cohen, H.B. 1998. The performance paradox. The Academy of Management Executive, Ada, 12(3) 30-40. Curtis, B., M. Kellner, J. Over. 1992. Process Modeling, Communications of the ACM, 35(9) 75-90. Cusumano, M., D. Yoffie. 1999. Software development on Internet time, IEEE Computer 32(10): 60-69. Dekleva, S., D. Drehmer. 1997. Measuring software engineering evolution: A Rasch calibration, Information Systems Research, 8(1) 95-105. Douglas, E.E. 1999. Establishing a project controls organization, Transactions of the American Association of Cost Engineers International, Morgantown, WV, 71-74. Duncan, N.B. 1995. Capturing flexibility of information technology infrastructure: A study of resource characteristics and their measure, Journal of Management Information Systems 12(2) 37-57. Dyer, M., A. Kouchakdjian. 1990. Correctness verification: alternative to structural software testing, Information and Software Technology 32(1) 53-59.
24
Evers, S. 1996. DarkStar problems fixed as flight is set for March, Jane’s Defence Weekly, January 31, 6. Ferry, M.J. 1985. Quality assurance, Journal of Information Management, Atlanta, 6(2) 25-37. Fichman, R., C. Kemerer. 1997. The assimilation of software process innovations: An organization learning perspective, Management Science 43(10) 1345-1363. Findley, D.A. 1998. Controlling costs for software development projects, Transactions of the American Association of Cost Engineers International, Morgantown, WV, pp. 6-9. Finlay, P.N., A.C. Mitchell. 1994. Perceptions of the benefits from the introduction of CASE: An empirical study, MIS Quarterly, December 353-370. Fiol, C.M., A.S. Huff. 1992. Maps for managers: Where are we? Where do we go from here? Journal of Management Studies, 29 267-285. Goldenson, D., J. Herbsleb. 1995. After the Appraisal: A Systematic Survey of Process Improvement, Its Benefits, and Factors that Influence Success. CMU/SEI-95-TR-009, Software Engineering Institute. Greene, W.H. 1997. Econometric Analysis, 3rd ed., New York: MacMillan Publishing Company. Gresov, C., H.A. Haveman, T.A. Oliva. 1993. Organizational design, inertia, and the dynamics of competitive response. Organization Science, 4(2) 181-208. Greeve, H.R. 1999. The effect of core change on performance: Inertia and regression toward the mean, Administrative Science Quarterly 44 590-614. Greeve, H.R. 1998. Performance, aspirations, and risky organizational change, Administrative Science Quarterly, 43 58-86. Gupta, Y.P. 1989. Software quality assurance, The International Journal of Quality and Reliability Management, Bradford, 6(4) 56-67. Guthrie, K.M. 1984. Management controls in a high tech environment, Transactions of the American Association of Cost Engineers, Morgantown, WV, pp. 1-7. Hamel, G. and C.K. Prahalad. 1994. Competing for the future: Breakthrough strategies for seizing control of your industry and creating the markets of tomorrow, Boston, MA: Harvard Business School Press. Hannan, M.T., J. Freeman. 1984. Structural inertia and organizational change, American Sociological Review, 49 149-164. Hannan, M.T., J. Freeman. 1989. Organizational Ecology, Cambridge, MA: Harvard University Press. Harter, D.E., M.S. Krishnan, S.A. Slaughter. 2000. Effects of process maturity on quality, cycle time, and effort in software product development, Management Science, 46(4) 451-466. Haveman, H. 1992. Between a rock and a hard place: organizational change and performance under conditions of fundamental environmental transformation, Administrative Science Quarterly 37 48-75.
25
Henderson, M.M. 1987. The importance of data administration in information management, Information Management Review, Frederick, 2(4) 41-47. Herbsleb, J., D. Zubrow, D. Goldenson, W. Hayes, M. Paulk. 1997. Software quality and the capability maturity model, Communications of the ACM 40(6) 30-40. Humphrey, W.S. 1989. Managing the Software Process, Addison-Wesley Publishing, Reading, Massachusetts. Jones, C. 1986. How not to measure programming productivity (part 1), Computerworld, 20(2) 65-72. Jones, C. 1994. Assessment and Control of Software Risks, Yourdon Press/PrenticeHall, Upper Saddle River, New Jersey. Keen, P. 1991. Shaping the future: Business redesign through information technology, Boston: Harvard Business School Press, 1991. Kibler, B.E. 1992. Integrated cost and schedule control: A piece of cake, Cost Engineering, Morgantown, WV, 34(7) 15-21. Kmenta, J. 1986. Elements of Econometrics, Macmillan, New York.
Krishnan, M. S., S. Kekre, C. H. Kriebel, T. Mukhopadhyay. 2000. An empirical analysis of productivity and quality in software products, Management Science, 46(6) 745-759. Krishnan, M.S., M.I. Kellner. 1999. Measuring process consistency: Implications for reducing software defects, IEEE Transactions on Software Engineering, 25(6) 800-815. Laitenberger, O., J. DeBaud. 2000. An encompassing life cycle centric survey of software inspection, Journal of Systems and Software 50(1) 5-31. Laverty, K.J. 1996. Economic “short-termism”: the debate, the unresolved issues, and the implications for management practice and research, The Academy of Management Review, 21(3) 825. Leavitt, D. 1977. Human factors also vital: project confirms impact of programming techniques, Computerworld, Framingham, 11(11) 23. MacCormack, A., R. Verganti, M. Iansiti. 2001. Developing products on “Internet time”: The anatomy of a flexible development process, Management Science 47(1) 133-150.
McKenna, J. 1996. C-130J software problems addressed, Aviation Week & Space Technology, 144(6) 28.
McLean, E.R., R.B. Wilkes. 1990. Computer operations: a case of management neglect, Information Systems Management, Boston, 7(2) 73-76. March, J.G. 1981. Footnotes on organizational change, Administrative Science Quarterly 26 563-597. Martin, B.A. 1992. Aspects of Cost Control, Cost Engineering, Morgantown, WV, 34(6): 19. Paulk, M.C., C.V. Weber, B. Curtis, M. B. Chrissis. 1994. The Capability Maturity Model: Guidelines for Improving the Software Process, Addison-Wesley Publishing Company, Reading, MA.
26
Paulk, M.C., C.V. Weber, S. Garcia, M. Chrissis, M. Bush. 1993. Key practices of the Capability Maturity Model version 1.1, Carnegie Mellon University, Software Engineering Institute, Technical Report SEI-93-TR-025, http://www.sei.cmu.edu/publications/documents/93.reports/93.tr.025.html. Pressman, R.S. 1997. Software Engineering: A Practitioner’s Approach, McGraw-Hill, New York. Ravichandran, T. 2000. Swiftness and intensity of administrative innovation adoption: An empirical study of TQM in information systems, Decision Sciences 31(3) 691-724. Ray, M.C. 1998. Project control: where it fits in, Transactions of the American Association of Cost Engineers International, Morgantown, WV, pp.15-16. Reger, R.K., L.T. Gustafson, S.M. Demarie, J.V. Mullane. 1994. Reframing the organization: Why implementing total quality is easier said than done, The Academy of Management Review, Mississippi State, 19(3) 565. Rumelt, R.P., D.E. Schendel, D.J. Teece (editors). 1994. Fundamental Issues in Strategy: A Research Agenda. Boston: Harvard Business School Press. Slaughter, S.A., D.E. Harter, M.S. Krishnan. 1998. Evaluating the cost of software quality, Communications of the ACM 41(8) 67-73. Stephens, M.A. 1986. Goodness of Fit Techniques, New York: M. Dekker. Sterman, J.D., N.P. Repenning, F. Kofman. 1997. Unanticipated side effects of successful quality programs: Exploring a paradox of organizational improvement, Management Science 43(4) 503-521. Strassman, P. 1999. Information productivity: Assessing the information management costs of U.S. industrial corporations, New Canaan, CT: The Information Economics Press. Turnbull, L. 1986. With standards, quality assurance serves as a helping hand, Data Management, 24(4) 25-27. Tushman, M.L., E. Romanelli. 1985. Organizational evolution: A metamorphosis model of convergence and reorientation, in L.L. Cummings and B.M. Staw (eds.), Research in Organizational Behavior 7 171-222, Greenwich, CT: JAI Press. Venkatakrishnan, V. 1988. Difference in dictionaries, Computerworld, Framingham, 22(11) 13. Weill, P., M. Broadbent. 1998. Leveraging the new Infrastructure: How market leaders capitalize on information technology, Boston: Harvard Business School Press. White, C.J. 1987. Dictionary priorities, Computer & Communications Decisions, 19(14) 29. White, H. 1980. Heteroskedasticity-consistent covariance matrix estimator and a direct test for heteroskedasticity, Econometrica 48(5) 817-838. Zahran, S. 1998. Software Process Improvement: Practical Guidelines for Business Success, Essex, England: Addison Wesley Longman Ltd.
27
T
able
1. I
T I
nfra
stru
ctur
e A
ctiv
itie
s, I
nves
tmen
ts, a
nd B
enef
its
Infr
astr
uctu
re
Fun
ctio
n In
fras
truc
ture
Cen
ter
Infr
astr
uctu
re A
ctiv
ity
Inve
stm
ents
in P
roce
ss
Impr
ovem
ents
Ben
efit
s fr
om
Pro
cess
Im
prov
emen
ts
Seni
or M
anag
emen
t E
xecu
tive
man
agem
ent o
f de
velo
pmen
t, m
aint
enan
ce,
and
supp
ort a
ctiv
itie
s
Lim
ited
to p
roce
ss a
ppro
vals
R
educ
ed r
evie
ws
and
inte
rven
tions
P
rodu
ct D
eliv
ery
Man
agem
ent
Pro
gram
Con
trol
S
ched
ule
and
budg
et tr
acki
ng
Inco
rpor
ate
chan
ges
into
sc
hedu
le te
mpl
ate
Red
uced
rew
ork
of
sche
dule
s &
bud
gets
Con
figu
ratio
n M
anag
emen
t M
anag
emen
t of
base
line
docu
men
ts, s
oftw
are
&
revi
ews
Inve
stm
ent i
n pr
oces
s co
ntro
l Fe
wer
iter
atio
ns o
n so
ftw
are
mig
ratio
ns
and
docu
men
tatio
n P
roce
ss
Man
agem
ent
Qua
lity
Ass
uran
ce
Aud
it of
pro
cess
es &
pr
oduc
ts (
docu
men
ts &
so
ftw
are)
Inve
stm
ent i
n pr
oces
s de
velo
pmen
t R
educ
tion
in r
evie
w
of r
ewor
k
Sof
twar
e In
tegr
atio
n M
anag
e so
ftw
are
inte
rfac
es
and
syst
ems
leve
l int
egra
tion
Dev
elop
men
t of
soft
war
e st
anda
rds
Red
uced
rew
ork
of
inte
rfac
es
Inte
grat
ion
Man
agem
ent
Dat
a In
tegr
atio
n
Tra
ck d
ata
base
ele
men
ts,
fiel
d de
scri
ptio
ns, e
dit c
rite
ria
Dev
elop
men
t of
data
st
anda
rds
Red
uced
rew
ork
of
data
nam
ing
Syst
ems
Supp
ort
Tec
hnic
al s
uppo
rt f
rom
ha
rdw
are
and
syst
em
spec
ialis
ts
Lim
ited
– m
ore
focu
sed
on
prod
uctio
n ac
tivity
R
educ
ed s
yste
m
inte
rven
tion
due
to
failu
res
O
pera
tion
s M
anag
emen
t O
pera
tions
O
pera
tor
supp
ort f
or
deve
lopm
ent,
test
and
pr
oduc
tion
Lim
ited
– m
ore
focu
sed
on
prod
uctio
n ac
tivity
R
educ
tion
in
reru
nnin
g of
pr
oduc
tion
& te
st
Doc
umen
tati
on
Doc
umen
tatio
n P
rodu
ctio
n Sy
stem
, use
r, a
nd s
uppo
rt
docu
men
tatio
n A
utom
atio
n in
vest
men
t do
cum
enta
tion
proc
esse
s R
educ
ed r
ewor
k of
pu
blic
atio
ns
28
Table 2. Descriptive Statistics
Variable Mean Std Deviation Median Minimum Maximum
Process Maturity (CMM Level)
1.99 0.61 2.14 1.00 2.78
Product Quality (KLOC/Defects)
536.68 276.06 546.28 174.56 1442.10
Development Workload (KLOC)
28.01 8.41 25.71 7.43 47.12
Production Workload (KLOC)
1275.47 952.57 1380.72 0.00 3417.78
Senior Management Costs
$202,612.40 88,272.81 204,779.00 3,065.70 542,704.30
Program Control Costs
$34,151.07 14,299.49 34,698.13 9,761.85 81,641.65
Configuration Management Costs
$40,452.74 15,663.61 40,160.28 80.68 82,746.96
Quality Assurance Costs
$16,783.76 7,765.13 16,124.31 1,386.78 37,875.25
Software Integration Costs
$24,977.13 12,832.46 25,406.89 163.41 65,300.24
Data Integration Costs
$6,154.48 3,470.73 6,092.59 81.83 19,573.12
Systems Support Costs
$18,943.35 24,498.58 10,298.93 1,269.22 151,086.00
Operations Costs $67,973.04 79,254.57 52,824.97 504.39 559,528.10
Documentation Costs
$48,771.03 38,426.23 36,869.05 5,009.86 203,203.30
Note: Numbers reported are monthly values.
29
Table 3. Correlation Matrix
Process Maturity
Product Quality
Development Workload
Production Workload
CASE Hiring
Process Maturity
1.000
Product Quality
0.875 (0.000)
1.000
Development Workload
0.288 (0.002)
0.080 (0.387)
1.000
Production Workload
0.620 (0.000)
0.621 (0.000)
0.301 (0.001)
1.000
CASE 0.424
(0.000) 0.434
(0.000) 0.370
(0.000) 0.420
(0.000) 1.000
Hiring 0.561
(0.000) 0.486
(0.000) 0.372
(0.000) 0.537
(0.000) 0.736
(0.000) 1.000
Notes: Pearson correlation coefficients are reported between pairs of continuous variables, with p values in parentheses. Spearman rank correlations are reported between pairs of continuous and discrete variables, with p values in parentheses. Phi correlations are reported between pairs of binary variables, with p values in parentheses.
Table 4. ARCH Quality Parameter Estimates (n=118) (standard errors, t statistics and two-tailed p values)
ln(Quality t) = β00 + β10* ln(Process Maturity t) + β20* ln(Development-Workload t) + β30* CASE t + ε0
Variable Para-meter
ARCH Estimate
Intercept β00
s.e. t p
6.214 0.846 7.349 0.000
ln (Process Maturity t) β10
s.e. t p
1.252 0.092
13.641 0.000
ln (Development-Workload t)
β20
s.e. t p
-0.359 0.067
-5.352 0.000
CASE t β30
s.e. t p
0.355 0.829 0.428 0.669
Wald Statistic χ2(3)
p 230.09
0.000
Note: n represents the number of months in the analysis.
30
Tab
le 5
. AR
CH
IT
Inf
rast
ruct
ure
Cos
ts P
aram
eter
Est
imat
es
(sta
ndar
d er
rors
, t s
tatis
tics
and
two-
taile
d p
valu
es)
ln(I
nfra
stru
ctur
e-co
st c,
t) =
β0C
+ β
1C*
ln(I
nfra
stru
ctur
e-co
st c,
t-1)
+ β
2C*
ln(Q
ualit
y t)
+
β 3C*
ln(D
evel
opm
ent-
Wor
kloa
d t)
+ β
4C*
ln(P
rodu
ctio
n-W
orkl
oad
t) +
β5C
* H
irin
g t +
ε1
N
otes
: (1)
n r
epre
sent
s th
e nu
mbe
r of
mon
ths
in th
e an
alys
is; (
2) S
oftw
are
Inte
grat
ion,
Dat
a In
tegr
atio
n an
d D
ocum
enta
tion
are
only
for
new
dev
elop
men
t, no
t fo
r pr
oduc
tion;
(3)
Hir
ing
polic
ies
chan
ged
befo
re c
reat
ion
of th
e So
ftw
are
Inte
grat
ion
and
Dat
a In
tegr
atio
n or
gani
zatio
ns.
Var
iabl
e P
ara-
met
er
Seni
or
Mgt
P
rogr
am
Con
trol
C
onfi
g M
gt
Qua
lity
Ass
uran
ce
Soft
war
e In
tegr
atio
n D
ata
Inte
grat
ion
Syst
ems
Supp
ort
Ope
r-at
ions
D
ocu-
men
tati
on
n
(1)
115
115
115
115
84
84
94
103
110
Inte
rcep
t β 0
C
s.e.
t p
11.5
08
1.08
2 10
.633
0.
000
7.97
6 1.
029
7.75
1 0.
000
6.92
7 0.
894
7.74
8 0.
000
5.62
4 1.
164
4.83
3 0.
000
4.83
4 1.
095
4.41
3 0.
000
2.55
2 1.
691
1.50
9 0.
131
14.3
91
2.18
4 6.
589
0.00
0
9.95
8 1.
706
5.83
6 0.
000
6.95
0 1.
592
4.36
4 0.
000
ln (
Cos
ts t-
1)
β 1C
s.e.
t p
0.18
5 0.
070
2.64
4 0.
008
0.34
4 0.
072
4.75
7 0.
000
0.38
5 0.
075
5.14
8 0.
000
0.54
0 0.
076
7.05
6 0.
000
0.49
0 0.
084
5.84
5 0.
000
0.67
0 0.
075
8.97
0 0.
000
0.22
9 0.
091
2.51
7 0.
012
0.40
1 0.
060
6.71
5 0.
000
0.51
9 0.
086
6.03
3 0.
000
ln (
Qua
lity
t) β 2
C
s.e.
t p
-0.4
25
0.07
3 -5
.788
0.
000
-0.3
60
0.08
2 -4
.372
0.
000
-0.1
89
0.07
3 -2
.597
0.
009
-0.3
81
0.10
7 -3
.563
0.
000
-0.2
75
0.09
6 -2
.862
0.
004
-0.2
00
0.21
5 -0
.928
0.
354
-1.2
96
0.28
2 -4
.591
0.
000
-0.5
55
0.15
5 -3
.573
0.
000
-0.3
52
0.11
9 -2
.969
0.
003
ln (
Dev
elop
men
t-W
orkl
oad
t) β 3
C
s.e.
t p
0.16
0 0.
091
1.76
3 0.
078
0.20
4 0.
140
1.45
9 0.
145
0.13
5 0.
127
1.06
6 0.
286
0.25
6 0.
120
2.13
0 0.
033
0.60
5 0.
153
3.94
4 0.
000
0.45
7 0.
229
1.99
1 0.
046
-0.1
09
0.24
2 -0
.449
0.
654
-0.0
61
0.20
8 -0
.296
0.
767
0.10
3 0.
144
0.72
0 0.
471
ln (
Pro
duct
ion-
Wor
kloa
d t)
β 4C
s.e.
t p
0.01
2 0.
013
0.96
0 0.
337
0.00
6 0.
013
0.45
7 0.
648
0.00
1 0.
011
0.12
0 0.
905
0.00
4 0.
021
0.19
4 0.
847
not
appl
icab
le
(2)
not
appl
icab
le
(2)
-0.0
12
0.02
2 -0
.535
0.
592
0.01
5 0.
036
0.41
4 0.
679
not
appl
icab
le
(2)
Hir
ing
t β 5
C
s.e.
t p
0.50
5 0.
121
4.18
3 0.
000
0.40
1 0.
127
3.15
7 0.
002
0.33
5 0.
111
3.01
3 0.
003
0.31
0 0.
214
1.44
7 0.
148
not
appl
icab
le
(3)
not
appl
icab
le
(3)
1.25
1 0.
243
5.15
4 0.
000
0.14
5 0.
353
0.41
0 0.
682
-0.0
13
0.17
1 -0
.076
0.
939
Wal
d St
atis
tic
χ2 (df)
P
123
.11
0.0
00
103.
81
0.0
00
62.2
9 0
.000
24
6.15
0
.000
1
11.9
6 0
.000
81
6.61
0
.000
12
2.16
0
.000
20
3.66
0
.000
1
18.7
1 0
.000
31
Table 6. Monthly Infrastructure Costs/KLOC of Development by SEI CMM Level
Infrastructure Activity SEI CMM Level 1 2 3 Defects/KLOC 4.56 1.92 1.15 Senior Management $10,271 $7,103 $5,724 Program Control $1,595 $1,167 $972 Configuration Management $1,634 $1,386 $1,260 Quality Assurance $729 $524 $432 Software Integration $934 $736 $640 Data Integration $197 $166 $150 Systems Support $1,133 $368 $191 Operations $3,063 $1,893 $1,428 Documentation $1,840 $1,356 $1,134
Total Infrastructure Costs:
Monthly Costs per KLOC $21,396 $14,698 $11,930 Monthly Costs for Project $599,302 $411,691 $334,159 Annual Costs for Project $7,191,624 $4,940,292 $4,009,912
32
Figure 1. Infrastructure Costs Conceptual Framework
Figure 2. Process Maturity, Defect Rates, and Infrastructure Costs
Infrastructure Costsc,t
H2
(+) (-)
H1
H3 (+)
Senior Management
Program Control
Configuration Management
Quality Assurance
Software Integration
Data Integration
Systems Support
Operations
Documentation
Software Process
Maturityt
Development Workloadt
Organizational Inertia: Lagged costs c,t-1
Product Qualityt
CASEt (+)
Production Workloadt (+)
Hiringt (+)
(+) (-)
0.00
1.00
2.00
3.00
4.00
5.00
1985 1986 1987 1988 1989 1990 1991 1992 1993 1994
Year
Def
ects
per
KL
OC
$0
$5,000
$10,000
$15,000
$20,000
$25,000
Defects/KLOC Infrastructure Costs/KLOC
Infr
astr
uct
ure
Co
sts
per
KL
OC
CMM Level 1 CMM Level 2 CMM Level 3
CMM Evaluations: