question framework for architectural description quality evaluation

14
Question framework for architectural description quality evaluation Niina Ha ¨ma ¨la ¨inen Jouni Markkula Published online: 10 January 2009 Ó Springer Science+Business Media, LLC 2009 Abstract The challenges of architectural descriptions (AD), processes and practices have become increasingly important for enterprise information system and software developers. As the development and efficient usage of different architectures are highly dependent on the quality of their documentation, there is an evident need for practical means for AD evaluation. In this paper, we introduce a question framework for AD quality evaluation. The framework was developed in a joint study with industry and validated by the industry experts. This framework can be used as a practical tool for evaluating and further devel- oping the quality of the AD within organisations. Keywords Enterprise architecture Á Software architecture Á Architectural description Á Documentation Á Quality Á Evaluation 1 Introduction The present day challenging business environment and increasing complexity of infor- mation systems have gradually increased the significance of architectures and their descriptions. Architecture processes, practices and descriptions have become more and more important for companies. Software and enterprise architecture descriptions are mainly used by developers, for describing and documenting architectures. However, they have begun to gain a central role also as a means for communication between development, management and business. The quality of the documentation is a major factor in understanding and usage the information it convey. A warning example is presented by Rosen (2006): ‘‘‘‘shelfware’’— N. Ha ¨ma ¨la ¨inen (&) Aditro HRM Oy, P.O. Box 146, 40101 Jyvaskyla, Finland e-mail: [email protected] J. Markkula Department of Information Processing Science, University of Oulu, P.O. Box 3000, 90014 Oulu, Finland e-mail: Jouni.Markkula@oulu.fi 123 Software Qual J (2009) 17:215–228 DOI 10.1007/s11219-008-9068-1

Upload: niina-haemaelaeinen

Post on 15-Jul-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Question framework for architectural description qualityevaluation

Niina Hamalainen Æ Jouni Markkula

Published online: 10 January 2009� Springer Science+Business Media, LLC 2009

Abstract The challenges of architectural descriptions (AD), processes and practices have

become increasingly important for enterprise information system and software developers.

As the development and efficient usage of different architectures are highly dependent on

the quality of their documentation, there is an evident need for practical means for AD

evaluation. In this paper, we introduce a question framework for AD quality evaluation.

The framework was developed in a joint study with industry and validated by the industry

experts. This framework can be used as a practical tool for evaluating and further devel-

oping the quality of the AD within organisations.

Keywords Enterprise architecture � Software architecture � Architectural description �Documentation � Quality � Evaluation

1 Introduction

The present day challenging business environment and increasing complexity of infor-

mation systems have gradually increased the significance of architectures and their

descriptions. Architecture processes, practices and descriptions have become more and

more important for companies. Software and enterprise architecture descriptions are

mainly used by developers, for describing and documenting architectures. However, they

have begun to gain a central role also as a means for communication between development,

management and business.

The quality of the documentation is a major factor in understanding and usage the

information it convey. A warning example is presented by Rosen (2006): ‘‘…‘‘shelfware’’—

N. Hamalainen (&)Aditro HRM Oy, P.O. Box 146, 40101 Jyvaskyla, Finlande-mail: [email protected]

J. MarkkulaDepartment of Information Processing Science, University of Oulu,P.O. Box 3000, 90014 Oulu, Finlande-mail: [email protected]

123

Software Qual J (2009) 17:215–228DOI 10.1007/s11219-008-9068-1

the architecture documents look spiffy on the shelf, and having them there demonstrates howsmart you are to be able to understand the architecture. Unfortunately, in many cases theyare never opened again, and certainly not by the development organisation’’. The quality of

the architectural description (AD) determines the value of it, and consecutively the value of

the architecture work. The planned documentation methods and high quality of AD improve

communication and collaboration between stakeholders involved in architecture work. For

assuring that the architectural documents can be well understood and correctly used, the

companies should have practices for their quality evaluation.

Quality management of documents in general is quite an extensively studied area. A

number of papers have been published on concepts for understanding of the quality of

documents as well as their quality assurance. The quality issues of conceptual represen-

tations and models have been tackled for example by Bolloju and Leung (2006), Claxton

and McDougall (2000), Lindland et al. (1994) and, Nelson and Monarchi (2007). In this

literature, for example quality aspects for conceptual models have been defined (e.g.

Nelson and Monarchi 2007) and quality of technical documentation analysed (e.g. Hargis

et al. 2004; McDavid 1999).

The previous studies highlight the significance of the quality of conceptual models.

Nelson and Monarchi (2007) state: ‘‘Good representations are necessary for good infor-mation systems.’’ High quality conceptual models (such as AD) are seen essential for the

success of system development projects and to the understanding of the nature of the

systems, the processes, and the things of the real world (Nelson and Monarchi 2007). As

more and more software development is being outsourced, accurate, complete and valid

conceptual models are becoming critical (Nelson and Monarchi 2007). In addition, new

technologies for generating implementation code from architecture descriptions also

require reliable models (Li and Horgan 2000). As high quality conceptual models are

important from both human and technical viewpoints, it can be seen that, in advancement

of ICT technology and applications, the quality management of AD is also becoming a

central problem for industry as well as an important research area for academia.

Although research on quality management of documents is present in the literature,

previous studies seem to seldom focus on the quality of architecture descriptions. Espe-

cially, the quality evaluation of architecture descriptions is quite rarely studied area. The

situation seems to be similar in both software and enterprise architecture fields. There

exists some literature and guidelines for documenting software architecture (Clements

et al. 2002; Fairbanks 2003; Fu et al. 2005; He et al. 2002; Rozanski and Woods 2005) and

for the documenting enterprise architecture (Bernus 2003; Chapurlat et al. 2003; Jonkers

et al. 2004; Lankhorst 2005; Polikoff and Coyne 2005). Also architecture description

concepts have been established by standards (IEEE Std 1471-2000). When reviewing this

literature, it can be noticed that the documentation guidelines support choosing suitable

documentation aspects (architecture views). Some of the guidelines define quality criteria

for architecture descriptions. However, different guidelines highlight different quality

aspects and criteria. Thus, there appears not to be available proper guidelines on how to

carry out a quality evaluation of architecture descriptions. It seems that quality evaluation

criteria for AD have not been well identified and analysed up to the present.

This paper addresses this prevalent problem. It presents the study of architecture doc-

umentation quality evaluation, carried out in the AISA (see acknowledgements) research

project, in co-operation with a group of companies. The objective of the research was to

develop a practical means for assessing the quality of the architecture descriptions in the

companies. The study was started with a literature review. Based on the analysis of the

related documentation quality evaluation factors presented in literature, the main quality

216 Software Qual J (2009) 17:215–228

123

aspects were identified and architecture description criteria and questions specified. Those

were used to form a preliminary evaluation question framework. After that, the preliminary

framework was complemented and assessed by industry practitioners using focus group

interviews and questionnaires and the final framework was constructed.

The result of the study was a question framework for architectural documentation

quality evaluation, which was validated by the industry experts. The proposed framework

was planned to be practical and flexible means for architecture documentation assessment,

which can be applied in companies to increase the quality of descriptions produced by

software and enterprise architects.

The structure of the paper is the following. The next Sect. 2 introduces the context of

architecture documentation and presents the literature sources for the background of

architecture documentation evaluation. In Sect. 3 the used research method is explained.

The Sect. 4 presents the developed AD evaluation question framework. In the concluding

Sect. 5, the results are discussed.

2 Architecture documentation

Enterprise architecture description is usually produced and used at the organisation level,

as an instrument to manage a company’s daily operations and future development

(Lankhorst 2005). For example Kaisler et al. (2005) define enterprise architecture as ‘‘themain components of the organization, its information systems, the ways in which thesecomponents work together in order to achieve defined business objectives, and the way inwhich the information systems support the business processes of the organization’’. These

components include staff, business processes, technology, information, financial and other

resources, etc.

A software architecture description is mostly produced and utilised in system or soft-

ware development work. A definition of software architecture is provided, for example, by

Bass et al. (2003): ‘‘The software architecture of a program or computing system is thestructure or structures of the system, which comprise software elements, the externallyvisible properties of those elements, and the relationships among them.’’

The concepts related to architectural documentation are formalized and standardized in

IEEE standard (IEEE Std 1471-2000) ‘‘Recommended Practice for Architectural

Description’’. This has been accepted also as an ISO standard (ISO/IEC 42010:2007). In

addition to this, standards for architecture descriptions have also been developed and

defined by companies. For example, IBM has presented architecture description standards

(McDavid 1999; Youngs et al. 1999).

For this paper, the main architecture documentation concepts defined in IEEE-1471 are:

• Architecture: The fundamental organization of a system embodied in its components,

their relationships to each other, and to the environment, and the principles guiding its

design and evolution.

• System stakeholder: An individual, team, or organization (or classes thereof) with

interests in, or concerns relative to, a system.

• Architectural description (AD): A collection of products to document an architecture.

• View: A representation of a whole system from the perspective of a related set of

concerns.

• Viewpoint: A specification of the conventions for constructing and using a view. A

pattern or template from which to develop individual views by establishing the

purposes and audience for a view and the techniques for its creation and analysis.

Software Qual J (2009) 17:215–228 217

123

• Concern: Those interests which pertain to the system’s development, its operation or

any other aspects that are critical or otherwise important to one or more stakeholders.

Concerns include system considerations such as performance, reliability, security,

distribution, and evolvability.

Relationships between these concepts are defined in Fig. 1, which presents a part of

IEEE-1471 conceptual model of AD.

In architecture description, a view may consist of architectural models. Different types

are needed because of the varying stakeholders and concerns of the descriptions. The

enterprise architecture models can be categorised in the following way (Polikoff and

Coyne 2005):

• Ad hoc models: models that serve basic goals of communication and documentation

and that are usually developed using simple drawing or presentation tools.

• Standardized models: models adopting a standard or framework-based approach and

using case tools.

• Formal models: models that are based on reference architectures.

• Federated models: models that aggregate across diverse sources and using EA tools

interoperating with diverse repositories of information.

• Executable models: active knowledge models that can be consulted by applications as

well as humans.

Rozanski and Woods (2005) classify software architecture models as formal qualitative or

quantitative models and informal qualitative models (sketches). These are defined as follows:

• Qualitative models illustrate the key structural or behavioral elements, features, or

attributes of the architecture being modelled.

• Quantitative models make statements about the measurable properties of an architec-

ture, such as performance, resilience, and capacity.

• A sketch is a deliberately informal graphical model, created in order to communicate

the most important aspects of an architecture to non-technical audience. It may

combine elements of a number of modelling notations as well as pictures and icons.

2.1 Architecture frameworks

Architectural frameworks have a central role in architecture documentation. These

frameworks provide structure for the architecture descriptions by identifying and

Fig. 1 Architectural description related concepts (IEEE Std 1471-2000)

218 Software Qual J (2009) 17:215–228

123

sometimes relating different architectural domains and the modelling techniques associated

with them (Steen et al. 2004). They typically define a number of conceptual domains or

aspects to be described (Steen et al. 2004).

Examples of enterprise architecture frameworks include Zachman’s Framework for

Enterprise Architecture (Zachman 1987), The Open Group Architecture Framework

(TOGAF 2007), Archimate framework, ISO (ISO/IEC 10746) Reference Model of Open

Distributed Processing (RM-ODP). Examples of software architecture frameworks include

Kruchten’s (1995) ‘‘4 ? 1’’ View Model, the Software Engineering Institute (SEI) set of

views (Clements et al. 2002), Siemens Four View Model (Soni et al. 1995) and Rational

Architecture Description Specification (ADS).

As discovered by May (2005), viewpoints defined by different SA frameworks do not

completely correspond to each other. This seems to apply EA frameworks as well. Cur-

rently, there seems to be no commonly accepted set of architectural viewpoints (May 2005;

Smolander et al. 2002). As Smolander et al. (2002) point out, architectural viewpoints

chosen by companies are often agreements between people depending on the organiza-

tional and project environment. In practice, the selection of architectural viewpoints is thus

based on the prevalent situation and characteristics in a company and in the project at hand.

2.2 Architecture documentation practices and realities

For organisational level practice assessment, a maturity model for enterprise architecture

representations and capabilities is introduced by Polikoff and Coyne (2005). This maturity

model consists of the following levels:

• Level 1 Ad hoc: No common reference framework; possible use of case tools; little

commonality between descriptions produced by different people or groups.

• Level 2 Standardized: Established methodology for describing architectures; use of

industry standard/custom framework; methodology not fully supported and enforced by

tools.

• Level 3 Formal: Methodology enforced by tools; reference architectures; multiple tools

in use but from different vendors with low level of interoperability; reference

framework and architectural models cannot be readily queried.

• Level 4 Federated: Connections between different systems and tools established.

• Level 5 Executable: Models are consultable by applications at run time; knowledge

about enterprise activities, systems and capabilities becomes a real time resource.

In companies, the architecture documentation practices are affected by the architects’

own practices as well as by company level practices. The architect’s decisions and choices

affect architecture documentation. Given a specific goal and focus, an architect decides

which aspects of an enterprise or a system are relevant and should be represented in the

model (Lankhorst 2005).

A company’s situation affects the possibilities for architecture documentation work. It is

necessary to know (Clements et al. 2002): who the developers will be and which skills are

available, what the budget is and what the schedule is. In addition, some other realities

relate to architecture documentation work, such as: resources and time limits; stakeholder’s

requirements and; needs for architecture documents, notations and tools. Architects often

do not have much time to do architecture design and analysis (Rozanski and Woods 2005).

The reality is that all projects and work have cost/benefit trade-offs. Architecture docu-

mentation is no different (Clements et al. 2002). A rough-and-ready model that is produced

Software Qual J (2009) 17:215–228 219

123

early and becomes established and familiar to the team over time may be more useful than

something considered more fully that appears too late (Rozanski and Woods 2005).

Simple models are more useful in presentations to non-technical stakeholders, as well as

in the early stages of the architectural analysis for showing some key features. Sophisti-

cated models are more useful as analysis, communication, and comprehension tools for

technical stakeholders, such as software developers (Rozanski and Woods 2005). The

range of phenomena addressed by enterprise and system modelling stretches multiple

disciplines. Several modelling languages and practices are used, and one cannot always

find a single person/profession that can guarantee the consistency of all models involved.

There are several factors affecting architecture work and documentation practices.

However, the developments in business and IT are leading to more and more complex

systems and environments. In order to deal with this, well planned and documented, high

quality, architecture and architecture documentation have become more vital for organi-

sations. In order to promote high quality architecture work and efficient usage of the

architectures, companies need practical means for evaluating the quality of the AD.

3 Research method

The objective of the research was to develop a framework for evaluating the quality of AD,

which would be practical and usable within companies. As specific quality dimensions of

documents can be measured by asking probing questions (Smart 2002), the evaluation

framework was founded on questions. In developing the final framework, the following

research phases were conducted.

In the first phase, a literature review and analysis was carried out. As various information

sources should be used in this type of analysis (Worthen and Fitzpatrick 1997), different types

of material were utilised. The used sources included: models, findings and salient issues

raised in the literature; questions, concerns and values of practitioners; general evaluation and

quality models for documentation (e.g. technical documentation); views and knowledge of

expert consultants (comments and recommendations in articles published in internet). Based

on the literature review and analysis, the preliminary evaluation question framework con-

sisting of identified criteria, questions and metrics was constructed.

In the second phase, a semi-structured focus group interview was organised for

assessing the initial framework. The participants from the companies were interviewed as

one group, in order to allow the group members to influence each other by responding to

the ideas and comments of the others (Krueger and Casey 2000). The group influence was

discovered to be fruitful and discussion brought up new aspects on the topic.

The focus group consisted of 7 practitioners from five ICT user and service provider

organisations. The expert practitioners were specialists of the management of enterprise

and software architectures in their organisations. The organisations were: an architecture

consultation company (10 employees); a banking, finance and insurance company (11.974

employees); a telecommunication company (4.989 employees); a business & IT consulting

and development organisation (part of large international company with total 329,373

employees) and a retail and service company (28,092 employees). The viewpoints pre-

sented by the interviewees were: business consultation, software architecture consultation,

enterprise architecture, software architecture, marketing, business and IT governance.

The preliminary evaluation question framework was presented to the group of practi-

tioners. They were asked to evaluate the value and usefulness of it, based on their own

practical experiences. The interview was recorded and notes were written during the

220 Software Qual J (2009) 17:215–228

123

session. The preliminary evaluation question framework was complemented based on the

results of the interview.

As the last empirical phase, a questionnaire for assessing the usefulness of the evalu-

ation question framework was organised for the workshop participants. Four of them

answered to it. In the questionnaire, the practitioners assessed the importance of each

criterion on a four point scale (1 = important to evaluate, 2 = useful to evaluate, 3 = not

necessary to evaluate, 4 = useless to evaluate). Some of the criteria were assessed in more

detailed question level. The median of the answers was calculated for using as a rating of

importance.

The results of the focus group interview and questionnaire were then used for devel-

oping the final architecture description quality evaluation question framework, presented in

the next section.

4 Evaluation question framework

Three main aspects of the quality of documents can be identified, based on the literature.

These main quality aspects are: stakeholder and purpose orientation, content quality and

presentation and visualisation quality. The first aspect, stakeholder and purpose orientation,

is used for evaluating how well documents are focused on their purpose and on the

stakeholders using them. The second aspect, content quality, is used for the evaluation of

the quality of the information included in the documents. The third aspect, presentation and

visualisation quality, is used for evaluating how well information is presented in the

documents.

In addition to these three quality aspects of the documents, the management of docu-

ments was identified to be the fourth main aspect related to the architecture description

quality, from the point of view of processes and practices.

The developed architecture documentation quality evaluation question framework was

organised according the identified four main aspects:

(1) Stakeholder and purpose orientation of AD.

Table 1 Evaluation question framework for the stakeholder and purpose orientation

Criteria Questions/metrics Importance

Stakeholders Are the stakeholders of the description defined and who are they? 1

Purpose Is the purpose of the description defined in relation to the stakeholders? 1

Suitabilityfor thestakeholders

Does the description provide the stakeholder with the desired knowledge? 1

Does the description answer/correspond to the objective of stakeholder?

Does the description relate to problem? Is a practical reason for theinformation evident?

Is the information presented from the stakeholders’ point of view?

Usage Frequency of use: How frequently the description is used or referenced. 2

Number of users: The approximate number of personnel who will likely wantor need to use the description.

Variety of users: The variety of different functional areas or skill levels ofpersonnel who will likely use the description.

Impact of non-use: The level of adverse impact that is likely to occur if thedescription is not used properly.

Software Qual J (2009) 17:215–228 221

123

Table 2 Evaluation question framework for the content

Criteria Questions/metrics Importance

Scope and focus Scope: Is it defined what part of reality will be described(e.g. only primary processes)?

1

Aspects: Is it defined what aspects will be described? 1

The level of detail: Is it defined what level of detail will bedescribed?

1

Currency of EAdescription

Does the information reflect the current enterprise? 2

Were any changes made in the EA after the EA description hasbeen produced?

Number and scope of architectural effects having projects carriedout after the EA description have been produced

Number and scope of architecture changes made after EAdescription has been produced

Degree with which the current version of the description is up todate (Percentage, subjective evaluation)

How long is it since the previous updating of the description?

Currency of SAdescription

Does the information reflect the system? 1.5

Have there been any changes in the system after the architecturedescription was produced?

How long is it from the previous updating?

Correctness ofInformation

Verification of information: 2

Is the information included in the description verified?

Are there any incorrect arguments, or in-accurate or untruereasoning?

Correctness of EA ‘‘Substantive’’ errors/deficiencies after the EA description has beenreleased:

2.5

Are there ‘‘substantive’’ errors/deficiencies?

The number of ‘‘substantive’’ errors/deficiencies found (e.g. thenumber and type of change request applied to EA principles)?

Correctness of SA Correctness for stakeholders: Does the description present correctlythe needs and concerns of stakeholders?

1.5

Correctness of solution: Does the description define correctly anarchitecture that will meet stakeholder’s needs?

EA completeness EA’s coverage of business areas: The degree to which EAdescription addresses needs of each business area (e.g. subjectiveevaluation score 1–10)

2

Sufficiency/completeness

Description’s coverage of required viewpoints: The degree to whichdescription addresses each required architectural viewpoint(e.g. subjective evaluation score 1–10).

2

Sufficient amount of information:

Is the all required information included in the description? Are alltopics relating to stakeholder’s objectives and concernscovered, and only those topics?

1.5

Is information repeated only when needed?

Does the description contain irrelevant or superfluous elements?

Sufficient level of detail: Has each topic has just the detail thatstakeholder needs?

1.5

Consistency Are views presenting different viewpoints in the descriptionconsistent with each other?

1.5

222 Software Qual J (2009) 17:215–228

123

(2) Content quality of AD.

(3) Presentation and visualisation quality of AD.

(4) Management of AD.

The framework is presented according this organisation in the four corresponding

tables. Table 1 presents the stakeholder and purpose orientation aspect criteria and ques-

tions, Table 2 content quality, Table 3 presentation and visualisation quality and, Table 4

management of architecture descriptions. In the tables, the first column includes the criteria

and the second column the related questions and possible metrics. The last column presents

the importance of the criteria (with scale 1 (high) to 4 (low)).

Table 1 presents the stakeholder and purpose orientation aspect of the framework. The

criteria are related to AD users and usage. The importance rating shows that, in general,

Table 3 Evaluation question framework for the presentation and visualisation

Criteria Questions/metrics Importance

Conformance tocorporate standards

Does the presentation of the description conform to the corporatestandards (if any) for such documents?

2.5

Intuitiveness of thepresentation

Does the description have an intuitive structure for the stakeholder? 2

What is this intuitive structure?

Does the description correspond to it? Is the recipient familiarwith the structures used?

Definition of thenotation andstructures

Does the description use a defined notation? 1.5

Is the notation/structure of the description explained?

Is stakeholder familiar with notation?

Clarity of thevocabulary andconcepts

Are the terms and concepts used known by the stakeholder? 1.5

Are the terms used defined? Are the (new) concepts defined andexplained?

Are the names of elements descriptive? Are all of the description’selements defined so that their meanings, roles, and mapping to thereal world are all clear and not open to different interpretations?

Informationcomplexity

Is there too much information included in the model? 2

The number of elements in the model. (Humans are only good atworking with models that do not include more than 30 elements.)

The number of types of elements in the model.

The number of relations depicted in the model.

The number and types of concepts.

The number of architectural viewpoints. (Viewpoints reducecomplexity).

Visual complexity Proximity: Are the related objects placed near to each otherin a model?

2

Continuity: Are there any right angles positioned next to each other?(Right angles should not be positioned next to each other in amodel.)

Closure: Are objects symmetric and regular? (This increasesreadability of models and reduces the perceived complexity.)

Similarity: Are similar objects presented in the similar way?

Common fate: Are similar object presented to move or function asimilar manner? (People have a tendency to perceive differentobjects that move or function in a similar manner as a unit.)

Software Qual J (2009) 17:215–228 223

123

Table 4 Evaluation question framework for the architecture documentation management

Criteria Questions/metrics Importance

Maintenance ofdocumentation

Ownership: 1

Are the staffs responsible for the documentation clearlyidentified and supported?

Maintenance practice: 2

Is it known how the documentation will be maintained onceit has been accepted?

Is the frequency of updating known?

Frequency of updates (number of updates/year or project).

Needs for updates (number of architecture changes madein a year, in projects that require documentation update).

Maintainability of documentation: 2.5

The relative ease or difficulty with which the documentation canbe updated, including revision dates and distribution of newversions and the relative ease or difficulty with which theconsistency between descriptions can be checked.

Cost effectiveness Costs: 2

Time and resources needed to produce or update architecturedocumentation (required man-days).

Amount of documentation: 3

Number of documents/models.

Frequency of documentation updates: 2.5

Updates/project or updates/year.

Needs for updates (number of architecture changes madein (a year, in projects) that require documentation update

Architecturalframework andviews

Architecture framework (for EA and for SA): 1.5

Is there existing architectural framework?

Is the framework accepted in organisation?

Is the framework used in the EA documentation work?

Architectural views: 1.5

Are the suitable architectural views chosen for the companyor for the project?

Are the viewpoints well defined?

A Viewpoint name?

The stakeholders the viewpoint is aimed at?

The concerns the viewpoint addresses?

The language, modelling techniques, or analytical methodsto be used in constructing a view based upon the viewpoint?

Tools support Support for organisation’s framework and viewpoints: 1

Do design tools support the framework and viewpoints thatorganisation has chosen to use?

Do design tools support production of the deliverables required?

Suitability for Stakeholders: Is there ability to represent architecturedescriptions (e.g. models and views) in a way meaningful tostakeholders (e.g. to non-technical stakeholders)?

1

Repository for architecture documentation: Is there a repositoryfor storage and dissemination of the captured information?

1.5

224 Software Qual J (2009) 17:215–228

123

this aspect is seen highly important in AD evaluation. It is essential that the stakeholders’

interests are considered and the purpose of description is well defined with respect to those

interests.

Table 2 presents the evaluation of the content quality with the criteria and questions

related to scoop and focus, currency, correctness, completeness and consistency. The

importance rating reveals that, with respect to the content, the scoop and focus have the

highest importance. Questions related to sufficiency and consistency, as well as currency

and correctness in the case of SA, are also assessed to be quite important concerning

architectural quality evaluation.

Table 3 includes the presentation and visualisation criteria and questions. The criteria

are related to presentation standards, notation and structures, clarity and complexity.

Among these criteria, definition of notation and structures as well as suitability of the

vocabulary used and concepts was seen to be most essential, even if none of the criteria

received the highest rating.

The last main quality evaluation aspect, documentation management, is presented in

Table 4. The criteria of this aspect include maintenance of documentation, cost effec-

tiveness, frameworks and views, and tool support. The results show that clearly defined

responsibilities in maintaining the architecture descriptions is of the highest importance. In

addition, it is essential to have an appropriate tool support.

When summarizing the information in the four tables above, we can see the relevance of

all of the presented four main quality aspects. All of them include some criteria that are

considered highly important when the quality of the documentation is evaluated. The most

important quality criteria of the stakeholder and purpose orientation are definition of the

stakeholders and the purpose, and also description’s suitability to the stakeholders. With

respect to the quality of content, highest importance is given to the scope and focus of

documents, followed by their sufficiency and consistency. In addition, the currency and

correctness of SA descriptions in relation to the system is seen vital. In the quality of

presentation and visualization, the vocabulary and concepts, and their adequate definition

and explanation, is the main concern. When considering the documentation management,

the most important quality criteria are clear ownership identification, defined architectural

framework and views, as well as appropriate tool support.

5 Conclusion

In the present day information system development and software engineering context, the

significance of well designed architectures and high quality AD has been continually

increasing. Current architecture documentation related questions and challenges in the

industry appears to be related especially to the following issues: multiple stakeholders of

architecture work; definition of the architecture framework and views used in organization;

decisions concerning what documents to produce; multiple existing notations and tools

and; the lack of architecture documents, in some cases.

Architectural descriptions are used as a communication tool. Poor quality descriptions

may focus the communication on irrelevant aspects. High quality descriptions support

more efficient communication about architecture issues and enhance the understanding of

the architecture. The understanding of architecture can be seen also as a prerequisite for the

proper application of the designed architecture. In this way, the quality of the AD has even

an effect to the realization of the planned architecture.

Software Qual J (2009) 17:215–228 225

123

As a solution to this architecture documentation quality management problem, we pre-

sented the evaluation question framework. It is a practical solution that can be utilized by

companies in their aspiration of increasing the quality of AD. The presented framework was

developed in co-operation with industry practitioners, which supports its practical validity.

The framework consists of the four identified documentation quality aspects (stake-

holder and purpose orientation, content quality, presentation and visualization quality and,

architecture documentation management) and related criteria and questions. In the arran-

ged focus group interview, the practitioners agreed about the specified four main aspects.

They considered the aspects and criteria included in framework to be valid, useful and

helpful in the quality evaluation of AD. The focus group interview brought up also that

significance and meaning of AD is different for specialist representing different domains.

Therefore, the views to the relevance of AD quality evaluation can vary between the

specialists of different domains.

The industry practitioners involved in the study were architecture experts, mainly EA

and SA design and development specialists. Their perspectives might reveal much more

than the companies’ other business and ICT stakeholders’ perspectives. The points of

views of the description users were not gathered in this study. Information about direct user

experiences would be a relevant extension to this research in the future.

The questionnaire supplemented the focus group interview and gave a rating of the

importance of each evaluation criteria. The limited number of replies by the focus group

members may have affected the reliability of the results. However, the importance ratings

were mainly quite consistent.

The presented framework would require further practical validation, in different com-

panies. An interesting direction to continue the research would also be to study the

documentation from different stakeholders’ perspective: how architecture documents can

be produced and managed efficiently when the reality is that different stakeholders need

different levels of information presented in different ways.

Previous research highlights the quality of conceptual representations, which is seen

important from both human and technical view point. (e.g. Nelson and Monarchi 2007).

This study contributes to the quality evaluation of AD by identifying and defining the

criteria and a group of questions that can be used in the evaluation as well as by intro-

ducing a framework for carrying out practical assessments. In addition, some knowledge

about the importance of different evaluation criterion for the practitioners was produced.

Previous studies and literature present some guidelines and practices (mainly checklists)

for quality evaluation of AD. However, these guidelines and practices seem to be limited

only to one or a few quality aspects, or they do not clearly present the aspects that should

be evaluated. Therefore the criteria seem not to be identified and analyzed well enough by

the earlier research. In the present study, we approached systematically the problem, taking

into account the different relevant aspects of document quality. With this analysis and

synthesis, we presented new more comprehensive view to AD evaluation, which extends

the guidelines and practices presented earlier.

Some notes and recommendations for the companies can be derived form the results of

this study. The quality of AD should be a concern of the whole company, not of the

architects alone. The enterprise and software architects should ensure the quality of AD

and include the evaluation in their production process. The description quality evaluation

should be a standard part of architecture reviews. The companies can develop their own

quality evaluation checklists, which can be used in architecture design by the architects as

well as in architecture reviews by the reviewers. The results of this study can be utilised in

developing these checklists.

226 Software Qual J (2009) 17:215–228

123

Acknowledgement This paper is based on the work carried out during the AISA project (Quality Man-agement of Enterprise and Software Architectures) organized by the Information Technology ResearchInstitute (ITRI), University of Jyvaskyla. AISA project was financed by the Finnish Funding Agency forTechnology and Innovation (Tekes) and participating companies: Elisa Oyj, OP Bank Group, IBM Finland,S Group, Tieturi, and A-Ware Oy. We wish to thank the participating companies for their co-operation. Inaddition, Tanja Ylimaki and Eetu Niemi participated in the validation of these results.

References

Bass, L., Clements, P., & Kazman, R. (2003). Software architecture in practice. Boston: Addison-Wesley.Bernus, P. (2003). Enterprise models for enterprise architecture and ISO9000:2000. Annual Reviews in

Control, 27(2), 211–220.Bolloju, N., & Leung, F. S. K. (2006). Assisting novice analyst in developing quality conceptual models

with UML. Communications of the ACM, 49(7), 108–112.Chapurlat, V., Kamsu-Foguem, B., & Prunet, F. (2003). Enterprise model verification and validation: An

approach. Annual Reviews in Control, 27(2), 185–197.Claxton, J. C., & McDougall, P. A. (2000). Measuring the quality of models. The Data Administration

Newsletter, TDAN.com, http://www.tdan.com/view-articles/4877. Accessed 31 Dec 2008.Clements, P., Bachmann, F., Bass, L., Garlan, D., Ivers, J., Little, R., et al. (2002). Documenting software

architectures: Views and beyond. Boston: Addison Wesley.Fairbanks, G. (2003). Why can’t they create architecture models like ‘‘Developer X’’? An experience report.

In Proceedings of the 25th International Conference on Software Engineering (pp. 548–552). Wash-ington: IEEE Computer Society.

Fu, Y., Dong, Z. & He, X. (2005). An approach to validation of software architecture model. In Proceedingsof 12th Asia-Pacific Software Engineering Conference APSEC ’05 (pp. 375–384). Washington: IEEEComputer Society.

Hargis, G., Carey, M., Hernandez, A. K., Hughes, P., Longo, D., Rouiller, S., et al. (2004). Developingquality technical information—A handbook for writers and editors. Upper Saddle River, NJ: PearsonEducation.

He, X., Ding, J., & Deng, Y. (2002). Model checking software architecture specifications in SAM. InProceedings of the 14th International Conference on Software Engineering and Knowledge Engi-neering SEKE’02 (pp. 271–274). New York: ACM.

IEEE Std 1471-2000, Recommended practice for architectural description of software-intensive systems.ISO/IEC 10746. (1994). Reference model of open distributed processing (RM-ODP).ISO/IEC 42010:2007. Systems and software engineering—Recommended practice for architectural

description of software-intensive systems.Jonkers, H., Lankhorst, M., Van Buuren, R., Hoppenbrouwers, S., Bosanque, M., & Van der Torre, L.

(2004). Concepts for modeling enterprise architectures. International Journal of Cooperative Infor-mation Systems, 13(3), 257–287.

Kaisler, S.�H., Armour, F., & Valivullah, M. (2005). Enterprise architecting: Critical problems. In Pro-ceedings of the 38th Hawaii International Conference on System Sciences, HICSS’05 (p. 224b).Washington, DC: IEEE Computer Society.

Kruchten, P. (1995). 4?1 view model of architecture. IEEE Software, 12(6), 42–50.Krueger, R. A., & Casey, M. A. (2000). Focus groups: A practical guide for applied research. Thousand

Oaks, CA: Sage Publications.Lankhorst, M. (2005). Enterprise architecture at work. Modelling, communication, and analysis. Berlin,

Heidelberg: Springer-Verlag.Li, J., & Horgan, R. (2000). vSuds-SDL: A tool for testing software architecture specifications. Software

Quality Journal, 8(4), 241–253.Lindland, O. I., Sindre, G., & Solvberg, A. (1994). Understanding quality in conceptual modeling. IEEE

Software, 11(2), 42–49.May, N. (2005). A survey of software architecture viewpoint models. In Proceedings of the Sixth Aus-

tralasian Workshop on Software and System Architectures (pp. 13–24). Melbourne, Australia:Swinburne University of Technology.

McDavid, D. (1999). A standard for business architecture description. IBM Systems Journal, 38(1), 12–31.Nelson, H. J., & Monarchi, D. (2007). Ensuring the quality of conceptual representations. Software Quality

Journal, 15(2), 213–233.

Software Qual J (2009) 17:215–228 227

123

Polikoff, I., & Coyne, R. (2005). Towards executable enterprise models: Ontology and semantic web meetenterprise architecture. Journal of Enterprise Architecture, 1(1), 45–61.

Rosen, M. (2006). Opening statement. Cutter IT Journal, 19(3), 3–5.Rozanski, N., & Woods, E. (2005). Software systems architecture: Using viewpoints and perspectives.

Boston: Addison-Wesley Professional.Smart, K. L. (2002). Commentaries: Assessing quality documents. ACM Journal of Computer Documen-

tation, 26(3), 130–140.Smolander, K., Hoikka, K., Isokallio, J., Kataikko, M. & Makela, T. (2002). What is included in software

architecture? A case study in three software organizations. In Proceedings of the Ninth Annual IEEEInternational Conference and Workshop on the Engineering of Computer-Based Systems (ECBS’02)(pp. 131–138). Washington: IEEE Computer Society.

Steen, M., Akehurst, D., Doest, H., & Lankhorst, M. (2004). Supporting viewpoint-oriented enterprisearchitecture. In Proceedings of the eighth IEEE International Enterprise Distributed Object ComputingConference (EDOC 2004) (pp. 201–211). Washington: IEEE Computer Society.

Soni, D., Nord, R., & Hofmeister, C. (1995). Software architecture in industrial applications. In Proceedingsof the 17th International Conference on Software Engineering (pp. 196–207). New York: ACM.

TOGAF. (2007). The Open Group Architecture Framework, Version 8.1.1 ‘‘Enterprise Edition’’. The OpenGroup. http://www.opengroup.org/togaf/.

Worthen, B., & Fitzpatrick J. (1997). Program evaluation: Alternative approaches and practical guidelines.New York: Addison Wesley Longman.

Youngs, R., Redmond-Pyle, D., Spaas, P., & Kahan, E. (1999). A standard for architecture description. IBMSystems Journal, 38(1), 32–50.

Zachman, J. A. (1987). A Framework for Information Systems Architecture. IBM Systems Journal, 26(3),276–292.

Author Biographies

Niina Hamalainen is a software architect in Aditro HRM Oy. Shereceived her Ph.D. in Computer Science from the University ofJyvaskyla in 2008. Previously (1999–2008), she has worked as aproject manager and researcher at the Information TechnologyResearch Institute (ITRI), University of Jyvaskyla. This study is basedon the research carried out in ITRI. Hamalainen’s research interestsinclude enterprise and software architecture management and espe-cially evaluation and measurement methods and practices inarchitecture management area.

Jouni Markkula is an assistant professor at the University of Oulu,Finland. He received his Ph.D. in Computer Science from the Uni-versity of Jyvaskyla in 2003. His main research interest areas aresoftware engineering, software processes and mobile services. He hasalso lead and managed several research projects in these fields, inco-operation with the industry. Before the University of Oulu,Dr. Markkula was working at the Information Technology ResearchInstitute (ITRI) of the University of Jyvaskyla as a research director.

228 Software Qual J (2009) 17:215–228

123