how do software architects specify and validate quality requirements?

Post on 15-Jun-2015

195 Views

Category:

Technology

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

ECSA 14: empirical study on how software architects specify and validate quality requirements

TRANSCRIPT

How Do Software Architects Specify and Validate !Quality Requirements?

Andrea Caracciolohttp://scg.unibe.ch/staff/caracciolo

ECSA 201429. august 2014

�2

Quality Requirements

“ The rendering operation has to be completed in less than 4ms”

!!!

�3

Quality Requirements

“ The rendering operation has to be completed in less than 4ms”

class

!!!

�4

Quality Requirements

“ The rendering operation has to be completed in less than 4ms”

class

?

�5

Quality Requirements

!!! class

Validation

“ The rendering operation has to be completed in less than 4ms”

Specification

?

�6

Empirical Study

14

34

�7

Specification

customerarchitect

developer

descriptiveprescriptive

IntentAudience

�8

Specification

customerarchitectdeveloper

descriptiveprescriptive

IntentAudience

�9

Specification — Taxonomy

PerformanceResponse timeThroughput

Compatibility Signature…

MaintainabilityMeta-annotationsDependencies

{

{{

Communication

�10

Specification — Taxonomy

PerformanceResponse timeThroughput

Compatibility…

MaintainabilityMeta-annotationsDependencies

{

{{

CommunicationSignature

�11

Validation

None Manual Automated

�12

Validation

None

framework code generator

�13

Validation

Manual

code review interaction

�14

Validation

Automated

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

Automated Validation is not Prevalent

Avg: 40%

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

Automated Validation is not Prevalent

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

auto > 50%

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

Automated Validation is Needed

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

manual > auto

Tool support is Insufficient

Avg: 10%

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

< 25%

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

naming conventionsfile location

hardware infrastructuresoftware updaterecoverabilitydependencies

signaturesoftware infrastructure

data structureevent handling

availabilitycommunicationaccessibility

meta-annotationcode quality

visual designdata integrityauthentication

data retention policyresponse-time

throughputauthorization

0% 25% 50% 75% 100%

Tool support is insufficient

very critical

16%

�20

Formalization is not Prevalent

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

�21

Formalization is not Prevalent

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

Avg: 20%

ER, UML + profileRegex, BNFannotations

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

Formalization is not Prevalent

formal > 30%

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

Formalization could be exploited more

tool-based validation

software updatehardware infrastructure

accessibilityrecoverability

software infrastructureauthentication

data retention policythroughput

response-timeavailability

file locationcode metricsvisual design

communicationdata integrityauthorization

event handlingnaming conventions

meta-annotationdata structure

signaturedependencies

0% 25% 50% 75% 100%

22%

11%

16%

formal > 30%

�24

!

Test-oriented Specification Language

 “          cannot    ..”  !  “          shall  always    ..”

!!! class

�25

Continuous Feedback

“ The rendering operation has to be completed in less than 4ms”!

!“ Data transfer object classes must be

annotated with @DTO”

!!! class

Validation

 “          cannot    ..”  !  “          shall  always    ..”

�26

Summary

Quality Requirements

~ 60% not automatically validated

Inadequate tools

Little formal specification

Continuous feedback

One specification language

Andrea Caracciolohttp://scg.unibe.ch/staff/caracciolo

�27

Interviewees

�28

How Do Software Architects Specify and Validate Quality Requirements? 3

# Role Org. Project (domain; type) team size

A CEO, architect C1 government / enterprise <5B business manager C2 government / enterprise 10-50C project manager C3 insurance / enterprise >50D architect C4 logistic / enterprise(integration) <5E developer C4 logistic / enterprise(integration) <5F CTO C5 banking / enterprise >50G architect C2 government / enterprise 5-10H architect C2 government / enterprise 10-50I architect C6 logistic / enterprise(migration) >50J* developer C2 government / development support tool <5K architect C5 banking / enterprise 5-10L architect C6 transportation / control systems 5-10M* developer C5 banking / source code analysis >5N* architect C5 banking / development support tool 5-10

Table 1. Interview study participants. Candidates with an asterisk worked in projectsaimed at supporting architectural design. The remaining candidates worked as soft-ware architects or project managers in medium to large projects and have more directexperience in architectural design.

The main outcome of this qualitative study was the list of quality attributespresented in Table 2. These quality attributes were inferred by analyzing theinterviews and synthesizing the main concerns using coding techniques [17]. Tosupport this activity, we identified and labeled quality requirements in interviewtranscriptions as well as the documentation files (i.e., Software ArchitectureDocuments, Developer guidelines) that we collected at the end of several inter-view sessions. To gather more evidence that the observations coming from thefirst study actually reflected the state-of-practice of a broader community, wecreated an e-survey. Over a time span of two months we collected 34 valid andcomplete responses. Invitations were sent to professionals selected among indus-trial partners and collaborators (i.e., convenience sampling method), includingpeople involved in the first phase of the study. The survey was also advertisedin several groups of interest related to software architecture hosted by LinkedInand on Twitter2 (i.e., voluntary sampling method). Survey participants wereasked to specify whether the quality attributes identified in the first study wereever encountered in a past project, their perceived level of importance (on a scalefrom 1 to 5, with 5 being the highest), the formalism adopted to describe themand the testing tool used for their validation. A complete copy of the survey canbe found on our web site1.

3 Learning from Practitioners: a Qualitative Study

During interviews, we tried to elicit a possibly wide range of distinct architec-turally significant quality attributes. We asked our respondents to enumerate

2http://www.linkedin.com; http://www.twitter.com

�29

4 A. Caracciolo, M. F. Lungu, O. Nierstrasz

those concerns that could be considered fundamental for their architecture. Foreach of those, we asked them to describe their main properties and the formin which they were typically specified. Table 2 shows all identified quality at-tributes. For each quality attribute, we also present additional details collectedduring our quantitative study (columns 3-6 in Table 2).

Quality attributes are categorized based on the closest matching ISO-25010[11]quality characteristic. For simplicity’s sake, we decided to pair each attributewith one single category. For clarity, we also published some explanatory re-quirements for all presented quality attributes on our web site1.

Quality Quality Attribute Importance Form.

Characteristic (Internal / External / Process) Q1 Q2 Q3 Fam. Not.

Performance Response time (E) 3 4 5 15% 14%

Throughput (E) 3 4 4 26% 13%

Hardware infrastracture (I) 2 3 4 29% 0%

Compatibility Signature (I) 3 4 4 18% 52%

File location (I) 1 3 4 29% 18%

Data structure (I) 2 3 4 29% 47%

Communication (I) 2 4 4 15% 22%

Usability Visual design (E) 2 3 3.5 9% 21%

Accessibility (E) 1 2 3.5 50% 0%

Reliability Availability (E) 4 4 5 15% 14%

Recoverability (E) 2 3 5 32% 5%

Data integrity (I) 3 3 4 18% 23%

Event handling (I) 2 3 4 35% 25%

Software update (P) 1 2 3 59% 0%

Security Authorization (E) 4 4 5 3% 23%

Authentication (E) 3 4 5 21% 12%

Data retention policy (I) 2 3 4 12% 13%

Maintainability Meta-annotations (I) 1 3 4 32% 39%

Code quality (I) 2 3 3.5 15% 19%

Dependencies (I) 2.5 3 4 18% 53%

Naming conventions (I) 2 3 3 12% 38%

Portability Software infrastracture (I) 3 3 4 24% 8%

Table 2. Taxonomy of quality requirements (grouped by supported quality characteris-tic). Columns (from left to right): Matching quality characteristic; Quality requirement;Evaluated importance (first, second and third quartile); Participants who encounteredthe requirement in a previous project (familiarity); Participants who specified the re-quirement using a formal notation. Columns 3-6 contain data collected during ourquantitive study.

3.1 Identified Quality Attributes

We now comment on the identified quality attributes.

�30

How Do Software Architects Specify and Validate Quality Requirements? 11

Constraint Tool Reported Testing Tools

authorization 15% SoapUI / other: Framework (JAAS)

throughput 26% Meter, LISA, Selenium, Lucust, Gatling, HP LoadRunner

response-time 17% JMeter, LISA, Selenium

data retention policy 8% no tool specified

authentication 3% other: Framework (JAAS, Spring)

data integrity 8% Moose / other: db-constraints, Framework

visual design 4% other: Framework

code quality 39% Sonar, Findbugs, Code critics, Checkstyle, Emma, Clover

meta-annotation 19% dclcheck

accessibility 0% no tool specified

communication 8% Moose, dclcheck

availability 10% DynaTrace, Gomez, Shell script + Selenium, Pingdom

event handling 12% dclcheck, Moose

data structure 16% Moose / other: Custom tools

software infrastr. 8% other: Automated declarative provisioning

signature 7% Moose, JMeter, soapUI

dependencies 22% SAVE, dclcheck, Patternity, Jdepend, Ndepend, Macker,IntensiVE, SmallLint, DSM tool

recoverability 0% no tool specified

software update 0% no tool specified

hardware infrastr. 6% no tool specified

file location 0% other: Guaranteed by framework

naming conventions 11% Code critics, Checkstyle, PMD, FxCop, IntensiVE, Petit-Parser

Table 3. Survey results related to tool-aided architectural constraints testing. Columns(from left to right): Constraint name; respondents using third-party tools for testingthe constraint; adopted tools.

O6. Tools do not take advantage of existing formalizations: Figure 1 shows thatsome constraints (e.g., dependencies, naming conventions) are more often for-mally specified than automatically validated. However, formally specifying con-straints without automatically verifying them is less than optimal. Based on ouranalysis, we observe that some adopted notations do not provide su�cient detailsto support validation (e.g. UML for describing signature) and other notationsare not fully taken advantage of by the existing tools (e.g. regular expressionsfor describing naming conventions). We think that more empirical studies areneeded in order to expose actual formalization practices. The results of thesestudies might expose common flaws of existing notations and provide concreteevidence of practitioner’s needs.

top related