evaluating software architectureswstomv/edu/2ii45/year-0910/... · 1 evaluating software...
TRANSCRIPT
![Page 1: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/1.jpg)
1
EVALUATING SOFTWARE ARCHITECTURES
M.R.V. ChaudronTechnische Universiteit Eindhoven
Adapted by Tom Verhoeff for 2II45 in 2009
C S
CP SP
C CP SP SCCP
SSP
C SCPSPCCPSPS CCP SSP
C SCPSPCCPSPS CCP SSP
If you haven’t analyzed it, don’t build it.
With slides from Rick Kazman
1
![Page 2: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/2.jpg)
Topics in Part 2
1. From Req. to Arch.: Doing Design
2. From Arch. to Req.: Doing Evaluation
3. From Arch. to Code: Doing Implementation, code generation, testing infrastructure, code configuration management
4. From Code to Arch.: Monitoring impl. work, Reverse Engineering, Integration
5. Process, Documentation, Tools, Standards
USER
REQUIREMENTS
DEFINITION
SOFTWARE
DEFINITION
CODE
DETAILED UNIT
DESIGN TESTS
DESIGN
ARCHITECTURAL
REQUIREMENTS
INTEGRATION
TESTS
SYSTEM
TESTS
ACCEPTANCE
TESTS
SVVP/UT
SVVP/IT
SVVP/ST
SVVP/AT
Project Request
URD
SRD
ADD
DDD
SVVP/SR
SVVP/AD
SVVP/DD
SVVP/DD
Tested Modules
Tested Subsystems
Tested System
Accepted Software
1
2
3
4
5
6
7
8
Product
Activity
Verification
Compiled Modules
9
SVVP Software Verification and Validation Plan
ESA Software Engineering Standards: Life Cycle Verification Approach
1.
3.
2.
4.5.
2
2
![Page 3: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/3.jpg)
3
System Quality Attributes (Extra-Func. Req.)
• Performance• Availability• Usability• Security
• Maintainability• Portability• Reusability• Testability
End User’sview
Developer’s view
• Time To Market• Cost and Benefits• Projected life time• Targeted Market• Integration with
Legacy System• Roll back Schedule
BusinessCommunityview
3
![Page 4: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/4.jpg)
4
Design of Software Architecture
FunctionalRequirements
Extra-FunctionalRequirements
Domain Knowledge and Requirements
UserRequirements
Group Functionalityin subsystems
Design approach for realizing extra-functional
quality properties
Synthesize
Analyze refineRBD, QN, RMA,ATAM, prototype
S.M.A.R.T.
Design Metrics
Model/DescribeUML, Views
Identify •Trade-offs •Sensitivity points
Select •Architectural Style•Reference Architecture•Architecture Tactics
Standards
4
![Page 5: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/5.jpg)
5
Why analyze architecture?• In the majority of projects, the only model
available for measurement is the final implementation
This is far too late and causes excessive costs and risks
• Every design involves tradeoffs A software architecture is the earliest life-
cycle artifact that embodies significant design decisions with high risks
5
![Page 6: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/6.jpg)
6
Heuristic (to avoid common pitfall)
Don’t evaluate what can be done easily.
Do evaluate what you need to know!
6
![Page 7: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/7.jpg)
7
Types of AnalysisQuantitative: How much …?• Analysis based on (mathematical) model• Measurements
• Feasibility prototypes• (Process) models• Simulation
Qualitative: What if ...? • Architecture Trade-off Analysis Method (ATAM), • Cost-Benefit Analysis Method (CBAM), …
7
![Page 8: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/8.jpg)
8
Architecture Tradeoff Analysis Method (ATAM): Origin
•Software Engineering Institute (SEI) at Carnegie Mellon University (CMU)
•Comparable to LaQuSo at TU/e•Consulting role for (multi-party) projects•Need to evaluate architectural designs independent
of how they were created•Evaluate and report in a standardized format
8
![Page 9: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/9.jpg)
9
Architecture Tradeoff Analysis Method (ATAM): Overview
9
![Page 10: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/10.jpg)
10
ATAM Reference
• “The Architecture Tradeoff Analysis Method.” Article ’98.
Rick Kazman et al. [Ex.: Remote Temperature Sensor System]
• ATAM: Method for Architecture Evaluation,
Rick Kazman, Mark Klein, Paul Clements, August 2000,
Technical Report CMU/SEI-2000-TR-004
[Ch. 9 & Appendix optional. Ex.: Battlefield Control System]
• Chapter 11 from the BCK book [Optional. Ex.: Nightingale]
10
![Page 11: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/11.jpg)
11
ATAM Purpose• Evaluate whether the design decisions satisfactorily
address the quality requirements.
• Elicit rationale of design decisions (traceability).
• Discover risks: decisions that might create (future) problems in some quality attribute.
• Discover sensitivity points: Alternatives for which a slight change makes a significant difference in a quality attribute.
• Discover tradeoffs: Decisions affecting more than one quality attribute, in opposite direction.
11
![Page 12: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/12.jpg)
12
ATAM Output• Precise description of the architecture
• Articulation of the business goals
• Quality requirements in terms of Qu. Attr. Scenarios
• Relation between business goals and architecture tactics (the rationale of the design)
+
• Risks
• Sensitivity points
• Tradeoff points
12
![Page 13: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/13.jpg)
13
ATAM Qualifications (Reservations)
Result depends on the quality of the specification of the architecture
• garbage in, garbage out
Not an attempt to predict resulting quality attributes
Quality properties that are not easily expressed quantitatively, such as usability, interoperability, …
13
![Page 14: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/14.jpg)
14
ATAM Side-effect
• Improve the architecture documentation.•Elicit/make precise a statement of the architecture’s driving quality attribute requirements
• Process benefit:• Foster stakeholder communication & consensus
14
![Page 15: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/15.jpg)
15
ATAM PreconditionsThe ATAM relies critically on:• Appropriate preparation by the customer• Clearly-articulated quality attribute
requirements• Active stakeholder participation• Active participation by the architect• Evaluator familiarity with architectural
styles and analytic models
15
![Page 16: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/16.jpg)
16
ATAM STEPS
1. Explain the ATAM2. Present business drivers3. Present architecture4. Identify architectural approaches5. Generate quality attribute utility tree6. Analyze architectural approaches7. Brainstorm and prioritize scenarios8. Analyze architectural approaches9. Present results
These slides by Rick Kazman
16
![Page 17: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/17.jpg)
17
1. PRESENT THE ATAMEvaluation Team presents an overview of ATAM• ATAM steps in brief• Techniques
Utility tree generation Architecture elicitation and analysis Scenario brainstorming
• Outputs Architectural approaches Utility tree Scenarios Risks and “non-risks” Sensitivity points and tradeoffs
17
![Page 18: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/18.jpg)
18
ATAM customer representative describes the system’s business drivers including:
• Business context for the system• Time to market
• Most important functional requirements• Most important quality attribute requirements
Architectural drivers: • Quality attributes that “shape” the architecture
Critical requirements: •Quality attributes most central to the system’s success
•High availability, high security, …
2. PRESENT BUSINESS DRIVERS
Understand the requirements
18
![Page 19: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/19.jpg)
19
• Architect presents an architecture overview incl: Technical context of the system
• systems with which the system must interact
• Technical constraints such as an OS, hardware, or
middleware prescribed for use
Architectural approaches/styles used to address quality attribute requirements
• Evaluation team begins probing for and capturing risks
3. PRESENT ARCHITECTURE
Understand the architecture
19
![Page 20: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/20.jpg)
20
• Start to identify parts of the architecture that are key for realizing quality attribute goals
• Identify any predominant architectural styles, tactics, guidelines & principles
• Examples: 3-tier Client-server
Watchdog, Redundant hardware
4. IDENTIFY* ARCHITECTURAL APPROACHES
*approaches are not yet analyzed
Understand the architecture
20
![Page 21: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/21.jpg)
21
5. GENERATE QUALITY ATTRIBUTE UTILITY TREE
• Identify, prioritize, and refine the most important quality attribute goals by building a utility tree A utility tree is a top-down vehicle for characterizing
the “driving” attribute-specific requirements Select the most important quality goals to be the
high-level nodes (e.g. performance, modifiability, security, availability)
Scenarios are the leaves of the utility tree
• Output: A characterization and a prioritization of specific quality attribute requirements
Prioritize requirements
21
![Page 22: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/22.jpg)
22
EXAMPLE UTILITY TREE
Utility
Performance
Modifiability
Availability
Security
Add CORBA middlewarein < 20 person-months Change web user interfacein < 4 person-weeksPower outage at site1 requires trafficredirected to site2 in < 3 seconds.Restart after disk failure in < 5 minutes
Network failure detected and recoveredin < 1.5 minutes
Reduce storage latency on customer DB to < 200 ms.
Deliver video in real time
Customer DB authorization works 99.999% of the time
Credit card transactions are secure 99.999% of the time
DataLatencyTransactionThroughput
New product categoriesChange COTS
H/W failure
COTS S/Wfailures
Data
Dataconfidentiality
integrity
(H,L)
(H,M)
(M,M)
(M,L)
(L,H)
(L,H)
(H,M)
(L,H)
(L,H)
(Importance, Achievability)H-High, M-Medium, L-Low
22
![Page 23: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/23.jpg)
23
SCENARIOS• Scenarios are used to:
Represent stakeholders’ interests Understand quality attribute requirements
• Scenarios should cover a range of: Use case scenarios
• anticipated uses
Evolution scenarios• anticipated changes; e.g. growth
Exploratory scenarios•unanticipated stresses to the system
23
![Page 24: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/24.jpg)
24
EXAMPLE SCENARIOS• Use case scenario
A remote user requests a database report via the Web during peak period and receives it within 5 seconds
• Evolution scenarioAdd a new data server during peak hours within a downtime of at most 8 hours.
• Exploratory scenarioHalf of the servers go down during normal operation without affecting overall system availability
24
![Page 25: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/25.jpg)
25
STIMULI-ENVIRONMENT-RESPONSES
• Use case (performance) scenarioRemote user requests a database report via the Web during peak period and receives it within 5 seconds
• Growth scenarioAdd a new data server during peak hours within a downtime of at most 8 hours.
• Exploratory scenarioHalf of the servers go down during normal operation without affecting overall system availability
‘Formula’ for scenarios
A good scenario makes clear what the stimulus is and what the measurable response of interest is
25
![Page 26: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/26.jpg)
General Qual. Attr. Scenario for Scalability
• Source: system owner
• Stimulus: request to accommodate more concurrent users (usage parameter)
• Artifact: the system, incl. computing platforms
• Environment: normal operation, design/run time
• Response: add extra memory/servers (architectural parameters)
• Response measure: cost of additional hardware, change in performance
26
26
![Page 27: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/27.jpg)
Specific Qual. Attr. Scenario for Scalability• Source: system owner
• Stimulus: request to accommodate five times more concurrent users over next two years
• Artifact: the main server cluster
• Environment: normal operation
• Response: increase number of servers no more than sixfold, without recompiling the software
• Response measure: performance as measured by average number of typical requests processed per minute may not drop more than 10% 27
27
![Page 28: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/28.jpg)
28
SAAM*: Rank Architectures
Architecture Scenario 1 Scenario 2 Scenario 3 Scenario 4 Contention
X 0 0 - - -
Y 0 0 + + +
Z - + 0 + -
* Software Architecture Analysis Method
Summary of suitability
28
![Page 29: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/29.jpg)
29
6. ANALYZE ARCHITECTURAL APPROACHESThe evaluation team probes architectural approaches
w.r.t. specific quality attributes to identify risks
• Identify the approaches that pertain to the highest priority quality attribute requirements
• Generate quality-attribute specific questions for highest priority quality attribute requirement
• Ask quality-attribute specific questions
• Identify and record risks and non-risks, sensitivity points and tradeoffs
Analyze the Architecture
29
![Page 30: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/30.jpg)
30
Sensitivity Point
A system requires • high performance
Suppose throughput depends on one channel
Sensitivity point is a parameter of the architecture to which some quality attribute is highly related.
increase channelspeed
increaseperformance
Subsystem 1 Subsystem 2
30
![Page 31: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/31.jpg)
31
Trade-off point
A system requires • high performance, • high reliability• high security
increase channelspeed
increaseperformance
decreasereliability&
A trade-off point is a parameter of the architecture that affects multiple quality attributes in opposite directions.
increase encryption increasesecurity
decreaseperformance&
Subsystem 1 Subsystem 2
31
![Page 32: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/32.jpg)
32
QUALITY ATTRIBUTE QUESTIONS• Quality attribute questions elicit architectural
decisions which bear on quality attribute requirements
• Example: Performance How are priorities assigned to processes? What are the message arrival rates?
• Example: Modifiability Are there any places where layers/facades are circumvented ? What components rely on detailed knowledge of message
formats?
32
![Page 33: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/33.jpg)
33
Characterization of Availability
From: CMU/SEI-2000-TR-004: ATAM: Method for Architecture Evaluation
Not Complete, but:- a framework for thinking- helps ensure coverage- helps eliciting questions
33
![Page 34: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/34.jpg)
34
Characterization of Modifiability
34
![Page 35: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/35.jpg)
35
RISKS and NON-RISKS• Risks are potentially important architectural decisions
that may cause problems
• Non-risks are good decisions frequently relying on implicit assumptions
• Risk and non-risk constituents Architectural decision Quality attribute requirements Rationale
• Sensitivity points are candidate risks
35
![Page 36: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/36.jpg)
36
EXAMPLE RISK and NON-RISK• Example risk
Rules for writing business logic modules in the second tier of your 3-tier style are not clearly articulated. This could result in compromising modifiability.
- A quality concerns is not addressed / has not been analyzed- Some tactics interact in an unknown manner
• Example non-riskAssuming message arrival rates of once per second, a processing time of less than 30 ms, and the existence of one higher priority process, a 1 second soft deadline seems reasonable
– Risk may also occur in the project organization.Engineer X has never heard of requirement Y.
36
![Page 37: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/37.jpg)
37
7. BRAINSTORM AND PRIORITIZE SCENARIOS Stakeholders generate scenarios using
a facilitated brainstorming process• Examples are used to facilitate the step
• The new scenarios are added to the leaves of the utility tree
Essentially a process step:• include a larger group of stakeholders• extend consensus (esp. on priorities)• extend confidence in completeness of scenario’s
37
![Page 38: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/38.jpg)
38
8. ANALYZE ARCHITECTURAL APPROACHES
• Identify the architectural approaches impacted by the scenarios generated in the previous step
• This step continues the analysis started in step 6 using the new scenarios
• Continue identifying risks and non-risks• Continue annotating architectural information
Essentially a process step:• include a larger group of stakeholders• extend consensus • extend confidence in completeness of scenario’s
38
![Page 39: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/39.jpg)
39
9. PRESENT RESULTS• Recapitulate steps of the ATAM
• Present ATAM outputs Architectural approaches
Utility tree
Scenarios
Risks and “non-risks”
Sensitivity points and tradeoffs
39
![Page 40: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/40.jpg)
40
• ATAM ‘buys’ time to think about an architecture while development processes are often under time-pressure • Identification of risks early in the life-cycle• Focuses on features that are essential for the
stakeholders and not on technical details• Improved architecture documentation
• Forces stakeholders to:• think about qualitative requirements• prioritize qualitative requirements
• Documented basis for architectural decisions
ATAM BENEFITS
The results are improved architectures40
![Page 41: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/41.jpg)
41
ATAM – Cost/Benefit
• Cost– 1 – 2 weeks of time for 8 – 10 highly paid people, 2 days for another 10-12
people (for full formal process!)– Delays project start– Forces development of architecture up front
• Benefit– Financial – saves money– Forces preparation / documentation / understanding– Captures rationale– Catch architectural errors before built– Make sure architecture meets scenarios– More general, flexible architecture– Reduces risk
41
![Page 42: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/42.jpg)
42
• Subjective judgementthat depends on the experience of the participants
• No guidelines for definition of useful change cases
• Risk: check-list thinking
ATAM WEAKNESSES
42
![Page 43: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/43.jpg)
– Certification
LaQuSo Software Product Certification Model
43
!"#$%&'%&()*"+,'-+&.$/)'0,+)1(1/")1&2'3&.,4'
-"5,'6'&('67'
! !"#$%&'(#!%""
!#! $%&'%()*&$+"*&*,-./."
!"#$%& '()*+)#$,*("'& ,*-& '()$./0,*("'& ,#"& )+1"*& 0*1"#(2,*3"-& 0*& (4##"*1&%)+15,#"&0*-4%1#6&.#,(10("&,*-&#"%",#(2&7"838&9:",#&;<<<&()*+)#$,*("=&,*-&9:",#&;<<<& ()$./0,*("=& )+& %6%1"$%>8& ?/12)432& 12"#"& $0321& @"& %4@1/"& -0++"#"*("%& 0*&$",*0*3A&5"& ()*%0-"#& 12"& 15)& 1"#$%& %6*)*6$%&$",*0*3B& 12"& %0$0/,#016& #"/,10)*&@"15""*& 15)& )@C"(1%& 5012& #"%."(1& 1)& ,& ("#1,0*& (2,#,(1"#0%10(8& !2"%"& )@C"(1%& (,*&"012"#&@"&%."(0+0(,10)*%A&0$./"$"*1,10)*%A&%1,*-,#-%&)#&C4%1&%0$./"&.#)."#10"%&7%""&D034#"& E>8&F"&50//& #"+"#& 1)& 120%& 16."& )+& G"#0+0(,10)*%& 5012& 12"& 1"#$&0&2(&+8"2/,'92"4:;1;8&
&
&)*+,-./01/&*22.-.34/#56.7/82/(8328-9:3;./<3:=57*7/
&
H)*+)#$,*("&?*,/6%0%& +)#$%& 12"& ("*1#,/&.,#1&)+& %)+15,#"&I4,/016& ,%%4#,*("&,%& 01&.#)G0-"%&$",*%&)+&(2"(J0*3&52"12"#&12"&%)+15,#"&@"0*3&-"G"/)."-&()##"%.)*-%&1)&12"& -"G"/)."#& )#& "*-K4%"#& "L."(1,10)*%A& /"30%/,10G"& ,(1%& )#& %1,*-,#-%8&F"& -)& *)1&,0$& ,1& 0*G"%103,10*3& %)+15,#"& "*30*""#0*3& 1"(2*0I4"%& "*%4#0*3& ()*+)#$,*("&#,12"#& 5"& 0*G"%103,1"& 1"(2*0I4"%& ,//)50*3& 4%& 1)& (2"(J& ()*+)#$,*("8& M*& )12"#&5)#-%A& 12"& H)*+)#$,*("& ?*,/6%0%& 5"& ."#+)#$& 0%& ,0$"-& ,1& ,#10+,(1%& #"/,1"-& 1)&%)+15,#"&.#)-4(1%A&*)1&,1&%)+15,#"&"*30*""#0*3&.#)("%%"%&7%""&D034#"&E>8&&
F"&(,*&0-"*10+6&12#""&-0++"#"*1&(/,%%"%&)+&0*.41&,#10+,(1%B/
E8 %&()*"+,'9+)1("/);N&12"%"&50//&@"&"L./,0*"-&0*&$)#"&-"1,0/8&;8 -+&<,+)1,;N&12"%"&(,*&@"&,*6&J0*-&)+&.#)."#16&,../0(,@/"&1)&)@C"(1%&+#)$&(/,%%&
E8&OL,$./"%&,#"B&9-)"%&12"&.#)3#,$&1"#$0*,1"P=A&9,#"&12"&#"I40#"$"*1%&1"%1,@/"P=&,*-&90%&12"&,(10G016&-"%03*&,&%)4*-&5)#J+/)5P=&
Q8 %)"2."+.;='>$1.,412,;'"2.'?,5$4")1&2;N&12"%"&,#"&3"*"#0(&%"1%&)+/"%&)#&#"I40#"$"*1%&,../0(,@/"&1)&12"&"*10#"&(/,%%&)+&%)+15,#"&,#10+,(1%&,*-&4%4,//6&5#011"*&@6&,&-0++"#"*1&.,#16&12,*&12"&)*"&12,1&(#",1"-&12"&)@C"(1&+#)$&(/,%%&E8&OL,$./"%&,#"&DR?&#"34/,10)*%A&?STM&HA&,*-&,&()$.,*6U%&()-0*3&%1,*-,#-%8&
?%&(,*&,/%)&@"&%""*&0*&D034#"&E&)*"&)+&12"&0*.41&)@C"(1%&0%&4%4,//6&)+&(/,%%&EA&520/"&12"&)12"#&(,*&@"&)+&,*6&)+&12"&12#""&(/,%%"%8&
T)$"10$"%&%)+15,#"&,#10+,(1%&*""-&1)&@"&1#,*%+)#$"-&@"+)#"&12"&,*,/6%0%&(,*&1,J"&./,("8& D)#& 0*%1,*("& ,& #"I40#"$"*1%& -)(4$"*1&$4%1& @"& 1#,*%/,1"-& 0*1)& ,& +)#$,/&$)-"/& @"+)#"& ,*6& .#)."#16& )+& 01& (,*& @"& .#))+"-8& V#& %)4#("& ()-"& 2,%& 1)& @"&1#,*%+)#$"-&0*1)&,&%"1&)+&.#"-0(,1"%&1)&,../6&12")#"$&.#)G0*38&&
• Consistency, Functional, Behavioral, Quality, Compliance
• Certification Criteria: Formality, Uniformity, Conformance
• 6 Product Areas: Context Description, User Requirements, High-level Design, Detailed Design, Implementation, Tests
• Results in an Achievement Level43
![Page 44: EVALUATING SOFTWARE ARCHITECTURESwstomv/edu/2ii45/year-0910/... · 1 EVALUATING SOFTWARE ARCHITECTURES M.R.V. Chaudron Technische Universiteit Eindhoven Adapted by Tom Verhoeff for](https://reader034.vdocuments.site/reader034/viewer/2022050512/5f9cac272c48f443b52994d6/html5/thumbnails/44.jpg)
Relationship between design & evaluation
• Why (not) use the same method for design and evaluation?
• Should not only design to pass the evaluation; evaluation is often limited
• If evaluation includes important considerations, then these should also have played a role in design
• Evaluation is part of design, not an add-on
44
44