software design evaluation: vision and tools....the puma project 25 june 2004 murray woodside,...

42
Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision and Tools The PUMA project "Performance by Unified Model Analysis" Murray Woodside Dorina Petriu Carleton University what sort of evaluation? how can it be done easily? UML tool integration, PUMA project ideas benefits www.sce.carleton.ca/rads/puma/

Upload: phyllis-burke

Post on 20-Jan-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 1

Software Design Evaluation: Vision and Tools

The PUMA project "Performance by Unified Model Analysis"

Murray WoodsideDorina Petriu

Carleton University

what sort of evaluation? how can it be done easily?

UML tool integration, PUMA project ideas benefits

www.sce.carleton.ca/rads/puma/

Page 2: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 2

Why Design Evaluation?

System perspective early integrated... tradeoffs between parts

Planning view understandable by many stakeholders and by groups with subsystem concerns maintain a view over the life cycle

Aligned with Model-Driven Design/Architecture trend to higher-level development techniques trend to use of predefined components (generative

techniques)

Page 3: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 3

Six capabilities

“Profile the design” how often are objects called/created, messages sent?

Estimate Resource loadings heads-up on saturation

Budgetting for resource use and delay create and verify budgets given to subsystem developers

Estimate performance estimate response time, achievable throughput, and compare to

requirements point estimates, sensitivity

Analyze scale-up and find bottlenecks intrinsic limitations in the design; go beyond the limits of the lab.

Improve scalability

Page 4: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 4

“Profile the design”

Estimate the numbers of events per system response calls, requests, and messages, relative frequencies of paths

Based on estimates of call multipliers at each component for one entry to the component, how many of these will occur? this “local” knowledge is more easily grasped, simple math does

the rest. The simplest numbers: a kind of accounting model for events

10/sec

[*5]

[*7][*10]

3500/sec

A B C D

Page 5: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 5

Estimate Resource loadingsby component

Estimate utilization ( 0 to 100%) (frequency of resource use) holding time frequency from the accounting model...

holding times may be estimated by judgment, or budgetted, or also be found from a model

CPU loading: % utilization Logical resource (thread pool, buffer pool)...

need to estimate holding times, generally needs a model or use a budget value for holding time

Page 6: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 6

Budgets for resource use and delay

Budgeting is a shift in viewpoint: instead of trying to guess or predict, we set targets. budgeting is a familiar problem, with familiar uncertainties achieving the targets is a separate job,

delayed until implementationmay involve budget adjustments

A management tool that allows a developer to focus on his own part of the system and still work towards achieving a performance plan

Page 7: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 7

Estimate performance

A point estimate for delay or throughput at a given set of parameter values average, or probability of meeting a target

A model may be used to explore parameter changes and uncertainties, by multiple runs design sensitivity, data sensitivity, environment sensitivity,

sensitivity to user behaviour.

%miss

parameter1

parameter2

Page 8: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 8

Analyze scale-up, find bottlenecks

bottlenecks = limiting factors on increasing the workload study replication strategies, protocol changes, movements

of the bottleneck study: “what if a certain function could be speeded

����up 20%”

study large system configurations that can’t be reproduced in the lab.

Page 9: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 9

Improve scalability

By removing bottlenecks in the model successive limits, similar to work in the lab but faster subject of several studies

By scalability strategies involving replicated resources

Software redesign and re-architecture can reduce logical resource holding times or frequencies

Page 10: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 10

PUMA project origins

Perceive: that evaluation is a worthwhile goal difficulties in getting parameters are real but can be overcome

Fact: we have scalable performance modelling tools, but creating models is too hard

Trend to higher-level tools for software development acceptance of UML MDA etc, generative programming

Conclude: can piggyback performance prediction on design capture in UML

PUMA: Performance by Unified Model Analysis(NSERC Strategic Grant Project)

Page 11: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 11

PUMA

Build models from the UML design specifications dealing with real limitations like incompleteness of the UML doc

Transform automatically (or nearly?) to a model we use “layered queueing”, others are to be supported

Assisted exploration of the design space (big question)

Designer feedback and design suggestions (big question)

Elements of a “Performance Engineering Assistant” open-ended potential for integration with other tools, other analyses

Page 12: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 12

PUMA: Performance Information Processing overview

SCore Scenario

Model(gather and

organize structureand data)

RResults ofanalysis

PPerformance

Model- layered queues

- Petri nets

Platform and environmentsubmodels

Plans forexperiments

U

UML Design- deployment- scenarios:(sequence,

activity diagrams)- components

U2S S2P

R2PR2S

R2U

solve

Page 13: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 13

Correspondence between a UML Spec and its Performance Model

A Building Security System (BSS) example

UML description with performance annotations using the standard performance profile for schedulability,

performance and time (SPT)

Layered queueing model

Sample results

Page 14: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 14

Performance Profile: the domain model

PerformanceContext

WorkloadresponseTimepriority

PScenariohostExecDemandresponseTime

PResourceutilizationschedulingPolicythroughput

PProcessingResourceprocessingRatecontextSwitchTimepriorityRangeisPreeemptible

PPassiveResourcewaitingTimeresponseTimecapacityaccessTime

PStepprobabilityrepetitiondelayoperationsintervalexecutionTime

ClosedWorkloadpopulationexternalDelay

OpenWorkloadoccurencePattern

0..n

1..n

1..n 1

11

1..n 1..n

0..n

0..n

0..n

0..1

1..n

1

1{ordered}

+successor

+predecessor

+root

+host

Page 15: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 15

Case Study – Building Security System (BSS)

Use Cases

Access control

Log entry/ exit

Acquire/store video

Manage access rights

Manager

Database Video Camera

<<includes>> User

Page 16: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 16

Deployment Diagram of BSS

Page 17: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 17

Sequence Diagram for Video Acquire/Store Scenario

<<PAresource>>

VideoController

<<PAresource>>

AcquireProc

<<PAresource>>

BufferManager

<<PAresource>>

StoreProc

*[$N] procOneImage (i)

<<GRMacquire>>allocBuf (b)

getImage (i, b)

passImage (i, b)

storeImage (i, b)

<<GRMrelease>>releaseBuf (b)

freeBuf (b)

<<PAresource>>Database

{PAcapacity=10}

writeImg (i, b)

getBuffer()

store (i, b)

<<PAstep>>{PAdemand =(‘asmd’,‘mean’, (1.5, ‘ms’)}

<<PAstep>>{PAdemand=(‘asmd’,‘mean’, (1.8, ‘ms))}

<<PAcontext>>

o

<<PAstep>>{PAdemand=(‘ asmd’,

‘mean’, ($P * 1.5, ‘ms’)),PAextOp = (network, $P)}

<<PAstep>>{PAdemand=(‘asmd’,

‘mean’, ($B * 0.9, ‘ms’)),,PAextOp=( writeBlock, $B)}

<<PAclosedLoad>>{PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) }

<<PAstep>>{PArep = $N}

<<PAstep>> {PAdemand=(‘asmd’,

‘mean’, (0.5, ‘ms’))}

o

o

<<PAstep>>{PAdemand=(‘ asmd’,‘mean’, (0.5, ‘ms’))}

<<PAstep>>{PAdemand=(‘asmd’,‘mean’, (0.9, ‘ms’))} <<PAstep>>

{PAdemand=(‘asmd’,‘mean’, (1.1, ‘ms’))}

<<PAstep>>{PAdemand=(‘asmd’,‘mean’, (2, ‘ms’))}

<<PAstep>>{PAdemand=(‘asmd’,‘mean’, (0.2,’ms’))}

o

This object manages the resource Buffer

o

Page 18: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 18

Sequence Diagram for Video Acquire/Store Scenario

<<PAresource>>

VideoController

<<PAresource>>

AcquireProc<<PAresource>>

BufferManager

*[$N] procOneImage(i)

<<GRMacquire>>allocBuf (b)

getBuffer()

<<PAstep>>{PAdemand =(‘asmd’,‘mean’, (1.5, ‘ms’)}

o

<<PAclosedLoad>>{PApopulation = 1, PAinterval =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $Cycle)) }

<<PAstep>>{PArep = $N}

<<PAstep>>{PAdemand=(‘‘mean’, (0.5, ‘ms’))}o

This object managesthe resource Buffer

Page 19: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 19

Sequence Diagram for Access Control Scenario

getRights()

User

CardReader DoorLock Alarm Access Controller

Database Disk

readCard

admit (cardInfo)

readRights() [not_in_cache] readData()

checkRights() [OK] openDoor()

[not OK] alarm() [need to log?] logEvent()

writeRec()

enterBuilding

writeEvent()

<<PAstep>> {PAextOp=(read, 1)}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (3, ‘ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))}

<<PAcontext>>

o

o

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’)),

PAextOp = (network, 1)}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’,

(1.5, ‘ms’)), PAprob = 0.4}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’,

(0.5, ‘ms’)), PAprob = 1}

<<PAstep>> {PAprob = 0}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.3, ‘ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’,

(0.2, ‘ms’), PAprob=0.2}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ms’))}

o

<<PAopenLoad>> {PAoccurencePattern = (‘poisson’, 120, ‘s’), PArespTime =((‘req’,’percentile’,95, (1, ‘s’)), (‘pred’,’percentile’, 95, $RT)) }

Page 20: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 20

Layered Queueing Network (LQN) model http://www.sce.carleton.ca/rads/lqn/lqn-documentation

Advantages of LQN modeling Model layered resources and

logical resources in a natural way Give insight to resource

dependencies Scale up well for large system

What can we get from the LQN solver

Service time (mean, variance) Waiting time Probability of missing deadline Throughput Utilization Confidence Interval

clientE

DBWrite DBRead DB

DKWrite[1, 10]

DKRead

ClientCPU

ClientT

Disk

DBCPU

DBDisk

Page 21: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 21

Using the Model to Study and Improve a Design

U

UML Design- deployment- scenarios:(sequence,

activity diagrams)- components

SCore Scenario

Model(gather and

organize structureand data)

RResults ofanalysis

PPerformance

Model- layered queues

- Petri nets

Platform and environmentsubmodels

Plans forexperiments

U2S S2P

R2PR2S

R2U

solve

Page 22: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 22

LQN Model

procOneImage [1.5,0]

alloc [0.5, 0]

bufEntry

getImage [12,0]

passImage [0.9, 0]

AcquireProc

BufferManager

Buffer

AcquireProc2

acquireLoop [1.8]

VideoController

lock [0, 500]

Lock

releaseBuf [0.5, 0]

BufMgr2

alarm [0,0]

Alarm

network [0, 1]

Network (infinite)

storeImage [3.3, 0]

StoreProc

User rate=0.5/sec

Users

readCard [1, 0]

CardReader

admit [3.9, 0.2]

AccessController

writeEvent [1.8, 0]

writeImg [7.2, 0]

readRight s [1.8,0]

writeRec [3, 0]

writeBlock [1, 0]

readData [1.5, 0]

(1,0)

(forwarded)

(1, 0) (1, 0) (0, 1)

($P, 0) (1, 0) (1, 0)

($N)

(1, 0)

(0, 0.2)

($B, 0) (0.4, 0) (1, 0)

(forwarded)

(0, 0)

Applic CPU

DB CPU

DiskP

LockP

AlarmP

CardP

UserP

NetP

Dummy

DataBase (10 threads)

Disk (2 threads)

(1)

(1,0) (0,1)

Page 23: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 23

Using the Model: Improvements to BSS Base case system capacity: 20 cameras

software bottleneck at the buffers Adding software resources:

4 Buffers and 2 StoreProc threads 40 cameras, performance improvement 100% hardware bottleneck at the processor

Replicating the processor: Dual Application CPU 50 cameras, performance improvement 150%

Increasing concurrency Moving first phase call to second phase 100 cameras, performance improvement 400%

Page 24: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 24

<<PAresource>>

Video Controller

<<PAresource>> AcquireProc {PAcapacit y= 3}

<<PAresource>> BufferManager

<<PAresource>> StoreProc

{PAcapacit y= 6}

*[$N] procOneImage(i)

<<GRMacquire>> allocBuf (b)

getImage (i, b)

passImage (i, b)

storeImage (i, b)

<<GRMrelease>> releaseBuf (b)

f reeBuf (b)

<<PAresource>> Database

{PAcapacity= 10}

writeImg (i, b)

getBuff er()

store (i, b)

<<PAstep>> {PAdemand =(‘asmd’, ‘mean’, (1.5, ‘ ms’)}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (1.8, ‘ ms))}

<<PAcontext>>

o

<<PAstep>> {PAdemand=(‘asmd’,

‘mean’, ($P * 1.5, ‘ms’)), PAextOp = (networ k,

$P)}

<<PAstep>> {PAdemand=(‘asmd’,

‘mean’, ($B * 0.9, ‘ms’)),, PAextOp=(writeBlock, $B)}

<<PAclosedLoad>> {PApopulation = 1, PAinterval =((‘req’,’percentile’, 95,

(1, ‘s’)), (‘pred’, ‘percentile’, 95, $C ycle))}

<<PAstep>> {PArep = $N}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ ms’))} o

o

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.5, ‘ ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.9, ‘ ms’))} <<PAstep>>

{PAdemand=(‘asmd’, ‘mean’, (1.1, ‘ ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (2, ‘ms’))}

<<PAstep>> {PAdemand=(‘asmd’, ‘mean’, (0.2,’ms’))}

o

o

This objec t manages the resource Buffer

Feedback Modified BSS Sequence Diagram

Page 25: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 25

Scalability study of a telecom system Deployment diagram

<<LAN>>

s bit/secService Builder

Database

<<multiprocessor>>

ServiceBuilder component instance contains several concurrent high-level objects

Page 26: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 26

Telecommunication system architecture

<<process>>

RequestHandler

1..n

<<process>>

IO

IOin

IOout

<<process>>

Stack

StackIn

StackOut

doubleBuffer

inBuffer

outBuffer

alloc() {sequential}free() {sequential}

ShMem1

update() {sequential}

ShMem2

<<process>>

DataBase

ServiceBuilder

Page 27: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 27

Architectural Design Patterns

PIPELINEWITH BUFFER

UpStrmFilterDownStrmFilterBufferPIPELINE

WITH MESSAGE

UpStrmFilterDownStrmFilter

<<process>>

RequestHandler

1..n

<<process>>

IO

IOin

IOout

<<process>>

Stack

StackIn

StackOut

doubleBuffer

inBuffer

outBuffer

PIPELINEWITH BUFFER

UpStrmFilterDownStrmFilterBufferPIPELINE

WITH MESSAGE

UpStrmFilterDownStrmFilter

CRITICAL SECTION

AccessorShared

CRITICAL SECTION

AccessorShared

CLIENT SERVER

ClientServer

alloc() {sequential}free() {sequential}

ShMem1

update() {sequential}

ShMem2 <<process>>

DataBase

Page 28: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 28

Scenario with performance annotations

enqueue (req)

StackIn:client

IOin inBuffer

input (req)

reserve()

write (req)

signal (awake)

read (req)

ReqHandler outBuffer IOout StackOut

process (req)

write (result)

signal (awake)

read (result)pass (result)

sendBack (result)

Details of process (req)

ReqHandler ShMem1 DataBase ShMem2

interpret (script)

alloc ()

get (script)

update ()free ()

<<PAclosedLoad>>{PAPopulation = $Nusers,PAextDelay = (‘mean’, asgn’, 20, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.4028, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 1.998, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.105, ‘ms’)} <<PAstep>>

{Pademand = (‘meas’, ‘mean’, 0.4028, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.120, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.1328, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.6749, ‘ms’)}

<<PAstep>>{Pademand = (‘meas’, ‘mean’, 0.1576, ‘ms’)}

Page 29: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 29

Average execution times per requestExecution time demands per system request

0 1 2 3 4 5

StackIn

StackOut

IOin

IOout

RequestHandler

DataBase

TOTAL

Execution time (msec)

Non-critical section

Critical sect:Buffer

Critical sect:ShMem1

Critical sect:ShMem2

Total

Page 30: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 30

LQN model for the telecom system

Dummy Proc

pull push alloc free

IOin

RequestHandler

IOout

StackExec

ShMem2

Proc

IOexec

StackIn

StackOut

Buffer ShMem1

update DataBase

ProcDB

Page 31: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 31

Max throughput for 1, 4 and 6-processor configurations

Maximum Throughput Vs. replication factor of the RequestHandler (RH)

0

200

400

600

800

1000

1200

n=1 processor n=4 processors n=6 processors

Configurations

Th

rou

gh

pu

t [r

eq

ue

sts

/se

co

nd

]

(n-1) RHs

n RHs

(n+1) RHs

(n+2) RHs

Page 32: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 32

Base Case, 1-processor: hardware bottleneck

1-Processor Configuration: Base Case

0

0.2

0.4

0.6

0.8

1

1 2 3 4

Number of RequestHandlers

Uti

liza

tio

n

RequestHandler

Processor

IOExec

StackExec

Database

Page 33: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 33

Base Case, 4-processor: software bottleneck

4-Processor Configuration: Base Case

0

0.2

0.4

0.6

0.8

1

2 4 6 8 10 12 14 16 18 20

Number of RequestHandler replications

Uti

liza

tio

n

IOout Processor

IOExec

StackExec

Database

RequestHandler

Page 34: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 34

Base Case, 6-processor: stronger software bottleneck

6-Processor Configuration: Base Case

0

0.2

0.4

0.6

0.8

1

5 6 7 8 9 10 11 12 13 14 15 16 17 18

Number of RequestHandler Replications

Uti

liza

tio

n

IOout

IOexec Processor

StackExec

DataBase

RequestHandler

Page 35: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 35

0

0.2

0.4

0.6

0.8

1

1.2

3 4 5 6 7 8 9 10 11 12

Number of RequestHandler Reprlications

Uti

liz

ati

on

Useful work by each RequestHandler

Useful work by others on behalf of RequestHandler

RequestHandler is busy while waiting for nested services

0

0.2

0.4

0.6

0.8

1

1.2

3 4 5 6 7 8 9 10 11 12

Number of RequestHandler replications

Uti

liz

ati

on IOout waiting for IOexec

IOexec waiting for semaphore / processor

Useful work: read from buffer

IOout is doing only 0.12 ms of useful work on behalf of a system request. Nonetheless, it is the system bottleneck.

RequestHandler is doing 2.75 ms of useful work on behalf of a system request. However, it is not the system bottleneck.

useful work

waitingwaiting

useful work

waitingwaiting

What is the bottleneck task doing?

Page 36: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 36

Modified system architecture

The original software architecture suffers from serializations constraints at the level of IO process, Stack process and doubleBuffer

Eliminate the serialization constraints in two steps: Each pipeline filter will run on its own process (thread of control) Split the pipeline buffer into two separate buffers, each controlled by its own

semaphore.

<<process>>

RequestHandler

1..n

alloc() {sequential}free() {sequential}

ShMem1

update() {sequential}

ShMem2

<<process>>

DataBase

ServiceBuilder

<<process>>

StackIn

<<process>>

StackOut

<<process>>

IOin

<<process>>

IOout

inBuffer

outBuffer

Page 37: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 37

Modified system, 4-processor configuration: software bottleneck

eliminated4-processor configuration: fully modified system

0

0.2

0.4

0.6

0.8

1

1.2

2 4 6 8 10 12 14 16 18 20

Number of RH replications

Uti

liza

tio

n

Processor

DataBase

RequestHandler

IOout IOin

StackIn

StackOut

Page 38: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 38

Modified system, 6-proc configuration:

a new software bottleneck is emerging (DataBase)

6-Processor Configuration: fully modified system

0

0.2

0.4

0.6

0.8

1

5 6 7 8 9 10 11 12 13 14 15 16 17 18

Number of RequestHandler replications

Uti

liza

tio

n

RequestHandler

Processor

DataBase

IOout

StackOut

IOin

StackIn

Page 39: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 39

Research on the SPT Profile

The SPT profile needs to be adapted to UML2, with many changes to behaviour specification, a new QoS profile which describes how to specify QoS

measures Enhancements can be made

to harmonize the performance and schedulability aspects to provide a more formal domain model to annotate components possible changes to measures and parameters

(topic of a workshop in Toronto in May) We will be participating with several other groups.

Page 40: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 40

PUMA will enhance Model-based Development

We see a paradigm shift in the direction of MDA or MDD (Model-Driven Development) emphasizes more work on models generation of code and system artifacts

Model evaluation/verification becomes more important performance verification supports this trend may provide a leading example for non-functional

verificationreliability

Page 41: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 41

PUMA opens new prospects...

Libraries of submodels for platform specific aspects Instrument and calibrate generated code as it is produced Use the performance model to manage later stages of

performance engineering testing debugging

Adapt the transformations for other kinds of evaluation, based on the approach of the QVT (Query, View, Transformation) standard introduced for transforming platform-independent specs to

platform-specific versions but capable of much more...

Page 42: Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004 Murray Woodside, Dorina Petriu page 1 Software Design Evaluation: Vision

Software Design Evaluation: Vision and Tools....The PUMA project 25 June 2004Murray Woodside, Dorina Petriu page 42

Research on Transformation Tools and QVT

At present our transformation tools are ad hoc creations based on the specific analysis, but QVT seems to be applicable: the Profile annotations support a scenario view Core Scenario Model is a first target model with its own

metamodel described in MOF the performance model is a second target: can it be

described in MOF? Explore the use of QVT

most uses are only UML to UML... this is more general may help in defining QVT may support other transformations for evaluation