itec 451 network design and analysis. 2 you will learn: (1) specifying performance requirements...
Post on 18-Dec-2015
217 Views
Preview:
TRANSCRIPT
ITEC 451
Network Design and Analysis
2
You will Learn: (1)
Specifying performance requirements
Evaluating design alternatives
Comparing two or more systems
Determining the optimal value of a parameter (system tuning)
Finding the performance bottleneck (bottleneck identification)
3
You will Learn: (2)
Characterizing the load on the system (workload characterization)
Determining the number and sizes of components (capacity planning)
Predicting the performance at future loads (forecasting)
4
Basic Terms (1)
System: Any collection of hardware, software, or both
Model: Mathematical representation of a concept, phenomenon, or system
Metrics: The criteria used to evaluate the performance of a system
Workload: The requests made by the users of a system
5
Basic Terms (2)
Parameters: System and workload characteristics that affect system performance
Factors: Parameters that are varied in a study especially those that depend on users
Outliners: Values (in a set of measurement data) that are too high or too low as compared to the majority
6
Common Mistakes in Performance Evaluation (1)
1. No Goals Goals Techniques, Metrics, Workload
2. Biased Goals (Ex) To show that OUR system is better than THEIRS
3. Unsystematic Approach
4. Analysis Without Understanding the Problem
5. Incorrect Performance Metrics
6. Unrepresentative Workload
7. Wrong Evaluation Technique
7
Common Mistakes in Performance Evaluation (2)
8. Overlook Important Parameters
9. Ignore Significant Factors
10. Inappropriate Experimental Design
11. Inappropriate Level of Detail
12. No Analysis
13. Erroneous Analysis
14. No Sensitivity Analysis
15. Ignore Errors in Input
8
Common Mistakes in Performance Evaluation (3)
16. Improper Treatment of Outliers
17. Assuming No Change in the Future
18. Ignoring Variability
19. Too Complex Analysis
20. Improper Presentation of Results
21. Ignoring Social Aspects
22. Omitting Assumptions and Limitations
9
Checklist for Avoiding Common Mistakes (1)
1. Is the system correctly defined and the goals clearly stated?
2. Are the goals stated in an unbiased manner?
3. Have all the steps of the analysis followed systematically?
4. Is the problem clearly understood before analyzing it?
5. Are the performance metrics relevant for this problem?
6. Is the workload correct for this problem?
10
Checklist for Avoiding Common Mistakes (2)
7. Is the evaluation technique appropriate?
8. Is the list of parameters that affect performance complete?
9. Have all parameters that affect performance been chosen as factors to be varied?
10. Is the experimental design efficient in terms of time and results?
11. Is the level of detail proper?
12. Is the measured data presented with analysis and interpretation?
11
Checklist for Avoiding Common Mistakes (3)
13. Is the analysis statistically correct?
14. Has the sensitivity analysis been done?
15. Would errors in the input cause an insignificant change in the results?
16. Have the outliers in the input or output been treated properly?
17. Have the future changes in the system and workload been modeled?
18. Have the variance of input been taken into account?
12
Checklist for Avoiding Common Mistakes (4)
19. Has the variance of the results been analyzed?
20. Is the analysis easy to explain?
21. Is the presentation style suitable for its audience?
22. Have the results been presented graphically as much as possible?
23. Are the assumptions and limitations of the analysis clearly documented?
13
Systematic Approach to Performance Evaluation
1. State Goals and Define the System
2. List Services and Outcomes
3. Select Appropriate Metrics
4. List the Parameters
5. Select Evaluation Techniques
6. Select Workload
7. Design Experiment(s)
8. Analyze and Interpret Data
9. Present the results
14
State Goals and Define the System
Identify the goal of study Not trivial, but
Will affect every decision or choice you make down the road
Clearly define the system Where you draw the boundary will
Dictate the choice of model
Affect choice of metrics and workload
15
List Services and Outcomes
Identify the services offered by the system
For each service, identify all possible outcomes
What’s the point? Those will help in the selection of appropriate metrics
16
Select Appropriate Metrics (1)
These are the criteria for performance evaluation
Desired Properties Specific Measurable Acceptable Realizable Through
Examples?
Prefer those that Have low variability, Are non-redundant, and Are complete
17
Select Appropriate Metrics (2)- Examples -
Successful Service Rate – Throughput
Frequency of Correct Results – Reliability
Being Available When Needed – Availability
Serving Users Fairly – Fairness
Efficiency of Resource Usage – Utilization
Now, How to Measure These?
18
Select Appropriate Metrics (3)- A Classification -
Higher is Better Examples?
Lower is Better Examples?
Nominal is the Best Examples?
19
Select Appropriate Metrics (4)- Criteria for Metric Set Selection -
Low-variability Helps reduce the number of runs needed
Advice: Avoid ratios of two variables
Non-redundancy Helps make results less confusing and reduce the
effort
Try to find a relationship between metrics If a simple relationship exists, keep only one
Completeness
20
Select Appropriate Metrics (5)- Summary -
Metrics chosen should be measurable Can assign a numerical value to it
Acceptable
Easy to Work With (i.e., can measure it easily)
Avoid Redundancy
Pay Attention to the Unit Used
Sanity Check Check the boundary conditions (i.e., best system, ideal
workload, etc.) to see if the metric is sensible
21
Systematic Approach to Performance Evaluation
1. State Goals and Define the System
2. List Services and Outcomes
3. Select Appropriate Metrics
4. List the Parameters
5. Select Evaluation Techniques
6. Select Workload
7. Design Experiment(s)
8. Analyze and Interpret Data
9. Present the results
22
List the Parameters
Identify all system and workload parameters System parameters
Characteristics of the system that affect system performance
Workload parameters Characteristics of usage (or workload) that affect system
performance
Categorize them according to their effects on system performance
Determine the range of their variation or expected variation
Decide on one or at most a couple to vary while keeping others fixed
23
Select Evaluation Technique(s) (1)
Three Techniques Measurement
Simulation
Analytical Modeling
24
Select Evaluation Technique(s) (2)-Measurement, Simulation, or Analysis?-
Can be a combination of two or all three
Use the goal of study to guide your decision
Remember, each of these techniques has its pros and cons
25
Select Evaluation Technique(s) (3)- Measurement -
(+) Provides realistic data (+) Can test the limits on load (-) System or a prototype should be working (-) The prototype may not represent the actual
system (-) Not that easy to correlate cause and effect Challenges
Defining appropriate metrics Using appropriate workload Statistical tools to analyze the data
26
Select Evaluation Technique(s) (4)- Simulation -
(+) Less expensive than building a prototype (+) Can test under more load scenarios (-) Synthetic since the model is not the actual
system (-) Can not use simulation to make any
guarantees on expected performance Challenges
Need to be careful when to use simulation Need to get the model right Need to represent results well (the graphical tools) Need to learn simulation tools
27
Select Evaluation Technique(s) (5)- Analytical Modeling -
(+) Can make strong guarantees on expected behavior
(+) Can provide an insight in to cause and effect (+) Does not need to build a prototype (-) Performance prediction only as good as the
model Challenges
Significant learning curve Mathematically involved Choosing the right model (the art work)
28
Select Evaluation Technique(s) (6)- Bottom Line -
You can use measurement to demonstrate feasibility of an approach.
You can use measurement or simulation to show an evidence that your algorithm or system performs better than competing approaches in certain situations.
But, if you would like to claim any properties of your algorithm (or system), the only option is to use analysis and mathematically prove your claim.
29
Select Evaluation Technique(s) (7)- When to Use What? -
It is good to be versed in all three
1. Can start with measurement or simulation to get a feel of the model or expected behavior
2. Start with a simple model
3. Perform an analysis to predict the performance and prove some behavioral properties
4. Observe the actual performance to determine the validity of your model and your analysis
5. Can use simulation for the previous step if a working system is not available/feasible
6. Go back to revise the model and analysis if significant inconsistency is observed and start with Step 4
7. Finally use simulation to verify your results for large scale data or for scenarios that can not be modeled with existing expertise and available time
30
Systematic Approach to Performance Evaluation
1. State Goals and Define the System
2. List Services and Outcomes
3. Select Appropriate Metrics
4. List the Parameters
5. Select Evaluation Techniques
6. Select Workload
7. Design Experiment(s)
8. Analyze and Interpret Data
9. Present the results
31
Select Workload
What is a workload? How do you represent it?
Range of values What should be in increment size?
Probability Distribution Need to find a good model that approximates reality May require measurement/statistical analysis In simulation, use an appropriate random number
generator to produce values
Trace from an actual system
32
Design Experiment(s) To provide maximum information with minimum
effort Field experiments can take enormous preparation time Attempt to get several experiments done in one setup Explore if you can use data collected by someone else Also, explore if you can use remote labs Finally, explore if you can use simulation without loosing
significant validity Modifying simulation code can be time consuming as well
In both simulation and measurement, repeat the same experiment (for a fixed workload and fixed parameter values) sufficient number of times for statistical validity
Always keep the goal in mind
33
Analyze and Interpret Data
In Analytical Modeling Carry out mathematical derivations that prove
expected system behavior
In Measurement Statistically analyze the collected data
Summarize the results by computing statistical measures
34
Systematic Approach to Performance Evaluation
1. State Goals and Define the System
2. List Services and Outcomes
3. Select Appropriate Metrics
4. List the Parameters
5. Select Evaluation Techniques
6. Select Workload
7. Design Experiment(s)
8. Analyze and Interpret Data
9. Present the results
35
Present the Results (1)
In Analytical Modeling Clear statements of lemmas, and theorems
Description of an algorithm with a proof of its properties
Present numerical computation results To show how to use the formulae, and
To show the effect of varying the parameters
Perform simulation/measurement to show the validity of the model and analysis
36
Present the Results (2)
In simulation and Measurement Clear statement of the goals of experiment
A list of assumptions
The experiment set up Platforms, tools, units, range of values for parameters
Graphical presentation of results The simpler is it to understand the graphs, the better it
is
37
Present the Results (3)
In all three, after presentation of results Discuss implications for the users
Discuss how a user can use the results
Any additional applications that can benefit from your experiment
Present conclusions What did you learn, e.g., surprises, new directions
Discuss limitations and future work
38
Review: Systematic Approach to Performance Evaluation
1. State Goals and Define the System
2. List Services and Outcomes
3. Select Appropriate Metrics
4. List the Parameters
5. Select Evaluation Techniques
6. Select Workload
7. Design Experiment(s)
8. Analyze and Interpret Data
9. Present the results
top related