online music store mse project presentation iii presented by: reshma sawant major professor: dr....
TRANSCRIPT
Online Music Store
MSE Project Presentation III
Presented by: Reshma Sawant
Major Professor: Dr. Daniel Andresen
03/11/08
Phase III Presentation Outline
Project Overview
Brief Review of Phases
Action Items from Phase II
Implementation/Demo
Assessment Evaluation
Project Evaluation
Lessons learned
Project Overview
The objective of this project is to design and develop an Online
Music Store.
Target: Public Users
Product: Media for Music
User Types: User, Administrator
Functionalities for Users: Browsing, searching, buying
products, getting song recommendations, managing
personal account
Functionalities for Administrator: Manage Catalog Details,
Manage Orders, Manage Shopping Cart
Review of Phases
Phase I:
Requirement Specifications
Phase II:
Designed Web Pages
Created Test Plan
Phase III (Current):
Coding
Testing and Analysis
Action Items from Phase II
1) Correct multiplicities in Class Diagram
Multiplicity between ShoppingCart Class and
CartItem Class should be 1..*
ClassDiagram
Action Items from Phase II
2) Revise SLOC count and Project Duration
Included in Project Evaluation
Implementation & Demo
Technologies Used:
IDE – Microsoft Visual Studio 2005
Technology - ASP.NET 2.0
Language – C#
Database – SQL Server 2005
Manual Testing - To ensure the correctness of various
parts of code
Assessment Evaluation
Test Case # DescriptionResults/
Comments
USER T-01 System Register Passed
T-02 System Login Passed
T-03 Add to Cart Passed
T-04 Edit Shopping Cart Passed
T-05 Place Order Passed
ADMINISTRATOR T-06 Create and Delete product from Category Passed
T-07 Create and Delete Category from a Genre Passed
T-18 Create and Delete Genre from Catalog Passed
T-09 Manage Orders Passed
T-10 Manage Shopping Carts Passed
E.g. Register Web Page for User
E.g. Edit Shopping Cart
Assessment Evaluation
Test Unit Test Case Result
btnSignup An empty requirement field(Username, password, confirm password, Email, Security question and Security answer)
System prompts user with a message “All fields are required. Please try again”.
Username already in use with other existing users. System prompts user to enter the username with a message “Please enter a different username”
Password and Confirm password fields do not match.
System prompts user to enter the password with a message “The Password and Confirmation Password must match”.
All valid requirement fields entered System redirected the user to the secure Login Web page.
Test Unit Test Case Result
btnUpdate,
btnDelete
Negative input number or input other than integer number entered in “Quantity” field
System prompts the user with the message “Please enter a valid number”.
Valid Positive number entered in “Quantity” field
System updates the product quantity and displays the message “Your shopping cart was successfully updated” or “Item successfully deleted”
Assessment Evaluation Performance Testing
Goal:
Determine load in terms of concurrent users and requests
Determine Response Time – time between the request being
initiated for a Web Page to time taken for it to be completely displayed on a
user’s browser
Tool Used – JMeter (http://jakarta.apache.org)
Inputs to JMeter:
Number of Users
Ramp-up period – time (sec) to load the full number of users chosen
Loop Count - how many times to repeat the test E.g. Users = 10, Loop-Count = 20, Ramp-up period = 5 sec
=> 10 Users will be loaded in 5 sec with total requests = 200 (10*20)
Assessment EvaluationPerformance Testing Factors
Load Type Peak Load – maximum number of users and requests loaded in
short duration (e.g. 5 sec).
Sustained Load – maximum users and requests loaded for
longer period (e.g. 5 mins).
ConnectionWireless Connection at 54.0 Mbps
LAN Connection at 100.0 Mbps
Web pages Tested HTML Page (Login Web Page)
Database Intensive Page (Home Page)
Business Logic Page (Shopping Cart Page)
Machine Configuration
Operating System – Windows XP Professional
Memory – 1GB RAM
100GB HardDisk
Intel Pentium M Processor 1.7 GHz
Assessment EvaluationPerformance Testing Environmental Set-up
Peak Load at Wireless (54 Mbps) vs. LAN Connection (100
Mbps)Users Loop
Count
Ramp-up
period
(sec)
Avg. Response
Time (ms) for
Wireless
Avg. Response
Time (ms) for
LAN
200 20000 5 8354 7400
600 20000 5 22538 21700
800 20000 5 29567 28600
1000 20000 5 38603 35390
Assessment EvaluationHome Page [http://localhost:2416/CDShop/Default.aspx]
Note Loop-Count constant at 20,000
Ramp-up period of 5 sec
Users – 200, 600, 800, 1000
Observations Response Time increases linearly
with number of users for both
Wireless and LAN
Max no.of users handled by the
system before it becomes
saturated = 1000
Response Time is less for LAN
due to better bandwidth.
Constant Users vs. Constant Loop-Count for Wireless Connection
Users Constant at 200 Loop-Count Constant at 20,000
Loop-Count increased up to 20000 Users – 200, 600, 800, 1000
Assessment EvaluationHome Page [http://localhost:2416/CDShop/Default.aspx]
Observations Response Time increases rapidly with number of users but not very much
when the users are kept constant and only loop-count is increased.
Reason: If the number of users is kept constant and only the loop-count is
increased, the number of requests/sec handled by the server remains
constant for every increase in the loop count.
If the users are increased and loop count is kept constant, the
requests/sec handled by the server increases with increasing users, but
the number of executions remain constant and hence the longer
response time.
Assessment EvaluationHome Page [http://localhost:2416/CDShop/Default.aspx]
Assessment Evaluation Comparison of Response Times of all 3 WebPages at Wireless Connection of 54.0Mbps
Note Loop-Count constant at 20,000
Ramp-up period of 5 sec
Users – 200, 600, 800, 1000
Observations Response Time increases more for Home
Page as compared to Login and Shopping
Cart Page
Lowest Response Time for Login Page as no
database requests are submitted by the user
Moderate Response Time for Shopping Cart
page because there are more computations
Response Time for Shopping Cart Page is
approx. 28% more on an average than for
Login Page
Response Time for Home Page is approx.
246% more on an average than for Login
Page
Avg. Response Time
(ms) for Login Page
Avg. Response Time (ms)
for Shopping Cart Page
Avg. Response Time (ms)
for Home Page
1900 2500 8354
7439 7700 22538
8500 10800 29567
13000 15400 38603
External Factors affecting Response Time
Varying Network Bandwidth
Limited System Hardware Resources (CPU, RAM, Disks)
and Configuration
JMeter Tests and Server running on the same machine
Assessment EvaluationHome Page [http://localhost:2416/CDShop/Default.aspx]
For Peak Load Users – 200, 600, 800, 1000
Loop-Count constant at 20,000
Ramp-up period = 5 sec
Response Time increases rapidly with number of users but not very much
when the users are kept constant and only loop-count is increased.
Response Time is highest for Home page, Intermediate for Shopping Cart
Page and Lowest for Login Page
Assessment EvaluationSummary
Wireless vs. LAN
Login Page Wireless takes on an average 9.5% more Response Time than LAN
Shopping Cart Page Wireless takes on an average 6.8% more Response Time than LAN
Home Page Wireless takes on an average 6.6% more Response Time than LAN
For Sustained Load at Wireless Connection
Assessment EvaluationLogin Page [http://localhost:2416/CDShop/Login.aspx]
UsersLoop
Count
Ramp-up period
(sec)
Average Response Time
(ms)
800 16000 300 10335
Project Evaluation Project Duration (actual)
Phase I = 86 hours
Phase II = 140.5 hours
Phase III = 304.5 hours
Total = 531 hours
Project Duration (in Months)
Estimated at the end of Phase II = 6.5 Months
Actual = 7.5 Months
Category BreakDown Research = 38.5 hours
Design = 37 hours
Coding = 305.5 hours
Testing = 32 hours
Documentation = 118 hours
Total = 531 hours
Project Evaluation
Project Evaluation SLOC Count (Actual) – LocMetrics Tool
(http://www.locmetrics.com)
C# Code (Including C# auto-generated code) = 2757
SQL Code = 540
XML Code = 86
CSS Code = 412
Total = 3795
SLOC Count (Estimated) At the end of Phase II – 3200 (Based on prototype design in phase
I)
Project Experience Lessons Learned:
New technology
Use of various tools for designing and testing – Visual Studio 2005, JMeter,
LocMetrics
Working with UML and Class Diagrams
Entire life cycle of the project– requirement gathering, Design, Coding,
Testing and Documentation
Testing applications at different levels