qe_camp_17

27
Analytics in Quality Assurance Rohit Vyas Sr. QE Certification Team, Pune(IN)

Upload: rohit-vyas

Post on 13-Apr-2017

148 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: qe_camp_17

Analytics in Quality Assurance

Rohit VyasSr. QE

Certification Team, Pune(IN)

Page 2: qe_camp_17

About Me

● QA Engineer● Sr. QA Lead ● Sr. QE

– 366 Days on 25th Jan -2017

Page 3: qe_camp_17

Leveraging Analytics in QA

Page 4: qe_camp_17

Predictive Analysis

Page 5: qe_camp_17
Page 6: qe_camp_17

Predictive Analysis

Page 7: qe_camp_17

Current QA Challenges

● What all testcases need to be executed to minimize the defect leakage rate < 10% and maximize the coverage > 90%?

● Identify the tests to be included in test suite which can be executed with resources <=5 and time_duration <10 days with severity defects= 0% ? (Min(Tc))

● Number of resources required to execute test suite with min(Tc) for ModuleX with min(defect leakage rate) within min(testing time frame)?

Page 8: qe_camp_17

Role of Predictive Analytics In QA

● TC Prioritization in RR● Resource utilization● Report generation

Page 9: qe_camp_17

Why TCP?

Page 10: qe_camp_17

TCP ?

● Focus on ranking all existing TC without eliminating. Detect Fault Soon.

● Executes TC's in given order until the testing budget is exhausted.

Page 11: qe_camp_17

TCP Effect

0 2 4 6 8 10 12 14 16 180

10

20

30

40

50

60

Bugs

0 2 4 6 8 10 12 14 16 180

10

20

30

40

50

60

70

Bugs

Page 12: qe_camp_17

How TCP ?Techniques for TCP

● Text diversity-based Prioritization

AllDist(Ti,PS,d)= Min{d(Ti,Tj)} | Tj PS

● Topic diversity-based● History Based clustering

● C 1 = { tc x — tc x 2 FT(n) }● C 2 = { tc x — tc x 62 C1 AND tc x 2 FT(n-1) }● C 3 = { tc x — tc x 62 [(C 1 ,C 2 ) AND tc x 2 FT(n-2)}

Page 13: qe_camp_17

Inputs For TCP.

● Change information● Historical Fault detection● Dynamic and Static Coverage Data● SRD● Test Scripts

Page 14: qe_camp_17

Data Sources

Page 15: qe_camp_17

System Under Test

Type Release Total Test New Test %New Test Median Old Test

TR 3.0 580 398 68% 1

RR 5.5 1055 39 4% 4

Type Release Release Date

No. Of test No. of Faults Failure Rate

RR 3.0 1/12/2016 580 127 21.90%

RR 4.0 25/12/2016 1055 6 0.57%

Page 16: qe_camp_17

K Mean Clustering

● Assume Euclidean space/distance● Start picking k , the number of clusters● Initialize clusters by picking one point per clusters and

find the minimum distance● Repeat for all the clusters

Page 17: qe_camp_17

Resource Allocation

● Right Tester/QA ?● QA score● How well QA handles Deadline Meets

Page 18: qe_camp_17

● Resource allocation predictions based on the Analysis● Predict the success rate of project with n number of

resources having 5+ years of domain expertise QA within min(time_frame)

Resource Allocation Problems

Page 19: qe_camp_17

Understand your Resource

● Identifying the Performance [Demographic, Gender Biased, Skills]

● Resource Allocation in RR & TR● Resources Churn Detection

Page 20: qe_camp_17

Data Sets

Project

ID Age Gender

Marital Status

Issues Reported

Priority of Bug

Release Time

Location

Project Complexity

aaa a123 23 M S 12 xx xx xx xx

Project Complexity Age Gender Domain Expertise Interest Level

xx xx xx xx xx xx xx

Page 21: qe_camp_17

Data Source

Page 22: qe_camp_17

ReportsMetrics That Matters

● Analytical Reports

– Add values to current test tools generated reports on better explaining the data collected and will be useful for future prediction and forecasting.

Page 23: qe_camp_17

Metrics That Matters

● Measuring the Doneness● Resource Allocation ● Measuring Performance and biases● Beyond the Check Marks

Page 24: qe_camp_17

Tools

● R ● Statpro● Excel or LibreOffice for Regression

Page 25: qe_camp_17

References

● Test case prioritization

http://sealab.cs.umanitoba.ca/wp-content/uploads/2016/07/Published.pdf

Page 26: qe_camp_17
Page 27: qe_camp_17