best practices in defect reporting shiva prasad

33
IBM Global Services – Testing Competency IBM Confidential © Copyright IBM Corporation 2005 01/23/2005 -Shiva Prasad.B.V

Upload: sivaprasanthrentala1975

Post on 22-Nov-2014

3.159 views

Category:

Education


2 download

DESCRIPTION

 

TRANSCRIPT

  • 1. -Shiva Prasad.B.V Effective Defect Tracking and Avoiding Duplication
  • 2. Objectives of this presentation
    • Objectives
      • To write effective Defect reports
      • Reduce the number of defects returned from development
      • Improve the speed of getting defect fixes
      • Improve the credibility of test
      • Enhance teamwork between test and development
      • Overview of Defect Tracking Tools available
      • To Avoid Defect Rejection rate which is about 40% now.
      • Avoid Duplication of Defects
  • 3. Introduction
    • Introduction
      • Defect reports are among the most important deliverables to come out of test. They are as important as
      • The test plan and will have more impact on the quality of the product than most other deliverables from test.
      • It is worth the effort to learn how to write effective defect reports.
      • Avoiding Duplication of Bugs results in saving lot of Testing Effort and time to not only to the Testing team but also the developers.
  • 4. Defect Remarks
    • Here are some key points to make sure the next defect report you write is an effective one.
    • 1. Condense - Say it clearly but briefly
    • 2. Accurate - Is it a defect or could it be user error, misunderstanding, etc.?
    • 3. Neutralize - Just the facts. No zingers. No humor. No emotion.
    • 4. Precise - Explicitly, what is the problem?
    • 5. Isolate - What has been done to isolate the problem?
  • 5. Defect Remarks continued..
    • 6 . Re-create - What are the essentials in triggering/re-creating this problem? (environment, steps,conditions)
    • 7. Impact - What is the impact to the customer? What is the impact to test? Sell the defect.
    • 8. Debug - What does development need to make it easier to debug? (traces, dumps, logs,immediate access, etc.)
    • 9. Evidence - What documentation will prove the existence of the error?
  • 6. Essentials for Effective Defect Remarks
      • Condense
        • Say it clearly but briefly. First, eliminate unnecessary wordiness. Second, dont add in extraneous
        • information. It is important that you include all relevant information, but make sure that the information is
        • relevant. In situations where it is unclear how to reproduce the problem or the understanding of the
        • problem is vague for whatever reason you will probably need to capture more information. Keep in
        • mind that irrelevant information can be just as problematic as too little relevant information.
  • 7. Example
      • Defect Remark
      • Dont:
      • Suffers from TMI (Too Much Information), most
      • of which is not helpful.
          • I was setting up a test whose real intent was to
          • detect memory errors. In the process I noticed a
          • new GUI field that I was not familiar with. I
          • decided to exercise the new field. I tried many
          • boundary and error conditions that worked just
          • fine. Finally, I cleared the field of any data and
          • attempted to advance to the next screen, then the
          • program abended. Several retries revealed that
          • anytime there is not any data for the "product
          • description" field you cannot advance to the next
          • screen or even exit or cancel without abending.
  • 8. Do
      • The "exit", "next", and "cancel" functions for the
      • "Product Information" screen abends when the
      • "product description" field is empty or blank.
  • 9. Accuracy
      • Make sure that what you are reporting is really a bug. You can lose credibility very quickly if you get a
      • reputation of reporting problems that turn out to be setup problems, user errors, or misunderstandings of the product. Before you write up the problem, make sure that you have done your homework. Before writing up the problem consider:
      • . Is there something in the setup that could have caused this? For example, are the correct versions
      • installed and all dependencies met? Did you use the correct login, security, command/task sequence and so fourth?
      • Could an incomplete cleanup, incomplete results, or manual interventions from a previous test cause this?
  • 10. Accuracy - Continued
      • Make sure What you report is really a Bug
      • Is there something in the setup that could have caused this? For example, are the correct versionsinstalled and all dependencies met? Did you use the correct login, security, command/task sequence.
      • Could an incomplete cleanup, incomplete results, or manual interventions from a previous test cause this?
      • Could this be the result of a network or some other environmental problem?
      • Do you really understand how this is supposed to work?
  • 11. Neutralize
      • State the problem objectively.
      • Dont try to use humor and dont use emotionally charged zingers. What you think is funny when you write the defect may not be interpreted as funny by a developer who is working overtime and is stressed by deadlines.
      • Using emotionally charged statements doesnt do anything for fixing the problem.
      • Dont:
      • The first clause will probably be interpreted as a
      • job at the developer and adds no useful information.
      • Do:
      • As could have been determined from the original defect with very little effort, function ABC does indeed abend with any negative value as input.
  • 12. Precise
    • The person reading the problem description should not have to be a detective to determine what the problem is.
    • Dont:
    • In this example, it is hard to tell if the problem is
    • 1) the twinax port not timing out or 2) the printer
    • not returning to ready or 3) the message on the
    • op panel.
    • Do:
    • Precede the description with a short summary of
    • exactly what you perceive the problem to be.
  • 13. Generalize
    • Often times, the developers will fix exactly what you report, without even realizing the problem is a more general problem that needs a more general fix
  • 14. Recreate
    • You should list all the steps, include the exact syntax, file names, sequences that you used to encounter or re-create the problem.
    • provide an explicit example that can be used to do the re-create
    • Dont assume that it can be re-created if you havent verified that it can be re-created.
    • If you cannot or have not re-created the problem it is important to note that in the defect remarks.
  • 15. Other Necessary Items for Defect tracking
    • 1) Impact 2) Severity 3) Priority 4) Environment 5)Owner 6) TestCase ID 9) Build Number
    • 10)Release status 11) Defect Type 12)State 13)Description 14)Submitter 15)History 16) Submitted 17) Date 18)Browser 18)Closed 19) Date Keyword 20) Effort in hours and 19) Turn Around Time
  • 16. Submitted by Testing team, Bug is in Submitted state Assign to Development Lead in Assigned State. Muthu Reviews the bug. Is the bug valid? Or does it need more data? 2 Yes Assigns back to Tester, with comments. Assigned state. Is the comment valid? Tester rechecks and either closes, Moves it to Duplicate or deletes the bug No Yes No
  • 17. 2 Assigns them to developers, Moves to Assigned state Developers Fix them and move them to Resolved or Duplicate or Not Abug or changed since design or Postpone Developers assign them back to Muthu. Muthu changes the state to NotAbug or duplicate or changedsince design if the bug is not in Opened or Postponed or resolved . 3 Opened Postponed NotAbug Duplicate Changedsincedesign
  • 18. Avoiding Duplication
    • The goal is to provide a methodology on how to minimize the duplication of bugs/defects which are reported in the software world, thereby saving time and money to the organization.
    • Duplication of bugs is a major hurdle for the testing community which results in wastage of efforts/Time and money since more than one resource may report the same defect or the same resource may raise the same defect more than once. This not only results in wastage of time among the testers, but also the developers.
  • 19. Proposed solution
    • The problem needs to be addressed at two levels, one at the team level and one at the individual resource level.
    • Case 1: When test cases are available.
      • Map TestCases to Defect ids
      • Map Defect ids to Testcases
      • Maintain the Testcase ids consistently throughout all the documents for the same functionality.
  • 20. Entering Testcase description
    • Design testcases so that it is easy to Enter the same description in the defect tracker.
    • Add TestCase Description in the Defect tracker so that it saves time
    • Easy to search the Defect based on Keyword or Testcase id
    • Avoids duplication if a number of Testers are testing simultaneously
  • 21. Case 2: When Testcases are not available
    • Base line the semantics used for Defect tracking
    • Using unique names or URLs in the defect tracker
    • Warning system in the defect tracker to inform the tester that the bug has already been reported.
  • 22. Base line the semantics used for Defect tracking
      • For EX: If there is defect in a login screen, for instance the password field is accepting the wrong value.
      • One resource may report as Password field is not validating the input and the second resource may report the defect as The login screen is accepting wrong password .
      • The idea here is to baseline their thinking and set standards for reporting bugs.
      • For instance, in this case we can avoid this problem by directing the team to use the window title followed by the field name.
      • Ex: In login screen: password field; enter an invalid value, it accepts this value
  • 23. Using unique names or URLs in the defect tracker Use the URLs which saves the Developer/Testers Time to browse that page and eases the searching of the Defect by entering the URL. URL used here in the defect tracker
  • 24. For Non Web based applications
      • team members use the labels, object names and the error messages exactly used in the Application. The testers can even copy and paste the labels, names of the objects from the application and use them in the defect tracking tool.
      • Entering the Path to reach the destination screen.
      • For eg: Home ContactUs Placex Click on XYZ button.
  • 25. Searching based on Keywords
    • create a field as Keyword. This field will contain the keywords used for reporting the bug.
    • Use inverted commas for window titles, objects will increase the search a bug effectively.
    • bug tracking tool should be case insensitive and space insensitive.
    • use the Nouns using inverted commas. For example a login screen may contain a username and password. We can denote it by referring the fields with inverted commas as username, password or Login dialog.
    • Similarly the verbs used should also be standardized, for instance one user may use click, and another user may use, clicks
  • 26. Use a common Jargon
      • For instance each user may use his own terminology for describing a bug. A window can be called as a dialog or a pop up etc.
  • 27. Warning system in the defect tracker to inform the tester that the bug has already been reported
        • The defect tracker should compare the defect description or keywords field to the existing defects which may be either opened/assigned/pending/postponed statuses and should get the matches if more than 3-4 words (strings) match with the defect being reported and should inform the tester that a match already exists in the database, and on the tester reviewing the matched result will proceed further with reporting the bug.
  • 28. Use Checklist before entering a defect
        • Have you reproduced the defect again before entering the defect?
        • Have you checked for duplicate defects in the same or similar kind of module?
        • Verify synopsis entered is clear & concise & should contain keywords that you would relate with this. Makes it easier to search for later. If your synopsis goes over the width of the field, its too long.
        • Have you selected type as defect?
        • Have you selected the relevant values from the followings for your project
          • Project Name
          • Release Name
          • OS Group
          • Version
          • HW Platform
          • Functionality
          • Feature Group and area,
  • 29. Checklist continued---
        • Have you selected the most appropriate severity of the defect based on the defect complexity?
        • Have you entered the description for:
          • Expected behavior
          • Observed behavior
        • 8. Have you entered the description for steps to duplicate to
        • Contain:
        • Detailed steps to reproduce / duplicate so that reviewers/developers wont request for more information
        • Builds used for testing
        • Connection information like database, server name etc...
        • 9. Have you attached screenshots), log file(s), trace file(s),
        • sample report(s), Sample application(s) for defect?
        • 10. Have you verified that there are no typo error within adapt
        • entry?
  • 30. References
      • http://qaforums.com
      • http://stickyminds.com
      • http://testing.com
      • Common Book Of Knowledge Reference book for CSTE
  • 31. Few of the defect tracking tools
    • BugZilla Available for free download at www.bugzilla.com
    • Test Track Pro From Sea Pine
    • IBM Rational clear quest
    • Track Gear from Logi Gear Corporation
    • PR Tracker www.prtracker.com
  • 32. Questions???
  • 33. Thank You