1 / 43

Testing 29/Apr/2009

Testing 29/Apr/2009. Petr Panuška petr.panuska@hp.com QA Manager & Offshore Team Manager, SOA Center HP Software R&D . Agenda. Why is SW testing necessary Testing principles Testing design techniques. Why is SW testing necessary. SW does not always work as we wish

nicola
Download Presentation

Testing 29/Apr/2009

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testing 29/Apr/2009 Petr Panuška petr.panuska@hp.com QA Manager & Offshore Team Manager, SOA Center HP Software R&D

  2. Agenda • Why is SW testing necessary • Testing principles • Testing design techniques

  3. Why is SW testing necessary • SW does not always work as we wish • Defects in SW can cause harm • Not all defects in SW result in a failure • Testing reduces the probability of undiscovered defects remaining in the software • Testing gives us the confidence about the tested SW

  4. Testing Principles

  5. Testing Principles • ISTQB defines 7 Testing Principles • Testing shows presence of defects • Exhaustive testing is impossible • Early testing • Defect clustering • Pesticide paradox • Testing is context dependent • Absence-of-errors fallacy

  6. Testing shows presence of defects • Tests show the presence not the absence of defects. • Testing can’t prove that SW is defect free • Testing reduces the probability of undiscovered defects remaining in the software • However, the fact that no defects were found, is not a proof of correctness.

  7. Exhaustive testing is impossible • It is not cost-effective • Defects are not of an equal risk • We need to prioritize our tests • Use risk analysis and priorities to focus testing efforts • Use appropriate testing techniques to meet the risk, giving us confidence in what testing has been performed • Exhaustive testing is equivalent to Halting problem • There is no algorithm that can for an arbitrary program and its input validates if the program works (with the given input) fine or not

  8. Early testing • In SW development, what can be tested • Requirement specification • Use-case Analysis document • Technical Analysis document • Design document • Functional Implementation • Performance • Usability • What can be also tested • Test Description • Documentation Price of potential defect fix increases

  9. Example

  10. Example – Early testing • Defect in H310: JBoss 4.3.0 not supported on Win 2008 (although Hermes 310 PRD requires this combination to be supported) • 7 other duplicates reported • Involving 4 people from QA • 9 people from DEV

  11. Defect clustering • Defects get clustered since different reasons • A component might be more complex than others • A component might be developed by a less experienced developer • A component might be developed by a less careful developer • A component might have a poorer specification • A component might need more refactoring (introducing more other defects) • Another explanation: http://parlezuml.com/blog/?postid=242

  12. Pesticide paradox • Old tests will eventually stop finding new defects • Defects create immunity to these tests • To find defects, new tests need to be introduced • Or old tests refactored • Conclusion: Regression tests do not find majority of new defects

  13. Testing is context dependent • Different kinds of tests are run in different periods • Defect testing • To discover faults or defects in the software where its behavior is incorrect or not in conformance with its specification; • A successful test is a test that makes the system perform incorrectly and so exposes a defect in the system. • Validation testing • To demonstrate to the developer and the system customer that the software meets its requirements; • A successful test shows that the system operates as intended. • We start with defect testing and perform validation testing later in the SW development process.

  14. Testing is context dependent II. • We can also test differently because of • The type of industry (safety-critical, business, nuclear) • Number of customers (impact the SW makes) • One customer patch • New version of SW for potentially many customers

  15. Absence-of-errors fallacy

  16. Absence-of-errors fallacy • We test a system to see if it meets the documented requirements • We find and fix defects to demonstrate that the system meets these specifications • Finding and fixing defects does not help if the system is unusable and does not fulfill the user’s needs and expectations

  17. Testing Techniques

  18. Testing design techniques • Black-box techniques • Equivalence partitioning • Boundary value analysis • Decision tables • State transitions • White-box techniques • Statement testing and coverage • Decision testing and coverage • Other structural-based coverage No details about the product implementation are known Tester knows how the tested requirement is implemented

  19. Equivalence partitioning • Tax Calculator based on annual income • Input – integral number • Partitions of equivalence

  20. Equivalence partitioning II • Split input into partitions • Valid partitions • Invalid partitions • Have one test-case for each partition • Even invalid ones • One more example: Enter number of a month

  21. Boundary value analysis • Similar as Equivalence partitioning but • Test boundary values • N-BVA (BVA = 0-BVA) • Test N boundary values • BV, BV+1, BV-1, … BV+N, BV-N

  22. Decision tables

  23. Decision tables - Example • Example – HP SOA Systinet Licensing Framework • HP SOA Systinet consists of • Core, Reporting, Lifecycle, etc. • Policy Manager (optional) • Contract Manager (optional) • HP SOA Systinet License can limit • Number of users • Number of days • Default license included in the installer • Policy & Contract Manager included • Limited to 60 days, unlimited to number of users

  24. Decision tables • Example – Testing license application

  25. Decision tables • Example – Testing license application

  26. Decision tables • Example – Testing license application

  27. Decision tables • Example – Testing license application Default license

  28. Decision tables • Example – Testing license application Default license

  29. Decision tables • Example – Testing license application Default license Visibility Edition

  30. Decision tables • Example – Testing license application Default license Unlimited Standard Edition Visibility Edition

  31. Decision tables • Example – Testing license application Default license Limited Standard Edition Unlimited Standard Edition Visibility Edition

  32. Decision tables II • Catches all possible combinations • Helps to analyze the situation and to decide • Which use-cases must be tested • Which use-cases does not make sense • Requires knowledge of the business environment • Helps to prioritize the use-cases

  33. State transitions • Example – Testing ‘Contract Request Lifecycle’ Create Request Accept Request Reject Request Revoke Request Delete Request

  34. State transitions II • Transition diagram shows valid transitions • Does not show invalid (that should be also tested) • Good for testing use-cases that are possible to be described by transitions between states • The scenarios may contain test-cases for each • State • Transition • Event that triggers state change (transition) • Action that may result from those transitions

  35. White-box techniques

  36. Statement testing and coverage

  37. Decision testing and coverage

  38. Other structure-based techniques • Condition Coverage • All single boolean conditions in a single statement must be evaluated to true and false • Full condition coverage does not imply full decision one! • Condition/Decision Coverage • Hybrid metric composed by the union of condition coverage and decision coverage.

  39. Other structure-based techniques II • Multiple Condition Coverage • All possible combinations of all boolean conditions in a single statement must be evaluated to true and false • Path Coverage • Whether each of the possible paths in each function have been followed. A path is a unique sequence of branches from the function entry to the exit. • Function, Call, LCSAJ, Loop, Race, etc. coverages • FMI, see http://www.bullseye.com/coverage.html

  40. Example • RTCA published DO-178B that requires minimal coverage for aeronautics SW systems based on their criticality:

  41. Summary

  42. Summary • Testing Principles • Testing shows presence of defects • Exhaustive testing is impossible • Early testing • Defect clustering • Pesticide paradox • Testing is context dependent • Absence-of-errors fallacy • Testing Techniques • Black-box techniques • White-box techniques

  43. Q&A

More Related