Friday, 15 March 2013

Creative and Critical Thinking and Testing Part 3

The previous articles in this series have looked at what is critical and creative thinking and defining the stages of testing.  This article is the start of looking at each stage in detail and advising which may be the best type of thinking to apply at that stage as described in the diagram from the first article of this series.

Documentation review

Even though we are moving towards a more ‘Agile’ style of software development it does not mean here should be no documentation and for complex projects it can be vital.  There are various documents that can be created and the ones most commonly accessed by testers are the requirements, design specifications and high level design.  The old schoolers may remember this being referred to as ‘Static Testing’, 'Inspections and walkthroughs.

One of the early stages of testing is for testers to review requirement documents from a testability perspective.  When reviewing the document the tester should be asking questions about the statements been made and critical thinking about what has been stated.  This stage of testing for the tester should mainly be a critical thinking exercise with some aspects of creative thinking.

The question is how do we apply critical thinking to documentation review?

One way is by the use of check lists and heuristics (rules of thumb) to prompt our thoughts on the testability of the requirements.  One example of this is shown below:

  • Do questions need to be asked of the requirement before a test can be derived? If so, it is incomplete.
  • Are there two or more requirements that are in conflict with each other? If yes, they are inconsistent.
  • Can the requirement be interpreted in more than one way? If it can it is ambiguous.
  •  Does the requirement contain the words AND, OR, NOR, IF-THEN-ELSE? If it does, it is likely to be non-compound.
  •  Does the requirement fail to comply with any of the five criteria? If yes, then it is not testable.
  • Does the requirement deal with the ‘what’ rather than the ‘how’ (e.g. design)?
  • For example, the requirement ‘Provide a database’ states how a function should be implemented rather than what it is that is required. It should instead read ‘Provide the capability for traceability between requirements’
  • Is the requirement written as an operation rather than a requirement?
  • For example, ‘It will be stored in a x20 rack’ describes the operation rather than the environment. It should read ‘It shall be designed for storage in a x20 rack’.
  • Is the requirement written using the correct terms?
  • For example, the terms shall, will and should: Requirements use shall Statements use will Goals use should
  • Does the requirement contain the words is, was or must? These should be removed.
  • Are there any requirements that are missing? (e.g. reliability, quality)

Other sources/check lists that can be used to encourage critical thinking of the document under review include:

Software Quality Characteristics this is useful for spotting missing information from requirements by asking questions such as:

  • Diagnostics: is it possible to find out details regarding customer situations?
  • Compatibility: does the product comply with common interfaces or official standards?
  • Standards Conformance: the product conforms to applicable standards, regulations, laws or ethics.

There are some similarities between the heuristics and models you use when following test execution and when you are reviewing documentation since your mind is trying to flow through how the program with function so some of the model you use for test execution can also apply to reviewing test documentation.

For example you can use the consistency heuristics created by Michael Bolton and James Bach.  HICCUPPS

  • History
  • Image
  • Comparable products
  • Claims
  • User experience
  • Product
  • Purpose
  • Statues

Another example that can be used comes from the rapid software testing course and the lessons learned in software testing book and the use of Reference, Inference and Conference. A good article about this can be found here

  • Reference: Use all documents and ensure they agree
  • Inference: What are my assumptions about the requirements?  Are my assumptions correct?
  • Conference: speak to the rest of the implementation team about the issues around testing. 

I am not suggesting using all of these methods but to mix and match and choose which one works best for you.  You may choose not to use any of them and create your own (this is the creative thinking element) If you do create your own it would be great to share with others.

So let us try to apply this in practice.
  • Incomplete example:
  • The system shall restrict access’
    Should be rewritten as: ‘The system shall control access via usernames and passwords’
  • Consistency example:
  • ‘The system shall calculate all distances in miles’ ‘
    The system shall calculate all speeds in km per hour'
    These should be rewritten as: ‘The system shall calculate all distances in miles’ ‘The system shall calculate all speeds in miles per hour’
  • Inaccurate example:
  • Requirement: ‘All transactions shall be improved by 1 second’
    Customer actually requires faster logon times
    Requirement: ‘Daily average user logon shall be improved from 5 seconds to 4 seconds
  • Ambiguous example:
  • ‘An employee is entitled to 4 weeks holiday a year’.
     Should be redefined as: ‘An employee is entitled to 20 working days holiday per calendar year
  • Embedded (compounded) example:
  • ‘The target platform for the server system shall be Windows 2000 and Windows NT for the client system’
    Should be separated into 2 requirements:
    ‘The target platform for the server shall be Windows 2000’
    ‘The target platform for the client shall be Windows NT4’
Another useful check list that could be used is the one from the Ron Patton book on Software Testing which is talking about reviewing design specifications

A good, well-thought-out product specification, with "all its t's crossed and its i's dotted," has eight important attributes:

  • Complete. Is anything missing or forgotten? Is it thorough? Does it include everything necessary to make it stand alone?
  • Accurate. Is the proposed solution correct? Does it properly define the goal? Are there any errors?
  • PreciseUnambiguous, and Clear. Is the description exact and not vague? Is there a single interpretation? Is it easy to read and understand?
  • Consistent. Is the description of the feature written so that it doesn't conflict with itself or other items in the specification?
  • Relevant. Is the statement necessary to specify the feature? Is it extra information that should be left out? Is the feature traceable to an original customer need?
  • Feasible. Can the feature be implemented with the available personnel, tools, and resources within the specified budget and schedule?
  • Code-free. Does the specification stick with defining the product and not the underlying software design, architecture, and code?
  • Testable. Can the feature be tested? Is enough information provided that a tester could create tests to verify its operation?

There are many other areas within the review phase in which critical thinking can play an important part which we have not touched on within this article and I suggest people reading this article to go and investigate more where testers  can add value to a development project by thinking critical.  We can influence such areas as code reviews, walkthroughs and inspections.

The next article will look in depth at Test Planning and the style of thinking required for that stage.

No comments:

Post a comment