0% found this document useful (0 votes)
14 views41 pages

SWEN3165 Lecture 8

The document outlines various testing approaches, emphasizing the importance of systematic techniques in software testing to effectively and efficiently identify faults. It distinguishes between black box and white box testing methods, detailing specific techniques such as equivalence partitioning and boundary value analysis. Additionally, it highlights the significance of combining different testing strategies and the role of error guessing as a supplementary technique after systematic testing has been conducted.

Uploaded by

kid unique
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views41 pages

SWEN3165 Lecture 8

The document outlines various testing approaches, emphasizing the importance of systematic techniques in software testing to effectively and efficiently identify faults. It distinguishes between black box and white box testing methods, detailing specific techniques such as equivalence partitioning and boundary value analysis. Additionally, it highlights the significance of combining different testing strategies and the role of error guessing as a supplementary technique after systematic testing has been conducted.

Uploaded by

kid unique
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

SWEN3165

Testing Approaches

Mr. Matthew Ormsby

University of the West Indies


Mona, Kingston, Jamaica
What is a testing
approach?
What is a testing approach?
• a procedure for selecting or designing tests
• based on a structural or functional model of the software
• successful at finding faults
• 'best' practice
• a way of deriving good test cases
• a way of objectively measuring a test effort

Testing should be rigorous, thorough and systematic


Advantages
• Different people: similar probability find faults
– gain some independence of thought

• Effective testing: find more faults


– focus attention on specific types of fault
– know you're testing the right thing

• Efficient testing: find faults with less effort


– avoid duplication
– systematic techniques are measurable

Using techniques makes testing much more effective


Systematic
Techniques
Three types of systematic techniques
Static (non-execution)
• examination of documentation,
source code listings, etc.

Functional (Black Box)


• based on behaviour /
functionality of software

Structural (White Box)


• based on structure
of software
Some test techniques
Static Dynamic
Reviews etc.
Static Analysis
Behavioural
Inspection
Walkthroughs Structural Non-functional Functional
Desk-checking etc.
etc.
Equivalence
Control Usability Partitioning
Data Flow
Flow Performance
etc. Boundary
Value Analysis
etc.
Statement
Symbolic Cause-Effect Graphing
Execution Branch/Decision Arcs
Random
Definition Branch Condition LCSAJ
-Use State Transition
Branch Condition
Combination
Black box versus white box?

Black box appropriate Acceptance


at all levels but
dominates higher
levels of testing System
White box used
predominately
at lower levels Integration
to compliment
black box
Component
Black Box Test
Techniques
Black Box test design and measurement techniques
• Techniques defined in BS 7925-2
– Equivalence partitioning
– Boundary value analysis
– State transition testing
– Cause-effect graphing
– Syntax testing
– Random testing

• Also defines how to specify other techniques


Equivalence partitioning (EP)

– divide (partition) the inputs, outputs, etc. into areas which are the same (equivalent)
– assumption: if one value works, all will work
– one from each partition better than all from one

Example: assume, we have to test a field which accepts age 18 – 56


Boundary value analysis (BVA)

– faults tend to lurk near boundaries


– good place to look for faults
– test values on both sides of boundaries

invalid valid invalid

0 1 100 101
Example: Loan application
2-64 chars.
Customer Name

Account number 6 digits, 1st


non-zero
Loan amount requested
£500 to £9000
Term of loan

Monthly repayment 1 to 30 years

Term: Minimum £10


Repayment:
Interest rate:
Total paid back:
Customer name
Number of characters:

1 2 64 65
invalid valid invalid

Valid characters: A-Z


Any
-’ a-z
space other

Conditions Valid Invalid Valid Invalid


Partitions Partitions Boundaries Boundaries
Customer 2 to 64 chars < 2 chars 2 chars 1 chars
name valid chars > 64 chars 64 chars 65 chars
invalid chars 0 chars
Account number
valid: non-zero
first character:
invalid: zero

number of digits: 5 6 7
invalid invalid
valid

Conditions Valid Invalid Valid Invalid


Partitions Partitions Boundaries Boundaries
Account 6 digits < 6 digits 100000 5 digits
number 1st non-zero > 6 digits 999999 7 digits
1st digit = 0 0 digits
non-digit
Loan amount

499 500 9000 9001

invalid valid invalid

Conditions Valid Invalid Valid Invalid


Partitions Partitions Boundaries Boundaries
Loan 500 - 9000 < 500 500 499
amount >9000 9000 9001
0
non-numeric
null
Condition template

Conditions Valid Tag Invalid Tag Valid Tag Invalid Tag


Partitions Partitions Boundaries Boundaries
Customer 2 - 64 chars V1 < 2 chars X1 2 chars B1 1 char D1
name valid chars V2 > 64 chars X2 64 chars B2 65 chars D2
invalid char X3 0 chars D3
Account 6 digits V3 < 6 digits X4 100000 B3 5 digits D4
number 1st non-zero V4 > 6 digits X5 999999 B4 7 digits D5
1st digit = 0 X6 0 digits D6
non-digit X7
Loan 500 - 9000 V5 < 500 X8 500 B5 499 D7
amount >9000 X9 9000 B6 9001 D8
0 X10
non-integer X11
null X12
Design test cases

Test Description Expected Outcome


Case
1 Name: John Smith Term: 3 years
Acc no: 123456 Repayment: 79.86
Loan: 2500 Interest rate: 10%
Term: 3 years Total paid: 2874.96

2 Name: AB Term: 1 year


Acc no: 100000 Repayment: 44.80
Loan: 500 Interest rate: 7.5%
Term: 1 year Total paid: 537.60
Why do both EP and BVA?

If you do boundaries only, you have covered all the partitions as well
– technically correct and may be OK if everything works correctly!
– if the test fails, is the whole partition wrong, or is a boundary in the wrong place - have to test
mid-partition anyway
– testing only extremes may not give confidence for typical use scenarios (especially for users)
– boundaries may be harder (more costly) to set up
Decision tables
• explore combinations of inputs, situations or events,

• it is very easy to overlook specific combinations of input

• start by expressing the input conditions of interest so that they are either TRUE
or FALSE

– record found – policy expired


– file exists – account in credit
– code valid – due date > current date
Example: student access
A university computer system allows students an allocation of disc space
depending on their projects.
If they have used all their allotted space, they are only allowed restricted access,
i.e. to delete files, not to create them. This is assuming they have logged on with
a valid username and password.

What are the input and output conditions?


List the input and output conditions

• list the ‘input Input Conditions


conditions’ in the first Valid username
column of the table Valid password
Account in credit
• list the ‘output Output Conditions
conditions’ under the Login accepted
input conditions Restricted access
Determine input combinations
• add columns to the table for each unique combination of input conditions.
• each entry in the table may be either ‘T’ for true, ‘F’ for false.

Input Conditions
Valid username T T T T F F F F
Valid password T T F F T T F F
Account in credit T F T F T F T F
Rationalise input combinations
• some combinations may be impossible or not of interest
• some combinations may be ‘equivalent’
• use a hyphen to denote “don’t care”

Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
Complete the table
• determine the expected output conditions for each combination of input
conditions

Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
Output Conditions
Login accepted F F T T
Restricted access - - T F
Determine test case groups
• each column is at least one test case

Input Conditions
Valid username F T T T
Valid password - F T T
Account in credit - - F T
Output Conditions
Login accepted F F T T
Restricted access - - T F
Tags A B C D
Design test cases
• usually one test case for each column but can be none or several

Test Description Expected Outcome Tag


1 Username BrbU Invalid username A
2 Username Invalid username A
usernametoolong
3 Username BobU Invalid password B
Password abcd
4 Valid user, no disc Restricted access C
space
5 Valid user with disc Unrestricted access D
space
White Box Testing
Technique
White Box test design and measurement techniques
• Techniques defined in BS 7925-2
– Statement testing
– Branch / Decision testing
– Data flow testing
– Branch condition testing
– Branch condition combination testing
– Modified condition decision testing
– LCSAJ testing

• Also defines how to specify other techniques


Using structural coverage
Spec Enough
Software tests?
Tests
Results OK?
What's
covered?
More tests Coverage OK?

Stronger structural
techniques (different
structural elements)

Increasing coverage
The test coverage trap
Function exercised, better testing
insufficient structure

Functional
testedness

Structure exercised,
insufficient function

% Statement % Decision % Condition


Combination
Structural testedness

100% coverage does Coverage is not


not mean 100% tested! Thoroughness
Statement coverage
is normally measured
Statement coverage by a software tool.

• percentage of executable statements exercised by a test suite


number of statements exercised
total number of statements
• example:
– program has 100 statements ?
– tests exercise 87 statements
– statement coverage = 87%

Typical ad hoc testing achieves 60 - 75%


Example of statement coverage
Test Input Expected
1 read(a) case output
2 IF a > 6 THEN
3 b=a 1 7 7
4 ENDIF
5 print b

As all 5 statements are ‘covered’ by


this test case, we have achieved
100% statement coverage

Statement
numbers
Decision coverage
Decision coverage is normally measured
by a software tool.
(Branch coverage)
• percentage of decision outcomes
exercised by a test suite
number of decisions outcomes exercised
total number of decision outcomes
• example: ?
False
– program has 120 decision outcomes
True
– tests exercise 60 decision outcomes
– decision coverage = 50%

Typical ad hoc testing achieves 40 - 60%


Paths through code 1234

1 2 1 2 1 2 3
?

? ? ? ?

?
Paths through code with loops

1 2 3 4 5 6 7 8 ….

for as many times as it


is possible to go round
the loop (this can be
unlimited, i.e. infinite)
?
Error Guessing
Non-systematic test techniques
• Trial and error / Ad hoc
• Error guessing / Experience-driven
• User Testing
• Unscripted Testing
Error-Guessing
• always worth including
• after systematic techniques have been used
• can find some faults that systematic techniques can miss
• a ‘mopping up’ approach
• supplements systematic techniques

Not a good approach to start testing with


Error Guessing: deriving test cases
• Consider:
– past failures
– intuition
– experience
– brain storming
– “What is the craziest thing we can do?”
Key Points
• Test techniques are ‘best practice’: help to find faults

• Black Box techniques are based on behaviour

• White Box techniques are based on structure

• Error Guessing is never the right way to start testing

You might also like