TestingOverview

Testing Overview
CS 4311
Frank Tsui, Orland Karam, and Barbara Bernal, Essential of Software
Engineering, 3rd edition, Jones & Bartett Learning. Sections 10.1-10.2.
Hans Van Vliet, Software Engineering, Principles and Practice, 3rd
edition, John Wiley & Sons, 2008. Chapter 13.
1
Outline

V&V
 Definitions
of V&V terms
 V&V and software lifecycle
 Sample techniques

Testing
 Basics
of testing
 Levels of software testing
 Sample testing techniques
2
Verification and Validation (V&V)

Textbook use of term “Testing”
 General/wider

sense to mean V&V
Q: What is V&V in software testing?
 Groups
of 2
 What? Why? Who? Against what? When? How?
 5 minutes
3
What is V&V?

Different use by different people, e.g.,


Verification



Formal vs. informal and static vs. dynamic
Evaluation of an object to demonstrate that it meets its
specification. (Did we build the system right?)
Evaluation of the work product of a development phase to
determine whether the product satisfies the conditions imposed
at the start of the phase.
Validation

Evaluation of an object to demonstrate that it meets the
customer’s expectations. (Did we build the right system?)
4
V&V and Software Lifecycle

Throughout software lifecycle, e.g., V-model
5
Requirement Engineering


Determine general test strategy/plan
(techniques, criteria, team)
Test requirements specification
 Completeness
 Consistency
 Feasibility
(functional, performance requirements)
 Testability (specific; unambiguous; quantitative;
traceable)

Generate acceptance/validation testing data
6
Design


Determine system and integration test strategy
Assess/test the design
 Completeness
 Consistency
 Handling scenarios
 Traceability (to and from)
 Design walkthrough, inspection
7
Implementation and Maintenance

Implementation
 Determine unit test strategy
 Techniques (static vs. dynamic)
 Tools, and whistles and bells (driver/harness,

stub)
Maintenance
 Determine
regression test strategy
 Documentation maintenance (vital!)
8
Hierarchy of V&V Techniques
V&V
Dynamic Technique
Testing
in narrow
sense
Symbolic
Execution
Model
Checking
Static Technique
Formal
Analysis
Static
Analysis
Informal
Analysis
Walkthrough
Inspection
Proof
Complementary
Reading
9
Definitions of V&V Terms

“Correct” program and specification

Program matches its specification
 Specification matches the client’s intent

Error (a.k.a. mistake)

A human activity that leads to the creation of a fault
 A human error results in a fault which may, at runtime, result in a failure

Fault (a.k.a. bug)

The physical manifestation of an error that may result in a failure
 A discrepancy between what something should contain (in order for
failure to be impossible) and what it does contain

Failure (a.k.a. symptom, problem, incident)

Observable misbehavior
 Actual output does not match the expected output
 Can only happen when a thing is being used
10
Definitions

Fault identification and correction

Process of determining what fault caused a failure
 Process of changing a system to remove a fault

Debugging


Testing


The act of finding and fixing program errors
The act of designing, debugging, and executing tests
Test case and test set

A particular set of input and the expected output
 A finite set of test cases working together with the same purpose

Test oracle

Any means used to predict the outcome of a test
11
Where Do the Errors Come From?

Q: What kinds of errors? Who?
 Groups
of 2
 3 minutes
12
Where Do the Errors Come From?

Kinds of errors




Missing information
Wrong information/design/implementation
Extra information
Facts about errors



To err is human (but different person has different error rate).
Different studies indicate 30 to 85 errors per 1000 lines. After
extensive testing, 0.5 to 3 errors per 1000 lines remain.
The longer an error goes undetected, the more costly to correct
13
Types of Faults

List all the types and causes of faults: what can
go wrong in the development process?
 In
group of 2
 3 minutes
14
Sample Types of Faults









Algorithmic: algorithm or logic does not produce the proper
output for the given input
Syntax: improper use of language constructs
Computation (precision): formula’s implementation wrong or
result not to correct degree of accuracy
Documentation: documentation does not match what program
does
Stress (overload): data structures filled past capacity
Capacity: system’s performance unacceptable as activity
reaches its specified limit
Timing: code coordinating events is inadequate
Throughput: system does not perform at speed required
Recovery: failure encountered and does not behave correctly
15
Sample Causes of Faults
Requirements
Incorrect or missing requirements
Incorrect translation
System Design
Incorrect design specification
Program Design
Incorrect design interpretation
Program Implementation
Incorrect documentation
Incorrect semantics
Unit Testing
System Testing
Incomplete testing
New faults introduced correcting
others
16
Sample V&V Techniques
Requirements
Reviews:
walkthroughs/inspections
Design
Synthesis
Implementation
Testing
Runtime monitoring
Operation
Model checking
Correctness proofs
Maintenance
17
Outline

V&V

Definitions of V&V terms
 V&V and software lifecycle
 Sample techniques

Testing
 Basics
of testing
 Levels of software testing
 Sample testing techniques
18
Question

How do you know your software works
correctly?
19
Question



How do you know your software works
correctly?
Answer: Try it.
Example: I have a function, say f, of one integer
input. I tried f(6). It returned 35.
 Is
my program correct?
Groups of 2
1 minute
20
Question



How do you know your software works
correctly?
Answer: Try it.
Example: I have a function, say f, of one integer
input. I tried f(6). It returned 35.
 My
function is supposed to compute x*6-1. Is it
correct?
 Is my program correct?
Groups of 2
1 minute
21
Goals of Testing
I want to show that my program is correct; i.e., it
produces the right answer for every input.
Q: Can we write tests to show this?
Groups of 2
1 minute
22
Goals of Testing


Can we prove a program is correct by testing?
Yes, if we can test it exhaustively: every
combination of inputs in every environment.
23
How Long Will It Take?

Consider X+Y for 32-bit integers.
How many test cases are required?

How long will it take?

1
test per second:
 1,000 tests per second:
 1,000,000 per second:
Groups of 2
1 minute
24
How Long?


Consider X+Y for 32-bit integers.
How many test cases are required?
 232 *
232 = 264 =1019
 (The universe is 4*1017 seconds old.)

How long will it take?
1 test per second:
1,000 tests per second:
1,000,000 per second:
580,000,000,000 years
580,000,000 years
580,000 years
25
Another Example
A
B
A loop returns to A.
We want to count the
number of paths.
The maximum
number of iterations
of the loop is 20.
How many?
C
26
Another Example
A
B
Suppose the loop
does not repeat:
Only one pass
executes
5 distinct paths
C
27
Another Example
A
Suppose the loop
repeats exactly once
B
5*5=25 distinct paths
If it repeats at most
once,
5 + 5*5
C
28
Another Example
C
A
What if it repeats
exactly n times?
B
5n paths
29
Another Example
A
What if it repeats at
most n times?
B
∑5n = 5n + 5n-1+ … + 5
n=20, ∑5n = 1015
32 years at 1,000,000
tests per second
C
30
Yet Another Example


Consider testing a Java compiler?
How many inputs are needed to test every
input?
31
Limits of Testing







You can’t test it completely.
You can’t test all valid inputs.
You can’t test all invalid inputs.
You really can’t test edited inputs.
You can’t test in every environment.
You can’t test all variations on timing.
You can’t even test every path. (path, set of lines
executed, start to finish)
32
Why Bother?


Test cannot show an absence of a fault.
But, it can show its existence!
33
Goals of Testing

Identify errors
 Make
errors repeatable (when do they occur?)
 Localize errors (where are they?)

The purpose of testing is to find problems in
programs so they can be fixed.
34
Cost of Testing



Testing accounts for between 30% and 90%
of the total cost of software.
Microsoft employs one tester for each
developer.
We want to reduce the cost
 Increase
test efficiency: #defects found/test
 Reduce the number of tests
 Find more defects


How?
Organize!
35
A Good Test:





Has a reasonable probability of catching an error
Is not redundant
Is neither too simple nor complex
Reveals a problem
Is a failure if it doesn’t reveal a problem
36
Outline

V&V

Definitions of V&V terms
 V&V and software lifecycle
 Sample techniques

Testing

Basics of testing
 Levels of software testing
 Sample testing techniques
37
Levels of Software Testing





Unit/Component testing
Integration testing
System testing
Acceptance testing
Installation testing
38
Levels of Software Testing

Unit/Component testing
 Verify
implementation of each software element
 Trace each test to detailed design




Integration testing
System testing
Acceptance testing
Installation testing
39
Levels of Software Testing


Unit/Component testing
Integration testing
 Combine
software units and test until the entire
system has been integrated
 Trace each test to high-level design



System testing
Acceptance testing
Installation testing
40
Levels of Software Testing



Unit/Component testing
Integration testing
System testing
 Test
integration of hardware and software
 Ensure software as a complete entity complies
with operational requirements
 Trace test to system requirements


Acceptance testing
Installation testing
41
Levels of Software Testing




Unit/Component testing
Integration testing
System testing
Acceptance testing

Determine if test results satisfy acceptance
criteria of project stakeholder
 Trace each test to stakeholder requirements

Installation testing
42
Levels of Software Testing





Unit/Component testing
Integration testing
System testing
Acceptance testing
Installation testing
 Perform
testing with application installed on its
target platform
43
Testing Phases: V-Model
Requirements
Specification
System
Specification
Acceptance
Test Plan
Service
Acceptance
Test
System
Design
System
Integration
Test Plan
System
Integration test
Detailed
Design
Sub-system
Integration
Test Plan
Unit code and
Test
Sub-system
Integration test
44
Hierarchy of Testing
Testing
Ad Hoc
Program Testing
Unit Testing
Integration Testing
Black Box
Top Down
Equivalence
State Transition
Acceptance
Testing
Function
Properties
Benchmark
Performance
Pilot
Reliability
Alpha
Bottom Up
Boundary
Decision Table
System Testing
Big Bang
Sandwich
Availability
Beta
Security
Use Case
Usability
Domain Analysis
Documentation
Portability
White Box
Control Flow
Data Flow
Capacity
45