Lecture 26: Testing Software Engineering ITCS 3155 Fall 2008 Dr. Jamie Payton Department of Computer Science University of North Carolina at Charlotte Dec. 9, 2008
Verification vs validation Verification: Are we building the product right? The software should conform to its specification Validation: Are we building the right product? The software should do what the user really requires 2
The V&V Process V & V must be applied at each stage in the software process Two principal objectives: Discover defects in a system Assess whether or not the system is useful and useable in an operational situation 3
V&V Goals Verification and validation should establish confidence that the software is fit for purpose This does NOT mean completely free of defects Not an exhaustive process Making an exhaustive determination is impossible Rather, software must be good enough for its intended use 4
V & V Confidence How good is good enough? Depends on system s purpose, user expectations, and marketing environment Software function The level of confidence depends on how critical the software is to an organization User expectations Users may have low expectations of certain kinds of software Marketing environment Getting a product to market early may be more important than finding defects in the program 5
Static and Dynamic Verification Two complementary approaches to system checking and analysis: Software inspections A form of static verification Analysis of static system representation to discover problems Software testing A form of dynamic verification Exercising and observing product behavior 6
Testing Strategy First: Unit testing Test individual software units in isolation Second: Integration testing Focus is on design and construction of the architecture Third: Validation testing Focus is on user-visible actions and output from system to determine if requirements are satisfied Finally: System testing Entire software system is tested as a whole 7
Unit Testing Unit testing Idea: Test individual software units (components, modules) in isolation Tests usually exercise important paths in the control structure Tests focus on internal logic and data structures of unit Test case design Should design test cases to discover errors due to: Erroneous computations Incorrect comparisons Improper control flow Boundary testing! n th time through the loop Min/max element accessed in the array Min/max allowable value assigned to variable Often performed by developer 8
Integration Testing Best practice: incremental integration Program is constructed and tested in small increments Incremental testing approaches Top-down Integrate modules by moving downward through control hierarchy Depth-first Breadth-first Higher level modules are used as drivers, must write stubs Bottom up Begins construction and testing with atomic modules Must write higher-level drivers Regression testing after each integration 9
Test Characteristics A good test: Has a high probability of finding an error Test classes designed around ideas on how software might fail Is not redundant Test classes test for different errors Is the best of breed Choose the best test out of similar ones Is not too simple or too complex Combining tests can save time, effort Combining tests can create side effects that mask errors 10
Testing Approaches White-box testing Uses knowledge about the internals of the program to create tests and test data Tests demonstrate each function is fully operational and error-free Black-box testing No knowledge of internals (control flow, data structure) Tests are constructed using only the interface 11
White Box Testing Can derive test cases that: Guarantee that all independent paths have been exercised Exercise all logical decisions on true and false sides Execute loops at their boundaries and within bounds Exercise internal data structures to ensure validity Techniques Basis path testing Control structure testing 12
Basis Path Testing 1. Create a flow graph that depicts logical flow of control 2. Determine cyclomatic complexity of flow graph Provides a quantitative measure of the logical complexity of a program 3. Determine basis set of linearly independent paths 4. Prepare test cases that force execution of each path in basis set 13
Flow Graph Depicts logical control flow Node represents a statement Predicate nodes represent conditionals Will have two or more outgoing edges Not every node with two outgoing edges is a predicate node Compound predicates may be broken up into two nodes Edge represents flow of control Areas bounded by edges and nodes are called regions Flowchart 11 6 6 7 8 9 7 8 R3 1 2,3 1 2 3 R2 4 5 10 4,5 R1 Flowgraph 9 10 R4 11 14
Finding Independent Paths Independent path Any path that introduces at least one new set of processing statements or condition In flow graph: Moves along an edge that has not been traversed before path was defined Basis set Set of paths that covers every statement in the 1 program Example independent paths: Path 1: 1, 11 Path 2: 1, 2, 3, 4, 5, 10, 1, 11 Path 3: 1, 2, 3, 6, 8, 9, 10, 1, 11 Path 4: 1, 2, 3, 6, 7, 9, 10, 1, 11 Flowgraph 6 7 8 11 9 2,3 10 4,5 15
Determining Basis Set How do we know when we ve found all the paths? Cyclomatic complexity Defines number of independent paths in the basis set Provides upper bound on number of tests Approaches to computing cyclomatic complexity Number of regions corresponds to cyclomatic complexity V(G) = E N + 2 V(G) = P + 1 Where P is the number of predicate nodes 16
Cyclomatic Complexity Approaches to computing cyclomatic complexity Number of regions corresponds to cyclomatic complexity V(G) = E N + 2 V(G) = P + 1 Where P is the number of predicate nodes 6 7 8 R3 1 2,3 R2 4,5 R1 9 10 R4 11 17
Creating Basis Path Test Cases Choose data to set conditions at predicate nodes to test each path in basis set Each test case is executed and results are compared to expected results 18
White Box Testing Can derive test cases that: Guarantee that all independent paths have been exercised Exercise all logical decisions on true and false sides Execute loops at their boundaries and within bounds Exercise internal data structures to ensure validity Techniques Basis path testing Control structure testing 19
Control Structure Testing General techniques Condition testing Exercises each logical condition in the program Data flow testing Selects tests according to the location of definitions and uses of variables Loop testing Focuses on validity of loop constructs 20
Condition testing Testing conditional statements in program Use to find condition components that may be incorrect Boolean operator Boolean variable errors Boolean parenthesis errors Relational operator errors Arithmetic expression errors 21
Data Flow Testing Selects tests according to the location of definitions and uses of variables Each statement is numbered For each statement s, define: DEF(s) = {x s contains a definition of x} USE(s) = {x s contains a use of x} Definition use chain for variable x : [x, s, x ] X in def(s) and use(s ) Strategy Cover every definition use chain for every variable Limitation: does not necessarily cover all branches 22
Loop Testing Different testing approaches defined for different types of loops Simple loops Skip the loop entirely Only one pass through Two passes through m passes through, m< n n-1, n, n+1 passes through Nested loops Concatenated loops Unstructured loops 23
Loop Testing Different testing approaches defined for different types of loops Nested loops Start at innermost loop. Set all other loops to minimum values Conduct simple loop test for innermost loop Work outward, conducting tests for the next loop Continue until all loops have been tested Concatenated loops If independent, use approach for simple loops Otherwise, use approach for nested loops Unstructured loops Don t test redesign! 24
Testing Approaches White-box testing Uses knowledge about the internals of the program to create tests and test data Tests demonstrate each function is fully operational and error-free Black-box testing No knowledge of internals (control flow, data structure) Tests are constructed using only the interface 25
Black Box Testing Focuses on functional requirements No knowledge of internals (control flow, data structure) Tests are constructed using only the interface Complementary to white-box testing Attempts to find errors due to: Incorrect/missing functions Interface errors Errors in data structures or external database access Behavior or performance errors Initialization and termination errors 26
Black Box Testing Techniques Graph-based testing methods Creates a graph of important objects and relationships Devising set of tests that cover paths through graph Equivalence partitioning Divides input domain into classes of data to derive test cases Ideal test case covers entire class Boundary value analysis Selects test cases at edges of equivalence classes 27
Equivalence Partition Testing Input data and output results often fall into different classes where all members of a class are related Each of these classes is an equivalence partition or domain program behaves in equivalent way for each member Test cases should be chosen from each partition 28
Binary Search Equivalence Classes Sequence Single value Single value More than 1 value More than 1 value More than 1 value More than 1 value Element In sequence Not in sequence First element in sequence Last element in sequence Middle element in sequence Not in sequence 29
Structuring a Software Test Plan Elements of a test plan Testing process Requirements traceability Tested items Testing schedule Test recording procedures Hardware and software requirements Constraints 30