Verification and Validation

Similar documents
Software Testing. Minsoo Ryu. Hanyang University. Real-Time Computing and Communications Lab., Hanyang University

Lecture 26: Testing. Software Engineering ITCS 3155 Fall Dr. Jamie Payton

Testing. Unit, integration, regression, validation, system. OO Testing techniques Application of traditional techniques to OO software

Verification and Validation. Assuring that a software system meets a user s needs. Verification vs Validation. The V & V Process

Software Testing Fundamentals. Software Testing Techniques. Information Flow in Testing. Testing Objectives

Lecture 15 Software Testing

Software Testing Strategies. Slides copyright 1996, 2001, 2005, 2009, 2014 by Roger S. Pressman. For non-profit educational use only

Overview. State-of-the-Art. Relative cost of error correction. CS 619 Introduction to OO Design and Development. Testing.

Software Testing Interview Question and Answer

SOFTWARE ENGINEERING IT 0301 Semester V B.Nithya,G.Lakshmi Priya Asst Professor SRM University, Kattankulathur

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Chapter 11, Testing. Using UML, Patterns, and Java. Object-Oriented Software Engineering

Integration and Testing. Uses slides from Lethbridge & Laganiere, 2001

Topic: Software Verification, Validation and Testing Software Engineering. Faculty of Computing Universiti Teknologi Malaysia

Computer Science and Software Engineering University of Wisconsin - Platteville 9-Software Testing, Verification and Validation

Part 5. Verification and Validation

Main concepts to be covered. Testing and Debugging. Code snippet of the day. Results. Testing Debugging Test automation Writing for maintainability

Program Correctness and Efficiency. Chapter 2

Chapter 8 Software Testing. Chapter 8 Software testing

Software testing. Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 23 Slide 1

! Is often referred to as verification and validation (V&V).

Darshan Institute of Engineering & Technology Unit : 9

Verification and Validation. Verification and validation

Aerospace Software Engineering

Sample Question Paper. Software Testing (ETIT 414)

Write perfect C code to solve the three problems below.

Terminology. There are many different types of errors and different ways how we can deal with them.

Software Engineering Software Testing Techniques

Testing & Debugging TB-1

SOFTWARE ENGINEERING SOFTWARE VERIFICATION AND VALIDATION. Saulius Ragaišis.

Topics in Software Testing

Chapter 11, Testing, Part 2: Integration and System Testing

Introduction to Software Engineering

Chapter 14 Testing Tactics

Chapter 9. Software Testing

Software Engineering (CSC 4350/6350) Rao Casturi

CMSC 132: OBJECT-ORIENTED PROGRAMMING II

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CS SOFTWARE ENGINEERING

Verification and Validation

Verification and Validation

Testing and Debugging

VETRI VINAYAHA COLLEGE OF ENGINEERING AND TECHNOLOGY DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

Software Testing. Software Testing

Lecture 20: SW Testing Presented by: Mohammad El-Ramly, PhD

In this Lecture you will Learn: Testing in Software Development Process. What is Software Testing. Static Testing vs.

Bridge Course On Software Testing

Chapter 11, Testing, Part 2: Integration and System Testing

MONIKA HEINER.

Software Testing. Software Testing. in the textbook. Chapter 8. Verification and Validation. Verification and Validation: Goals

Software Testing. 1. Testing is the process of demonstrating that errors are not present.

Software Design Models, Tools & Processes. Lecture 6: Transition Phase Cecilia Mascolo

The testing process. Component testing. System testing

Testing Objectives. Successful testing: discovers previously unknown errors

Chapter 1: Principles of Programming and Software Engineering

Software Engineering Fall 2015 (CSC 4350/6350) TR. 5:30 pm 7:15 pm. Rao Casturi 11/10/2015

TITAN 5300 Software. Unit Test Guidelines. S. Darling, S. Harpster, R. Hite, K. Konecki, W. Martersteck, R. Stewart. Revision 2.0

Quality Assurance in Software Development

Software Engineering Fall 2014

Literature. CHAPTER 5 Testing. When to Test? The Unified Process. When to Test?

Unit Testing as Hypothesis Testing

Darshan Institute of Engineering & Technology for Diploma Studies

UNIT-2 Levels of Testing

UNIT 1-SOFTWARE PROCESS AND PROJECT MANAGEMENT

QUIZ #5 - Solutions (5pts each)

INTRODUCTION TO SOFTWARE ENGINEERING

Chapter 8 Software Testing

Software Testing. An Overview

Chapter 9 Quality and Change Management

Software Engineering

Unit Testing as Hypothesis Testing

Pearson Education 2007 Chapter 9 (RASD 3/e)

Types of Software Testing: Different Testing Types with Details

Testing. UW CSE 160 Winter 2016

Chapter 14 Software Testing Techniques

Verification and Validation

Software Engineering Testing and Debugging Testing

Computational Systems COMP1209

Software Testing part II (white box) Lecturer: Giuseppe Santucci

CSE 403: Software Engineering, Fall courses.cs.washington.edu/courses/cse403/16au/ Unit Testing. Emina Torlak

18-642: Unit Testing 1/31/ Philip Koopman

Software Testing for Developer Development Testing. Duvan Luong, Ph.D. Operational Excellence Networks

SE 2730 Final Review

Learning outcomes. Systems Engineering. Debugging Process. Debugging Process. Review

C++ for Everyone, 2e, Cay Horstmann, Copyright 2012 John Wiley and Sons, Inc. All rights reserved. Using a Debugger WE5.

UNIT 1-2 MARKS QUESTIONS WITH ANSWERS

Chapter 8 Software Testing. Chapter 8 So-ware tes0ng

Object-Oriented Software Engineering Conquering Complex and Changing Systems. Chapter 9, Testing

Recap on SDLC Phases & Artefacts. Topic: Software Verification, Validation and Testing Software Engineering

Quote by Bruce Sterling, from: A Software Testing Primer, Nick Jenkins

Facts About Testing. Cost/benefit. Reveal faults. Bottom-up. Testing takes more than 50% of the total cost of software development

1 Visible deviation from the specification or expected behavior for end-user is called: a) an error b) a fault c) a failure d) a defect e) a mistake

CSE 374 Programming Concepts & Tools. Hal Perkins Fall 2015 Lecture 15 Testing

18-642: Unit Testing 9/18/ Philip Koopman

Testing and Debugging

QUIZ. What are 3 differences between C and C++ const variables?

Testing. Prof. Clarkson Fall Today s music: Wrecking Ball by Miley Cyrus

Three General Principles of QA. COMP 4004 Fall Notes Adapted from Dr. A. Williams

Program Analysis. Program Analysis

CS6403 SOFTWARE ENGINEERING Year / Sem : II / IV Sub. Code &Subject : CS6403 SOFTWARE ENGINEERING QUESTION BANKWITH ANSWERS

Software Quality Assurance (SQA) Software Quality Assurance

Transcription:

Chapter 5 Verification and Validation Chapter Revision History Revision 0 Revision 1 Revision 2 Revision 3 Revision 4 original 94/03/23 by Fred Popowich modified 94/11/09 by Fred Popowich reorganization and introduction of numbering introduction of conditional text for FrameMaker containing material deleted for 94-3 offering of the course modified 97/07/15 by Fred Popowich incorporation of additional material from Sommerville, and deletion of some Pressman material modified 97/07/15 by Fred Popowich incorporation of additional material from Sommerville, chapter 23 modified 98/11/13 by Fred Popowich updating and streamlining of material 5.1 Fundamentals Validation: are we building the right product Verification: are we building the product right so far, at each stage, we have produced documents which can be subjected to static verification inspections analysis formal verification 1

2 Verification and Validation program testing is still the most common dynamic verification and validation technique 1. testing is a destructive process (a) you try to demolish the software that you just built (b) is the tedium of testing is [the] punishment for our errors 2. testing is the execution of a program with the intent of finding an error 3. a good test case is one which is likely to find an error 4. a successful test is one which finds an unknown error - it is NOT a case where no bug is found! 5. testing shows the presence (not the absence) of defects 6. testing is not debugging (a) testing discovers the error (b) debugging removes it (c) system should then be retested (regression testing) 7. best if you don t unit test your own code (a) often happens that you do test your own code though (b) in large projects, there might be a separate testing group (c) tester and programmer have to work together though 5.2 Strategies for Testing i. and not just in the final testing phase ii. programmer needs to fix errors 1. starting at the module level and working outwards (a) COMPONENT TESTING (also known as UNIT TESTING) i. individual modules ii. individual units (b) INTEGRATION TESTING i. gradually put the system together ii. top down or bottom up iii. system testing (c) USER TESTING i. validation testing - meet the needs/expectations of customer ii. acceptance testing, alpha test

Strategies for Testing 3 2. Put Together Test Plans iii. deliver a system to end users (beta test) i. Figure 22.4 ii. testing in parallel with the different phases of the waterfall model (Fig 22.5) 5.2.1 Component Testing 1. testing of a single compilation unit (module) 2. approach could also be the testing of a single procedure (a) decide what to test i. interface (structure tests) A. is there correct flow of info into and out of module B. must be done first C. check not only parameters and function values... D.... but also external files E. interface bugs are common in languages without strong intermodule type checking ii. local data structures (functional tests) iii. boundary conditions (functional tests) iv. control structures (functional tests) v. error handling (stress tests) vi. some performance testing might be done on low level modules (b) write a driver (c) write stubs i. small main program to test all the modules features ii. it will import the module being tested iii. replaces modules that are superordinate to tested module in the HMD iv. drivers will exercise the modules by: (Pressman Fig 17.10) A. invoking a subordinate function B. sending a parameter from a table or external file C. displaying a parameter returned by the subordinate function D. a combination of B and C. i. small dummy subprograms

4 Verification and Validation ii. will be imported by the module being tested iii. use the subordinate modules interfaces (header files) iv. replace modules that are subordinate to the tested module in the HMD v. you can test a module without having to use other peoples buggy versions of their modules vi. will mostly contain do nothing (empty) functions... A. just take the signature and add an empty body vii.... or might have some minimum functionality (see Pressman Fig 17.8) A. could just display a trace message cout << Currently in Function Foo B. complex functions could just return some constant value or a value from a table int myfunc() { return 42; } C. could just display a parameter that was passed in D. could do a table search for input parameter and return associated output parameter trick: write an awk program which automatically generates stubs from header files (d) drivers/stubs represent overhead i. extra code to be written ii. amount of code depends on type of stub (e) will it be easier to unit test with high cohesion/low coupling? (f) note: lowest modules don t need stubs (g) note: highest modules don t need drivers 5.2.2 Integration Testing 1. the whole is greater than the sum of the parts just because the parts work doesn t mean the whole system will work 2. nonincremental (big bang) integration doesn t usually work 3. incremental integration is the solution 5.2.2.1 Top Down Integration (a) method

Strategies for Testing 5 i. start with main module plus stubs ii. one by one, replace stubs with actual modules A. actual modules will need their own stubs iii. test features in existing modules A. make sure that the interface is tested! iv. continue working across and downwards in the HMD v. regression testing may also be done A. conducting all or some of the previous tests vi. see example 5.1 below, a HMD (filled box denotes stub) vii. also see example in Fig 22.7 in textbook Example 5.1 1. ACMS 2. ACMS 3. ACMS CargoOp FlightOp CargoOp FlightOp CargoOp FlightOp CargoADT FlightADT CargoADT FlightADT 4. 5. 6. ACMS ACMS ACMS CargoOp FlightOp CargoOp FlightOp CargoOp FlightOp CargoADT FlightADT CargoADT FlightADT CargoADT FlightADT PalletADT PalletADT PalletADT

6 Verification and Validation (b) notice that the integration requires the use of the stubs but not the drivers often easier to write stubs than drivers (c) we did it in a breadth first manner can also be done in a depth first manner (d) since upper modules are integrated first, you can get earlier user feedback on the look and feel of the whole system (e) major system control errors will be detected early (f) access to real data can occur very late in the testing stages (disadvantage) i. can delay some tests until stubs are replaced by actual modules (disadvantage) ii. depth first strategy gets around some of these problems (g) it can sometimes be difficult to write test cases for upper level modules (h) upper level modules end up receiving more extensive testing which can be good if these are the modules that are most critical 5.2.2.2 bottom-up integration (a) method i. start with lowest level modules ii. a group of low level modules is combined into a cluster or build iii. a driver is used to coordinate testing A. they call lower level procedures with appropriate test parameters iv. cluster is tested v. remove drivers and combine clusters (b) example, figure 22.8 A. new modules will have their own drivers (c) no stubs are needed, but drivers are! (d) lower level modules receive more extensive testing (e) easier test case design 5.2.2.3 sandwich integration (a) top down integration for the upper modules (b) bottom up integration for the lower levels 5.2.2.4 general testing and integration comments (a) testing on unsure areas should start as early as possible (b) all stubs and drivers should be saved

Strategies for Testing 7 i. whenever modules are changed, they should be re-unit tested before being reintegrated (c) a Test Plan document contains an overall plan i. for integration ii. description of specific tests 5.2.3 User Testing 1. often incorporates a series of black box tests 2. demonstrate conformity to requirements 3. comes from the test plan that we formulated early in the lifecycle 4. acceptance testing (a) conducted by the end user 5. alpha/beta tests (a) when product is to be used by many customers (b) alpha test (c) beta test i. conducted by a customer at developer s site ii. developer looking over shoulder and recording errors i. at one or more customer sites ii. developer not usually present iii. customer records problems 5.2.4 Back to Back Testing when more than one version of a system is available possible when N versions in parallel prototypes one version of a series also known as comparison testing i. for critical systems, redundant software is often developed A. from same specification, separate systems are created ii. multiple implementations are tested back to back iii. if output from all implementations is identical, all are assumed to be correct

8 Verification and Validation 5.3 Techniques 1. Black Box Testing iv. if output from black box tests differs, then further investigation is required v.... but what if there is an error in the specification? A. the systems are not entirely independent vi. recall that the Space Shuttle has multiple computers for some systems A. suggests that there might be separate hardware B. and perhaps separate software (a) used in integration testing and user testing 2. White Box (Glass Box) testing 3. Interface Testing 5.3.1 Recall our Goals: (a) example: control structure testing (b) used in unit testing and integration testing 1. FINDING incorrect functioning 2. FINDING undesirable/additional side effects 3. we are NOT trying to show that there are NO bugs present 4. we not NOT trying to raise confidence in the program 5. let get into more detail than we would for the project! 5.3.2 White Box Testing 1. assumes designer of test cases can see detailed design and/or source code 2. advantages (a) can see design documentation which may suggest where weak points are (b) can see code, and thus can design complete tests of all components 3. disadvantages - weren t discussed in class (a) internal knowledge may bias test case design to assume certain code looks good and doesn t need thorough testing 4. three arguments FOR white box testing (Jones, in Pressman pg 455)

Techniques 9 (a) logic errors and incorrect assumptions are inversely proportional to the probablility that a program path will be executed - (bugs creep into the corners, and congregate at the boundaries) (b) we often believe that a logical path is not likely to be executed when in fact it may be executed on a regular basis (c) typographical errors are random 5. some techniques (a) path testing - Pressman and Sommerville are quite quite consistent on this i. all statements in the program are executed at least once ii. determine all paths through a program iii. can be done from detailed design iv. develop a flow graph to depict logical control flow A. each programming language construct has a flow graph representation (Fig 23.14, pg 474) B. easy to convert a flow chart into a flow graph C. sequential statements can be ignored D. compound conditions will need to be simplified v. derive a measure of the logical complexity (cyclomatic complexity) A. corresponds to the number of independent paths B. corresponds to the number of regions in graph C. is E - N + 2, where E is number of edges and N is number of nodes D. or is P+1, where P is number of predicate nodes (conditions) vi. use this measure to define a basis set of execution paths A. each new path must include one edge not already contained in some path in the set vii. test cases derived from basis set A. we can figure out how many tests are needed B. each statement will get executed at least once C. prepare test cases corresponding to each path viii.one can automatically generate the basis set given a flow graph ix. some program analysers can determine if all statements have been executed at least once x. problems:

10 Verification and Validation (b) loop testing A. data complexity not taken into account B. does not test all possible combinations of all possible paths C. too complicated for entire systems, but good for individual components i. simple loops (where n is the max number of passes) A. 1. skip the loop B. 2. one pass through C. 3. two passes through D. 4. m passes, where m < n E. 5. n-1, n and n+1 passes ii. nested loops A. Pressman describes a strategy iii. concatenated loops A. if loops are truly independent, treat as simple loops B. if dependent, treat similar to nested loops 5.3.3 Black Box Testing 1. we ve already seen these 2. not an alternative to white (glass) box testing, but a complement to it 3. generally performed in the later stages of testing 4. based on functional requirements of the software 5. advantages: (a) can be written early (b) can easily be written by someone other than the programmer (why is this good) 6. disadvantages (a) not necessarily as thorough 7. finds the following categories of errors: (a) incorrect or missing functions (b) interface errors (c) errors in internal datastructures or external database access (d) performance errors (e) initialization or termination errors

Techniques 11 8. equivalence partitioning i. break into input data into different partitions (that are processed in similar ways) ii. valid and invalid inputs form partitions iii. design test cases based on these partitions A. look at boundaries (each side) B. also a midpoint of each partition iv. example given in book 5.3.4 Automatic Testing 1. tools to help with the testing 2. simplified classification (a) code analysis (b) input (c) output C. look at the output partition as well i. static analyzers (LINT), code auditors ii. looks at program structure and content iii. no execution of code i. test file generators ii. test data generators i. output comparators ii. assertion processors - does program execution meet programmer supplied claims (d) environment i. test harnesses - install the program in a test environment, make stubbing and driving easier ii. environment simulators (models the external environment to allow better testing) 5.3.5 Interface Testing done during integration testing (as discussed earlier) tests applied to entire subsystems not to individual components important for Object Oriented Programming due to the interaction between different objects

Techniques 12 different types of interface (a) parameter - data or function references passed (b) shared memory (c) procedural - encapsulation (ADTs, objects) (d) message passing - messages passed common types of errors: (a) interface misuse strong typing can prevent this (b) interface misunderstanding oh, I assumed that it sorted the list first (c) timing errors - real time systems hints test with null pointers design calls with extreme values use stress tests

Debugging 13 5.4 Debugging 1. is a consequence of successful testing 2. removing compilation errors is easy 3. finding runtime errors isn t 4. Some methods of runtime debugging: 1. Programmer observation 1. programmer tries to determine cause by observing output 2. tends not to work for anything but a simple program 3. method is only as good as the programmer 2. Debugging Statements in Code 1. insert special debugging statements in the code 2. usually insert WRITE statements (a) to show the value of a variable (b) to show WHERE the execution is... 3. but extra recompilation is often needed 4. good if there is no debugger 5. appropriate for real time systems 3. Debugger 1. system utility (or part of the programming language software) 2. breakpoints (stopping locations) may be specified 3. contents of program variables may be checked 4. execution of statements may be traced 5. dbx used on UNIX