Ingegneria del Software II academic year: Course Web-site: [www.di.univaq.it/ingegneria2/]

Similar documents
Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 22 Slide 1

Verification and Validation

ΗΜΥ 317 Τεχνολογία Υπολογισμού

Objectives. Chapter 19. Verification vs. validation. Topics covered. Static and dynamic verification. The V&V process

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Part 5. Verification and Validation

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Static and dynamic Testing

Verification and Validation. Assuring that a software system meets a user s needs. Verification vs Validation. The V & V Process

Verification and Validation

Verification and Validation

Verification and Validation

Verification and Validation

Lecture 9. Verification and Validation. Assuring that a software system meets a user's needs. Tutorial Questions. Verification and Validation

Software Testing. Software Testing

Why testing and analysis. Software Testing. A framework for software testing. Outline. Software Qualities. Dependability Properties

Verification Using Static Analysis

Learning outcomes. Systems Engineering. Debugging Process. Debugging Process. Review

Lecture 15 Software Testing

CS314 Software Engineering Peer Reviews

Aerospace Software Engineering

SOFTWARE ENGINEERING SOFTWARE VERIFICATION AND VALIDATION. Saulius Ragaišis.

Chapter 8 Software Testing. Chapter 8 Software testing

Chapter 8 Software Testing

Chapter 8. Achmad Benny Mutiara

Topic: Software Verification, Validation and Testing Software Engineering. Faculty of Computing Universiti Teknologi Malaysia

Using Static Code Analysis to Find Bugs Before They Become Failures

Testing! Prof. Leon Osterweil! CS 520/620! Spring 2013!

Sample Exam Syllabus

MONIKA HEINER.

Software Engineering (CSC 4350/6350) Rao Casturi

Topics in Software Testing

Three General Principles of QA. COMP 4004 Fall Notes Adapted from Dr. A. Williams

Object-Oriented and Classical Software Engineering

Object-Oriented and Classical Software Engineering

1 Visible deviation from the specification or expected behavior for end-user is called: a) an error b) a fault c) a failure d) a defect e) a mistake

Software testing. Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 23 Slide 1

Chapter 9. Software Testing

Software Engineering Fall 2015 (CSC 4350/6350) TR. 5:30 pm 7:15 pm. Rao Casturi 11/10/2015

Software Processes. Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 4 Slide 1

Examining the Code. [Reading assignment: Chapter 6, pp ]

Examination Questions Time allowed: 1 hour 15 minutes

Techniques for the unambiguous specification of software

Software Engineering Fall 2014

An Introduction to Systematic Software Testing. Robert France CSU

CS 424 Software Quality Assurance & Testing LECTURE 3 BASIC CONCEPTS OF SOFTWARE TESTING - I

Verification and Validation. Verification and validation

The requirements engineering process

Software Testing. Software Testing. in the textbook. Chapter 8. Verification and Validation. Verification and Validation: Goals

Software Engineering 2 A practical course in software engineering. Ekkart Kindler

Verification, Testing, and Bugs

Bridge Course On Software Testing

Software Quality. Richard Harris

Advanced Software Engineering: Software Testing

Specifying and Prototyping

Chapter 10. Testing and Quality Assurance

Chapter 11, Testing. Using UML, Patterns, and Java. Object-Oriented Software Engineering

Coding and Unit Testing! The Coding Phase! Coding vs. Code! Coding! Overall Coding Language Trends!

CMSC 435: Software Engineering Section 0201

All the subjective part of 2011 papers solved complete reference numbers

Lecture 26: Testing. Software Engineering ITCS 3155 Fall Dr. Jamie Payton

In this Lecture you will Learn: Testing in Software Development Process. What is Software Testing. Static Testing vs.

Introduction to Software Engineering

Software Testing. Software Testing. in the textbook. Chapter 8. Verification and Validation. Verification Techniques

Reading assignment: Reviews and Inspections

Chapter 9 Quality and Change Management

TESTING. Overview Slide 6.2. Testing (contd) Slide 6.4. Testing Slide 6.3. Quality issues Non-execution-based testing

Pearson Education 2007 Chapter 9 (RASD 3/e)

Page 1. Reading assignment: Reviews and Inspections. Foundations for SE Analysis. Ideally want general models. Formal models

Facts About Testing. Cost/benefit. Reveal faults. Bottom-up. Testing takes more than 50% of the total cost of software development

WHITE PAPER. 10 Reasons to Use Static Analysis for Embedded Software Development

IT323 - Software Engineering 2 1

SFU CMPT week 11

Formal Methods and their role in Software and System Development. Riccardo Sisto, Politecnico di Torino

CERT C++ COMPLIANCE ENFORCEMENT

Regression testing. Whenever you find a bug. Why is this a good idea?

Software Quality. Chapter What is Quality?

Security Coding Module Integer Error You Can t Count That High CS1

Requirements Validation and Negotiation

Improving Software Testability

Higher-order Testing. Stuart Anderson. Stuart Anderson Higher-order Testing c 2011

Chapter 8 Software Testing. Chapter 8 So-ware tes0ng

Verification and Validation

Software Design Models, Tools & Processes. Lecture 6: Transition Phase Cecilia Mascolo

Testing. ECE/CS 5780/6780: Embedded System Design. Why is testing so hard? Why do testing?

Lecture 10: Introduction to Correctness

Static Analysis Techniques

CS 520 Theory and Practice of Software Engineering Fall 2018

MTAT : Software Testing

INTRODUCTION TO SOFTWARE ENGINEERING

Verification & Validation of Open Source

The testing process. Component testing. System testing

Computer Science and Software Engineering University of Wisconsin - Platteville 9-Software Testing, Verification and Validation

People tell me that testing is

Critical Systems. Objectives. Topics covered. Critical Systems. System dependability. Importance of dependability

Software Quality Assurance. David Janzen

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Finding Firmware Defects Class T-18 Sean M. Beatty

Specification-Based Testing 1

Formal Specification. Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 10 Slide 1

Transcription:

Course: Ingegneria del Software II academic year: 2004-2005 Course Web-site: [www.di.univaq.it/ingegneria2/] Verification and Validation Lecturer: Henry Muccini and Vittorio Cortellessa Computer Science Department University of L'Aquila -Italy muccini@di.univaq.it cortelle@di.univaq.it [www.di.univaq.it/muccini] [www.di.univaq.it/cortelle]

Acknowledgment» This material comes from: - Ian Sommerville, Software Engineering 7th edition, chapter 22 - D. J. Richardson, COEN 345 Lecture Notes 2

Objectives» To introduce software verification and validation and to discuss the distinction between them» To describe the program inspection process and its role in V & V» To explain static analysis as a verification technique 3

Topics covered A. Verification and validation planning B. Software inspections C. Automated static analysis 4

A. Verification vs validation» Verification: "Are we building the product right. - The software should conform to its specification.» Validation: "Are we building the right product. - The software should do what the user really requires. 5

V & V [next slides taken from D.J. Richardson LN] 6

Verification» An activity whose primary purpose and effect is to help demonstrate the integrity or self-consistency of the object being verified» Detect Verification Errors ( Defects) - Recall definition ( object was built right -Veritas ) - Example 1: inconsistent requirements - Example 2: design defects found during ROOM/SDL - walktroughs ( controlled simulations) - Example 3: code defects found in code inspection - Example 4: Poor code coverage in design test - Example 5: integration makes build insane 7

Validation» Detect validation errors ( failures) - Example 1: requirements errors ( omissions, invalid constraints, erroneous requirements) - Example 2: non-conformance of design to key client scenarios, missing high risk exception handling - Example 3: functional test violations, improper or inadequate exception handling during function/ stress test - Example 4: regression test violations ( at product test, system test and release process time) 8

Verification or validation? Depends on the property» Example: elevator response»... if a user press a request button at floor i, an available elevator must arrive at floor I soon -? this property can be validated, but NOT verified - (SOON is a subjective quantity)»... if a user press a request button at floor i, an available elevator must arrive at floor i within 30 seconds -? this property can be verified - (30 seconds is a precise quantity) 9

The V & V process» Is a whole life-cycle process -V & V must be applied at each stage in the software process.» Has two principal objectives - The discovery of defects in a system; - The assessment of whether or not the system is useful and useable in an operational situation. 10

V & V goals» Verification and validation should establish confidence that the software is fit for purpose.» This does NOT mean completely free of defects.» Rather, it must be good enough for its intended use and the type of use will determine the degree of confidence that is needed. 11

Static and dynamic verification» Software inspections. Concerned with analysis of the static system representation to discover problems (static verification) - May be supplement by tool-based document and code analysis» Softwaretesting. Concerned with exercising and observingproduct behaviour (dynamic verification) - The system is executed with test data and its operational behaviour is observed 12

Static and dynamic V&V Software inspections Requirements specification High-level design Formal specifica tion Detailed design Program Prototype Program testing 13

Why Testing and Analysis» Software is never correct no matter which developing technique is used» Any software must (should) be verified» Software testing and analysis are - important to control the quality of the product (and of the process) - very (often too) expensive - difficult and stimulating 14

Impact of software on testing and analysis» The type of software and its characteristics impact in different ways the testing and analysis activities: - different emphasis may be given to the same properties - different (new) properties may be required - different (new) testing and analysis techniques may be needed 15

Different emphasis to the same properties» Dependability requirements different radically between - Safety-critical applications > flight control systems have strict safety requirements > telecommunication systems have strict robustness requirements - Mass-market products > dependability is less important than time to market» can vary within the same class of products: - reliability and robustness are key issues f or multi-user operating systems (e.g., UNIX) less important f or single users operating systems (e.g., Windows or MacOS) 16

Different type of software may require different properties» Timing properties - deadline satisfaction is a key issue for real time systems, but can be irrelevant for other systems - performance is important for many applications, but not the main issue for hard-real-time systems» Synchronization properties - absence of deadlock is important for concurrent or distributed systems, not an issue for other systems» External properties - user friendliness is an issue for GUI, irrelevant for embedded controllers 17

Goals of Testing» Find faults ( Debug Testing): Goodenough & Gerhart, Toward a Theory of Test Data Selection. IEEE, TSE, Jan 1985. - a test is successful if the program fails» Provide confidence (Acceptance Testing) - of reliability - of (probable) correctness - of detection (therefore absence) of particular faults 18

Goals of Analysis» Formal proof of software properties - restrict properties ( easier properties) or programs ( structured programs) to allow algorithmic proof > data flow analysis > necessary sufficient conditions - compromise between > accuracy of the property > generality of the program to analyze > complexity of the analysis > accuracy of the result 19

Program testing» Can reveal the presence of errors NOT their absence.» The only validation technique for non-functional requirements as the software has to be executed to see how it behaves.» Should be used in conjunction with static verification to provide full V&V coverage. 20

Types of testing» Defect testing - Tests designed to discover system defects. - A successful defect test is one which reveals the presence of defects in a system. - Covered in Chapter 23» Validation testing - Intended to show that the software meets its requirements. - A successful test is one that shows that a requirements has been properly implemented. 21

Testing and debugging» Defect testing and debugging are distinct processes.» Verification and validation is concerned with establishing the existence of defects in a program.» Debugging is concerned with locating and repairing these errors.» Debugging involves formulating a hypothesis about program behaviour then testing these hypotheses to find the system error. 22

The debugging process Test results Specification Test cases Locate error Design error repair Repair error Retest program 23

V & V planning» Careful planning is required to get the most out of testing and inspection processes.» Planning should start early in the development process.» The plan should identify the balance between static verification and testing.» Test planning is about defining standards for the testing process rather than describing product tests. 24

B. Software inspections Software inspections. Concerned with analysis of the static system representation to discover problems (static verification)» These involve people examining the source representation with the aim of discovering anomalies and defects.» Inspections not require execution of a system so may be used before implementation.» They may be applied to any representation of the system (requirements, design, configurationdata, test data, etc.).» They have been shown to be an effective technique for discovering program errors. 25

Inspections and testing» Inspections and testing are complementary and not opposing verification techniques.» Both should be used during the V & V process. - Inspections can check conformance with a specification but not conformance with the customer s real requirements. - Inspections cannot check non-functional characteristics such as performance, usability, etc. - Inspections are more effective for defect discovery than program testing [Fagan 86] 26

Inspection roles [Grady and Van Slack, 1994] Author or owner Inspector Reader Scribe Chairman or moderator Chief moderator The programmer or designer responsible for producing the program or document. Responsible for fixing defects discovered during the inspection process. Finds errors, omissions and inconsistencies in programs and documents. May also identify broader issues that are outside the scope of the inspection team. Presents the code or document at an inspection meeting. Records the results of the inspection meeting. Manages the process and facilitates the inspection. Reports process results to the Chief moderator. Responsible for inspection process improvements, checklist updating, standards development etc. 27

Inspection pre-conditions» A precise specification must be available.» Team members must be familiar with the organisationstandards.» Syntactically correct code or other system representations must be available.» An error checklist should be prepared.» Management must accept that inspection will increase costs early in the software process.» Management should not use inspections for staff appraisal ie finding out who makes mistakes. 28

The inspection process Planning Team moderator Overview Individual preparation Inspection meeting Rework Authors Inspectors Inspectors Authors Authors Inspectors Follow-up Team moderator 29

Inspection checklists» Checklist of common errors should be used to drivethe inspection.» Error checklists are programming language dependent and reflect the characteristic errors that are likely to arise in the language.» In general, the 'weaker' the type checking, the larger the checklist.» Examples: Initialisation, Constant naming, loop termination, array bounds, etc. 30

Inspection checks 1 Data faults Control faults Input/output faults Are all program variables initialised before their values are used? Have all constants been named? Should the upper bound of arrays be equal to the size of the array or Size -1? If character strings are used, is a de limiter explicitly assigned? Is there any possibility of buffer overflow? For each conditional statement, is the condition correct? Is each loop certain to terminate? Are compound statements correctly bracketed? In case statements, are all possible cases accounted for? If a break is required after each case in case statements, has it been included? Are all input variables used? Are all output variables assigned a value before they are output? Can unexpected inputs cause corruption? 31

Inspection checks 2 Interface faults Storage management faults Exception management faults Do all function and method calls have the correct number of parameters? Do formal and actual parameter types match? Are the parameters in the right order? If components access shared memory, do they have the same model of the shared memory structure? If a linked structure is modified, have all links been correctly reassigned? If dynamic storage is used, has space been allocated correctly? Is space explicitly de-allocated after it is no longer required? Have all possible error conditions been taken into account? 32

Inspection rate [Barnard and Price, 94]» 500 code statements/hour presented during overview.» 125 source statement/hour examined duringindividual preparation.» 90-125 statements/hour can be inspected during inspection meeting.» Inspection is therefore an expensive process.» Inspecting 500 lines costs about 40 man/hours effort - about 2800 at UK rates. 33

C. Automated static analysis» Static analysers are software tools for source text processing.» They parse the program text and try to discover potentially erroneous conditions and bring these to the attention of the V & V team.» They are very effective as an aid to inspections - they are a supplement to but not a replacement for inspections. 34

Stages of static analysis» Control flow analysis. Checks for loops with multiple exit or entry points, finds unreachable code, etc.» Data use analysis. Detects uninitialised variables, variables written twice without an intervening assignment, variables which are declared but never used, etc.» Interface analysis. Checks the consistency of routine and procedure declarations and their use 35

Stages of static analysis» Information flow analysis. Identifies the dependencies of output variables. Does not detect anomalies itself but highlights information for code inspection or review» Path analysis. Identifies paths through the program and sets out the statements executed in that path. Again, potentially useful in the review process» Both these stages generate vast amounts of information. They must be used with care. 36

LINT static analyzer for C programs 138% more lint_ex.c #include <stdio.h> printarray (Anarray) int Anarray; { printf( %d,anarray); } main () { int Anarray[5]; int i; char c; printarray (Anarray, i, c); printarray (Anarray) ; } 139% cc lint_ex.c 140% lint lint_ex.c lint_ex.c(10): warning: c may be used before set lint_ex.c(10): warning: i may be used before set printarray: variable # of args. lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(10) printarray, arg. 1 used inconsistently lint_ex.c(4) :: lint_ex.c(11) printf returns value which is always ignored 37

Use of static analysis» Particularly valuable when a language such as C is used which has weak typing and hence many errors are undetected by the compiler,» Less cost-effective for languages like Java that have strong type checking and can therefore detect many errors during compilation. 38

Verification and formal methods» Formal methods can be used when a mathematical specification of the system is produced.» They are the ultimate static verification technique.» They involve detailed mathematical analysis of the specification and may develop formal arguments that a program conforms to its mathematical specification. 39

Arguments for formal methods» Producing a mathematical specification requires a detailed analysis of the requirements and this is likely to uncover errors.» They can detect implementation errors before testing when the program is analysed alongside the specification. 40

Arguments against formal methods» Require specialised notations that cannot be understood by domain experts.» Very expensive to develop a specification and even more expensive to show that a program meets that specification.» It may be possible to reach the same level of confidence in a program more cheaply using other V & V techniques. 41

Cleanroom software development» The name is derived from the 'Cleanroom' process in semiconductor fabrication. The philosophy is defect avoidance rather than defect removal.» This software development process is based on: - Incremental development; - Formal specification; - Static verification using correctness arguments; - Statistical testing to determine program reliability. 42

The Cleanroom process Formally specify system Error rework Define software increments Construct structured program Formally verify code Integrate increment Develop oper ational profile Design statistical tests Test integrated system 43

Cleanroom process characteristics» Formal specification using a state transition model.» Incremental development where the customer prioritises increments.» Structured programming -limited control and abstraction constructs are used in the program.» Static verification using rigorous inspections.» Statistical testing of the system (covered in Ch. 24). 44

Formal specification and inspections» The state based model is a system specification and the inspection process checks the program against this mode.l» The programming approach is defined so that the correspondence between the model and the system is clear.» Mathematical arguments (not proofs) are used to increase confidence in the inspection process. 45

Cleanroom process teams» Specification team. Responsible for developing and maintaining the system specification.» Development team. Responsible for developing and verifying the software. The software is NOT executed or even compiled during this process.» Certification team. Responsible for developing a set of statistical tests to exercise the software after development. Reliability growth models used to determine when reliability is acceptable. 46

Cleanroom process evaluation» The results of using the Cleanroom process have been very impressive with few discovered faults in delivered systems.» Independent assessment shows that the process is no more expensive than other approaches.» There were fewer errors than in a 'traditional' development process.» However, the process is not widely used. It is not clear how this approach can be transferred to an environment with less skilled or less motivated software engineers. 47

Key points» Verification and validation are not the same thing. Verification shows conformance with specification; validation shows that the program meets the customer s needs.» Test plans should be drawn up to guide the testing process.» Static verification techniques involve examination and analysis of the program for error detection. 48

Key points» Program inspections are very effective in discovering errors.» Program code in inspections is systematically checked by a small team to locate software faults.» Static analysis tools can discover program anomalies which may be an indication of faults in the code.» The Cleanroom development process depends on incremental development, static verification and statistical testing. 49