SOFTWARE ENGINEERING SOFTWARE VERIFICATION AND VALIDATION. Saulius Ragaišis.

Similar documents
Part 5. Verification and Validation

Software Testing Strategies. Slides copyright 1996, 2001, 2005, 2009, 2014 by Roger S. Pressman. For non-profit educational use only

Verification and Validation

Verification and Validation. Assuring that a software system meets a user s needs. Verification vs Validation. The V & V Process

Objectives. Chapter 19. Verification vs. validation. Topics covered. Static and dynamic verification. The V&V process

Software Testing Strategies. Software Engineering: A Practitionerʼs Approach, 7/e by Roger S. Pressman

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Verification and Validation. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1

Ian Sommerville 2006 Software Engineering, 8th edition. Chapter 22 Slide 1

ΗΜΥ 317 Τεχνολογία Υπολογισμού

Verification and Validation

Verification and Validation

Chapter 9. Software Testing

QUIZ #5 - Solutions (5pts each)

Software Testing. Software Testing

Ingegneria del Software II academic year: Course Web-site: [

Software Testing. Minsoo Ryu. Hanyang University. Real-Time Computing and Communications Lab., Hanyang University

Lecture 26: Testing. Software Engineering ITCS 3155 Fall Dr. Jamie Payton

Aerospace Software Engineering

Verification and Validation

Static and dynamic Testing

Examination Questions Time allowed: 1 hour 15 minutes

Verification and Validation

Testing Object-Oriented Applications. Slide Set to accompany. Software Engineering: A Practitioner s Approach, 7/e by Roger S.

SOFTWARE ENGINEERING IT 0301 Semester V B.Nithya,G.Lakshmi Priya Asst Professor SRM University, Kattankulathur

Lecture 15 Software Testing

Software Engineering Software Testing Techniques

Software Testing Interview Question and Answer

Sample Question Paper. Software Testing (ETIT 414)

Chapter 11, Testing. Using UML, Patterns, and Java. Object-Oriented Software Engineering

Chapter 8. Achmad Benny Mutiara

Sample Exam Syllabus

Software Engineering (CSC 4350/6350) Rao Casturi

Chapter 9 Quality and Change Management

Pearson Education 2007 Chapter 9 (RASD 3/e)

1 Visible deviation from the specification or expected behavior for end-user is called: a) an error b) a fault c) a failure d) a defect e) a mistake

Topics in Software Testing

Software Engineering Fall 2015 (CSC 4350/6350) TR. 5:30 pm 7:15 pm. Rao Casturi 11/10/2015

Testing. Unit, integration, regression, validation, system. OO Testing techniques Application of traditional techniques to OO software

Verification and Validation. Verification and validation

! Is often referred to as verification and validation (V&V).

Software Engineering Fall 2014

Introduction to Software Engineering

Testing Object-Oriented Applications. Slides copyright 1996, 2001, 2005, 2009 by Roger S. Pressman. For non-profit educational use only

MONIKA HEINER.

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CS SOFTWARE ENGINEERING

Software Testing and Maintenance

Chapter 8 Software Testing. Chapter 8 Software testing

Advanced Software Engineering: Software Testing

Software Testing Fundamentals. Software Testing Techniques. Information Flow in Testing. Testing Objectives

Overview. State-of-the-Art. Relative cost of error correction. CS 619 Introduction to OO Design and Development. Testing.

Testing & Debugging TB-1

Quote by Bruce Sterling, from: A Software Testing Primer, Nick Jenkins

VETRI VINAYAHA COLLEGE OF ENGINEERING AND TECHNOLOGY DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

Software Quality Assurance. David Janzen

Three General Principles of QA. COMP 4004 Fall Notes Adapted from Dr. A. Williams

Learning outcomes. Systems Engineering. Debugging Process. Debugging Process. Review

Terminology. There are many different types of errors and different ways how we can deal with them.

Darshan Institute of Engineering & Technology Unit : 9

The Fundamental Testing Process in Practical or Company environment

Certified Tester Foundation Level(CTFL)

Why testing and analysis. Software Testing. A framework for software testing. Outline. Software Qualities. Dependability Properties

UNIT-2 Levels of Testing

Topic: Software Verification, Validation and Testing Software Engineering. Faculty of Computing Universiti Teknologi Malaysia

Software Quality Assurance (SQA) Software Quality Assurance

Bridge Course On Software Testing

By V-cubed Solutions, Inc. Page1. All rights reserved by V-cubed Solutions, Inc.

Higher-order Testing. Stuart Anderson. Stuart Anderson Higher-order Testing c 2011

Chapter 10. Testing and Quality Assurance

Verification and Validation

INTRODUCTION TO SOFTWARE ENGINEERING

Testing! Prof. Leon Osterweil! CS 520/620! Spring 2013!

UNIT 1-SOFTWARE PROCESS AND PROJECT MANAGEMENT

Software Design Models, Tools & Processes. Lecture 6: Transition Phase Cecilia Mascolo

Computer Science and Software Engineering University of Wisconsin - Platteville 9-Software Testing, Verification and Validation

Lecture 9. Verification and Validation. Assuring that a software system meets a user's needs. Tutorial Questions. Verification and Validation

Software Testing for Developer Development Testing. Duvan Luong, Ph.D. Operational Excellence Networks

COPYRIGHTED MATERIAL. Index

Software Testing. Testing: Our Experiences

CS 424 Software Quality Assurance & Testing LECTURE 3 BASIC CONCEPTS OF SOFTWARE TESTING - I

Testing is the process of evaluating a system or its component(s) with the intent to find whether it satisfies the specified requirements or not.

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, September 18, ISSN SOFTWARE TESTING

Program Correctness and Efficiency. Chapter 2

Write perfect C code to solve the three problems below.

Verification, Testing, and Bugs

An Introduction to Systematic Software Testing. Robert France CSU

Department of Electrical & Computer Engineering, University of Calgary. B.H. Far

Quality Assurance = Testing? SOFTWARE QUALITY ASSURANCE. Meaning of Quality. How would you define software quality? Common Measures.

Darshan Institute of Engineering & Technology for Diploma Studies

People tell me that testing is

(From Glenford Myers: The Art of Software Testing)

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, April- ICITDA 18,

Introduction To Software Testing. Brian Nielsen. Center of Embedded Software Systems Aalborg University, Denmark CSS

Chapter 14 Software Testing Techniques

Facts About Testing. Cost/benefit. Reveal faults. Bottom-up. Testing takes more than 50% of the total cost of software development

(Complete Package) We are ready to serve Latest Testing Trends, Are you ready to learn? New Batches Info

CS6403 SOFTWARE ENGINEERING Year / Sem : II / IV Sub. Code &Subject : CS6403 SOFTWARE ENGINEERING QUESTION BANKWITH ANSWERS

[IT6004-SOFTWARE TESTING] UNIT 2

Integration Testing Qualidade de Software 2

Standard Glossary of Terms used in Software Testing. Version 3.2. Beta - Foundation Terms

Transcription:

SOFTWARE ENGINEERING SOFTWARE VERIFICATION AND VALIDATION Saulius Ragaišis saulius.ragaisis@mif.vu.lt

CSC2008 SE Software Verification and Validation Learning Objectives: Distinguish between program validation and verification. Describe the role that tools can play in the validation of software. Distinguish between the different types and levels of testing (unit, integration, systems, and acceptance) for medium-size software products and related materials. Create, evaluate, and implement a test plan for a medium-size code segment. Undertake, as part of a team activity, an inspection of a medium-size code segment. Discuss the issues involving the testing of object-oriented software.

Ways to improve quality Prevention of Defects Software process improvement Complexity reduction Risk management Causal analysis Detection and Correction of Defects Verification Validation Rework Causal analysis

VERIFICATION VS. VALIDATION

Purpose Verification and Validation (V&V) are independent procedures that are used together for checking that a product, service, or system meets requirements and specifications and that it fulfills its intended purpose.

Verification Verification is intended to check that a product, service, or system meets a set of initial design requirements, specifications, and regulations. It is a process that is used to evaluate whether a product, service, or system complies with regulations, specifications, or conditions imposed at the start of a development phase. This is often an internal process.

Validation Validation is intended to check that development and verification procedures result in a product, service, or system that meets initial requirements, specifications, and regulations. It is a process of establishing evidence that provides a high degree of assurance that a product, service, or system accomplishes its intended requirements. This is often an external process.

Common sense definitions Validation: "Are you building the right thing? Verification: "Are you building it right? "Building the right thing" refers back to the user's needs, while "building it right" checks that the specifications are correctly implemented by the system.

V&V process Must applied at each stage of the software development process to be effective Objectives Discovery of defects Assessment of usability in an operational situation

V&V techniques Static methods Reviews (Walkthroughs, Inspections, Program reviews) May be supplemented by tool-based analysis Dynamic methods Testing Model based verification Simulation Mathematics based methods Formal methods

Static. Reviews Walkthroughs Code verification Document ConOps, SRS validation STEP, SAD, SDS verification Inspections Code verification Document Audits verification Program Reviews Customer involved validation No customer verification

Effectiveness of static methods Could be applied from early stages of software life cycle. More than 60% of program defects can be detected by program inspections. More than 90% of program defects may be detectable using more rigorous mathematical program verification. The defect detection process is not confused by the existence of previous defects.

Dynamic. Testing (Verification) Unit Test: Testing the individual software modules, components, or units. Integration Testing: After unit test, the system is put together in increments. Integration testing focuses on the interfaces between software components (OO thread-based, cluster-based testing) System Testing: One goal of system testing is to ensure that the system functions as specified in the specification

Dynamic. Testing (Validation) System Testing: Another goal of system testing is to ensure that the system functions as the client expected in a controlled environment. User Acceptance Test: A set of formal tests run for the client, and specified by the client. When the system passes these tests, the software has been accepted by the client as meeting the requirements.

V model

Program testing Can only reveal the presence of errors, cannot prove their absence. A successful test discovers 1 or more errors The only validation technique that should be used for non-functional (or performance) requirements. Should be used in conjunction with static verification to ensure full product coverage.

Types of testing Defect testing Tests designed to discover system defects A successful defect test reveals the presence of defects in the system Statistical testing Tests designed to reflect the frequency of user inputs Used for reliability estimation

Mathematics based methods Verification is based on mathematical arguments which demonstrate that a program is consistent with its specification. Programming language semantics must be formally defined. The program must be formally specified.

Automated static analysis Performed by software tools that process source code listing. Can be used to flag potentially erroneous conditions for the inspection team to examine. They should be used to supplement the reviews.

Static analysis checks Data faults (e.g. variables not initialized) Control faults (e.g. unreachable code) Input/output faults (e.g. duplicate variables output) Interface faults (e.g. parameter type mismatches) Storage management faults (e.g. pointer arithmetic)

Static analysis stages Control flow analysis checks loops for multiple entry points or exits find unreachable code Data use analysis finds initialized variables variable declared and never used Interface analysis check consistency of function prototypes and instances Information flow analysis examines output variable dependencies highlights places for inspectors to look at closely Path analysis identifies paths through the program determines order of statements executed on each path highlights places for inspectors to look at closely

Costs to fix defect Time detected Time introduced Req. Arch. Constr. System test Postrelease Requirements 1x 3x 5-10x 10x 10-100x Architecture - 1x 10x 15x 25-100x Construction - - 1x 10x 10-25x

Cost of quality Includes all costs of quality-related activities. Quality costs = Prevention costs + Detection and Appraisal costs + Failure costs Internal failure costs External failure costs

Quality cost components Direct costs: Reviews/inspections Unit testing System testing Acceptance testing Test planning and design Computer time Resources (terminals, staffs, etc.) Indirect costs: Rework Recovery Corrective action cost Failures Analysis meeting Debugging Retesting Legal fees

Goal of Verification and Validation Establish confidence that software fits for its intended purpose. The software may or may not have all defects removed. The intended use of the product will determine the degree of confidence in product needed.

Confidence parameters Software function: How critical is the software to the organization? User expectations: Certain kinds of software have low user expectations. Marketing environment: Getting a product to market early might be more important than finding all defects.

V&V planning Careful planning is required to get the most out of verification and validation processes. Planning should start early in the development process. The plan should identify the balance between static verification and testing. Test planning must define standards for the testing process, not just describe product tests.

TESTING

Testing strategy characteristics To perform effective testing, formal technical reviews (inspections) should be conducted. Testing begins at component level and works outward toward integration of the entire system. Different testing techniques are appropriate at different points in time. Testing is conducted by the developer and (for large projects) an independent test group. Testing and debugging are different activities, but debugging must be accommodated in any test strategy.

Misconceptions The developer of software should do not testing at all. The software should be tossed over the wall to strangers who will test it mercilessly. Testers get involved with the project only when testing steps are about to begin.

Criteria for completion of testing You re never done testing; the burden simply shifts from you (software engineer) to your customer. You re done testing when you run out of time or you run out of money. Using statistical modeling and software reliability theory, models of software failures (uncovered during testing) as a function of execution time are developed.

Strategic issues Specify product requirements in a quantifiable manner long before testing commences. State testing objectives explicitly. Understand the users of the software and develop a profile for each user category. Develop a testing plan that emphasizes rapid cycle testing. Build robust software that is designed to test itself. Use effective inspections as a filter prior to testing. Conduct inspections to assess the test strategy and test cases themselves. Develop a continuous improvement approach for the testing process.

Testing priorities Exhaustive testing only way to show program is defect free. Exhaustive testing is not possible. Tests must exercise system capabilities, not its components. Testing old capabilities is more important than testing new capabilities. Testing typical situations is more important than testing boundary value cases.

Unit testing Focuses verification effort on the smallest unit of software design the software component or module. Tests the internal processing logic and data structures within the unit. Driver main program that accepts test case data, passes it to unit, and prints the results. Stub (dummy subprogram) serves to replace modules that are subordinate to (called by) the unit to be tested.

Integration testing A systematic technique for constructing the software architecture while at the same time conducting tests to uncover errors associated with interfacing. Integration strategies: Big bang approach Top-down integration (Depth-first, Breadth-first) Bottom-up integration (clusters) Sandwich testing (combined approach)

Regression testing Regression testing is an important strategy for reducing side effects. Run regression tests every time a major change is made to the software. Classes of test cases in regression test suite: A representative sample of tests that will exercise all software functions. Additional tests that focus on software functions that are likely to be affected by the change. Tests that focus on the software components that have been changed.

Smoke testing Integration testing approach used when software products are being developed. Essence activities: Coded software components are integrated into a build. A series of tests is designed with intent to uncover show stopper errors. The build is integrated with other builds and the entire product (in its current form) is smoke tested daily.

Smoke testing (2) The smoke test should exercise the entire system from end to end. It does not have to be exhaustive, but it should be capable of exposing major problems. The smoke test should be thorough enough that if the build passes, you can assume that it is stable enough to be tested more thoroughly. [McConnell] Benefits: Integration risk is minimized. The quality of the end-product is improved. Error diagnosis and correction are simplified. Progress is easier to assess.

Interface testing guidelines Design tests so actual parameters passed are at extreme ends of formal parameter ranges. Test pointer variables with null values. Design tests that cause components to fail. Use stress testing in message passing systems. In shared memory systems, vary the order in which components are activated.

Critical modules. Characteristics: addresses several requirements, has a high level of control (resides relatively high in program structure), is complex or error prone, or has definite performance requirements.

Test documentation Overall plan for the integration and a description of specific tests are documented in a Test Specification. It contains a test plan, a test procedure, is a work product of the software process, and becomes part of the software configuration. Test plan describes the overall strategy. Testing is divided into phases and builds that address specific functional and behavioral characteristics.

Software test plan components Testing process Requirements traceability Testing schedule Test recording procedures Testing HW and SW requirements Testing constraints Items tested

Strategies for OO software Unit testing -> class testing Integration testing strategies: Thread-based testing (integrates the set of classes required to respond to one input or event) Use-based testing (integrates the set of classes required to respond to one use case) Cluster testing (integrates the set of classes required to demonstrate one collaboration) Test case design draws on conventional methods, but also encompasses special features

Validation testing Validation test criteria Configuration review Alfa testing Beta testing

System testing Recovery testing forces the software to fail in a variety of ways and verifies that recovery is properly performed Security testing verifies that protection mechanisms built into a system will, in fact, protect it from improper penetration Stress testing executes a system in a manner that demands resources in abnormal quantity, frequency, or volume Performance testing test the run-time performance of software within the context of an integrated system

System testing guidelines Testing of critical systems must often rely on simulators for sensor and activator data (rather than endanger people or profit). Test for normal operation should be done using a safely obtained operational profile. Tests for exceptional conditions will need to involve simulators.

Software testability Operability Observability Controllability Decomposability Simplicity Stability Understandability

Characteristics of good test Has a high probability of finding an error Is not redundant. Should be best of breed Should be neither too simple nor too complex

Designing of test case Objective to uncover errors Criteria in a complete manner Constraint with a minimum of effort and time

White-box testing White-box testing (also known as structural testing) is a method of testing software that tests internal structures or workings of an application, as opposed to its functionality (i.e. black-box testing). In white-box testing an internal perspective of the system, as well as programming skills, are used to design test cases.

White-box test design techniques Basis path testing Control structure testing: Condition testing Data flow testing

Basis path testing Goal is to ensure that all statements and conditions are executed at least once. Independent path is any path through the program that introduces at least one new set of processing statement or a new condition. Number of independent paths = cyclomatic complexity of the graph representing program. Cyclomatic complexity = (the number of regions) or (number of edges number of nodes + 2) or (number of predicate nodes)

Control structure testing Condition testing - a test case design method that exercises the logical conditions contained in a program module: Loops testing: Simple loops Nested loops Concatenated loops Unstructured loops Data flow testing - selects test paths of a program according to the locations of definitions and uses of variables in the program

Black-box testing Black-box testing is a method of software testing that tests the functionality of an application as opposed to its internal structures or workings. This method of test can be applied to all levels of software testing.

Black-box test design techniques Boundary value analysis Equivalence partitioning Decision table testing All-pairs testing State transition tables

OO software testing methods Fault-based testing The tester looks for plausible faults (i.e., aspects of the implementation of the system that may result in defects). To determine whether these faults exist, test cases are designed to exercise the design or code. Class testing Inheritance does not obviate the need for thorough testing of all derived classes. In fact, it can actually complicate the testing process. Scenario-based testing Concentrates on what the user does, not what the product does. This means capturing the tasks (via use-cases) that the user has to perform, then applying them and their variants as tests.

OO software testing methods (2) Random testing generate a variety of random (but valid) test sequences Partition testing state-based partitioning attribute-based partitioning category-based partitioning Inter-class testing

Defect seeding Error seeding is also known as bebugging. It acts as a reliability measure for the release of the product. Usually one group of members in the project injects the defects while an other group tests to remove them. The purpose of this exercise is while finding the known seeded defects, the unseeded/unearthed defects may also be uncovered. Defects that are seeded are similar to real defects. Therefore, they are not very obvious and easy to detect.

Bug tracking system A bug tracking system is a software application that is designed to help quality assurance and programmers keep track of reported software bugs in their work. It may be regarded as a type of issue tracking system. Typically bug tracking systems are integrated with other software project management applications. Having a bug tracking system is extremely valuable in software development, and they are used extensively by companies developing software products. Consistent use of a bug or issue tracking system is considered one of the "hallmarks of a good software team".

Examples of bug tracking software JIRA: browser-based bug, issue, task and defect tracking system, and project management software solution used for open source and enterprise projects. (Atlassian Software Systems) Advanced Defect Tracking: designed for small and large software companies to simplify their feature development, bug tracking and helpdesk support. (Borderwave Software) Bugzilla: Defect Tracking Systems allowing individual or groups of developers to keep track of outstanding bugs in their product effectively.

Testing and Debugging These are two distinct processes. Testing is concerned with establishing the existence of defects in a program. Debugging is concerned with locating and repairing these defects. Debugging involves formulating a hypothesis about program behavior and then testing this hypothesis to find the error.

Definition of debugging Debugging is that activity which is performed after executing a successful test case. Debugging consists of determining the exact nature and location of the suspected error and fixing the error. Of the two aspects of debugging, locating the error represents about 95% of the activity.

Debugging process

Psychological considerations Debugging is done by the person who developed the software, and it is hard for that person to acknowledge that an error was made. Of all the software-development activities, debugging is the most mentally taxing because of the way in which most programs are designed and because of the nature of most programming languages (i.e., the location of any error is potentially any statement in the program). Debugging is usually performed under a tremendous amount of pressure to fix the suspected error as quickly as possible. Compared to the other software-development activities, comparatively little research, literature, and formal instruction exist on the process of debugging.

Debugging approaches Debugging by Brute Force Debugging by Induction Debugging By Deduction Debugging by Backtracking Debugging by Testing

Debugging Guidelines (Error Locating) Guidelines suggested by G. J. Myers: Think. If you reach an impasse, sleep on it. If you reach an impasse, describe the problem to someone else. Use debugging tools only as a second resort. Avoid experimentation.

Debugging Guidelines (Error Repairing) Guidelines suggested by G. J. Myers: Where there is one bug, there is likely to be another. Fix the error, not just a symptom of it. The probability of the fix being correct is not 100%. The probability of the fix being correct drops as the size of the program increases. Beware of the possibility that an error correction creates a new error. The process of error repair should put one back temporarily in the design phase. Change the source code, not the object code.

REVIEWS

Errors vs. Defects Error - a quality problem found before the software is released to end users. Defect - a quality problem found only after the software has been released to end-users. Some approaches (e.g. PSP, TSP) are accounting defects between software development phases also.

Software reviews People examine a work product to discover anomalies and defects. May be applied to any work product (document, model, test data, code, etc.). Does not require systems execution so they may occur before implementation.

Types of reviews Author review Pair review Formal technical review, inspection Audits

Review success factors Very effective technique for discovering defects. It is possible to discover several defects in a single review. In testing one defect may in fact mask another. They reuse domain and programming knowledge (allowing reviewers to help avoid making common errors).

Reviews vs. Testing These are complementary processes. Reviews can check conformance to specifications, but not with customer s real needs (except cases when customers are participating). Testing must be used to check compliance with non-functional system characteristics like performance, usability, etc.

Inspections Formalizes the approach to reviews. Focus is on defect detection, not defect correction. Defects uncovered may be logic errors, coding errors, or non-compliance with standards.

Inspection preconditions A precise specification must be available. Team members must be familiar with organization standards. Inspection checklist must be prepared in advance. Inspectors should made individual review of the work product inspected. Management must buy into the fact that inspections will increase the early development costs. Inspections should not be used to evaluate staff performance.

Inspection procedure System overview presented to inspection team. Work product and associated documents are distributed to team in advance. Errors discovered during the inspection are recorded. Product modifications are made to repair defects. Re-inspection may or may not be required.

Code inspection fault classes Data faults (e.g. array bounds) Control faults (e.g. loop termination) Input/output faults (e.g. all data read) Interface faults (e.g. parameter assignment) Storage management faults (e.g. memory leaks) Exception management faults (e.g. all error conditions trapped)

Inspection team Should have at least 4 team members: product author; inspectors (looks for errors, omissions, and inconsistencies); moderator (chairs meeting and records errors uncovered).

Inspection rates (examples) 500 statements per hour during overview. 125 statements per hour during individual preparation. 90-125 statements per hour can be inspected by a team. Including preparation time, each 100 lines of code costs one person day (if a 4 person team is used).

Software audit Software audit is a type of software review in which one or more auditors who are not members of the software development organization conduct "An independent examination of a software product, software process, or set of software processes to assess compliance with specifications, standards, contractual agreements, or other criteria [IEEE Std. 1028-1997, IEEE Standard for Software Reviews]

Examples of tools for static code analysis Historical Lint: the original static code analyzer of C code. Multi-language IBM Rational AppScan Source Edition: C/C++,.NET, Java, JSP, JavaScript, ColdFusion, Classic ASP, PHP, Perl, VisualBasic 6, PL/SQL, T-SQL, and COBOL Java Jtest: testing and static code analysis product by Parasoft. C/C++ Eclipse (software): an IDE that includes a static code analyzer (CODAN)

What we have learned? Concepts of Verification and Validation Testing fundamentals Ideas for Debugging Importance of Reviews

QUESTIONS?