Software and Verification Technologies Model Driven ing Overview July 2003 www.agedis.de w3.haifa.il.ibm.com\softwaretesting\gtcb IBM Internal Use Only IBM Labs in Haifa
Outline Motivation Process Technology Generation Execution Analysis Use Cases Deployments 2
Defect Cost Over Time (Apar $15-40,000) $14,000 85% %Defects Introduced in this phase Percentage of Bugs $25 Coding $130 Unit $250 Funct $1000 Field Post Release % Defects found in this in phase $ Cost to repair defect in this phase source: Applied Software Measurement, Capers Jones,1996 3
In the News 4 Source: Gartner Group
Downtime Costs (per Hour) Brokerage operations $6,450,000 Credit card authorization $2,600,000 Ebay (1 outage 22 hours) $225,000 Amazon.com $180,000 Package shipping services $150,000 Home shopping channel $113,000 Catalog sales center $90,000 Airline reservation center $89,000 Cellular service activation $41,000 On-line network fees $25,000 ATM service fees $14,000 Sources: InternetWeek 4/3/2000 + Fibre Channel: A Comprehensive Introduction, R. Kembel 2000, p.8....based on a survey done by Contingency Planning Research." 5
Current State of the ART 100% 80% 60% 40% 20% 0% 15% 50% 20% Cause of System Crashes 15% 18% 21% 53% 18% 10% 1985 1993 2001 Failures due to people up, hard to measure VAX crashes 85, 93 [Murp95]; extrap. to 01 HW/OS 70% in 85 to 28% in 93. In 01, 10%? 69% 5% 5% (est.) How get administrator to admit mistake? (Heisenberg?) Other: app, power, network failure System management: actions + N/problem Operating System failure Hardware failure (based on the lecture Recovery Oriented Computing by Dave Patterson, Berkeley) 6
ing Problem Today 80% of testers focus is making testing possible and only 20% is in making it meaningful Most defects discovered in system test could have been discovered in function test Cost of developing and supporting private test automation solutions in each lab Gap between developer and tester environment Gap between unit, function and system test System Under complexity 7
Model Based ing Process System Specs. Design 1.Modelling Design 2.Generation Interface 3.Translation State Abstract Machine Editor Model GOTCHA Suite Spider Scripts Legend Data 5.Execution 4.Execution Code Executor Tool Bug Interface Spider Code System Under Execution Log 8
Benefits of Model Driven ing Starting from specification Involves testers early in the development process Teams testers with developers Forces testability into product design Building behavioural model and test interface Finds design and specification bugs - before code exists The model is the test plan - and is easily maintained Automated test suite generation Coverage is guaranteed - increases testing thoroughness Zero test suite maintenance costs Automated test suite execution Finds code and interface bugs Includes a framework for the testing of distributed applications Reduces test execution costs 9
Model Driven ing Function test Model the procedures, functions, classes both public and private behavior White box and black box testing Component compliance test Model the external functionality of a component the actual behavior against model (spec) Black box testing (e.g., API level) Integration test Model the interaction between components the interaction and system behavior 10
GOTCHA Coverage directed Generation Behavior model describes: Data types Variables Behavior rules (methods) Generate tests to cover the input Cover the behavior rules Cover the rule parameter combinations Cover transitions between parameter combinations Generate tests to cover the behavior Cover variable combinations Cover transitions between variable combinations Interactive test generation Walk through the model Record and play the walk-through 11
Execution Engine Distributed support for major platforms (Windows, Unix) Direct execution in Java, C, C++, command line, sockets Translation to existing test harness Automated comparison with predicted results (traceable back to ATS) Synchronous & Asynchronous support: Environment to SUT interactions SUT to Environment interactions Add invariant operations, e.g. setup and cleanup object multiplication and stepwise synchronization Interactive and batch execution 12
SUT System Under Object... Object Process... Process Host... Host N e t w o r k Distributed Components 13
System Overview Host 1 Host 2 Host 3 O O O O O O PC PC HM PC HM HM PC TSD Main Host Legend: TSD Suite Driver HM Host Manager PC Process Controller O Object 14
Object Proxy SUT Object SUT Object Object Proxy Process Controller Process Controller may interact: directly with the SUT Object indirectly via Object Proxies created by the tester Java and C++ Proxy support wizards for creating proxy templates N e t w o r k 15
Analysis Technology Defect cluster analysis Coverage analysis of test suite and execution trace Feedback to test generation 16
Model Driven ing Architecture Model User Interface Generation Directives Compiler Analyzer Edit/Browser Intermediate Format Encoding Suite Execution Trace Generator Abstract Suite Execution Directives Execution 17
Use Cases Data Exchange Formats: ATS Abstract Suite (XML) GD Generation Directives TED Execution Directives (XML) SET Suite Execution Trace (XML) Tools used: GOTCHA Generator AGEDIS Generator TSDriver Suite Driver TSBrowser Suite browser Wizard TED & Proxy Wizard Con Concurrency er AGEDIS Suite Editor AGEDIS Suite Coverage Analyzer AGEDIS Defect Cluster Analyzer 18
GOTCHA Modeling SPEC. GOTCHA Text Editor GDL Model + coverage criteria + test constraints GOTCHA + Generation Options TSBrowser Abstract Suite 19
AGEDIS Modeling SPEC. Objecteering AML Model + coverage criteria + test purposes AGEDIS + Generation Options Abstract Suite TSBrowser 20
AGEDIS Suite Editor AGEDIS Suite Editor Abstract Suite TSBrowser 21
Suite Reuse: SPIDER Input Variation Wizard ATS-TED Editor Abstract Suite Exec. Dir. Wizard SPIDER TSBrowser Cloning Proxy Proxy Proxy Con System Under Suite Event Trace 22
Feedback and Analysis Suite Execution Engine Suite Trace Generator Coverage Analyzer Defect Analyzer Reports Generation Directives 23
Simulation Driven Execution Model Simulator GOTCHA or AGEDIS SPIDER System Under Suite Event Trace 24
Deployments Service Processor Controller Wireless Storage Controller Mainframe Messaging middleware Data Base Application Call Center Application 25
Three Tier A large number of DB update methods Three tier Websphere application COBOL code with a Java testing interface Rapid deployment! 5 person team working within 2 weeks Automated model template from COBOL Wizard for execution interface SQL code generated Bugs: 21 code, 19 error handling, 39 documentation 26
File System Retest of functions Modelling and translation by testers Comparison Original test: 18 bugs,12 PM Pilot test: 15 original bugs + 2 escapes, 10 PM (INCLUDING learning curve) Conclusions: Efficient way to free the tester for creative testing Replaces a large part of the manual test case writing DEFECTS BY SEVERITY # % 1 0 0 2 10 58.8 3 6 35.2 4 1 5.8 DEFECTS BY # % ODC TRIGGERS Coverage 6 35.2 Variation 1 5.8 Sequencing 8 47.0 Interaction 1 5.8 Load 1 5.8 27
Call Center Live test of Java based Call Center Modelling and Execution Function Regression System : Reusing function test Distributing test on many hosts Multiplying test clients DEFECTS BY SEVERITY # % 1 1 2.7 2 21 56.7 3 14 37.8 4 1 2.7 DEFECTS BY # % TRIGGERS Coverage 7 18.9 Variation 4 10.8 Sequencing 17 45.9 Interaction 5 13.5 Reliability 2 5.4 Recovery 2 5.4 28
Engineering Software- Microcode Service Processor Components Under State Manager Event Notifier Challenges Asynchronous Simulation environment Hardware environment 8 Defects discovered multiplication found difficult synchronization bug 29