Executable UML the silver bullet or maybe not Dr. Joel Henry October 22, 2008
Background Overview Model Driven Development (MDD) Unified Modeling Language (UML) Executable UML (xuml) Testing challenges When? Where? How? xuml testing Integration into the process Preliminary research results
Model Driven Development Model Driven Development is about making: software development more domain-driven as opposed to software restricted model development in a specific domain is more efficient (in terms of development time) maintenance is model-centered rather than software-centered activity (challenge!)
Model Driven Development Model Driven Development has advantages: Models are free of implementation artifacts directly represent domain knowledge Domain experts can play a direct role in development Implementations for various platforms can be generated (Web, standalone, mobile device)
Model Driven Development Domain expert develops model(s) Using code generation templates, the model is transformed to executable code The generated code is merged with manually written code Maintenance done HERE!
Model Driven Development Instruction Fetch Instruction Hit in L1 Cache 0 True Decode 0 0 0 False Instruction Hit in L2 Cache 0 True Instruction Miss Delay 0 False 0 Instruction Miss L2 Cache Delay 0
Model Driven Development
Recap Model Driven Development Specifying requirements is development Graphical and mathematical specification Model and then generate source code Examples MatrixX Matlab/Simulink xuml (Restricted UML that can be simulated)
UML A brief primer Graphical representation Class diagram Use cases Sequence diagrams State charts Textual descriptions Pseudocode in methods State transition actions textual Many processes - design or reverse engineer
UML Example class diagram
UML Example use case
UML Example sequence diagram
UML Example state diagram
Challenges of UML Computationally incomplete UML describes a system by specifying the desired results (use cases, sequence diagrams) Specifies what the software produces but not how ( and the devil is in the ) Missing key ingredients: Implementation of methods are specified by a language dependent pseudocode Actions associated with state machines are specified by a text string, or action part of UML
Background Executable UML Requirements is development Graphical and mathematical specification xuml Restricted UML that can be executed (simulated) xuml = UML V1.x - Semantically Weak Elements + Precisely Defined Action Semantics
Executable UML xuml is an executable version of UML clearly defined model structure precise semantics for actions action specification language for methods an accompanying process xuml based on strict development process executable models large-scale reuse -> pattern based design
xuml contains Executable UML Domains (divide problem into smaller problems) Use cases (how domains work together) Sequence diagrams (what happens when) Within domains: Class diagrams each with a state transition diagram Methods written in precise, but limited, ASL Bridges between domains
Executable UML domain model
Executable UML use case diagram
Executable UML sequence diagram
Executable UML Model is a precise specification Can be simulated from use cases (Platform Independent Model - PIM) Easier transition to target system (Platform Specific Model - PSM) No ambiguity in the models Model results immediately available (simulation) Managers feel confidence that progress is being made No software development needed but wait
Model requires Executable UML State transition diagrams (simple. but) Interfaces between classes and domains (requires some thought ) Action specification language (another programming language ) Testing (BIGGEST CHALLENGE) How to test these models? Without testing how do we know any of the above is correct?
xuml System Development System Requirements Specify Domains Specify PIM to PSM Translation Build PIM Use cases Class Diagram State Charts Methods Deployed System Generate PSM TEST?
Requirements Driven Testing Required but unimplemented functionality SOFTWARE REQUIREMENTS Required and implemented functionality Implemented but unacceptable functionality SOFTWARE FUNCTIONALITY Implemented but acceptable functionality Where is this line?
Requirements Driven Testing What does this mean? Model and system functionality drives testing Use knowledge of model design to generate tests Why do this beyond black box testing? Monitor model functionality while testing Detect defects and locations/causes of defects What does this require? Knowledge of model and system Tools to generate and configure test data Ability to identify defects in test results
Requirements Driven Testing Ball and Urn Analogy Model Based Software (PIM or PSM) Test Results Test case 123 Requirements Compare
Requirements Driven Testing Analysis Ball and Urn Analogy Test case 1 Output Values and Events from Test 1 Output Values and Events from Test 2 Test case 2 Output Values and Events from Test 3 Test case 3
Requirements Driven Testing Ball and Urn Analogy What does this mean? Test cases for specific software functionality Likely to find defects Critical for safety, reliability, success, etc. What to test? Values around critical points Large number of input value combinations Sufficient coverage of each equivalence class What are the results? Test coverage for input ranges and combinations Output range coverage, reliability, MTTF, etc.
Testing Goals Testing solution requires: Innovative, reusable, long-term testing environment Requirements and structure driven testing Implement without change to models Defect detection, test case re-execution, testing measurement Test model and translated model with same tests Leverage past success with Matlab/Simulink
Testing Requirements Requirements Input file/matrix Output file/matrix Sample time variable or set frequency Variable Range Input variable min, max, and accuracy Output variable min, max, and accuracy Defects/Exceptions/Faults Identification Tracing
Test Execution Create test data Functions, freehand, imported Execute tests Configure input data Wrap model, simulate, unwrap Capture output values Capture results Input, states, output Detect exceptions Analyze results across multiple tests
Defect Detection Simple value range detection Percent change Allows exception detection if the output value changes more than a specified percent over a specified number of steps Absolute change Allows exception detection if the output value changes more than a specified amount over a specified number of steps
Defect Detection Advanced Exceptions Combinations of exception definitions Disjoint ranges Create exception definitions by time range Combinatorial definitions based on multiple exception definitions Overall system reliability Scenario based reliability per major function Overall reliability combining scenario reliability
Constraint Determination Search based method to find min or max values for a Simulink outport Two methods Genetic algorithm Combination of Simplex and Simulated Annealing Research tool (needs an interface, help, etc.) International acceptance (paper invited to conference in Oxford UK in 2007)
Constraint Determination Test Results Matlab/Simulink Model Constraint Determination Tool Tests Test results: Global Minimum Global Maximum Input values associated with Min and Max
Integration of Tools and Methods Where to place the tools? How to use the tools effectively? What to do with results? How to gain acceptance?
MDA Testing & Tool Usage Determine Objectives, Alternates and Constraints Evaluate alternatives And Resolve Risks Risk Analysis Simulate Test Algorithm Development Select Simulate Solution Test Initial Requirements Translate Link to 3GL code Plan Next Iteration Execute Test Build and Verify Develop and Verify Software and System
xuml Testing Placement System Requirements Specify Domains Testing Specify PIM to PSM Translation Build PIM Use cases Class Diagram State Charts Operations Testing Deployed System Generate PSM Test Data & Drivers Validate PIM Use cases Class Diagram State Charts Operations
xuml Testing Approach Build a set of testing domains independent of application domains Implement reusable testing methods within testing domains Encapsulate application-testing domain coupling in bridges Include the ability to automate testing New applications require only new bridges Suitable for PIM and PSM testing
xuml Testing Approach Data Creation Functions Test Data File Access Functions Expected Results Test Execution Functions Data Storage Functions Test Measurement Bridge Bridge Bridge Bridge Bridge Bridge Domain Domain Domain Application Domains
xuml Testing Domains Data Creation Functions File Access Functions Data Creation Test Data User configurable functions generate test data File Access Functions Functions to retrieve data from xml files or DB Test Data Data format conversion functions
xuml Testing Domains Expected Results Expected Results Organize output data into a consistent format Test Execution Functions Test Execution Functions A set of test types that can be executed on any application domain (through bridges)
xuml Testing Domains Data Storage Functions Data Storage Functions Output the test data, actual and expected results, and exceptions in a consistent format Test Measurements Test Measurement Functions that perform a variety of test measurements (MTTF, Range Coverage, etc.)
xuml Development Testing Data Creation Functions Test Data Bridge File Access Functions Expected Results Test Execution Functions Bridge Bridge Data Storage Functions Test Measurement 1. Read input data 2. Read expected results 3. Configure the Input Data (initialize, call-data pairs) 4. Execute the tests 5. Capture test results 6. Calculate test measurements 7. Store the test results Bridge Bridge Bridge Domain Domain Domain Application Domains
xuml Maintenance Testing Data Creation Functions Test Data Bridge Bridge Domain File Access Functions Expected Results Test Execution Functions Bridge Bridge Domain Bridge Bridge Domain Application Domains Version 2.0 Data Storage Functions Test Measurement 1. Read input data from data gathered during use 2. Read expected results a) Actual results from deployed system OR b) Expected results for new functionality 3. Configure the Input Data (initialize, call-data pairs) 4. Execute the tests 5. Capture test results 6. Calculate test measurements 7. Store the test results
Reuse xuml Testing Domains Data Creation Functions Test Data File Access Functions Expected Results Test Execution Functions Bridge Bridge Data Storage Functions Test Measurement 1. Read input data 2. Read expected results 3. Configure the Input Data (initialize, call-data pairs) 4. Execute the tests (using new Test Bridges) 5. Capture test results 6. Calculate test measurements 7. Store the test results Bridge Bridge Domain Domain New Application Domains
xuml Testing Process
xuml Testing Comparison How did it work? Unit testing: 31% less time than testing C++ Integration testing: 9% less time than C++ Requirements driven testing: still working but More time to build testing domains than application! 90% less time for 2 nd application (test bridges and test files) Caveats? Unit testing has NO reuse across applications Integration testing has some reuse (test harness) Requirements driven testing largely reusable (testing domains)
Questions.? This research funded by: Lockheed Martin (Denver) This research done with: MRI Technologies With strong support from: The University of Montana
Acronyms MDA model driven application PIM platform independent model PSM platform specfic model xuml executable unified modeling language MTTF mean time to failure MATT Matlab automated testing tool RATT reliability automated testing tool GIST graphical input specification tool