Deliver robust products at reduced cost by linking model-driven software testing to quality management.

Similar documents
AUTOSAR: from concept to code.

SCREEN COMBINATION FEATURE IN HATS 7.0

WP710 Language: English Additional languages: None specified Product: WebSphere Portal Release: 6.0

Virtualizing disaster recovery helps ensure business resiliency while cutting operating costs.

Lab DSE Designing User Experience Concepts in Multi-Stream Configuration Management

IBM InfoSphere Information Analyzer

Networking for a dynamic infrastructure: getting it right.

[CONFIGURE NEW PAYMENT METHOD IN STORE FRONT]

HATS APPLICATION DEVELOPMENT FOR A MOBILE DEVICE

Integrate IBM Rational Application Developer and IBM Security AppScan Source Edition

What's new with Rational IBM s Telelogic Solutions move to Jazz

IBM Software IBM InfoSphere Information Server for Data Quality

RUP for Systems Z and other Legacy Systems

IBM Compliance Offerings For Verse and S1 Cloud. 01 June 2017 Presented by: Chuck Stauber

Document, presentation and spreadsheet applications To support your business objectives. Why IBM Lotus Symphony? Why free?

Value of managing and running automated functional tests with Rational Quality Manager

A number of optimizations are already in use by the majority of companies in industry, notably:

How to Secure Your Cloud with...a Cloud?

Unlocking increased collaboration, innovation and productivity.

IBM Aspera for Microsoft SharePoint

IBM Lotus Notes and Lotus Domino 7 software: the new face of business collaboration and communications.

IBM Rational Developer for System z Version 7.5

Hybrid WAN Operations: Extend Network Monitoring Across SD-WAN and Legacy WAN Infrastructure

Overview. Business value

Networking for a smarter data center: Getting it right

IBM Internet Security Systems Proventia Management SiteProtector

EY s data privacy service offering

Introducing IBM Lotus Sametime 7.5 software.

An Integrated Test Framework to Reduce Embedded Software Lifecycle Costs

Simplify IP Telephony with System i. IBM System i IP Telephony

IBM Tivoli Directory Server

Get your business Skype d up. Lessons learned from Skype for Business adoption

Continuous Diagnostics and Mitigation demands, CyberScope and beyond

The McGill University Health Centre (MUHC)

Innovate 2013 Automated Mobile Testing

IBM Real-time Compression and ProtecTIER Deduplication

IBM Software Configuration Library Manager Advanced Edition for z/os, Version 1.2

IBM Rational Host Access Transformation Services, Version 7.1

The Challenge of Managing WebSphere Farm Configuration. Rational Automation Framework for WebSphere

IBM Rational Application Developer for WebSphere Software, Version 7.0

Cloud-based data backup: a buyer s guide

Building a New Rational Web Site with Rational Suite

Break Through Your Software Development Challenges with Microsoft Visual Studio 2008

Upgrading the DOORS and Change integration data to the OSLC-CM integration

Vault Systems. Using IBM NeXtScale to disrupt industry with secure, cost-effective private cloud. Overview. IBM Systems & Technology Case Study

How WhereScape Data Automation Ensures You Are GDPR Compliant

Modeling Requirements, Architectures, Behaviour...

Everyday Security: Simple Solutions to Complex Security Problems

What s New in IBM Rational Software Architect 8.5 and Rational Software Architect Design Manager 4.0 Beta

The Business Case for a Web Content Management System. Published: July 2001

IBM Rational Software

Getting Started with Rational Team Concert

Escaping PCI purgatory.

HATS 7.1 Performance and Capacity Planning

Automated Firewall Change Management Securing change management workflow to ensure continuous compliance and reduce risk

IBM Tivoli Network Manager Entry Edition

Helix Test Case Management Best Practices

CA ERwin Data Profiler

MSS VSOC Portal Single Sign-On Using IBM id IBM Corporation

Hyper-Converged Infrastructure: Providing New Opportunities for Improved Availability

The case for cloud-based data backup

ebook library PAGE 1 HOW TO OPTIMIZE TRANSLATIONS AND ACCELERATE TIME TO MARKET

Hands-on Lab Session 9909 Introduction to Application Performance Management: Monitoring. Timothy Burris, Cloud Adoption & Technical Enablement

IT optimization White paper May Optimize energy use for the data center through enhanced measurement and management

WHITE PAPER MARCH Automating Data Masking and Reduction for SAP System Copy

Customer Breach Support A Deloitte managed service. Notifying, supporting and protecting your customers through a data breach

Semiconductor Market for Data Processing: Asia/Pacific, 3Q03

Comprehensive Test Management with Parametrization Manual and Automated Test Execution Test Case Library Management & Re-use Requirements Test

IBM TS7700 grid solutions for business continuity

The HITECH Act. 5 things you can do Right Now to pave the road to compliance. 1. Secure PHI in motion.

A Better Approach to Leveraging an OpenStack Private Cloud. David Linthicum

[INTEGARTION OF DISPLAY TAG WITH WEBSPHERE COMMERCE]

Using Question/Answer Wizards and Process Slots to configure an RMC process/wbs

IBM Resilient Incident Response Platform On Cloud

IBM Infrastructure Suite for z/vm and Linux: Introduction IBM Tivoli OMEGAMON XE on z/vm and Linux

ITIL : the basics. Valerie Arraj, Compliance Process Partners LLC. AXELOS.com. The APM Group and The Stationery Office 2013

IBM Tealeaf cxoverstat

IBM Cloud IBM Cloud for VMware Solutions Zeb Ahmed Senior Offering Manager and BCDR Leader VMware on IBM Cloud VMworld 2017 Content: Not for publicati

Data Governance. Mark Plessinger / Julie Evans December /7/2017

The power management skills gap

CA Test Data Manager Key Scenarios

IBM SmartCloud Notes Security

Evaluator Group Inc. Executive Editor: Randy Kerns

Benchmarking z/os Development Tasks - Comparing Programmer Productivity using RDz and ISPF

Now you can Microsoft Visual Studio 2010 with MSDN

IBM Next Generation Intrusion Prevention System

Test and Evaluation of Autonomous Systems in a Model Based Engineering Context

Now on Now: How ServiceNow has transformed its own GRC processes

PERFORM FOR HPE CONTENT MANAGER

IBM dashdb Local. Using a software-defined environment in a private cloud to enable hybrid data warehousing. Evolving the data warehouse

Software Traceability Establishment Revolution Based on Complexity Science

Veritas Provisioning Manager

TABLE OF CONTENTS DOCUMENT HISTORY 3

Quality Assurance and IT Risk Management

Agile Accessibility. Presenters: Ensuring accessibility throughout the Agile development process

20 years of Lotus Notes and a look into the next 20 years Lotusphere Comes To You

IBM SPSS Text Analytics for Surveys

Next Generation Backup: Better ways to deal with rapid data growth and aging tape infrastructures

CYSE 411/AIT 681 Secure Software Engineering. Topic #6. Seven Software Security Touchpoints (III) Instructor: Dr. Kun Sun

Data Governance Quick Start

Transcription:

Quality management White paper September 2009 Deliver robust products at reduced cost by linking model-driven software testing to quality management.

Page 2 Contents 2 Closing the productivity gap between development and testing 3 The benefits of model-driving software testing 4 The role of model-based testing in early detection of defects 6 Creating model-driven tests 9 The missing link: making tests available to QA 9 Store and rerun tests from almost any source 11 Integrating testing into the design process Closing the productivity gap between development and testing With embedded product designs becoming more complex and product lifecycles shrinking, development efficiencies are essential. Fortunately, emergence of the Unified Modeling Language (UML) and Systems Modeling Language (SysML) standards has provided the opportunity for breakthrough gains in streamlining the development process. These standards enable engineers and software designers to significantly improve their productivity by transitioning from a code-based development process to model-driven development (MDD). Using models, software engineers can more clearly understand and analyze requirements, define design specifications, test systems concepts using simulation and automatically generate code for direct deployment on the target hardware. Development teams can standardize processes and automate repetitive tasks to boost productivity and enhance regulatory compliance through self-documenting data and workflows. As a result, engineers and developers are able to deliver more complex and intelligent designs in much less time. These benefits of model-based development are well documented. But software testing is another story altogether. The testing process is still largely code based creating a serious productivity gap in the development process. Designs can be produced much faster than they can be tested. It s not that model-based testing tools don t exist. They do and have been proven to save time by enabling developers to test in the same language in which they design UML. Another advantage is that when developers start with scenarios as requirements, they are testing against what customers and marketing staff have agreed too.

Page 3 So why are developers continuing to code their own test scripts? A major roadblock of model-based testing is the lack of integration to quality assurance (QA). Why aren t model-based tests available to the QA team? The main reason is because QA professionals don t understand UML and have little reason to learn it. A major roadblock to adopting model-driven testing is lack of integration to quality assurance. This white paper explains how organizations can overcome this obstacle. It highlights the benefits of model-based testing and describes a model-driven approach designed to create tests that can be executed from external sources such as QA. No matter who executes the tests, results are available to all affected practitioners closing the loop from requirements to verification and validation of use cases. The benefits of model-driven software testing Year-over-year studies by the Embedded Market Forecasters organization show that model-based testing helps companies deliver complex designs on time while meeting predesign expectations for performance, systems functionality and features. These studies compare actual data gathered from surveys of practitioners in three types of development and testing projects: legacy, transitional and enhanced. The legacy projects used no model-based practices at all, either in development or testing. The transitional projects used model-based development, but code-level testing. And the enhanced projects used models to drive both development and testing.

Page 4 When models are used to drive both development and testing, project teams are able to deliver better outcomes in a shorter period of time a key reason being early detection of defects. The most recent comparative data shows that enhanced projects produce designs that cost, on average, tens of thousands of dollars less than those produced in legacy and transitional environments. Enhanced designs also were completed more quickly requiring up to six weeks less time from start to shipment. And MDD and model-driven testing (MDT) together resulted in better design outcomes (see figure 1). 1 Code-based with no MDD (legacy) Code-based with MDD (transitional) MDT with MDD (enhanced) Performance 76.4% 80.0% 88.8% Systems 76.7% 80.0% 81.5% functionality Features and schedule 72.0% 74.0% 70.3% Figure 1: Projects using both model-driven development and testing completed significantly more designs with outcomes that met predesign expectations. Early detection of defects was one of the key reasons that enhanced projects were able to deliver better outcomes in a shorter period of time. The role of model-based testing in early detection of defects According to the National Institute of Standards and Technology, 80 percent of development costs are spent on identifying and fixing defects. 2 The majority of defects are most likely introduced during the early stages of design and development when errors can be fixed more easily and at a fairly low cost. Unfortunately, most defects tend to be found late in the process, sometimes after product release.

Page 5 Induced errors Costs of design change Time Requirements Design Implementation Test $7,600/defect $80/defect During the requirements phase $240/defect During the design and construction phase $960/defect During the QA/testing phase Once released as a product Figure 2: The costs of correcting defects increases geometrically throughout the development process. The obvious response is to perform tests as early as possible in the development process. That way, errors can be found and fixed before they propagate. If the development environment uses models but the test environment relies on code, there is no effective way of navigating from test cases to use cases to validate features against requirements. Verification and validation are the two fundamental strategies for software testing. Verification answers the question, Is the feature developed in the right way? Validation answers the question, Is it the right feature? When the development and test environments are separate because the first is model-centric and the second is code-centric no one can navigate from the test cases all the way back to the use cases to validate requirements.

Page 6 Using the design model to create more efficient and higher-quality test cases can result in better coverage of requirements. If testing can be extended forward in the process to include the design model, quality managers and software engineers can check that each feature meets requirements as soon as it is implemented in the model. And, if the testing process uses model-based validation activities to automate test code, developers are able to be more accurate and save significant time verifying code coverage, detecting any memory problems or finding other run-time issues. 3 In short, extending traditional code-centric test case development into modeling test architectures and test case behaviors, and using the design model to create more efficient and higher quality test cases, can result in better coverage of the requirements and ultimately higher-quality deliverables. Creating model-driven tests The IBM Rational solution is based on the UML 2.0 testing profile, which integrates testing into UML, enhancing it with concepts such as test architectures and test behaviors. Test architectures extend the exiting UML 2.0 structural concepts to describe the elements involved in a test and their relationships. Similarly, the test behavior extends the existing UML 2.0 behavioral concepts to encompass all observations and activities during the test. 4

Page 7 The IBM Rational solution offers a single model-driven environment where both design and testing can coexist. The testing profile offers a taxonomy for testing artifacts that integrates well with UML, offering a single environment where both design and testing can coexist. All design artifacts, test artifacts and test reports can be integrated within the same browser. Design and test artifacts are always in sync and navigation between design and test artifacts is easy. Requirements can be linked to design components, which can then be linked to test cases and to test execution reports for comprehensive traceability (see figure 3). Figure 3: The UML testing profile enables model-driven testing where the design process and test process are fully integrated providing an efficient approach to software validation and verification.

Page 8 Developers are able to design and test their models using the same diagrams. In UML you design your applications through diagrams. The diagrams include sequence diagrams and activity diagrams. MDT with IBM Rational Rhapsody software allows tests to be drawn as these diagram types, which means that the developer doesn t need to create a new language in order to build tests. The software can execute these tests against the design and provide comprehensive test results. The tests execute on the desktop and on the target. This can work quite well for developers. It may not, however, work as well for QA. Figure 4: Model-driven testing with IBM Rational Rhapsody software enables you to test your models the same way you design them using diagrams. In a traditional MDT environment, these tests need to be executed and the results reported inside the UML tool. This means QA teams have been left out of the loop for executing and getting results from tests.

Page 9 The missing link: making tests available to QA In view of the benefits, the primary reason organizations fail to adopt modeldriven tests is because these tests are hard to reuse. Other people in the product development process don t know UML and they don t have access to the machines being tested. There is little incentive to spend time developing tests unless they can be used throughout the development process and product lifecycle. With the IBM solution, the model tests created by developers can be stored in a central repository and executed by QA managers who are unfamiliar with UML. IBM Rational Quality Manager software links with IBM Rational Rhapsody TestConductor Add On software to make tests available to QA. The integration between these tools allows tests created in Rhapsody TestConductor Add On software to be executed by people unfamiliar with UML. The integration provides a central repository to run tests from almost any source, store the results and enable access to those results. Store and rerun tests from almost any source QA teams have to test whether or not the product or system meets requirements from all sorts of different perspectives, not just software. Multiple teams are developing multiple components at any given time and creating tests. QA managers also write their own tests. To realize the full benefit of MDT, QA managers should be able to reuse tests to make sure that completed system components work together as designed and meet business objectives. With the Rational MDT solution, both model-based and code-based tests can be stored in a central repository. They are always available for the QA manager to execute at any point in the development process and product lifecycle when it is necessary to validate that any functional additions and changes operate as expected and don t cause other parts of the system to break.

Page 10 The QA manager can view automatically generated test reports on the Rational Quality Manager dashboard and click on a failed test to navigate to exactly where the problem lies in the design model. For example, a new component is ready to be added to a system under development. As part of the requirements validation process, the QA manager executes a range of tests from Rational Quality Manager software, which includes technology that is designed to identify and select the minimum number of tests required for a given level of coverage. Via the Rational Quality Manager dashboard, the QA manager can view automatically generated test reports that indicate what portions of the system pass and what parts fail. By clicking on a failed test, the QA manager can navigate to exactly where the problem lies in the design model and the associated code. In this scenario, the test that failed is for a software component. Though executed from Rational Quality Manager software, the test is actually run in the Rational Rhapsody TestConductor Add On software. The appropriate developer is alerted automatically. The defect tracking functionality enables him to see all the ways the test failed and the context in which it failed (what the QA manager did that caused the failure.) And when the QA manager is notified that the defect is fixed, she can rerun the initial tests to see whether or not the system fails again. Essentially, model-centric validation activities are used to drive code-centric, automated verification activities. The result is an integrated validation and verification testing process that leverages the model information and automates much of the code-centric verification activities.

Page 11 The IBM approach to model-driven testing can help your QA managers and testers realize productivity gains similar to those developers enjoy with model-driven development. Integrating testing into the design process Developers who have migrated from a code-centric approach to a MDD approach have been able to significantly increase their productivity. By leveraging the IBM approach to MDT, the QA and testing organization can realize similar productivity increases with the ability to: Trace and easily navigate between requirements, design artifacts, test architectures, test cases and test execution reports, all from within a single browser. Link test cases, regardless of how they are captured, to their test architecture. Execute the same test cases on the host development platform and on the target without modifying them. Graphically monitor the progress of executed tests and graphically identify causes of failures. For most companies that depend on software to differentiate their products and services, it s really not a question of whether to transition to MDT. It s a question of when. With IBM, you can transition in steps and at a pace that makes the most sense for your business.

For more information To learn more about how IBM can help you transition to model-driven testing, contact your IBM representative or IBM Business Partner, or visit: IBM Rational Test Conductor ibm.com/software/rational/products/rhapsody/developer/features (Click on the Validation and Testing tab.) Rational Quality Manager ibm.com/software/awdtools/rqm/index.html Copyright IBM Corporation 2009 IBM Corporation Software Group Route 100 Somers, NY 10589 U.S.A. Produced in the United States of America September 2009 All Rights Reserved IBM, the IBM logo, ibm.com, Rational, and Rhapsody are trademarks or registered trademarks of International Business Machines Corporation in the United States, other countries, or both. If these and other IBM trademarked terms are marked on their first occurrence in this information with a trademark symbol ( or ), these symbols indicate U.S. registered or common law trademarks owned by IBM at the time this information was published. Such trademarks may also be registered or common law trademarks in other countries. A current list of IBM trademarks is available on the Web at Copyright and trademark information at ibm.com/legal/copytrade.shtml Other company, product, or service names may be trademarks or service marks of others. References in this publication to IBM products or services do not imply that IBM intends to make them available in all countries in which IBM operates. The information contained in this documentation is provided for informational purposes only. While efforts were made to verify the completeness and accuracy of the information contained in this documentation, it is provided as is without warranty of any kind, express or implied. In addition, this information is based on IBM s current product plans and strategy, which are subject to change by IBM without notice. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, this documentation or any other documentation. Nothing contained in this documentation is intended to, nor shall have the effect of, creating any warranties or representations from IBM (or its suppliers or licensors), or altering the terms and conditions of the applicable license agreement governing the use of IBM software. IBM customers are responsible for ensuring their own compliance with legal requirements. It is the customer s sole responsibility to obtain advice of competent legal counsel as to the identification and interpretation of any relevant laws and regulatory requirements that may affect the customer s business and any actions the customer may need to take to comply with such laws. 1 Jerry Krasner, The Economics of Defective Software, Embedded Market Forecasters, July 2008. 2 National Institute of Standards and Technology (NIST), Software Errors Cost U.S. Economy $59.5 Billion Annually, NIST news release, June 2002. 3 Martin Stockl, Validation vs. Verification: When Resources Are Limited, Which One First?, IBM Rational, March 2009. 4 Moshe Cohen, Transitioning from Code-Based to Model-Driven Software Testing Part 1: The Basics of the UML 2.0 Software Testing Profile, Embedded.com, November 2007. RAW14161-USEN-00