Tool Selection and Implementation

Similar documents
Examination Questions Time allowed: 1 hour 15 minutes

Security Automation Best Practices

SECURITY AUTOMATION BEST PRACTICES. A Guide to Making Your Security Team Successful with Automation

SECURITY AUTOMATION BEST PRACTICES. A Guide on Making Your Security Team Successful with Automation SECURITY AUTOMATION BEST PRACTICES - 1

Test Automation Blunders

Sample Exam Syllabus

Test How to Succeed in Test Automation Björn Hagström & Davor Crnomat, Testway AB

Software Quality Assurance. David Janzen

Chapter 8 Software Testing. Chapter 8 Software testing

Marketer s Guide to Website Redesign

CATCH ERRORS BEFORE THEY HAPPEN. Lessons for a mature data governance practice

Sample Exam. Advanced Test Automation - Engineer

Case study on PhoneGap / Apache Cordova

Datacenter Care HEWLETT PACKARD ENTERPRISE. Key drivers of an exceptional NPS score

Black Box Software Testing (Academic Course - Fall 2001) Cem Kaner, J.D., Ph.D. Florida Institute of Technology

Introduction to Software Testing

Sample Exam. Advanced Test Automation Engineer

1 Visible deviation from the specification or expected behavior for end-user is called: a) an error b) a fault c) a failure d) a defect e) a mistake

Test Automation. Fundamentals. Mikó Szilárd

Standard Glossary of Terms used in Software Testing. Version 3.2. Foundation Extension - Usability Terms

Rapid Software Testing Guide to Making Good Bug Reports

9 th CA 2E/CA Plex Worldwide Developer Conference 1

Breaking Down Barriers To VMware Technology Adoption Nishan Sathyanarayan

Software Design Models, Tools & Processes. Lecture 6: Transition Phase Cecilia Mascolo

Lecture 15 Software Testing

Shift Left, Automation, and Other Smart Strategies for Getting Ahead in QA

Meet our Example Buyer Persona Adele Revella, CEO

Contents. Management issues. Technical issues. Mark Fewster.

The Fine Art of Creating A Transformational Cyber Security Strategy

Verification, Testing, and Bugs

Yammer Product Manager Homework: LinkedІn Endorsements

Shift Left and Friends And What They Mean for Testers

Good afternoon, everyone. Thanks for joining us today. My name is Paloma Costa and I m the Program Manager of Outreach for the Rural Health Care

Chapter 8 Software Testing. Chapter 8 So-ware tes0ng

Protect Your Organization from Cyber Attacks

C&G 2391 ELECTRICAL INSPECTION AND TESTING

Topics in Software Testing

Foundation Level Syllabus Usability Tester Sample Exam Answers

Determining Best Fit for ITIL Implementation

Verification and Validation

Verification and Validation

The Power of Unit Testing and it s impact on your business. Ashish Kumar Vice President, Engineering

Vulnerability Disclosure Policy. v.1.1

Lesson Guides PRE-INTERMEDIATE

CYBERSECURITY RESILIENCE

Sample Exam. Certified Tester Foundation Level

Digital Marketing Manager, Marketing Manager, Agency Owner. Bachelors in Marketing, Advertising, Communications, or equivalent experience

Up and Running Software The Development Process

Standards for Test Automation

CONFERENCE PROCEEDINGS QUALITY CONFERENCE. Conference Paper Excerpt from the 28TH ANNUAL SOFTWARE. October 18th 19th, 2010

CASE STUDY FINANCE. ABSA Bank Introducing database automation with SQL Toolbelt

The Joel Test: 12 Steps to Better Code.

The SD-WAN implementation handbook

Baseline Testing Services. Whitepaper Vx.x

Seven proven ways to ruin your Test Automation

Question 1: What is a code walk-through, and how is it performed?

Testing Tools to Support Agile Software Delivery. The Critical Role of Automated Functional Testing in Enterprise Environments

Why testing and analysis. Software Testing. A framework for software testing. Outline. Software Qualities. Dependability Properties

Automate to Innovate L EA RN WHAT SCRIPTING CAN DO FOR YOU P U N E E T S I N G H

DEADLY SINS. Of Document. 7 Production THE. automated document assembly

Title: Episode 11 - Walking through the Rapid Business Warehouse at TOMS Shoes (Duration: 18:10)

Chapter 8. Achmad Benny Mutiara

CYBER RESILIENCE & INCIDENT RESPONSE

3Lesson 3: Web Project Management Fundamentals Objectives

CS 160: Evaluation. Outline. Outline. Iterative Design. Preparing for a User Test. User Test

In this Lecture you will Learn: Testing in Software Development Process. What is Software Testing. Static Testing vs.

CS 160: Evaluation. Professor John Canny Spring /15/2006 1

Copyright ECSC Group plc 2017 ECSC - UNRESTRICTED

Software Testing. Software Testing. in the textbook. Chapter 8. Verification and Validation. Verification and Validation: Goals

WEB CMS SELECTION: How to Go From Shortlist to Final Selection

CA Test Data Manager Key Scenarios

Concepts of Usability. Usability Testing. Usability concept ISO/IS What is context? What is context? What is usability? How to measure it?

How method-based problem diagnosis can cut downtime by 97%

2012 Microsoft Corporation. All rights reserved. Microsoft, Active Directory, Excel, Lync, Outlook, SharePoint, Silverlight, SQL Server, Windows,

ASTQB Advance Test Analyst Sample Exam Answer Key and Rationale

BECOME A LOAD TESTING ROCK STAR

Day in the Life of an SAP Consultant using IntelliCorp s LiveCompare Software

FDA 483 The Definitive Guide to Responding to FDA 483 and Warning Letters

Introduction To Software Testing. Brian Nielsen. Center of Embedded Software Systems Aalborg University, Denmark CSS

I just called to say Hello

ROBOTIC TOTAL STATION AND YOUR WORKFORCE OF THE FUTURE

The Importance of Test

Bridge Course On Software Testing

INTRODUCTION. In this guide, I m going to walk you through the most effective strategies for growing an list in 2016.

Higher-order Testing. Stuart Anderson. Stuart Anderson Higher-order Testing c 2011

Testing Objectives. Successful testing: discovers previously unknown errors

E-BOOK The Truth About Diallers

For Volunteers An Elvanto Guide

DSDM Agile Professional Candidate Guidelines October I do it right

Usability Testing! Hall of Fame! Usability Testing!

SFU CMPT week 11

CIS 188 CCNP TSHOOT Ch. 2: Troubleshooting Processes for Complex Enterprise Networks

1. You re boring your audience

Guide to B2B Marketing Part four : Effective reporting

COMP6471 WINTER User-Centered Design

Agile Accessibility. Presenters: Ensuring accessibility throughout the Agile development process

Introductions. 30 minutes of instruction 20 minutes of discussion

Considerations for Mobilizing your Lotus Notes Applications

*ANSWERS * **********************************

MTAT : Software Testing

Transcription:

Tool Selection and Implementation Paul Gerrard Systeme Evolutif Limited email: paulg@evolutif.co.uk http://www.evolutif.co.uk 2000 Systeme Evolutif Ltd Slide 1 Agenda What Can Test Execution Tools Do For You? Good and Bad Reasons for Buying a Tool What to Look for in a Tool Tool Implementation What is Success? Close. 2000 Systeme Evolutif Ltd Slide 2

What Can Test Execution tools Do For You? 2000 Systeme Evolutif Ltd Slide 3 Drawbacks of manual testing Costly Slow Error-prone. 2000 Systeme Evolutif Ltd Slide 4

Speed factors Inputs entered at human speeds Testing only conducted during hours people work Outputs checked at human speeds. 2000 Systeme Evolutif Ltd Slide 5 Error proneness of manual testing Cloning of test cases Inexact repetition of tests Inaccurate results checking Of faults left after testing the majority were discovered by tests, but not noticed by the testers. (Based on work by Caper Jones and Basili) 2000 Systeme Evolutif Ltd Slide 6

Boredom index Much Amount of tool support needed None Time flies Measure of Boredom Dead boring 2000 Systeme Evolutif Ltd Slide 7 And the reality... 90% of organisations have CAST tools (usually test execution) 40-50% CAST tools end up as shelfware <10% have benefited significantly >75% want more CAST tools Everyone knows there is great potential few succeed and achieve real, lasting benefits. 2000 Systeme Evolutif Ltd Slide 8

Good and Bad Reasons for Buying a Tool 2000 Systeme Evolutif Ltd Slide 9 Bad reasons for buying a tool Test faster! Test more! Save money! Test earlier! Find more bugs! Do regression testing! Get tools, not people! 2000 Systeme Evolutif Ltd Slide 10

Test activities planning the testing to be done, both static and dynamic designing the test conditions (logical design) preparing the test input cases (physical and logical design) preparing test data (physical design) preparing the expected results from the requirements specification running the tests examining the mismatches when the expected results do not agree with the actual results isolating bug symptoms so they can be corrected monitoring what tests have been performed evaluating the quality of the testing performed, and extending the tests where required inspection of code, designs, requirements and test cases assessing non-functional aspects of software, such as usability and performance evaluating the quality of the software tested, i.e. release decision rerunning tests after bugs have been corrected updating tests when software is changed. 2000 Systeme Evolutif Ltd Slide 11 Where can the tool assist? planning the testing to be done, both static and dynamic designing the test conditions (logical design) preparing the test input cases (physical and logical design) preparing test data (physical design) preparing the expected results from the requirements specification running the tests examining the mismatches when the expected results do not agree with the actual results isolating bug symptoms so they can be corrected monitoring what tests have been performed evaluating the quality of the testing performed, and extending the tests where required inspection of code, designs, requirements and test cases assessing non-functional aspects of software, such as usability and performance evaluating the quality of the software tested, i.e. release decision rerunning tests after bugs have been corrected updating tests when software is changed. 2000 Systeme Evolutif Ltd Slide 12

Success with test running tools Have realistic expectations - no silver bullet Commitment (management and testers) Implementation project: plan, mobilise, needs, select, train, pilot, review, roll-out. Tactical use of tools (not blanket use) Tools are for life, not just Christmas PROCESS, THEN TOOLS = BENEFITS 2000 Systeme Evolutif Ltd Slide 13 What to Look for in a Tool 2000 Systeme Evolutif Ltd Slide 14

Overview of the selection process Define Problem Short-list Consider automation as a solution Define require features Evaluate Demo Decide Make business case Constraints Trial 2000 Systeme Evolutif Ltd Slide 15 Is a tool the right solution? Tools are not the only way: code inspections are effective at fault finding better documentation and test management can reduce the problem of omitting or repeating tests better impact analysis reduces the tests to be run Tools are sexy, easy to buy and fun! Process improvement is hard people, organisation, and resistance to change can be daunting and hard to overcome. 2000 Systeme Evolutif Ltd Slide 16

Tool selection considerations Just what do you want to automate? automated running or automated thinking? regression testing a mature product? to find bugs during development? Are your people interested in using tools? What skills are available to use the tool? users can t use technical tools automated scripting needs programmer skills. 2000 Systeme Evolutif Ltd Slide 17 Tool selection considerations (2) What technical environment(s) will the tool be used in? Are you organised enough to use tools? who will design the tests? who will write the automated scripts? Implement as a one-off or are tools part of an infrastructure project? If the wrong tool is selected the benefits will not be achieved. 2000 Systeme Evolutif Ltd Slide 18

Tool selection and evaluation team Give someone responsibility for managing the selection and evaluation process A single individual authorised to investigate what tools are available prepare a shortlist Before you start, you need to know: what type of tool is needed who might use it factors for tool to qualify for the shortlist. 2000 Systeme Evolutif Ltd Slide 19 Evaluating the shortlist Involve representatives from groups planning to use the tool different job functions who will use it If you trial the tools usability an important consideration to nontechnical users so involve technical support staff non-technical users need this support. The selection and evaluation team may become the implementation team. 2000 Systeme Evolutif Ltd Slide 20

How much help should the tool be? How will we know a tool is effective? do the testers feel better? Need measurable criteria for success if length of time taken to run tests manually is the problem, how much quicker should the tests be run using a tool? Setting measurable criteria is not so difficult setting reasonable expectations is the problem. 2000 Systeme Evolutif Ltd Slide 21 Measurable success criteria example Manual execution of tests currently takes 4 man-weeks in the first 3 months of using the tool, 50 per cent of these tests should be automated, with the whole test suite run in 2 2½ man-weeks next year at this time we aim to have 80 per cent of the tests automated, with the equivalent test suite being run in 5 man-days. 2000 Systeme Evolutif Ltd Slide 22

Tool Implementation 2000 Systeme Evolutif Ltd Slide 23 Tool implementation process Assemble team Management commitment Publicity Internal marketing Pilot Pilot evaluation Phased implementation Postimplementation review 2000 Systeme Evolutif Ltd Slide 24

Keys to success Selling the concept commitment to testing a pre-requisite tools can save time/money but only if time is currently being spent on the task to be automated Selecting the right tool tool should fit the test process or you will have to refine/develop test process at the same time define the stages of testing the tool supports not all testing can be automated! 2000 Systeme Evolutif Ltd Slide 25 Keys to success (2) Implementation CAST tools no different from any software process, training, documentation pilot to gain quick-wins and gain support Roll-out the things that work learn from pilot what works, what doesn t move skilled resources with the tool measure success and publicise One-off successes are difficult to roll-out. 2000 Systeme Evolutif Ltd Slide 26

Three routes to shelfware Testers at the grass roots reject the idea of CAST No one adequately learns to use the tool Team that knows the tool is disbanded; tool never rolled out Shelfware through Abandonment Shelfware through Neglect Requirements Selection Implementation Roll-out Technical champions can t make the business case Tool proves too much trouble to use; Never Again! Shelfware through Banishment 2000 Systeme Evolutif Ltd Slide 27 Pilot project Try out the tool on a small pilot project first risk of problems encountered much lower helps you to iron out process problems Business case for the pilot objectives for the pilot, e.g. lessons to be learned implementation concerns benefits to be gained. 2000 Systeme Evolutif Ltd Slide 28

Evaluation of pilot Compare results with the business case If objectives met, lessons learned will help the next project gain more benefits If objectives not met either the tool is not suitable tool not yet being used in a suitable way decision: abandon the tool, re-state realistic objectives, or change the approach to gain success next time. 2000 Systeme Evolutif Ltd Slide 29 Planned phased installation Publicise the success of the pilot Plan, conduct training, prepare in-house manuals Nominate a change management team to act as internal consultants Main risks to successful roll-out failure to follow through with training over-ambition under investment. 2000 Systeme Evolutif Ltd Slide 30

What is Success? 2000 Systeme Evolutif Ltd Slide 31 Damn those faults! As usual, testing paradoxes... A successful test detects a fault...but stops our automated test working A manual tester can cope easily stop, log incident, do another test... The tester programmer says he can write us a general purpose error handling routine Ah, that gets us going again. 2000 Systeme Evolutif Ltd Slide 32

False sense of security It works! Unattended run of 73 scripts! But at what cost? - more than we thought With what compromises? we took out all the test checks... It s a test Jim, but not as we know it It works! But what is it? It works! But what is works? What DOES this test prove? 2000 Systeme Evolutif Ltd Slide 33 The next software release So, we have test scripts that run reliably We ve found a few bugs too New release is quite different and all our tests fail dramatically It takes several days to get them working again Now, lets get testing! 2000 Systeme Evolutif Ltd Slide 34

Regression testing The next software release arrives before we got the test scripts working again Project Manager: what do you mean, you haven t started yet? Need to plan for script maintenance Need to script for maintainability. 2000 Systeme Evolutif Ltd Slide 35 Close 2000 Systeme Evolutif Ltd Slide 36

Do automated tests find bugs? If the system we are automating tests for has a bug, when is the bug found? During recording, of course! Does the script we record 2nd time round find a new bug - not very often It is the process of automating test scripts that finds the bugs Do we need the tool to do this? 2000 Systeme Evolutif Ltd Slide 37 Conclusion What do you really, really want a tool for? tools don t necessarily find bugs, but they are more fun than manual testing (sometimes) automated tests provide confidence for rapid development - if the testers can keep up essential for certain types of test - but do you know which? Be careful to separate test design from script development - two skills, both are required. 2000 Systeme Evolutif Ltd Slide 38

Papers Systeme Evolutif web site: www.evolutif.co.uk Testing GUI Applications a strategy for successful GUI test design and test automation Selecting and Evaluating CAST Tools 2000 Systeme Evolutif Ltd Slide 39