Agile Testing Dr. Ronen Bar-Nahor ronen@agilesparks.com 1 AgileSparks We help companies improve by Adopting agile principles and practices. We provide training and coaching to all organizational levels, from high management to developers. Our team consists of Agile professionals with diverse expertise several coaches participate in each implementation. Successfully completed dozens of projects. www.agilesparks.com Agile Testing Course: 15 16/11 1
Agenda Agile overview The challenge for QA The cost of Technical Debt and late feedback Automation approach Continuous integration Exploratory testing Scaling [QA through Scrum lifecycle] 3 Why Agile? Because almost everything is changing Scope Our understanding Our estimates Organizational constrains 2
Agile Process Waterfall approach - Manage the chaos Last defender of Quality Strict Change control Detailed preparation and planning High heavy documentation as a basis for planning Gates, sign-offs, entry criteria QA Keep automation only for them mainly focus on regression 3
Agile The Challenge for QA Based on short iterations that produced Potentially Shippable Product Endorse changes, inspect and adapt Value interactions over comprehensive documentation Testware as executable specs. Right hand of the PO within the team. Collaboration skills with developers and business people Agile The Challenges for QA Shared team ownership on quality Risk for loss of identity QA org.? Dev.? Test engineers Build Quality In Zero defects! Continuous integration! Stop & Fix Scaling, system/portfolio/nf test 4
The Cost of Technical Debt The mind-set of Done is Done!!! 10 Agile Testing goal Testing is not a Phase, testing is the only way to be sure that features/tasks implemented during a given iteration or sprints are actually done (DOD, TDD, ATDD). 11 5
Definition for Technical Dept Anything that slows down development Can Left over, you cutting give corners examples? Defects Code that not executed (but supported) Code complexity Coupling and spaghetti code, Environment setup and distribution Build failure Manual testing Installation, test data upgrade No documentation and leveraging of knowledge 12 User Story Done Criteria Another view for creating debt-> Late feedback Planning Analysis Design Coding Unit Test Along this arrow, there are many opportunities for us to say we are done. Functional Test Non Functional Test Integration testing GA 6
Done in Large Projects Work tested at integration layer and integration bugs passed back for immediate fixing Integration Layer Layer 1 Layer 2 Layer 3 Layer 4 Layer 5 Done in Large Projects Better: Teams around Product Features Having 20% scope usable than 50% just developed One code base with continuous integration Team 1 User Interface Layer Team n Business Logic Layer Persistence Layer 7
User Story Done Criteria Weak Definition of Done Iteration Iteration Iteration Iteration Product Product Product Product Stuff we defer: refactoring security testing user documentation technical debt UAT... Shippable Product? Robust Definition of Done Iteration Iteration Iteration Iteration Product Product Product Product If Product Manager said Ship what you showed me, would you be ready within one stabilized iteration? The goal to minimize code freeze periods Stabilized Iteration Shippable Product? 8
So, What is the problem? Legacy code Existing manual testwaret Existing tools and techniques Existing skills, structure and culture QA/Developer role perception A real holistic change! But Don t be dogmatic, Agile is a framework/toolbox, apply it correctly based on your constrains 9
Testing Levels Brian Marick s Agile Testing Matrix Automated & manual Functional Tests Simulations, prototypes Story Tests/Examples Suppo orts Programming Unit Tests Comp./Integration Tests Code Quality tools Automated Business Facing User Acceptance Tests Exploratory Tests Usability Tests Q2 Q3 Q1 Q4 Performance Tests Load Tests, Security bility testing Technology Facing Manual Tools Critiques Prod duct 10
Continuous Integration (CI) and Automation (Q1,Q,Q2) 22 Objectives for Automation Having a working software on a daily basis (CI) Promote faster, higher coverage and more efficient testing Increase productivity and time to market by: Reducing code freeze periods Providing early feedback for quality and integration issues Better control over testing coverage and quality Ensure reliable system!!! 11
Agile The Challenges for QA Waterfall (Level 0) PSP Months R e l e a s e Early Feedback, Potentially Shippable Product (PSP) Scrum Level 1 Weeks Sprint 1 Sprint 2 Sprint 3 Sprint 4 PSP PSP PSP PSP PSP Scrum Level 2 Days Scrum Level 3 Pairing, hours R&D QA Sprint 1 Sprint 2 Sprint 3 Sprint 4 PSP PSP PSP Sprint 1 Sprint 2 Sprint 3 Sprint 4 Copyright of AgileSparks LTD PSP Test Driven Development Add a test "keep it simple, stupid" (KISS) "You ain't gonna need it" (YAGNI) "Fake it, till you make it Early feedback Drive the design of a program, focus on interface (what) rather implementation No Write some code Run tests Tests passed? Yes Refactor Less code is written More confidence to make changes Yes Development finished No 12
Acceptance Test Driven Development (ATDD) Criteria specified by the customer are automated into acceptance tests Customer has an automated mechanism to decide whether the software meets their requirements. Drive the unit test-driven development (UTDD) Keeps development teams continuously focused on what the customer really wants from that user story. SOA example... Agile Automation Pyramid Effort Distribution GUI 5% Explorato ry Tests run through GUI Expensive, more brittle, late feedback Slow execution and low coverage Done by QA Cost ROI Acceptance (Service/API level) 15% Business logic behind the GUI Understood by customer/po!!! By Product owner, QA and Dev. AUT Unit & Component (integration) 80% System language (By dev.) Fast feedback, part of coding Also component and UNFT 27 13
Typical Automation Formula Purchase an expensive GUI test execution tool. (see Rational, Mercury, Compuware, etc.) Define a lot of paper p test procedures. Hire an automation team to automate each one. Build a comprehensive test library and framework. Keep fixing it. This can work if your product is very easy to test and it doesn t change much. 28 KSF for Agile automation Believe that Automation objectives are doable!!! Any use of tools to support all aspect of testing [James Bach] Test toolsmith role - to gather and apply a wide variety of tools to support testing and testers Measured by reducing testing time, help testers. Promote right usage by all, as native part of the development and testing work Build Quality in Test engineering Testability as part of architecture and design Mocks/integration approach,data generation, component independency, tracing tools etc. 29 14
Practical Incremental Approach New Test Coverage New Features Refactored Code New Features Refactored Code New Features New Test Coverage New Features Refactored Code New Features 1 Legacy Systems Sanity and risky areas 2 Legacy Systems 3 Legacy Systems New Test Coverage New Features 4 Legacy Systems Malfunctioning Code Low quality code covered by automated tests High quality code covered by automated tests Apply TDD to incrementally repair touch points as new features added. Manual affected regression testing only after risk analysis. Automate Sanity and risky areas by independent team Goals of CI Continuous Integration is the practice of integrating gearly and often, so as to avoid the pitfalls of "integration hell". The ultimate goal is stop and fix as early as possible 15
CI Flow Developer Product Build Mini Suit Portfolio Integration Code + Build + Unit Testing until stable Get latest and Merge Local Build + test + code analysis Check in Build and Package Unit Testing Deploy and Test Integ/ Acceptance / System Code Quality Checks Profiling Log Analysis End to end flows Pickup & Deploy Test Failure Report Exploratory testing User Acceptance (Q3,Q,Q4) 33 16
Exploratory testing The chance that you will find a problem on the second execution of the script is substantially bt till lower than if you ran a new test t instead Exploratory testing is simultaneous learning, test design, and test execution) 34 Session Based Testing A method specifically designed to make exploratory testing auditable and measurable on a wider scale (vs. freestyle testing) Source: http://en.wikipedia.org/wiki/exploratory_testing 17
Session Based Structure Store the report results as aggregate data for reporting Charter and metrics With manager about process and results Parsing Results Debrief Report 1 3 sentences, flow to start. Session Source: http://www.satisfice.com/rst.pdf Uninterrupted 1 2 hours Start with the charter and explore Create TCs on the fly and record Flows, areas, charter coverage, bugs, issues, When scaling [Scott Ambler, IBM] 37 18
Independent Integration Team Use expensive testing tools and production like env. Pull stable builds based on quality and plan Focus on higher level stories and cross products business flows focus on bility testing Provide teams automation ti flows (ATDD) Finds integration bugs for immediate fixing Impact Release Look Ahead Planning EPC Internal Delivery Sprint S1 S2 week S3 S4 S5 Compare ind Eligibility Usability X enhance no capacity Dev 2 Dev External drop Demo Sync point - Give & get S1 S2 S3 Ordering Service: CompareProd UI: Compare option Integ Demo Eligibility BE Eligibility FE S1 S2 S3 AMSS AMSS UI: Compare (reuse ) Integration Team Compare MMF Ex. Demo Eligibility MMF 19
QA through Scrum lifecycle Testers support quality infusion through entire team and product cyclecle Release Planning Testing Strategy (iteration zero) Risk areas Tools, Automation strategy, test data Constrains of integrations and dependencies Regression and code freeze strategy Non-Functional testing QA allocation to teams 41 20
QA Activities During Iteration Planning Scrum Planning Session Committed USs + Tasks Daily Meeting Sprint Sprint review GA Iteration Backlog (team level) New functionality is demonstrated Release Backlog (Epics & USs) Iteration Planning PO Team Iteration Scope Definition Scrum Flow Timeboxes, Roles, Rules Iteration Planning Right hand of the Product owner Definition of Done Analyze risks and affected regression Automation plan Build plan (for integration test) Defect backlog review Planning session estimate sizing 43 21
Iteration Execution Scrum Planning Session Committed USs + Tasks Daily Meeting Sprint Sprint review GA Iteration Backlog (team level) New functionality is demonstrated Release Backlog (Epics & USs) Iteration Planning PO Team Iteration Scope Definition Scrum Flow Timeboxes, Roles, Rules Iteration Development Not recommended: mini waterfall iterations. Focus Pla anning Session Design Code Test Recommended: do some of everything daily: Story X Coding Story X Analysis Coding Story X Testing Analysis Coding Story X Testing Analysis Design Integration Coding Story X Testing Analysis Design Integration Everything Coding else Story X "Done" Testing Analysis required Design for Integration Everything Coding else required for "Done" Testing Analysis Design Integration Everything else Testing required Design for "Done" Integration Everything else required Design for "Done" Everything else required Everything for "Done" else required for "Done" Development WU x Design Detailed Implementation Requirements and Devloper Analysis Testing QA and Acceptance Deploy Testing Everything Else Required for "Done" Iteration Status Review 22
QA role during iteration Participate in design Develop test data and test flows in parallel to development Team accountability to execute test flows Automate relevant regression scenarios Exploratory testing Support PO to elaborate DoD for future stories As much as possible dedicated to the team 46 Iteration review Scrum Planning Session Committed USs + Tasks Daily Meeting Sprint Sprint review GA Iteration Backlog (team level) New functionality is demonstrated Release Backlog (Epics & USs) Iteration Planning PO Team Iteration Scope Definition Scrum Flow Timeboxes, Roles, Rules 23
Iteration review and retrospective Confirm to the PO that all scenarios were tested Present quality metrics Conduct Cross QA retrospective session 48 Thanks! 49 24