Selling Improved Testing Reducing Customer Pain
Technology Advance Partners Technology Advance Partners is a software services consulting firm with deep experience in Information Technology management, solution architecture and program delivery. Our solutions are tool and technology agnostic, helping Clients select the best technology and software meeting the needs of the business. We help Clients establish cost-effective information lifecycle governance programs making the best use of their data across the information lifecycle, from initiation to disposal. We provide our Clients management experience, process maturity and program management expertise gained from our combined experience leading enterprise IT projects. Technology Services: Quality and Testing Life Cycle o Test Coverage o Test Data Management o Test Data Fabrication o Data De-identification & Masking Information Lifecycle Governance o Production Application Archiving o Application Data Decommissioning Data Analytics o Data Architecture o Advanced Data Analysis o Information Governance Data Security o Data Access Management o File Access Management o Security Policy Management o Security Data Analytics Data Integration o Master Data Management o Data Cleansing and Enrichment o Real Time Data Integration Technology Advance Partners is led by Bill Wolf, Anthony Minstein and Andy Sorrell, with a combined 100+ years of experience in Information Technology. They are acknowledged business and technology leaders with expertise across healthcare (payer and provider), banking and retail. Prior engagements: Strategic consulting for Fortune 100 company adopting Hadoop and Big Data Analytics Managing and implementing data security strategy across 8,000 databases for a Fortune 10 healthcare payer Development of a Data Privacy Architecture for a Fortune 50 insurance company with multiple mainframes & 100,000+ servers Developing a global retirement program for a Fortune 10 company including tools, processes and policies Design, implement and test new architectures adding High Availability and Disaster Recovery Contact: Bill Wolf wwolf@techadvancepartners.com 612-719 - 9066; or Anthony Minstein - aminstein@techadvancepartners.com 502-216 - 3443 1
Testing Pain Point DATA! Regardless of platform, discipline, methodology or project type, every IT project has a need for test data As earlier life cycle phases (requirements, design, development) are delayed, end dates do not change, compressing time available for testing, an activity which is part of all phases Testing best practices relay on test planning, solid requirements and clear design; rarely understood or practiced, in spite of best intentions; testing is a reactive process at best There are never enough test environments In absence of test environments, test data proliferates, typically copied from Production with PHI/PII values intact, without the rigorous monitoring found in Production Copied production (test) data contains production PII values: open to both security exposure and audit fines In Agile(ish) methodologies, data must be fabricated to match changing schema designs and emerging requirements What data is fabricated, is typically incomplete, incorrect and insufficient to match business rules or, only testing positive conditions Refreshing environments is time-consuming and resource intensive; test data becomes stale There is little opportunity to reset data state; testers continually create new test data Testing must cover multiple diverse platforms, systems and data structures Each tester (business, development and QA) invest time and effort on finding, creating and maintaining their own data Resolving defects in Production can cost 100 times over the cost of fixing that defect earlier in the lifecycle Speed-to-market is unintentionally more important than delivered quality: DevOps equals automation 2
Waterfall Lifecycle V - Model ITIL V-model for waterfall solution delivery Stakeholder Requirements Business Objectives / Requirements Business Requirements Review Requirements, Analysis and Design Activities System Requirements / Design System Requirements Review Component Requirements and Design Preliminary Design Review / Critical Design Review Satisfaction of Customer Needs Pre-Production Verification Data matching business use cases, verifying end-to-end business rule processing: Identify what environment composed of what components available, when IT QA Integrated Verification Initial data model developed; evolving and expanding as business requirements are further explored Sub-System Verification Data models in flux: test data is acquired from Production or created: Testers, Development competing for data and conflicting data state Component Test - Functional - Non-Functional Deployed capabilities System Integration Test - Functional - Non-Functional Test Readiness Review Production Readiness Review System Acceptance / Validation Subsystem integration and pre-production verification Product Development 3 Technology Advance Partners 4 2017 techadvancepartners.com
Agile Lifecycle Agile is a formal process, testing integrated throughout each 30 day delivery increment ( sprint ) emphasizing face-to-face communication between end-user and development SCRUM Roles Product Owner Team Member Scrum Master 24 Hour Standup Standup Questions What have I completed? What am I planning to do? What are my impediments? Developer may begin by coding test cases (Test Directed Development): Fabricating data to match needs with focus on positive test cases, for assigned component 4 Product Backlog Sprint Backlog 30 Day Sprint Product Backlog Potentially shippable product Sprint Retrospective Technology Advance Partners 5 2017 techadvancepartners.com Final integration at conclusion of Sprint: Integration of individual feature data designs Still need for comprehensive end-toend business use cases and data to match modified data structures and new business rules Agile intended to deliver software into release as quickly as possible; does not resolve test data problems 5
The Variables of Testing Test data must meet multiple specifications Test Development DB/Structure Testing Scope? Data properties (new requirements, new business rules, corner cases, edge cases, regress vendor changes, regress legacy changes) Application Expectation? Business rules that describe legitimate application data, emerging within Agile methodology Evolving Data Structures? Database-logic (RI, CHECK constraints), structural rules to enforce on the database, federated data sources across multiple stakeholder applications How to meet test data needs when data structures are not yet defined or even designed? 5
Test Data Solution The design/documentation dependency can be eliminated as test data requirements is based on test coverage Calculated test coverage data requirements from source variables The initial (first sprint/phase) data design Customer Type Customer ID Customer Retail Wholesale Warehousing Full Logisitics Single Multiple Retail RECORD 1 Customer ID Single RECORD 2 RECORD 3 Customer ID Retail RECORD 4 Customer ID Single RECORD 5 RECORD 6 Customer ID Retail RECORD 7 Customer ID Single RECORD 8 Customer ID Multiple RECORD 9 Customer ID Retail RECORD 10 Customer ID Single RECORD 11 RECORD 12 Customer ID Retail Create test data coverage matrix matching coverage needs from known meta-data; results in 12 records 6 Technology Advance Partners 7 2017 techadvancepartners.com
Test Data Solution Filter based on realistic business rules Calculated test coverage data requirements from source variables Filter out unrealistic test cases: warehousing and logisitics customers would not have retail stores The initial (first sprint/phase) data design Customer Type Customer ID Customer Retail Wholesale Warehousing Full Logisitics Single Multiple Retail RECORD 1 Customer ID Single RECORD 2 RECORD 3 Customer ID Retail RECORD 4 Customer ID Single RECORD 5 RECORD 6 Customer ID Retail RECORD 7 Customer ID Single RECORD 8 Customer ID Multiple RECORD 9 Customer ID Retail RECORD 10 Customer ID Single RECORD 11 RECORD 12 Customer ID Retail Applying the business rules filters test data needs down to 10 records 7 Technology Advance Partners 8 2017 techadvancepartners.com
Test Data Solution Integrate with Optim matching test data to test data coverage matrix Calculated test coverage data requirements from source variables Customer ID Single Customer ID Retail Customer ID Single Optim Test Data Orchestrator Customer ID Retail Customer ID Single Customer ID Single Can test data need be satisfied from existing Optim TDM extract? Optim Test Data Management Optim Data Privacy De-identify structured data values UMASK De-identify unstructured data values Satisfy test data needs by expanding existing or creating new Optim TDM extract Test data needs cannot be satisfied from existing sources Optim Test Data Fabricator Create test data coverage matrix matching coverage needs from known meta-data 8 Technology Advance Partners 9 2017 techadvancepartners.com 9
Test Data Solution As design changes, integrate the addition of new columns or tables across a single or federated data sources Calculated test coverage data requirements from source variables The initial (first sprint/phase) data design Customer Type Customer ID Customer ID Type Contact Name Contact Phone Address Reuse existing components and add what is needed Optim work orders created to alert to presence 10 of new or altered columns 9 Customer Retail Wholesale Warehousing Full Logisitics Single Multiple Retail Office Warehouse Sales Office Customer ID Single Customer ID Retail Customer ID Single Customer ID Retail Technology Advance Partners 10 2017 techadvancepartners.com Customer ID Single Customer ID Single Customer ID Customer ID Customer ID Type Office Type Warehouse Type Sales Office Contact Name Contact Name Contact Name Contact Phone Contact Phone Contact Phone Address Address Address
The Test Data Automated Factory Automation and integration powers the test data factory and DevOps delivery Agility begin test data analysis earlier in the lifecycle Speed-to-market analysis further upstream in process integrating with existing assets Data de-identification of both structured and unstructured data eliminates security and audit concerns Reduced size of test data bases with focus on test data necessary for thorough test coverage Library of test data extracts for loading to match testing need for that environment With smaller test extracts, fit to purpose, environments can be refreshed faster and easier Tools implemented as part of an integrated platform, or may be implemented individually, matching site requirements, budget, environment and culture 10
Test Data Factory Application Customer test world Extract from Production? Optim TDM/TDO Fabicate for New Schema or Business Rules? Optim TDF Fabricate data to match new/revised schema Optim TDF Fabricate data to match new/revised schema or new business rules Optim TDF Reuse test data from prior environments: Add for new expanded test cases Optim TDM/TDO Superset data to identify fail over point Optim TDM With each production release, refresh regression data bed Optim TDM With each production release, create regression data bed Vendor Release of New Functionality Custom Development Agile Custom Development Waterfall Middle-Tier Transaction Change New Middle-Tier Transaction Extranet Web Change New Extranet Application Custom Break-Fix Vendor Break-Fix New Functional Release - Custom Infrastructure Upgrade Infrastructure Dev-Ops Upgrade Test Data Sizes Percentage of Production 5% 20% 20% 150%+ 20% 20% Refresh State and Rerun Tests? - - De-Identify Structured and Unstructured Data - Optim Data Privacy and UMASK Calculate Test Coverage - Test Data Orchestrator Logical Test Environment Development System Test QA/Acceptance Performance Test Break Fix Regression 11
The Pain Killer Analgesic is automation, integration and fit-to-purpose Problem Project Schedule Compression Provide solutions to known problems Solution Alleviate schedule compression with test data automation enable data fulfillment to drive the test engine Test Best Practices Coordinate testing around the knowns calculate test data needs from test coverage based on data source model and data variables Test Data Platform Test Data Orchestrator Lack of Test Environments Maintain a library of reusable test data subsets to load where needed Optim Test Data Management Proliferation of Sensitive Data Employ data privacy/de-identification on both structured and unstructured data sources Optim Data Privacy (Structured), UMASK (Unstructured) Data Fabrication Fabricate data to fill holes Optim Test Data Fabricator Environment Refresh Structured extract/de-identify and load from library of reusable data subsets Test Data Reset Structured extract/de-identify and load from library of reusable data subsets Multiple diverse platforms Test data consistency across affected platforms, data target federated as a single target Optim Test Data Management Optim Test Data Management Optim Test Data Management Individual Tester Test Data Maintain a library of reusable test data subsets to load where needed Optim Test Data Management 12
Potential Considerations 13 Implementing technology innovation demands expectation management Concern Ease of implementation how difficult to achieve results Cost of implementation how expensive to achieve results Time to implement how quickly can results be achieved Resource impact of implementation who is required to implement Complexity of design: are end-users able to apply tools? Integration with other existing tools (e.g., TDM, Test Automation) There are no EASY buttons: there are easier buttons Does the solution extend to both structured and unstructured data? How to Address the Concern Implementation takes advantage of existing processes: Test Coverage (TDO) integrated with existing TDM artifacts to find necessary data and alert on new or changed tables/fields Fabrication requires some understanding of source and target databases Right-size the initial targets There is limited investment upfront All tools designed with reuse in mind Shared repositories and saved objects reduce the heavy lifting Depends on size and breadth of source system More tables and columns, with more extensive business rules demand more time and effort Is faster than each testing resource performing the same work on their own Tools able to import logical models and interrogate meta-data reducing resource impact Once configuration is completed, end users are able to rerun tests, refresh state and reload data Tools may be installed and accessed as a single, integrated platform, or as individual units interacting with existing, installed products Data privacy and masking and data generation applies to both structured and unstructured data sources
Thank You Questions? There are no EASY buttons: but there are easier buttons 14