T-76.5613 Software Testing and Quality Assurance 23.10.2006 Test documentation and Reporting SoberIT
IEEE 829 Standard for Software Test Documentation Test Planning Test plan Project level Phase level Test Specification Test design Specification Test case specification Test procedure specification Test Reporting Transmittal report Test log Incident report Test summary report Number of needed test documents, their format, thoroughness and level of detail depends on context. 2
IEEE 829 Standard for Software Test Documentation Test Planning Test plan Project level Phase level Test Specification Test design Specification Test case specification Test procedure specification Test Reporting Transmittal report Test log Incident report Test summary report 3
Test plan a project plan for testing A document describing the scope, approach, resources, and schedule of intended testing activities. It identifies test items, the features to be tested, the testing tasks, responsibilities, schedules, and risks Test plan can be either product or tool Test plan as a product: structure, format and level of detail are determined not only what s best for the effectiveness of the testing effort but also by what customer or regulating agency requires. Test plan as a tool: Creating long, impressive, or detailed test planning documents is not the best use of your limited time. A test plan is a valuable tool to the extent that it helps you manage your testing project and achieve your testing goals. 4
Test Plan (IEEE Std 829) 1/5 1 Test plan identifier 2 Introduction Product to be tested, objectives, scope of the test plan Software items and features to be tested References to project authorization, project plan, QA plan, CM plan, relevant policies & standards 3 Test items Test items including version/revision level Items include end-user documentation Bug fixes How transmitted to testing References to software documentation 5
Test Plan (IEEE Std 829) 2/5 4 Features to be tested Identify test design / specification techniques Reference requirements or other specs 5 Features not to be tested Deferred features, environment combinations, Reasons for exclusion 6 Approach How you are going to test this system Activities, techniques and tools Detailed enough to estimate Completion criteria Specify degree of comprehensiveness (e.g. coverage) and other criteria (e.g faults) Identify constraints (environment, staff, deadlines) 6
Test Plan (IEEE Std 829) 3/5 7 Item pass/fail criteria What constitutes success of the testing E.g. coverage, bug count, bug rate, number of executed tests, Is NOT product release criteria 8 Suspension and resumption criteria For all or parts of testing activities Which activities must be repeated on resumption 9 Test deliverables Test plan Test design specification Test case specification Test procedure specification Test item transmittal report Test logs Test incident reports Test summary reports 7
Test Plan (IEEE Std 829) 4/5 10 Testing tasks Including inter-task dependencies & special skills Estimates 11 Environment Physical, hardware, software, tools Mode of usage, security, office space Test environment set-up 12 Responsibilities To manage, design, prepare, execute, witness, check, resolve issues, providing environment, providing the software to test 13 Staffing and Training needs 8
Test Plan (IEEE Std 829) 5/5 14 Schedule Test milestones in project schedule Item transmittal milestones Additional test milestones (environment ready) What resources are needed when 15 Risks and Contingencies Testing project risks Contingency and mitigation plan for each identified risk 16 Approvals Names and when approved 9
Course Test Plan Template for the exercise phase 3 The high level structure for your test plan Details below each chapter give you the idea of the contents of each chapter You should apply information from your course book, lectures, and IEEE 829-1998 standard to make as good test plan as possible Remember: Content or nothing Grading is not based on word count 1. Introduction 2. Tested items and features 3. Testing Approach 4. Resources 5. Tasks and Schedule 6. Risks and contingencies 7. Approvals 10
Test plan quality criteria Usefulness: Will the test plan effectively serve its intended functions? Clarity: Is the test plan self-consistent and sufficiently unambiguous? Accuracy: Is the test plan document accurate with respect to any statements of fact? Adaptability: Will it tolerate reasonable change and unpredictability in the project? Efficiency: Does it make efficient use of available resources? Usability: Is the test plan document concise, maintainable, and helpfully organized? Compliance: Does the test plan meet externally imposed requirements? Foundation: Is the test plan the product of an effective test planning process? Feasibility: Is the test plan within the capability of the organization that must use it? Source: Kaner, Bach, Pettichord. Lessons Learned in Software Testing. 2002 11
IEEE 829 Standard for Software Test Documentation Test Planning Test plan Project level Phase level Test Specification Test design Specification Test case specification Test procedure specification Test Reporting Transmittal report Test log Incident report Test summary report 12
Test Case Specification (IEEE Std 829) 1. Test-case-specification identifier: specifies the unique identifier. 2. Test items: detailed feature, code module, etc. to be tested; references to product specifications or other design docs 3. Input specifications: Each input required to execute the test case (by value with tolerances or by name); identifies all appropriate databases, files, terminal messages, etc.; specifies all required relationships between inputs (for example, timing) 4. Output specifications: result expected from executing the test case; outputs and features (for example, response time) required of the test items; exact value (with tolerances where appropriate) for each required output or feature 5. Environmental needs: hardware, software, test tools, facilities, staff, etc. 6. Special procedural requirements: special constraints 7. Intercase dependencies: lists the identifiers of test cases which must be executed prior to this test case; the nature of the dependencies 13
Test design specification Specifies a one set of tests and refines the test plan information Specific methods, tools and techniques Groups together tests that test a certain set of features References to test procedures and test cases Pass/fail criteria 14
Test procedure specification Used if needed Test plan and test design specification are not enough Specifies the steps for executing a set of test cases Special requirements for executing the tests References to test cases Steps and any measurements to be made 15
IEEE 829 Standard for Software Test Documentation Test Planning Test plan Project level Phase level Test Specification Test design Specification Test case specification Test procedure specification Test Reporting Transmittal report Test log Incident report Test summary report 16
Reporting test results Evaluation of the tested software Found defects and issues Testers assessment of the quality Risk assessment based on test results and gained knowledge Comparison of planned vs. actual Testing project management report A post mortem for tests to come Weak and omitted areas Ideas for new tests Risks 17
Test record (log) contains Chronological record of relevant details about the tests execution Identities and versions (unambiguously) of Software under test (exact build, versions of components, ) Test specifications Testing environment Identify the attributes of the environments in which the testing is conducted Include hardware being used (e.g., amount of memory being used, CPU model, mass storage devices) system software used resources available Activity entries Date and time for beginning and end of activities Identity of the tester Execution description Results Anomalous events Defect ID:s References to other documents (test case, test design specification, ) 18
Check the results Follow the plan and mark off progress on test script Note that these records are used to establish that all test activities have been carried out as specified Document actual outcomes from the test Capture any other ideas you have for new test cases Compare actual outcome with expected outcome. Log discrepancies accordingly Software fault Test fault (e.g. expected results wrong) Environment or version fault Test run incorrectly Log coverage and other planned metrics for measures specified as test completion criteria 19
Defect reporting Defect report A technical document written to describe the symptoms of a defect to Communicate the impact and circumstances of a quality problem Prioritize the defect for repair Help a programmer to locate and fix the underlying fault Defect reports are the most frequent and visible results of testing work Defect reports are an important communication channel from testing to development Defect reports are challenging to write Bearing bad news Explaining complicated behaviour Communicating to people with different mindset using as few words as possible Goal is to make people fix their mess instead of creating some new fancy functionality 20
Reporting the found defects Report the defects immediately Don t leave it until the end of test session Make sure the defect has not been previously reported Find out how to reproduce the defect Easier to isolate and get fixed Write specific and clear defect reports Spend some time to find out what is the actual defect and under which conditions it occurs What were the specific expected outcome and the actual outcome Be non-judgmental in reporting bugs. Bug reports need to be non-judgmental and non-personal Reports should be written against the product, not the person, and state only the facts. 21
An effective bug description Useful bug reports are ones that get bugs fixed! Minimal Just the facts and details necessary An exact sequence of steps that shows the problem Singular Only one bug per report only one report per bug Obvious and general Use steps that are easily performed and show the bug to be as general as possible and readily seen by users If a programmer or tester has to decipher a bug, they may spend more time cursing the submitter than solving the problem Reproducible Isolate and reproduce what seems like random software behavior If an engineer can't see it or conclusively prove that it exists, the engineer will probably stamp it "WORKSFORME" or "INVALID", and move on to the next bug. Severity Show clearly how severe are the consequences if this defect is delivered to operation 22
10 steps to great defect reports 1. Structure Testing must be structured, to understand what you are doing. 2. Reproduce Clear steps. Three tries. 3. Isolate Which factors affect the defect and how. 4. Generalize Try to find out the more general case when the defect occurs. 5. Compare Does the same defect exist in other versions, and other parts of the product. 6. Summarize Communicate with a single sentence the essence and significance of the defect. 7. Condense Remove any excess information. Use just the words you need and describe only the necessary steps. 8. Disambiguate Remove confusing or misleading words be clear. 9. Neutralize As a bearer of bad news, express yourself calmly, don t attack programmer or use unnecessary humour or sarcasm. 10. Review E.g. informal check by another tester, or pair testing. Rex Black, 2004. Critical Testing Processes. 23
Motivate fixing the defect Make the defect look more serious Find a credible scenario that demonstrates the impact of the defect E.g., a realistic story that describes how a user can lose data when this defect occurs Make the defect look more general You discovered the defect in some specific case What is the most general case the defect occurs E.g., if you first found out that the system cannot cope with ~ and \ you might be able to generalize the defect into the system only accepts characters a-z, A-Z and 0-9 and not any special characters including -, ä, and ö. 24
Defect report 1. Defect-report identifier 2. Title: A short description of the defect 3. Bug description: A detailed description of the defect Date, time and finder Test item and environment including version and build numbers Expected results Actual results Repeatability (whether repeated; whether occurring always, occasionally or just once) Additional information that may help to isolate and correct the cause of the incident 4. Severity of the bug 25
Reporting a bug an exercise Suppose that you are running tests on the Windows Calculator and find following results: 1+1=2 2+2=5 3+3=6 4+4=9 5+5=10 6+6=13 4+6=10 4+5=9 Write a bug title and bug description that effectively describes the problem. 26
Reporting a bug one solution Title: Adding a pair of equal even numbers gives too big result (by one) Description: Setup: start Version 1.0 of Calculator Repro steps: Try adding pairs of equal even number such as 2+2, 4+4, and 10+10. Also try adding pairs of equal odd numbers such as 3+3, 5+5, and 13+13 and pairs of unequal numbers such as 1+2, 4+6, and 15+13. Expected result: Correct answer for all pairs 2+2=4, 4+4=8 Actual result: For pairs of equal even numbers, the answer is one too big: 2+2=5, 4+4=9, 10+10=21 and so on. Other info: This wasn t tried exhaustively, but the bug occurred on many instances from 2+2 to 65536+65536. The bug doesn t seem to occur with odd numbers or unequal pairs. Environment: Windows 2000, 5.00.2195, Service Pack 4 Reporter: Jack Debugger 27
Test Summary Report (IEEE Std 829) 1/2 1. Test-summary-report identifier 2. Summary summarizes the evaluation of the test items identifies the items tested (including their version/revision level) indicates the environments in which the testing took place supplies references to the documentation over the testing process 3. Variances indicates any variances of the actual testing process from the test plan or test procedures specifies the reason for each variance 4. Comprehensiveness assessment evaluates the comprehensiveness of the actual testing process against the criteria specified in the test plan identifies features or feature combinations which were not sufficiently tested and explains the reasons for omission 28
Test Summary Report (IEEE Std 829) 2/2 5. Summary of results summarizes the success of testing (such as coverage, numbers of defects, severities, etc.), identifies all resolved and unresolved incidents 6. Evaluation provides an overall evaluation of each test item including its limitations based upon the test results and the item-level pass/fail criteria 7. Summary of activities the major testing activities and events resource consumption (total staffing level, total person-hours, total machine time, total elapsed time used for each of the major testing activities, ) 8. Approvals specifies the persons who must approve this report (and the whole testing phase) 29
Meetings, reports and project control issues Test manager should collect and track the metrics regularly (weekly) and direct the testing groups efforts by status meetings Test results must be reported regularly to the whole project group (or development team) Depending on the used sw development model, the required pace of feedback (test results) varies from 1 day to over a month The mission of a testing team is to produce, for the rest of the organisation, relevant information about the quality status of the system promptly and in a useful form Summary reports are provided to upper management at least in every milestone 30
Status reports to management Reports should be brief and contain (J. Rakos) Activities and accomplishments during the reporting period Problems encountered since the last meeting/report Problems solved Outstanding problems Current state versus plan Expenses versus budget Plans for the next time period What is the most important information for upper management from a testing project? 31
The most important information (my answer) 1. Evaluation of quality and status of the software development (project) Brief Easy and quick to understand Brings clearly forth the relevant information Risks 2. Problems that require management actions 3. Status of testing versus plans Accomplishments, coverage, defect counts and rates Expenses Required changes to plans 32
Testing Dashboard 6.6.2003 Build 1.34.52 Functional Area Activity Coverage Quality Comments Perfect File management low 2 Hmm Main hierarchy high 2 Some weird behavior, still testing Aargghhh! Formatting high 2 Some critical bugs found Drag and drop pause 1 Don't work with other applications 3 Its tested Imports Ready 3 2 Features checked Exports blocked 0 Can't test: no specs for exprt formats 1 We tried it (once) Overall GUI blocked 3 A lot of small bugs 0 Nothing Editing area pause 2 Looks good Clipboard 0 Not delivered Property tables 0 Not implemented Preferences high 1 Nothing serious yet Help pause 2 Mainly spelling and grammatic issues Kaner et al. 2002. Lessons Learned in Software testing. 33
Risk-based reporting All identified risks open at the beginning Start Today Planned End Residual risks of releasing to d a y Residual Risks Progress through the Planned Testing Gerrard, P. & Thompson, N. 2002. Riskbased E-business Testing. 34
Risk based testing Risk based test case design techniques Goal is to analyse product risks and use that information to design good tests (test-cases) How this system could fail? Error guessing Failure models Experience Risk based test management Estimating risks for each function or feature Using risk analysis to prioritise testing Choosing what to test first Choosing what to test most 35
Qualitative vs. quantitative Technical Incompatibility Technical Incompatibility Very likely 70% likelihood Could affect many users may affect 80 % of users System facilities not fully available or usable 7 facilities could be unusable, 3 difficult to use Funding withdrawn Funding withdrawn Unlikely 5 % probability Severe impact 95 % chance of project cancellation (5 % find other sponsors) 36
Qualitative risk analysis Nightmares Severe impact Some impact Severe impact, unlikely Some impact, unlikely Severe impact, midlikely Some impact, mid-likely Severe impact, likely Some impact, likely Most benefits in addressing these risks Low impact Insignificant Low impact, unlikely Low impact, mid-likely Low impact, likely unlikely mid-likely likely Nuisances 37
Example Statistical Risk Analysis Matrix Cost * Probability = Re Dev Cust Avrg. New Func. Design Qual. Size Complexity Weigh. Sum Risk Exposure Weight Interest Calculation Close Account 5 5 1 3 3 3 3 2 3 3 3 37 Idea is to get the features into 1 3 priority order - somehow 2 2 2 2 3 31 111 62 Create Account 2 1 1,5 3 3 2 3 41 61,5 Modified slide, originally from Ståle Amland 38
Test catalogs can be helpful Collections an empty collection exactly one element (this is somewhat low-yield) more than one element the maximum number of elements (if that's not reasonable, try it at least once with more than just a few elemen duplicate elements Searching Collections match not found. (If searching a limited subset of a collection, arrange for there to be a match outside that subset. Example below.) one match (The best possible place to put it is probably just before the end bounds, if such a thing makes sense. Example below.) more than one match Finding subsets (filtering out elements) Filtering out no elements (subset is the same as the original) Filtering out one element Filtering out all but one element Filtering out all elements (subset is empty) Special collection elements the collection contains itself (redundant with next one, but the first one to try). indirect containment: the collection contains a collection that contains the original collection Pairs of collections Both collections empty First collection has one element, second has none. First collection has no elements, second has one. Both collections have more than one elements. 39
Containers Appending to a container's contents Container initially empty. Container initially not empty. Adding just enough elements to fill it. A full container, adding one element. A partially full container, add one too many elements Add zero new elements Overwriting a container's contents New contents have one more element than will fit New contents just fit. Zero elements used. Is the container correctly emptied? Some elements added, but fewer than in original (are old contents correctly cleared?) Deleting elements The container has one element (low yield) The container is already empty. Files Ability to operate on the file the file exists it's readable it's not readable it's writeable it's not writeable the file does not exist it doesn't exist, but it can be created it doesn't exist, and it can't be created File types An ordinary file A directory/folder An alias or symbolic link (note that both exist on Mac OS X) A special file (Unix) 40
Names Whatever the name names does not exist Two different names name the same thing. Pathnames POSIX / Mac OS X Empty. (For example, in a Unix shell, give it the argument "", which is equivalent to ".".) Absolute Example: /tmp/foo Relative Example: tmp/foo If the program is likely to take the pathname apart (to use one of its parts, or to build a new pathname): Containing the component "..", as in../../dir/x Slash at end: foo/ No slash: foo Slash in the middle: foo/bar More than one slash: foo/bar/baz Duplicate slashes: foo//bar (should be equivalent to foo/bar) Pathname as argument to command-line command: Pathname begins with dash. (How do you get the command to work on -file?) Percentage If the percentage is calculated from some Count, try to make the count be zero. 41
Textual Input/Quoted Unpaired Quotes Like Unix's use of \ to quote many special characters, or comments that apply from comment character like # or // to the end of the line. In the Quote mark as first character in text. Example: \abcd Quote mark somewhere in the middle of the text. Example: ab\cd Quote mark as the last character (quoting nothing). Example: abcd\ Paired Quotes Like double quotes (") around strings, or /* */ around Java comments. I'll use /* and */ for start-of-quote and end-of-quote. Quote everything - no unquoted text. A quote in the middle - with "real" text before and after it. Example: ab/*de*/fg No closing quote mark. Example: ab/*de Opening quote as last character in the text. Example: /* Nested quotes. Example: /*/*hi*/*/ Nothing inside the quote. (low yield) Example: /**/ Open quote without close quote. Example: /*no end Close quote without open quote. Example: no beginning */ Combinations Single-quoted double-quote mark. Example: \"some text without closing quote Double quote, where close-quote immediately follows a single-quote mark. Example: \/*Not quoted. 42
Textual input Nothing Empty (clear default) 0 LB-1 LB = Lower boundary LB UB = Upper boundary UB UB+1 Far below LB Far above UB UB number of chars UB+1 number of chars Far beoynd UB chars Negative Non-digit (/ ASCII 47) Non-digit (: ASCII 58 Upper ASCII (128-254) characters ASCII 255 Wrong data type Expressions Leading zeros Leading spaces Non-printing char O/S file name Upper ASCII Upper case Lower case Modifiers (ctrl, alt, etc.) Function keys Edited with backspace and delete Input while processing Language reserved characters 43